WO2021185085A1 - 一种显示方法及控制显示的装置 - Google Patents

一种显示方法及控制显示的装置 Download PDF

Info

Publication number
WO2021185085A1
WO2021185085A1 PCT/CN2021/078944 CN2021078944W WO2021185085A1 WO 2021185085 A1 WO2021185085 A1 WO 2021185085A1 CN 2021078944 W CN2021078944 W CN 2021078944W WO 2021185085 A1 WO2021185085 A1 WO 2021185085A1
Authority
WO
WIPO (PCT)
Prior art keywords
liquid crystal
image
crystal cell
displayed
target liquid
Prior art date
Application number
PCT/CN2021/078944
Other languages
English (en)
French (fr)
Inventor
罗伟城
高少锐
王海涛
张朋
李江
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021185085A1 publication Critical patent/WO2021185085A1/zh
Priority to US17/947,427 priority Critical patent/US20230013031A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/03Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes specially adapted for displays having non-planar surfaces, e.g. curved displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/52Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels the 3D volume being constructed from a stack or sequence of 2D planes, e.g. depth sampling systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/02Diffusing elements; Afocal elements
    • G02B5/0205Diffusing elements; Afocal elements characterised by the diffusing properties
    • G02B5/0236Diffusing elements; Afocal elements characterised by the diffusing properties the diffusion taking place within the volume of the element
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1334Constructional arrangements; Manufacturing methods based on polymer dispersed liquid crystals, e.g. microencapsulated liquid crystals
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/137Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/137Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
    • G02F1/13731Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on a field-induced phase transition
    • G02F1/13737Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on a field-induced phase transition in liquid crystals doped with a pleochroic dye
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/137Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
    • G02F1/139Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/137Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering
    • G02F1/139Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells characterised by the electro-optical or magneto-optical effect, e.g. field-induced phase transition, orientation effect, guest-host interaction or dynamic scattering based on orientation effects in which the liquid crystal remains transparent
    • G02F1/1391Bistable or multi-stable liquid crystal cells
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • G03B21/60Projection screens characterised by the nature of the surface
    • G03B21/62Translucent screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background

Definitions

  • This application relates to the field of display, and in particular to a display method and a display control device.
  • the present application provides a display method and a display control device, which help improve the stereoscopic effect when a user views a three-dimensional image with the naked eye.
  • the present application provides a display method, which is applied to a terminal device, and the terminal device includes a projection screen.
  • the projection screen includes a transparent substrate and a liquid crystal film covering the transparent substrate, and the liquid crystal film includes a plurality of liquid crystal cells.
  • the display method includes: acquiring an image to be displayed. Based on the position of the pixel in the image to be displayed, the target liquid crystal cell is determined among the plurality of liquid crystal cells. Next, the state of the target liquid crystal cell is set to the scattering state, and the state of the non-target liquid crystal cell is set to the transparent state.
  • the non-target liquid crystal cell is a liquid crystal cell other than the target liquid crystal cell among the plurality of liquid crystal cells.
  • the projected image of the image to be displayed is displayed on the target liquid crystal cell.
  • the image to be projected of the image to be displayed can be projected on the target liquid crystal unit, so that the projected image of the image to be displayed is displayed on the target liquid crystal unit.
  • the projected image of the image to be displayed is displayed on the transparent projection screen. Therefore, the background of the projected image of the image to be displayed is merged with the surrounding environment, and the visual effect is improved.
  • the above-mentioned "obtaining the image to be displayed” includes: selecting the image to be displayed from a stored image library, or downloading the image to be displayed from the network.
  • the image to be displayed includes a three-dimensional image
  • the projected image of the image to be displayed includes a two-dimensional image.
  • the two-dimensional projected image of the three-dimensional image to be displayed can be displayed on a transparent projection screen.
  • the above “setting the state of the target liquid crystal cell to the scattering state and setting the state of the non-target liquid crystal cell to the transparent state” includes:
  • a first preset voltage is set for the target liquid crystal cell to control the state of the target liquid crystal cell to a scattering state; and a second preset voltage is set for the non-target liquid crystal cell to control the state of the non-target liquid crystal cell to a transparent state.
  • the second preset voltage is set for the target liquid crystal cell to control the state of the target liquid crystal cell to the scattering state; and the first preset voltage is set for the non-target liquid crystal cell to control the state of the non-target liquid crystal cell to the transparent state.
  • the first preset voltage is greater than or equal to a preset value
  • the second preset voltage is less than the preset value
  • the projected image of the image to be displayed can be displayed on a transparent projection screen, so that the background of the projected image of the image to be displayed is integrated with the surrounding environment, and the visual effect is improved.
  • the above-mentioned liquid crystal film includes: a polymer dispersed liquid crystal film, a bistable liquid crystal film, or a dyed liquid crystal film.
  • the above-mentioned projection screen includes a curved screen, or the above-mentioned projection screen includes a three-dimensional screen.
  • a curved screen or a three-dimensional screen when the user views the two-dimensional projection image of the three-dimensional image to be displayed on the projection screen with the naked eye, the user can see the realistic three-dimensional image "floating" in the air, thus improving the user's passing The three-dimensional effect of viewing the three-dimensional image with the naked eye.
  • the above display method further includes: tracking the position of the human eye.
  • the above "determining the target liquid crystal cell among multiple liquid crystal cells based on the position of the pixel in the image to be displayed” includes: determining the target among the multiple liquid crystal cells based on the tracked human eye position and the position of the pixel in the image to be displayed The location of the liquid crystal cell.
  • the position of the target liquid crystal unit for displaying the projected image of the image to be displayed is determined by the tracked human eye position, which can improve the stereoscopic effect of viewing the projected image of the image to be displayed at the human eye position.
  • the above based on the tracked human eye position and the position of the pixel in the image to be displayed, determine in multiple liquid crystal cells
  • the “position of the target liquid crystal cell” includes: based on the connection between the tracked human eye position and the position of each pixel in the image to be displayed, and the intersection with the projection screen, the liquid crystal cell that determines the position of the intersection among multiple liquid crystal cells is Target liquid crystal cell.
  • the above-mentioned terminal device further includes a first projection lens
  • the above-mentioned "projecting the image to be projected of the image to be displayed on the target liquid crystal unit” includes: adjusting the first The projection area of the projection lens is such that the first projection lens projects the image to be projected in the target liquid crystal unit; wherein, the field angle of the first projection lens is less than or equal to the preset threshold.
  • the image to be displayed is a three-dimensional image
  • using a projection lens with a smaller angle of view can still project the image to be projected of the image to be displayed on the target liquid crystal unit determined according to the position of the human eye, which improves the position of the human eye View the three-dimensionality of the projected image of the image to be displayed.
  • the above-mentioned terminal device further includes a second projection lens
  • the above-mentioned "projecting the image to be projected of the image to be displayed on the target liquid crystal unit” includes: The projection lens projects the image to be projected of the image to be displayed in the target liquid crystal unit; wherein, the field angle of the second projection lens is greater than the preset threshold.
  • the above-mentioned terminal device further includes an image source module configured to project the image to be projected of the image to be displayed on the projection screen.
  • the tracking module may be installed inside the terminal device or outside the terminal device.
  • the size of the terminal device can be reduced.
  • the tracking module is installed outside the terminal device, since the detection light rays of the tracking module will not intersect the projection screen, the area of the projection image used to display the image to be displayed on the projection screen will increase. In this way, The projected image can be viewed at any eye position tracked in a larger range.
  • the present application provides a device for controlling display, which is applied to a terminal device, and the device can be used to execute any of the methods provided in the first aspect.
  • the present application may divide the function module of the device for controlling display according to any of the methods provided in the above-mentioned first aspect.
  • each function module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the present application may divide the device for controlling display into an acquisition unit, a determination unit, a setting unit, a control unit, and the like according to functions.
  • the descriptions of the possible technical solutions and beneficial effects performed by the above-mentioned divided functional modules can all refer to the technical solutions provided by the first aspect or its corresponding possible designs, which will not be repeated here.
  • this application provides a terminal device, which includes a projection screen, a processor, and so on.
  • the terminal device may be used to execute any of the methods provided in the first aspect above.
  • description of possible technical solutions and beneficial effects implemented by each module component in the terminal device reference may be made to the technical solutions provided by the above-mentioned first aspect or its corresponding possible designs, which will not be repeated here.
  • the present application provides a chip system, including a processor, configured to call and run a computer program stored in the memory from a memory, and execute any method provided in the implementation manner in the first aspect.
  • the present application provides a computer-readable storage medium, such as a non-transitory computer-readable storage medium.
  • a computer program (or instruction) is stored thereon, and when the computer program (or instruction) runs on a computer, the computer is caused to execute any method provided by any one of the possible implementations of the first aspect.
  • the present application provides a computer program product that, when it runs on a computer, enables any method provided in any possible implementation manner in the first aspect to be executed.
  • any of the above-provided devices, computer storage media, computer program products, or chip systems can be applied to the corresponding methods provided above. Therefore, the beneficial effects that can be achieved can refer to the corresponding The beneficial effects of the method will not be repeated here.
  • FIG. 1 is a schematic diagram of a projection area provided by an embodiment of the application
  • FIG. 2 is a schematic structural diagram of a display system provided by an embodiment of this application.
  • FIG. 3 is a schematic diagram of the structure of a liquid crystal film provided by an embodiment of the application.
  • FIG. 4 is a schematic structural diagram of a projection screen provided by an embodiment of the application.
  • 5A is a first schematic diagram of the hardware structure of a display system provided by an embodiment of the application.
  • 5B is a second schematic diagram of the hardware structure of a display system provided by an embodiment of the application.
  • FIG. 6 is a schematic flowchart of a display method provided by an embodiment of this application.
  • FIG. 7 is a first schematic diagram of a display method provided by an embodiment of the application.
  • FIG. 8 is a second schematic diagram of a display method provided by an embodiment of this application.
  • FIG. 9 is a third schematic diagram of a display method provided by an embodiment of this application.
  • FIG. 10 is a schematic structural diagram of an apparatus for controlling display provided by an embodiment of the application.
  • FIG. 11 is a schematic structural diagram of a chip system provided by an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of a computer program product provided by an embodiment of this application.
  • the retina can only receive stimulation in two-dimensional space, and the reflection of three-dimensional space is mainly realized by binocular vision.
  • depth perception The ability of humans to perceive the world and judge the distance of objects in the dimensions of three-dimensional space by binocular vision is called depth perception.
  • Depth perception is a kind of comprehensive feeling, which is obtained by comprehensive processing of various information obtained by the human eye through the brain.
  • the information used to provide depth perception is called depth cues. There are complete depth clues in the real world.
  • the "three-dimensional sense" of a three-dimensional display technology is related to whether the observer's depth perception of the displayed content is close to the real world. Therefore, the "three-dimensionality" of a three-dimensional display technology depends on whether the display technology can provide appropriate depth cues in its application.
  • the current 3D display technology can usually provide one or several depth cues.
  • the depth cues can be parallax, light and shadow relationship, or occlusion relationship.
  • parallax refers to the position change and difference of the object in the field of view when the same object is observed from two different positions. To look at the target from two observation points, the angle between the two lines of sight is called the parallax angle of these two points, and the distance between the two points is called the parallax baseline.
  • the parallax may include binocular parallax and motion parallax.
  • Binocular parallax refers to the difference between the normal pupil distance and the gaze angle, which causes a certain level of difference in the image of the object on the retina of the left and right eyes.
  • binocular parallax When observing a three-dimensional target, since the distance between the two eyes is about 60mm, the two eyes will observe from different angles. This tiny horizontal aberration in the retinal image of the two eyes is called binocular parallax or stereoscopic vision.
  • Motion parallax also known as “monocular motion parallax” is a type of monocular depth cues, which refers to the difference in the movement direction and speed of objects seen when the line of sight moves laterally in the field of view. When doing relative displacement, objects that are close seem to move fast, and objects that are far away appear to move slowly.
  • the binocular parallax when the observer is closer to the observed target, the binocular parallax is obvious. When the observer is far away from the observed target, for example, greater than 1m, the binocular parallax can be ignored, and the motion parallax plays a leading role.
  • the two-dimensional projection image to be projected (equivalent to the image to be projected in the embodiment of this application) is the two-dimensional projection image (equivalent to the image to be displayed in the embodiment of the present application) to be displayed after coordinate conversion ( It is equivalent to the projected image in the embodiment of the present application), and the two-dimensional projected image can be displayed on the projected image source module 211 described below.
  • the two-dimensional projection image (equivalent to the projection image in the embodiment of the present application) is an image on which the two-dimensional projection image to be projected is projected onto a projection screen (projection screen 212 described below).
  • the projection lens has a certain range of projection area when projecting on the projection screen.
  • the projection area can be used to display a two-dimensional projected image.
  • the projection area of the projection lens 11 on the projection screen 13 is the projection area 12 shown by the dashed ellipse.
  • the shape of the projection area 12 and the aperture provided on the projection lens 11 The aperture shape of the diaphragm is related.
  • FOV field of view
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • the embodiment of the present application provides a display method, which is applied to a display system.
  • This method can provide suitable motion parallax.
  • the user can obtain the two-dimensional projection image information with the naked eye, and combine the motion parallax to obtain the viewing experience of the three-dimensional image through comprehensive processing by the brain.
  • FIG. 2 shows a schematic structural diagram of a display system provided by an embodiment of the present application.
  • the display system 20 shown in FIG. 2 may include a projection module 21, a tracking module 22 and a processor 23.
  • the display system 20 may further include a memory 24 and a communication interface 25.
  • a memory 24 may be integrated on one device, or may be set on different devices.
  • the display system 20 further includes a bus 26.
  • the projection module 21, the tracking module 22, the processor 23, the memory 24, and the communication interface 25 may be connected through a bus 26.
  • the terminal device may be any electronic device with a projection screen, which is not limited in the embodiment of the present application.
  • the electronic device may be a smart audio device with a projection screen.
  • the projection module 21 includes a projection image source module 211, a projection screen 212, and a projection lens 213.
  • the projection image source module 211 is used to display the two-dimensional projection image to be projected, and to project the two-dimensional projection image to be projected onto the projection screen 212 through the projection lens 213.
  • the projection image source module 211 includes a light source and a light modulation element.
  • the embodiments of the present application do not limit the specific forms of the light source and the light modulation element.
  • the light source may be a light emitting diode (LED) or a laser
  • the light modulation element may be a digital light processing (DLP) system or a liquid crystal.
  • DLP digital light processing
  • LCD liquid crystal.
  • the two-dimensional projection image to be projected is displayed on the light modulation element, the light emitted by the light source is modulated on the light modulation element to form the two-dimensional projection image to be projected, and the two-dimensional projection image is projected onto the projection screen 212 through the projection lens 213 .
  • the projection screen 212 is used to display a two-dimensional projection image.
  • the projection screen 212 may be a curved screen or a three-dimensional screen.
  • the projection screen 212 may also be a flat screen.
  • the shape of the three-dimensional screen may be a variety of shapes, such as a spherical shape, a cylindrical shape, a prismatic shape, a cone shape, or a polyhedral shape, which is not limited in the embodiment of the present application.
  • the projection screen 212 includes a transparent substrate and a liquid crystal film covering the transparent substrate.
  • the embodiment of the present application does not limit the material of the transparent substrate.
  • the transparent substrate may be a transparent glass substrate, or may be a transparent resin substrate.
  • the liquid crystal film may be a polymer dispersed liquid crystal (PDLC) film, a bistable liquid crystal (BLC) film, a dye-doped liquid crystal (DDLC) film, or the like.
  • the above-mentioned liquid crystal film includes a plurality of liquid crystal cells, and each liquid crystal cell has a scattering state and a transparent state.
  • the processor 23 can control the state of each liquid crystal cell through an electrical signal.
  • the scattered state can also be referred to as the non-transparent state.
  • Each liquid crystal cell can correspond to one pixel of the two-dimensional projection image, or it can correspond to multiple pixels of the two-dimensional projection image. Of course, multiple liquid crystal cells can also correspond to one pixel of the two-dimensional projection image. limited. It should be noted that the liquid crystal cell in the scattering state is used to display a two-dimensional projection image.
  • the above-mentioned liquid crystal film is a PDLC film as an example for description.
  • Figure 3(a) shows a plurality of liquid crystal cells in the PDLC film (each square represents a liquid crystal cell), and each of the plurality of liquid crystal cells
  • the liquid crystal cells are all set with a first preset voltage, and the first preset voltage is greater than or equal to a preset value.
  • the liquid crystal molecules of each liquid crystal cell of the plurality of liquid crystal cells are uniformly arranged along the electric field direction, so that the incident light exits in the original direction after passing through the liquid crystal cell, and therefore the state of the liquid crystal cell is a transparent state.
  • the applied voltages of the liquid crystal cell 33, the liquid crystal cell 34, the liquid crystal cell 38, and the liquid crystal cell 39 shown in (a) of FIG. 3 are set to the second preset voltage, here, the second preset voltage is less than the preset value, Then, the liquid crystal molecules in the liquid crystal cell 33, the liquid crystal cell 34, the liquid crystal cell 38, and the liquid crystal cell 39 are arranged in random directions. At this time, after the incident light passes through the liquid crystal cell 33, the liquid crystal cell 34, the liquid crystal cell 38, and the liquid crystal cell 39, the emitted light is scattered light, as shown in FIG. 3(b).
  • the states of the liquid crystal cell 33, the liquid crystal cell 34, the liquid crystal cell 38, and the liquid crystal cell 39 are the scattering state, that is, the opaque state.
  • the preset value of the voltage can be determined by the specific components of the liquid crystal film and the ratio of each component, which is not limited in the embodiment of the present application.
  • the state of the liquid crystal cell is the scattering state; when the second preset voltage is set for the liquid crystal cell When the voltage is applied, the state of the liquid crystal cell is a transparent state.
  • the above-mentioned liquid crystal film is a dyed liquid crystal film
  • it can be set: when the first preset voltage is set for a certain liquid crystal cell, the state of the liquid crystal cell is the scattering state, and when the second preset voltage is set for the liquid crystal cell, The state of the liquid crystal cell is a transparent state; alternatively, it can be set: when a first preset voltage is set for a certain liquid crystal cell, the state of the liquid crystal cell is a transparent state, and when a second preset voltage is set for the liquid crystal cell , The state of the liquid crystal cell is the scattering state.
  • the embodiment of the application does not limit this.
  • the projection lens 213 is used to project the two-dimensional projection image to be projected displayed in the projection image source module 211 onto the projection screen 212.
  • the projection lens 213 may be a lens with a large field of view (FOV), such as a fisheye lens with a FOV greater than 150° (equivalent to the second projection lens in the embodiment of the present application).
  • FOV field of view
  • the projection lens 213 may also be a projection lens with a FOV of about 40°-70° (equivalent to the first projection lens in the embodiment of the present application).
  • the field angle of the first projection lens is less than or equal to the preset threshold
  • the field angle of the second projection lens is greater than the preset threshold.
  • the embodiment of the present application does not limit the value of the preset threshold.
  • the projection module 21 may further include a rotating platform 214.
  • the rotating platform 214 is used to adjust the projection area of the projection lens 213 by rotating the angle.
  • the controller of the rotating platform 214 is connected to the processor 23, or the controller for controlling the rotation of the rotating platform 214 is the processor 23.
  • the projection lens 213 may be completely arranged inside the stereo screen, and the projection lens 213 may also be partially arranged inside the stereo screen.
  • the projection lens 213 can realize the projection function through an annular projection optical system.
  • the upper and lower surfaces of the cylindrical projection screen may not participate in the projection display, and the side walls of the cylindrical body can be used to display a two-dimensional projection image, but of course it is not limited to this.
  • FIG. 4 shows a structural diagram of a projection module 21.
  • the FOV of the projection lens 213 is 50°.
  • the projection screen 212 is a spherical three-dimensional screen, and the projection lens 213 is partially disposed inside the projection screen 212.
  • the projection lens 213 is located between the projection image source module 211 and the projection screen 212, and the positions of the projection lens 213 and the projection image source module 211 are relatively fixed.
  • the rotating platform 214 is used to adjust the projection area of the projection lens 213.
  • the projection area of the projection lens 213 at the current moment is A
  • the processor 23 instructs the rotating platform 214 to rotate X° at the next moment, so that the projection area of the projection lens 213 is as shown in the figure. 4 is shown at B.
  • the specific value of X is determined by the processor 23.
  • the process of the processor 23 specifically determining the specific value of X refer to the description of the display method below in the embodiment of the present application, which will not be repeated here.
  • the tracking module 22 is used to track the position of the human eye and send the tracked human eye position to the processor 23.
  • the tracking module may use infrared imaging technology to track the position of the human eye, although the embodiment of the present application is not limited to this.
  • the processor 23 is the control center of the display system 20, and the processor 23 may be a general-purpose central processing unit (central processing unit, CPU), or other general-purpose processors. Among them, the general-purpose processor may be a microprocessor or any conventional processor. As an example, the processor 23 may include one or more CPUs, such as CPU 0 and CPU 1 shown in FIG. 2.
  • the processor 23 is configured to determine the two-dimensional projection image to be projected of the three-dimensional image to be displayed according to the position of the pixel in the three-dimensional image to be displayed and the position of the human eye, and send the two-dimensional projection image to the projection image source Module 211.
  • the processor 23 is also used to determine the position of the target liquid crystal cell in the projection screen 212 according to the position of the pixel in the three-dimensional image to be displayed and the position of the human eye, and control the state of the target liquid crystal cell to the scattering state through the control circuit, and control the non- The state of the target liquid crystal cell is a transparent state.
  • the non-target liquid crystal cell is a liquid crystal cell other than the target liquid crystal cell in the projection screen 212.
  • the control circuit may be integrated on the liquid crystal film, which is not limited in the embodiment of the present application.
  • the memory 24 may be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (RAM), or other types that can store information and instructions
  • ROM read-only memory
  • RAM random access memory
  • EEPROM electrically erasable programmable read-only memory
  • magnetic disk storage media or other magnetic storage devices, or can be used to carry or store instructions or data structures
  • the desired program code and any other medium that can be accessed by the computer but not limited to this.
  • the memory 24 may exist independently of the processor 23.
  • the memory 24 may be connected to the processor 23 through the bus 26 for storing data, instructions or program codes.
  • the processor 23 calls and executes the instructions or program codes stored in the memory 24, it can implement the display method provided in the embodiment of the present application.
  • the memory 24 may also be integrated with the processor 23.
  • the communication interface 25 is used to display that the system 20 is connected with other devices (such as servers, etc.) through a communication network.
  • the communication network may be an Ethernet, a radio access network (RAN), or a wireless local area network. , WLAN) etc.
  • the communication interface 25 may include a receiving unit for receiving data, and a sending unit for sending data.
  • the bus 26 may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, or an Extended Industry Standard Architecture (EISA) bus, etc.
  • ISA Industry Standard Architecture
  • PCI Peripheral Component Interconnect
  • EISA Extended Industry Standard Architecture
  • the bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one thick line is used in Figure 2, but it does not mean that there is only one bus or one type of bus.
  • the structure shown in FIG. 2 does not constitute a limitation on the display system.
  • the display system 20 may include more or less components than those shown in the figure, or a combination Certain components, or different component arrangements.
  • FIG. 5A shows a hardware structure of a terminal device (such as a smart speaker device) provided by an embodiment of the present application.
  • the smart speaker device 50 includes a projection module, a tracking module, and a processor 53.
  • the projection module includes a projection image source module 511, a projection screen 512, and a fisheye lens 513 with a FOV of 170°.
  • the tracking module includes a tracking lens 52, and the projection image source module 511 and the tracking lens 52 communicate with the processor through a bus, respectively. 53 connection communication.
  • the projection screen 512 is a spherical projection screen.
  • the projection screen 512 includes a spherical transparent substrate and a liquid crystal film covering the spherical transparent substrate.
  • the liquid crystal film can cover the inner surface of the spherical transparent substrate or
  • the embodiment of the present application takes the liquid crystal film covering the inner surface of the spherical transparent substrate as an example.
  • the shadow area corresponding to the fisheye lens 513 is the projectable area of the fisheye lens
  • the shadow area corresponding to the tracking lens 52 is the range where the tracking lens can track the human eye. It is understandable that the smart speaker device 50 may include multiple tracking lenses, so as to track the position of the human eye within a 360° range.
  • the smart speaker device 50 may also include a voice collector and a voice player (not shown in FIG. 5A), and the voice collector and the voice player are respectively connected and communicated with the processor through a bus. Among them, the voice collector is used to collect the user's voice instructions, and the voice player is used to output voice information to the user.
  • the smart speaker device 50 may further include a memory (not shown in FIG. 5A), and the memory is connected to the processor for communication and is used to store local data.
  • the projection lens 52 may also be located outside the projection screen 512, as shown in FIG. 5B, which is not limited in the embodiment of the present application. It is understandable that if the projection lens 52 is inside the projection screen 512, the volume of the smart speaker device 50 can be reduced. If the projection lens 52 is outside the projection screen 512, the conflict between the display area of the projection screen and the tracking optical path of the tracking lens can be avoided, thereby obtaining a larger projection display area.
  • the display method is applied to the smart audio device 50 shown in FIG. 5A as an example for description.
  • FIG. 6 shows a schematic flowchart of a display method provided by an embodiment of the present application.
  • the display method includes the following steps:
  • the processor acquires an image to be displayed.
  • the image to be displayed may be a multi-dimensional image, such as a three-dimensional image.
  • the image to be displayed is a three-dimensional image to be displayed as an example for description.
  • the processor may obtain the three-dimensional image to be displayed from the network or the local gallery according to the obtained instruction information, which is not limited in the embodiment of the present application.
  • the embodiment of the present application does not limit the specific content and form of the instruction information.
  • the instruction information may be instruction information input by the user through voice, text, or keystrokes, and the instruction information may also be trigger information detected by the processor, for example, the smart audio device 50 is turned on or off.
  • the processor may obtain the voice information collected by the smart audio device through the voice collector.
  • the content of the voice information can be the wake-up words of the smart audio device, such as "little e".
  • the processor calls the three-dimensional cartoon character of "little e" from the local gallery, and the three-dimensional cartoon character is Three-dimensional image to be displayed.
  • the content of the voice message can be any question raised by the user after uttering the wake-up word, such as "Help me find the satellite map of this city”.
  • the processor searches and downloads the city’s 3D satellite from the network. Map, the three-dimensional satellite map is the three-dimensional image to be displayed.
  • the content of the voice message is "Watch XX movie”.
  • the processor searches for and downloads a 3D version of XX movie from the network, where the current frame of the 3D version of the XX movie to be played is the current moment. Three-dimensional image to be displayed.
  • the processor can obtain the user input instruction information, and obtain the to-be-displayed three-dimensional image based on the instruction of the instruction information.
  • the processor detects the power-on operation of the smart audio device 50.
  • the startup operation triggers processing to acquire a three-dimensional image corresponding to the startup operation, and determines that the three-dimensional image is a to-be-displayed three-dimensional image.
  • the three-dimensional image corresponding to the power-on operation may be a three-dimensional image of a cartoon character representing the smart audio device 50 waving greetings.
  • S102 The processor determines image information of the three-dimensional image to be displayed.
  • the processor determines the image information of the three-dimensional image to be displayed in a preset three-dimensional coordinate system.
  • the image information of the three-dimensional image to be displayed is used to describe the three-dimensional image to be displayed.
  • the three-dimensional image to be displayed may be composed of multiple pixels.
  • the image information of the three-dimensional image to be displayed may be the pixel point in the preset three-dimensional coordinate system.
  • the preset three-dimensional coordinate system is preset by the processor.
  • the preset three-dimensional coordinate system may be a three-dimensional coordinate system whose origin is the center of the spherical projection screen.
  • the preset three-dimensional coordinate system may also be a three-dimensional coordinate system with an arbitrary point as the origin, which is not limited in the embodiment of the present application.
  • the origin of the preset three-dimensional coordinate system is the center of the sphere of the spherical projection screen as an example for description.
  • any one pixel point A of the plurality of pixels constituting the rectangular parallelepiped 70 can use the coordinates (x a , y a , z a ) means.
  • the coordinates (x a, y a, z a) is a coordinate value in the sphere center of the spherical projection screen 512 as an origin of the three-dimensional coordinate system.
  • the position of some pixels of the three-dimensional image to be displayed may be located outside the projection screen 512, which in turn makes the pixel points on the corresponding two-dimensional projection image of this part of pixels not exist. Displayed on the projection screen 512.
  • the cuboid 80 shown in FIG. 8 is too large in size.
  • the positions of some pixels are located outside the projection screen, such as in FIG. Point B.
  • the processor may reduce the size of the three-dimensional image to be displayed, so that the pixel point of the two-dimensional projection image corresponding to each pixel of the three-dimensional image to be displayed can be displayed on the projection screen.
  • the processor may perform the following steps:
  • Step 1 The processor determines the position of each pixel of the three-dimensional image to be displayed in a preset three-dimensional coordinate system.
  • Step 2 The processor determines whether each pixel in the three-dimensional image to be displayed is located on the same side of the projection screen.
  • the processor determines the distance between each pixel point in the three-dimensional image to be displayed and the origin of the coordinate according to the position of each pixel point of the three-dimensional image to be displayed in the preset three-dimensional coordinate system. Then, the processor determines whether the distance between each pixel point in the three-dimensional image to be displayed and the coordinate origin is less than or equal to the radius of the projection screen 512. If it is less than or equal to the radius of the projection screen 512, the processor determines that the position of each pixel of the 3D image to be displayed is located in the spherical projection screen 512, that is, the 3D image to be displayed is located on the same side of the projection screen 512.
  • the processor determines that the three-dimensional image to be displayed has a pixel outside the spherical projection screen 512, that is, the three-dimensional image to be displayed is located in the projection screen. Both sides of the screen 512.
  • Step 3 The processor reduces the 3D image to be displayed (for example, to reduce according to a preset ratio), and repeats steps 1 and 2 until the processor determines that each pixel in the reduced 3D image to be displayed is located on the projection screen Up to the same side.
  • the embodiment of the present application does not limit the specific value and value method of the preset ratio.
  • the tracking lens tracks the position of the human eye, determines the observation position according to the position of the human eye, and sends the determined observation position to the processor. Or, the tracking lens tracks the position of the human eye, and sends the tracked position of the human eye to the processor, so that the processor determines the observation position according to the position of the human eye.
  • the observation position is a single point position determined based on the position of the eyes of a person, and the embodiment of the present application does not limit the relationship between the observation position and the position of the eyes.
  • the observation position may be the midpoint of the line connecting the positions of the eyes.
  • the tracking lens presets its position in the preset three-dimensional coordinate system.
  • the position of the tracking lens in the preset three-dimensional coordinate system and the observation position can be expressed by the coordinates in the preset three-dimensional coordinate system.
  • the tracking module includes a tracking lens and a calculation module.
  • the tracking lens can use infrared imaging technology to track the position of the person's eyes according to its position in the preset three-dimensional coordinate system.
  • the calculation module calculates the midpoint of the line of the binocular positions according to the positions of the binoculars tracked by the tracking lens, and sends the calculated midpoint position as the observation position to the processor.
  • the specific process of the tracking lens using infrared imaging technology to track the position of the human eye can refer to the prior art, which will not be repeated here.
  • the calculation module According to the positions of E1 and E2, the position E (x e , y e , z e ) of the midpoint of the connection between E1 and E2 is calculated, and E is sent to the processor as the observation position.
  • the tracking module includes a tracking lens.
  • the tracking lens can use infrared imaging technology to track the position of the eyes of the person according to its position in the preset three-dimensional coordinate system, and send the positions of both eyes to the processing unit. Device.
  • the processor determines the observation position according to the received positions of the eyes. For example, the processor may calculate the position of the midpoint of the line connecting the positions of the eyes, and determine the position of the midpoint as the observation position.
  • S102 and S103 may be executed at the same time, or S102 may be executed first, and then S103 may be executed.
  • S104 The processor determines the intersection set and the information of each intersection in the intersection set according to the image information of the three-dimensional image to be displayed and the determined observation position.
  • the processor determines the intersection set and the information of each intersection in the intersection set according to the determined observation position and the position of each pixel of the three-dimensional image to be displayed in the preset three-dimensional coordinate system.
  • the intersection set includes multiple intersections, and the multiple intersections are multiple lines obtained by respectively connecting the observation position and multiple pixels in the three-dimensional image to be displayed, and multiple intersections obtained by intersecting the projection screen respectively.
  • the connection line between the pixel and the observation position does not have an intersection other than the pixel with the three-dimensional image to be displayed.
  • the above-mentioned multiple pixels are the pixels included in the picture of the three-dimensional image to be displayed that can be viewed by the human eye at the observation position.
  • the pixel point has a corresponding relationship with the intersection point.
  • FIG. 9 shows a schematic diagram of a processor determining any intersection point in the intersection point set.
  • the human eye shown by the dotted line represents the observation position E determined in step S103
  • the cuboid 70 is a three-dimensional image to be displayed placed in a preset three-dimensional coordinate system.
  • connection line between any pixel point A on the rectangular parallelepiped 70 and the observation position E is the line AE
  • the line AE intersects the projection screen 512 at the intersection A1 (x a1 , y a1 , z a1 )
  • the line AE and the cuboid 70 There is no intersection other than the pixel point A, so the pixel point A and the intersection point A1 have a corresponding relationship.
  • the line CE intersects the projection screen 512 at the intersection A1 (x a1 , y a1 , z a1 ), and the line CE and
  • the rectangular parallelepiped 70 has an intersection point other than the pixel point C, that is, the pixel point A. Therefore, the pixel point C and the intersection point A1 have no corresponding relationship.
  • intersection A1 is any intersection in the intersection set.
  • the intersection point A1 may be a point on the liquid crystal film, or a point on the inner surface or the outer surface of the spherical transparent substrate in the projection screen.
  • the information of the intersection may include the position of the intersection, the color brightness information corresponding to the intersection, and the like.
  • the position of the intersection is the position of the intersection in the preset three-dimensional coordinate system.
  • the position of any intersection in the intersection set can be (x s , y s , z s ).
  • the color brightness information is the color brightness information of the pixel points in the three-dimensional image to be displayed that has a corresponding relationship with the intersection point.
  • intersection of the above-mentioned line and the projection screen may be the intersection of the line and the inner surface of the projection screen, that is, the intersection is a point on the liquid crystal film on the projection screen.
  • the intersection point where the above-mentioned line intersects with the projection screen can also be the intersection point where the line and the outer surface of the projection screen (that is, the outer surface of the spherical transparent substrate in the projection screen) intersect, that is, the intersection point is the spherical shape in the projection screen.
  • the point on the outer surface of the transparent substrate, or the intersection point where the line intersects the inner surface of the spherical transparent substrate in the projection screen, that is, the intersection point is the point on the inner surface of the spherical transparent substrate in the projection screen.
  • the example does not limit this.
  • the processor determines the to-be-projected two-dimensional projection image information of the three-dimensional image to be displayed according to the determined information of each intersection in the set of intersections.
  • the two-dimensional projection image of the three-dimensional image to be displayed includes multiple pixels.
  • the two-dimensional projection image information of the three-dimensional image to be displayed includes the position of the pixel and the color and brightness information of the pixel.
  • the position of the pixel point may be determined according to the position of the intersection point in the intersection point concentration
  • the color brightness information may be determined based on the color brightness information of the intersection point in the intersection point concentration used to determine the position of the pixel point.
  • the processor determines the two-dimensional position of the two-dimensional projection image to be projected when displayed in the projection image source module according to the position of the intersection point in the set of intersection points in the preset three-dimensional coordinate system.
  • the method of coordinate change in the prior art can be referred to. OK, I won’t go into details here.
  • the processor may preset the positions of the projection image source module and the projection lens in the preset three-dimensional coordinate system, and preset the projection angle of the projection image source module to the projection lens.
  • the position of the projection image source module can be represented by the coordinates of the center point of the display interface of the projection image source module in the preset three-dimensional coordinate system
  • the position of the projection lens can be represented by the intersection of the projection lens and its optical axis in the preset three-dimensional coordinates The coordinate representation in the system.
  • the processor calculates the angle between the line and the optical axis of the projection lens, and based on the angle, Obtain the exit direction of the connection line relative to the projection lens. Then, based on the determined exit direction, the optical properties of the projection lens (such as focal length and distortion properties), the projection image source module and the position of the projection lens in the preset three-dimensional coordinate system, the processor determines the use of the projection image source module. To get the position of the pixel point of the light in the exit direction.
  • the optical properties of the projection lens such as focal length and distortion properties
  • the processor transforms the position of the intersection point in the set of intersection points in the preset three-dimensional coordinate system into the two-dimensional position of the two-dimensional projection image to be projected when it is displayed in the projection image source module according to the above-mentioned method.
  • S106 The processor determines the target liquid crystal cell in the projection screen according to the determined position of each intersection in the set of intersections.
  • the processor determines the position of the target liquid crystal cell on the projection screen according to the position of each intersection in the determined intersection set.
  • the target liquid crystal cell is used to display a two-dimensional projected image.
  • the position of the target liquid crystal cell is a two-dimensional coordinate position.
  • the processor may determine the position of the target liquid crystal cell based on the position of each intersection point.
  • the liquid crystal film covers the transparent substrate of the projection screen, and therefore, each point on the liquid crystal film corresponds to each point on the transparent substrate in a one-to-one correspondence.
  • the distance between two points having a corresponding relationship can be the thickness of the transparent substrate, or the thickness of the transparent substrate and the liquid crystal film, depending on whether the above-mentioned intersection point is a point on the outer surface or inner surface of the spherical transparent substrate in the projection screen .
  • the distance between the two corresponding points is the thickness of the transparent substrate and the liquid crystal film, if the point of intersection is the inner surface of the spherical transparent substrate in the projection screen.
  • the distance between the two points with the corresponding relationship is the thickness of the liquid crystal film.
  • the processor determines that each intersection point in the intersection point is located along the normal direction of the spherical transparent substrate at that point, towards the side of the liquid crystal film.
  • the position coordinates after extending the distance of the thickness of the liquid crystal film in the direction of, and the x and y coordinates of the position are determined as the position of the target liquid crystal cell.
  • the processor determines that each intersection point in the intersection point is in the direction along the normal line of the spherical transparent substrate at the point and extends to the side of the liquid crystal film.
  • the position after the distance between the thickness of the transparent substrate and the liquid crystal film, and the x and y coordinates of the position are determined as the position of the target liquid crystal cell.
  • the processor Based on the determined target liquid crystal cell, the processor sets the state of the target liquid crystal cell to the scattering state, and sets the state of the non-target liquid crystal cell to the transparent state.
  • the target liquid crystal cell in the scattering state can be used to display the two-dimensional projection image of the three-dimensional image to be displayed.
  • the processor may use any of the following methods to set the state of the target liquid crystal cell to the scattering state, and set the state of the non-target liquid crystal cell to the transparent state:
  • Method 1 The processor sends the position of the target liquid crystal cell to the control circuit. If the liquid crystal film in the projection screen is a PDLC film, the processor also instructs the control circuit to set a second preset voltage for the target liquid crystal cell so that the target liquid crystal cell is in the scattering state; and instructs the control circuit to set the first preset voltage for the non-target liquid crystal cell Set the voltage so that the non-target liquid crystal cell is in a transparent state.
  • the processor also instructs the control circuit to set the first preset voltage for the target liquid crystal cell, so that the target liquid crystal cell is in a scattering state. And instruct the control circuit to set a second preset voltage for the non-target liquid crystal cell, so that the non-target liquid crystal cell is in a transparent state.
  • the processor also instructs the control circuit to be the target liquid crystal according to the corresponding relationship between the first preset voltage and the second preset voltage, respectively, and the scattering state and the transparent state.
  • the cell and the non-target liquid crystal cell are respectively set with a first preset voltage or a second preset voltage, so that the target liquid crystal cell is in a scattering state and the non-target liquid crystal cell is in a transparent state.
  • a first preset voltage is set for the target liquid crystal cell, so that the target liquid crystal cell is in a scattering state
  • a second preset voltage is set for the non-target liquid crystal cell, so that the non-target liquid crystal cell is in a transparent state.
  • the second preset voltage is set for the target liquid crystal cell, so that the target liquid crystal cell is in a scattering state; and the first preset voltage is set for the non-target liquid crystal cell, so that the non-target liquid crystal cell is in a transparent state.
  • Manner 2 The processor compares the position of the liquid crystal cell in the scattering state (referred to as the scattering state liquid crystal cell in the embodiment of the present application) in the projection screen at the current moment with the position of the target liquid crystal cell. If there is an intersection between the position of the scattering liquid crystal cell and the position of the target liquid crystal cell, the processor sends the position of the target liquid crystal cell outside the intersection to the control circuit.
  • the position of the liquid crystal cell in the scattering state referred to as the scattering state liquid crystal cell in the embodiment of the present application
  • the processor also instructs the control circuit to set a second preset voltage for the target liquid crystal cell outside the intersection, so that the target liquid crystal cell outside the intersection is in a scattering state; and instructs The control circuit sets a first preset voltage for the non-target liquid crystal cells outside the intersection, so that the non-target liquid crystal cells outside the intersection are in a transparent state.
  • the processor also instructs the control circuit to set a first preset voltage for the target liquid crystal cell outside the intersection, so that the target liquid crystal cell outside the intersection is in a scattering state; and instructs The control circuit sets a second preset voltage for the non-target liquid crystal cells outside the intersection, so that the non-target liquid crystal cells outside the intersection are in a transparent state.
  • the processor will instruct the control circuit to be the intersection of the first preset voltage and the second preset voltage according to the corresponding relationship between the first preset voltage and the second preset voltage, respectively, and the transparent state.
  • the outside target liquid crystal cell and the non-target liquid crystal cell are respectively set with the first preset voltage or the second preset voltage, so that the target liquid crystal cell outside the intersection is in a scattering state, and the non-target liquid crystal cell outside the intersection is set In a transparent state.
  • set a first preset voltage for the target liquid crystal cell outside the intersection, so that the target liquid crystal cell outside the intersection is in a scattering state; and set a second preset voltage for the non-target liquid crystal cell outside the intersection Means that the non-target liquid crystal cells outside the intersection are in a transparent state.
  • S105 and S106-S107 may be executed simultaneously, or S105 may be executed first, and then S106-S107, etc. may be executed.
  • S108 The processor sends the two-dimensional projection image information to be projected to the projection image source module.
  • the processor sends the two-dimensional projection image information to be projected determined in S105 to the projection image source module.
  • the projection image source module receives the two-dimensional projection image information to be projected, and displays the two-dimensional projection image to be projected according to the two-dimensional projection image information to be projected.
  • the projection image source module projects the two-dimensional projection image to be projected onto the target liquid crystal unit on the projection screen through the projection lens.
  • S109 can refer to the prior art to project the two-dimensional projection image to be projected on the target liquid crystal unit on the projection screen, which will not be repeated here.
  • the projection lens uses a fisheye lens 513 with a FOV of 170° for projection. If a projection lens with a FOV of about 40°-70° is used for projection, the smart audio device 50 shown in FIG. 5A also includes a rotating platform.
  • the above S104 also includes:
  • the processor determines the angle at which the rotating platform needs to be rotated based on the set of intersections and the observation position, so as to adjust the projection area of the projection lens.
  • the processor may first determine the position of the center point in the area where the intersection set is located on the projection screen. Then, the processor determines the line between the center point and the observation point, and the angle between the current optical axis of the projection lens and the current optical axis of the projection lens is the angle that the rotating platform needs to rotate. Then, the processor sends the angle value to the controller of the rotating platform, so that the rotating platform rotates by the angle. In this case, the line between the above-mentioned center point and the observation point can coincide with the optical axis of the projection lens. In other words, the projection area of the projection lens is adjusted to cover the area where the intersection set on the projection screen is located.
  • the rotating platform rotates the angle determined by the processor so that the projection area of the projection lens can cover the area where the intersection set on the projection screen is located.
  • the projection area needs to cover the area where the intersection point set is determined in S104. In this way, the two-dimensional projection image to be projected can be projected by the projection lens. Projected onto the target LCD unit.
  • the display method uses tracking technology to track the position of the human eye, and then determines the set of intersections between the three-dimensional image to be displayed and the projection screen based on the position of the human eye, and further determines the second of the three-dimensional image to be displayed based on the set of intersections.
  • Dimensional projection image Therefore, after the two-dimensional projection image of the three-dimensional image to be displayed is projected onto the target liquid crystal unit with the scattering state in the projection screen, it has a realistic three-dimensional effect.
  • the non-target liquid crystal cell on the projection screen is in a transparent state, that is, the area where the non-target liquid crystal cell is located on the projection screen is transparent.
  • the two-dimensional projection image of the three-dimensional image to be displayed is displayed on the transparent projection screen, and therefore, the background of the two-dimensional projection image of the three-dimensional image to be displayed is fused with the surrounding environment.
  • the user views the two-dimensional projected image of the three-dimensional image to be displayed on the projection screen with the naked eye, a realistic three-dimensional image "floating" in the air can be seen. Therefore, the three-dimensional effect of the user viewing the three-dimensional image with the naked eye is improved.
  • the embodiment of the present application may divide the display control device into functional modules according to the above method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. It should be noted that the division of modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 10 shows a schematic structural diagram of an apparatus 100 for controlling display provided by an embodiment of the present application.
  • the device 100 for controlling display can be applied to a terminal device.
  • the terminal device includes a projection screen.
  • the projection screen includes a transparent substrate and a liquid crystal film covering the transparent substrate.
  • the liquid crystal film includes a plurality of liquid crystal cells.
  • the display control apparatus 100 can be used to control the display of an image to be displayed on a projection screen in a terminal device, and to perform the above-mentioned display method, for example, to perform the method shown in FIG. 6.
  • the device 100 for controlling display may include an acquiring unit 101, a determining unit 102, a setting unit 103, and a control unit 104.
  • the acquiring unit 101 is configured to acquire an image to be displayed.
  • the determining unit 102 is configured to determine a target liquid crystal cell among a plurality of liquid crystal cells based on the positions of pixels in the image to be displayed.
  • the setting unit 103 is used to set the state of the target liquid crystal cell to the scattering state, and to set the state of the non-target liquid crystal cell to the transparent state; wherein, the non-target liquid crystal cell is one of the plurality of liquid crystal cells except the target liquid crystal cell Liquid crystal cell.
  • the control unit 104 is configured to control the projected image of the image to be displayed on the target liquid crystal unit.
  • the acquiring unit 101 may be used to perform S101
  • the determining unit 102 may be used to perform S106
  • the setting unit 103 may be used to perform S107.
  • the image to be displayed includes a three-dimensional image
  • the projected image of the image to be displayed includes a two-dimensional image
  • the setting unit 103 is specifically configured to:
  • a first preset voltage is set for the target liquid crystal cell to control the state of the target liquid crystal cell to a scattering state; and a second preset voltage is set for the non-target liquid crystal cell to control the state of the non-target liquid crystal cell to a transparent state.
  • the second preset voltage is set for the target liquid crystal cell to control the state of the target liquid crystal cell to the scattering state; and the first preset voltage is set for the non-target liquid crystal cell to control the state of the non-target liquid crystal cell to the transparent state.
  • the first preset voltage is greater than or equal to the preset value
  • the second preset voltage is less than the preset value
  • the setting unit 103 may be used to perform S107.
  • the above-mentioned liquid crystal film includes: a polymer dispersed liquid crystal film, a bistable liquid crystal film or a dyed liquid crystal film.
  • the above-mentioned projection screen includes a curved screen, or the projection screen includes a three-dimensional screen.
  • the aforementioned terminal device further includes a tracking module, which is used to track the position of the human eye.
  • the determining unit 102 is further configured to determine the position of the target liquid crystal cell among the multiple liquid crystal cells based on the tracked human eye position and the position of the pixel in the image to be displayed. As an example, referring to FIG. 6, the determining unit 102 may be used to perform S102-S106.
  • the determining unit 102 is specifically configured to intersect the projection screen based on the connection between the tracked human eye position and the position of each pixel in the image to be displayed ,
  • the liquid crystal cell whose intersection position is determined among the plurality of liquid crystal cells is the target liquid crystal cell.
  • the determining unit 102 may be used to perform S102-S106.
  • the aforementioned terminal device further includes a rotating platform and a first projection lens.
  • the control unit 104 is specifically configured to control the rotating platform to adjust the projection area of the first projection lens, so that the first projection lens projects the image to be projected in the target liquid crystal unit, so that the projected image of the image to be displayed is displayed on the target liquid crystal unit; ,
  • the field angle of the first projection lens is less than or equal to the preset threshold.
  • the control unit 104 may be used to perform S104a.
  • the aforementioned terminal device further includes a second projection lens.
  • the control unit 104 is specifically configured to control the second projection lens to project the image to be projected in the target liquid crystal unit, so that the projected image of the image to be displayed is displayed on the target liquid crystal unit; wherein, the field of view of the second projection lens is greater than the preset Threshold.
  • the device 100 for controlling display provided in the embodiment of the present application includes but is not limited to the above-mentioned units.
  • the device 100 for controlling display may further include a storage unit 105.
  • the storage unit 105 may be used to store the program code of the device 100 for controlling display and the like.
  • the acquiring unit 101 in the device 100 for controlling display may be implemented through the communication interface 25 in FIG. 2.
  • the functions implemented by the determining unit 102, the setting unit 103, and the control unit 104 may be implemented by the processor 23 in FIG. 2 executing the program code in the memory 24 in FIG.
  • the functions implemented by the storage unit 105 can be implemented by the memory 24 in FIG. 2.
  • the chip system 110 includes at least one processor 111 and at least one interface circuit 112.
  • the processor 111 and the interface circuit 112 may be interconnected by wires.
  • the interface circuit 112 may be used to receive signals (e.g., receive signals from a tracking module).
  • the interface circuit 112 may be used to send signals to other devices (for example, the processor 111).
  • the interface circuit 112 may read an instruction stored in the memory, and send the instruction to the processor 111.
  • the device for controlling the display can be made to execute the steps in the foregoing embodiments.
  • the chip system 110 may also include other discrete devices, which are not specifically limited in the embodiment of the present application.
  • Another embodiment of the present application further provides a computer-readable storage medium that stores instructions in the computer-readable storage medium.
  • the display control device executes the method shown in the foregoing method embodiment. The various steps performed by the display control device in the process of the method.
  • the disclosed methods may be implemented as computer program instructions encoded on a computer-readable storage medium in a machine-readable format or encoded on other non-transitory media or articles.
  • FIG. 12 schematically shows a conceptual partial view of a computer program product provided by an embodiment of the present application, and the computer program product includes a computer program for executing a computer process on a computing device.
  • the computer program product is provided using the signal bearing medium 120.
  • the signal bearing medium 120 may include one or more program instructions, which, when executed by one or more processors, can provide the functions or part of the functions described above with respect to FIG. 6. Therefore, for example, referring to one or more features of S101 to S109 in FIG. 6 may be undertaken by one or more instructions associated with the signal bearing medium 120.
  • the program instructions in FIG. 12 also describe example instructions.
  • the signal-bearing medium 120 may include a computer-readable medium 121, such as, but not limited to, a hard disk drive, compact disk (CD), digital video compact disk (DVD), digital tape, memory, read-only storage memory (read -only memory, ROM) or random access memory (RAM), etc.
  • a computer-readable medium 121 such as, but not limited to, a hard disk drive, compact disk (CD), digital video compact disk (DVD), digital tape, memory, read-only storage memory (read -only memory, ROM) or random access memory (RAM), etc.
  • the signal bearing medium 120 may include a computer recordable medium 122, such as, but not limited to, memory, read/write (R/W) CD, R/W DVD, and so on.
  • a computer recordable medium 122 such as, but not limited to, memory, read/write (R/W) CD, R/W DVD, and so on.
  • the signal-bearing medium 120 may include a communication medium 123, such as, but not limited to, digital and/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
  • a communication medium 123 such as, but not limited to, digital and/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
  • the signal bearing medium 120 may be communicated by a wireless communication medium 123 (for example, a wireless communication medium that complies with the IEEE 802.11 standard or other transmission protocols).
  • the one or more program instructions may be, for example, computer-executable instructions or logic-implemented instructions.
  • an apparatus for controlling display such as that described with respect to FIG. 6 may be configured to, in response to passing through one or more program instructions in the computer readable medium 121, the computer recordable medium 122, and/or the communication medium 123, Provide various operations, functions, or actions.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • a software program it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer execution instructions When the computer execution instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application are generated in whole or in part.
  • the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line (digital subscriber line, DSL) or wireless (such as infrared, wireless, microwave, etc.) transmission to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or may include one or more data storage devices such as a server or a data center that can be integrated with the medium.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Nonlinear Science (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Dispersion Chemistry (AREA)
  • Mathematical Physics (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

本申请公开了一种显示方法及控制显示的装置,涉及显示技术领域,该方法有助于提高用户通过裸眼观看三维图像时的立体效果。该显示方法应用于显示系统。该显示系统包括投影屏,投影屏包括透明基板以及覆盖在透明基板上的液晶膜,液晶膜包括多个液晶单元。该显示方法包括:获取待显示图像;基于待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元;将目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态,其中,非目标液晶单元是多个液晶单元中的除目标液晶单元之外的液晶单元;在目标液晶单元上显示待显示图像的投影图像。

Description

一种显示方法及控制显示的装置
本申请要求于2020年3月20日提交国家知识产权局、申请号为202010203698.2、申请名称为“一种显示方法及控制显示的装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示领域,尤其涉及一种显示方法及控制显示的装置。
背景技术
在显示图像时,传统方法均采用二维显示。随着显示技术的发展,用三维显示代替二维显示,可以为用户带来更好的视觉体验。目前,用户在用裸眼观看通过三维显示的图像时,普遍存在所显示的三维图像的立体感不强的问题。因此,如何提高用户通过裸眼观看三维图像时的立体效果,成为亟待解决的技术问题。
发明内容
本申请提供了一种显示方法及控制显示的装置,有助于提高用户通过裸眼观看三维图像时的立体效果。
为达上述目的,本申请提供如下技术方案:
第一方面,本申请提供了一种显示方法,该显示方法应用于终端设备,该终端设备包括投影屏。该投影屏包括透明基板以及覆盖在透明基板上的液晶膜,液晶膜包括多个液晶单元。该显示方法包括:获取待显示图像。基于待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元。接着,将目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态,这里,非目标液晶单元是多个液晶单元中的除目标液晶单元之外的液晶单元。然后,在目标液晶单元上显示待显示图像的投影图像。这里,可以通过在目标液晶单元上投影待显示图像的待投影图像,以使得待显示图像的投影图像显示在目标液晶单元上。
通过上述方法,待显示图像的投影图像显示在透明的投影屏上,因此,待显示图像的投影图像的背景与周围环境融合,提升了视觉效果。
结合第一方面,在一种可能的设计方式中,上述“获取待显示图像”包括:从已经存储的图像库中选取待显示图像,或者,从网络中下载待显示图像。
结合第一方面,在另一种可能的设计方式中,上述待显示图像包括三维图像,上述待显示图像的投影图像包括二维图像。这样,待显示的三维图像的二维投影图像可以显示在透明的投影屏上,当通过裸眼在曲面或立体的投影屏上观看该待显示三维图像的二维投影图像时,可以看到“悬浮”在空气中的、逼真的三维图像,因此,提高了用户通过裸眼观看三维图像的立体效果。
结合第一方面,在另一种可能的设计方式中,上述“将目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态”包括:
为目标液晶单元设置第一预置电压,以控制目标液晶单元的状态为散射态;以及,为非目标液晶单元设置第二预置电压,以控制非目标液晶单元的状态为透明态。或者,为目标液晶单元设置第二预置电压,以控制目标液晶单元的状态为散射态;以及,为 非目标液晶单元设置第一预置电压,以控制非目标液晶单元的状态为透明态。
其中,上述第一预置电压大于或等于预设值,上述第二预置电压小于该预设值。
通过该可能的设计方式,可以使待显示图像的投影图像显示在透明的投影屏上,从而待显示图像的投影图像的背景与周围环境相融合,提升了视觉效果。
结合第一方面,在另一种可能的设计方式中,上述液晶膜包括:聚合物分散液晶膜、双稳态液晶膜或染色液晶膜。
结合第一方面,在另一种可能的设计方式中,上述投影屏包括曲面屏,或者,上述投影屏包括立体屏。通过使用曲面屏或立体屏,用户通过裸眼在投影屏上观看该待显示三维图像的二维投影图像时,可以看到“悬浮”在空气中的、逼真的三维图像,因此,提高了用户通过裸眼观看三维图像的立体效果。
结合第一方面,在另一种可能的设计方式中,上述显示方法还包括:追踪人眼位置。上述“基于待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元”包括:基于追踪到的人眼位置和待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元的位置。当待显示图像是三维图像时,通过追踪的人眼位置确定用于显示待显示图像的投影图像的目标液晶单元的位置,可以提高在人眼位置观看待显示图像的投影图像的立体感。
结合第一方面,在另一种可能的设计方式中,若待显示图像是三维图像时,上述“基于追踪到的人眼位置和待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元的位置”包括:基于追踪到的人眼位置与待显示图像中的每个像素点的位置的连线,与投影屏的交点,在多个液晶单元中确定交点位置的液晶单元为目标液晶单元。
结合第一方面,在另一种可能的设计方式中,上述终端设备还包括第一投影镜头,上述“在所述目标液晶单元上投影所述待显示图像的待投影图像”包括:调整第一投影镜头的投影区域,以使得第一投影镜头在目标液晶单元中投影待投影图像;其中,第一投影镜头的视场角小于或等于预设阈值。这样,当待显示图像是三维图像时,使用视场角较小的投影镜头,依然可以在根据人眼位置确定的目标液晶单元上,投影待显示图像的待投影图像,提高了在人眼位置观看待显示图像的投影图像的立体感。
结合第一方面,在另一种可能的设计方式中,上述终端设备还包括第二投影镜头,上述“在所述目标液晶单元上投影所述待显示图像的待投影图像”包括:通过第二投影镜头在目标液晶单元中投影待显示图像的待投影图像;其中,第二投影镜头的视场角大于预设阈值。
结合第一方面,在另一种可能的设计方式中,上述终端设备还包括图像源模块,该图像源模块用于将待显示图像的待投影图像投影在投影屏上。
结合第一方面,在另一种可能的设计方式中,上述追踪模块可以设置于上述终端设备内部,也可以设置于上述终端设备外部。当上述追踪模块设置于上述终端设备内部时,可以减小终端设备的体积。当上述追踪模块设置于上述终端设备外部时,由于追踪模块的探测光线不会与投影屏相交,因此,投影屏上用于显示待显示图像的投影图像的面积将增大,这样的话,可以使更大范围内追踪到的人眼位置均能观看到投影图像。
第二方面,本申请提供了一种控制显示的装置,该装置应用于终端设备,该装置 可以用于执行上述第一方面提供的任一种方法。本申请可以根据上述第一方面提供的任一种方法,对该控制显示的装置进行功能模块的划分。例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。示例性的,本申请可以按照功能将该控制显示的装置划分为获取单元、确定单元、设置单元以及控制单元等。上述划分的各个功能模块执行的可能的技术方案和有益效果的描述均可以参考上述第一方面或其相应的可能的设计提供的技术方案,此处不再赘述。
第三方面,本申请提供了一种终端设备,该终端设备包括投影屏和处理器等。该终端设备可以用于执行上述第一方面提供的任一种方法。终端设备中各个模块部件执行的可能的技术方案和有益效果的描述均可以参考上述第一方面或其相应的可能的设计提供的技术方案,此处不再赘述。
第四方面,本申请提供了一种芯片系统,包括:处理器,处理器用于从存储器中调用并运行该存储器中存储的计算机程序,执行第一方面中的实现方式提供的任一种方法。
第五方面,本申请提供了一种计算机可读存储介质,如计算机非瞬态的可读存储介质。其上储存有计算机程序(或指令),当该计算机程序(或指令)在计算机上运行时,使得该计算机执行上述第一方面中任一种可能的实现方式提供的任一种方法。
第六方面,本申请提供了一种计算机程序产品,当其在计算机上运行时,使得第一方面中的任一种可能的实现方式提供的任一种方法被执行。
可以理解的是,上述提供的任一种装置、计算机存储介质、计算机程序产品或芯片系统等均可以应用于上文所提供的对应的方法,因此,其所能达到的有益效果可参考对应的方法中的有益效果,此处不再赘述。
在本申请中,上述终端设备和控制显示的装置的名字对设备或功能模块本身不构成限定,在实际实现中,这些设备或功能模块可以以其他名称出现。只要各个设备或功能模块的功能和本申请类似,属于本申请权利要求及其等同技术的范围之内。
本申请的这些方面或其他方面在以下的描述中会更加简明易懂。
附图说明
图1为本申请实施例提供投影区域的示意图;
图2为本申请实施例提供的一种显示系统的结构示意图;
图3为本申请实施例提供的一种液晶膜的结构示意图;
图4为本申请实施例提供的一种投影屏的结构示意图;
图5A为本申请实施例提供的一种显示系统的硬件结构示意图一;
图5B为本申请实施例提供的一种显示系统的硬件结构示意图二;
图6为本申请实施例提供的一种显示方法的流程示意图;
图7为本申请实施例提供的一种显示方法的示意图一;
图8为本申请实施例提供的一种显示方法的示意图二;
图9为本申请实施例提供的一种显示方法的示意图三;
图10为本申请实施例提供的一种控制显示的装置的结构示意图;
图11为本申请实施例提供的一种芯片系统的结构示意图;
图12为本申请实施例提供的一种计算机程序产品的结构示意图。
具体实施方式
以下,说明本申请实施例中涉及的部分术语或技术:
1)运动视差
视网膜只能接受二维空间的刺激,对三维空间的反映主要依靠双眼视觉实现。人类依靠双眼视觉在三维空间的维度上感知世界和判断物体距离的能力被称为深度知觉(depth perception)。深度知觉是一种综合感受,是由人眼获取的多种信息经由大脑综合处理得到的。通常,将用于提供深度知觉的信息称为深度线索(depth cues)。真实世界中具有完整的深度线索。
通俗意义上,一种三维显示技术的“立体感”强不强,与观察者对所显示内容的深度知觉是否接近于真实世界有关。因此,一种三维显示技术的“立体感”取决于该显示技术在其应用中是否能够提供合适的深度线索。目前的三维显示技术通常可以提供一种或几种深度线索。例如,深度线索可以是视差、光影关系或遮挡关系等。
其中,视差(parallax)是指从两个不同位置观察同一个物体时,此物体在视野中的位置变化与差异。从两个观察点看目标,两条视线之间的夹角叫做这两个点的视差角,两点之间的距离称作视差基线。视差可以包括双目视差和运动视差。
双目视差,是指由于正常的瞳孔距离和注视角度不同,造成左右眼视网膜上的物象存在一定程度的水平差异。在观察立体的目标时,由于两只眼睛之间相距约60mm,所以两只眼睛会从不同角度观察。这种在双眼视网膜结像出现微小的水平像位差,称为双眼视差或立体视差(stereoscopic vision)。
运动视差,亦称“单眼运动视差”,是单眼深度线索的一种,是指视线在视野中横向移动时见到的物体的运动方向和速度的差异。在做相对位移的时候,近的物体看起来移动的快,远的物体看起来移动的慢。
需要说明的是,观察者距离被观察目标较近时,双目视差明显。观察者距离被观察目标较远时,例如大于1m,这时,双目视差可以忽略,运动视差起主导作用。
2)待投影的二维投影图像,二维投影图像
待投影的二维投影图像(相当于本申请实施例中的待投影图像),是待显示三维图像(相当于本申请实施例中的待显示图像)经坐标转换后得到的二维投影图像(相当于本申请实施例中的投影图像),该二维投影图像可以显示在下文中描述的投影图像源模块211上。
二维投影图像(相当于本申请实施例中的投影图像),是将待投影的二维投影图像投影到投影屏(如下文中描述的投影屏212)上的图像。
3)投影区域
投影镜头在投影屏上投影时具有一定范围的投影区域。投影区域可以用于显示二维投影图像。
示例性的,参考图1,如图1所示,投影镜头11在投影屏13上的投影区域为虚线椭圆所示出的投影区域12,投影区域12的形状与设置于投影镜头11上的孔径光阑的孔径形状有关。这里,投影区域中距离最远的点121和点122分别与投影镜头的连线的夹角,称为视场角(field of view,FOV),这里,FOV为D°。
4)其他术语
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B。本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。并且,在本申请的描述中,除非另有说明,“多个”是指两个或多于两个。
本申请实施例提供了一种显示方法,该方法应用关于显示系统。该方法可以提供合适的运动视差。对于显示在投影屏上的二维投影图像,用户可以通过裸眼获取该二维投影图像信息,并结合该运动视差,经由大脑综合处理即可得到三维图像的观看体验。
参考图2,图2示出了本申请实施例提供的显示系统的一种结构示意图。图2所示的显示系统20可以包括投影模块21、追踪模块22以及处理器23。
可选的,显示系统20还可以包括存储器24和通信接口25。其中,投影模块21、追踪模块22、处理器23、存储器24以及通信接口25中的至少两个模块/器件可以集成在一个设备上,也可以分别设置在不同的设备上。
以投影模块21、追踪模块22、处理器23、存储器24以及通信接口25集成在一个终端设备上为例,显示系统20还包括总线26。投影模块21、追踪模块22、处理器23、存储器24以及通信接口25可以通过总线26连接。该情况下,该终端设备可以是任意一种带有投影屏的电子设备,本申请实施例对此不作限定。例如,该电子设备可以是带有投影屏的智能音响设备。
投影模块21,包括投影图像源模块211、投影屏212以及投影镜头213。
投影图像源模块211,用于显示待投影的二维投影图像,以及将待投影的二维投影图像通过投影镜头213投影到投影屏212上。投影图像源模块211包括光源和光调制元件。本申请实施例对光源和光调制元件的具体形式不作限定,例如,光源可以是发光二极管(light emitting diode,LED)或激光,光调制元件可以是数字光处理(digital light processing,DLP)系统或液晶覆硅(liquid crystal on silicon,LCoS)。其中,光调制元件上显示有待投影的二维投影图像,光源发出的光经由光调制元件上调制,形成待投影的二维投影图像,二维投影图像通过投影镜头213被投影到投影屏212上。
投影屏212用于显示二维投影图像。投影屏212可以是曲面屏或者立体屏,当然,投影屏212也可以是平面屏。这里,立体屏的形状可以是多种形状,例如球状、圆柱状、棱柱状、锥体状或者多面体状等,本申请实施例对此不作限定。
投影屏212包括透明基板以及覆盖在该透明基板上的液晶膜。本申请实施例对该透明基板的材料不作限定,例如该透明基板可以是透明的玻璃基板,或者可以是透明的树脂基板等。该液晶膜可以是聚合物分散液晶(polymer dispersed liquid crystal,PDLC)膜、双稳态液晶(bistable liquid crystal,BLC)膜或染色液晶(dye-doped liquid crystal,DDLC)膜等。
具体的,上述液晶膜包括多个液晶单元,每个液晶单元均具有散射态和透明态。 并且,处理器23可以通过电信号来控制每个液晶单元的状态。这里,相对于透明态,散射态也可以称为非透明态。每个液晶单元可以对应二维投影图像的一个像素,也可以对应二维投影图像的多个像素,当然,多个液晶单元也可以对应二维投影图像的一个像素,本申请实施例对此不作限定。需要说明的是,处于散射态的液晶单元用于显示二维投影图像。
作为一个示例,以上述液晶膜是PDLC膜为例进行说明。如图3中的(a)所示,图3中的(a)示出了PDLC膜中的多个液晶单元(每一个方格表示一个液晶单元),且该多个液晶单元中的每个液晶单元均设置了第一预置电压,第一预置电压大于等于预设值。这时,该多个液晶单元中的每个液晶单元的液晶分子均沿电场方向统一排列,使得入射光经该液晶单元后沿原方向出射,因此该液晶单元的状态为透明态。若将图3中的(a)所示的液晶单元33、液晶单元34、液晶单元38以及液晶单元39的外加电压设置为第二预置电压,这里,第二预置电压小于预设值,则液晶单元33、液晶单元34、液晶单元38以及液晶单元39中的液晶分子排列方向是随机的。这时,入射光经液晶单元33、液晶单元34、液晶单元38以及液晶单元39后,出射光为散射光,如图3中(b)所示。此时,液晶单元33、液晶单元34、液晶单元38以及液晶单元39的状态为散射态,即不透明态。其中,电压的预设值可以由液晶膜的具体成分以及各成分的比例决定,本申请实施例对此不进行限定。
需要说明的是,若上述液晶膜是BLC膜,当为BLC膜中的某个液晶单元设置第一预置电压时,该液晶单元的状态为散射态;当为该液晶单元设置第二预置电压时,则该液晶单元的状态为透明态。当上述液晶膜是染色液晶膜时,可以设定:当为某个液晶单元设置第一预置电压时,该液晶单元的状态为散射态,当为该液晶单元设置第二预置电压时,该液晶单元的状态为透明态;或者,可以设定:当为某个液晶单元设置第一预置电压时,该液晶单元的状态为透明态,当为该液晶单元设置第二预置电压时,该液晶单元的状态为散射态。本申请实施例对此不作限定。
投影镜头213,用于将投影图像源模块211中显示的待投影的二维投影图像,投影到投影屏212上。投影镜头213可以是大视场角(field of view,FOV)的镜头,例如FOV大于150°的鱼眼镜头(相当于本申请实施例中的第二投影镜头)。当然,投影镜头213也可以是FOV约40°-70°的投影镜头(相当于本申请实施例中的第一投影镜头)。这里,第一投影镜头的视场角小于等于预设阈值,第二投影镜头的视场角大于预设阈值,本申请实施例对预设阈值的取值不作限定。
若投影镜头213是第一投影镜头时,投影模块21还可以包括旋转平台214。旋转平台214用于通过旋转角度来调整投影镜头213的投影区域。旋转平台214的控制器与处理器23相连,或者,用于控制旋转平台214旋转的控制器为处理器23。
作为示例,若投影屏212为立体屏,则投影镜头213可以完全设置于立体屏的内部,投影镜头213也可以部分设置于立体屏的内部。
作为示例,若投影屏212为圆柱或方柱等柱体投影屏,则投影镜头213可以通过环带投影光学系统实现投影功能。这时,对于柱体投影屏而言,柱体投影屏的上下表面可以不参与投影显示,而柱体的侧壁可以用于显示二维投影图像,当然不限于此。
作为示例,如图4所示,图4示出了一种投影模块21的结构图。其中,投影镜头 213的FOV是50°。投影屏212为球状的立体屏,且投影镜头213部分设置于投影屏212内部。投影镜头213位于投影图像源模块211与投影屏212之间,并且,投影镜头213和投影图像源模块211的位置是相对固定的。旋转平台214用于调整投影镜头213的投影区域,例如,当前时刻投影镜头213的投影区域为A,下一时刻,处理器23指示旋转平台214旋转X°,使得投影镜头213的投影区域为图4所示出的B处。这里,X的具体数值是处理器23确定的。处理器23具体确定X的具体数值的过程,参考本申请实施例下文中显示方法的描述,此处不再赘述。
追踪模块22,用于追踪人眼所在的位置,并向处理器23发送追踪到的人眼位置。具体的,追踪模块可以利用红外成像技术,追踪人眼所在的位置,当然本申请实施例不限于此。
处理器23,是显示系统20的控制中心,处理器23可以是一个通用中央处理单元(central processing unit,CPU),也可以是其他通用处理器等。其中,通用处理器可以是微处理器或者是任何常规的处理器等。作为一个示例,处理器23可以包括一个或多个CPU,例如图2中所示的CPU 0和CPU 1。
具体的,处理器23用于根据待显示三维图像中像素点的位置和人眼位置,确定出待显示三维图像的待投影的二维投影图像,并将该二维投影图像发送至投影图像源模块211。处理器23还用于根据待显示三维图像中像素点的位置和人眼位置,在投影屏212中确定目标液晶单元的位置,并通过控制电路控制目标液晶单元的状态为散射态,以及控制非目标液晶单元的状态为透明态。这里,非目标液晶单元是投影屏212中除目标液晶单元之外的液晶单元。其中,控制电路可以集成于液晶膜上,本申请实施例对此不作限定。
存储器24可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。
一种可能的实现方式中,存储器24可以独立于处理器23存在。存储器24可以通过总线26与处理器23相连接,用于存储数据、指令或者程序代码。处理器23调用并执行存储器24中存储的指令或程序代码时,能够实现本申请实施例提供的显示方法。
另一种可能的实现方式中,存储器24也可以和处理器23集成在一起。
通信接口25,用于显示系统20与其他设备(如服务器等)通过通信网络连接,所述通信网络可以是以太网,无线接入网(radio access network,RAN),无线局域网(wireless local area networks,WLAN)等。通信接口25可以包括用于接收数据的接收单元,以及用于发送数据的发送单元。
总线26,可以是工业标准体系结构(Industry Standard Architecture,ISA)总线、外部设备互连(Peripheral Component Interconnect,PCI)总线或扩展工业标准体系结构(Extended Industry Standard Architecture,EISA)总线等。该总线可以分为地址总线、数据总线、控制总线等。为便于表示,图2中仅用一条粗线表示,但并不表示仅 有一根总线或一种类型的总线。
需要指出的是,图2中示出的结构并不构成对该显示系统的限定,除图2所示部件之外,该显示系统20可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
作为一个示例,参考图5A,图5A示出了本申请实施例提供的一种终端设备(例如智能音箱设备)的硬件结构。智能音箱设备50包括投影模块、追踪模块以及处理器53。其中,投影模块包括投影图像源模块511、投影屏512、以及FOV为170°的鱼眼镜头513,追踪模块包括追踪镜头52,并且,投影图像源模块511和追踪镜头52分别通过总线与处理器53连接通信。
如图5A所示,投影屏512是球状的投影屏,投影屏512包括球状的透明基板和覆盖于球状透明基板上的液晶膜,该液晶膜可以覆盖于球状透明基板的内表面,也可以覆盖于球状透明基板的外表面,本申请实施例以液晶膜覆盖于球状透明基板的内表面为例说明。鱼眼镜头513对应的阴影区域为鱼眼镜头的可投影区域,追踪镜头52对应的阴影区域为追踪镜头可以追踪人眼的范围。可以理解的是,智能音箱设备50可以包括多个追踪镜头,以实现在360°范围内追踪人眼位置。
智能音箱设备50还可以包括语音采集器和语音播放器(图5A中未示出),并且,语音采集器和语音播放器分别通过总线与处理器连接通信。其中,语音采集器用于采集用户的语音指令,语音播放器用于向用户输出语音信息。可选的,智能音箱设备50还可以包括存储器(图5A中未示出),存储器与处理器连接通信,并用于存储本地数据。
当然,投影镜头52也可以位于投影屏512的外部,如图5B所示,本申请实施例对此不作限定。可以理解的是,若投影镜头52在投影屏512的内部,可以缩小智能音箱设备50的体积。若投影镜头52在投影屏512的外部,可以避免投影屏的显示区域与追踪镜头的追踪光路的冲突,从而获得更大的投影显示区域。
下面结合附图对本发明实施例提供的显示方法进行描述。本申请实施例以该显示方法应用于图5A所示的智音响设备50中为例进行说明。
请参考图6,图6示出了本申请实施例提供的显示方法的流程示意图。该显示方法包括以下步骤:
S101、处理器获取待显示图像。
其中,待显示图像可以是多维图像,例如三维图像。下文的描述中,以待显示图像为待显示三维图像为例进行说明。
具体的,处理器可以根据获取到的指示信息,从网络中或本地图库中获取待显示三维图像,本申请实施例对此不作限定。
其中,本申请实施例对指示信息的具体内容及形式均不进行限定。例如,指示信息可以是用户通过语音、文字或按键输入的指示信息,指示信息还可以是处理器检测到的触发信息,例如智音响设备50开机或关机等。
在一种可能的实现方式中,若指示信息是用户输入的语音信息,则处理器可以获取到智能音响设备通过语音采集器采集的语音信息。
其中,语音信息的内容可以是智能音响设备唤醒词,例如“小e小e”,该情况下, 处理器从本地图库中调用“小e”的三维形象卡通人物,该三维形象卡通人物即为待显示三维图像。
或者,语音信息的内容可以是用户在说出唤醒词之后所提出的任何问题,例如“帮我查找本市的卫星地图”,该情况下,处理器从网络中搜索并下载本市的三维卫星地图,该三维卫星地图即为待显示三维图像。再例如,语音信息的内容是“观看XX电影”,该情况下,处理器从网络中搜索并下载3D版本的XX电影,其中,即将播放的3D版本的XX电影的当前帧即为当前时刻的待显示三维图像。
在另一种可能的实现方式中,若指示信息是用户输入的非语音信息,即用户可以通过按键、智能音响设备的触摸屏、或其他任意可以输入指示信息的方式输入指示信息,本申请实施例对此不作限定。相应的,处理器可以获取到用户输入指示信息,并基于指示信息的指示,获取待显示三维图像。
在又一种可能的实现方式中,若指示信息是处理器检测到的触发信息,例如,处理器检测到智音响设备50的开机操作。该情况下,开机操作触发处理获取开机操作所对应三维图像,并确定该三维图像为待显示三维图像。示例性的,开机操作对应的三维图像可以是代表智能音响设备50的卡通形象人物招手致意三维图像。
S102、处理器确定待显示三维图像的图像信息。
具体的,处理器在预设的三维坐标系中确定待显示三维图像的图像信息。
待显示三维图像的图像信息用于描述待显示三维图像。其中,待显示三维图像可以由多个像素点构成,对于该多个像素点中的每个像素点而言,待显示三维图像的图像信息可以是该像素点在预设的三维坐标系中的坐标位置,以及待显示三维图像在该坐标位置的颜色亮度信息等。
其中,预设的三维坐标系是处理器预先设置的。例如,该预设的三维坐标系可以是以球状投影屏的球心为原点的三维坐标系。当然,该预设的三维坐标系也可以是以任意一点作为原点的三维坐标系,本申请实施例对此不作限定。为了方便描述,在下文中,本申请实施例以预设的三维坐标系的原点是球状投影屏的球心为例进行说明。
示例性的,结合图5A,参考图7,如图7所示,若待显示三维图像为长方体70,则构成长方体70的多个像素点中的任意一个像素点A可以用坐标(x a,y a,z a)表示。这里,坐标(x a,y a,z a)是在以球状投影屏512的球心为原点的三维坐标系中的坐标值。
此外,若待显示三维图像的尺寸较大,则可能使得待显示三维图像的部分像素点的位置位于投影屏512之外,进而使得该部分像素点的对应的二维投影图像上的像素点不在投影屏512上显示。结合图5A,参考图8,图8中示出的长方体80由于尺寸过大,当长方体80置于预设的三维坐标系中时,部分像素点的位置位于投影屏之外,例如图8中的点B。
可选的,为避免出现图8中示出的情况,处理器可以缩小待显示三维图像的尺寸,以使得待显示三维图像的每个像素点对应的二维投影图像的像素点可以在投影屏上显示。具体的,处理器可以执行以下步骤:
步骤1、处理器在预设的三维坐标系中确定待显示三维图像的每个像素点的位置。
步骤2、处理器确定待显示三维图像中的每个像素点是否均位于投影屏的同侧。
具体的,处理器根据待显示三维图像的每个像素点在预设的三维坐标系中的位置,确定待显示三维图像中每个像素点与坐标原点之间的距离。然后,处理器确定待显示三维图像中每个像素点与坐标原点之间的距离是否均小于等于投影屏512的半径。若小于等于投影屏512的半径,则处理器确定待显示三维图像的每个像素点的位置均位于球状的投影屏512内,即待显示三维图像位于投影屏512的同侧。若待显示三维图像中至少一个像素点与坐标原点之间的距离大于投影屏512的半径,则处理器确定待显示三维图像存在位于球状投影屏512外的像素点,即待显示三维图像位于投影屏512的两侧。
步骤3、处理器对待显示三维图像进行缩小(例如按照预设比例进行缩小),并重复执行步骤1和步骤2,直到处理器确定缩小后的待显示三维图像中每个像素点均位于投影屏同侧为止。其中,本申请实施例对预设比例具体取值及取值方式不进行限定。
S103、追踪镜头追踪人眼位置,并根据人眼位置确定观察位置,以及向处理器发送确定出的观察位置。或者,追踪镜头追踪人眼位置,并向处理器发送追踪到的人眼位置,以使得处理器根据人眼位置确定观察位置。
其中,观察位置是基于人的双眼位置所确定的单点位置,本申请实施例对观察位置与双眼位置之间的关系不作限定。例如,观察位置可以是双眼位置连线的中点。
追踪镜头预先设置了自身在预设的三维坐标系中的位置。追踪镜头在预设的三维坐标系中的位置,以及观察位置均可以通过预设的三维坐标系中的坐标表示。
在一种实现方式中,追踪模块包括追踪镜头和计算模块。其中,追踪镜头可以根据自身在的预设的三维坐标系中的位置,采用红外成像技术追踪到人的双眼位置。然后,计算模块根据追踪镜头追踪到的双眼位置,计算出双眼位置连线的中点,并将该计算出的中点的位置作为观察位置,发送给处理器。追踪镜头采用红外成像技术追踪追踪人眼位置的具体过程可以参考现有技术,此处不再赘述。
示例性的,若追踪镜头追踪到人的双眼中左眼的位置为E1(x e1,y e1,z e1),右眼的位置为E2(x e2,y e2,z e2),则计算模块根据E1和E2的位置,计算出E1和E2连线的中点的位置E(x e,y e,z e),并将E作为观察位置发送至处理器。
在另一种实现方式中,追踪模块包括追踪镜头,追踪镜头可以根据自身在预设的三维坐标系中的位置,采用红外成像技术追踪到人的双眼位置,并将该双眼位置均发送至处理器。然后,处理器根据接收到的双眼位置确定观察位置。例如,处理器可以计算双眼位置连线的中点的位置,并将该中点的位置确定为观察位置。
需要说明的是,本申请实施例对S102和S103的执行时序不作限定,例如可以同时执行S102和S103,也可以先执行S102,再执行S103等。
S104、处理器根据待显示三维图像的图像信息以及确定出的观察位置,确定交点集以及交点集中每个交点的信息。
具体的,处理器根据确定出的观察位置,以及预设的三维坐标系中待显示三维图像的每个像素点的位置,确定交点集以及交点集中每个交点的信息。
其中,交点集包括多个交点,该多个交点是:观察位置和待显示三维图像中的多个像素点分别连接得到的多条连线,与投影屏分别相交得到的多个交点。对于该多个 像素点中的任一个像素点而言,该像素点与观察位置之间的连线,与待显示三维图像不具有除该像素点以外的交点。也就是说,上述多个像素点是人眼在观察位置所能观看到的待显示三维图像的画面中包括的像素点。这样的话,对于上述多个像素点中的每个像素点而言,基于该像素点与观察位置相连得到的连线,与投影屏相交得到的交点,该像素点与该交点具有对应关系。
示例性的,结合图5A,参考图9,图9示出了处理器确定交点集中的任一个交点的示意图。如图9所示,虚线所示的人眼表示S103步骤中确定的观察位置E,长方体70是置于预设的三维坐标系中的待显示三维图像。长方体70上的任意一个像素点A和观察位置E的连线为连线AE,连线AE与投影屏512相交于交点A1(x a1,y a1,z a1),并且,连线AE与长方体70不存在除像素点A以外的交点,因此,像素点A和交点A1具有对应关系。若长方体70上的任意一个像素点C和观察位置E的连线为连线CE,连线CE与投影屏512相交于交点A1(x a1,y a1,z a1),并且,连线CE与长方体70存在除像素点C以外的交点,即像素点A,因此,像素点C和交点A1没有对应关系。
其中,交点A1是交点集中的任一个交点。此外,从上述描述可知,交点A1可以是液晶膜上的点,也可以是投影屏中的球状透明基板的内表面或外表面上的点。
可以理解的是,对于三维图像,人眼在不同角度观看到的三维图像的画面是不同的。因此,观察位置不同时,处理器所确定的交点集不同。
对于确定出的交点集中每个交点而言,该交点的信息可以包括该交点的位置以及该交点对应的颜色亮度信息等。其中,交点的位置是该交点在预设的三维坐标系中的位置,例如,交点集中任意一个交点的位置可以是(x s,y s,z s)。此外,颜色亮度信息是与该交点具有对应关系的待显示三维图像中的像素点的颜色亮度信息。
需要说明的是,上述连线与投影屏相交的交点,可以是该连线与投影屏的内表面相交的交点,即该交点是投影屏上的液晶膜上的点。当然,上述连线与投影屏相交的交点,也可以是该连线与投影屏的外表面(即投影屏中的球状透明基板的外表面)相交的交点,即该交点是投影屏中的球状透明基板的外表面上的点,或者是该连线与投影屏中的球状透明基板的内表面相交的交点,即该交点是投影屏中的球状透明基板的内表面上的点,本申请实施例对此不作限定。
S105、处理器根据确定出的交点集中每个交点的信息,确定待显示三维图像的待投影的二维投影图像信息。
待显示三维图像的二维投影图像中包括多个像素点,对于其中任一个像素点而言,待显示三维图像的二维投影图像信息包括该像素点的位置和该像素点的颜色亮度信息等。其中,该像素点的位置可以根据交点集中交点的位置确定,颜色亮度信息可以基于用于确定该像素点位置的交点集中交点的颜色亮度信息确定。
处理器根据交点集中的交点在预设的三维坐标系中的位置,确定待投影的二维投影图像在投影图像源模块中显示时的二维位置,可以参考现有技术中坐标变化的方法来确定,此处不再详细赘述。
示例性的,处理器可以预置投影图像源模块和投影镜头在预设的三维坐标系中的位置,以及预置投影时,投影图像源模块到投影镜头的发射角度。其中投影图像源模 块的位置可以用投影图像源模块的显示界面的中心点在预设的三维坐标系中的坐标表示,投影镜头的位置可以用投影镜头与其光轴的交点在预设的三维坐标系中的坐标表示。然后,对于交点集中的每个交点分别与投影镜头相连得到的多条连线中的每条连线而言,处理器计算该连线与投影镜头光轴的夹角,并根据该夹角,得到该连线相对于投影镜头的出射方向。接着,处理器基于确定出的出射方向、投影镜头的光学属性(例如焦距和畸变属性)、投影图像源模块和投影镜头在预设的三维坐标系中的位置,在投影图像源模块中确定用于得到该出射方向光线的像素点的位置。即,处理器根据上述方法将交点集中的交点在预设的三维坐标系中的位置,变换为待投影的二维投影图像在投影图像源模块中显示时的二维位置。
S106、处理器根据确定出的交点集中每个交点的位置,在投影屏中确定目标液晶单元。
具体的,处理器根据确定出的交点集中每个交点的位置,在投影屏中确定目标液晶单元的位置。这里,目标液晶单元用于显示二维投影图像。目标液晶单元的位置为二维的坐标位置。
根据S104的描述,若交点集中的交点是投影屏中液晶膜上的点,则交点集中每个交点位置的x、y坐标即为目标液晶单元的位置。若交点集中的交点是投影屏中的球状透明基板的外表面或内表面上,则处理器可以基于每个交点的位置确定目标液晶单元的位置。
可以理解的是,液晶膜覆盖于投影屏的透明基板上,因此,液晶膜上的每个点与透明基板上的每个点是一一对应的。具有对应关系的两点之间的距离可以是透明基板的厚度,也可以是透明基板和液晶膜的厚度,这取决于上述交点是投影屏中的球状透明基板的外表面或内表面上的点。若交点是投影屏中的球状透明基板的外表面上的点,则具有对应关系的两点之间的距离是透明基板和液晶膜的厚度,若交点是投影屏中的球状透明基板的内表面上的点,则具有对应关系的两点之间的距离是液晶膜的厚度。
具体的,若交点集中的交点是投影屏中的球状透明基板的内表面上的点,则处理器确定交点集中每个交点在沿球状透明基板在该点的法线方向,向液晶膜一侧的方向延伸液晶膜厚度的距离后的位置坐标,并将该位置的x、y坐标确定为目标液晶单元的位置。或者,若交点集中的交点是投影屏中的球状透明基板的外表面上的点,则处理器确定交点集中每个交点在沿球状透明基板在该点的法线方向,向液晶膜一侧延伸透明基板和液晶膜的厚度的距离后的位置,并将该位置的x、y坐标确定为目标液晶单元的位置。
S107、处理器基于确定出的目标液晶单元,将目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态。
处于散射态的目标液晶单元,可以用于显示待显示三维图像的二维投影图像。
具体的,处理器可以采用下述任意一种方式将目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态:
方式一、处理器向控制电路发送目标液晶单元的位置。若投影屏中的液晶膜为PDLC膜,处理器还指示控制电路为目标液晶单元设置第二预置电压,以使得目标液晶单元处于散射态;并指示控制电路为非目标液晶单元设置第一预置电压,以使得非 目标液晶单元处于透明态。
若投影屏中的液晶膜为BLC膜,则处理器还指示控制电路为目标液晶单元设置第一预置电压,以使得目标液晶单元处于散射态。并指示控制电路为非目标液晶单元设置第二预置电压,以使得非目标液晶单元处于透明态。
若投影屏中的液晶膜为染色液晶膜时,则处理器还根据预先设定的第一预置电压、第二预置电压分别与散射态和透明态的对应关系,指示控制电路为目标液晶单元和非目标液晶单元分别设置第一预置电压或第二预置电压,以使得目标液晶单元处于散射态,以及使得非目标液晶单元处于透明态。例如,为目标液晶单元设置第一预置电压,以使得目标液晶单元处于散射态;为非目标液晶单元设置第二预置电压,以使得非目标液晶单元处于透明态。或者,为目标液晶单元设置第二预置电压,以使得目标液晶单元处于散射态;为非目标液晶单元设置第一预置电压,以使得非目标液晶单元处于透明态。
方式二、处理器将当前时刻投影屏中处于散射态状态的液晶单元(本申请实施例简称为散射态液晶单元)的位置,与目标液晶单元的位置进行比较。若散射态液晶单元的位置和目标液晶单元的位置存在交集,则处理器将该交集之外的目标液晶单元的位置发送至控制电路。
若投影屏中的液晶膜为PDLC膜,则处理器还指示控制电路为该交集之外的目标液晶单元设置第二预置电压,以使得该交集之外的目标液晶单元处于散射态;并指示控制电路为该交集之外的非目标液晶单元设置第一预置电压,以使得该交集之外的非目标液晶单元处于透明态。
若投影屏中的液晶膜为BLC膜,则处理器还指示控制电路为该交集之外的目标液晶单元设置第一预置电压,以使得该交集之外的目标液晶单元处于散射态;并指示控制电路为该交集之外的非目标液晶单元设置第二预置电压,以使得该交集之外的非目标液晶单元处于透明态。
若投影屏中的液晶膜为染色液晶膜时,则处理器根据预先设定的第一预置电压和第二预置电压分别与散射态和透明态的对应关系,指示控制电路为该交集之外的目标液晶单元和非目标液晶单元分别设置第一预置电压或或第二预置电压,以使得该交集之外的目标液晶单元处于散射态,以及使得该交集之外的非目标液晶单元处于透明态。例如,为该交集之外的目标液晶单元设置第一预置电压,以使得该交集之外的目标液晶单元处于散射态;以及,为该交集之外的非目标液晶单元设置第二预置电压,为该交集之外的非目标液晶单元处于透明态。或者,为该交集之外的目标液晶单元设置第二预置电压,以使得该交集之外的目标液晶单元处于散射态;以及,为该交集之外的非目标液晶单元设置第一预置电压,以使得该交集之外的非目标液晶单元处于透明态。
需要说明的是,本申请实施例对S105和S106-S107的执行时序不作限定,例如可以同时执行S105和S106-S107,也可以先执行S105,再执行S106-S107等。
S108、处理器向投影图像源模块发送待投影的二维投影图像信息。
处理器将S105确定出的待投影的二维投影图像信息发送至投影图像源模块。
响应于处理器的操作,投影图像源模块接收到该待投影的二维投影图像信息,并根据该待投影的二维投影图像信息显示该待投影的二维投影图像。
S109、投影图像源模块通过投影镜头将待投影的二维投影图像投影到投影屏上的目标液晶单元。
具体的,S109可以参考现有技术,将待投影的二维投影图像投影在投影屏上的目标液晶单元上,此处不再赘述。
上述描述中,投影镜头采用FOV为170°的鱼眼镜头513进行投影,若采用FOV约为40°-70°的投影镜头进行投影时,图5A中所示出的智能音响设备50还包括旋转平台。
这时,上述S104还包括:
S104a、处理器基于交点集和观察位置,确定旋转平台需要旋转的角度,以调整投影镜头的投影区域。
可选的,处理器可以先确定出交点集在投影屏上所处的区域中中心点的位置。然后,处理器确定该中心点与观察点之间的连线,与投影镜头当前的光轴之间的夹角为旋转平台需要旋转的角度。然后,处理器将该角度值发送至旋转平台的控制器,以使得旋转平台旋转该角度。这样的话,上述中心点与观察点之间的连线与投影镜头的光轴可以重合。也就是说,投影镜头的投影区域被调整到可以覆盖投影屏上交点集所处的区域。
响应于处理器的操作,旋转平台旋转处理器所确定的角度,以使得投影镜头的投影区域可以覆盖投影屏上交点集所处的区域。
需要说明的是,由于目标液晶单元是基于交点集中交点的位置确定的,因此,投影区域需要覆盖上述S104所确定的交点集所在区域,这样的话,待投影的二维投影图像才可以被投影镜头投影到目标液晶单元上。
综上,本申请实施例提供的显示方法,通过采用追踪技术追踪人眼位置,然后基于人眼位置确定待显示三维图像与投影屏的交点集,并进一步基于交点集确定待显示三维图像的二维投影图像。因此,待显示三维图像的二维投影图像投影到投影屏中具有散射态的目标液晶单元后,具有逼真的三维立体效果。并且,投影屏上的非目标液晶单元处于透明态,即投影屏上非目标液晶单元所处区域是透明的。也就是说,待显示三维图像的二维投影图像显示在透明的投影屏上,因此,待显示三维图像的二维投影图像的背景与周围环境融合。当用户通过裸眼在投影屏上观看待显示三维图像的二维投影图像时,可以看到“悬浮”在空气中的、逼真的三维图像,因此,提高了用户通过裸眼观看三维图像的立体效果。
上述主要从方法的角度对本申请实施例提供的方案进行了介绍。为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对控制显示的装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一 个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
如图10所示,图10示出了本申请实施例提供的控制显示的装置100的结构示意图。该控制显示的装置100可以应用于终端设备中,该终端设备包括投影屏,投影屏包括透明基板以及覆盖在透明基板上的液晶膜,液晶膜包括多个液晶单元。控制显示的装置100可以用于控制待显示图像在终端设备中的投影屏上的显示,以及用于执行上述的显示方法,例如用于执行图6所示的方法。其中,控制显示的装置100可以包括获取单元101、确定单元102、设置单元103以及控制单元104。
获取单元101,用于获取待显示图像。确定单元102,用于基于待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元。设置单元103,用于将目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态;其中,非目标液晶单元是多个液晶单元中的除目标液晶单元之外的液晶单元。控制单元104,用于控制在目标液晶单元上显示待显示图像的投影图像。作为示例,参考图6,获取单元101可以用于执行S101,确定单元102可以用于执行S106,设置单元103可以用于执行S107。
可选的,上述待显示图像包括三维图像,上述待显示图像的投影图像包括二维图像。
可选的,设置单元103具体用于:
为目标液晶单元设置第一预置电压,以控制目标液晶单元的状态为散射态;以及,为非目标液晶单元设置第二预置电压,以控制非目标液晶单元的状态为透明态。或者,为目标液晶单元设置第二预置电压,以控制目标液晶单元的状态为散射态;以及,为非目标液晶单元设置第一预置电压,以控制非目标液晶单元的状态为透明态。
其中,第一预置电压大于或等于预设值,第二预置电压小于预设值。
作为示例,参考图6,设置单元103可以用于执行S107。
可选的,上述液晶膜包括:聚合物分散液晶膜、双稳态液晶膜或染色液晶膜。
可选的,上述投影屏包括曲面屏,或者,投影屏包括立体屏。
可选的,上述终端设备还包括追踪模块,追踪模块用于追踪人眼位置。确定单元102,还用于基于追踪到的人眼位置和待显示图像中像素点的位置,在多个液晶单元中确定目标液晶单元的位置。作为示例,参考图6,确定单元102可以用于执行S102-S106。
可选的,若所述待显示图像是三维图像时,确定单元102,具体用于基于追踪到的人眼位置与待显示图像中的每个像素点的位置的连线,与投影屏的交点,在多个液晶单元中确定交点位置的液晶单元为目标液晶单元。作为示例,参考图6,确定单元102可以用于执行S102-S106。
可选的,上述终端设备还包括旋转平台和第一投影镜头。控制单元104,具体用于控制旋转平台调整第一投影镜头的投影区域,使得第一投影镜头在目标液晶单元中投影待投影图像,以使得待显示图像的投影图像显示在目标液晶单元上;其中,第一投影镜头的视场角小于或等于预设阈值。作为示例,参考图6,控制单元104可以用于执行S104a。
可选的,上述终端设备还包括第二投影镜头。控制单元104,具体用于控制第二投影镜头在目标液晶单元中投影待投影图像,以使得待显示图像的投影图像显示在目标液晶单元上;其中,第二投影镜头的视场角大于预设阈值。
当然,本申请实施例提供的控制显示的装置100包括但不限于上述单元,例如该控制显示的装置100还可以包括存储单元105。存储单元105可以用于存储该控制显示的装置100的程序代码等。
关于上述可选方式的具体描述可以参见前述的方法实施例,此处不再赘述。此外,上述提供的任一种控制显示的装置100的解释以及有益效果的描述均可参考上述对应的方法实施例,不再赘述。
作为示例,结合图2,控制显示的装置100中的获取单元101可以通过图2中的通信接口25实现。确定单元102、设置单元103以及控制单元104实现的功能可以通过图2中的处理器23执行图2中的存储器24中的程序代码实现。存储单元105实现的功能可以通过图2中的存储器24实现。
本申请实施例还提供一种芯片系统110,如图11所示,芯片系统110包括至少一个处理器111和至少一个接口电路112。处理器111和接口电路112可通过线路互联。例如,接口电路112可用于接收信号(例如从追踪模块接收信号)。又例如,接口电路112可用于向其它装置(例如处理器111)发送信号。示例性的,接口电路112可读取存储器中存储的指令,并将该指令发送给处理器111。当所述指令被处理器111执行时,可使得控制显示的装置执行上述实施例中的各个步骤。当然,该芯片系统110还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请另一实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当指令在控制显示的装置上运行时,该控制显示的装置执行上述方法实施例所示的方法流程中该控制显示的装置执行的各个步骤。
在一些实施例中,所公开的方法可以实施为以机器可读格式被编码在计算机可读存储介质上的或者被编码在其它非瞬时性介质或者制品上的计算机程序指令。
图12示意性地示出本申请实施例提供的计算机程序产品的概念性局部视图,所述计算机程序产品包括用于在计算设备上执行计算机进程的计算机程序。
在一个实施例中,计算机程序产品是使用信号承载介质120来提供的。所述信号承载介质120可以包括一个或多个程序指令,其当被一个或多个处理器运行时可以提供以上针对图6描述的功能或者部分功能。因此,例如,参考图6中S101~S109的一个或多个特征可以由与信号承载介质120相关联的一个或多个指令来承担。此外,图12中的程序指令也描述示例指令。
在一些示例中,信号承载介质120可以包含计算机可读介质121,诸如但不限于,硬盘驱动器、紧密盘(CD)、数字视频光盘(DVD)、数字磁带、存储器、只读存储记忆体(read-only memory,ROM)或随机存储记忆体(random access memory,RAM)等等。
在一些实施方式中,信号承载介质120可以包含计算机可记录介质122,诸如但不限于,存储器、读/写(R/W)CD、R/W DVD、等等。
在一些实施方式中,信号承载介质120可以包含通信介质123,诸如但不限于, 数字和/或模拟通信介质(例如,光纤电缆、波导、有线通信链路、无线通信链路、等等)。
信号承载介质120可以由无线形式的通信介质123(例如,遵守IEEE 802.11标准或者其它传输协议的无线通信介质)来传达。一个或多个程序指令可以是,例如,计算机可执行指令或者逻辑实施指令。
在一些示例中,诸如针对图6描述的控制显示的装置可以被配置为,响应于通过计算机可读介质121、计算机可记录介质122、和/或通信介质123中的一个或多个程序指令,提供各种操作、功能、或者动作。
应该理解,这里描述的布置仅仅是用于示例的目的。因而,本领域技术人员将理解,其它布置和其它元素(例如,机器、接口、功能、顺序、和功能组等等)能够被取而代之地使用,并且一些元素可以根据所期望的结果而一并省略。另外,所描述的元素中的许多是可以被实现为离散的或者分布式的组件的、或者以任何适当的组合和位置来结合其它组件实施的功能实体。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式来实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机执行指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或者数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可以用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质(例如,软盘、硬盘、磁带),光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。

Claims (21)

  1. 一种显示方法,其特征在于,所述显示方法应用于终端设备,所述终端设备包括投影屏,所述投影屏包括透明基板,以及覆盖在所述透明基板上的液晶膜,所述液晶膜包括多个液晶单元;所述方法包括:
    获取待显示图像;
    基于所述待显示图像中像素点的位置,在所述多个液晶单元中确定目标液晶单元;
    将所述目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态;其中,所述非目标液晶单元是所述多个液晶单元中的除所述目标液晶单元之外的液晶单元;
    在所述目标液晶单元上显示所述待显示图像的投影图像。
  2. 根据权利要求1所述的方法,其特征在于,所述待显示图像包括三维图像,所述待显示图像的投影图像包括二维图像。
  3. 根据权利要求1或2所述的方法,其特征在于,所述将所述目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态,包括:
    为所述目标液晶单元设置第一预置电压,以控制所述目标液晶单元的状态为散射态;以及,为所述非目标液晶单元设置第二预置电压,以控制所述非目标液晶单元的状态为透明态;其中,所述第一预置电压大于或等于预设值,所述第二预置电压小于所述预设值;
    或者,
    为所述目标液晶单元设置第二预置电压,以控制所述目标液晶单元的状态为散射态;以及,为所述非目标液晶单元设置第一预置电压,以控制所述非目标液晶单元的状态为透明态,其中,所述第一预置电压大于或等于预设值,所述第二预置电压小于所述预设值。
  4. 根据权利要求1至3中任意一项所述的方法,其特征在于,所述液晶膜包括:聚合物分散液晶膜、双稳态液晶膜或染色液晶膜。
  5. 根据权利要求1至4中任意一项所述的方法,其特征在于,所述投影屏包括曲面屏,或者,所述投影屏包括立体屏。
  6. 根据权利要求1至5中任意一项所述的方法,其特征在于,所述方法还包括:
    追踪人眼位置;
    所述基于所述待显示图像中像素点的位置,在所述多个液晶单元中确定目标液晶单元,包括:
    基于追踪到的人眼位置和所述待显示图像中像素点的位置,在所述多个液晶单元中确定所述目标液晶单元的位置。
  7. 根据权利要求6所述的方法,其特征在于,若所述待显示图像是三维图像时,所述基于追踪到的人眼位置和所述待显示图像中像素点的位置,在所述多个液晶单元中确定所述目标液晶单元的位置,包括:
    基于追踪到的人眼位置与所述待显示图像中的每个像素点的位置的连线,与所述投影屏的交点,在所述多个液晶单元中确定所述交点位置的液晶单元为所述目标液晶单元。
  8. 根据权利要求1至7中任意一项所述的方法,其特征在于,所述终端设备还包括第一投影镜头,所述在所述目标液晶单元上显示所述待显示图像的投影图像,包括:
    调整所述投影镜头的投影区域,使得所述第一投影镜头在所述目标液晶单元中投影所述待显示图像的待投影图像,以使得所述待显示图像的投影图像显示在所述目标液晶单元上;其中,所述第一投影镜头的视场角小于或等于预设阈值。
  9. 根据权利要求1至7中任意一项所述的方法,其特征在于,所述终端设备还包括第二投影镜头,所述在所述目标液晶单元上显示所述待显示图像的投影图像,包括:
    通过所述第二投影镜头在所述目标液晶单元中投影所述待显示图像的待投影图像,以使得所述待显示图像的投影图像显示在所述目标液晶单元上;其中所述第二投影镜头的视场角大于预设阈值。
  10. 一种控制显示的装置,其特征在于,所述装置应用于终端设备,所述终端设备还包括投影屏,所述投影屏包括透明基板,以及覆盖在所述透明基板上的液晶膜,所述液晶膜包括多个液晶单元;所述装置包括:
    获取单元,用于获取待显示图像;
    确定单元,用于基于所述待显示图像中像素点的位置,在所述多个液晶单元中确定目标液晶单元;
    设置单元,用于将所述目标液晶单元的状态设置为散射态,并将非目标液晶单元的状态设置为透明态;其中,所述非目标液晶单元是所述多个液晶单元中的除所述目标液晶单元之外的液晶单元;
    控制单元,用于控制在所述目标液晶单元上显示所述待显示图像的投影图像。
  11. 根据权利要求10所述的装置,其特征在于,所述待显示图像包括三维图像,所述待显示图像的投影图像包括二维图像。
  12. 根据权利要求10或11所述的装置,其特征在于,所述设置单元具体用于:
    为所述目标液晶单元设置第一预置电压,以控制所述目标液晶单元的状态为散射态;以及,为所述非目标液晶单元设置第二预置电压,以控制所述非目标液晶单元的状态为透明态;其中,所述第一预置电压大于或等于预设值,所述第二预置电压小于所述预设值;
    或者,
    为所述目标液晶单元设置第二预置电压,以控制所述目标液晶单元的状态为散射态;以及,为所述非目标液晶单元设置第一预置电压,以控制所述非目标液晶单元的状态为透明态;其中,所述第一预置电压大于或等于预设值,所述第二预置电压小于所述预设值。
  13. 根据权利要求10-12中任意一项所述的装置,其特征在于,所述液晶膜包括:聚合物分散液晶膜、双稳态液晶膜或染色液晶膜。
  14. 根据权利要求10-13中任意一项所述的装置,其特征在于,所述投影屏包括曲面屏,或者,所述投影屏包括立体屏。
  15. 根据权利要求10-14中任意一项所述的装置,其特征在于,所述终端设备还包括追踪模块,所述追踪模块用于追踪人眼位置;
    所述确定单元,还用于基于追踪到的人眼位置和所述待显示图像中像素点的位置, 在所述多个液晶单元中确定所述目标液晶单元的位置。
  16. 根据权利要求15所述的装置,其特征在于,若所述待显示图像是三维图像时,
    所述确定单元,具体用于基于追踪到的人眼位置与所述待显示图像中的每个像素点的位置的连线,与所述投影屏的交点,在所述多个液晶单元中确定所述交点位置的液晶单元为所述目标液晶单元。
  17. 根据权利要求10-16中任意一项所述的装置,其特征在于,所述终端设备还包括旋转平台和第一投影镜头;
    所述控制单元,具体用于控制所述旋转平台调整所述投影镜头的投影区域,使得所述投影镜头在所述目标液晶单元中投影待投影图像,以使得所述待显示图像的投影图像显示在所述目标液晶单元上;其中,所述投影镜头的视场角小于或等于预设阈值。
  18. 根据权利要求10-16中任意一项所述的装置,其特征在于,所述终端设备还包括第二投影镜头;
    所述控制单元,具体用于控制所述第二投影镜头在所述目标液晶单元中投影待投影图像,以使得所述待显示图像的投影图像显示在所述目标液晶单元上;其中,所述第二投影镜头的视场角大于预设阈值。
  19. 一种终端设备,其特征在于,所述终端设备包括投影屏、存储器和处理器;
    所述投影屏包括透明基板,以及覆盖在所述透明基板上的液晶膜,所述液晶膜包括多个液晶单元,所述液晶单元包括散射态和透明态,处于散射态的液晶单元用于显示投影图像;
    所述处理器用于从所述存储器中调用并运行所述存储器中存储的计算机程序,使得所述处理器执行权利要求1-9中任意一项所述的方法。
  20. 一种芯片系统,其特征在于,所述芯片系统包括:处理器;所述处理器用于从存储器中调用并运行所述存储器中存储的计算机程序,使得所述处理器执行权利要求1-9中任意一项所述的方法。
  21. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行权利要求1-9中任意一项所述的方法。
PCT/CN2021/078944 2020-03-20 2021-03-03 一种显示方法及控制显示的装置 WO2021185085A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/947,427 US20230013031A1 (en) 2020-03-20 2022-09-19 Display method and display control apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010203698.2 2020-03-20
CN202010203698.2A CN113497930A (zh) 2020-03-20 2020-03-20 一种显示方法及控制显示的装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/947,427 Continuation US20230013031A1 (en) 2020-03-20 2022-09-19 Display method and display control apparatus

Publications (1)

Publication Number Publication Date
WO2021185085A1 true WO2021185085A1 (zh) 2021-09-23

Family

ID=77769170

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/078944 WO2021185085A1 (zh) 2020-03-20 2021-03-03 一种显示方法及控制显示的装置

Country Status (3)

Country Link
US (1) US20230013031A1 (zh)
CN (1) CN113497930A (zh)
WO (1) WO2021185085A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105139336A (zh) * 2015-08-19 2015-12-09 北京莫高丝路文化发展有限公司 一种多通道全景影像转换球幕鱼眼影片的方法
CN106488208A (zh) * 2017-01-03 2017-03-08 京东方科技集团股份有限公司 一种显示装置及显示方法
CN107894666A (zh) * 2017-10-27 2018-04-10 杭州光粒科技有限公司 一种头戴式多深度立体图像显示系统及显示方法
US10091482B1 (en) * 2017-08-04 2018-10-02 International Business Machines Corporation Context aware midair projection display
CN109413403A (zh) * 2017-08-15 2019-03-01 想象技术有限公司 用于头戴式显示器的单通道渲染

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3108351B2 (ja) * 1995-12-25 2000-11-13 三洋電機株式会社 投写型立体映像表示装置
JP2001305999A (ja) * 2000-04-26 2001-11-02 Nippon Telegr & Teleph Corp <Ntt> 表示装置
JP3918487B2 (ja) * 2001-07-26 2007-05-23 セイコーエプソン株式会社 立体表示装置及び投射型立体表示装置
JP2006003867A (ja) * 2004-05-20 2006-01-05 Seiko Epson Corp 画像補正量検出装置、電気光学装置用駆動回路、電気光学装置及び電子機器
JP4126564B2 (ja) * 2005-02-14 2008-07-30 セイコーエプソン株式会社 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法
JP4330642B2 (ja) * 2007-04-05 2009-09-16 三菱電機株式会社 光拡散素子、スクリーンおよび画像投写装置
WO2014115884A1 (ja) * 2013-01-28 2014-07-31 株式会社Jvcケンウッド 投射装置、画像補正方法およびプログラム
US11109015B2 (en) * 2014-09-08 2021-08-31 Sony Corporation Display apparatus, display apparatus driving method, and electronic instrument
CN107148591A (zh) * 2014-11-07 2017-09-08 索尼公司 显示装置和显示控制方法
CN105704475B (zh) * 2016-01-14 2017-11-10 深圳前海达闼云端智能科技有限公司 一种曲面二维屏幕的三维立体显示处理方法和装置
WO2018042413A1 (en) * 2016-08-28 2018-03-08 Siegel Gabriel A system for histological examination of tissue specimens
CN106657951A (zh) * 2016-10-20 2017-05-10 北京小米移动软件有限公司 投影控制方法及装置、移动设备、投影仪
CN109076173A (zh) * 2017-11-21 2018-12-21 深圳市大疆创新科技有限公司 输出影像生成方法、设备及无人机
JP7083102B2 (ja) * 2017-11-29 2022-06-10 Tianma Japan株式会社 光線方向制御デバイス及び表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105139336A (zh) * 2015-08-19 2015-12-09 北京莫高丝路文化发展有限公司 一种多通道全景影像转换球幕鱼眼影片的方法
CN106488208A (zh) * 2017-01-03 2017-03-08 京东方科技集团股份有限公司 一种显示装置及显示方法
US10091482B1 (en) * 2017-08-04 2018-10-02 International Business Machines Corporation Context aware midair projection display
CN109413403A (zh) * 2017-08-15 2019-03-01 想象技术有限公司 用于头戴式显示器的单通道渲染
CN107894666A (zh) * 2017-10-27 2018-04-10 杭州光粒科技有限公司 一种头戴式多深度立体图像显示系统及显示方法

Also Published As

Publication number Publication date
CN113497930A (zh) 2021-10-12
US20230013031A1 (en) 2023-01-19

Similar Documents

Publication Publication Date Title
US9329682B2 (en) Multi-step virtual object selection
JP6433914B2 (ja) 裸眼立体拡張現実ディスプレイ
US10955665B2 (en) Concurrent optimal viewing of virtual objects
US9298012B2 (en) Eyebox adjustment for interpupillary distance
JP5913346B2 (ja) 拡張現実表示のための自動可変仮想焦点
US20130187943A1 (en) Wearable display device calibration
US20140368532A1 (en) Virtual object orientation and visualization
EP4026318A1 (en) Intelligent stylus beam and assisted probabilistic input to element mapping in 2d and 3d graphical user interfaces
CA2914012A1 (en) Shared and private holographic objects
US11353955B1 (en) Systems and methods for using scene understanding for calibrating eye tracking
CN107635132B (zh) 裸眼3d显示终端的显示控制方法、装置及显示终端
CN111095348A (zh) 基于摄像头的透明显示器
US11488365B2 (en) Non-uniform stereo rendering
CN113724309A (zh) 图像生成方法、装置、设备及存储介质
Bohdal Devices for Virtual and Augmented Reality
WO2021185085A1 (zh) 一种显示方法及控制显示的装置
TWM497800U (zh) 組合式光學鏡頭以及具有該組合式光學鏡頭的光學影像裝置
US11477419B2 (en) Apparatus and method for image display
US10852561B2 (en) Display device and method
CN114830011A (zh) 虚拟、增强和混合现实系统和方法
US11763517B1 (en) Method and device for visualizing sensory perception
US20240223780A1 (en) Generating tile-based region of interest representation of video frames for video encoding
US20230314846A1 (en) Configurable multifunctional display panel
CN117435041A (zh) 信息交互方法、装置、电子设备和存储介质
WO2024147919A1 (en) Generating tile-based region of interest representation of video frames for video encoding

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21770822

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21770822

Country of ref document: EP

Kind code of ref document: A1