WO2024041070A1 - Procédé d'affichage de projection, dispositif de projection et support de stockage - Google Patents

Procédé d'affichage de projection, dispositif de projection et support de stockage Download PDF

Info

Publication number
WO2024041070A1
WO2024041070A1 PCT/CN2023/097475 CN2023097475W WO2024041070A1 WO 2024041070 A1 WO2024041070 A1 WO 2024041070A1 CN 2023097475 W CN2023097475 W CN 2023097475W WO 2024041070 A1 WO2024041070 A1 WO 2024041070A1
Authority
WO
WIPO (PCT)
Prior art keywords
controller
image mode
color
image
target
Prior art date
Application number
PCT/CN2023/097475
Other languages
English (en)
Chinese (zh)
Inventor
陈星�
高力波
Original Assignee
青岛海信激光显示股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202211030207.4A external-priority patent/CN115396641B/zh
Priority claimed from CN202211031744.0A external-priority patent/CN115396642B/zh
Application filed by 青岛海信激光显示股份有限公司 filed Critical 青岛海信激光显示股份有限公司
Publication of WO2024041070A1 publication Critical patent/WO2024041070A1/fr

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present disclosure relates to the technical field of projection display, and in particular, to a projection display method, projection equipment and storage medium.
  • projection equipment With the continuous development of projection display technology, projection equipment is becoming more and more popular among consumers. Since laser has the characteristics of good monochromaticity and high brightness, projection equipment usually uses laser as the light source to display the projected image.
  • a projection device includes a light valve, a first controller, a second controller, a camera device and a detection device.
  • the second controller is electrically connected to the first controller and the light valve, and the second controller is configured to drive the light valve for image display.
  • the camera device is electrically connected to the first controller.
  • the detection device is electrically connected to the first controller, and the detection device is configured to detect the current brightness of ambient light.
  • the first controller is configured to: after receiving the first instruction, control the camera device to turn on, and control the detection device to detect the current brightness of ambient light; the first instruction instructs the first controller to control
  • the camera device is turned on; the target camera image mode is determined according to the corresponding relationship between the brightness and the camera image mode; the target camera image mode is the camera image mode corresponding to the current brightness, and the camera image mode is to display the camera device
  • the image mode of the collected image data, and the projection device has a plurality of camera image modes, each of the plurality of camera image modes corresponds to a color gamut; according to the target camera image mode, the second controller Send first notification information; the first notification information indicates the determined target camera image mode; the second controller is further configured to: call a third corresponding to the target camera image mode according to the first notification information.
  • a characteristic parameter set which processes the image data collected by the camera device and drives the light valve for image display; a plurality of first characteristic parameter sets are prestored in the second controller, and the plurality of first characteristic parameter sets are pre-stored in the second controller.
  • One of the feature parameter sets corresponds to one of the plurality of camera image modes, and the first feature parameter set includes a plurality of color feature parameters, and the color feature parameters are located within the color gamut of the corresponding image mode.
  • a projection display method is provided.
  • the projection display method is applied to projection equipment.
  • the projection device includes a light valve, a first controller, a second controller, a camera device, and a detection device; the second controller is electrically connected to the first controller and the light valve, and is configured to drive The light valve performs image display; the camera device is electrically connected to the first controller; the detection device is electrically connected to the first controller and is configured to detect the current brightness of ambient light; the method
  • the method includes: the first controller receives a first instruction, controls the camera device to turn on, and controls the detection device to detect the current brightness of ambient light; the first instruction instructs the first controller to control the camera device.
  • the first controller determines the target camera image mode according to the correspondence between the brightness and the camera image mode; the target camera image mode is the camera image mode corresponding to the current brightness, and the camera image mode is to display the The image mode of the image data collected by the camera device, and the projection device has multiple camera image modes, each of the multiple camera image modes corresponds to a color gamut; the first controller determines the target camera image according to the The mode sends first notification information to the second controller; the first notification information indicates the determined target camera image mode; the second controller calls the target camera image mode according to the first notification information.
  • the corresponding first characteristic parameter set processes the image data collected by the camera device and drives the light valve for image display; multiple first characteristic parameter sets are prestored in the second controller, and the plurality of first characteristic parameter sets are pre-stored in the second controller.
  • One of the first feature parameter sets corresponds to one of the plurality of camera image modes.
  • the first feature parameter set includes a plurality of color feature parameters, and the color feature parameters are located in the corresponding image mode. within the color gamut.
  • a projection device in another aspect, includes a light valve, a first controller and a second controller.
  • the second controller is electrically connected to the first controller and the light valve, and the second controller is configured to drive the light valve for image display.
  • the first controller is configured to: receive a target instruction; the target instruction indicates switching the image mode; send target notification information to the second controller according to the received target instruction; the target notification information indicates the current Switching image modes, the projection device has multiple image modes, and among the multiple image modes Each corresponds to a color gamut respectively;
  • the second controller is also configured to: call the target feature parameter set corresponding to the currently switched image mode according to the target notification information to process the current image data, and drive
  • the light valve performs image display; the second controller pre-stores multiple target feature parameter sets; one of the multiple target feature parameter sets corresponds to one of the multiple image modes; the target feature
  • the parameter set includes a plurality of color characteristic parameters, and the plurality of color characteristic parameters are located within the color gamut of the corresponding image mode.
  • a projection display method is provided.
  • the projection display method is applied to a projection device.
  • the projection device includes a first controller, a second controller and a light valve; the first controller is electrically connected to the second controller, and the second controller is electrically connected to The light valve is electrically connected and configured to drive the light valve for image display;
  • the method includes: the first controller receives a target instruction; the target instruction indicates switching to a corresponding image mode for image display;
  • the first controller sends target notification information to the second controller according to the received target instruction; the target notification information indicates the currently switched image mode, the projection device has multiple image modes, and the One of the multiple image modes corresponds to a color gamut; the second controller pre-stores multiple target feature parameter sets; one of the multiple target feature parameter sets corresponds to one of the multiple image modes.
  • the target feature parameter set includes a plurality of color feature parameters, and the color feature parameters are located in the corresponding color gamut; the second controller calls the corresponding to the currently switched image mode according to the target notification information.
  • the target feature parameter set processes the current image data and drives the light valve to display the image.
  • a computer-readable storage medium stores computer program instructions. When executed by a computer, the computer program instructions cause the computer to perform one or more steps in the projection display method.
  • Figure 1 is a structural diagram of a projection system according to some embodiments.
  • Figure 2 is a structural diagram of a projection device according to some embodiments.
  • Figure 3 is an optical path diagram of a light source, an optical engine and a lens in a projection device according to some embodiments;
  • Figure 4 is another optical path diagram of the light source, light engine and lens in the projection device according to some embodiments.
  • Figure 5 is an arrangement diagram of tiny reflective lenses in a digital micromirror device according to some embodiments.
  • Figure 6 is a structural diagram of another projection device according to some embodiments.
  • Figure 7 is a structural diagram of another projection system according to some embodiments.
  • Figure 8 is a structural diagram of yet another projection system according to some embodiments.
  • Figure 9 is a schematic diagram of a camera device taking pictures under dark light conditions according to some embodiments.
  • Figure 10 is a chromaticity diagram of various colors according to some embodiments.
  • Figure 11 is a schematic diagram of the hue, gain and saturation functional interface of multiple colors according to some embodiments.
  • Figure 12 is a flow chart of a projection display method according to some embodiments.
  • Figure 13 is a schematic diagram of a remote control of a projection device according to some embodiments.
  • Figure 14 is a structural diagram of another projection device according to some embodiments.
  • Figure 15 is a schematic diagram of a social application interface according to some embodiments.
  • Figure 16 is another flowchart of a projection display method according to some embodiments.
  • Figure 17 is a flow chart of a method for determining the value of the color characteristic parameter corresponding to each color in the first characteristic parameter set according to some embodiments
  • Figure 18 is a schematic diagram of image data flow according to some embodiments.
  • Figure 19 is a schematic diagram of a projection screen according to some embodiments.
  • Figure 20 is another flowchart of a projection display method according to some embodiments.
  • Figure 21 is a color coordinate diagram of a projection device according to some embodiments.
  • Figure 22 is a structural diagram of yet another projection device according to some embodiments.
  • Figure 23 is yet another flowchart of a projection display method according to some embodiments.
  • Figure 24 is another schematic diagram of a remote control of a projection device according to some embodiments.
  • Figure 25 is a structural diagram of yet another projection device according to some embodiments.
  • Figure 26 is another schematic diagram of a menu interface according to some embodiments.
  • Figure 27 is yet another flowchart of a projection display method according to some embodiments.
  • Figure 28 is a flow chart of a method for determining the value of the color feature parameter corresponding to each color in the target feature parameter set according to some embodiments
  • Figure 29 is a plot of image modes versus color gamut, according to some embodiments.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • connection should be understood in a broad sense.
  • connection can be a fixed connection, a detachable connection, or an integrated connection; it can be a direct connection or an indirect connection through an intermediate medium.
  • connection can be a fixed connection, a detachable connection, or an integrated connection; it can be a direct connection or an indirect connection through an intermediate medium.
  • connection can be a fixed connection, a detachable connection, or an integrated connection; it can be a direct connection or an indirect connection through an intermediate medium.
  • At least one of A, B, or C includes the following combinations of A, B, and C: A only, B only, C only, a combination of A and B, a combination of A and C, a combination of B and C, and A , combination of B and C.
  • Figure 1 is a structural diagram of a projection system according to some embodiments.
  • the projection system 1 includes a projection device 100 and a projection screen 200 .
  • FIG. 2 is a structural diagram of a projection device according to some embodiments.
  • the projection device 100 includes a complete machine housing 40 (only part of the complete machine housing 40 is shown in FIG. 2 ), a light source 10 assembled in the complete machine housing 40 , an optical engine 20 , and a lens 30 .
  • the light source 10 is configured to provide an illumination beam (laser beam).
  • the optical engine 20 is configured to modulate the illumination beam provided by the light source 10 using an image signal to obtain a projection beam.
  • Lens 30 is configured to project the projection beam onto a screen or wall.
  • the light source 10, the optical engine 20 and the lens 30 are connected in sequence along the direction of light beam propagation, and each is wrapped by a corresponding housing.
  • the respective housings of the light source 10, the optical engine 20 and the lens 30 support the corresponding optical components and enable each optical component to meet certain sealing or airtight requirements.
  • Figure 3 is an optical path diagram of a light source, an optical engine and a lens in a projection device according to some embodiments.
  • one end of the light engine 20 is connected to the light source 10 , and the light source 10 and the light engine 20 are arranged along the emission direction of the illumination beam of the projection device 100 (refer to the M direction in FIG. 3 ).
  • the other end of the optical engine 20 is connected to the lens 30 , and the optical engine 20 and the lens 30 are arranged along the emission direction of the projection light beam of the projection device 100 (refer to the N direction in FIG. 3 ).
  • the emission direction M of the illumination beam is approximately perpendicular to the emission direction N of the projection beam.
  • this connection structure can adapt to the optical path characteristics of the reflective light valve in the optical machine 20. On the other hand, it is also conducive to shortening the optical path in one dimension.
  • the length is conducive to the structural arrangement of the whole machine.
  • the optical engine 20 and the lens 30 are arranged in one dimensional direction (for example, the M direction)
  • the length of the optical path in the dimensional direction will be very long, which is not conducive to the structural arrangement of the entire machine.
  • the reflective light valve will be described later.
  • the light source 10 can provide three primary color lights in a timely manner (other color lights can also be added on top of the three primary color lights). Due to the persistence of vision phenomenon of the human eye, what the human eye sees is a mixture of the three primary color lights. The white light formed. Alternatively, the light source 10 can also output three primary colors of light simultaneously and continuously emit white light. Alternatively, the projection device 100 may also use a monochromatic light source combined with a fluorescent wheel to perform time-sharing display.
  • the light source 10 includes a light emitting diode (Light Emitting Diode, LED), an electroluminescence (Electro-Luminescence, EL) device, a laser, etc. Since the laser beam emitted by the laser has good monochromaticity, high color purity, high brightness, and good directivity, the laser is widely used as the light source of the projection device 100 .
  • a light emitting diode Light Emitting Diode, LED
  • an electroluminescence (Electro-Luminescence, EL) device a laser, etc. Since the laser beam emitted by the laser has good monochromaticity, high color purity, high brightness, and good directivity, the laser is widely used as the light source of the projection device 100 .
  • the projection device 100 in which a laser serves as the light source 10 of the projection device 100 may be called a laser projection device.
  • the laser projection equipment uses red laser, green laser and blue laser as light sources, the red laser emits a red laser beam, the green laser emits a green laser beam, and the blue laser emits a blue laser beam.
  • Laser projection equipment realizes image display through three-color laser beams, which can obtain a larger color gamut and have better color expression.
  • Figure 4 is another optical path diagram of a light source, an optical engine and a lens in a projection device according to some embodiments.
  • the optical engine 20 includes a light pipe 210 , a reflector 220 , a lens assembly 230 , a prism assembly 240 and a light valve 250 .
  • the light pipe 210 can receive the illumination beam provided by the light source 10 and homogenize the illumination beam.
  • the outlet of the light pipe 210 can be rectangular, thereby having a shaping effect on the light spot.
  • Reflector 220 may reflect the illumination beam to lens assembly 230 .
  • Lens assembly 230 may focus the illumination beam onto prism assembly 240.
  • the prism assembly 240 reflects the illumination beam to the light valve 250 , the light valve 250 modulates the illumination beam to obtain a projection beam, and reflects the projection beam into the lens 30 .
  • the light pipe 210 can also be replaced by a fly-eye lens or other components with a light uniformity function, and this disclosure is not limiting.
  • the light valve 250 uses image signals to modulate the illumination beam provided by the light source 10 , that is, to control the projection beam to display different brightness and gray scale for different pixels of the image to be displayed, so as to finally form an optical image.
  • the light modulation device can be divided into a transmissive light modulation device or a reflective light modulation device.
  • the Digital Micromirror Device (DMD) 250A shown in Figure 4 reflects the illumination beam, which is a reflective light modulation device.
  • the liquid crystal light valve transmits the illumination beam, so it is a transmissive light modulation device.
  • the optical engine 20 can be divided into a single-chip system, a dual-chip system, or a three-chip system.
  • Figure 5 is an arrangement diagram of micro reflective lenses in a digital micromirror device according to some embodiments.
  • the light valve 250 in some embodiments of the present disclosure is a digital micromirror device 250A.
  • the digital micromirror device 250A includes thousands of tiny reflective mirrors 2501 that can be driven individually to rotate. These tiny reflective mirrors 2501 are arranged in an array.
  • One tiny reflective mirror 2501 (for example, each tiny reflective mirror Lens 2501) corresponds to a pixel in the projection image to be displayed.
  • the image signal can be converted into digital codes such as 0 and 1 after processing.
  • the tiny reflective mirror 2501 can swing.
  • the grayscale of each pixel in a frame of image is achieved by controlling the duration of each tiny reflective mirror 2501 in the on state and off state respectively.
  • the digital micromirror device 250A can modulate the illumination beam to display the projection image.
  • the open state of the micro reflective lens 2501 is a state that the micro reflective lens 2501 is in and can maintain when the illumination beam emitted by the light source 10 can enter the lens 30 after being reflected by the micro reflective lens 2501 .
  • the off state of the micro reflective lens 2501 is a state that the micro reflective lens 2501 is in and can maintain when the illumination beam emitted by the light source 10 is reflected by the micro reflective lens 2501 and does not enter the lens 30 .
  • the light pipe 210, the reflector 220 and the lens assembly 230 at the front end of the digital micromirror device 250A form an illumination light path.
  • the illumination beam emitted by the light source 10 passes through the illumination light path and forms a beam size and incident angle that meet the requirements of the digital micromirror device 250A.
  • the projection screen 200 is spaced apart from the projection device 100, and the projection screen 200 is configured to receive the projection beam emitted from the projection device 100 for image display.
  • the projection screen 200 can be a curtain, a wall, a front windshield of a car, a window of an exhibition cabinet, etc., which is not limited in this disclosure.
  • the following description takes the projection device 100 as a laser projection device, which uses red, green, and blue lasers as the light source 10, and the light valve 250 in the projection device 100 uses a DMD as an example.
  • Figure 6 is a structural diagram of another projection device according to some embodiments.
  • the projection device 100 further includes a first controller 11 , a second controller 12 and a camera device 13 .
  • the light valve 250 is located on the light exit side of the light source 10, and the lens 30 is located on the projection light emitted from the light valve 250. beam of light on its path.
  • the first controller 11 is electrically connected to the second controller 12 , and the first controller 11 is configured to process the received image data and send the processed image data and corresponding instructions to the second controller 12 .
  • the second controller 12 is electrically connected to the light valve 250 and is configured to drive the light valve 250 to display images.
  • the first controller 11 receives image data and decodes the received image data. For example, the first controller 11 decodes the received image data into a low-voltage differential signal (Low-Voltage Differential Signaling, LVDS). The first controller 11 sends the decoded image data to the second controller 12. The second controller 12 receives and processes the decoded image data into a driving signal, and drives the light valve 250 for image display according to the driving signal.
  • the first controller 11 can be a system on chip (SOC) in the projection device 100
  • the second controller 12 can be a digital light processing (Digital Light Processing, DLP) projection architecture. A chip that controls the light valve 250.
  • SOC system on chip
  • DLP digital light processing
  • the camera device 13 is electrically connected to the first controller 11, and the first controller 11 can control the camera device 13 to turn on or off.
  • the camera device 13 is electrically connected to the first controller 11 through a connecting wire or an adapter board, or is communicatively connected to the first controller 11 through Bluetooth.
  • the imaging device 13 includes a photosensitive element. The photosensitive element is used to photoelectrically convert the captured image light to convert the optical signal into an electrical signal that can be transmitted in the circuit, thereby generating image data.
  • the camera device 13 collects image data and sends the image data to the first controller 11 for image display.
  • Figure 7 is a structural diagram of another projection system according to some embodiments.
  • Figure 8 is a structural diagram of yet another projection system according to some embodiments.
  • the camera device 13 is installed on the projection screen 200 ; or, as shown in FIG. 8 , the projection device 100 further includes a housing 100A, and the camera device 13 is disposed on the housing 100A.
  • the camera device 13 can also be disposed at other positions to adapt to corresponding usage scenarios.
  • the camera device 13 is located in the housing 100A, as long as the camera device 13 can collect ambient light.
  • the projection device 100 further includes a detection device 14 .
  • the detection device 14 is electrically connected to the first controller 11 and is configured to detect the brightness of the ambient light during the display of the image and send the brightness of the ambient light to the first controller 11 .
  • the detection device 14 and the camera device 13 are integrated.
  • the detection device 14 may be integrated on the imaging device 13 , or the imaging device 13 may be integrated on the detection device 14 .
  • the detection device 14 is disposed close to the camera device 13 to accurately detect the brightness of the ambient light taken in by the camera device 13 .
  • the detection device 14 can also be provided separately from the camera device 13, and the detection device 14 can be integrated on the body of the projection device 100, or on the projection screen 200, which is not limited in this disclosure.
  • Figure 9 is a schematic diagram of a camera device taking pictures under dark light conditions according to some embodiments.
  • the imaging device 13 is disposed on the side (eg, the upper side) of the projection screen 200 away from the housing 100A.
  • the imaging device 13 uses the first light L1, the second light L2, and the third light L3 to fill light.
  • the first light L1 is natural light in the surrounding environment, and the second light L2 and the third light L3 are respectively laser light emitted from the projection device 100 .
  • the second light L2 and the third light L3 are more than the first light L1.
  • the photosensitive element in the camera device 13 is relatively sensitive to the laser light emitted by the projection device 100, especially the red laser light, and the red laser light exceeds the color correction range of the camera device 13 itself, without performing color correction , the displayed image is reddish and the display effect is poor.
  • some embodiments of the present disclosure provide a projection display method. This method is applied to the projection device 100 described above.
  • Figure 10 is a chromaticity diagram of various colors in accordance with some embodiments.
  • red, green, and blue can mix to form white.
  • Red and blue can be mixed to form magenta
  • blue and green can be mixed to form cyan
  • green and red can be mixed to form yellow.
  • magenta coordinate point M is located on the line between the red coordinate point R and the blue coordinate point B.
  • the cyan coordinate point C is located on the connection line between the blue coordinate point B and the green coordinate point G; the yellow coordinate point Y is located on the connection line between the red coordinate point R and the green coordinate point G. .
  • the magenta coordinate point M is located on the extension line connecting the green coordinate point G and the white coordinate point W, and the cyan coordinate point C is located on the red coordinate point R and the white coordinate point.
  • the yellow coordinate point Y is located on the extension line connecting the blue coordinate point B and the white coordinate point W.
  • Figure 11 is a schematic diagram of a functional interface for hue, gain, and saturation of multiple colors according to some embodiments.
  • each color includes color characteristic parameters such as hue (Hue, H), gain (Gain, G), and saturation (Saturation, S).
  • the second controller 12 can adjust the hue H, gain G, and saturation of each color.
  • Degree S to adjust the color gamut of the displayed image.
  • the picture displayed by the projection device 100 includes a functional interface (referred to as the HSG functional interface) composed of the hue H, the gain G, and the saturation S of each color.
  • HSG function interface Users can call up the HSG function interface through external devices, and adjust the color gamut of the displayed image by adjusting the hue H, gain G and saturation S of the corresponding color in the HSG function interface.
  • the above process may be called the HSG function of the second controller 12 .
  • the above-mentioned process of adjusting the color gamut of the displayed image can be performed in advance, and the adjusted color feature parameter values are set to fixed values so that the user can directly call them to display the image corresponding to the color gamut. For example, by adjusting the hue, gain, and saturation corresponding to red, green, blue, cyan, yellow, magenta, and white, the color coordinates of the corresponding white point and the corresponding color gamut can be determined. In this way, the second controller 12 processes the image data according to the corresponding color gamut determined by each adjusted color, and drives the light valve 250 to display the image to obtain a corresponding display effect.
  • projection equipment includes camera devices to meet user needs for functions such as video calls.
  • the controller of the projection device does not perform color processing on the image data collected by the camera device.
  • the camera device will use ambient light to provide fill light.
  • the camera device is more sensitive to the light of the laser light source, especially the red laser light, which will cause the overall display image of the projection device to be reddish, affecting the display effect of the projection device.
  • some embodiments of the present disclosure provide a projection display method.
  • Figure 12 is a flow chart of a projection display method according to some embodiments.
  • the method includes steps 101 to 104.
  • the first controller 11 receives the first instruction, controls the camera device 13 to turn on, and controls the detection device 14 to detect the current brightness of the ambient light.
  • the first instruction instructs the first controller 11 to control the camera device 13 to turn on.
  • the first controller 11 determines the target captured image mode according to the corresponding relationship between the brightness and the captured image mode.
  • the target captured image mode is a captured image mode corresponding to the current brightness.
  • step 103 the first controller 11 sends first notification information to the second controller 12 according to the target camera image mode.
  • step 104 the second controller 12 calls the first characteristic parameter set corresponding to the determined target camera image mode according to the first notification information to process the image data collected by the camera device 13 and drive the light valve. 250 for image display.
  • the projection device 100 can meet the needs of various scenes through the camera device 13 .
  • users make video calls with other users through the projection device 100.
  • the projection display has a larger display area and can provide the user with an immersive call experience.
  • Users conduct remote meetings with other users through remote conferencing applications.
  • users can display meeting materials while making video calls through a small window; conduct remote learning through educational learning applications.
  • the teacher The student's learning status can be grasped through video, and the interaction with the student can be enhanced through video calls;
  • the user's movements can be captured through the camera device 13, for example, when the user is playing a dancing somatosensory game, the camera Device 13 captures the user's dance movements, and can score the user's dance movements through limb detection and tracking, detection of human skeleton key point data, etc., and the user can observe his own movements through a small window and make adjustments to his movements; the user The image is captured by the imaging device 13 for looking into the mirror.
  • the projection device 100 can also meet the needs of other scenarios through the camera device 13 to achieve more or fewer functions, which is not limited by the present disclosure.
  • the user can send the first instruction to the first controller 11, and the first controller 11 controls the camera device 13 to turn on in response to the first instruction, thereby realizing the function of at least one of the above multiple scenarios.
  • Figure 13 is a schematic diagram of a remote control of a projection device according to some embodiments.
  • Figure 14 is a diagram according to some embodiments Structural diagram of another projection device.
  • Figure 15 is a schematic diagram of a social application interface according to some embodiments.
  • the user may send the first instruction to the first controller 11 through an external device.
  • the external devices include remote controls, buttons on the projection device 100 and other devices and control devices that can send control instructions.
  • the projection system 1 further includes a remote control 60 .
  • the remote control 60 and the projection device 100 can communicate through an infrared communication protocol, a Bluetooth communication protocol, a ZigBee communication protocol or other short-distance communication methods.
  • the user can send instructions to the projection device 100 by pressing buttons on the remote control 60, thereby controlling the projection device 100 to perform corresponding operations.
  • the remote controller 60 includes a first button, and the first button may be the “camera on/off” button in FIG. 13 .
  • the remote control 60 sends the first instruction to the projection device 100.
  • the first controller 11 controls the camera device 13 to turn on.
  • the remote control 60 sends a second instruction to the projection device 100.
  • the first controller 11 controls the camera device 13 closure.
  • the second instruction is used to instruct the first controller 11 to control the camera device 13 to turn off.
  • the projection device 100 has a voice recognition function, and the user can control the camera device 13 to turn on or off through voice input or other methods, which is not limited in this disclosure.
  • the projection device 100 is provided with multiple buttons.
  • the plurality of keys include the first key. After the user presses the first button, the situation in which the projection device 100 executes the instruction is similar to the above, and will not be described again here.
  • the user When the user needs to look in the mirror through the projection device 100, he can control the opening of the camera device 13 through the first button on the remote control 60 or the first button on the projection device 100.
  • the camera device 13 collects portrait data and then projects it. Display, users can organize their clothing according to the displayed content.
  • the external device may be a smart device, such as a mobile terminal, a tablet, a computer, a laptop, etc.
  • the external device can communicate with the projection device 100 through various methods such as network, infrared, and data lines, and send instructions through various methods such as buttons, voice input, gesture input, etc., which is not limited by this disclosure.
  • the user can control the camera device 13 to open by selecting a camera function such as taking photos or videos in the application software.
  • a camera function such as taking photos or videos in the application software.
  • the social applications provide options such as taking photos or video calls.
  • the first controller 11 The first instruction can be received, thereby controlling the camera device 13 to open.
  • the user can accept the video call invitation sent by other users, thereby controlling the camera device 13 to open and enter the video call, which is not limited by this disclosure.
  • the foregoing description mainly takes as an example the first controller 11 controlling the camera device 13 to turn on after receiving the first instruction in various scenarios.
  • the first controller 11 can also control the camera device 13 to turn on under various other conditions.
  • the camera device 13 is controlled to be turned on, which is not limited by this disclosure.
  • the first controller 11 controls the camera device 13 to turn on after receiving the first instruction, the first controller 11 controls the detection device 14 to detect the current brightness of the ambient light.
  • the detection device 14 sends the detected current brightness to the first controller 11 .
  • the corresponding relationship between brightness and camera image mode may be preset and stored in the first controller 11 .
  • the first controller 11 may determine the target camera image mode according to the current brightness and the corresponding relationship between the brightness and the camera image mode.
  • the target camera image mode is a camera image mode corresponding to the current brightness.
  • the camera image mode is an image mode that displays the image data collected by the camera device 13.
  • One camera image mode corresponds to a color gamut, and the camera image mode can be corresponding to the characteristics of the image captured by the camera device 13 under corresponding brightness conditions. The color gamut can be adjusted. In this way, the captured image can be displayed using the corresponding camera image mode under different brightness conditions to make the image color balanced.
  • the first controller 11 After determining the camera image mode corresponding to the current brightness (the target camera image mode), the first controller 11 sends the first notification information to the second controller 12 according to the target camera image mode.
  • the first notification information indicates the determined target captured image mode.
  • a plurality of first feature parameter sets are prestored in the second controller 12, and one first feature parameter set corresponds to one camera image mode.
  • the first characteristic parameter set includes a plurality of color characteristic parameters, and the plurality of color characteristic parameters are located in the corresponding color gamut.
  • the second controller 12 calls the first parameter set corresponding to the target camera image mode to process the image data collected by the camera device 13 according to the first notification information, and drives The light valve 250 displays the image captured by the imaging device 13 .
  • the second controller 12 collects the data collected by the camera device
  • the image data is processed into a driving signal, and the second controller 12 drives the light valve 250 according to the driving signal to display the image.
  • the detection device 13 can detect the current brightness of the ambient light, and the first controller 11 determines the current camera image mode based on the current brightness.
  • the second controller 12 calls the corresponding color feature parameter set (such as the corresponding first feature parameter set) according to the target camera image mode, and performs color gamut conversion on the image data collected by the camera device 13, so that the image data collected by the camera device 13 can be displayed at different brightnesses.
  • the conditions adopt the corresponding color gamut to display the image captured by the imaging device 13 .
  • the HSG function of the second controller 12 can be used to adopt corresponding color gamut display for different ambient brightness.
  • the image captured by the imaging device 13, and thus the image captured by the imaging device 13 displayed by the projection device 100, may be color balanced.
  • Figure 16 is another flowchart of a projection display method according to some embodiments.
  • the method includes steps 201 to 204.
  • step 201 the first controller 11 receives the first instruction, controls the camera device 13 to turn on, and controls the detection device 14 to detect the current brightness of the ambient light.
  • step 202 the detection device 14 determines whether the current brightness is greater than or equal to the preset brightness threshold. If “Yes”, perform step 203; if "No”, perform step 204.
  • the preset brightness threshold can be set to a fixed value in advance and stored in the first controller 11 . It should be noted that the preset brightness threshold can be determined based on the shooting effects of the imaging device 13 under different brightness conditions. For example, without performing color adjustment, the brightness when the color deviation visible to the human eye occurs in the image captured by the imaging device 13 is used as the preset brightness threshold. Of course, other brightnesses can also be set as the preset brightness threshold. The present disclosure There is no limit to this.
  • step 203 the first controller 11 determines that the target captured image mode is the first captured image mode.
  • the camera image mode includes a first camera image mode and a second camera image mode.
  • the first controller 11 determines that the target camera image mode is the first camera image mode. In this way, when the brightness of ambient light is high, the image captured by the imaging device 13 can be displayed in the first captured image mode.
  • step 204 the first controller 11 determines that the target captured image mode is the second captured image mode.
  • the first controller 11 determines that the target camera image mode is the second camera image mode. In this way, when the brightness of the ambient light is low, the image captured by the imaging device 13 can be displayed in the second captured image mode.
  • the multiple color characteristic parameters of the first characteristic parameter set respectively correspond to multiple colors.
  • the plurality of colors include at least red, green, blue, cyan, magenta, yellow and white.
  • a color gamut can be determined.
  • the plurality of color characteristic parameters of the first characteristic parameter set may also include color characteristic parameters of multiple transition colors between red, green, blue, cyan, magenta, yellow and white, which is not covered by this disclosure. limited. It should be noted that the more color characteristic parameters of different colors included in the characteristic parameter set and the finer the adjustment, the better the display effect of the projection device 100.
  • the gain of the red color in the first feature parameter set corresponding to the first camera image mode can be set to be greater than the gain of the red color corresponding to the second camera image mode.
  • the gain of green in the first feature parameter set corresponding to the first captured image mode is smaller than the gain of green in the first feature parameter set corresponding to the second captured image mode.
  • the gain of blue in the first feature parameter set corresponding to the first captured image mode is smaller than the gain of blue in the first feature parameter set corresponding to the second captured image mode.
  • the hue, saturation and gain corresponding to multiple colors can be adjusted respectively according to the characteristics of images captured by different imaging devices 13, which is not limited by the present disclosure.
  • Figure 17 is a flow chart of a method for determining the value of the color characteristic parameter corresponding to each color in the first characteristic parameter set according to some embodiments.
  • the method includes steps 301 to 306.
  • step 301 the second controller 12 determines initial values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white respectively.
  • the second controller 12 needs to determine the initial values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white.
  • red, green, blue, cyan colors that meet the preset color gamut of the projection device 100 will be Hue, saturation and gain corresponding to color, magenta, yellow and white are used as initial values.
  • the initial value of hue is set to 0
  • the initial value of saturation is set to 1
  • the initial value of gain is set to 1.
  • the initial values of the hue, saturation and gain of the seven colors that meet the preset color gamut of the projection device 100 are respectively: red R 0 (0,1,1), green G 0 (0,1,1), Blue B 0 (0,1,1), Cyan C 0 (0,1,1), Magenta M 0 (0,1,1), Yellow Y 0 (0,1,1) and White W 0 ( 0,1,1).
  • the preset color gamut of the projection device 100 is the largest color gamut used by the projection device in displaying images.
  • step 302 the detection device 14 determines whether the current brightness of the ambient light is greater than or equal to the preset brightness threshold. If “Yes”, perform steps 303 to 304; if "No”, perform steps 305 to 306.
  • the detection device 14 can determine whether the current brightness of the ambient light is greater than or equal to the preset brightness threshold. Of course, the present disclosure is not limited thereto.
  • the first controller 11 can also determine whether the current brightness of the ambient light is greater than or equal to the preset brightness threshold. .
  • step 303 the second controller 12 sequentially adjusts the hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white within the set adjustment interval, so as to satisfy the requirements of the first captured image.
  • the values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white in the color gamut corresponding to the mode are determined as the set values of the first feature parameter set corresponding to the first camera image mode .
  • the second controller 12 sequentially adjusts the hue, saturation corresponding to red, green, blue, cyan, magenta, yellow and white within the set adjustment interval. degree and gain, so as to determine the values of hue H, saturation S and gain G corresponding to the seven colors that satisfy the color gamut corresponding to the first camera image mode as the settings of the first feature parameter set corresponding to the first camera image mode value.
  • the values of hue H, saturation S and gain G corresponding to the seven colors that satisfy the color gamut corresponding to the first camera image mode are respectively: red R 1 (H R1 , S R1 , G R1 ), green G 1 ( H G1 ,S G1 ,G G1 ), blue B 1 (H B1 ,S B1 ,G B1 ), cyan C 1 (H C1 ,S C1 ,G C1 ), magenta M 1 (H M1 ,S M1 , G M1 ), yellow Y 1 (H Y1 ,S Y1 ,G Y1 ) and white W 1 (H W1 ,S W1 ,G W1 ).
  • step 304 the second controller 12 determines the hue, saturation and gain corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white through linear interpolation, thereby determining the first camera image mode The value of the color characteristic parameter corresponding to each color in the corresponding first characteristic parameter set.
  • the second controller 12 After determining the setting value of the first characteristic parameter set corresponding to the first camera image mode, the second controller 12 determines the hue corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white.
  • the values of saturation and gain are linearly interpolated to determine the value of the color feature parameter corresponding to the transition color, thereby achieving the value of the color feature parameter corresponding to multiple colors in the first feature parameter set.
  • the second controller 12 sequentially adjusts the hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white within the set adjustment interval, so as to satisfy the requirements of the second captured image.
  • the values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white in the color gamut corresponding to the mode are determined as the set values of the first feature parameter set corresponding to the second camera image mode .
  • the second controller 12 sequentially adjusts the hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white within the set adjustment interval.
  • the value of the hue H, saturation S and gain G corresponding to each color when the color gamut corresponding to the second camera image mode is satisfied is determined as the set value of the first feature parameter set corresponding to the second camera image mode.
  • the values of the hue H, saturation S and gain G corresponding to each color include: red R 2 (H R2 , S R2 , G R2 ), green G 2 (H G2 ,S G2 ,G G2 ), blue B 2 (H B2 ,S B2 ,G B2 ), cyan C 2 (H C2 ,S C2 ,G C2 ), magenta M 2 (H M2 ,S M2 ,G M2 ), yellow Y 2 (H Y2 ,S Y2 ,G Y2 ) and white W 2 (H W2 ,S W2 ,G W2 ).
  • step 306 the second controller 12 determines the hue, saturation and gain corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white through linear interpolation, thereby determining the second camera image mode The value of the color characteristic parameter corresponding to each color in the corresponding first characteristic parameter set.
  • the second controller 12 After determining the setting value of the first characteristic parameter set corresponding to the second camera image mode, the second controller 12 determines the hue corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white. The values of saturation and gain are linearly interpolated to determine the value of the color characteristic parameter corresponding to each color.
  • linear interpolation can be automatically performed through the built-in program of the second controller 12, or linear interpolation can be performed through an external program, and the corresponding feature parameter set obtained after linear interpolation is imported into the second controller 12. To store, this disclosure does not limit this.
  • the camera image mode including two image modes as an example.
  • the camera image mode may also include more than three image modes.
  • the second controller 12 may store three image modes.
  • the above first characteristic parameter set is not limited by this disclosure.
  • Figure 18 is a schematic diagram of image data flow according to some embodiments.
  • the image data received by the first controller 11 also includes first data and second data.
  • the first data may be multimedia data.
  • the first data includes High Definition Multimedia Interface (HDMI) video data, analog television (Analog Television, ATV) video data, digital television (Digital Television, DTV) video data and data transmitted through the Universal Serial Bus (Universal Serial Bus). Bus, USB) interface input video data, etc.
  • HDMI High Definition Multimedia Interface
  • ATV analog television
  • DTV Digital Television
  • USB Universal Serial Bus
  • the first data can be input to the first controller 11 through the network, antenna, closed-circuit television, memory card, etc.
  • the second data is image data (such as menu data) generated by an image generator inside the projection device 100 .
  • the first data corresponds to a first image (ie, a multimedia image), and the second data corresponds to a second image (ie, a menu image).
  • the first image and the second image will be described later.
  • the first controller 11 continues to receive image data, decodes the image data, and sends it to the second controller 12 .
  • the first controller 11 decodes image data in different formats into low-voltage differential signals (Low-Voltage Differential Signaling, LVDS).
  • Low-voltage differential signals have the characteristics of low power consumption, low bit error rate (Symbol Error Rate, SER), low crosstalk and low radiation, which can improve the transmission quality of image signals.
  • Figure 19 is a schematic diagram of a projection screen according to some embodiments.
  • the projection device 100 can display the image captured by the camera device 13 in full screen.
  • the projection device 100 can display the image captured by the camera device 13 in full screen, so that the user can observe and adjust his or her clothing and posture.
  • the projection device 100 may also display the image captured by the camera device 13 in a small window mode, and display the first image or the second image in the background.
  • the user receives After accepting the video call invitation sent by other users, the user can watch TV programs while making the video call in a part of the projected image (such as the small window in the upper left corner).
  • the image displayed in the rectangular area surrounded by the vertex A, the vertex B, the vertex C, and the vertex D in FIG. 19 is an image captured by the imaging device 13 .
  • the first controller 11 can identify the type of the image data.
  • the type of image data may include at least one of the first data, the second data, and the third data.
  • the third data may be image data collected by the camera device 13 . As shown in FIG. 18 , when the first controller 11 recognizes the third data, the first controller 11 does not perform color gamut conversion on the third data.
  • the projection device 100 uses the small window mode to display the image captured by the camera 13, then the first controller 11 parses the image data, and then displays the parsed image data, the first notification information and the camera device. 13
  • the vertex coordinates of the display area corresponding to the collected image data (such as the coordinates of vertex A, vertex B, vertex C and vertex D) are respectively sent to the second controller 12, and the second controller 12 responds to the first notification information
  • the first feature parameter set corresponding to the target camera image mode is called, the image data in the vertex coordinates are analyzed, and the light valve 250 is driven to display the image.
  • the image data may include the first data and the third data.
  • the projection device 100 uses the full-screen mode to display the image captured by the camera device 13, the first controller 11 can shield the first data and the second data to combine the parsed third data and the first data.
  • the notification information is sent to the second controller 12.
  • the second controller 12 calls the first feature parameter set corresponding to the target camera image mode according to the first notification information, processes the received image data, and drives the light valve. 250 for image display.
  • the projection device 100 may also use the same method as in the small window mode to display the image captured by the camera device 13 in full screen, and the present disclosure is not limited to this.
  • the color gamut conversion method is not limited to the embodiments described above when the image captured by the camera device 13 is displayed in a small window or full screen.
  • the color gamut conversion method is performed when the image captured by the camera device 13 is displayed in a small window or full screen.
  • the method may also include other methods, which are not limited by this disclosure.
  • Figure 20 is another flowchart of a projection display method according to some embodiments.
  • the method further includes steps 401 to 406.
  • step 401 when receiving the second instruction, the first controller 11 controls the camera device 13 to turn off.
  • step 402 the first controller 11 determines whether the type of the currently input image data is the second data. If “Yes”, perform steps 403 to 404; if "No”, perform steps 405 to 406.
  • the first controller 11 After receiving the image data, the first controller 11 decodes the image data and determines whether the currently input image data type is the second data. It should be noted that since the first controller 11 controls the camera device 13 to turn off when receiving the second instruction, the currently input image data does not include the third data.
  • step 403 the first controller 11 sends the second notification information to the second controller 12.
  • the first controller 11 determines that the currently input image data type is the second data, usually the first controller 11 does not perform color gamut conversion on the second data, and directly uses the color gamut preset by the projection device 100 The second image display is performed. In this case, since the preset color gamut of the projection device 100 is larger, the color display effect of the second image is poor. In some embodiments of the present disclosure, the display effect of the second image displayed by the projection device 100 can be improved by performing color gamut conversion on the second data by the second controller 12 . For example, after the first controller 11 determines that the currently input image data type is the second data, the first controller 11 sends the second data and the second notification information to the second controller 12 . The second notification information indicates that the currently input image data is the second data.
  • step 404 the second controller 12 responds to the second notification information by calling a second characteristic parameter set to process the second data, and drives the light valve 250 to display the second image.
  • the second characteristic parameter set is also pre-stored in the second controller 12 .
  • the second characteristic parameter set includes a plurality of color characteristic parameters.
  • the plurality of color characteristic parameters correspond to the color gamut of the second image.
  • the second controller 12 responds to the second notification information, calls the second characteristic parameter set to process the second data, and drives the light valve 250 to display the second image.
  • the value method of each color feature parameter in the second feature parameter set is the same as the value method of each color feature parameter in the first feature parameter set, and will not be described again here.
  • step 405 the first controller 11 performs color gamut conversion on the first data according to the color gamut currently set by the projection device 100, and sends the color gamut converted first data to the second controller. 12.
  • the first controller 11 may directly perform color gamut conversion on the first data. For example, the first controller 11 determines the currently input image data type After obtaining the first data, perform color gamut conversion on the decoded first data according to the color gamut currently set by the projection device 100, and send the color gamut converted first data to the second control Device 12.
  • step 406 the second controller 12 parses the received first data into a driving signal, and drives the light valve 250 according to the driving signal to display the first image.
  • the second controller 12 directly analyzes the received first data into a driving signal, and drives the light valve 250 according to the driving signal to display the first image.
  • multiple multimedia image modes for displaying the first image may be pre-stored in the first controller 11, and one multimedia image mode corresponds to one color gamut.
  • the color gamut corresponding to the multimedia image mode is the currently set color gamut.
  • some embodiments of the present disclosure may also include other projection display methods, and the present disclosure does not limit this.
  • corresponding color gamuts are used to display images captured by the camera device 13 for different ambient brightness, so that the projection device 100 can achieve a color balance effect when displaying images captured by the camera device 13 .
  • the projection device includes the above-mentioned light valve 250, the first controller 11, the second controller 12, the camera device 13 and the detection device 14.
  • the first controller 11 is configured to: receive a first instruction, control the camera device to turn on, and control the detection device to detect the current brightness; the first instruction is used to instruct the first control device
  • the controller controls the opening of the camera device; determines the target camera image mode according to the corresponding relationship between the brightness and the camera image mode; the target camera image mode is the camera image mode corresponding to the current brightness, and the camera image mode is to display the camera device
  • the image mode of the collected image data, one of the plurality of camera image modes corresponds to a color gamut; the first notification information is sent to the second controller 12 according to the target camera image mode; the first notification information is used to Characterize the target camera image mode.
  • the second controller 12 is also configured to call the first characteristic parameter set corresponding to the target camera image mode according to the first notification information, process the image data collected by the camera device, and drive the light valve to perform Image display; a plurality of first characteristic parameter sets are prestored in the second controller, one of the plurality of first characteristic parameter sets corresponds to one of the plurality of camera image modes, and the third A feature parameter set includes a plurality of color feature parameters, and the color feature parameters are located in the corresponding color gamut.
  • the image mode includes a first camera image mode and a second camera image mode.
  • the second controller 12 is further configured to determine initial values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white respectively.
  • the second controller 12 is further configured to: sequentially adjust the corresponding red, green, blue, cyan, magenta, yellow and white colors within the set adjustment interval.
  • the hue, saturation and gain of red, green, blue, cyan, magenta, yellow and white that satisfy the color gamut corresponding to the first camera image mode are determined as the first The set value of the first feature parameter set corresponding to the camera image mode; determining the hue, saturation and gain corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white through linear interpolation, thereby determining The value of the color characteristic parameter corresponding to each color in the first characteristic parameter set corresponding to the first camera image mode.
  • the second controller 12 is further configured to: sequentially adjust the corresponding hues of red, green, blue, cyan, magenta, yellow and white within the set adjustment interval. , saturation and gain, the values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white that satisfy the color gamut corresponding to the second camera image mode are determined as the second camera image
  • the set value of the first characteristic parameter set corresponding to the mode determine the hue, saturation and gain corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white through linear interpolation, thereby determining the second The value of the color characteristic parameter corresponding to each color in the first characteristic parameter set corresponding to the camera image mode.
  • the first controller 11 is configured to: when receiving the second instruction, control the camera device to turn off ; Determine the type of image data currently input, the image data also includes at least one of first data or second data; the second instruction is used to instruct the first controller 11 to control the camera device 13 to turn off.
  • the first controller 11 is configured to perform color gamut conversion on the first data according to the color gamut currently set by the projection device, and convert all the results after color gamut conversion.
  • the first data is sent to the Two controllers 12; in this case, the second controller 12 is configured to process the received first data into a driving signal, and drive the light valve 250 according to the driving signal to display the first image.
  • the first controller 11 is configured to send second notification information; the second notification information is used to represent that the currently input image data is the second data; the second controller 12 is also configured to respond to the second notification information by calling a second feature parameter set to process the second data, and drive the light valve 250 to display the second image; the second controller 12 is also pre-stored.
  • the second characteristic parameter set includes a plurality of color characteristic parameters, and the plurality of color characteristic parameters correspond to the color gamut of the second image.
  • the first characteristic parameter set and the second characteristic parameter set respectively include color characteristic parameters of multiple colors, and the color characteristic parameters at least include hue, saturation and gain; the multiple colors Include at least: red, green, blue, cyan, magenta, yellow, and white.
  • the gain of red in the first feature parameter set corresponding to the first camera image mode is greater than the gain of red in the first feature parameter set corresponding to the second camera image mode; the first The green gain in the first feature parameter set corresponding to the camera image mode is smaller than the green gain in the first feature parameter set corresponding to the second camera image mode; the first feature corresponding to the first camera image mode The gain of blue in the parameter set is smaller than the gain of blue in the first feature parameter set corresponding to the second camera image mode.
  • a color gamut suitable for the content of the displayed image is used for image display, the color of the image can be restored and a better display effect can be achieved.
  • projection equipment can perform color gamut conversion through a chip with color gamut conversion function, but the conversion function of this chip is limited, and some other chips do not have the color gamut conversion function.
  • Figure 21 is a color coordinate diagram of a projection device according to some embodiments.
  • the size of the color gamut that the projection device 100 can display is related to the red laser beam, green laser beam and blue laser beam emitted by the light source 10.
  • the wavelength half-maximum width of the laser beam emitted by the laser is narrow and the color purity is high. Therefore, as shown in Figure 21, the color gamut when the projection device 100 displays an image (the triangle surrounded by the dotted line corresponding to the projection device 100 in Figure 21 Area) is larger than the maximum color gamut defined by the color gamut standard (the area of the triangle surrounded by the dotted line corresponding to BT2020 in Figure 21). In this way, the image displayed by the projection device 100 can meet the user's demand for a large color gamut image.
  • the projection device displays an image
  • a color gamut suitable for the content of the displayed image is used for image display
  • the image color can be restored and a better display effect can be achieved.
  • projection equipment can perform color gamut conversion through a chip with color gamut conversion function, but the conversion function of this chip is limited, and some other chips do not have the color gamut conversion function.
  • Figure 22 is a structural diagram of yet another projection device according to some embodiments.
  • some embodiments of the present disclosure also provide a projection device.
  • the structure of the projection device is the same as the structure of the above-mentioned projection device 100 .
  • projection device 100 has multiple image modes.
  • the multiple image modes at least include Artificial Intelligence (AI) mode, standard mode, soft mode, vivid mode and customized mode, and the AI mode, the standard mode, the soft mode, the bright mode and The custom mode is described below.
  • One image mode corresponds to one color gamut.
  • the projection device 100 may use different color gamuts for image display.
  • the user can switch the projection device 100 among multiple image modes.
  • target feature parameter sets are pre-stored in the second controller 12, and one target feature parameter set corresponds to one image mode.
  • the target feature parameter set includes a plurality of color feature parameters, and the color feature parameters are located in the corresponding color gamut.
  • the color characteristic parameters include hue, saturation and gain, and multiple color characteristic parameters included in a target characteristic parameter set correspond to multiple colors.
  • the target feature parameter set includes feature parameters of multiple colors.
  • the plurality of colors include at least red, green, blue, cyan, magenta, yellow and white. By setting the hue, saturation and gain of these seven colors, a color gamut can be determined.
  • the target feature parameters are concentrated in Color characteristic parameters for multiple transition colors between red, green, blue, cyan, magenta, yellow and white may also be included.
  • Some embodiments of the present disclosure also provide a projection display method, which is applied to the projection device 100 .
  • Figure 23 is yet another flowchart of a projection display method according to some embodiments.
  • the method includes steps 501 to 503.
  • step 501 the first controller 11 receives the target instruction.
  • the first controller 11 may receive a target instruction.
  • the target instruction instructs the projection device 100 to switch to a corresponding image mode for image display.
  • step 502 the first controller 11 sends target notification information to the second controller 12 according to the target instruction.
  • step 503 the second controller 12 calls the target feature parameter set corresponding to the currently switched image mode according to the target notification information, processes the current image data into a driving signal, and drives the light according to the driving signal.
  • the valve 250 performs image display.
  • Figure 24 is another schematic diagram of a remote control of a projection device according to some embodiments.
  • Figure 25 is a structural diagram of yet another projection device according to some embodiments.
  • Figure 26 is another schematic diagram of a menu interface according to some embodiments.
  • the user can send the target instruction to the first controller 11 through an external device.
  • the external devices include remote controls, buttons on the projection device 100 and other devices and control devices that can send control instructions.
  • the projection system 1 further includes a remote control 60 .
  • the remote control 60 and the projection device 100 can communicate through an infrared communication protocol, a Bluetooth communication protocol, a ZigBee communication protocol or other short-distance communication methods.
  • the user can send instructions to the projection device 100 by pressing buttons on the remote control 60, thereby controlling the projection device 100 to perform corresponding operations.
  • the remote control 60 includes a second button, a third button, a fourth button, a fifth button and a sixth button.
  • the second button may be the AI mode button in FIG. 24 and corresponds to the AI mode.
  • the third button may be the standard mode button in FIG. 24 and corresponds to the standard mode.
  • the fourth button may be the soft mode button in FIG.
  • the fifth button may be the bright mode button in FIG. 24 and corresponds to the bright mode.
  • the sixth button may be the custom mode button in FIG. 24 and corresponds to the custom mode.
  • One image mode button corresponds to one image mode, and one image mode corresponds to one color gamut, thereby achieving one image display effect.
  • the projection device 100 has a voice recognition function, and the user can switch image modes through voice input or other methods, which is not limited by the present disclosure.
  • the projection device 100 is provided with a plurality of buttons, the plurality of buttons including the second button, the third button, the fourth button, the fifth button and the The sixth button.
  • One image mode button corresponds to one image mode
  • one image mode corresponds to one color gamut, thereby achieving one image display effect.
  • the corresponding button sends the target instruction corresponding to the currently selected image mode to the first controller 11 .
  • the external device may be a smart device, such as a mobile terminal, a tablet, a computer, a laptop, etc.
  • the external device can communicate with the projection device 100 through various methods such as network, infrared, and data lines, and send instructions through various methods such as buttons, voice input, gesture input, etc., which is not limited by this disclosure.
  • the user can enter the menu interface of the projection device 100 through the external device.
  • the menu interface can provide the AI mode, the standard mode, the soft mode, the There are multiple image mode options such as vivid mode and the custom mode.
  • One image mode button corresponds to one image mode, and one image mode corresponds to one color gamut, thereby achieving one image display effect.
  • the first controller 11 can receive the image mode selected by the user. For example, as shown in Figure 26, the user selects the soft mode through the external device.
  • the type of image mode and the sending method of the target instruction can also be of other types and methods.
  • the first controller 11 may also receive the target instruction under other conditions, which is not limited by this disclosure.
  • the first controller 11 After receiving the target instruction, the first controller 11 sends corresponding target notification information to the second controller 12 according to the received target instruction.
  • the target notification information indicates the currently switched image mode. For example, first After receiving the target instruction sent by the user through an external device, the controller 11 sends the corresponding target notification information to the second controller 12, or the first controller 11 can receive the target instruction selected by the user in the menu interface. After entering the image mode, the target notification information is sent to the second controller 12, which is not limited in this disclosure.
  • the second controller 12 After receiving the target notification information sent by the first controller 11, the second controller 12 calls the target feature parameter set corresponding to the currently switched image mode according to the received target notification information, and performs the processing on the current image data. Processing is performed, and the light valve 250 is driven to display the image. For example, the second controller 12 processes the image data into a driving signal, and the driving signal can drive the light valve 250 to perform image display. After receiving the target notification information, the second controller 12 calls the target feature parameter set corresponding to the currently switched image mode according to the target notification information, and processes the received image data into a driving signal. The second controller 12 drives the light valve 250 to display the image according to the processed driving signal, so that the projection device 100 can display the image using a color gamut corresponding to the image mode selected by the user.
  • multiple target feature parameter sets corresponding to multiple image modes are pre-stored in the second controller 12 .
  • One image mode corresponds to one color gamut.
  • the second controller 12 can call the corresponding target feature parameter set according to the image mode selected by the user, process the image data, and drive the light valve 250 according to the processed driving signal. Perform image display.
  • switching between multiple color gamuts can be realized under the condition that the first controller 11 has limited color gamut conversion function or no color gamut conversion function, and has a wide range of applicability.
  • Figure 27 is yet another flowchart of a projection display method according to some embodiments.
  • the method further includes steps 601 to 605.
  • step 601 the first controller 11 receives image data, decodes the image data and sends it to the second controller 12.
  • the first controller 11 continues to receive image data, decodes the image data, and then sends it to the second controller 12.
  • the first controller 11 decodes image data in different formats into low-voltage differential signals.
  • Low-voltage differential signals have the characteristics of low power consumption, low bit error rate, low crosstalk and low radiation, which can improve the transmission quality of image signals.
  • the image data may include first data and second data. For the first data and the second data, please refer to the previous relevant descriptions and will not be described again here.
  • step 602 the first controller 11 determines whether the target instruction is received. If “Yes”, perform steps 603 to 604; if "No", perform step 605.
  • step 603 the first controller 11 sends the corresponding target notification information to the second controller 12 according to the target instruction.
  • the first controller 11 If the first controller 11 receives the target instruction while the projection device 100 is displaying an image, the first controller 11 sends the target notification information to the second controller 12 .
  • step 604 the second controller 12 calls the target feature parameter set corresponding to the currently switched image mode according to the target notification information, processes the current image data, and drives the light valve 250 to display the image.
  • the second controller 12 After receiving the target notification information, the second controller 12 will call the target feature parameter set corresponding to the image mode indicated by the target notification information, process the currently received image data, and drive the light valve. 250 for image display.
  • step 605 the second controller 12 uses the current target feature parameter set to process the image data, and drives the light valve 250 to display the image.
  • the second controller 12 will process the image data using the current target feature parameter set after receiving the image data. And drive the light valve 250 to display the image.
  • the current target feature parameter set is the target feature parameter set corresponding to the image mode last switched by the projection device 100 . It should be noted that if the projection device 100 is turned on for the first time to display an image, the second controller 12 calls the target feature parameter set corresponding to the default image mode, processes the image data, and drives the light valve 250 to perform image display. show.
  • the first controller 11 receives the image data, decodes the image data, and sends the decoded image data to the second controller 12 .
  • the second controller 12 uses the target feature parameter set corresponding to the image mode switched during the last image display process of the projection device 100 Perform image display. For example, the last time the projection device 100 performed an image display process, the user selected the first sub-image mode for image display, then during this image display process, if the user does not switch the image mode, the second controller 12 still uses the image mode selected in the last image display process (that is, the first sub-image The target feature parameter set corresponding to the mode) is displayed as an image.
  • the first sub-image mode will be described below.
  • the second controller 12 calls the corresponding target feature parameter set according to the switched image mode to perform image display. For example, during this image display process, the user selects the second image mode for image display, and the second controller 12 calls all the files corresponding to the second image mode according to the target notification information sent by the first controller 11 .
  • the target feature parameter set is used for image display. In this way, during the next image display, the second controller 12 will continue to use the target feature parameter set corresponding to the second image mode for image display until the user switches the image mode again, and the second controller 12 will then display the image according to the target feature parameter set corresponding to the second image mode.
  • the image mode switched by the user calls the corresponding target feature parameter set for image display.
  • the second controller 12 first uses the target feature parameter set corresponding to the default image mode of the projection device 100 to display the image. Afterwards, if the user switches the image mode, the second controller 12 calls the corresponding target feature parameter set to perform image display according to the image mode switched by the user.
  • the image data may include a notification message.
  • the notification message corresponds to first notification information, second notification information, and target notification information.
  • the notification message may be a separate instruction, and the instruction is sent to the second controller 12 .
  • the first controller 11 does not decode the second data and directly sends the second data to the second controller 12 .
  • the image display process may be any process in which the projection device 100 displays an image.
  • the process of image display includes a process from power on to power off, a process from the start of a video source to the end of playback, a process from the start of a video source to paused playback, and a process from paused playback of a video source.
  • This disclosure does not limit the process from playing to the end of playing, the process from starting to play one video source to switching to the next video source, or any process that is shorter or longer than the above process during image display.
  • Figure 28 is a flow chart of a method for determining the value of the color feature parameter corresponding to each color in the target feature parameter set according to some embodiments.
  • the method includes steps 701 to 703.
  • step 701 the second controller 12 determines initial values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white respectively.
  • the second controller 12 needs to determine the initial values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white respectively.
  • the hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white when the preset color gamut of the projection device 100 is met is used as the initial value. For example, set the initial value of hue to 0, the initial value of saturation to 1, and the initial value of gain to 1. In this way, it is determined that the initial values of the hue, saturation and gain of each color when satisfying the preset color gamut of the projection device 100 are: red R 0 (0,1,1), green G 0 (0,1,1), Blue B 0 (0,1,1), Cyan C 0 (0,1,1), Magenta M 0 (0,1,1), Yellow Y 0 (0,1,1) and White W 0 ( 0,1,1).
  • the second controller 12 sequentially adjusts the hue and saturation corresponding to red, green, blue, cyan, magenta, yellow and white within the set interval according to the color gamut corresponding to the target feature parameter set.
  • the values of degree and gain, the values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white when the color gamut corresponding to the target feature parameter set is satisfied are determined as The setting value of the target feature parameter set.
  • the second controller 12 sequentially adjusts the hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white within the set interval according to the color gamut corresponding to the target feature parameter set. value, store the values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white when the color gamut corresponding to the target feature parameter set is satisfied as the target feature parameter set value.
  • the second controller 12 determines the hue, saturation and gain corresponding to the transition colors between red, green, blue, cyan, magenta, yellow and white through linear interpolation, thereby determining the corresponding hue, saturation and gain of each color.
  • the value of the color feature parameter is the value of the color feature parameter.
  • the following uses five different image modes as examples for explanation.
  • the five different image modes include the first sub-image mode, the second sub-image mode, the third sub-image mode, the fourth sub-image mode and the fifth sub-image mode.
  • the color gamut corresponding to the fourth sub-image mode is the color gamut preset by the projection device 100 .
  • the hue H and saturation of each color When adjusting S and gain G, you can directly set red R 0 (0,1,1), green G 0 (0,1,1), blue B 0 (0,1,1), cyan C 0 (0 ,1,1), magenta M 0 (0,1,1), yellow Y 0 (0,1,1) and white W 0 (0,1,1) are determined as corresponding to the fourth sub-image mode
  • the setting value of the target feature parameter set is determined as corresponding to the fourth sub-image mode.
  • the hue H, saturation S and gain G of each color are adjusted within the set interval so that the values of the hue H, saturation S and gain G of each color satisfy the color gamut corresponding to the first image mode.
  • the adjusted hue H, saturation S and gain G corresponding to each color are determined as the setting values of the target feature parameter set corresponding to the first sub-image mode.
  • the seven colors after adjustment are: red R 1 (H R1 ,S R1 ,G R1 ), green G 1 (H G1 ,S G1 ,G G1 ), blue B 1 (H B1 ,S B1 ,G B1 ), cyan C 1 (H C1 ,S C1 ,G C1 ), magenta M 1 (H M1 ,S M1 ,G M1 ), yellow Y 1 (H Y1 ,S Y1 ,G Y1 ) and white W 1 ( H W1 , S W1 , G W1 ), determine the hue H, saturation S and gain G corresponding to the seven colors as the set values of the target feature parameter set corresponding to the first image mode.
  • the hue H, saturation S and gain G of each color are adjusted within the set interval so that the values of the hue H, saturation S and gain G of each color satisfy the color gamut corresponding to the second image mode.
  • the adjusted hue H, saturation S and gain G corresponding to each color are determined as the setting values of the target feature parameter set corresponding to the second image mode.
  • the seven colors after adjustment are: red R 2 (H R2 ,S R2 ,G R2 ), green G 2 (H G2 ,S G2 ,G G2 ), blue B 2 (H B2 ,S B2 ,G B2 ), cyan C 2 (H C2 , S C2 , G C2 ), magenta M 2 (H M2 , S M2 , G M2 ), yellow Y 2 (H Y2 , S Y2 , G Y2 ) and white W 2 ( H W2 , SW2 , G W2 ), determine the hue H, saturation S and gain G corresponding to the seven colors as the set values of the target feature parameter set corresponding to the second image mode.
  • the hue H, saturation S and gain G of each color are determined as the setting values of the target feature parameter set corresponding to the third image mode.
  • the seven colors after adjustment are: red R 3 (H R3 ,S R3 ,G R3 ), green G 3 (H G3 ,S G3 ,G G3 ), blue B 3 (H B3 ,S B3 ,G B3 ), cyan C 3 (H C3 , S C3 , G C3 ), magenta M 3 (H M3 , S M3 , G M3 ), yellow Y 3 (H Y3 , S Y3 , G Y3 ) and white W 3 ( H W3 , SW3 , G W3 ), determine the hue H, saturation S and gain G corresponding to the seven colors as the set values of the target feature parameter set corresponding to the third image mode.
  • the hue H, saturation S and gain G of each color are determined as the setting values of the target feature parameter set corresponding to the fifth image mode.
  • the seven colors after adjustment are: red R 5 (H R5 ,S R5 ,G R5 ), green G 5 (H G5 ,S G5 ,G G5 ), blue B 5 (H B5 ,S B5 ,G B5 ), cyan C5 (H C5 , S C5 , G C5 ), magenta M5 (H M5 , S M5 , G M5 ), yellow Y 5 (H Y5 , S Y5 , G Y5 ) and white W 5 ( H W5 , SW5 , G W5 ), the hue H, saturation S and gain G corresponding to the seven colors are determined as the setting values of the target feature parameter set corresponding to the fifth image mode.
  • the adjustment interval of hue H may be a closed interval, and the lower limit of the closed interval is -1 and the upper limit is 1 (that is, hue H is greater than or equal to -1 and less than or equal to 1).
  • the adjustment interval of saturation S can be a closed interval, and the lower limit of the closed interval is 0 and the upper limit is 2 (that is, the hue H is greater than or equal to 0 and less than or equal to 2).
  • the value of saturation S is 0, All colors will be removed; when the saturation value is 2, the color is set to the maximum color; when the saturation value is 1, the saturation does not change.
  • the adjustment interval of gain G can be a closed interval, and the lower limit of the closed interval is 0 and the upper limit is 2 (that is, the hue H is greater than or equal to 0 and less than or equal to 2).
  • the gain is to change the intensity level of the corresponding color.
  • the gain When the value is 1, it is the nominal setting. When the gain value is less than 1, the color becomes darker; when the gain value is 2, the color is the brightest.
  • the second controller 12 After determining the setting value of the color characteristic parameter of each color in each of the target characteristic parameter sets, the second controller 12 determines the hue corresponding to the transition color between red, green, blue, cyan, magenta, yellow and white.
  • the values of , saturation and gain are linearly interpolated to determine the value of the color characteristic parameter corresponding to each color. For example, linear interpolation is automatically performed through the built-in program of the second controller 12, or linear interpolation is performed through an external program, and the characteristic parameter set obtained after interpolation is imported into the second controller 12 for storage. This disclosure does not limit this. .
  • the image mode may include a first image mode and a second image mode.
  • the color gamut corresponding to the first image mode meets the standard color gamut.
  • the color gamut corresponding to the second image mode is different from the standard color gamut.
  • the first image mode includes the first sub-image mode, the second sub-image mode and the third sub-image mode.
  • the color gamut corresponding to the first sub-image mode satisfies BT2020, the color gamut corresponding to the second sub-image mode satisfies DCI-P3, and the third sub-image mode satisfies Rec.709.
  • the second image mode may include the fourth sub-image mode and the fifth sub-image mode.
  • the color gamut corresponding to the fourth image mode meets the color gamut preset by the projection device 100 , and the color gamut preset by the projection device 100 is larger than the color gamut corresponding to the standard image mode.
  • the color gamut corresponding to the fifth sub-image mode is smaller than the color gamut preset by the projection device 100, and the color gamut corresponding to the fifth sub-image mode is different from the color gamut corresponding to the first image mode.
  • the color gamut corresponding to the fifth sub-image mode can be customized according to the display effect of the projection device 100 and the user's preference.
  • Figure 29 is a plot of image modes versus color gamut, according to some embodiments.
  • the first sub-image mode, the second sub-image mode, the third sub-image mode, the fourth sub-image mode and the fifth sub-image mode can be The sub-image modes respectively display the color characteristics of the image and name each image mode.
  • the first sub-image mode is defined as the AI mode
  • the second sub-image mode is defined as the standard mode
  • the third sub-image mode is defined as the soft mode
  • the fourth sub-image mode with a large color gamut is defined as the vivid mode
  • the fifth sub-image mode is defined as the custom mode.
  • the user can intuitively distinguish the differences between the various image modes, Select the desired image mode for image display.
  • fewer or more types and numbers of image modes may be determined according to different classification standards and user requirements, which is not limited by the present disclosure.
  • the first controller 11 when the first controller 11 only has a simple color gamut conversion function or does not have a color gamut conversion function, conversion of multiple color gamuts can be achieved.
  • Some embodiments of the present disclosure also provide a projection device, which has a similar structure to the above-mentioned projection device 100.
  • the projection device includes the above-mentioned light valve 250, the first controller 11 and the second controller 12.
  • the first controller 11 receives the target instructions.
  • the target instruction is used to instruct the projection device 100 to switch to a corresponding image mode for image display.
  • the first controller 11 sends target notification information to the second controller 12 according to the received target instruction.
  • the target notification information is used to characterize the currently switched image mode, and one image mode corresponds to one color gamut; the second controller pre-stores multiple target feature parameter sets.
  • One of the target feature parameter sets corresponds to one of the image modes; the target feature parameter set includes a plurality of colors that meet the corresponding color gamut. Characteristic Parameters.
  • the second controller 12 calls the target feature parameter set corresponding to the currently switched image mode according to the target notification information, processes the current image data into a driving signal, and drives the light valve 250 for image display according to the driving signal.
  • the target feature parameter set includes color feature parameters of multiple colors respectively, and the color feature parameters at least include hue, saturation and gain; the multiple colors include at least: red, green, blue , cyan, magenta, yellow and white.
  • the second controller 12 determines the initial values of hue, saturation and gain corresponding to red, green, blue, cyan, magenta, yellow and white respectively; according to the color corresponding to the target feature parameter set The gamut adjusts the hue, saturation and gain values corresponding to red, green, blue, cyan, magenta, yellow and white in sequence within the set interval.
  • red, green, blue , the corresponding hue, saturation and gain values of cyan, magenta, yellow and white are determined as the setting values of the target feature parameter set; for red, green, blue, cyan, magenta, yellow and white
  • the values of hue, saturation and gain corresponding to the transition color are linearly interpolated to determine the value of the color characteristic parameter corresponding to each color.
  • the image mode includes a standard image mode and a non-standard image mode; the color gamut corresponding to the standard image mode satisfies the standard color gamut; the color gamut corresponding to the non-standard image mode meets the standard color gamut. different.
  • the standard image mode includes a first image mode, a second image mode and a third image mode.
  • the color gamut corresponding to the first image mode meets BT2020.
  • the color gamut corresponding to the second image mode meets DCI-P3; the color gamut corresponding to the third image mode meets Rec.709.
  • the non-standard image modes include a fourth image mode and a fifth image mode.
  • the color gamut corresponding to the fourth image mode satisfies the color gamut range preset by the projection device, and the color gamut range preset by the projection device is greater than the color gamut corresponding to the standard image mode;
  • the fifth image mode corresponds to The color gamut is smaller than the color gamut preset by the projection device, and the color gamut corresponding to the fifth image mode is different from the color gamut corresponding to the standard image mode.
  • the first controller 11 also receives image data, decodes the image data and sends it to the second controller 12 .
  • the second controller 12 calls the target feature parameter set corresponding to the currently switched image mode according to the target notification information, and parses the image data received by the second controller 12 into a driving signal.
  • the second controller 12 drives the light valve 250 to display images according to the driving signal.
  • the first controller 11 receives the target instruction sent by the user through an external device, or the first controller receives the image mode selected by the user in the menu interface.
  • Some embodiments of the present disclosure provide a computer-readable storage medium (eg, a non-transitory computer-readable storage medium) having computer program instructions stored therein.
  • a computer-readable storage medium eg, a non-transitory computer-readable storage medium
  • the computer is caused to execute the projection display method as described in any of the above embodiments.
  • the above computer-readable storage medium may include, but is not limited to: magnetic storage devices (such as hard disks, floppy disks or tapes, etc.), optical disks (such as CD (Compact Disk, compressed disk), DVD (Digital Versatile Disk, digital versatile disk) disk), etc.), smart cards and flash memory devices (e.g., EPROM (Erasable Programmable Read-Only Memory, Erasable Programmable Read-Only Memory), cards, sticks or key drives, etc.).
  • the various computer-readable storage media described in this disclosure may represent one or more devices and/or other machine-readable storage media for storing information.
  • the term "machine-readable storage medium" may include, but is not limited to, wireless channels and various other media capable of storing, containing and/or carrying instructions and/or data.
  • Some embodiments of the present disclosure also provide a computer program product.
  • the computer program product includes computer program instructions. When the computer program instructions are executed on the computer, the computer program instructions cause the computer to perform the projection display method as described in the above embodiment.
  • Some embodiments of the present disclosure also provide a computer program.
  • the computer program When the computer program is executed on the computer, the computer program causes the computer to perform the projection display method as described in the above embodiment.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Projection Apparatus (AREA)

Abstract

L'invention concerne un dispositif de projection. Le dispositif de projection comprend une vanne de lumière, un premier dispositif de commande, un second dispositif de commande, un dispositif d'appareil de prise de vues et un dispositif de détection. Le dispositif de détection est configuré pour détecter la luminosité actuelle de la lumière ambiante. Le premier dispositif de commande est configuré pour : après réception d'une première instruction, commander au dispositif d'appareil de prise de vues de démarrer, et commander au dispositif de détection de détecter la luminosité actuelle de la lumière ambiante ; déterminer un mode d'image photographiée cible en fonction de la correspondance entre la luminosité et le mode d'image photographiée ; et envoyer des premières informations de notification au second dispositif de commande selon le mode d'image photographiée cible. Le second dispositif de commande est en outre configuré pour appeler un premier ensemble de paramètres de caractéristique correspondant au mode d'image photographiée cible selon les premières informations de notification, traiter les données d'image acquises par le dispositif d'appareil de prise de vues, et amener la vanne de lumière à afficher une image.
PCT/CN2023/097475 2022-08-26 2023-05-31 Procédé d'affichage de projection, dispositif de projection et support de stockage WO2024041070A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202211030207.4 2022-08-26
CN202211030207.4A CN115396641B (zh) 2022-08-26 2022-08-26 激光投影显示方法、三色激光投影设备及可读性存储介质
CN202211031744.0 2022-08-26
CN202211031744.0A CN115396642B (zh) 2022-08-26 2022-08-26 激光投影显示方法、三色激光投影设备及可读性存储介质

Publications (1)

Publication Number Publication Date
WO2024041070A1 true WO2024041070A1 (fr) 2024-02-29

Family

ID=90012331

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/097475 WO2024041070A1 (fr) 2022-08-26 2023-05-31 Procédé d'affichage de projection, dispositif de projection et support de stockage

Country Status (1)

Country Link
WO (1) WO2024041070A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176339A (zh) * 2012-08-23 2013-06-26 深圳市金立通信设备有限公司 一种投影亮度自动调节系统及方法
US20170223318A1 (en) * 2014-07-24 2017-08-03 Tecnical Institute Of Physcs And Chemistry Of The Chinese Academy Of Sciences Laser display system
CN214375786U (zh) * 2021-04-08 2021-10-08 深圳市点睛创视技术有限公司 一种三基色激光消散斑匀光装置及投影系统
CN114584748A (zh) * 2022-03-17 2022-06-03 青岛海信激光显示股份有限公司 激光投影设备、其显示方法及可读性存储介质
CN114866752A (zh) * 2022-06-01 2022-08-05 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质
CN115396641A (zh) * 2022-08-26 2022-11-25 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质
CN115396642A (zh) * 2022-08-26 2022-11-25 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103176339A (zh) * 2012-08-23 2013-06-26 深圳市金立通信设备有限公司 一种投影亮度自动调节系统及方法
US20170223318A1 (en) * 2014-07-24 2017-08-03 Tecnical Institute Of Physcs And Chemistry Of The Chinese Academy Of Sciences Laser display system
CN214375786U (zh) * 2021-04-08 2021-10-08 深圳市点睛创视技术有限公司 一种三基色激光消散斑匀光装置及投影系统
CN114584748A (zh) * 2022-03-17 2022-06-03 青岛海信激光显示股份有限公司 激光投影设备、其显示方法及可读性存储介质
CN114866752A (zh) * 2022-06-01 2022-08-05 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质
CN115396641A (zh) * 2022-08-26 2022-11-25 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质
CN115396642A (zh) * 2022-08-26 2022-11-25 青岛海信激光显示股份有限公司 激光投影显示方法、三色激光投影设备及可读性存储介质

Similar Documents

Publication Publication Date Title
JP4923500B2 (ja) プロジェクタ装置、及びその光源制御方法
CN107241588A (zh) 投影器和投影器的控制方法
US20110164192A1 (en) Projector and method of controlling the same
US10061479B2 (en) Display system, information processing apparatus, computer readable recording medium, and power source control method
US9621863B2 (en) Projector and light emission control method in the projector
CN114866752B (zh) 激光投影显示方法、三色激光投影设备及可读性存储介质
CN115396641B (zh) 激光投影显示方法、三色激光投影设备及可读性存储介质
US9064443B2 (en) Projection apparatus, projection method, and storage medium storing program, for reducing energy consumption by shortening color mixing period
JP2014021235A (ja) プロジェクター、及び、プロジェクターにおける発光制御方法
US20160353096A1 (en) Display device and image quality setting method
US20190265847A1 (en) Display apparatus and method for controlling display apparatus
JP2007072150A (ja) 画像表示装置及びプロジェクタ
CN115396642B (zh) 激光投影显示方法、三色激光投影设备及可读性存储介质
CN114979598B (zh) 激光投影显示方法、三色激光投影设备及可读性存储介质
JP2017085446A (ja) 投影装置、投影方法及び投影システム
WO2024041070A1 (fr) Procédé d'affichage de projection, dispositif de projection et support de stockage
JP5979833B2 (ja) 投影装置及びその制御方法
US11238828B2 (en) Method of controlling display device and display device
JP2014066805A (ja) プロジェクター、及び、プロジェクターにおける発光制御方法
JP2017010057A (ja) プロジェクター、及び、プロジェクターにおける発光制御方法
WO2023232084A1 (fr) Dispositif de projection, son procédé d'affichage et support de stockage
JP2010217907A (ja) 画像表示装置、プロジェクタおよび画像表示方法
JP2016163294A (ja) 表示装置及び表示方法
US11089273B2 (en) Image display system and control method for image display system
JP6704722B2 (ja) 画像処理装置及び画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23856162

Country of ref document: EP

Kind code of ref document: A1