WO2023287024A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2023287024A1
WO2023287024A1 PCT/KR2022/008300 KR2022008300W WO2023287024A1 WO 2023287024 A1 WO2023287024 A1 WO 2023287024A1 KR 2022008300 W KR2022008300 W KR 2022008300W WO 2023287024 A1 WO2023287024 A1 WO 2023287024A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
electronic device
projection area
projected
Prior art date
Application number
PCT/KR2022/008300
Other languages
English (en)
Korean (ko)
Inventor
김형철
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2023287024A1 publication Critical patent/WO2023287024A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B43/00Testing correct operation of photographic apparatus or parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3173Constructional details thereof wherein the projection device is specially adapted for enhanced portability

Definitions

  • the present disclosure relates to an electronic device for projecting a projection image and a method for controlling the same, and more particularly, to an electronic device for correcting interference of an external light source and a method for controlling the same.
  • a projector is an electronic device that enlarges and projects output light from a light source onto a wall or screen through a projection lens.
  • the present disclosure has been made due to the above-described necessity, and more specifically, relates to an electronic device for correcting interference of an external light source through an image of a projection area and a method for controlling the same.
  • a method for controlling an electronic device obtains a first image corresponding to a projection area on which the projection image is to be projected before the projection image is projected by the electronic device. doing; Projecting at least one test image based on the acquired first image; obtaining at least one second image corresponding to a projection area on which the at least one test image is projected; and projecting the projected image based on the first image and the at least one second image.
  • control method may include projecting at least one test image based on first projection area information obtained based on the first image when the first image is obtained; obtaining second projection area information based on the at least one projected test image; and projecting a corrected image obtained by correcting the projected test image based on the first projection area information and the second projection area information.
  • control method includes identifying whether or not to project a test image based on the first projection area information; projecting at least one test image if the test image is identified as being projected; and correcting the projection image based on the first projection region information and the second projection region information and projecting the corrected projection image when the second projection region information is acquired.
  • the method may further include projecting a projection image based on the first projection area information when it is identified that the test image is not to be projected.
  • the projecting of the corrected image may further include adjusting a color expression method of the projection image based on the first projection area information and the second projection area information.
  • the identifying of whether to project the test image may include identifying whether or not to project the test image based on a difference between the first projection area information and the preset projection area information.
  • the light source for the house where the electronic device is located is projected. detecting changes; and projecting the at least one test image when a change in the light source is detected.
  • the projected image is corrected, obtaining light emission information of a lighting device located in a house where the electronic device is located; and matching and storing correction information for the corrected projection image with the emission information.
  • the first image may be obtained by photographing the projection area when the power of the electronic device is turned on.
  • an electronic device includes a projection unit for projecting a projection image; camera; a memory storing at least one instruction; and a processor controlling the electronic device by executing at least one instruction stored in the memory, wherein the processor corresponds to a projection area where the projection image is to be projected before the projection image is projected by the electronic device.
  • a projection unit for projecting a projection image
  • camera a memory storing at least one instruction
  • a processor controlling the electronic device by executing at least one instruction stored in the memory, wherein the processor corresponds to a projection area where the projection image is to be projected before the projection image is projected by the electronic device.
  • Acquiring a first image of the image through the camera controlling the projection unit to project at least one test image based on the acquired first image, and corresponding to a projection area on which the at least one test image is projected.
  • At least one second image may be acquired through the camera, and the projection unit may be controlled to project the projected image based on the first image and the at least one second image.
  • the processor controls the projection unit to project at least one test image based on first projection area information acquired based on the first image, and the at least one projected image is projected.
  • the projection unit may be controlled to acquire second projection area information based on the test image of and project a corrected image obtained by correcting the projected test image based on the first projection area information and the second projection area information.
  • the processor identifies whether to project a test image based on the information on the first projection area, and controls the projection unit to project at least one test image if the test image is identified as being projected;
  • the projection image may be corrected based on the first projection area information and the second projection area information, and the projection unit may be controlled to project the corrected projection image.
  • the processor may control the projection unit to project a projected image based on the first projection area information.
  • the processor may adjust a color expression method of the projection image based on the first projection area information and the second projection area information.
  • the electronic device can provide a projected image with minimal interference from an external light source device in the home.
  • FIG. 1 is a block diagram briefly illustrating the configuration of an electronic device 100 that projects a projection image according to the present disclosure.
  • FIG. 2 is a diagram for explaining the influence of an external light source on a projection area according to the present disclosure.
  • 3A is a diagram illustrating a first image obtained by capturing a projection area on which no projection image is projected according to the present disclosure.
  • 3B is a diagram for explaining an embodiment of correcting a projection image based on a first image according to the present disclosure.
  • 3C is a diagram for explaining an embodiment in which a corrected projection image is projected based on a first image according to the present disclosure.
  • 4A is a diagram illustrating an image obtained by capturing a projection area in which a projection image is not projected according to the present disclosure.
  • 4B is a diagram illustrating at least one second image obtained by capturing a projection area on which at least one test image is projected according to the present disclosure.
  • 4C is a diagram for explaining an embodiment of correcting a projection image based on a first image and at least one second image according to the present disclosure.
  • 4D is a diagram for explaining an embodiment in which a corrected projection image is projected based on a first image and at least one second image according to the present disclosure.
  • FIG. 5 is a flowchart for explaining a specific operation of an electronic device according to the present disclosure.
  • FIG. 6 is a flowchart illustrating an operation of an electronic device according to the present disclosure.
  • FIG. 7 is a perspective view illustrating an external appearance of an electronic device 700 according to an embodiment of the present disclosure.
  • FIG. 8 is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • FIG. 9 is a perspective view illustrating an external appearance of an electronic device 700 according to other embodiments of the present disclosure.
  • FIG. 10 is a perspective view illustrating an external appearance of an electronic device 700 according to still other embodiments of the present disclosure.
  • FIG. 11 is a perspective view illustrating an external appearance of an electronic device 700 according to another embodiment of the present disclosure.
  • 12A and 12B are perspective views illustrating an external appearance of an electronic device 700 according to still other embodiments of the present disclosure.
  • FIG. 1 is a block diagram briefly illustrating the configuration of an electronic device 100 that projects a projection image according to the present disclosure.
  • the electronic device 100 may be a device of various types.
  • the electronic device 100 may be a projector device that enlarges and projects an image onto a wall or a screen
  • the projector device may be an LCD projector or a digital light processing (DLP) projector using a digital micromirror device (DMD).
  • DLP digital light processing
  • DMD digital micromirror device
  • the electronic device 100 may be a home or industrial display device, or a lighting device used in daily life, a sound device including a sound module, a portable communication device (eg, a smartphone), It may be implemented as a computer device, a portable multimedia device, a wearable device, or a home appliance.
  • the electronic device 100 according to an embodiment of the present disclosure is not limited to the above devices, and the electronic device 100 may be implemented as an electronic device 100 having two or more functions of the above devices.
  • the electronic device 100 may be used as a display device, a lighting device, or a sound device by turning off a projector function and turning on a lighting function or a speaker function according to manipulation of a processor, and AI including a microphone or communication device. Can be used as a speaker.
  • the electronic device 100 may include a projection unit 110 , a memory 120 , a camera 130 and a processor 140 .
  • the configuration shown in FIG. 1 is merely an embodiment, and some configurations may be omitted and new configurations may be added.
  • the projection unit 110 is a component that projects an image to the outside.
  • the projection unit 110 may use various projection methods (eg, a cathode-ray tube (CRT) method, a liquid crystal display (LCD) method, a digital light processing (DLP) method, and a laser method). etc.) can be implemented.
  • CTR cathode-ray tube
  • LCD liquid crystal display
  • DLP digital light processing
  • laser method a laser method
  • the projection unit 110 may include various types of light sources.
  • the projection unit 110 may include at least one light source among a lamp, LED, and laser.
  • the projection unit 110 may output an image in a 4:3 aspect ratio, a 5:4 aspect ratio, or a 16:9 wide aspect ratio according to the purpose of the electronic device 100 or the user's settings, and depending on the aspect ratio, WVGA (854*480 ), SVGA(800*600), XGA(1024*768), WXGA(1280*720), WXGA(1280*800), SXGA(1280*1024), UXGA(1600*1200), Full HD(1920*1080) ), etc., can output images at various resolutions.
  • the projection unit 110 may perform various functions for adjusting an output image under the control of the processor 140 .
  • the projection unit 110 may perform functions such as zoom, keystone, quick corner (four corners) keystone, lens shift, and the like, which will be described in detail with reference to FIG. 8 later.
  • the memory 120 may store at least one command related to the electronic device 100 .
  • an operating system (O/S) for driving the electronic device 100 may be stored in the memory 120 .
  • various software programs or applications for operating the electronic device 100 may be stored in the memory 120 according to various embodiments of the present disclosure.
  • the memory 120 may include a semiconductor memory such as a flash memory or a magnetic storage medium such as a hard disk.
  • various software modules for operating the electronic device 100 may be stored in the memory 120 according to various embodiments of the present disclosure, and the processor 140 executes various software modules stored in the memory 120.
  • the operation of the electronic device 100 may be controlled. That is, the memory 120 is accessed by the processor 140, and data can be read/written/modified/deleted/updated by the processor 140.
  • the term memory 120 refers to the memory 120, a ROM (not shown) in the processor 140, a RAM (not shown), or a memory card (not shown) mounted in the electronic device 100 (eg For example, micro SD card, memory stick) may be used as a meaning including.
  • the camera 130 is a component for capturing still images and moving images.
  • the camera 130 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the camera 130 may capture a projected image projected by the projection unit 110 and obtain a captured image. That is, the lens of the camera 130 may be disposed to face a direction in which a light source is projected through the projection unit 110 .
  • the present invention is not limited thereto, and when the electronic device 100 is implemented to adjust the projection angle or direction of the projection image while fixing the main body, the camera 130 is implemented to be moved according to the projection angle or projection direction of the projection image. It can be.
  • the processor 140 is electrically connected to the memory 120 to control the overall operation of the electronic device 100 . Specifically, the processor 140 may control the electronic device 100 by executing at least one command stored in the memory 120 .
  • the processor 140 acquires the first image corresponding to the projection area 1000 on which the projection image is to be projected through the camera 130. can do. That is, the processor 140 may acquire the first image by capturing the projection area 1000 on which the projection image is to be projected through the camera 130 .
  • the projection area 1000 is an area where a projection area projected by the projection unit 110 is located, and may be various areas such as a wall surface, a ceiling surface, and a screen area in a home.
  • interference areas 1010 , 1020 , and 1030 representing a specific color may appear in a partial area of the projection area 1000 when light emitted from a plurality of external light sources is reflected to the projection area. That is, the interference area is caused by an interference area 1010 represented by light emitted from the light stand device 10, an interference area 1020 represented by light emitted from the bulb 20, and external light outside the window 30. It may include an interference area 1030 indicating.
  • the interference area is shown as being divided into one area within the projection area 1000 in FIG. 2, the present invention is not limited thereto, and the interference area is not divided and may be the entire area of the projection area 1000.
  • the processor 140 captures the projection area 1000 including the interference areas 1010, 1020, and 1030 through the camera 130 to capture the first image. can be obtained.
  • the processor 140 may control the camera 130 to capture the projection area 1000 based on throw ratio and offset information of the projection unit 110 .
  • the throw ratio means the ratio of the distance between projection areas to the width of the projection area in the electronic device 100 that projects an image. When the throw ratio is 1 or more, a long throw occurs, and when the throw ratio is 1 If the throw ratio is less than 0.4, it can be defined as a short throw, and if the throw ratio is less than 0.4, it can be defined as an ultra short throw.
  • Offset information means height difference information between a lens for projecting a projection image and a projection area on which a projection image is projected.
  • the value of Offset) information may increase.
  • throw ratio and offset information of the projection unit 110 may be preset when the electronic device 100 is manufactured. Accordingly, the processor 140 controls the camera 130 so that the projection area 1000 is included in the first image captured by the camera 130 based on predetermined throw ratio and offset information. can do.
  • the processor 140 may perform calibration between the projection area 1000 and a perception area that can be photographed by the camera 130 based on distance information between the electronic device 100 and the projection area. Also, as an example, the processor 140 may obtain a first image by cropping the projection area 1000 from an image captured by the camera 130 .
  • the present disclosure is not limited thereto, and when the first image is obtained through an external camera instead of the camera 130 in the electronic device 100, the processor 140 may use the recognition area of the external camera and the projection area 1000. After cropping so that ) is as similar as possible, calibration between the projection area 1000 and the perceptual area that can be photographed by the camera 130 may be performed.
  • the processor 140 may obtain first projection area information based on the first image.
  • the projection area information may include color information and light quantity information of the projection area 1000 .
  • the projection area information may include color information and light quantity information for each pixel of the projection area 1000 in the first image.
  • projection area information for one pixel of the projection area 1000 in the first image may be expressed as L(x,y), where L is brightness information of one pixel and may be in units of nits.
  • (x, y) is light quantity information represented by one pixel and may mean a coordinate value on a CIE color coordinate system.
  • the projection area information may include various pieces of information for correcting a projection image on which the projection area 1000 is projected.
  • the first projection area information may be projection area information obtained based on a first image captured by the camera 130 of the projection area 1000 when the projection image is not projected by the projection unit 110 . there is.
  • the processor 140 may control the projection unit 110 to project a projection image based on the information on the first projection area. Specifically, the processor 140 may correct the projection image using the first projection area information and control the projection unit 110 to project the corrected projection image.
  • the processor 140 may correct each area of the projection image using projection area information corresponding to each pixel of the projection area 1000 included in the first projection area information. That is, the projection image can be corrected using projection area information for each pixel of the projection area 1000 included in the first projection area information. A specific embodiment for this will be described later with reference to FIGS. 3A to 3C.
  • the processor 140 may project a projection image based on the information on the first projection area, but the present disclosure is not limited thereto.
  • the processor 140 may control the projection unit 110 to project at least one test image based on the information on the first projection area. Further, the processor 140 obtains second projection area information based on at least one projected test image, and produces a corrected image obtained by correcting the projected test image based on the first projection area information and the second projection area information.
  • the projection unit 110 may be controlled to project.
  • the second projection area information is projection area information obtained based on images obtained by photographing at least one test image projected on the basis of the first projection area information, and will be described in detail later.
  • the processor 140 may identify whether to project a test image based on the information on the first projection area.
  • the processor 140 may identify whether to project a test image based on a difference between first projection area information and preset projection area information.
  • the preset projection area information may refer to projection area information indicating color reproducibility of a projection image to correspond to color reproducibility to be expressed by the projection unit 110 .
  • the first projection area information for comparison with the preset projection area information may be an average value of color information and light amount information for each pixel of the projection area 1000 in the first image.
  • the difference between the first projection area information and the preset projection area information when the difference between the first projection area information and the preset projection area information is greater than or equal to a first ratio (eg, 20% or more), it may be identified that the test image should be projected. In addition, when the difference between the first projection area information and the preset projection area information is less than the first ratio (eg, less than 20%), it may be identified that the test image is not projected.
  • a first ratio eg, 20% or more
  • the processor 140 may control the projection unit 110 to project at least one test image.
  • the processor 140 may control the projection unit 110 to sequentially project a RED color test image, a GREEN color test image, and a BLUE color test image onto the projection area 1000 .
  • a RED color test image, a GREEN color test image, and a BLUE color test image may be sequentially projected by various sequential methods.
  • the processor 140 may acquire at least one second image by capturing the projection area 1000 on which the at least one test image is projected through the camera 130 . Specifically, the processor 140 obtains the 2-1 image by capturing the projection area 1000 on which the RED color test image is projected through the camera 130, and the projection area on which the GREEN color test image is projected ( 1000) to obtain a 2-2 image, and a 2-3 image to be obtained by capturing the projection area 1000 on which the BLUE color test image is projected.
  • the processor 140 may obtain second projection area information based on at least one second image. For example, an average value of projection area information obtained from each of the 2-1, 2-2, and 2-3 images may be identified as the second projection area information.
  • the first projection area information is projection area information for an area on which an image is not projected
  • the second projection area information means projection area information for an area where test images of RED, GREEN, and BLUE colors are projected.
  • the processor 140 may control the projection unit 110 to project a projection image based on the first projection area information and the second projection area information.
  • the processor 140 may control the projection unit 110 to project a projection image whose color expression method is corrected based on the first projection area information and the second projection area information. That is, when a projection image whose color expression method is corrected based on the first projection area information and the second projection area information is projected, the image finally expressed in the projection area 1000 has a color corresponding to the preset projection area information. can have reproducibility.
  • the test image is a RED color test image, a GREEN color test image, and a BLUE color test image, but the present disclosure is not limited thereto. That is, as another embodiment according to the present disclosure, when first projection area information is obtained from the first image, the processor 140 controls the projection unit 110 to project at least one test image based on the first image. can do.
  • the processor 140 may identify the color of at least one test image based on the first projection area information obtained from the first image. Also, the processor 140 may control the projection unit 110 to project at least one test image based on the identified color. For example, the processor 140 uses an average value of color information and light amount information for each pixel within the projection area 1000 included in the first projection area information to affect the projection area 1000 similarly to an external light source. The projection unit 110 may be controlled to project a color test image.
  • the present invention is not limited thereto, and the processor 140 operates the projection unit 110 to project a test image of a plurality of colors corresponding to each pixel in the projection area 1000 based on the color information and light amount information of each pixel. You can control it.
  • the processor 140 may control the projection unit 110 to project a projection image whose color expression method is corrected based on the first projection area information. That is, as described above, the processor 140 may correct the projection image using the first projection area information and control the projection unit 110 to project the corrected projection image.
  • the present invention is not limited thereto, and when the difference between the first projection area information and the preset projection area information is less than the second ratio (eg, less than 10%), the processor 140 converts the projection image without correcting the color expression method.
  • the projection unit 110 may be controlled to project.
  • the processor 140 is controlled by a difference between the first ratio and the second ratio between the first projection area information and the preset projection area information, but the present disclosure is not limited thereto and the first projection area information
  • the processor 140 may be controlled by a difference between the first value and the second value between the first value and the preset projection area information. An embodiment of the first ratio and the second ratio will be described later with reference to FIG. 5 .
  • the aforementioned process of controlling the processor 140 may be performed when the power of the electronic device 100 is turned on.
  • the present invention is not limited thereto, and the processor 140 may perform the above-described control process when a change in a light source in the house where the electronic device 100 is located is detected.
  • the processor 140 may obtain light emission information of an indoor lighting device.
  • the light emission information may include information on whether the lighting device emits light and information on light intensity.
  • the indoor lighting device may refer to a lighting device that affects the projection area 1000 .
  • the lighting device may include a light stand device 10, a light bulb 20, etc. as shown in FIG. 2, but is not limited thereto, and may include various lighting devices outputting a light source such as an AI speaker.
  • the lighting device may include a smart curtain device installed on the window 30 in the house, and in this case, light emission information may include operation information of the smart curtain.
  • the processor 140 may receive light emission information from the indoor lighting device.
  • the processor 140 may control the projection unit 110 to project a test image.
  • the present invention is not limited thereto, and when a change in the light source in the house is detected, the processor 140 may control the projection unit 110 to project a projection image including a UI notifying that image correction is required. Further, the processor 140 may control the projection unit 110 to project a test image when a user input for image correction is received.
  • control process of the processor 140 described above may be performed whenever the power of the electronic device 100 is turned on, but the present disclosure is not limited thereto.
  • the processor 140 may obtain light emission information of the indoor lighting device. In addition, the processor 140 may match correction information for the corrected projection image with the acquired light emission information and store it in the memory 120 .
  • the processor 140 may receive light emission information of the indoor lighting device when the power of the electronic device 100 is turned on. And, if correction information matching the received light emission information is pre-stored, the processor 140 corrects the projection image based on the pre-stored correction information corresponding to the received light emission information, and projects the corrected projection image.
  • the projection unit 110 may be controlled to do so.
  • the projection image can be corrected without a test process by storing the correction information for the projection image in which the interference of the existing external lighting device is reflected in the memory 120 .
  • 3A is a diagram illustrating a first image obtained by capturing a projection area on which no projection image is projected according to the present disclosure.
  • 3B is a diagram for explaining an embodiment of correcting a projection image based on a first image according to the present disclosure.
  • 3C is a diagram for explaining an embodiment in which a corrected projection image is projected based on a first image according to the present disclosure.
  • the electronic device 100 may acquire the first image 300 by photographing the projection area 1000 where the projection image is to be projected before the projection image is projected, as shown in FIG. 3A.
  • the first image 300 may include interference areas 1010 , 1020 , and 1030 representing a specific color by reflecting light emitted from an external light source to the projection area 1000 as described in FIG. 2 .
  • the interference area is shown as being divided into one area within the projection area 1000, but is not limited thereto. That is, the interference area may be the entire area of the projection area 1000 without being divided as shown in FIG. 3A.
  • the electronic device 100 may obtain first projection area information based on the first image 300 .
  • the first projection area information may include color information and light amount information of the projection area 1000 included in the first image 300 .
  • the projection area information includes projection area information of each pixel corresponding to the projection area, and the projection area information may include brightness information and color information.
  • the first projection area information obtained through the first image 300 includes projection area information of the first representative pixel 1000-1 and projection area information of the second representative pixel 1000-2.
  • the first representative pixel 1000 - 1 and the second representative pixel 1000 - 2 may mean a minimum unit in which a projected image within the projection area 1000 is expressed by the projection unit 110 .
  • the electronic device 100 may project projection images of N pixels on the projection area 1000 .
  • one pixel among the N pixels may be the first representative pixel 1000-1.
  • the first image 300 is projected within the first image 300 according to the quality of the camera 130 that captures the first image 300 and the size of the projection area 1000 within the first image 300.
  • Area 1000 may include M pixels. Accordingly, the electronic device 100 may match N pixels corresponding to the projection image with M pixels corresponding to the projection area 1000 of the first image 300 .
  • the electronic device 100 may match a plurality of M pixels corresponding to the projection area 1000 of the first image 300 to one of the N pixels corresponding to the projection image.
  • the electronic device 100 identifies at least one pixel corresponding to the first pixel among the N pixels corresponding to the projection image among the plurality of M pixels included in the first image 300, and identifies at least one pixel.
  • One pixel may be identified as the first representative pixel 1000-1.
  • the electronic device 100 converts average values of brightness information and color information of at least one pixel included in the first representative pixel 1000-1 in the first image 3000 to the first representative pixel 1000-1. It can be obtained as projection area information of Then, the electronic device 100 identifies at least one pixel corresponding to the second pixel among the N pixels corresponding to the projection image among the plurality of M pixels included in the first image 300, and identifies at least one pixel.
  • a pixel of may be identified as the second representative pixel 1000-2. Then, the electronic device 100 converts average values of brightness information and color information of at least one pixel included in the second representative pixel 1000-2 in the first image 3000 to the second representative pixel 1000-2. It can be obtained as projection area information of The electronic device 100 may obtain projection area information for each of a plurality of representative pixels included in the projection area 1000 in the first image 3000 by repeating this process.
  • the electronic device 100 may obtain projection information of each of the representative pixels based on projection area information for each of a plurality of representative pixels and predetermined projection area information to be expressed through a projection image.
  • Projection area information may be expressed as a brightness value (an X-axis value on a CIE coordinate system and a Y-axis value on a CIE coordinate system), and the brightness value is brightness information of one pixel and may be in units of nits. For example, if projection area information of the first representative pixel 1000-1 is l1(x1, y2) and preset projection area information corresponding to the first representative pixel 1000-1 is L1(X1, Y1) , the electronic device 100 may obtain projection information of the first pixel as in Equation 1.
  • the electronic device 100 obtains projection information for each pixel as in Equation 1, and corrects each pixel of the projection image 30 based on the acquired projection information for each pixel as shown in FIG. 3B. can be done That is, the electronic device 100 may correct the projected image so that the image is projected using brightness information and color information included in the projection information for each pixel acquired through Equation 1. In the case of a pixel having a negative projection information, the electronic device 100 may not perform correction on the corresponding pixel. Then, the electronic device 100 may acquire the corrected projection image 35 and project the corrected projection image 35 as shown in FIG. 3C.
  • the electronic device 100 obtains and projects the corrected projection image 35 as shown in FIG. 3C, so that the projected image projected on the projection area 1000 can be expressed to correspond to preset projection area information.
  • 4A is a diagram illustrating an image obtained by capturing a projection area in which a projection image is not projected according to the present disclosure.
  • 4B is a diagram illustrating at least one second image obtained by capturing a projection area on which at least one test image is projected according to the present disclosure.
  • 4C is a diagram for explaining an embodiment of correcting a projection image based on a first image and at least one second image according to the present disclosure.
  • 4D is a diagram for explaining an embodiment in which a corrected projection image is projected based on a first image and at least one second image according to the present disclosure.
  • FIGS. 4A to 4D are examples in which a projection image is corrected based on a first image captured before the projection image is projected and at least one second image in which the projection area where the test image is projected is captured.
  • the electronic device 100 may obtain a first image 400 by photographing the projection area 1000 where the projection image is to be projected before the projection image is projected, as shown in FIG. 4A.
  • the first image 400 may include interference areas 1010 , 1020 , and 1030 representing a specific color as light emitted from an external light source is reflected by the projection area 1000 as described in FIG. 2 .
  • interference areas 1010 , 1020 , and 1030 representing a specific color as light emitted from an external light source is reflected by the projection area 1000 as described in FIG. 2 .
  • the electronic device 100 may obtain first projection area information based on the first image 400 .
  • the first projection area information may include color information and light amount information of the projection area 1000 included in the first image 400 .
  • the first projection area information acquired through the first image 400 may include projection area information L(x,y) for each pixel.
  • the electronic device 100 may project at least one test image.
  • the electronic device 100 may control the projection unit 110 to sequentially project a RED color test image, a GREEN color test image, and a BLUE color test image onto the projection area 1000 .
  • a color test image, a green color test image, and a BLUE color test image may be sequentially projected by various sequential methods.
  • the electronic device 100 may obtain at least one second image by photographing the projection area 1000 on which the at least one test image is projected through a camera, as shown in FIG. 4B. Specifically, referring to FIG. 4B , the electronic device 100 obtains a 2-1 image 410 by capturing a projection area 1000 on which a test image of RED color is projected through a camera, and a test image of GREEN color is obtained. A 2-2 image 420 is acquired by capturing the projection area 1000 where the image is projected, and a 2-3 image 430 is obtained by capturing the projection area 1000 where the BLUE color test image is projected. can do.
  • the processor 140 may obtain second projection area information based on at least one second image.
  • second projection area information including projection area information obtained from each of the 2-1 image 410, 2-2 image 420, and 2-3 image 430 may be obtained.
  • the electronic device 100 may express the color corresponding to the test image as reference coordinate values for each of RED, GREEN, and BLUE on the CIE color coordinate system. That is, as an example, the color corresponding to the test image may be expressed as color information about coordinate values of [R(x1,y1), G(x2,y2), B(x3,y3)].
  • the color corresponding to the test image can be generated by mixing the RED color corresponding to (x1, y1), the GREEN color corresponding to (x2, y2), and the BLUE color corresponding to (x3, y3). there is.
  • the electronic device 100 projects the color information of [R(x1,y1), G(x2,y2), B(x3,y3)] corresponding to the test image and the test image corresponding to the corresponding color information
  • a projection image may be corrected through a difference between projection area information of a photographed image.
  • the first projection area information is projection area information for an area on which an image is not projected
  • the second projection area information includes projection area information for each of the areas onto which the RED, GREEN, and BLUE color test images are projected. can do.
  • the electronic device 100 may correct the projection image based on the first projection area information and the second projection area information and project the corrected projection image.
  • the electronic device 100 displays the first image 400, the 2-1 image 410, the 2-2 image 420, and the 2-3 image 430 as shown in FIG. 4C.
  • the projection image may be corrected using the acquired first projection area information and second projection area information.
  • the electronic device 100 may project the corrected projection image onto the projection area 1000 as shown in FIG. 4D, so that the projected image projected on the projection area 1000 may be expressed to correspond to preset projection area information.
  • the projection image may be corrected based on the first projection area information and the second projection area information.
  • the electronic device 100 affects the projection area 1000 based on the first projection area information.
  • a test image of a color similar to that of the light source can be projected. That is, the electronic device 100 may obtain color information of an external light source affecting the projection area 1000 based on information on the first projection area, and project a test image based on the acquired color information.
  • the electronic device 100 may obtain second projection area information based on an image of a projection area onto which a test image having a color similar to that of the external light source is projected. Then, the electronic device 100 corrects the projection image based on the first projection area information and the second projection area information, and projects the corrected projection image so that the projection image projected on the projection area 1000 is a preset projection area. It can be expressed to correspond to information.
  • FIG. 5 is a flowchart for explaining a specific operation of an electronic device according to the present disclosure.
  • the electronic device 100 may acquire a first image by capturing a projection area where a projection image is to be projected (S505). Specifically, before the projection image is projected by the electronic device 100, the electronic device 100 may acquire the first image by capturing a projection area where the projection image is to be projected.
  • the electronic device 100 may obtain first projection area information based on the first image (S510).
  • the electronic device 100 may identify whether a difference between the first projection area information and the preset projection area information is greater than or equal to a first ratio (S515).
  • the electronic device 100 determines a difference between an average value of color coordinate values of each pixel included in the first projection area information and an average value of color coordinate values of each pixel included in the preset projection area information. It is possible to identify whether it is above the ratio or not.
  • the first ratio may be a preset value by the user or the manufacturer of the electronic device 100 .
  • the electronic device 100 determines the difference between the first projection area information and the preset projection area information. 2 or more may be identified (S520). As an embodiment, the electronic device 100 determines the difference between the average value of color coordinate values of each pixel included in the first projection area information and the average value of color coordinate values of each pixel included in the preset projection area information in the second projection area information. It is possible to identify whether it is above the ratio or not.
  • the second ratio is a value smaller than the value of the first ratio, and may be a value preset by the user or the manufacturer of the electronic device 100 .
  • the electronic device 100 may project a projection image without image correction (S525). That is, the electronic device 100 may project the projection image as it is without correcting the color coordinates of the projection image.
  • the electronic device 100 may project a corrected projection image based on the first projection area information. (S530).
  • the electronic device 100 may project at least one test image (S535). That is, the electronic device 100 may project at least one test image onto the projection area captured in the first image.
  • the electronic device 100 may obtain at least one second image by capturing a projection area where at least one test image is projected (S540).
  • the projection area included in the second image and the projection area included in the first image may be the same area.
  • the electronic device 100 may acquire second projection area information based on at least one second image (S545).
  • the electronic device 100 may identify an average value of projection area information of each of at least one second image as second projection area information.
  • the electronic device 100 may project a corrected projection image based on the first projection area information and the second projection area information (S550).
  • the electronic device 100 may correct the projection image based on projection area information obtained by applying a preset weight to each of the first projection area information and the second projection area information.
  • the preset weight may be preset by a manufacturer or appropriately changed by a user's control command for changing color reproducibility.
  • FIG. 6 is a flowchart illustrating an operation of an electronic device according to the present disclosure.
  • the electronic device 100 may acquire a first image by photographing a projection area where a projection image is to be projected (S610). Specifically, before the projection image is projected by the electronic device 100, the electronic device 100 may obtain a first image by capturing a projection area where the projection image is to be projected through a camera.
  • the electronic device 100 may obtain first projection area information based on the first image (S620).
  • the first projection area information may include at least one of color information and light quantity information on the projection area.
  • the electronic device 100 may identify whether or not to project a test image based on the information on the first projection area (S630). As an example, the electronic device 100 may identify whether to project a test image based on a difference between first projection area information and preset projection area information. That is, when the difference between the first projection area information and the preset projection area information is relatively large, the electronic device 100 may identify that the test image is being projected.
  • the electronic device 100 may project the projected image based on the first projection area information. That is, the electronic device 100 may correct the projection image using at least one of color information and light quantity information about the projection region included in the first projection region information, and project the corrected projection image.
  • the electronic device 100 may project at least one test image (S640).
  • the electronic device 100 may acquire at least one second image by capturing a projection area where at least one test image is projected (S650).
  • the electronic device 100 may obtain second projection area information based on at least one second image (S660).
  • the second projection area information may also include at least one of color information and light quantity information about the projection area on which the test image is projected.
  • the electronic device 100 may project a projection image based on the first projection area information and the second projection area information (S670). For example, the electronic device 100 may adjust the color expression method of the projection image based on the first projection area information and the second projection area information, and project the adjusted projection image.
  • an electronic device 700 may include a head 703, a body 705, a projection lens 710, a connector 730, or a cover 707.
  • the electronic device 700 may be various types of devices.
  • the electronic device 700 may be a projector device that enlarges and projects an image onto a wall or a screen
  • the projector device may be an LCD projector or a digital light processing (DLP) projector using a digital micromirror device (DMD).
  • DLP digital light processing
  • DMD digital micromirror device
  • the electronic device 700 may be a home or industrial display device, or may be a lighting device used in daily life, a sound device including a sound module, a portable communication device (eg, a smartphone), It may be implemented as a computer device, a portable multimedia device, a wearable device, or a home appliance.
  • the electronic device 700 according to an embodiment of the present disclosure is not limited to the above devices, and the electronic device 700 may be implemented as an electronic device 700 having two or more functions of the above devices.
  • the electronic device 700 may be used as a display device, a lighting device, or a sound device by turning off a projector function and turning on a lighting function or a speaker function according to manipulation of a processor, and AI including a microphone or communication device. Can be used as a speaker.
  • the main body 705 is a housing forming an exterior and may support or protect components (eg, the components shown in FIG. 8 ) of the electronic device 700 disposed inside the main body 705 .
  • the body 705 may have a structure close to a cylindrical shape as shown in FIG. 7 .
  • the shape of the main body 705 is not limited thereto, and according to various embodiments of the present disclosure, the main body 705 may be implemented in various geometric shapes such as a column, a cone, and a sphere having a polygonal cross section.
  • the size of the main body 705 may be a size that a user can hold or move with one hand, may be implemented in a very small size for easy portability, and may be implemented in a size that can be placed on a table or coupled to a lighting device.
  • the material of the main body 705 may be implemented with matte metal or synthetic resin so as not to be stained with user's fingerprints or dust, or the exterior of the main body 705 may be made of a smooth gloss.
  • a friction area may be formed on a portion of the exterior of the body 705 so that a user can grip and move the body 705 .
  • the main body 705 may be provided with a bent gripping part or a support 708 (see FIG. 9 ) that the user can grip in at least a portion of the body 705 .
  • the projection lens 710 is formed on one surface of the body 705 to project light passing through the lens array to the outside of the body 705 .
  • the projection lens 710 of various embodiments may be an optical lens coated with low dispersion to reduce chromatic aberration.
  • the projection lens 710 may be a convex lens or a condensing lens, and the projection lens 710 according to an exemplary embodiment may adjust a focus by adjusting positions of a plurality of sub-lenses.
  • the head 703 is provided to be coupled to one surface of the body 705 to support and protect the projection lens 710 .
  • the head 703 may be coupled to the body 705 so as to be swivelable within a predetermined angular range based on one surface of the body 705 .
  • the head 703 is automatically or manually swiveled by a user or a processor to freely adjust the projection angle of the projection lens 710 .
  • the head 703 is coupled to the body 705 and includes a neck extending from the body 705, so that the head 703 is tilted or tilted to adjust the projection angle of the projection lens 710. can be adjusted
  • the electronic device 700 may project light or an image to a desired location by adjusting the direction of the head 703 and adjusting the emission angle of the projection lens 710 while the position and angle of the main body 705 are fixed. there is.
  • the head 703 may include a handle that the user can grasp after rotating in a desired direction.
  • a plurality of openings may be formed on an outer circumferential surface of the main body 705 . Audio output from the audio output unit may be output to the outside of the main body 705 of the electronic device 700 through the plurality of openings.
  • the audio output unit may include a speaker, and the speaker may be used for general purposes such as multimedia playback, recording playback, and audio output.
  • a heat dissipation fan (not shown) may be provided inside the main body 705, and when the heat dissipation fan (not shown) is driven, air or heat inside the main body 705 passes through a plurality of openings. can emit. Therefore, the electronic device 700 can discharge heat generated by driving the electronic device 700 to the outside and prevent the electronic device 700 from overheating.
  • the connector 730 may connect the electronic device 700 with an external device to transmit/receive electrical signals or receive power from the outside.
  • the connector 730 may be physically connected to an external device.
  • the connector 730 may include an input/output interface, and may connect communication with an external device or receive power through a wired or wireless method.
  • the connector 730 may include an HDMI connection terminal, a USB connection terminal, an SD card accommodating groove, an audio connection terminal, or a power outlet, or Bluetooth, Wi-Fi, or wireless connection to an external device wirelessly.
  • a charging connection module may be included.
  • the connector 730 may have a socket structure connected to an external lighting device, and may be connected to a socket receiving groove of the external lighting device to receive power.
  • the size and standard of the connector 730 having a socket structure may be variously implemented in consideration of a receiving structure of a coupleable external device.
  • the diameter of the junction of the connector 730 may be implemented as 26 mm, and in this case, the electronic device 700 replaces a conventionally used light bulb and an external lighting device such as a stand. can be coupled to Meanwhile, when fastened to a socket located on an existing ceiling, the electronic device 700 is projected from top to bottom, and when the electronic device 700 is not rotated by coupling the socket, the screen cannot be rotated either.
  • the electronic device 700 is socket-coupled and power is supplied, the electronic device 700 is socket-coupled to the ceiling stand and the head 703 is moved from one side of the main body 705 so that the electronic device 700 can rotate. It swivels and adjusts the emission angle to project the screen to a desired position or rotate the screen.
  • the connector 730 may include a coupling sensor, and the coupling sensor may sense whether or not the connector 730 is coupled with an external device, a coupling state, or a coupling target, and transmit the sensor to the processor, based on the received detection value. Driving of the electronic device 700 may be controlled.
  • the cover 707 can be coupled to and separated from the main body 705 and can protect the connector 730 so that the connector 730 is not constantly exposed to the outside.
  • the shape of the cover 707 may have a shape continuous with the body 705 as shown in FIG. 7 , or may be implemented to correspond to the shape of the connector 730 .
  • the cover 707 can support the electronic device 700, and the electronic device 700 can be used by being coupled to the cover 707 and coupled to or mounted on an external cradle.
  • a battery may be provided inside the cover 707 .
  • a battery may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell or a fuel cell.
  • the electronic device 700 may include a protective case (not shown) to protect the electronic device 700 and facilitate transportation, or a stand supporting or fixing the main body 705. (not shown), and may include a bracket (not shown) coupled to a wall or partition.
  • the electronic device 700 may provide various functions by being connected to various external devices using a socket structure.
  • the electronic device 700 may be connected to an external camera using a socket structure.
  • the electronic device 700 may use the projection unit 110 to provide an image stored in a connected camera or an image currently being captured.
  • the electronic device 700 may be connected to a battery module using a socket structure to receive power.
  • the electronic device 700 may be connected to an external device using a socket structure, but this is only an example, and may be connected to an external device using another interface (eg, USB, etc.).
  • the electronic device 800 includes a projection unit 110, a memory 120, a user interface 830, an input/output interface 840, an audio output unit 850, a power supply unit 860, and a processor. (140). Meanwhile, the configuration shown in FIG. 8 is just one embodiment, and some configurations may be omitted and new configurations may be added. Since configurations of the memory 120 and the processor 140 have been described with reference to FIG. 1 , detailed descriptions thereof will be omitted.
  • the projection unit 110 is a component that projects an image to the outside.
  • the projection unit 110 may use various projection methods (eg, a cathode-ray tube (CRT) method, a liquid crystal display (LCD) method, a digital light processing (DLP) method, and a laser method). etc.) can be implemented.
  • CTR cathode-ray tube
  • LCD liquid crystal display
  • DLP digital light processing
  • laser method e.g., a laser method, etc.
  • the principle of the CRT method is basically the same as that of a CRT monitor.
  • the CRT method enlarges the image with a lens in front of the cathode ray tube (CRT) and displays the image on the screen.
  • CTR cathode ray tube
  • red, green, and blue cathode ray tubes can be separately implemented.
  • the LCD method is a method of displaying an image by transmitting light from a light source through a liquid crystal.
  • the LCD method is divided into a single-panel type and a three-panel type.
  • the light from the light source is separated into red, green, and blue by a dichroic mirror (a mirror that reflects only light of a specific color and passes the rest) and transmits the liquid crystal. After that, the light can gather in one place again.
  • a dichroic mirror a mirror that reflects only light of a specific color and passes the rest
  • the DLP method is a method of displaying an image using a DMD (Digital Micromirror Device) chip.
  • the projection unit of the DLP method may include a light source, a color wheel, a DMD chip, a projection lens, and the like.
  • Light output from a light source may exhibit a color while passing through a rotating color wheel.
  • the light that passed through the color wheel is input to the DMD chip.
  • the DMD chip includes numerous micromirrors and reflects light input to the DMD chip.
  • the projection lens may play a role of enlarging light reflected from the DMD chip to an image size.
  • the laser method includes a diode pumped solid state (DPSS) laser and a galvanometer.
  • DPSS diode pumped solid state
  • the galvanometer includes a mirror and a high power motor to move the mirror at high speed.
  • a galvanometer can rotate a mirror at up to 40 KHz/sec.
  • the galvanometer is mounted according to the scanning direction. In general, since the projector scans in a plane, the galvanometer can also be arranged separately in the x and y axes.
  • the projection unit 110 may include various types of light sources.
  • the projection unit 110 may include at least one light source among a lamp, LED, and laser.
  • the projection unit 110 may output an image in a 4:3 aspect ratio, a 5:4 aspect ratio, or a 16:9 wide aspect ratio according to the purpose of the electronic device 800 or a user's setting, and may output an image in a WVGA (854*480) aspect ratio depending on the aspect ratio. ), SVGA(800*600), XGA(1024*768), WXGA(1280*720), WXGA(1280*800), SXGA(1280*1024), UXGA(1600*1200), Full HD(1920*7080) ), etc., can output images at various resolutions.
  • the projection unit 110 may perform various functions for adjusting an output image under the control of the processor 140 .
  • the projection unit 110 may perform functions such as zoom, keystone, quick corner (4 corner) keystone, and lens shift.
  • the projection unit 110 may enlarge or reduce the image according to the distance from the screen (projection distance). That is, a zoom function may be performed according to the distance from the screen.
  • the zoom function may include a hardware method of adjusting the screen size by moving a lens and a software method of adjusting the screen size by cropping an image.
  • methods for adjusting the focus include a manual focus method and a motorized method.
  • the manual focus method refers to a method of manually focusing
  • the motorized method refers to a method of automatically focusing using a motor built into the projector when a zoom function is performed.
  • the projection unit 110 may provide a digital zoom function through software, and may provide an optical zoom function that performs a zoom function by moving a lens through a driving unit.
  • the projection unit 110 may perform a keystone function. If the height is not right for the front projection, the screen may be distorted up or down.
  • the keystone function means a function of correcting a distorted screen. For example, if distortion occurs in the left and right directions of the screen, it can be corrected using the horizontal keystone, and if distortion occurs in the vertical direction, it can be corrected using the vertical keystone.
  • the quick corner (4 corner) keystone function corrects the screen when the central area of the screen is normal but the corner area is not balanced.
  • the lens shift function is a function that moves the screen as it is when the screen is out of the screen.
  • the projection unit 110 may provide zoom/keystone/focus functions by automatically analyzing the surrounding environment and the projection environment without user input. Specifically, the projection unit 110 determines the distance between the screen and the electronic device 800 detected through sensors (a depth camera sensor, a distance sensor, an infrared sensor, an illuminance sensor, etc.) and a space where the electronic device 800 is currently located. Zoom/Keystone/Focus functions can be automatically provided based on information about the image and the amount of ambient light.
  • sensors a depth camera sensor, a distance sensor, an infrared sensor, an illuminance sensor, etc.
  • the projection unit 110 may provide a lighting function using a light source.
  • the projection unit 110 may provide a lighting function by outputting a light source using LEDs.
  • the projection unit 110 may include one LED, and according to another embodiment, the electronic device may include a plurality of LEDs.
  • the projection unit 110 may output a light source using a surface-emitting LED according to an implementation example.
  • the surface-emitting LED may refer to an LED having a structure in which an optical sheet is disposed above the LED so that light sources are uniformly distributed and output. Specifically, when the light source is output through the LED, the light source may be evenly dispersed through the optical sheet, and the light source dispersed through the optical sheet may be incident to the display panel.
  • the projection unit 110 may provide a user with a dimming function for adjusting the intensity of the light source. Specifically, when a user input for adjusting the intensity of a light source is received from a user through the user interface 830 (eg, a touch display button or a dial), the projection unit 110 displays a light source corresponding to the received user input. It is possible to control the LED to output the intensity of.
  • the user interface 830 eg, a touch display button or a dial
  • the projection unit 110 may provide a dimming function based on the content analyzed by the processor 140 without user input.
  • the projection unit 110 may control the LED to output the intensity of the light source based on information about currently provided content (eg, content type, content brightness, etc.).
  • the projection unit 110 may control the color temperature under the control of the processor 140 .
  • the processor 140 may control the color temperature based on content. Specifically, if the content is identified as being output, the processor 140 may obtain color information for each frame of the content for which output is determined. Also, the processor 140 may control the color temperature based on the obtained color information for each frame. Here, the processor 140 may obtain at least one primary color of a frame based on color information for each frame. Also, the processor 140 may adjust the color temperature based on the acquired primary color. For example, the color temperature controllable by the processor 140 may be classified into a warm type or a cold type.
  • a frame to be output (hereinafter referred to as an output frame) includes a scene in which a fire has occurred.
  • the processor 140 may identify (or obtain) that the primary color is red based on color information included in the current output frame. Also, the processor 140 may identify a color temperature corresponding to the identified main color (red). Here, the color temperature corresponding to red may be a warm type. Meanwhile, the processor 140 may use an artificial intelligence model to obtain color information or a primary color of a frame.
  • the artificial intelligence model may be stored in the electronic device 800 (eg, the memory 120). According to another embodiment, the artificial intelligence model may be stored in an external server communicable with the electronic device 800 .
  • the electronic device 800 may control a lighting function in conjunction with an external device.
  • the electronic device 800 may receive lighting information from an external device.
  • the lighting information may include at least one of brightness information and color temperature information set by an external device.
  • the external device is a device connected to the same network as the electronic device 800 (eg, an IoT device included in the same home/work network) or a device that is not on the same network as the electronic device 800 but can communicate with the electronic device ( For example, a remote control server).
  • IoT device an external lighting device included in the same network as the electronic device 800 outputs red light with a brightness of 50.
  • the external lighting device may directly or indirectly transmit lighting information (eg, information indicating that red light is output with a brightness of 50) to the electronic device 800 .
  • the electronic device 800 may control the output of the light source based on lighting information received from an external lighting device. For example, when lighting information received from an external lighting device includes information for outputting red light with a brightness of 50, the electronic device 800 may output red light with a brightness of 50.
  • the electronic device 800 may control a lighting function based on biometric information.
  • the processor 140 may obtain user's biometric information.
  • the biometric information may include at least one of the user's body temperature, heart rate, blood pressure, respiration, and electrocardiogram.
  • the biometric information may include various types of information in addition to the information described above.
  • an electronic device may include a sensor for measuring biometric information.
  • the processor 140 may obtain user's biometric information through a sensor and control the output of a light source based on the obtained biometric information.
  • the processor 140 may receive biometric information from an external device through the input/output interface 840 .
  • the external device may refer to a user's portable communication device (eg, a smart phone or a wearable device).
  • the processor 140 may obtain user's biometric information from an external device and control the output of the light source based on the obtained biometric information.
  • the electronic device may identify whether the user is sleeping, and if the user is identified as sleeping (or preparing for sleep), the processor 140 determines the light source based on the user's biometric information. You can control the output.
  • the user interface 830 may include various types of input devices.
  • the user interface 830 may include physical buttons.
  • the physical button may include a function key, a direction key (eg, a 4-direction key), or a dial button.
  • the physical button may be implemented as a plurality of keys.
  • the physical button may be implemented as one key.
  • the electronic device 800 may receive a user input in which one key is pressed for a critical period of time or longer.
  • the processor 140 may perform a function corresponding to the user input. For example, the processor 140 may provide a lighting function based on user input.
  • the user interface 830 may receive a user input using a non-contact method.
  • a method for controlling the electronic device regardless of physical force may be required.
  • the user interface 830 may receive a user gesture and perform an operation corresponding to the received user gesture.
  • the user interface 830 may receive a user's gesture through a sensor (eg, an image sensor or an infrared sensor).
  • the user interface 830 may receive a user input using a touch method.
  • the user interface 830 may receive a user input through a touch sensor.
  • the touch method may be implemented as a non-contact method.
  • the touch sensor may determine whether the user's body has approached within a critical distance.
  • the touch sensor may identify a user input even when the user does not contact the touch sensor.
  • the touch sensor may identify a user input in which a user contacts the touch sensor.
  • the electronic device 800 may receive user input in various ways other than the above-described user interface.
  • the electronic device 800 may receive a user input through an external remote control device.
  • the external remote control device may be a remote control device corresponding to the electronic device 800 (eg, an electronic device-specific control device) or a user's portable communication device (eg, a smartphone or a wearable device).
  • the user's portable communication device may store an application for controlling the electronic device.
  • the portable communication device may obtain a user input through the stored application and transmit the acquired user input to the electronic device 800 .
  • the electronic device 800 may receive a user input from a portable communication device and perform an operation corresponding to a user's control command.
  • the electronic device 800 may receive a user input using voice recognition.
  • the electronic device 800 may receive a user's voice through a microphone included in the electronic device.
  • the electronic device 800 may receive a user's voice from a microphone or an external device.
  • the external device may acquire the user's voice through the microphone of the external device and transmit the acquired user's voice to the electronic device 800 .
  • the user's voice transmitted from the external device may be audio data or digital data obtained by converting the audio data (eg, audio data converted into a frequency domain).
  • the electronic device 800 may perform an operation corresponding to the received user voice.
  • the electronic device 800 may receive audio data corresponding to a user's voice through a microphone.
  • the electronic device 800 may convert the received audio data into digital data.
  • the electronic device 800 may convert the converted digital data into text data using a speech to text (STT) function.
  • the speech to text (STT) function may be performed directly in the electronic device 800, and according to another embodiment, the speech to text (STT) function may be performed in an external server.
  • the electronic device 800 may transmit digital data to an external server.
  • the external server may convert digital data into text data and obtain control command data based on the converted text data.
  • the external server may transmit control command data (at this time, text data may also be included) to the electronic device 800 .
  • the electronic device 800 may perform an operation corresponding to the user's voice based on the obtained control command data.
  • the electronic device 800 may provide a voice recognition function using one assistant (or artificial intelligence assistant, eg, BixbyTM, etc.), but this is only one embodiment and through a plurality of assistants.
  • a voice recognition function may be provided.
  • the electronic device 800 may provide a voice recognition function by selecting one of a plurality of assists based on a trigger word corresponding to the assist or a specific key present on the remote control.
  • the electronic device 800 may receive a user input using screen interaction.
  • Screen interaction may refer to a function of identifying whether a predetermined event occurs through an image projected on a screen (or a projection surface) by an electronic device and acquiring a user input based on the predetermined event.
  • the predetermined event may refer to an event in which a predetermined object is identified at a specific location (eg, a location where a UI for receiving a user input is projected).
  • the predetermined object may include at least one of a user's body part (eg, a finger), a pointing stick, and a laser point.
  • the electronic device 800 may identify that a user input for selecting the projected UI has been received. For example, the electronic device 800 may project a guide image to display a UI on the screen. And, the electronic device 800 can identify whether the user selects the projected UI. Specifically, the electronic device 800 may identify that the user has selected the projected UI when a predetermined event is identified at the location of the projected UI.
  • the projected UI may include at least one or more items.
  • the electronic device 800 may perform spatial analysis to identify whether a predetermined event is located at the location of the projected UI.
  • the electronic device 800 may perform spatial analysis through a sensor (eg, an image sensor, an infrared sensor, a depth camera sensor, a distance sensor, etc.).
  • the electronic device 800 may identify whether a predetermined event occurs at a specific location (the location where the UI is projected) by performing spatial analysis. And, if it is identified that a predetermined event occurs at a specific location (the location where the UI is projected), the electronic device 800 may identify that a user input for selecting a UI corresponding to the specific location has been received.
  • the input/output interface 840 is a component for inputting/outputting at least one of an audio signal and a video signal.
  • the input/output interface 840 may receive at least one of audio and video signals from an external device and output a control command to the external device.
  • the input/output interface 840 includes HDMI (High Definition Multimedia Interface), MHL (Mobile High-Definition Link), USB (Universal Serial Bus), USB C-type, DP (Display Port), Thunderbolt, VGA (Video Graphics Array) port, RGB port, D-SUB (Dsubminiature) and DVI (Digital Visual Interface) may be implemented as at least one wired input/output interface.
  • the wired input/output interface may be implemented as an interface for inputting/outputting only audio signals and an interface for inputting/outputting only video signals, or may be implemented as one interface for inputting/outputting both audio and video signals.
  • the electronic device 800 may receive data through a wired input/output interface, but this is merely an example, and power may be supplied through the wired input/output interface.
  • the electronic device 800 may receive power from an external battery through USB C-type or from an outlet through a power adapter.
  • an electronic device may receive power from an external device (eg, a laptop computer or a monitor) through a DP.
  • the input/output interface 840 is Wi-Fi, Wi-Fi Direct, Bluetooth, Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP), and Long Term Evolution (LTE) communication It may be implemented as a wireless input/output interface that performs communication using at least one of the communication methods.
  • the wireless input/output interface may be implemented as an interface for inputting/outputting only audio signals and an interface for inputting/outputting only video signals, or may be implemented as one interface for inputting/outputting both audio and video signals.
  • an audio signal may be input through a wired input/output interface, and a video signal may be input through a wireless input/output interface.
  • an audio signal may be input through a wireless input/output interface and a video signal may be input through a wired input/output interface.
  • the audio output unit 850 is a component that outputs an audio signal.
  • the audio output unit 850 may include an audio output mixer, an audio signal processor, and a sound output module.
  • the audio output mixer may synthesize a plurality of audio signals to be output into at least one audio signal.
  • the audio output mixer may combine an analog audio signal and another analog audio signal (eg, an analog audio signal received from the outside) into at least one analog audio signal.
  • the sound output module may include a speaker or an output terminal.
  • the sound output module may include a plurality of speakers, and in this case, the sound output module may be disposed inside the main body, and the sound emitted while covering at least a part of the diaphragm of the sound output module is transmitted through a sound conduit ( waveguide) and can be transmitted to the outside of the main body.
  • the sound output module includes a plurality of sound output units, and since the plurality of sound output units are symmetrically disposed on the exterior of the main body, sound can be emitted in all directions, that is, in all directions of 360 degrees.
  • the power supply unit 860 may receive power from the outside and supply power to various components of the electronic device 800 .
  • the power supply unit 860 may receive power through various methods.
  • the power supply unit 860 may receive power using the connector 730 shown in FIG. 7 .
  • the power supply unit 860 may receive power using a 120V DC power cord.
  • the present invention is not limited thereto, and the electronic device may receive power using a USB power cord or a wireless charging method.
  • the power supply unit 860 may receive power using an internal battery or an external battery.
  • the power supply unit 860 according to an embodiment of the present disclosure may receive power through an internal battery.
  • the power supply unit 860 may charge power of an internal battery using at least one of a 120V DC power cord, a USB power cord, and a USB C-Type power cord, and may receive power through the charged internal battery.
  • the power supply unit 860 according to an embodiment of the present disclosure may receive power through an external battery.
  • the power supply unit 860 may receive power through the external battery. That is, the power supply unit 860 may directly receive power from an external battery or may charge an internal battery through an external battery and receive power from the charged internal battery.
  • the power supply unit 860 may receive power using at least one of the plurality of power supply methods described above.
  • the electronic device 800 may have power consumption equal to or less than a predetermined value (eg, 43W) due to a socket type and other standards.
  • a predetermined value eg, 43W
  • the electronic device 800 may vary power consumption to reduce power consumption when using a battery. That is, the electronic device 800 may vary power consumption based on a power supply method and power usage.
  • the electronic device 800 may provide various smart functions.
  • the electronic device 800 is connected to a portable terminal device for controlling the electronic device 800, and a screen output from the electronic device 800 can be controlled through a user input input from the portable terminal device.
  • the mobile terminal device may be implemented as a smart phone including a touch display, and the electronic device 800 receives and outputs screen data provided by the mobile terminal device from the mobile terminal device, and inputs data from the mobile terminal device.
  • a screen output from the electronic device 800 may be controlled according to a user input.
  • the electronic device 800 may share content or music provided by the portable terminal device by connecting to the portable terminal device through various communication methods such as Miracast, Airplay, wireless DEX, and Remote PC.
  • the mobile terminal device and the electronic device 800 may be connected through various connection methods.
  • the portable terminal device may perform a wireless connection by searching for the electronic device 800 or the electronic device 800 may search for the portable terminal device and perform a wireless connection.
  • the electronic device 800 may output content provided by the portable terminal device.
  • the electronic device In 800 when a predetermined gesture is detected through the display of the portable terminal device after placing the portable terminal device near the electronic device while specific content or music is being output (eg, motion tap view), the electronic device In 800, content or music being output on the portable terminal device can be output.
  • specific content or music eg, motion tap view
  • the portable terminal device while specific content or music is being output from the portable terminal device, the portable terminal device is brought closer to the electronic device 800 by a predetermined distance or less (eg, non-contact tap view), or the portable terminal device is close to the electronic device 800.
  • the electronic device 800 may output content or music currently being output on the portable terminal device.
  • the mobile terminal device when a connection is established between the mobile terminal device and the electronic device 800, the mobile terminal device outputs a first screen provided by the mobile terminal device, and in the electronic device 800, the first screen is provided by a different mobile terminal device.
  • a second screen may be output.
  • the first screen may be a screen provided by a first application installed on the portable terminal device
  • the second screen may be a screen provided by a second application installed on the portable terminal device.
  • the first screen and the second screen may be different screens provided by one application installed in the portable terminal device.
  • the first screen may be a screen including a remote control type UI for controlling the second screen.
  • the electronic device 800 may output a standby screen.
  • the electronic device 800 may output a standby screen.
  • the electronic device 800 may output a standby screen.
  • Conditions for the electronic device 800 to output the standby screen are not limited to the above examples, and the standby screen may be output under various conditions.
  • the electronic device 800 may output a standby screen in the form of a blue screen, but the present disclosure is not limited thereto.
  • the electronic device 800 may obtain an irregular object by extracting only the shape of a specific object from data received from an external device, and output an idle screen including the acquired irregular object.
  • FIG. 9 is a perspective view illustrating an external appearance of an electronic device 700 according to other embodiments of the present disclosure.
  • the electronic device 700 may include a support (or referred to as a “handle”) 708a.
  • the support 708a of various embodiments may be a handle or a hook provided for a user to grip or move the electronic device 700, or the support 708a may be a main body ( 705) may be a stand supporting it.
  • the support 708a may be coupled to or separated from the outer circumferential surface of the main body 705 in a hinge structure, and may be selectively separated and fixed from the outer circumferential surface of the main body 705 according to the user's needs.
  • the number, shape, or arrangement of the supports 708a may be variously implemented without limitation.
  • the support 708a is built into the main body 705 and can be taken out and used by the user as needed, or the support 708a can be implemented as a separate accessory and detachable from the electronic device 700. there is.
  • the support 708a may include a first support surface 708a-1 and a second support surface 708a-2.
  • the first support surface 708a-1 may be a surface facing the outside of the body 705 in a state where the support 708a is separated from the outer circumferential surface of the body 705, and the second support surface 708a-2 is the support 708a may be a surface facing the inside of the main body 705 in a state where it is separated from the outer circumferential surface of the main body 705 .
  • the first support surface 708a-1 extends from the lower part of the body 705 to the upper part of the body 705 and may move away from the body 705, and the first support surface 708a-1 is flat or uniformly curved. can have a shape.
  • the first support surface 708a-1 is the case where the electronic device 700 is mounted such that the outer surface of the body 705 is in contact with the bottom surface, that is, when the projection lens 710 is disposed facing the front direction of the body ( 705) can be supported.
  • the angle of emission of the head 703 and the projection lens 710 may be adjusted by adjusting the distance between the two supports 708 or the hinge opening angle.
  • the second support surface 708a-2 is a surface that comes into contact with the user or an external mounting structure when the support 708a is supported by the user or an external mounting structure, and prevents the user from slipping when the electronic device 700 is supported or moved. It may have a shape corresponding to the gripping structure of the hand or the external mounting structure. The user may direct the projection lens 710 toward the front, fix the head 703, hold the support 708a, move the electronic device 700, and use the electronic device 700 like a flashlight.
  • the support groove 704 is provided on the main body 705 and is a groove structure that can be accommodated when the support rod 708a is not in use, and as shown in FIG. It can be implemented as a home structure. Through the support groove 704, the support 708a can be stored on the outer circumferential surface of the main body 705 when the support 708 is not in use, and the outer circumferential surface of the main body 705 can be kept smooth.
  • the support 708a may be stored inside the main body 705 and in a situation where the support 708 is needed, the support 708a may be pulled out of the main body 705.
  • the support groove 704 may have a structure drawn into the main body 705 to accommodate the support rod 708a, and the second support surface 708a-2 may be in close contact with the outer circumferential surface of the main body 705 or a separate support rod.
  • a door (not shown) that opens and closes the groove 704 may be included.
  • the electronic device 700 may include various types of accessories that help use or store the electronic device 700.
  • the electronic device 700 may include A protective case (not shown) may be included to protect and easily transport the electronic device 700 by being coupled to a tripod (not shown) or an external surface that supports or fixes the main body 705. Possible brackets (not shown) may be included.
  • FIG. 10 is a perspective view illustrating an external appearance of an electronic device 700 according to still other embodiments of the present disclosure.
  • the electronic device 700 may include a support (or referred to as a “handle”) 708b.
  • the support 708b of various embodiments may be a handle or a ring provided for a user to grip or move the electronic device 700, or the support 708b may be a main body ( 705) may be a stand that supports it so that it can be directed at an arbitrary angle.
  • the support 708b may be connected to the main body 705 at a predetermined point (eg, 2/3 to 3/4 of the height of the main body) of the main body 705. .
  • a predetermined point eg, 2/3 to 3/4 of the height of the main body
  • the main body 705 can be supported at an arbitrary angle in a state where the main body 705 is laid down in the lateral direction.
  • FIG. 11 is a perspective view illustrating an external appearance of an electronic device 700 according to another embodiment of the present disclosure.
  • the electronic device 100 may include a support (or referred to as “support”) 708c.
  • the support 708c of various embodiments includes a base plate 708c-1 provided to support the electronic device 700 on the ground and two support members 708c connecting the base plate 708-c and the main body 705. -2) may be included.
  • the height of the two support members 708c-2 is the same, so that one end surface of the two support members 708c-2 has a groove provided on one outer circumferential surface of the main body 705 and a hinge member 708c. -3) can be combined or separated.
  • the two supporting members may be hingedly connected to the main body 705 at a predetermined point (eg, 1/3 to 2/4 of the height of the main body) of the main body 705 .
  • the main body 705 is rotated based on an imaginary horizontal axis formed by the two hinge members 708c-3 so that the projection lens 710 The emission angle of can be adjusted.
  • FIG. 11 shows an embodiment in which two support members 708c-2 are connected to the main body 705, the present disclosure is not limited thereto and one support member and the main body 705 are one as shown in FIG. 12. It can be connected by a hinge member of.
  • 12A and 12B are perspective views illustrating an external appearance of an electronic device 700 according to still other embodiments of the present disclosure.
  • a support 708d includes a base plate 708d-1, a base plate 708-c, and a body 705 provided to support an electronic device 700 on a ground. It may include one support member (708d-2) connecting the.
  • one support member 708d-2 may be coupled or separated by a groove provided on one outer circumferential surface of the main body 705 and a hinge member (not shown).
  • the supports shown in FIGS. 9, 10, 11, 12a, and 12b are merely examples, and the electronic device 700 may have supports in various positions or shapes.
  • expressions such as “has,” “can have,” “includes,” or “can include” indicate the presence of a corresponding feature (eg, numerical value, function, operation, or component such as a part). , which does not preclude the existence of additional features.
  • expressions such as “A or B,” “at least one of A and/and B,” or “one or more of A or/and B” may include all possible combinations of the items listed together.
  • a component e.g., a first component
  • another component e.g., a second component
  • connection to it should be understood that the certain component may be directly connected to the other component or connected through another component (eg, a third component).
  • the phrase “device configured to” may mean that the device is “capable of” in conjunction with other devices or components.
  • a processor configured (or configured) to perform A, B, and C may include a dedicated processor (eg, embedded processor) to perform the operation, or by executing one or more software programs stored in a memory device.
  • a general-purpose processor eg, CPU or application processor
  • a 'module' or 'unit' performs at least one function or operation, and may be implemented with hardware or software, or a combination of hardware and software.
  • a plurality of 'modules' or a plurality of 'units' may be integrated into at least one module and implemented by at least one processor, except for 'modules' or 'units' that need to be implemented with specific hardware.
  • various embodiments described above may be implemented in a recording medium readable by a computer or a similar device using software, hardware, or a combination thereof.
  • the embodiments described in this disclosure are application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable gate arrays (FPGAs). ), processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • the embodiments described herein may be implemented by a processor itself.
  • non-transitory readable media may be loaded and used in various devices.
  • a non-transitory readable medium is not a medium that stores data for a short moment, such as a register, cache, or memory, but a medium that stores data semi-permanently and can be read by a device.
  • programs for performing the various methods described above may be stored and provided in a non-transitory readable medium such as a CD, DVD, hard disk, Blu-ray disk, USB, memory card, or ROM.
  • the method according to various embodiments disclosed in this document may be included and provided in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • a computer program product may be distributed in the form of a device-readable storage medium (eg, compact disc read only memory (CD-ROM)) or online through an application store (eg, Play Store TM ).
  • an application store eg, Play Store TM
  • at least a part of the computer program product may be temporarily stored or temporarily created in a storage medium such as a manufacturer's server, an application store server, or a relay server's memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne un dispositif électronique et son procédé de commande. Un procédé de commande de dispositif électronique selon la présente invention comprend les étapes consistant à : acquérir, avant qu'une image de projection ne soit projetée au moyen du dispositif électronique, une première image correspondant à une zone de projection dans laquelle l'image de projection doit être projetée ; projeter au moins une image de test sur la base de la première image acquise ; acquérir au moins une seconde image correspondant à la zone de projection dans laquelle la au moins une image de test est projetée ; et projeter l'image de projection sur la base de la première image et de la au moins une seconde image.
PCT/KR2022/008300 2021-07-16 2022-06-13 Dispositif électronique et son procédé de commande WO2023287024A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0093754 2021-07-16
KR1020210093754A KR20230012909A (ko) 2021-07-16 2021-07-16 전자 장치 및 이의 제어 방법

Publications (1)

Publication Number Publication Date
WO2023287024A1 true WO2023287024A1 (fr) 2023-01-19

Family

ID=84919489

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/008300 WO2023287024A1 (fr) 2021-07-16 2022-06-13 Dispositif électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20230012909A (fr)
WO (1) WO2023287024A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005189542A (ja) * 2003-12-25 2005-07-14 National Institute Of Information & Communication Technology 表示システム、表示プログラム、表示方法
KR20070072324A (ko) * 2005-12-29 2007-07-04 삼성전자주식회사 영상 투사기의 투사 화면 조절 방법 및 그 장치
KR20100081040A (ko) * 2009-01-05 2010-07-14 삼성전자주식회사 영상 프로젝터의 영상 색상 손실 보상 방법 및 그 장치
KR101366329B1 (ko) * 2007-05-07 2014-02-20 엘지전자 주식회사 프로젝터 및 프로젝터의 영상 보정 방법
KR20190099509A (ko) * 2017-09-22 2019-08-27 popIn가부시키가이샤 프로젝터 및 프로젝터 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005189542A (ja) * 2003-12-25 2005-07-14 National Institute Of Information & Communication Technology 表示システム、表示プログラム、表示方法
KR20070072324A (ko) * 2005-12-29 2007-07-04 삼성전자주식회사 영상 투사기의 투사 화면 조절 방법 및 그 장치
KR101366329B1 (ko) * 2007-05-07 2014-02-20 엘지전자 주식회사 프로젝터 및 프로젝터의 영상 보정 방법
KR20100081040A (ko) * 2009-01-05 2010-07-14 삼성전자주식회사 영상 프로젝터의 영상 색상 손실 보상 방법 및 그 장치
KR20190099509A (ko) * 2017-09-22 2019-08-27 popIn가부시키가이샤 프로젝터 및 프로젝터 시스템

Also Published As

Publication number Publication date
KR20230012909A (ko) 2023-01-26

Similar Documents

Publication Publication Date Title
WO2017078356A1 (fr) Dispositif d'affichage et procédé d'affichage d'image associé
WO2019004570A1 (fr) Hotte de cuisine et procédé de commande de hotte de ladite cuisine
WO2017014429A1 (fr) Dispositif électronique, et procédé de fonctionnement associé
WO2022265428A1 (fr) Appareil électronique et son procédé de commande
WO2023013862A1 (fr) Appareil électronique et procédé de traitement d'image associé
WO2023287024A1 (fr) Dispositif électronique et son procédé de commande
WO2022191538A1 (fr) Appareil de production sonore
WO2023027285A1 (fr) Dispositif électronique et son procédé de commande
WO2023022422A1 (fr) Appareil électronique et son procédé de commande
WO2023286931A1 (fr) Appareil électronique et son procédé de commande
WO2023132625A1 (fr) Dispositif électronique et son procédé de commande
WO2022234942A1 (fr) Appareil électronique et son procédé de commande
WO2023249271A1 (fr) Dispositif électronique de recadrage et de projection d'images et procédé de commande de ce dispositif
WO2023055139A1 (fr) Appareil électronique et son procédé de commande
WO2019093674A1 (fr) Dispositif d'affichage, dispositif de détermination de reproduction et procédé de vérification de reproduction de contenu
WO2023282460A1 (fr) Appareil électronique et son procédé de commande
WO2023003140A1 (fr) Dispositif électronique et son procédé de commande
WO2022191404A1 (fr) Dispositif électronique et procédé de commande associé
WO2023090648A1 (fr) Appareil électronique
WO2022196877A1 (fr) Dispositif électronique et son procédé de commande
WO2024101616A1 (fr) Dispositif électronique et son procédé de commande
WO2023249275A1 (fr) Dispositif électronique pour projeter une image sur un écran comprenant une pluralité de surfaces, et procédé de commande
WO2023018008A1 (fr) Appareil électronique et son procédé de commande
WO2023027319A1 (fr) Dispositif électronique et procédé associé de commande
WO2022181865A1 (fr) Dispositif d'affichage et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22842287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE