WO2023018008A1 - Appareil électronique et son procédé de commande - Google Patents

Appareil électronique et son procédé de commande Download PDF

Info

Publication number
WO2023018008A1
WO2023018008A1 PCT/KR2022/009346 KR2022009346W WO2023018008A1 WO 2023018008 A1 WO2023018008 A1 WO 2023018008A1 KR 2022009346 W KR2022009346 W KR 2022009346W WO 2023018008 A1 WO2023018008 A1 WO 2023018008A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
electronic device
shutter unit
shutter
unit
Prior art date
Application number
PCT/KR2022/009346
Other languages
English (en)
Korean (ko)
Inventor
김형철
김학재
문승현
박기홍
백원준
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2023018008A1 publication Critical patent/WO2023018008A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/147Optical correction of image distortions, e.g. keystone
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/142Adjusting of projection optics
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/145Housing details, e.g. position adjustments thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/10Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer with simulated flight- or engine-generated force being applied to aircraft occupant
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present disclosure relates to an electronic device that outputs an image on a projection surface and a control method thereof, and relates to an electronic device that controls a shutter unit to block light output to a remaining portion not corresponding to content, and a control method thereof.
  • Keystone correction may be performed to minimize image distortion.
  • Keystone correction may refer to an operation of reversely distorting an image so that an image with reduced distortion is output on a projection surface.
  • a panel inside the projector eg, Digital Micromirror Device, DMD
  • DMD Digital Micromirror Device
  • the corresponding gray zone is not parallel or uniform in the shape of the area for viewing (eg, a rectangle) and each side is not uniform, and the user's discomfort may not be resolved due to this phenomenon.
  • the present disclosure has been devised to improve the above problems, and an object of the present disclosure is to output light corresponding to an image portion corresponding to content to the projection surface and to output light corresponding to the remaining portion not corresponding to content to the projection surface. It is an object of the present invention to provide an electronic device and a control method thereof for blocking using a shutter unit so as not to occur.
  • An electronic device for achieving the above object acquires a memory, a shutter unit, a projection unit outputting an image on a projection surface, and a first image corresponding to content from the memory, and converts the first image into a keystone.
  • Obtaining a second image by correcting, identifying an image portion corresponding to the content and a remaining portion not corresponding to the content in the second image, and controlling the shutter unit so that the remaining portion is not projected onto the projection surface. contains the processor.
  • the processor may control the projection unit so that the image portion of the entire portion of the second image is output to the projection surface, and the remaining portion of the entire portion of the second image is not output to the projection surface.
  • a position of the shutter unit may be changed.
  • the processor may control at least one of movement or rotation of the shutter part based on the location information of the remaining part.
  • the memory may store position information on which the shutter unit is movable
  • the processor may obtain movement information and rotation information of the shutter unit based on the position information on which the shutter unit is movable and the position information of the remaining part.
  • the processor may move the shutter unit based on the movement information and rotate the shutter unit based on the rotation information.
  • the shutter unit may match a central axis of the projection unit with a central axis of the shutter unit.
  • the processor may control the shutter unit so that the remaining portion is not projected onto the projection surface in a state in which a central axis of the projection unit coincides with a central axis of the shutter unit.
  • the processor may obtain an offset of the projection unit and change a position of the shutter unit based on the offset.
  • the shutter unit may include at least one of a shutter, a fixing member, a rail, a body, or a motor, the body may include the shutter and the fixing member, and the processor may include driving power generated by the motor. Accordingly, the shutter may be controlled to move along the rail, and the shutter may be controlled to rotate according to driving power generated by the motor.
  • the processor may identify a reference point of the shutter unit for covering the remaining portion, control a direction of the rail to move the body to the reference point, and move the body along the rail to
  • the fixing member may be controlled to be positioned at the reference point.
  • the body may include the fixing member and at least two shutters corresponding to the fixing member, and the processor operates the at least two shutters based on the fixing member according to driving power generated by the motor. You can control it to rotate.
  • a control method of an electronic device including a projection unit and a shutter unit outputting an image on a projection surface includes obtaining a first image corresponding to content, and obtaining a second image by performing keystone correction on the first image. identifying an image portion corresponding to the content and a remaining portion not corresponding to the content in the second image, and controlling the shutter unit so that the remaining portion is not projected onto the projection surface.
  • control method further includes outputting the image portion of the entire portion of the second image to the projection surface, and the controlling of the shutter unit outputs the remaining portion of the entire portion of the second image to the projection surface. It is possible to change the position of the shutter unit so that it is not output to the slope.
  • At least one of movement or rotation of the shutter unit may be controlled based on location information of the remaining portion.
  • movement information and rotation information of the shutter unit may be obtained based on movable position information of the shutter unit and position information of the remaining part.
  • the controlling of the shutter unit may include moving the shutter unit based on the movement information and rotating the shutter unit based on the rotation information.
  • the shutter unit may be controlled so that the remaining portion is not projected onto the projection surface in a state in which a central axis of the projection unit and a central axis of the shutter unit coincide.
  • an offset of the projection unit may be obtained, and a position of the shutter unit may be changed based on the offset.
  • the shutter unit may include at least one of a shutter, a fixing member, a rail, a body, or a motor, the body may include the shutter and the fixing member, and controlling the shutter unit may be performed by the motor.
  • the shutter may be controlled to move along the rail according to the generated driving power, and the shutter may be controlled to be rotated according to the driving power generated by the motor.
  • a reference point of the shutter unit for covering the remaining portion may be identified, a direction of the rail may be controlled to move the body to the reference point, and the body may be moved along the rail. It is possible to control the fixing member to be located at the reference point by moving.
  • the body may include the fixing member and at least two shutters corresponding to the fixing member
  • controlling the shutter unit may include the at least one shutter based on the fixing member according to driving power generated by the motor. It can be controlled to rotate the two shutters.
  • FIG. 1 is a perspective view illustrating an external appearance of an electronic device according to an embodiment of the present disclosure.
  • FIG. 2A is a block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • FIG. 2B is a block diagram showing the configuration of the electronic device of FIG. 2A in detail.
  • FIG. 3 is a perspective view illustrating an external appearance of an electronic device according to other embodiments of the present disclosure.
  • FIG. 4 is a perspective view illustrating an external appearance of an electronic device according to another embodiment of the present disclosure.
  • FIG. 5 is a perspective view illustrating an external appearance of an electronic device according to another embodiment of the present disclosure.
  • 6A is a perspective view illustrating an external appearance of an electronic device according to another embodiment of the present disclosure.
  • FIG. 6B is a perspective view illustrating a rotated state of the electronic device of FIG. 6A.
  • FIG. 7 is a flowchart illustrating a method of controlling a shutter unit according to an embodiment of the present disclosure.
  • FIG. 8 is a flowchart for explaining the control method of FIG. 7 in detail.
  • FIG. 9 is a diagram for explaining an output operation of an image on which keystone correction has been performed according to an embodiment of the present disclosure.
  • FIG. 10 is a diagram for explaining an output operation of an image on which keystone correction is performed according to another embodiment of the present disclosure.
  • 11 is a diagram for explaining an output operation of an image on which keystone correction is performed according to another embodiment of the present disclosure.
  • FIG. 12 is a diagram for explaining an operation of a shutter unit according to an embodiment of the present disclosure.
  • FIG. 13 is a diagram for explaining an operation of a shutter unit according to another embodiment of the present disclosure.
  • FIG. 14 is a diagram for explaining an operation of a shutter unit according to another embodiment of the present disclosure.
  • 15 is a diagram for explaining an operation of a shutter unit according to another embodiment of the present disclosure.
  • 16 is a diagram for explaining an operation of a shutter unit according to another embodiment of the present disclosure.
  • 17 is a diagram for explaining an operation of a shutter unit according to another embodiment of the present disclosure.
  • FIG. 18 is a diagram for explaining an operation of a shutter unit according to another embodiment of the present disclosure.
  • 19 is a diagram for explaining arrangement of a projection lens and a shutter unit.
  • FIG. 20 is a view for explaining an embodiment in which a shutter unit is disposed to correspond to an offset of a projection lens.
  • 21 is a diagram for identifying a remaining area.
  • 22 is a diagram for explaining an operation of changing an image area.
  • 23 is a diagram for explaining a remaining area that is changed according to an image correction operation.
  • 24 is a view for explaining the operation of changing the position of the shutter unit.
  • 25 is a diagram for comparing the size of a remaining area that is changed according to a shutter unit.
  • 26 is a diagram for explaining the operation of identifying the position of the shutter unit.
  • 27 is a flowchart for explaining an operation of determining whether the body of the shutter unit moves.
  • 28 is a view for explaining an operation of moving the body of the shutter unit.
  • 29 is a flowchart for explaining an operation of controlling a shutter unit based on the size of an unoccluded area.
  • 30 is a diagram for explaining an operation of controlling the shutter unit so that there is no remaining area.
  • 31 is a diagram for explaining an operation of controlling a shutter unit based on a size of a remaining area, according to an exemplary embodiment.
  • 32 is a diagram for explaining an operation of controlling a shutter unit based on a size of a remaining area according to another embodiment.
  • 33 is a flowchart for explaining a control method of an electronic device according to an embodiment of the present disclosure.
  • expressions such as “has,” “can have,” “includes,” or “can include” indicate the existence of a corresponding feature (eg, numerical value, function, operation, or component such as a part). , which does not preclude the existence of additional features.
  • a component e.g., a first component
  • another component e.g., a second component
  • connection to it should be understood that an element may be directly connected to another element, or may be connected through another element (eg, a third element).
  • a “module” or “unit” performs at least one function or operation, and may be implemented in hardware or software or a combination of hardware and software.
  • a plurality of “modules” or a plurality of “units” are integrated into at least one module and implemented by at least one processor (not shown), except for “modules” or “units” that need to be implemented with specific hardware. It can be.
  • the term user may refer to a person using an electronic device or a device (eg, an artificial intelligence electronic device) using an electronic device.
  • a device eg, an artificial intelligence electronic device
  • FIG. 1 is a perspective view illustrating an external appearance of an electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 may include a head 103, a body 105, a projection lens 110, a connector 130, or a cover 107.
  • the electronic device 100 may be a device of various types.
  • the electronic device 100 may be a projector device that enlarges and projects an image onto a wall or a screen
  • the projector device may be an LCD projector or a digital light processing (DLP) projector using a digital micromirror device (DMD).
  • DLP digital light processing
  • DMD digital micromirror device
  • the electronic device 100 may be a home or industrial display device, or a lighting device used in daily life, a sound device including a sound module, a portable communication device (eg, a smartphone), It may be implemented as a computer device, a portable multimedia device, a wearable device, or a home appliance.
  • the electronic device 100 according to an embodiment of the present disclosure is not limited to the above-described devices, and the electronic device 100 may be implemented as an electronic device 100 having two or more functions of the above-described devices.
  • the electronic device 100 may be used as a display device, a lighting device, or a sound device by turning off a projector function and turning on a lighting function or a speaker function according to manipulation of a processor, and AI including a microphone or communication device.
  • a speaker a speaker.
  • the main body 105 is a housing that forms an exterior, and may support or protect components (eg, the components shown in FIG. 2B ) of the electronic device 100 disposed inside the main body 105 .
  • the body 105 may have a structure close to a cylindrical shape as shown in FIG. 1 .
  • the shape of the main body 105 is not limited thereto, and according to various embodiments of the present disclosure, the main body 105 may be implemented in various geometric shapes such as a column, a cone, and a sphere having a polygonal cross section.
  • the size of the main body 105 may be a size that a user can hold or move with one hand, may be implemented in a very small size for easy portability, and may be implemented in a size that can be placed on a table or coupled to a lighting device.
  • the material of the main body 105 may be implemented with matte metal or synthetic resin so as not to be stained with user's fingerprints or dust, or the exterior of the main body 105 may be made of a smooth gloss.
  • a friction area may be formed on a portion of the exterior of the body 105 so that a user can grip and move the main body 105 .
  • the main body 105 may be provided with a bent gripping portion or a support 108a (see FIG. 3 ) that the user can grip in at least a portion of the area.
  • the projection lens 110 is formed on one surface of the main body 105 to project light passing through the lens array to the outside of the main body 105 .
  • the projection lens 110 of various embodiments may be an optical lens coated with low dispersion in order to reduce chromatic aberration.
  • the projection lens 110 may be a convex lens or a condensing lens, and the projection lens 110 according to an embodiment may adjust the focus by adjusting the positions of a plurality of sub-lenses.
  • the head 103 is provided to be coupled to one surface of the main body 105 to support and protect the projection lens 110 .
  • the head 103 may be coupled to the main body 105 so as to be swivelable within a predetermined angular range based on one surface of the main body 105 .
  • the head 103 is automatically or manually swiveled by a user or a processor to freely adjust the projection angle of the projection lens 110 .
  • the head 103 is coupled to the main body 105 and includes a neck extending from the main body 105, so that the head 103 is tilted or tilted to adjust the projection angle of the projection lens 110. can be adjusted
  • the electronic device 100 adjusts the direction of the head 103 and adjusts the emission angle of the projection lens 110 while the position and angle of the main body 105 are fixed, so that light or an image can be projected to a desired location.
  • the head 103 may include a handle that the user can grasp after rotating in a desired direction.
  • a plurality of openings may be formed on the outer circumferential surface of the main body 105 . Audio output from the audio output unit may be output to the outside of the main body 105 of the electronic device 100 through the plurality of openings.
  • the audio output unit may include a speaker, and the speaker may be used for general purposes such as multimedia playback, recording playback, and audio output.
  • a heat dissipation fan (not shown) may be provided inside the main body 105, and when the heat dissipation fan (not shown) is driven, air or heat inside the main body 105 passes through a plurality of openings. can emit. Therefore, the electronic device 100 can discharge heat generated by driving the electronic device 100 to the outside and prevent the electronic device 100 from overheating.
  • the connector 130 may connect the electronic device 100 to an external device to transmit/receive electrical signals or receive power from the outside.
  • the connector 130 may be physically connected to an external device.
  • the connector 130 may include an input/output interface, and may communicate with an external device through wired or wireless communication or receive power.
  • the connector 130 may include an HDMI connection terminal, a USB connection terminal, an SD card receiving groove, an audio connection terminal, or a power outlet, or may include Bluetooth, Wi-Fi, or wireless connection to an external device wirelessly.
  • a charging connection module may be included.
  • the connector 130 may have a socket structure connected to an external lighting device, and may be connected to a socket receiving groove of the external lighting device to receive power.
  • the size and standard of the connector 130 having a socket structure may be variously implemented in consideration of a receiving structure of a coupleable external device.
  • the diameter of the junction of the connector 130 may be implemented as 26 mm, and in this case, the electronic device 100 replaces a conventionally used light bulb and an external lighting device such as a stand. can be coupled to Meanwhile, when fastened to a socket located on an existing ceiling, the electronic device 100 is projected from top to bottom, and when the electronic device 100 is not rotated by coupling the socket, the screen cannot be rotated either.
  • the electronic device 100 is socket-coupled to the ceiling stand so that the electronic device 100 can rotate even when the socket is coupled and power is supplied, and the head 103 is moved from one side of the main body 105. It swivels and adjusts the emission angle to project the screen to a desired position or rotate the screen.
  • the connector 130 may include a coupling sensor, and the coupling sensor may sense whether or not the connector 130 is coupled with an external device, a coupling state, or a coupling target, and transmit the sensor to the processor, based on the received detection value. Driving of the electronic device 100 may be controlled.
  • the cover 107 can be coupled to and separated from the main body 105 and can protect the connector 130 so that the connector 130 is not constantly exposed to the outside.
  • the shape of the cover 107 may have a shape continuous with the body 105 as shown in FIG. 1, or may be implemented to correspond to the shape of the connector 130.
  • the cover 107 can support the electronic device 100, and the electronic device 100 can be used by being coupled to the cover 107 and coupled to or mounted on an external cradle.
  • a battery may be provided inside the cover 107 .
  • a battery may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell or a fuel cell.
  • the electronic device 100 may include a camera module, and the camera module may capture still images and moving images.
  • the camera module may include one or more lenses, image sensors, image signal processors, or flashes.
  • the electronic device 100 may include a protective case (not shown) to protect the electronic device 100 and easily carry it, or a stand supporting or fixing the main body 105. (not shown), and may include a bracket (not shown) coupled to a wall or partition.
  • the electronic device 100 may provide various functions by being connected to various external devices using a socket structure.
  • the electronic device 100 may be connected to an external camera device using a socket structure.
  • the electronic device 100 may use the projection unit 111 to provide an image stored in a connected camera device or an image currently being captured.
  • the electronic device 100 may be connected to a battery module to receive power using a socket structure.
  • the electronic device 100 may be connected to an external device using a socket structure, but this is merely an example, and may be connected to an external device using another interface (eg, USB, etc.).
  • 2A is a block diagram illustrating a configuration of an electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 includes a projection unit 111 , a memory 112 , a shutter unit 120 and a processor 114 .
  • the projection unit 111 may output an image to be output from the electronic device 100 to the projection surface.
  • the projection unit 111 may include a projection lens 110 .
  • the projection unit 111 may perform a function of outputting an image on a projection surface. A detailed description related to the projection unit 111 is described in FIG. 2B.
  • the electronic device 100 may project images in various ways.
  • the projection unit 111 may include a projection lens 110 .
  • the projection surface may be a part of a physical space on which an image is output or a separate screen.
  • the memory 112 may store the first image and the second image output on the projection surface. A detailed description related to the memory 112 is described in FIG. 2B.
  • the shutter unit 120 may block light output from the projection unit 111 .
  • the shutter unit 120 may include at least one of a shutter 121, a fixing member 122, a rail 123, a body 124, or a motor 125, and the body 124 includes the shutter 121 and It may include a fixing member 122, and the processor 114 may control the shutter 121 to move along the rail 123 according to the driving power generated by the motor 125, and the motor 125 It is possible to control the shutter 121 to rotate according to the driving power generated by the.
  • the processor 114 may perform overall control operations of the electronic device 100 . Specifically, the processor 114 functions to control the overall operation of the electronic device 100 .
  • the processor 114 may acquire a first image corresponding to the content from the memory 112, obtain a second image by keystone-correcting the first image, and obtain an image portion corresponding to the content in the second image and The remaining portion not corresponding to the content may be identified, and the shutter unit 120 may be controlled so that the remaining portion is not projected onto the projection surface.
  • the first image corresponding to the content may mean “the first image including the content”.
  • the image part corresponding to the content and the remaining part not corresponding to the content may mean “the image part including the content and the remaining part not including the content”.
  • an operation of controlling the shutter unit 120 so that the remaining portion is not projected onto the projection surface may mean “an operation of controlling the shutter unit 120 to cover the light output to the remaining portion”.
  • the first image may mean an image that the user wants to be able to output through the projection unit 111 .
  • the first image may or may not include content.
  • the contents may mean photos, videos, and the like.
  • the first image may be selected by the user.
  • a plurality of images may be stored in the memory 112 .
  • the processor 114 may acquire the first image.
  • the first image may be a predetermined image.
  • the predetermined image may mean a test image including a manufacturer's logo.
  • the predetermined image may be stored in the memory 112 .
  • the processor 114 may perform keystone correction on the first image after obtaining the first image.
  • the processor 114 may obtain a second image by performing keystone correction on the first image.
  • the processor 114 may perform keystone correction based on at least one of an image size, an inclination angle (or a tilting angle) of the electronic device 100, a projection surface size, or whether or not the projection surface is curved.
  • the processor 114 may obtain a second image on which keystone correction is performed.
  • An image before keystone correction is performed may be described as a first image, and an image after keystone correction is performed is described as a second image.
  • the first image when keystone correction is performed, the first image may be changed into a trapezoidal shape rather than a rectangular shape during the keystone correction process.
  • the processor 114 may add an arbitrary remaining portion to generate a second image having a rectangular shape. Accordingly, the processor 114 may acquire the second image by combining the trapezoidal image portion and the residual portion.
  • the image 910 of FIG. 9 may be the second image
  • the image portion 911 is a portion of the first image deformed in a trapezoidal shape due to keystone correction
  • the remaining portion 912 is a rectangular shape. It may be a part that is combined to create a second image.
  • the shape of the first image on which keystone correction is performed is described as a trapezoid, the first image may be changed into various shapes according to implementation examples.
  • the processor 114 may identify an image portion including content and a remaining portion not corresponding to the content.
  • the part may mean a specific area, range, area, etc. that occupies the image.
  • the second image may include an image part and a residual part.
  • the processor 114 may output the second image through the projection unit 111 .
  • the processor 114 may output all of the second image to the projection surface through the electronic device 100 .
  • all of the second image is output on the projection surface, since light may be output to the remaining portion not corresponding to the content, visibility may deteriorate as the user views the screen corresponding to the remaining portion.
  • the processor 114 may control the shutter unit 120 to block the light output to the remaining portion of the second image so that only the image portion is output. Specifically, the processor 114 may move the shutter unit 120 to a position where the remaining portion is output. Since the shutter unit 120 corresponds to the configuration of the electronic device 100, the shutter unit 120 can be controlled in consideration of the position of the remaining part included in the second image.
  • the processor 114 may control the projection unit 111 to output an image portion of the entire portion of the second image to the projection surface, and may control the shutter so that the remaining portion of the entire portion of the second image is not output to the projection surface.
  • the position of part 120 can be changed.
  • the entire part of the second image may be divided into an image part and a residual part.
  • the processor 114 may obtain the location of the image portion and the location of the remaining portion. Also, the processor 114 may control the projection unit 111 to output an entire portion of the second image. Here, the processor 114 may change the position of the shutter unit 120 to cover the output corresponding to the remaining portion of the second image. Before the change, the shutter unit 120 may be in a state in which the projection unit 111 is not covered at all.
  • the processor 114 may control the shutter unit 120 to cover a portion of the projection unit 111 . Specifically, the shutter unit 120 may be controlled to cover an area corresponding to the remaining portion of the second image in the entire area of the projection unit 111 .
  • the area corresponding to the remaining portion of the second image may mean an area to which the projection unit 111 irradiates light to output the remaining portion.
  • the processor 114 may obtain position information of the remaining part, identify at least one of movement information or rotation information of the shutter unit 120 based on the position information of the remaining part, and based on the movement information Thus, the shutter unit 120 may be moved, and the shutter unit 120 may be rotated based on rotation information.
  • the movement information may mean a movement point or coordinates to which the shutter unit 120 should move to cover the remaining portion.
  • the movement information may include an optimal point for masking the remaining portion.
  • the rotation information may mean a rotation angle at which the shutter unit 120 should be rotated to cover the remaining portion after the shutter unit 120 moves.
  • the rotation angle may mean a rotation angle of the shutter 121 having a fixed size.
  • the rotation angle may refer to an expansion angle or a contraction angle of the shutter 121 whose size is changed.
  • the memory 112 of the electronic device 100 may store positional information on which the shutter unit 120 is movable, and the processor 114 may store information on the positional position on which the shutter unit 120 is movable and the positional information of the remaining part. Based on this, movement information and rotation information of the shutter unit 120 may be obtained.
  • the movable position information may mean coordinate information on which the shutter unit 120 can physically move.
  • the shutter unit 120 may be movable only to a predetermined position, and the movable coordinates may be stored in the memory 112 .
  • the processor 114 may compare the position of the physically movable shutter part 120 and the position of the remaining part to obtain an optimal point and a rotation angle at which the shutter part 120 should move.
  • the rotation information may include a rotation angle.
  • the rotation angle may be defined as a positive (+) angle and a negative (-) angle. Therefore, the rotation direction can be defined according to the positive (+) or negative (-) of the rotation angle.
  • +30 degrees may mean 30 degrees clockwise and -30 degrees may mean 30 degrees counterclockwise.
  • the rotation information may include at least one of a rotation direction and a rotation angle including a clockwise direction or a counterclockwise direction
  • the processor 114 may include the shutter unit based on at least one of the rotation direction and the rotation angle. (120) can be rotated.
  • the rotation direction may mean clockwise or counterclockwise.
  • the rotation angle may mean an angle between 0 degrees and 360 degrees.
  • the shutter unit 120 of the projection unit 111 may be arranged so that the central axis of the projection unit 111 and the central axis of the shutter unit 120 coincide.
  • the processor 114 may control the shutter unit 120 so that the remaining portion is not projected onto the projection surface in a state in which the central axis of the projection unit 111 and the central axis of the shutter unit 120 coincide.
  • the shutter unit 120 may be configured so that the central axis of the projection unit 111 and the central axis of the shutter unit 120 coincide. This is because the light output from the projection unit 111 can be blocked only when the central axes of each component coincide. A detailed description related to this is described in FIG. 19 .
  • the processor 114 may obtain an offset of the projection and change the position of the shutter unit 120 based on the offset.
  • the offset may mean a distance in which an output image deviate from the central axis of the projection unit 111 .
  • the offset may be 0%.
  • the offset value may increase. A detailed description related to this will be described in FIG. 20 .
  • the shutter unit 120 of the projection unit 111 may include at least one of the shutter 121, the fixing member 122, the rail 123, the body 124, or the motor 125, and the body ( 124 may include a shutter 121 and a fixing member 122, and the processor 114 controls the shutter 121 to move along the rail 123 according to driving power generated by the motor 125.
  • the shutter 121 may be controlled to rotate according to driving power generated by the motor 125 .
  • the motor 125 may generate driving power necessary for moving or rotating the elements of the shutter unit 120 and deliver driving energy according to the generated driving power to each element.
  • the shutter unit 120 may include a shutter 121 having a fixed size. According to another embodiment, the shutter unit 120 may include a shutter 121 whose size is expanded or reduced.
  • the shutter 121 may cover a portion of the screen projected through the projection unit 111 .
  • the shutter 121 may mean a shield.
  • the motor 125 may generate and transmit power for moving the shutter 121 or the body 124 .
  • the shutter unit 120 may include an angle adjusting unit for adjusting the angle of the shutter 121 and a movement adjusting unit for moving the body 124 up, down, left and right.
  • the shutter unit 120 may be controlled based on at least one of a throw ratio (TR) of the electronic device 100 and optical characteristics of the projection unit 111 .
  • TR throw ratio
  • the electronic device 100 may include an image sensor that captures images of a screen output on a projection surface. Based on the captured image acquired by the image sensor, the electronic device 100 may identify whether the remaining portion is covered by the shutter 121 . If it is identified that the remaining part is output in the captured image, the electronic device 100 may change the position of the shutter 121 again.
  • the electronic device 100 may identify whether the image portion is covered by the shutter 121 based on the captured image. And, if it is identified that the image portion is covered by the shutter 121, the electronic device 100 may change the position of the shutter 121 again.
  • FIGS. 12 to 18 Various implementation forms of the shutter unit 120 are described in FIGS. 12 to 18 .
  • FIGS. 25 to 32 A detailed description of the movement and control operation of the shutter unit 120 will be described in FIGS. 25 to 32 .
  • the processor 114 may identify a reference point of the shutter unit 120 for covering the remaining portion, and may control the direction of the rail 123 to move the body 124 to the reference point. ) It is possible to control the fixing member 122 to be located at the reference point by moving the body 124 along.
  • FIG. 28 a detailed description of the shutter 121, the fixing member 122, the rail 123, the body 124, and the motor 125 and a description related to the moving operation using the rail 123 are described in FIG. 28.
  • the body 124 of the electronic device 100 may include a fixing member 122 and at least two shutters corresponding to the fixing member 122, and the processor 114 may generate At least two shutters may be controlled to rotate based on the fixing member 122 according to driving power.
  • FIG. 15 A detailed description related to this is described in FIG. 15 .
  • the electronic device 100 may block part of light output from the projection unit 111 by moving or rotating the shutter unit 120 .
  • Blocking the light itself output from the projection unit 111 may have many physical or software limitations. However, when light is physically blocked using the shutter unit 120, light corresponding to the remaining portion can be efficiently blocked. Therefore, when the light corresponding to the remaining portion is blocked, the user's convenience can be enhanced by increasing the visibility of the output image.
  • the shutter unit 120 may be controlled before keystone correction is performed, and the electronic device 100 may perform keystone correction after controlling the shutter unit 120 .
  • the electronic device 100 may obtain a first image to be output, and identify an area where the first image is to be output and a remaining area.
  • the remaining area may refer to an area where the first image is not output but light is irradiated by the projection unit 111 .
  • the electronic device 100 may control the shutter unit 120 to block light output to the remaining area.
  • the electronic device 100 may obtain a second image by performing keystone correction on the first image.
  • the first image may be changed due to keystone correction, and the electronic device 100 may obtain a keystone correction algorithm.
  • the electronic device 100 may change the positional movement of the shutter unit 120 using the acquired keystone correction algorithm.
  • the control operation of the shutter unit 120 determined based on the first image may be changed by a keystone correction algorithm.
  • the electronic device 100 may perform keystone correction and control the shutter unit 120 .
  • the electronic device 100 may first perform keystone correction and then control the shutter unit 120 .
  • the electronic device 100 may control the shutter unit 120 and then perform keystone correction.
  • the electronic device 100 may control the shutter unit 120 and then perform keystone correction.
  • the operation of identifying that the projection surface is not flat may be identifying a case where the projection surface has a curve, a complex shape, or an obstacle object present near the projection surface.
  • the size of an image finally output to the projection surface may be reduced.
  • a screen flickering problem may occur during an operation of correcting an aspect ratio or an image.
  • the flickering of the screen may not occur in the operation of blocking light by using the shutter unit 120 as hardware.
  • the electronic device 100 may be implemented in a form including a liquid crystal shutter.
  • the liquid crystal shutter may be disposed to cover all or part of the projection unit 111 .
  • the liquid crystal shutter may be implemented in a form of blocking or passing light according to physical properties of liquid crystal.
  • the electronic device 100 may control the liquid crystal shutter to pass light through an area corresponding to the image portion and block light in an area corresponding to the remaining portion.
  • the electronic device 100 may control the liquid crystal shutter based on the resolution of the output image and the resolution (or size) of the liquid crystal shutter.
  • the resolution of the shutter may be described as the controllability of the shutter.
  • the electronic device 100 first performs an operation of distinguishing between a part for outputting content and a remaining part for not outputting content in the output image, and thereafter, the projection area suitable for the position of the liquid crystal type satter and the liquid crystal type satter. Based on the size, the liquid crystal shutter can be controlled so that the remaining portion is not projected.
  • An operation calculated in consideration of a resolution of a shutter may be a pixel size matching operation.
  • the electronic device 100 may control the shutter unit 120 so that the remaining portion is not projected based on the resolution of the electronic device 100 and the resolution of the shutter unit 120 .
  • the electronic device 100 displays an image portion including content.
  • the shutter unit 120 may be controlled so as not to be covered by the shutter unit 120 .
  • the electronic device 100 may It is possible to control the shutter unit 120 based on the resolution of .
  • the resolution of the electronic device 100 is UHD and the resolution of the shutter unit 120 is FHD
  • the control capability of the shutter unit 120 may be 1/4 of that of the electronic device 100 .
  • the shutter unit 120 may turn off 1 pixel at the corresponding location. If the number of pixels turned off by the shutter unit 120 is three or less per unit area, the shutter unit 120 may not separately turn off pixels at the corresponding location.
  • the electronic device 100 determines the location of the shutter unit 120 and the size of the output image.
  • the shutter unit 120 may be controlled using only the inter-interval ratio.
  • FIG. 2B is a block diagram showing the configuration of the electronic device of FIG. 2A in detail.
  • the electronic device 100 includes a projection unit 111, a memory 112, a sensor unit 113, a processor 114, a user interface 115, an input/output interface 116, an audio output unit ( 117), a power supply unit 118, a communication interface 119, and a shutter unit 120.
  • the configuration shown in FIG. 2B is merely an example, and some configurations may be omitted and new configurations may be added.
  • the projection unit 111 is a component that projects an image to the outside.
  • the projection unit 111 may use various projection methods (eg, a cathode-ray tube (CRT) method, a liquid crystal display (LCD) method, a digital light processing (DLP) method, and a laser method). etc.) can be implemented.
  • CTR cathode-ray tube
  • LCD liquid crystal display
  • DLP digital light processing
  • laser method e.g., a laser method, etc.
  • the principle of the CRT method is basically the same as that of a CRT monitor.
  • the CRT method enlarges the image with a lens in front of the cathode ray tube (CRT) and displays the image on the screen.
  • CTR cathode ray tube
  • red, green, and blue cathode ray tubes can be separately implemented.
  • the LCD method is a method of displaying an image by transmitting light from a light source through a liquid crystal.
  • the LCD method is divided into a single-panel type and a three-panel type.
  • the light from the light source is separated into red, green, and blue by a dichroic mirror (a mirror that reflects only light of a specific color and passes the rest) and transmits the liquid crystal. After that, the light can gather in one place again.
  • a dichroic mirror a mirror that reflects only light of a specific color and passes the rest
  • the DLP method is a method of displaying an image using a DMD (Digital Micromirror Device) chip.
  • the projection unit of the DLP method may include a light source, a color wheel, a DMD chip, a projection lens, and the like.
  • Light output from a light source may exhibit a color while passing through a rotating color wheel.
  • the light that passed through the color wheel is input to the DMD chip.
  • the DMD chip includes numerous micromirrors and reflects light input to the DMD chip.
  • the projection lens may play a role of enlarging light reflected from the DMD chip to an image size.
  • the laser method includes a diode pumped solid state (DPSS) laser and a galvanometer.
  • DPSS diode pumped solid state
  • the galvanometer includes a mirror and a high power motor to move the mirror at high speed.
  • a galvanometer can rotate a mirror at up to 40 KHz/sec.
  • the galvanometer is mounted according to the scanning direction. In general, since the projector scans in a plane, the galvanometer can also be arranged separately in the x and y axes.
  • the projection unit 111 may include various types of light sources.
  • the projection unit 111 may include at least one light source among a lamp, LED, and laser.
  • the projection unit 111 may output an image in a 4:3 aspect ratio, a 5:4 aspect ratio, or a 16:9 wide aspect ratio according to the purpose of the electronic device 100 or the user's settings, and may output an image in a WVGA (854*480) aspect ratio depending on the aspect ratio. ), SVGA(800*600), XGA(1024*768), WXGA(1280*720), WXGA(1280*800), SXGA(1280*1024), UXGA(1600*1200), Full HD(1920*1080) ), etc., can output images at various resolutions.
  • the projection unit 111 may perform various functions for adjusting an output image under the control of the processor 114 .
  • the projection unit 111 may perform functions such as zoom, keystone, quick corner (4 corner) keystone, and lens shift.
  • the projection unit 111 may enlarge or reduce the image according to the distance from the screen (projection distance). That is, a zoom function may be performed according to the distance from the screen.
  • the zoom function may include a hardware method of adjusting the screen size by moving a lens and a software method of adjusting the screen size by cropping an image.
  • methods for adjusting the focus include a manual focus method and a motorized method.
  • the manual focus method refers to a method of manually focusing
  • the motorized method refers to a method of automatically focusing using a motor built into the projector when a zoom function is performed.
  • the projection unit 111 may provide a digital zoom function through software, and may provide an optical zoom function that performs a zoom function by moving a lens through a driving unit.
  • the projection unit 111 may perform a keystone function. If the height is not right for the front projection, the screen may be distorted up or down.
  • the keystone function means a function of correcting a distorted screen. For example, if distortion occurs in the left and right directions of the screen, it can be corrected using the horizontal keystone, and if distortion occurs in the vertical direction, it can be corrected using the vertical keystone.
  • the quick corner (4 corner) keystone function corrects the screen when the central area of the screen is normal but the corner area is not balanced.
  • the lens shift function is a function that moves the screen as it is when the screen is out of the screen.
  • the projection unit 111 may provide zoom/keystone/focus functions by automatically analyzing the surrounding environment and the projection environment without user input. Specifically, the projection unit 111 determines the distance between the electronic device 100 and the screen detected through sensors (depth camera, distance sensor, infrared sensor, illuminance sensor, etc.) and the space where the electronic device 100 is currently located. Zoom/Keystone/Focus functions can be automatically provided based on information about the image and the amount of ambient light.
  • the projection unit 111 may provide a lighting function using a light source.
  • the projection unit 111 may provide a lighting function by outputting a light source using LEDs.
  • the projection unit 111 may include one LED, and according to another embodiment, the electronic device may include a plurality of LEDs.
  • the projection unit 111 may output a light source using a surface-emitting LED according to an implementation example.
  • the surface-emitting LED may refer to an LED having a structure in which an optical sheet is disposed above the LED so that light sources are uniformly distributed and output. Specifically, when the light source is output through the LED, the light source may be evenly dispersed through the optical sheet, and the light source dispersed through the optical sheet may be incident to the display panel.
  • the projection unit 111 may provide a user with a dimming function for adjusting the intensity of the light source. Specifically, when a user input for adjusting the intensity of a light source is received from a user through the user interface 240 (eg, a touch display button or a dial), the projection unit 111 displays a light source corresponding to the received user input. It is possible to control the LED to output the intensity of.
  • the user interface 240 e.g, a touch display button or a dial
  • the projection unit 111 may provide a dimming function based on the content analyzed by the processor 114 without user input.
  • the projection unit 111 may control the LED to output the intensity of the light source based on information about currently provided content (eg, content type, content brightness, etc.).
  • the projection unit 111 may control the color temperature under the control of the processor 114 .
  • the processor 114 may control the color temperature based on content. Specifically, if the content is identified as being output, the processor 114 may obtain color information for each frame of the content for which output is determined. Also, the processor 114 may control the color temperature based on the obtained color information for each frame. Here, the processor 114 may obtain at least one main color of a frame based on color information for each frame. Also, the processor 114 may adjust the color temperature based on the obtained at least one primary color. For example, the color temperature controllable by the processor 114 may be classified into a warm type or a cold type.
  • a frame to be output (hereinafter referred to as an output frame) includes a scene in which a fire has occurred.
  • the processor 114 may identify (or obtain) that the main color is red based on the color information included in the current output frame. Also, the processor 114 may identify a color temperature corresponding to the identified main color (red). Here, the color temperature corresponding to red may be a warm type. Meanwhile, the processor 114 may use an artificial intelligence model to obtain color information or a primary color of a frame.
  • the artificial intelligence model may be stored in the electronic device 100 (eg, the memory 112). According to another embodiment, the artificial intelligence model may be stored in an external server communicable with the electronic device 100 .
  • the electronic device 100 may control a lighting function in conjunction with an external device.
  • the electronic device 100 may receive lighting information from an external device.
  • the lighting information may include at least one of brightness information and color temperature information set by an external device.
  • the external device is a device connected to the same network as the electronic device 100 (eg, an IoT device included in the same home/work network) or a device that is not on the same network as the electronic device 100 but can communicate with the electronic device ( For example, a remote control server).
  • IoT device included in the same network as the electronic device 100 outputs red light with a brightness of 50.
  • the external lighting device may directly or indirectly transmit lighting information (eg, information indicating that red light is output with a brightness of 50) to the electronic device 100 .
  • the electronic device 100 may control the output of the light source based on lighting information received from an external lighting device. For example, when lighting information received from an external lighting device includes information for outputting red light with a brightness of 50, the electronic device 100 may output red light with a brightness of 50.
  • the electronic device 100 may control a lighting function based on biometric information.
  • the processor 114 may obtain user's biometric information.
  • the biometric information may include at least one of the user's body temperature, heart rate, blood pressure, respiration, and electrocardiogram.
  • the biometric information may include various types of information in addition to the information described above.
  • an electronic device may include a sensor for measuring biometric information.
  • the processor 114 may obtain user's biometric information through a sensor and control the output of the light source based on the obtained biometric information.
  • the processor 114 may receive biometric information from an external device through the input/output interface 116 .
  • the external device may refer to a user's portable communication device (eg, a smart phone or a wearable device).
  • the processor 114 may obtain user's biometric information from an external device and control the output of the light source based on the obtained biometric information.
  • the electronic device may identify whether the user is sleeping, and if the user is identified as sleeping (or preparing for sleep), the processor 114 determines the light source based on the user's biometric information. You can control the output.
  • the memory 112 may store at least one command related to the electronic device 100 . Also, an operating system (O/S) for driving the electronic device 100 may be stored in the memory 112 . Also, various software programs or applications for operating the electronic device 100 may be stored in the memory 112 according to various embodiments of the present disclosure. Also, the memory 112 may include a semiconductor memory such as a flash memory or a magnetic storage medium such as a hard disk.
  • O/S operating system
  • various software programs or applications for operating the electronic device 100 may be stored in the memory 112 according to various embodiments of the present disclosure.
  • the memory 112 may include a semiconductor memory such as a flash memory or a magnetic storage medium such as a hard disk.
  • various software modules for operating the electronic device 100 may be stored in the memory 112 according to various embodiments of the present disclosure, and the processor 114 executes various software modules stored in the memory 112.
  • the operation of the electronic device 100 may be controlled. That is, the memory 112 is accessed by the processor 114, and data can be read/written/modified/deleted/updated by the processor 114.
  • the term memory 112 refers to a memory 112, a ROM (not shown) in the processor 114, a RAM (not shown), or a memory card (not shown) mounted in the electronic device 100 (eg For example, micro SD card, memory stick) may be used as a meaning including.
  • the sensor unit 113 may include at least one sensor. Specifically, the sensor unit 113 may include at least one of a tilt sensor that senses the tilt of the electronic device 100 and an image sensor that captures an image.
  • the tilt sensor may be an acceleration sensor or a gyro sensor
  • the image sensor may mean a camera or a depth camera.
  • the sensor unit 113 may include various sensors other than a tilt sensor or an image sensor.
  • the sensor unit 113 may include an illuminance sensor and a distance sensor.
  • the sensor unit 113 may include a lidar sensor.
  • User interface 115 may include various types of input devices.
  • user interface 115 may include physical buttons.
  • the physical button may include a function key, a direction key (eg, a 4-direction key), or a dial button.
  • the physical button may be implemented as a plurality of keys.
  • the physical button may be implemented as one key.
  • the electronic device 100 may receive a user input in which one key is pressed for a critical period of time or more.
  • the processor 114 may perform a function corresponding to the user input. For example, processor 114 may provide a lighting function based on user input.
  • the user interface 115 may receive a user input using a non-contact method.
  • a method for controlling the electronic device regardless of physical force may be required.
  • the user interface 115 may receive a user gesture and perform an operation corresponding to the received user gesture.
  • the user interface 115 may receive a user's gesture through a sensor (eg, an image sensor or an infrared sensor).
  • the user interface 115 may receive a user input using a touch method.
  • the user interface 115 may receive a user input through a touch sensor.
  • the touch method may be implemented as a non-contact method.
  • the touch sensor may determine whether the user's body has approached within a critical distance.
  • the touch sensor may identify a user input even when the user does not contact the touch sensor.
  • the touch sensor may identify a user input in which a user contacts the touch sensor.
  • the electronic device 100 may receive user input in various ways other than the above-described user interface.
  • the electronic device 100 may receive a user input through an external remote control device.
  • the external remote control device may be a remote control device corresponding to the electronic device 100 (eg, an electronic device-specific control device) or a user's portable communication device (eg, a smartphone or a wearable device).
  • the user's portable communication device may store an application for controlling the electronic device.
  • the portable communication device may obtain a user input through a stored application and transmit the acquired user input to the electronic device 100 .
  • the electronic device 100 may receive a user input from a portable communication device and perform an operation corresponding to a user's control command.
  • the electronic device 100 may receive a user input using voice recognition.
  • the electronic device 100 may receive a user's voice through a microphone included in the electronic device.
  • the electronic device 100 may receive a user's voice from a microphone or an external device.
  • the external device may acquire a user voice through a microphone of the external device and transmit the obtained user voice to the electronic device 100 .
  • the user's voice transmitted from the external device may be audio data or digital data obtained by converting the audio data (eg, audio data converted into a frequency domain).
  • the electronic device 100 may perform an operation corresponding to the received user voice.
  • the electronic device 100 may receive audio data corresponding to a user's voice through a microphone.
  • the electronic device 100 may convert the received audio data into digital data.
  • the electronic device 100 may convert the converted digital data into text data using a speech to text (STT) function.
  • STT speech to text
  • the STT (Speech To Text) function may be performed directly in the electronic device 100,
  • a speech to text (STT) function may be performed in an external server.
  • the electronic device 100 may transmit digital data to an external server.
  • the external server may convert digital data into text data and obtain control command data based on the converted text data.
  • the external server may transmit control command data (this time, text data may also be included) to the electronic device 100 .
  • the electronic device 100 may perform an operation corresponding to the user's voice based on the obtained control command data.
  • the electronic device 100 may provide a voice recognition function using one assistant (or artificial intelligence assistant, eg, BixbyTM, etc.), but this is only an example and through a plurality of assistants.
  • a voice recognition function may be provided.
  • the electronic device 100 may provide a voice recognition function by selecting one of a plurality of assists based on a trigger word corresponding to the assist or a specific key present on the remote control.
  • the electronic device 100 may receive a user input using screen interaction.
  • Screen interaction may refer to a function of identifying whether a predetermined event occurs through an image projected on a screen (or a projection surface) by an electronic device and acquiring a user input based on the predetermined event.
  • the predetermined event may refer to an event in which a predetermined object is identified at a specific location (eg, a location where a UI for receiving a user input is projected).
  • the predetermined object may include at least one of a user's body part (eg, a finger), a pointing stick, and a laser point.
  • the electronic device 100 may identify that a user input for selecting the projected UI has been received. For example, the electronic device 100 may project a guide image to display a UI on the screen. And, the electronic device 100 can identify whether the user selects the projected UI. Specifically, the electronic device 100 may identify that the user has selected the projected UI when a predetermined event is identified at the location of the projected UI.
  • the projected UI may include at least one or more items.
  • the electronic device 100 may perform spatial analysis to identify whether a predetermined event is located at the location of the projected UI.
  • the electronic device 100 may perform spatial analysis through a sensor (eg, an image sensor, an infrared sensor, a depth camera, a distance sensor, etc.).
  • the electronic device 100 may identify whether a predetermined event occurs at a specific location (the location where the UI is projected) by performing spatial analysis. And, if it is identified that a predetermined event occurs at a specific location (the location where the UI is projected), the electronic device 100 may identify that a user input for selecting a UI corresponding to the specific location has been received.
  • the input/output interface 116 is a component for inputting/outputting at least one of an audio signal and a video signal.
  • the input/output interface 116 may receive at least one of audio and video signals from an external device and output a control command to the external device.
  • the input/output interface 116 includes HDMI (High Definition Multimedia Interface), MHL (Mobile High-Definition Link), USB (Universal Serial Bus), USB C-type, DP (Display Port), Thunderbolt, VGA (Video Graphics Array) port, RGB port, D-SUB (Dsubminiature) and DVI (Digital Visual Interface) may be implemented as at least one wired input/output interface.
  • the wired input/output interface may be implemented as an interface for inputting/outputting only audio signals and an interface for inputting/outputting only video signals, or may be implemented as one interface for inputting/outputting both audio and video signals.
  • the electronic device 100 may receive data through a wired input/output interface, but this is merely an example, and power may be supplied through the wired input/output interface.
  • the electronic device 100 may receive power from an external battery through USB C-type or from an outlet through a power adapter.
  • an electronic device may receive power from an external device (eg, a laptop computer or a monitor) through a DP.
  • the input/output interface 116 is Wi-Fi, Wi-Fi Direct, Bluetooth, Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP) and Long Term Evolution (LTE) communication It may be implemented as a wireless input/output interface that performs communication using at least one of the communication methods.
  • the wireless input/output interface may be implemented as an interface for inputting/outputting only audio signals and an interface for inputting/outputting only video signals, or may be implemented as one interface for inputting/outputting both audio and video signals.
  • an audio signal may be input through a wired input/output interface, and a video signal may be input through a wireless input/output interface.
  • an audio signal may be input through a wireless input/output interface and a video signal may be input through a wired input/output interface.
  • the audio output unit 117 is a component that outputs an audio signal.
  • the audio output unit 117 may include an audio output mixer, an audio signal processor, and a sound output module.
  • the audio output mixer may synthesize a plurality of audio signals to be output into at least one audio signal.
  • the audio output mixer may combine an analog audio signal and another analog audio signal (eg, an analog audio signal received from the outside) into at least one analog audio signal.
  • the sound output module may include a speaker or an output terminal.
  • the sound output module may include a plurality of speakers, and in this case, the sound output module may be disposed inside the main body, and the sound emitted by covering at least a part of the diaphragm of the sound output module may be emitted through a sound conduit ( waveguide) and can be transmitted to the outside of the main body.
  • the sound output module includes a plurality of sound output units, and since the plurality of sound output units are symmetrically disposed on the exterior of the main body, sound can be emitted in all directions, that is, in all directions of 360 degrees.
  • the power supply unit 118 may receive power from the outside and supply power to various components of the electronic device 100 .
  • the power supply unit 118 may receive power through various methods. As an example, the power supply unit 118 may receive power using the connector 130 shown in FIG. 1 . In addition, the power supply unit 118 may receive power using a 220V DC power cord. However, the present invention is not limited thereto, and the electronic device may receive power using a USB power cord or a wireless charging method.
  • the power supply unit 118 may receive power using an internal battery or an external battery.
  • the power supply unit 118 according to an embodiment of the present disclosure may receive power through an internal battery.
  • the power supply unit 118 may charge power of an internal battery using at least one of a 220V DC power cord, a USB power cord, and a USB C-Type power cord, and may receive power through the charged internal battery.
  • the power supply unit 118 according to an embodiment of the present disclosure may receive power through an external battery.
  • the power supply unit 118 may receive power through the external battery. That is, the power supply unit 118 may directly receive power from an external battery, or may charge an internal battery through an external battery and receive power from the charged internal battery.
  • the power supply unit 118 may receive power using at least one of the plurality of power supply methods described above.
  • the electronic device 100 may have power consumption equal to or less than a preset value (eg, 43W) due to a socket type and other standards.
  • the electronic device 100 may vary power consumption to reduce power consumption when using a battery. That is, the electronic device 100 may vary power consumption based on a power supply method and power usage.
  • the electronic device 100 may provide various smart functions.
  • the electronic device 100 is connected to a portable terminal device for controlling the electronic device 100, and a screen output from the electronic device 100 can be controlled through a user input input from the portable terminal device.
  • the mobile terminal device may be implemented as a smart phone including a touch display, and the electronic device 100 receives and outputs screen data provided by the mobile terminal device from the mobile terminal device, and inputs data from the mobile terminal device.
  • a screen output from the electronic device 100 may be controlled according to a user input.
  • the electronic device 100 may share content or music provided by the portable terminal device by connecting to the portable terminal device through various communication methods such as Miracast, Airplay, wireless DEX, and Remote PC.
  • the mobile terminal device and the electronic device 100 may be connected through various connection methods.
  • the portable terminal device may perform a wireless connection by searching for the electronic device 100 or the electronic device 100 may search for the portable terminal device and perform a wireless connection.
  • the electronic device 100 may output content provided by the portable terminal device.
  • the electronic device 100 may output content or music currently being output on the portable terminal device.
  • the portable terminal device while specific content or music is being output from the portable terminal device, the portable terminal device is brought closer to the electronic device 100 by a predetermined distance or less (eg, non-contact tap view), or the portable terminal device is close to the electronic device 100.
  • a predetermined distance or less eg, non-contact tap view
  • the electronic device 100 may output content or music currently being output on the portable terminal device.
  • the portable terminal device when a connection is established between the portable terminal device and the electronic device 100, the portable terminal device outputs a first screen provided by the portable terminal device, and in the electronic device 100, the first screen is provided by a different portable terminal device.
  • a second screen may be output.
  • the first screen may be a screen provided by a first application installed on the portable terminal device
  • the second screen may be a screen provided by a second application installed on the portable terminal device.
  • the first screen and the second screen may be different screens provided by one application installed in the portable terminal device.
  • the first screen may be a screen including a remote control type UI for controlling the second screen.
  • the electronic device 100 may output a standby screen.
  • the electronic device 100 may output a standby screen.
  • the electronic device 100 may output a standby screen.
  • Conditions for the electronic device 100 to output the standby screen are not limited to the above examples, and the standby screen may be output under various conditions.
  • the electronic device 100 may output a standby screen in the form of a blue screen, but the present disclosure is not limited thereto.
  • the electronic device 100 may obtain an irregular object by extracting only the shape of a specific object from data received from an external device, and output an idle screen including the acquired irregular object.
  • the communication interface 119 is a component that performs communication with various types of external devices according to various types of communication methods.
  • the communication interface 119 may include a Wi-Fi module, a Bluetooth module, an infrared communication module, and a wireless communication module.
  • the Wi-Fi module and the Bluetooth module perform communication using the Wi-Fi method and the Bluetooth method, respectively.
  • the wireless communication module can use Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP), Long Term Evolution (LTE), LTE Advanced (LTE-A), 4th Generation (4G), 5G (5th Generation) may include at least one communication chip that performs communication according to various wireless communication standards.
  • the shutter unit 120 may include at least one of a shutter 121 , a fixing member 122 , a rail 123 , a body 124 , and a motor 125 .
  • the shutter 121 may block light output from the projection unit 111 .
  • the fixing member 122 may fix the position of the shutter 121 .
  • the rail 123 may be a path for moving the shutter 121 and the fixing member 122 .
  • the body 124 may have a configuration including a shutter 121 and a fixing member 122 .
  • the motor 125 provides driving power for movement of components in the shutter unit 120 (for example, movement of the body 124) or rotation of components (for example, rotation of the shutter 121). It may be a configuration that causes
  • FIG. 3 is a perspective view illustrating an external appearance of an electronic device 100 according to other embodiments of the present disclosure.
  • the electronic device 100 may include a support (or referred to as a “handle”) 108a.
  • the support 108a of various embodiments may be a handle or a hook provided for a user to grip or move the electronic device 100, or the support 108a may be a main body ( 105) may be a stand supporting the.
  • the support 108a may be coupled to or separated from the outer circumferential surface of the main body 105 in a hinge structure, and may be selectively separated and fixed from the outer circumferential surface of the main body 105 according to the user's needs.
  • the number, shape, or arrangement of the supports 108a may be variously implemented without limitation.
  • the support 108a is built into the main body 105 and can be taken out and used by the user as needed, or the support 108a can be implemented as a separate accessory and detachable from the electronic device 100. there is.
  • the support 108a may include a first support surface 108a-1 and a second support surface 108a-2.
  • the first support surface 108a-1 may be a surface facing the outside of the body 105 in a state where the support 108a is separated from the outer circumferential surface of the body 105
  • the second support surface 108a-2 is the support (108a) may be a surface facing the inner direction of the main body 105 in a state separated from the outer circumferential surface of the main body 105.
  • the first support surface 108a-1 is developed from the lower part of the body 105 to the upper part of the body 105 and may move away from the body 105, and the first support surface 108a-1 is flat or uniformly curved. can have a shape.
  • the first support surface 108a-1 is the case where the electronic device 100 is mounted so that the outer surface of the body 105 touches the bottom surface, that is, when the projection lens 110 is disposed facing the front direction, the body ( 105) can be supported.
  • the angle of exit of the head 103 and the projection lens 110 may be adjusted by adjusting the distance between the two supports 108a or the hinge opening angle.
  • the second support surface 108a-2 is a surface that comes into contact with the user or an external mounting structure when the support 108a is supported by the user or an external mounting structure, and prevents the user from slipping when the electronic device 100 is supported or moved. It may have a shape corresponding to the gripping structure of the hand or the external mounting structure. The user may direct the projection lens 110 toward the front, fix the head 103, hold the support 108a, move the electronic device 100, and use the electronic device 100 like a flashlight.
  • the support groove 104 is provided on the main body 105 and is a groove structure that can be accommodated when the support 108a is not in use, and as shown in FIG. It can be implemented as a home structure. Through the support groove 104, the support 108a can be stored on the outer circumferential surface of the main body 105 when the support 108a is not in use, and the outer circumferential surface of the main body 105 can be kept smooth.
  • the support 108a may have a structure in which the support 108a is stored inside the body 105 and the support 108a is pulled out of the body 105 in a situation where the support 108a is needed.
  • the support groove 104 may have a structure drawn into the main body 105 to accommodate the support rod 108a, and the second support surface 108a-2 may be in close contact with the outer circumferential surface of the main body 105 or a separate support rod.
  • a door (not shown) that opens and closes the groove 104 may be included.
  • the electronic device 100 may include various types of accessories that help use or store the electronic device 100.
  • the electronic device 100 may include the electronic device 100
  • a protective case (not shown) may be included to protect and easily transport the electronic device 100 by being coupled to a tripod (not shown) or an external surface that supports or fixes the main body 105. Possible brackets (not shown) may be included.
  • FIG. 4 is a perspective view illustrating an external appearance of an electronic device 100 according to another embodiment of the present disclosure.
  • the electronic device 100 may include a support (or referred to as a “handle”) 108b.
  • the support 108b of various embodiments may be a handle or a hook provided for a user to grip or move the electronic device 100, or the support 108b may be a main body ( 105) may be a stand that supports it so that it can be directed at an arbitrary angle.
  • the support 108b may be connected to the main body 105 at a predetermined point (eg, 2/3 to 3/4 of the height of the main body) of the main body 105. .
  • a predetermined point eg, 2/3 to 3/4 of the height of the main body
  • the main body 105 can be supported at an arbitrary angle in a state where the main body 105 is laid down in the lateral direction.
  • FIG. 5 is a perspective view illustrating an external appearance of an electronic device 100 according to another embodiment of the present disclosure.
  • the electronic device 100 may include a support (or referred to as a “stand”) 108c.
  • the support 108c of various embodiments includes a base plate 108c-1 provided to support the electronic device 100 on the ground and two support members 108c connecting the base plate 108-c and the main body 105. -2) may be included.
  • the two support members (108c-2) have the same height, so that one end surface of the two support members (108c-2) is provided on one outer circumferential surface of the main body 105 and the hinge member (108c). -3) can be combined or separated.
  • the two supporting members may be hingedly connected to the main body 105 at a predetermined point (eg, 1/3 to 2/4 of the height of the main body) of the main body 105 .
  • the main body 105 is rotated based on the imaginary horizontal axis formed by the two hinge members 108c-3 so that the projection lens 110 The emission angle of can be adjusted.
  • FIG. 5 shows an embodiment in which two support members 108c-2 are connected to the body 105
  • the present disclosure is not limited thereto, and one support member and the body as shown in FIGS. 6A and 6B ( 105) may be connected by one hinge member.
  • 6A is a perspective view illustrating an external appearance of an electronic device 100 according to another embodiment of the present disclosure.
  • FIG. 6B is a perspective view illustrating a state in which the electronic device 100 of FIG. 6A is rotated.
  • the support 108d of various embodiments includes a base plate 108d-1, a base plate 108-c, and a body 105 provided to support the electronic device 100 on the ground. It may include one support member (108d-2) connecting the.
  • one support member 108d-2 may be coupled or separated by a groove provided on one outer circumferential surface of the body 105 and a hinge member (not shown).
  • the supports shown in FIGS. 3, 4, 5, 6A, and 6B are merely examples, and the electronic device 100 may have supports in various positions or shapes.
  • FIG. 7 is a flowchart for explaining a control method of the shutter unit 120 according to an embodiment of the present disclosure.
  • the electronic device 100 may acquire an image to be output on the projection surface (S705). And, the electronic device 100 may perform keystone correction using the obtained image (S710). Then, the electronic device 100 may identify a remaining portion based on the keystone-corrected image (S715). Specifically, the electronic device 100 may identify a remaining portion excluding an image portion where an image is output from an output portion to which light may be irradiated by the electronic device 100 . And, the electronic device 100 may control the shutter unit based on the remaining portion (S720).
  • FIG. 8 is a flowchart for explaining the control method of FIG. 7 in detail.
  • the electronic device 100 may obtain a first image to be output on the projection surface (S805). Then, the electronic device 100 may obtain a second image by performing keystone correction on the first image (S810).
  • the second image may include an image portion where the target image is displayed and a remaining portion where the target image is not displayed.
  • the electronic device 100 may identify the remaining portion of the second image on which the target image is not displayed (S815).
  • the electronic device 100 may control the shutter unit 120 so that the remaining portion is not projected onto the projection surface (S820). And, the electronic device 100 may output the second image (S825). The electronic device 100 may output the second image including both the image part and the remaining part through the projection lens 110 . However, since the shutter unit 120 covers the remaining portion, only the image portion may be displayed on the projection surface and the remaining portion may not be displayed.
  • FIG. 9 is a diagram for explaining an output operation of an image on which keystone correction has been performed according to an embodiment of the present disclosure.
  • the electronic device 100 may acquire an image 910 on which keystone correction has been performed.
  • the image 910 on which keystone correction has been performed includes an image portion 911 including the target image and a residual portion 912 not including the target image.
  • the electronic device 100 may output the keystone-corrected image 910 to the projection surface 901 through the projection lens 110 .
  • An output capable area 920, an image area 921, and a residual area 922 may exist on the projection surface.
  • the outputable area 920 is an area to which the electronic device 100 can irradiate light using the projection lens 110 .
  • the outputable area 920 is an area where the entire keystone-corrected image 910 is output.
  • the image area 921 is an area where the image portion 911 including the target image is output.
  • the remaining area 922 is an area where the remaining portion 912 not including the target image is output.
  • the electronic device 100 may output the image portion 911 including the target image to the image area 921 of the projection surface 901 . And, the electronic device 100 may output the remaining portion 912 to the remaining area 922 of the projection surface 901 .
  • FIG. 10 is a diagram for explaining an output operation of an image on which keystone correction is performed according to another embodiment of the present disclosure.
  • the electronic device 100 may acquire an image 1010 on which keystone correction has been performed.
  • the image 1010 on which keystone correction has been performed includes an image portion 1011 including the target image and a remaining portion 1012 not including the target image.
  • the electronic device 100 may output the keystone-corrected image 1010 to the projection surface 1001 through the projection lens 110 .
  • An output capable area 1020, an image area 1021, and a residual area 1022 may exist on the projection surface.
  • the outputable area 1020 is an area to which the electronic device 100 can irradiate light using the projection lens 110 .
  • the outputable area 1020 is an area where the entire image 1010 on which keystone correction is performed is output.
  • the image area 1021 is an area where the image portion 1011 including the target image is output.
  • the remaining area 1022 is an area where the remaining portion 1012 not including the target image is output.
  • the electronic device 100 may output the image portion 1011 including the target image to the image area 1021 of the projection surface 1001 . And, the electronic device 100 may output the remaining portion 1012 to the remaining area 1022 of the projection surface 1001 .
  • 11 is a diagram for explaining an output operation of an image on which keystone correction is performed according to another embodiment of the present disclosure.
  • the electronic device 100 may obtain an image 1110 on which keystone correction has been performed.
  • the image 1110 on which keystone correction has been performed includes an image portion 1111 including the target image and a remaining portion 1112 not including the target image.
  • the electronic device 100 may output the keystone-corrected image 1110 to the projection surface 1101 through the projection lens 110 .
  • An output capable area 1120, an image area 1121, and a residual area 1122 may exist on the projection surface.
  • the outputable area 1120 is an area to which the electronic device 100 can irradiate light using the projection lens 110 .
  • the outputable area 1120 is an area where the entire keystone corrected image 1110 is output.
  • the image area 1121 is an area where the image portion 1111 including the target image is output.
  • the remaining area 1122 is an area where the remaining portion 1112 not including the target image is output.
  • the electronic device 100 may output the image portion 1111 including the target image to the image area 1121 of the projection surface 1101 . And, the electronic device 100 may output the remaining portion 1112 to the remaining area 1122 of the projection surface 1101 .
  • FIG. 12 is a diagram for explaining an operation of the shutter unit 120 according to an embodiment of the present disclosure.
  • an embodiment 1210 shows a situation before the shutter unit 120 moves.
  • the shutter unit 120 may include a plurality of shutters 121-1, 121-2, 121-3, and 121-4.
  • the plurality of shutters 121-1, 121-2, 121-3, and 121-4 may not have the shutter unit 120.
  • the embodiment 1220 shows a situation after the shutter unit 120 moves.
  • the electronic device 100 may control the shutter unit 120 to move a plurality of shutters.
  • the electronic device 100 may cover a portion of the projection lens 110 by moving the plurality of shutters 121-1, 121-2, 121-3, and 121-4.
  • FIG. 13 is a diagram for explaining the operation of the shutter unit 120 according to another embodiment of the present disclosure.
  • the shutter unit 120 includes a plurality of shutters 121-1, 121-2, 121-3, and 121-4 and a plurality of fixing members 122-1, 122-2, 122-3, 122-4).
  • each of the plurality of shutters 121-1, 121-2, 121-3, and 121-4 is a shutter unit by a plurality of fixing members 122-1, 122-2, 122-3, and 122-4, respectively. It is fixed to the body 124 of 120.
  • each of the plurality of fixing members 122-1, 122-2, 122-3, and 122-4 is at the central portion of each of the plurality of shutters 121-1, 121-2, 121-3, and 121-4. can be placed.
  • the plurality of fixing members 122-1, 122-2, 122-3, and 122-4 may be disposed at one of the left central part, the right central part, the upper central part, and the lower central part of the projection lens 110. can
  • the first shutter 121-1 may be fixed to the body 124 of the shutter unit 120 by the first fixing member 122-1.
  • the first fixing member 122-1 may be disposed at a central portion of the first shutter 121-1. Descriptions of other shutters and fixing members are omitted.
  • each of the plurality of shutters 121-1, 121-2, 121-3, and 121-4 includes a plurality of fixing members 122-1, 122-2, 122-3, and 122-4. ) can be rotated clockwise or counterclockwise based on each.
  • a portion of the projection lens 110 may be covered according to rotation of the plurality of shutters 121-1, 121-2, 121-3, and 121-4.
  • FIG. 14 is a diagram for explaining the operation of the shutter unit 120 according to another embodiment of the present disclosure.
  • the shutter unit 120 includes a plurality of shutters 121-1, 121-2, 121-3, 121-4, 121-5, 121-6, 121-7, and 121-8 and a plurality of shutters. It may include a fixing member (122-1, 122-2, 122-3, 122-4, 122-5, 122-6, 122-7, 122-8) of the.
  • each of the plurality of shutters 121-1, 121-2, 121-3, 121-4, 121-5, 121-6, 121-7, and 121-8 has a plurality of fixing members 122-1 and 122 -2, 122-3, 122-4, 122-5, 122-6, 122-7, 122-8) are fixed to the body 124 of the shutter unit 120, respectively.
  • each of the plurality of fixing members 122-1, 122-2, 122-3, 122-4, 122-5, 122-6, 122-7, and 122-8 has a plurality of shutters 121-1 and 121 -2, 121-3, 121-4, 121-5, 121-6, 121-7, 121-8) may be disposed on each side surface (left side, right side, upper side or lower side).
  • each of the plurality of fixing members 122-1, 122-2, 122-3, 122-4, 122-5, 122-6, 122-7, and 122-8 is a corner region of the projection lens 110 ( upper left part, upper right part, lower left part, lower right part).
  • the projection lens 110 has a circular shape, an arbitrary point may be designated, and a plurality of fixing members 122-1, 122-2, 122-3, 122-4, 122-5, 122-6, 122-7, 122-8) may be disposed.
  • the first shutter 121-1 may be fixed to the body 124 of the shutter unit 120 by the first fixing member 122-1.
  • the first fixing member 122-1 may be disposed on a side portion of the first shutter 121-1. Descriptions of other shutters and fixing members are omitted.
  • the electronic device 100 includes a plurality of shutters 121-1, 121-2, 121-3, 121-4, 121-5, 121-6, 121-7, and 121-8, respectively, a plurality of fixing members. (122-1, 122-2, 122-3, 122-4, 122-5, 122-6, 122-7, 122-8) can be rotated clockwise or counterclockwise, respectively.
  • a part of the projection lens 110 can be covered
  • 15 is a diagram for explaining the operation of the shutter unit 120 according to another embodiment of the present disclosure.
  • the shutter unit 120 includes a plurality of shutters 121-1-1, 121-1-2, 121-2-1, 121-2-2, 121-3-1, 121-3- 2, 121-4-1, 121-4-2) and a plurality of fixing members 122-1, 122-2, 122-3, and 122-4.
  • a plurality of shutters (121-1-1, 121-1-2, 121-2-1, 121-2-2, 121-3-1, 121-3-2, 121-4-1, 121- 4-2)
  • Each of the plurality of fixing members 122-1, 122-2, 122-3, and 122-4 is fixed to the body 124 of the shutter unit 120, respectively.
  • each of the plurality of fixing members 122-1, 122-2, 122-3, and 122-4 includes a plurality of shutters 121-1-1, 121-1-2, 121-2-1, and 121-2 -2, 121-3-1, 121-3-2, 121-4-1, 121-4-2) may be disposed on each side part (left side, right side, upper side or lower side).
  • each of the plurality of fixing members 122-1, 122-2, 122-3, and 122-4 is a corner region (upper left part, upper right part, lower left part, lower right part) of the projection lens 110 can be placed in According to another embodiment, if the projection lens 110 has a circular shape, an arbitrary point may be designated, and a plurality of fixing members 122-1, 122-2, 122-3, and 122-4 are attached to the designated point. can be placed.
  • the first shutter 121-1-1 may be fixed to the body 124 of the shutter unit 120 by the first fixing member 122-1.
  • the first fixing member 122-1 may be disposed on a side portion of the first shutter 121-1-1. Descriptions of other shutters and fixing members are omitted.
  • two or more shutters may be fixed to one fixing member.
  • the electronic device 100 includes a plurality of shutters 121-1-1, 121-1-2, 121-2-1, 121-2-2, 121-3-1, 121-3-2, 121-
  • Each of the plurality of fixing members 4-1 and 121-4-2 may rotate clockwise or counterclockwise based on each of the plurality of fixing members 122-1, 122-2, 122-3, and 122-4.
  • a plurality of shutters (121-1-1, 121-1-2, 121-2-1, 121-2-2, 121-3-1, 121-3-2, 121-4-1, 121- According to the rotation of 4-2), a part of the projection lens 110 may be covered.
  • 16 is a diagram for explaining the operation of the shutter unit 120 according to another embodiment of the present disclosure.
  • the shutter unit 120 includes a plurality of shutters 121-1, 121-2, 121-3, and 121-4 and a plurality of fixing members 122-1, 122-2, 122-3, 122-4).
  • the plurality of shutters 121-1, 121-2, 121-3, and 121-4 may have a fan shape.
  • the fan-shaped sizes of the plurality of shutters 121-1, 121-2, 121-3, and 121-4 may be fixed. Fan-shaped shutters 121-1, 121-2, 121-3, and 121-4 having fixed sizes may be rotated by fixing members 122-1, 122-2, 122-3, and 122-4. there is.
  • the fan-shaped sizes of the plurality of shutters 121-1, 121-2, 121-3, and 121-4 may be changed.
  • the electronic device 100 may control the plurality of shutters 121-1, 121-2, 121-3, and 121-4 to change the size of the sector. A specific operation related to this is described in the embodiment 2810 of FIG. 28 .
  • 17 is a diagram for explaining the operation of the shutter unit 120 according to another embodiment of the present disclosure.
  • the shutter unit 120 includes a plurality of shutters 121-1, 121-2, 121-3, 121-4, 121-5, and 121-6 and a plurality of fixing members 122-1, 122-2, 122-3, 122-4, 122-5, 122-6).
  • the plurality of fixing members 122-1, 122-2, 122-3, 122-4, 122-5, and 122-6 may be disposed at arbitrary points around the projection lens 110.
  • the plurality of fixing members 122-1, 122-2, 122-3, 122-4, 122-5, and 122-6 are upper left, upper, upper right, lower right, lower, and lower left portions. can be placed in
  • FIG. 18 is a diagram for explaining the operation of the shutter unit 120 according to another embodiment of the present disclosure.
  • the projection lens 110 is completely covered by the shutter unit 120.
  • the projection lens 110 may be completely covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4.
  • the embodiment 1820 represents a situation in which only a portion of the projection lens 110 is covered by the shutter unit 120 .
  • the electronic device 100 may move a plurality of shutters 121-1, 121-2, 121-3, and 121-4, and may move a plurality of shutters 121-1, 121-2, 121-3, and 121-4. 4) According to the movement of the projection lens 110, only a portion of the projection lens 110 may be covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4.
  • 19 is a view for explaining the arrangement of the projection lens 110 and the shutter unit 120.
  • the electronic device 100 may include a projection lens 110 and a shutter unit 120.
  • the central axis 1910 of the projection lens 110 and the central axis 1910 of the shutter unit 120 may be aligned.
  • the central axis of the projection lens 110 and the central axis of the shutter unit 120 may not coincide.
  • FIG. 20 is a view for explaining an embodiment in which a shutter unit is disposed to correspond to an offset of a projection lens.
  • an embodiment 2010 shows the arrangement of the projection lens 110 and the shutter unit 120 in a situation where the offset of the projection lens 110 is 0%.
  • the central axis of the projection lens 110 and the central axis of the shutter unit 120 may coincide.
  • Embodiment 2020 shows the arrangement of the projection lens 110 and the shutter unit 120 in a situation where the offset of the projection lens 110 is 100%.
  • the central axis of the projection lens 110 and the central axis of the shutter unit 120 may not coincide.
  • the shutter unit 120 may be disposed at a position corresponding to an offset (100%) of the projection lens 110 .
  • Embodiment 2030 shows the arrangement of the projection lens 110 and the shutter unit 120 in a situation where the offset of the projection lens 110 is 150%.
  • the central axis of the projection lens 110 and the central axis of the shutter unit 120 may not coincide.
  • the shutter unit 120 may be disposed at a position corresponding to an offset (150%) of the projection lens 110 .
  • Embodiment 2040 shows the arrangement of the projection lens 110 and the shutter unit 120 in a situation where the offset of the projection lens 110 is 200%.
  • the central axis of the projection lens 110 and the central axis of the shutter unit 120 may not coincide.
  • the shutter unit 120 may be disposed at a position corresponding to an offset (200%) of the projection lens 110 .
  • the disposition of the shutter unit 120 may be displaced more from the central axis of the projection lens 110 .
  • the offset may be determined according to the characteristics of the projection lens 110 and may be changed by user settings of the electronic device 100 . Accordingly, when the offset is identified, the electronic device 100 may change the position of the shutter unit 120 to correspond to the identified offset.
  • 21 is a diagram for identifying a remaining part.
  • the electronic device 100 may acquire an image 2110 on which keystone correction has been performed.
  • the electronic device 100 may identify the remaining portion in the image 2110.
  • a specific portion 2120 in the image 2110 will be described as a reference.
  • the image 2130 may be an image obtained by rotating the specific portion 2120 counterclockwise by 90 degrees.
  • the angle between the line segments 2131 and 2132 of PA(x1, y1) and PB(x2, y2) may be ⁇ 1.
  • the angle between the line segments 2131 and 2133 of PA(x1, y1) and PB(x2, y2) may be ⁇ 2.
  • ⁇ 1 may be arctan(
  • a1 tan( ⁇ 1)*
  • may be used.
  • the electronic device 100 moves the shutter unit 120 in consideration of d1 + a1 location can be determined.
  • the moving position may be determined by considering the distance between the DMD (Digital Micromirror Device) panel and the shutter in addition to d1 + a1.
  • the electronic device 100 may determine the moving position of the shutter unit 120 in consideration of the height ratio of the height of the DMD panel itself and the size projected on the shutter.
  • D_all_s d_all*H/h.
  • D_all_s may mean a movement distance of the actual shutter 121 .
  • d_all may mean a movement distance within the DMD panel.
  • h may mean the height of the DMD panel.
  • H may mean the height of the shutter 121 (or the height of the shutter 121 position).
  • w may be used instead of h, and w may mean the width of the DMD panel.
  • W may be used instead of H, and W may mean the screen width at the shutter position.
  • P0 (x0, y0) is the origin where the shutter unit 120 is disposed.
  • PA(x1, y1) and PB(x2, y2) are edges of the image portion including the target image in the keystone-corrected image 2110.
  • PP(xp, yp), PC(x3, y3), and PD(x4, y4) are movable points of the shutter unit 120.
  • the electronic device 100 may move the shutter unit 120 from P0 (x0, y0) to a specific point to cover the remaining portion. Information on P0(x0, y0) may be pre-stored. Also, the electronic device 100 may acquire PA(x1, y1) and PB(x2, y2) based on the image 2110 on which keystone correction has been performed. In addition, the electronic device 100 may move the shutter unit 120 to a location other than PA(x1, y1) and PB(x2, y2) in consideration of the possibility of further expansion of the remaining portion.
  • the electronic device 100 may move the shutter unit 120 by 'y3-y0' (d1) in the y-axis direction and by x1-x0 in the x-axis direction. In addition, the electronic device 100 may rotate the shutter included in the shutter unit 120 by ⁇ 1 (or by an angle calculated based on ⁇ 1) after moving the shutter unit 120 by PA(x1, y1). .
  • the electronic device 100 may move the shutter unit 120 by 'y2-y0' (d2) in the y-axis direction and by x2-x0 in the x-axis direction.
  • the electronic device 100 may rotate the shutter included in the shutter unit 120 by ⁇ 2 (or by an angle calculated based on ⁇ 2) after moving the shutter unit 120 to PB (x2, y2). there is.
  • the electronic device 100 may move the shutter unit 120 by 'y3-y0' (d1 + a1) in the y-axis direction. In addition, the electronic device 100 may rotate the shutter included in the shutter unit 120 by ⁇ 1 (or by an angle calculated based on ⁇ 1) after moving the shutter unit 120 by PC (x3, y3). .
  • the electronic device 100 may move the shutter unit 120 by 'x4-x0' in the x-axis direction.
  • the electronic device 100 may rotate the shutter included in the shutter unit 120 by ⁇ 2 (or by an angle calculated based on ⁇ 2) after moving the shutter unit 120 to PD (x4, y4). there is.
  • the electronic device 100 may move the shutter unit 120 by 'yp-y0' (d1) in the y-axis direction. In addition, the electronic device 100 may rotate the shutter included in the shutter unit 120 by a new angle after moving the shutter unit 120 to PP(xp, yp).
  • the electronic device 100 may obtain movement information and rotation information of the shutter by analyzing the keystone-corrected image 2110 .
  • the electronic device 100 may obtain shutter movement information by considering a throw ratio in addition to distance information.
  • the electronic device 100 may obtain shutter movement information by using a difference between a virtual area and a display area of a panel (eg, Digital Micromirror Device, DMD).
  • DMD Digital Micromirror Device
  • the electronic device 100 may acquire the rotation angle of the shutter 121, the up/down/left/right movement distance, etc., based on various data obtained in the above-described calculation process.
  • a phenomenon in which each side of the correction area is distorted may occur.
  • a new reference line is formed with the distortion area farthest as a orthogonal distance from the connection between each pole as a reference point, and the rotation area and movement distance of each blade of the shutter can be used based on the corresponding line and point. .
  • FIG. 23 A detailed description related to this is described in FIG. 23 .
  • the outer point of the correction area acquires a new baseline.
  • an appropriate outer point can be acquired so that the inner side of the correction area is not covered. In this environment, it is less precise than individual shutter blade driving, but it can be easy to calculate and adjust.
  • the electronic device 100 may consider priority in acquiring the rotation angle and the movement distance of the shutter 121 .
  • the electronic device 100 may minimize the operation time by preferentially considering the smaller rotation angle and the shorter moving distance.
  • 22 is a diagram for explaining an operation of changing an image area.
  • embodiments 2210 and 2220 may indicate a state in which an image for which keystone correction has been performed is output on a projection surface.
  • an image area 2211 and a residual area 2212 may exist.
  • the electronic device 100 maintains the position of the remaining area 2222 the same as in the embodiment 2210, but changes the image so that only the position of the image area 2221 is different from that of the embodiment 2210. can be printed out.
  • An image output in the embodiment 2220 may be output lower than an image output in the embodiment 2210.
  • the electronic device 100 may change the image part itself as in the embodiment 2220 of FIG. 22 .
  • FIG. 23 is a diagram for explaining a remaining portion changed according to an image correction operation.
  • an image 2311 means an image for which keystone correction has been completed.
  • one part 2312 of the corner in the image 2311 is specifically described.
  • Example 2320 is an enlargement of portion 2312.
  • An axis 2301 corresponding to a corner and an arbitrary axis 2302 may be parallel.
  • the electronic device 100 may control the shutter unit 120 based on at least one of the reference point 2321 and the reference point 2322 on the axis 2301 .
  • the embodiment 2330 represents a case in which a corner portion of the image 2311 is expanded during a correction process.
  • the image 2311 may be further determined than the embodiment 2320. Therefore, additional reference points other than the existing reference points 2321 and 2322 need to be identified.
  • the electronic device 100 may identify a point that is output at a position farthest from the axis 2301 and may acquire the identified point as a reference point 2331 .
  • the electronic device 100 may control the shutter unit 120 based on at least one of the reference point 2321 , the reference point 2322 , and the reference point 2331 .
  • Embodiment 2340 represents a case in which a corner portion of an image 2311 is reduced in a correction process.
  • the image 2311 may be further reduced than the embodiment 2320. Therefore, additional reference points other than the existing reference points 2321 and 2322 need to be identified.
  • the electronic device 100 may identify a reference point 2432 located on the axis 2301 and may identify a reference point 2431 or 2433 located on the axis 2302 . Also, the electronic device 100 may control the shutter unit 120 based on at least one of the reference point 2431 , the reference point 2432 , and the reference point 2433 .
  • the electronic device 100 may control the shutter unit 120 based on the existing reference points 2321 and 2322 without identifying a new reference point.
  • 24 is a diagram for explaining an operation of changing the position of the shutter unit 120.
  • the electronic device 100 may identify an optimal point for moving the shutter unit 120.
  • the electronic device 100 may acquire an image 2410 on which keystone correction has been performed.
  • the electronic device 100 may identify an image portion 2411 in which the target image is displayed and a remaining portion 2412 in which the target image is not displayed in the image 2410 .
  • the electronic device 100 may identify an optimal point where the shutter should move to cover the remaining portion 2412 .
  • the electronic device 100 may identify an optimal point by considering the position of the image part 2411 .
  • the electronic device 100 may identify a line segment 2421 and a line segment 2422 based on the outline of the image portion 2411 . In addition, the electronic device 100 may identify a point where the line segment 2421 and the line segment 2422 intersect as the optimal point 2430 . And, the electronic device 100 may move the shutter unit 120 to the optimal point 2430 .
  • FIG. 25 is a diagram for comparing the size of the remaining area changed according to the shutter unit 120. Referring to FIG.
  • embodiments 2510, 2520, and 2530 represent various situations in which the shutter unit 120 is controlled based on an image 2500 on which keystone correction is performed.
  • the image 2500 includes an image portion 2501 and a residual portion 2502.
  • the electronic device 100 may not use the shutter unit 120. Therefore, the remaining portion 2502 can be output to the projection surface as it is.
  • the electronic device 100 may cover only a portion of the remaining portion 2502 with the plurality of shutters 121-1, 121-2, 121-3, and 121-4. Therefore, among the remaining portion 2502, portions covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4 may not be output on the projection surface.
  • the electronic device 100 may cover all portions of the remaining portion 2502 with a plurality of shutters 121-1, 121-2, 121-3, and 121-4. Therefore, not all of the remaining portion 2502 may be output on the projection surface.
  • 26 is a diagram for explaining an operation of identifying the location of the shutter unit 120. Referring to FIG.
  • embodiments 2610 and 2620 represent various situations in which the shutter unit 120 is controlled based on an image 2600 on which keystone correction has been performed.
  • the image 2600 includes an image portion 2601 and a residual portion 2602.
  • the electronic device 100 may not use the shutter unit 120. Therefore, the remaining portion 2602 can be output to the projection surface as it is.
  • the electronic device 100 determines at least one optimum for moving the plurality of shutters 121-1, 121-2, 121-3, and 121-4 in the remaining portion 2602 based on the image portion 2601. Points 2603-1, 2603-21, 2603-3, and 2603-4 of can be identified.
  • the electronic device 100 operates a plurality of shutters 121-1, 121-2, and 121-2 based on optimal points 2603-1, 2603-21, 2603-3, and 2603-4. 3, 121-4) can be moved.
  • 27 is a flowchart for explaining an operation of determining whether the body of the shutter unit 120 moves.
  • the electronic device 100 may store positional information of the movement area of the shutter unit 120 (S2705). And, when the first image to be output is obtained, the electronic device 100 may convert the first image into a second image by performing keystone correction. Then, the electronic device 100 may obtain positional information of the image part and the remaining part of the second image on which keystone correction has been performed (S2710).
  • the electronic device 100 may identify whether the moving area of the shutter unit 120 covers all remaining portions at the current location (S2715). When the moving area of the shutter unit 120 covers all remaining parts (S2715-Y), the electronic device 100 can control only the shutter 121 without moving the body 124 of the shutter unit 120. (S2720).
  • the electronic device 100 controls the shutter 121 by moving the body 124 of the shutter unit 120.
  • 28 is a view for explaining an operation of moving the body 124 of the shutter unit 120.
  • embodiments 2810 and 2820 show the position of the shutter unit 120 in a state in which the projection lens 110 is viewed from the front.
  • the shutter unit 120 may include a shutter 121 and a fixing member 122. Based on the direction in which the projection lens 110 is viewed from the projection surface, the fixing member 122 may be positioned on the upper left side of the projection lens 110 . And, the electronic device 100 may move the shutter 121 without changing the position of the fixing member 122 .
  • moving the shutter 121 may mean an operation of rotating the shutter 121 clockwise or counterclockwise with respect to the fixing member 122 while maintaining the size of the shutter 121 .
  • the meaning of moving the shutter 121 may mean an operation of changing the size of the shutter 121 based on the fixing member 122 .
  • the operation described in the embodiment 2810 represents an operation of increasing the size of the fan-shaped shutter 121 .
  • Embodiment 2810 represents a situation in which the fan-shaped shutter 121 increases in a clockwise direction.
  • Embodiment 2820 represents a situation in which the shutter 121 of the shutter unit 120 is moved together with the fixing member 122 .
  • the body 124 of the shutter unit 120 may include a shutter 121 and a fixing member 122 .
  • the electronic device 100 may move the body 124 of the shutter unit 120 using the rail 123 .
  • the rail 123 may serve as a path along which the body 124 of the shutter unit 120 moves.
  • the electronic device 100 may identify a point where the shutter unit 120 should be moved, and control the direction of the rail 123 toward the identified point.
  • the electronic device 100 may change the direction of the rail 123 toward the identified point and move the body 124 of the shutter unit 120 to the end of the rail 123 .
  • the electronic device 100 may move the shutter 121 after moving the body 124 of the shutter unit 120 .
  • 29 is a flowchart for explaining an operation of controlling a shutter unit based on the size of an unoccluded area.
  • the electronic device 100 may store positional information of the movement area of the shutter unit 120 (S2905). And, when the first image to be output is obtained, the electronic device 100 may convert the first image into a second image by performing keystone correction. Then, the electronic device 100 may obtain positional information of the image part and the remaining part of the second image on which keystone correction is performed (S2910).
  • the electronic device 100 may identify whether the moving area of the shutter unit 120 covers all remaining portions (S2915). When the movement area of the shutter unit 120 covers the remaining portion (S2915-Y), the electronic device 100 may control the shutter unit 120 so that the remaining portion is not projected onto the projection surface (S2920). A detailed description related to this will be described in FIG. 20 .
  • the electronic device 100 determines the size of the non-coverable portion in which the moving area of the shutter unit 120 does not cover the remaining portion (S2915-N). area) can be identified (S2925). Then, the electronic device 100 may identify whether the size of the non-coverable portion is less than a threshold value (S2930).
  • the electronic device 100 may control the shutter unit 120 so that the remaining portion closest to the image portion is not projected onto the projection surface. A detailed description related to this is described in FIG. 31 .
  • the electronic device 100 may control the shutter unit 120 so that the remaining portion furthest away from the image portion is not projected onto the projection surface. A detailed description related to this is described in FIG. 32 .
  • 30 is a diagram for explaining an operation of controlling the shutter unit so that there is no remaining area.
  • embodiments 3010 and 3020 of the electronic device 100 represent a situation in which the shutter unit 120 is controlled based on an image 3000 on which keystone correction has been performed.
  • the electronic device 100 may identify an image portion 3001 and a remaining portion 3002 of the image 3000 on which keystone correction has been performed.
  • the electronic device 100 may control the plurality of shutters 121-1, 121-2, 121-3, and 121-4 to cover the remaining portion 3002. Since the remaining portion 3002 is completely covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4, the remaining portion 3002 may not be output on the projection surface.
  • 31 is a diagram for explaining an operation of controlling a shutter unit based on a size of a remaining area, according to an exemplary embodiment.
  • embodiments 3110 and 3120 show a situation in which the shutter unit 120 is controlled based on an image 3100 on which keystone correction has been performed.
  • the electronic device 100 may identify an image part 3101 and a remaining part 3102 of the image 3100 on which keystone correction has been performed.
  • the electronic device 100 can identify whether the remaining portion 3102 can be completely covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4. If the remaining portion 3102 is not completely covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4, the electronic device 100 determines the size (width) of the non-coverable portion. can identify.
  • the electronic device 100 when the size of the non-coverable portion is less than the threshold, the electronic device 100 preferentially covers a portion adjacent to the image portion 1301 among the remaining portion 3102. It is possible to control the shutters 121-1, 121-2, 121-3, and 121-4. If the size of the non-coverable portion is not large, the user may not easily recognize even when light is irradiated to the outermost portion. Therefore, the electronic device 100 uses a plurality of shutters 121-1, 121-2, 121-3, and 121-4 to cover the portion adjacent to the image portion 3101 among the remaining portions 3102 to secure visibility. can control.
  • 32 is a diagram for explaining an operation of controlling a shutter unit based on a size of a remaining area according to another embodiment.
  • embodiments 3210 and 3220 show a situation in which the shutter unit 120 is controlled based on an image 3200 on which keystone correction has been performed.
  • the electronic device 100 may identify an image part 3201 and a remaining part 3202 of the image 3200 on which keystone correction has been performed.
  • the electronic device 100 can identify whether the remaining portion 3202 can be completely covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4. If the remaining portion 3202 is not completely covered by the plurality of shutters 121-1, 121-2, 121-3, and 121-4, the electronic device 100 determines the size (width) of the non-coverable portion. can identify.
  • the electronic device 100 when the size of the non-coverable portion is less than the threshold, the electronic device 100 preferentially covers a portion of the remaining portion 3202 that is farthest from the image portion 1301.
  • a plurality of shutters 121-1, 121-2, 121-3, and 121-4 can be controlled. If the size of the non-coverable portion is large, a method in which light is not irradiated to the outermost portion can increase user visibility. Accordingly, the electronic device 100 uses a plurality of shutters 121-1, 121-2, 121-3, and 121-4 to cover the part farthest from the image part 3201 among the remaining part 2302 to secure visibility. ) can be controlled.
  • 33 is a flowchart for explaining a control method of an electronic device according to an embodiment of the present disclosure.
  • a control method of an electronic device 100 including a projection unit 111 and a shutter unit 120 outputting an image on a projection surface includes the steps of acquiring a first image corresponding to content ( S3305), keystone correcting the first image to obtain a second image (S3310), identifying an image part corresponding to the content and a remaining part not corresponding to the content in the second image (S3315), and the remaining part and controlling the shutter unit 120 not to be projected on the projection surface (S3320).
  • control method further includes outputting an image portion of the entire portion of the second image on a projection surface, and controlling the shutter unit 120 (S3320) projects the remaining portion of the entire portion of the second image.
  • the position of the shutter unit 120 may be changed so that it is not output on the slope.
  • At least one of movement or rotation of the shutter unit 120 may be controlled based on the location information of the remaining part.
  • movement information and rotation information of the shutter unit 120 may be obtained based on the movable position information of the shutter unit 120 and the position information of the remaining part. .
  • the shutter unit 120 may be moved based on movement information and rotated based on rotation information.
  • the central axis of the projection unit 111 coincides with the central axis of the shutter unit 120 so that the remaining portion is not projected onto the projection surface. ) can be controlled.
  • an offset of the projection unit 111 may be obtained, and the position of the shutter unit 120 may be changed based on the offset.
  • the shutter unit 120 may include at least one of a shutter 121, a fixing member 122, a rail 123, a body 124, or a motor 125, and the body 124 may include a shutter 121 ) and a fixing member 122, and controlling the shutter unit 120 (S3320) moves the shutter 121 along the rail 123 according to the driving power generated by the motor 125. It can be controlled to do so, and the shutter 121 can be controlled to rotate according to the driving power generated by the motor 125 .
  • a reference point of the shutter unit 120 for covering the remaining portion may be identified, and the direction of the rail 123 may be adjusted to move the body 124 to the reference point. It can be controlled, and by moving the body 124 along the rail 123, the fixing member can be controlled to be located at the reference point.
  • the body 124 may include a fixing member and at least two shutters corresponding to the fixing member, and controlling the shutter unit 120 (S3320) is performed according to the driving power generated by the motor 125. At least two shutters may be controlled to rotate based on the fixing member.
  • the method of controlling an electronic device as shown in FIG. 33 may be executed on an electronic device having the configuration of FIG. 2A or 2B and may also be executed on an electronic device having other configurations.
  • various embodiments of the present disclosure described above may be performed through an embedded server included in an electronic device or an external server of at least one of an electronic device and a display device.
  • a device is a device capable of calling a stored command from a storage medium and operating according to the called command, and may include an electronic device according to the disclosed embodiments.
  • the processor may perform a function corresponding to the command directly or by using other components under the control of the processor.
  • An instruction may include code generated or executed by a compiler or interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-temporary' only means that the storage medium does not contain a signal and is tangible, but does not distinguish whether data is stored semi-permanently or temporarily in the storage medium.
  • the method according to the various embodiments described above may be included in a computer program product and provided.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product may be distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)) or online through an application store (eg Play StoreTM).
  • CD-ROM compact disc read only memory
  • application store eg Play StoreTM
  • at least part of the computer program product may be temporarily stored or temporarily created in a storage medium such as a manufacturer's server, an application store server, or a relay server's memory.
  • each of the components may be composed of a single object or a plurality of entities, and some sub-components among the aforementioned sub-components may be omitted, or other sub-components may be used. Components may be further included in various embodiments. Alternatively or additionally, some components (eg, modules or programs) may be integrated into one entity and perform the same or similar functions performed by each corresponding component prior to integration. According to various embodiments, operations performed by modules, programs, or other components may be executed sequentially, in parallel, repetitively, or heuristically, or at least some operations may be executed in a different order, may be omitted, or other operations may be added. can

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Geometry (AREA)
  • Optics & Photonics (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

Un dispositif électronique comprend : une mémoire ; une unité d'obturation ; une unité de projection qui produit une image sur une surface de projection ; et un processeur qui acquiert une première image correspondant à un contenu à partir d'une mémoire, qui effectue une correction de trapèze sur la première image pour acquérir une seconde image, qui identifie une partie d'image correspondant au contenu et une partie restante ne correspondant pas au contenu dans la seconde image, et qui commande l'unité d'obturation de telle sorte que la partie restante n'est pas projetée sur la surface de projection.
PCT/KR2022/009346 2021-08-09 2022-06-29 Appareil électronique et son procédé de commande WO2023018008A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0104828 2021-08-09
KR1020210104828A KR20230022717A (ko) 2021-08-09 2021-08-09 전자 장치 및 그 제어 방법

Publications (1)

Publication Number Publication Date
WO2023018008A1 true WO2023018008A1 (fr) 2023-02-16

Family

ID=85200008

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/009346 WO2023018008A1 (fr) 2021-08-09 2022-06-29 Appareil électronique et son procédé de commande

Country Status (2)

Country Link
KR (1) KR20230022717A (fr)
WO (1) WO2023018008A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080246893A1 (en) * 2005-10-13 2008-10-09 Gregory Jensen Boss Internal light masking in projection systems
KR101048785B1 (ko) * 2008-09-25 2011-07-15 에이티엘(주) 디지털 노광 장치
US20120327315A1 (en) * 2011-06-27 2012-12-27 Microsoft Corporation Video projection system for mobile device
KR20150020800A (ko) * 2013-08-19 2015-02-27 엘지전자 주식회사 헤드 마운트 디스플레이 및 그의 영상 처리 방법
KR101683788B1 (ko) * 2009-07-24 2016-12-08 삼성전자주식회사 입체 영상 투사 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080246893A1 (en) * 2005-10-13 2008-10-09 Gregory Jensen Boss Internal light masking in projection systems
KR101048785B1 (ko) * 2008-09-25 2011-07-15 에이티엘(주) 디지털 노광 장치
KR101683788B1 (ko) * 2009-07-24 2016-12-08 삼성전자주식회사 입체 영상 투사 시스템
US20120327315A1 (en) * 2011-06-27 2012-12-27 Microsoft Corporation Video projection system for mobile device
KR20150020800A (ko) * 2013-08-19 2015-02-27 엘지전자 주식회사 헤드 마운트 디스플레이 및 그의 영상 처리 방법

Also Published As

Publication number Publication date
KR20230022717A (ko) 2023-02-16

Similar Documents

Publication Publication Date Title
WO2016171403A1 (fr) Dispositif électronique et procédé
WO2019004570A1 (fr) Hotte de cuisine et procédé de commande de hotte de ladite cuisine
WO2017082603A1 (fr) Four et procédé d'ouverture et de fermeture de porte de four
WO2015194773A1 (fr) Dispositif d'affichage et son procédé de commande
WO2017111321A1 (fr) Dispositif d'affichage d'image
WO2023013862A1 (fr) Appareil électronique et procédé de traitement d'image associé
WO2018190517A1 (fr) Appareil électronique et procédé d'affichage de contenu de ce dernier
WO2023003140A1 (fr) Dispositif électronique et son procédé de commande
WO2023017904A1 (fr) Appareil électronique et son procédé de commande
WO2016143993A1 (fr) Appareil et procédé de commande d'éclairage
WO2017119571A1 (fr) Dispositif numérique, et système et procédé de contrôle de couleur l'utilisant
WO2022191538A1 (fr) Appareil de production sonore
WO2021137630A1 (fr) Appareil d'affichage et son procédé de commande
WO2023018008A1 (fr) Appareil électronique et son procédé de commande
WO2023282460A1 (fr) Appareil électronique et son procédé de commande
WO2022265428A1 (fr) Appareil électronique et son procédé de commande
WO2023080421A1 (fr) Dispositif électronique et son procédé de commande
WO2023286931A1 (fr) Appareil électronique et son procédé de commande
WO2024025075A1 (fr) Dispositif électronique de projection d'images et procédé de commande de celui-ci
WO2016125966A1 (fr) Appareil de projection d'image et son procédé de fonctionnement
WO2023113201A1 (fr) Dispositif électronique et son procédé de commande
WO2024025130A1 (fr) Appareil électronique pour la projection d'images et son procédé de commande
WO2023249232A1 (fr) Dispositif électronique et son procédé de commande
WO2023249235A1 (fr) Dispositif électronique et son procédé de commande
WO2022177207A2 (fr) Dispositif électronique et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22856010

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22856010

Country of ref document: EP

Kind code of ref document: A1