CN116711300A - Electronic device and control method thereof - Google Patents

Electronic device and control method thereof Download PDF

Info

Publication number
CN116711300A
CN116711300A CN202280008498.9A CN202280008498A CN116711300A CN 116711300 A CN116711300 A CN 116711300A CN 202280008498 A CN202280008498 A CN 202280008498A CN 116711300 A CN116711300 A CN 116711300A
Authority
CN
China
Prior art keywords
image
pattern image
color space
electronic device
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280008498.9A
Other languages
Chinese (zh)
Inventor
朴在成
金志晩
禹浚熙
郑圣运
崔正和
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210144246A external-priority patent/KR20220145743A/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2022/000018 external-priority patent/WO2022225138A1/en
Publication of CN116711300A publication Critical patent/CN116711300A/en
Pending legal-status Critical Current

Links

Landscapes

  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An electronic device is disclosed. The electronic device includes: a memory storing a first pattern image and a second pattern image; a communication interface including a communication circuit for communicating with an external terminal device; a projection member; and a processor configured to: the control projection means projects a first pattern image onto a screen member located on the projection surface, acquires conversion information based on the first captured image and the first pattern image when the first captured image capturing the screen member is received from an external terminal device through the communication interface, controls the projection means to project a second pattern image onto the projection surface, and performs color calibration according to characteristics of the projection surface based on the second captured image, the second pattern image, and the conversion information when the second captured image obtained by capturing the projection surface is received from the external terminal device through the communication interface.

Description

Electronic device and control method thereof
Technical Field
The present disclosure relates to an electronic device including a projection part and a control method thereof, and more particularly, to an electronic device performing a color calibration operation related to image projection and a control method thereof.
Background
In the case where the electronic device projects an image through a projection means (e.g., projector), the specific surface such as a wall or ceiling may be a screen. In the case where the screen is not a private screen, the color of the screen on which the image is projected may not be standard white. Therefore, if the color of the screen is not a standard white, there is a problem in that the color of the original image is not displayed as it is. For example, assume that the color of the screen is gray. In the case where a white image is projected on a gray projection surface, there is a possibility that the user may recognize the image as a light gray image instead of a white image.
Even if the color of the screen is white from the user's perspective, the color may not completely coincide with the standard white. In the case where a dedicated screen is not employed, a problem may occur in that the user recognizes that the color of the image is different from that of the original image.
Further, even if a private screen is employed, there may be a possibility that the color of the private screen changes color with time. Therefore, there may be the following problems: the color of the image projected onto the projection surface where the discoloration occurs is recognized by the user to be different from the color of the original image.
Disclosure of Invention
[ problem ]
The present disclosure is directed to improving the above-described problems, and an object of the present disclosure is to provide an electronic device that outputs a pattern image to a screen member and acquires color space conversion information, and outputs the pattern image to a projection surface and performs color calibration suitable for the projection surface, and a control method thereof.
[ technical solution ]
An electronic device for achieving the above object according to an embodiment of the present disclosure includes: a memory storing a first pattern image and a second pattern image; a communication interface including a communication circuit configured to communicate with an external terminal device; a projection member; and a processor configured to: the method includes controlling the projection part to project a first pattern image onto a screen member located on a projection surface, acquiring conversion information based on the first shot image and the first pattern image in response to receiving the first shot image photographed with the screen member from an external terminal device through a communication interface, controlling the projection part to project a second pattern image onto the projection surface, and performing color calibration according to characteristics of the projection surface based on the second shot image, the second pattern image, and the conversion information in response to receiving the second shot image photographed with the projection surface from the external terminal device through the communication interface.
Meanwhile, the transformation information may include color space transformation information, and the processor may be configured to: color space information corresponding to the first pattern image is acquired, and color space conversion information is acquired according to characteristics of the projection section based on the first captured image and the color space information corresponding to the first pattern image.
Meanwhile, the processor may be configured to: color space information corresponding to the second pattern image is acquired, and color calibration is performed according to characteristics of the projection surface based on the second captured image, the color space information corresponding to the second pattern image, and the color space conversion information.
Meanwhile, the processor may be configured to: color space conversion information is acquired based on RGB information corresponding to the first captured image and XYZ color space information corresponding to the first pattern image.
Meanwhile, the transformation information may include a color space transformation matrix that transforms RGB information into XYZ color space information.
Meanwhile, the processor may be configured to: the method further includes converting RGB information corresponding to the second captured image into XYZ color space information based on the conversion information, acquiring a color difference between the XYZ color space information corresponding to the second captured image and the XYZ color space information corresponding to the second pattern image, and performing color calibration based on the acquired color difference.
Meanwhile, the processor may be configured to: at least one of a gain value or an offset value related to the RGB signal is changed based on the acquired color difference.
Meanwhile, the processor may be configured to: in response to identifying that the predetermined object related to the screen member is included in the first captured image, the transformation information is acquired based on the first captured image, and in response to identifying that the predetermined object related to the screen member is not included in the first captured image, the projection component is controlled to project a User Interface (UI) including information that the screen member is not identified.
Meanwhile, the first pattern image may include at least one of a white pattern image, a red pattern image, a green pattern image, or a blue pattern image, and the second pattern image may include a white pattern image.
Meanwhile, the processor may be configured to: the projection means is controlled so that a white pattern image among the plurality of pattern images included in the first pattern image is projected first, and the remaining pattern images are projected sequentially.
A method of controlling an electronic device storing a first pattern image and a second pattern image and communicating with an external terminal device according to an example embodiment of the present disclosure includes: projecting a first pattern image onto a screen member located on a projection surface; acquiring conversion information based on the first photographed image and the first pattern image in response to receiving the first photographed image photographed with the screen member from the external terminal device; projecting a second pattern image onto a projection surface; and performing color calibration according to characteristics of the projection surface based on the second photographed image, the second pattern image, and the transformation information in response to receiving the second photographed image photographed with the projection surface from the external terminal device.
Meanwhile, the transformation information may include color space transformation information, and acquiring the transformation information may include: acquiring color space information corresponding to the first pattern image; and acquiring color space conversion information according to characteristics of a projection part included in the electronic device based on the first photographed image and color space information corresponding to the first pattern image.
Meanwhile, the step of performing color calibration may include: acquiring color space information corresponding to the second pattern image; and performing color calibration according to characteristics of the projection surface based on the second captured image, the color space information corresponding to the second pattern image, and the color space conversion information.
Meanwhile, the step of acquiring the transformation information may include: color space conversion information is acquired based on RGB information corresponding to the first captured image and XYZ color space information corresponding to the first pattern image.
Meanwhile, the transformation information may include a color space transformation matrix that transforms RGB information into XYZ color space information.
Meanwhile, the step of performing color calibration may include: the method further includes converting RGB information corresponding to the second captured image into XYZ color space information based on the conversion information, acquiring a color difference between the XYZ color space information corresponding to the second captured image and the XYZ color space information corresponding to the second pattern image, and performing color calibration based on the acquired color difference.
Meanwhile, the step of performing color calibration may include: at least one of a gain value or an offset value related to the RGB signal may be changed based on the acquired color difference.
Meanwhile, the control method may further include: in response to identifying that the predetermined object related to the screen member is included in the first captured image, transform information is acquired based on the first captured image, and in response to identifying that the predetermined object related to the screen member is not included in the first captured image, a User Interface (UI) including information that the screen member is not identified is projected.
Meanwhile, the first pattern image may include at least one of a white pattern image, a red pattern image, a green pattern image, or a blue pattern image, and the second pattern image may include a white pattern image.
Meanwhile, the step of projecting the first pattern image includes: a white pattern image among a plurality of pattern images included in the first pattern image is first projected, and the remaining pattern images are sequentially projected.
Drawings
Fig. 1 is a diagram for illustrating an image projection operation and an image photographing operation according to an embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating an electronic device according to an embodiment of the present disclosure;
Fig. 3 is a block diagram for showing a detailed configuration of the electronic device in fig. 2;
FIG. 4 is a table showing various embodiments for performing color calibration operations;
fig. 5 is a diagram for illustrating a color calibration operation according to the first embodiment;
fig. 6 is a diagram for illustrating a color calibration operation according to the second embodiment;
fig. 7 is a diagram for illustrating a color calibration operation according to the third embodiment;
fig. 8 is a diagram for illustrating a color calibration operation according to the fourth embodiment;
fig. 9 is a diagram for illustrating an operation using a screen member according to the first and third embodiments;
fig. 10 is a diagram for illustrating an operation using a screen member according to the second and fourth embodiments;
FIG. 11 is a diagram for illustrating operations for generating a color space transformation matrix;
fig. 12 is a diagram for illustrating an operation of projecting a color calibration result;
fig. 13 is a diagram for illustrating an operation of guiding a user behavior corresponding to a color calibration result;
fig. 14 is a diagram for illustrating an operation of comparing a projection before color calibration and a projection after color calibration according to an embodiment;
fig. 15 is a diagram for showing an operation of comparing a projection before color calibration and a projection after color calibration according to another embodiment;
Fig. 16 is a diagram for illustrating an operation of projecting information about a terminal device that can be connected to an electronic device;
fig. 17 is a flowchart for schematically showing the entire process of performing the operation of acquiring the color space conversion matrix and performing the color calibration operation;
fig. 18 is a flowchart for showing in detail the operation of acquiring a color space conversion matrix;
fig. 19 is a flowchart for showing the color calibration operation in detail;
fig. 20 is a flowchart for illustrating an operation of identifying whether a predetermined object is included in a screen member;
fig. 21 is a diagram for illustrating an operation of identifying a screen member according to an embodiment;
fig. 22 is a diagram for illustrating an operation of identifying a screen member according to another embodiment;
FIG. 23 is a flow chart illustrating an embodiment for performing a color calibration operation in an electronic device;
fig. 24 is a flowchart for showing an embodiment of performing a color calibration operation in the terminal apparatus;
FIG. 25 is a flow chart illustrating an embodiment for performing a color calibration operation using streaming data;
fig. 26 is a diagram for showing a system including an electronic device, a terminal device, and a server;
fig. 27 is a diagram for illustrating a process of acquiring a color space conversion matrix for acquiring XYZ color space information corresponding to a captured image;
Fig. 28 is a diagram for showing RGB information corresponding to a captured image and XYZ color space information corresponding to the captured image; and
fig. 29 is a flowchart for illustrating a control method of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, the present disclosure will be described in detail with reference to the accompanying drawings.
General terms that are widely used at present are selected as terms used in the embodiments of the present disclosure as much as possible in consideration of the functions described in the present disclosure. However, these terms may vary according to the intention of those skilled in the relevant art or previous court decisions, the advent of new technology, etc. Furthermore, in certain cases, there may be arbitrarily selected terms, and in such cases, the meaning of the terms will be described in detail in the relevant description of the present disclosure. Accordingly, the terms used in the present disclosure should be defined based on the meaning of the terms and the entire contents of the present disclosure, not just the names of the terms.
Furthermore, in the disclosure, expressions such as "having," "may have," "include," and "may include" mean that such features (e.g., elements such as numerals, functions, operations, and components) are present, but that the presence of additional features is not precluded.
In addition, the expression "at least one of a and/or B" should be interpreted as referring to "a" or "B" or any of "a and B".
Further, the terms "first," "second," and the like, as used in this disclosure, may be used to describe various elements, regardless of their order and/or importance. Moreover, such expressions are merely used to distinguish one element from another and are not intended to limit such elements.
Furthermore, the description in this disclosure of one element (e.g., a first element), "being coupled (either operatively or communicatively) with …" or "connected to" another element (e.g., a second element) should be construed to include both the case where the one element is directly coupled to the other element and the case where the one element is coupled to the other element through yet another element (e.g., a third element).
In addition, singular references include plural references as long as they do not differ significantly from the context. Furthermore, in the present disclosure, terms such as "comprising" and "consisting of …" should be interpreted as specifying the presence of such features, numbers, steps, operations, elements, components, or combination thereof as described in the present disclosure, and should not be interpreted as excluding the presence or possible addition of one or more of the other features, numbers, steps, operations, elements, components, or combination thereof.
Further, in this disclosure, a "module" or "component" may perform at least one function or operation, and may be implemented as hardware or software, or as a combination of hardware and software. Furthermore, multiple "modules" or "components" may be integrated into at least one module and implemented as at least one processor (not shown), except for "modules" or "components" that need to be implemented as specific hardware.
Furthermore, in this disclosure, the term "user" may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).
Hereinafter, embodiments of the present disclosure will be described in more detail with reference to the accompanying drawings.
Fig. 1 is a diagram for illustrating an image projection operation and an image photographing operation according to an embodiment of the present disclosure.
Here, the electronic device 100 may refer to various devices that perform a projector function. Here, the electronic device 100 may include a projection part (e.g., projector) 120. Here, the projection part 120 may refer to hardware that projects a specific image. For example, the projection component 120 may refer to an image projection lens.
Here, the electronic device 100 may project the image 20 onto the screen 10 using the projection part 120.
Here, the terminal apparatus 200 may capture the projection image 20. The terminal device 200 may include a camera 210. Here, the terminal apparatus 200 may capture the image 20 using the camera 210.
According to an embodiment, the terminal device 200 may capture a projection surface including the projection image 20. The user can shoot not only the area where the projection image 20 is projected but also the area including other areas. Accordingly, the electronic device 100 can selectively use only the region including the projection image 20 in the photographed image photographed by the user.
According to another embodiment, the terminal apparatus 200 may capture only the projection image 20. Since the information actually required for the color calibration is a portion corresponding to the projection image 20, the user can take only the projection image 20. The electronic device 100 may provide information that guides the user to take only the projected image 20.
Fig. 2 is a block diagram illustrating an electronic device 100 according to an embodiment of the present disclosure.
Referring to fig. 2, the electronic device 100 may include at least one of a memory 110, a projection component (e.g., projector or projection lens) 120, a processor (e.g., including processing circuitry) 130, and/or a communication interface (e.g., including communication circuitry) 150.
In the memory 110, at least one instruction related to the electronic device 100 may be stored. Further, in the memory 110, an operating system (O/S) for driving the electronic apparatus 100 may be stored. In addition, in the memory 110, various software programs or applications for causing the electronic device 100 to operate according to various embodiments of the present disclosure may be stored. Further, the memory 110 may include a semiconductor memory such as a flash memory or a magnetic storage medium such as a hard disk.
Further, in the memory 110, various software modules for causing the electronic device 100 to operate according to various embodiments of the present disclosure may be stored, and the processor 130 may control the operation of the electronic device 100 by executing the various software modules stored in the memory 110. That is, the memory 110 may be accessed by the processor 130, and may perform reading/recording/correction/deletion/updating of data by the processor 130, and the like.
Further, the memory 110 may be used as meaning covering a ROM (not shown) and a RAM (not shown) inside the processor 130, or a memory card (not shown) mounted on the electronic device 100 (e.g., a micro SD card, a memory stick).
Meanwhile, the memory 110 may store information about the first pattern image and the second pattern image. Specifically, the memory 110 may store a first pattern image, RGB information corresponding to the first pattern image, color space information (e.g., XYZ color space information) corresponding to the first pattern image, and color space information (e.g., XYZ color space information) corresponding to the second pattern image.
The projection part 120 may include a projector or a projection lens, and outputs an image to be output from the electronic device 100 to a projection surface. The projection component 120 may include a projection lens.
The projection part 120 may perform a function of outputting an image to a screen (or projection surface). The projection unit 120 is a component that projects an image to the outside. The projection part 120 according to an embodiment of the present disclosure may be implemented in various projection methods, for example, a Cathode Ray Tube (CRT) method, a Liquid Crystal Display (LCD) method, a Digital Light Processing (DLP) method, a laser method, etc.
Meanwhile, the projection part 120 may perform various functions for adjusting an output image by control of the processor 130. For example, the projection component 120 may perform functions such as zooming, keystone (keystone), quick angle (quadrangle) keystone, lens panning, and the like.
The processor 130 may include various processing circuits and perform overall control operations of the electronic device 100. Specifically, the processor 130 performs functions that control the overall operation of the electronic device 100.
The processor 130 may be implemented as a Digital Signal Processor (DSP), a microprocessor, and a Time Controller (TCON) that process digital signals. However, the present disclosure is not limited thereto, and the processor 130 may include one or more of a Central Processing Unit (CPU), a Micro Controller Unit (MCU), a Micro Processing Unit (MPU), a controller, an Application Processor (AP), a Graphic Processing Unit (GPU), or a Communication Processor (CP) and an ARM processor, or may be defined by terms. Further, the processor 130 may be implemented as a system on chip (SoC) or Large Scale Integration (LSI) having a processing algorithm stored therein, or in the form of a Field Programmable Gate Array (FPGA). The processor 130 may perform various functions by executing computer-executable instructions stored in the memory 110.
The processor 130 may control the projection part 120 to project a first pattern image onto the screen member 30 (see fig. 5) located on the projection surface, and if a first photographed image photographed with the screen member 30 is received from the external terminal device through the communication interface 150, the processor 130 may acquire transformation information based on the first photographed image and the first pattern image and control the projection part 120 to project a second pattern image onto the projection surface; if a second photographed image photographed with the projection surface is received from the external terminal device through the communication interface 150, the processor 130 may perform color calibration according to characteristics of the projection surface based on the second photographed image, the second pattern image, and the transformation information.
Meanwhile, the transformation information may include color space transformation information, and the processor 130 may acquire color space information corresponding to the first pattern image, and acquire the color space transformation information according to characteristics of the projection part 120 based on the first photographed image and the color space information corresponding to the first pattern image. Here, the memory 110 may store color space information corresponding to the first pattern image.
Meanwhile, the processor 130 may acquire color space information corresponding to the second pattern image and perform color calibration according to characteristics of the projection surface based on the second photographed image, the color space information corresponding to the second pattern image, and the color space conversion information. Here, the memory 110 may store color space information corresponding to the second pattern image.
Here, the processor 130 may control the projection part 120 to project the first pattern image onto the screen member 30 (see fig. 5). Here, there are two methods of using the screen member 30. For example, the screen member 30 may be a member directly mounted by a user. An explanation of this will be described in more detail below with reference to fig. 5 and 7. For another example, the screen member 30 may be a member included in (or attached to) the electronic device 100. An explanation of this will be described in more detail below with reference to fig. 6 and 8.
Here, the screen member 30 may refer to a white reflector, and it may refer to a reflector having a standard white color provided by a manufacturer of the electronic device 100. When using a reflector made officially by the manufacturer of the electronic device 100, the correct color space conversion information can be obtained. The operation of identifying whether an official (or genuine) reflector is installed will be described in more detail below with reference to fig. 20 to 22. Here, the screen member 30 may be described as an official reflector, a genuine reflector, a standard reflector, or the like.
Here, the screen member 30 may be a member that satisfies at least one standard among a standard color, a standard specification, and a standard material. Here, the screen member 30 may be a plane.
Here, the first pattern image may be a test pattern image projected by the projection part 120. Here, the first pattern image may include a white pattern image, a red pattern image, a green pattern image, and a blue pattern image.
Specifically, after projecting the first pattern image onto the screen member 30, the processor 130 may acquire a first photographed image photographed with the screen member 30 on which the first pattern image is projected. Here, the first photographed image may include a screen member 30 on which the first pattern image is projected. Here, there may be two methods of acquiring a captured image. For example, the photographed image may be photographed by a camera of the terminal device 200, and the electronic device 100 may receive the photographed image from the terminal device 200. An explanation of this will be described in more detail below with reference to fig. 5 and 6. As another example, the photographed image may be photographed by a camera attached to the electronic device 100, and the processor 130 may acquire the photographed image through the camera of the electronic device 100. An explanation of this will be described in more detail below with reference to fig. 7 and 8.
Here, the characteristics of the projection part 120 may refer to hardware properties related to the projection part 120. For example, the characteristics of the projection part 120 may include information about the performance of the projection lens included in the projection part 120. Accordingly, the color space conversion information may vary according to hardware properties of the projection part 120. The processor 130 may acquire color space conversion information suitable for the projection part 120 based on the first photographed image and color space information corresponding to the first pattern image.
Meanwhile, the processor 130 may perform a pre-calibration operation by using the white pattern image in the first pattern image. Specifically, the processor 130 may project a white pattern image onto the screen member 30 and acquire a photographed image including the screen member 30 on which the white pattern image is projected (a photographed image including the white pattern image). Here, the processor 130 may compare the photographed image including the white pattern image with the white pattern image (original image) and adjust the sensitivity of the projection part 120. In particular, the processor 130 may adjust settings related to the performance of the projection lens included in the projection component 120. For example, the processor 130 may change a setting related to the shutter speed or aperture.
Meanwhile, the processor 130 may perform color calibration using the red, green, and blue pattern images in the first pattern image.
First, the processor 130 may project a red pattern image onto the screen member 30 and acquire a photographed image including the screen member 30 on which the red pattern image is projected (a photographed image including the red pattern image). The processor 130 may then compare the photographed image including the red pattern image with the red pattern image (original image) and acquire at least one simultaneous equation related to the red pattern image.
Next, the processor 130 may project a green pattern image onto the screen member 30 and acquire a photographed image including the screen member 30 on which the green pattern image is projected (a photographed image including the green pattern image). The processor 130 may then compare the captured image including the green pattern image with the green pattern image (original image) and obtain at least one simultaneous equation related to the green pattern image.
Third, the processor 130 may project a blue pattern image onto the screen member 30 and acquire a photographed image including the screen member 30 on which the blue pattern image is projected (a photographed image including the blue pattern image). The processor 130 may then compare the photographed image including the blue pattern image with the blue pattern image (original image) and acquire at least one simultaneous equation related to the blue pattern image.
Here, the processor 130 may acquire color space transformation information (color space transformation matrix) based on at least one simultaneous equation related to the red pattern image, at least one simultaneous equation related to the green pattern image, and at least one simultaneous equation related to the blue pattern image.
The explanation relating to the simultaneous equations will be described in more detail below with reference to fig. 27.
Here, the processor 130 may acquire the color space conversion information in advance based on information about the first photographed image including the first pattern image projected onto the screen member 30 and the first pattern image stored in the memory 110. Here, the information related to the first pattern image may include color space information corresponding to the first pattern image.
Here, the color space conversion information may refer to a matrix for converting general data into color space data.
Meanwhile, the processor 130 may acquire color space conversion information based on RGB information corresponding to the first photographed image and XYZ color space information corresponding to the first pattern image.
Here, there may be various definitions defining the color space. Further, although XYZ is described in the above description, other definitions of color space may be used according to implementation examples.
Meanwhile, the color space conversion information may be a color space conversion matrix that converts RGB information into XYZ color space information. The specific operation of acquiring the color space transformation matrix will be described in more detail below with reference to fig. 17 and 18. Further, a specific calculation operation of acquiring the color space conversion matrix will be described in more detail below with reference to fig. 27 and 28.
After the color space transformation matrix is acquired, the processor 130 may project a second pattern image onto the screen 10. Specifically, after acquiring the color space transformation matrix, the processor 130 may project the second pattern image onto the screen 10 where the screen member 30 is not present. The processor 130 may acquire a first photographed image photographed with the screen member 30 and acquire a color space transformation matrix, and acquire a second photographed image photographed with the screen 10 where the screen member 30 is not arranged, and perform color calibration according to the projection surface. Specifically, in order to solve the problem that the color of the original image is not represented as it is because of the color of the screen 10, the processor 130 may photograph the screen 10 and perform color calibration. Thus, the processor 130 may output a projection image suitable for the projection surface through color calibration. Furthermore, the color calibration may also change when the projection surface changes.
For example, the first pattern image and the second pattern image may be the same. The first pattern image and the second pattern image may be the same white pattern image. As another example, the first pattern image and the second pattern image may be different. The first pattern image may be one of a red pattern image, a green pattern image, or a blue pattern image, and the second pattern image may be a white pattern image.
Here, the processor 130 may acquire the second photographed image photographed with the screen 10 in a state where the screen member 30 is not disposed. Then, in order to solve the problem of color distortion due to the color of the screen 10, the processor 130 may perform color calibration based on the second photographed image.
Here, the processor 130 may acquire RGB information corresponding to the second photographed image.
Meanwhile, the processor 130 may convert RGB information corresponding to the second photographed image into XYZ color space information based on color space conversion information (e.g., a color space conversion matrix), acquire color differences between the XYZ color space information corresponding to the second photographed image and the XYZ color space information corresponding to the second pattern image, and perform color calibration based on the acquired color differences.
Meanwhile, the processor 130 may change at least one of a gain value or an offset value related to the RGB signal based on the acquired color difference.
Here, the processor 130 may change a gain value related to the RGB signal by performing a color calibration operation. Here, the gain value may refer to an element that adjusts an output value of the RGB signal through a multiplication (or division) operation.
Here, the processor 130 may change the offset related to the RGB signal by performing a color calibration operation. Here, the offset may refer to an element that adjusts an output value of the RGB signal through an addition (or subtraction) operation.
Here, if the output value of the RGB signal is adjusted, at least one of brightness, contrast, or color of the projected image may be changed, and an RGB signal suitable for a screen (or projection surface) may be output (or projected).
Meanwhile, the electronic device 100 according to the embodiment of the present disclosure may output RGB signals.
However, the electronic device 100 according to another embodiment of the present disclosure may output signals in different forms other than RGB signals. For example, the electronic device 100 may output an RGBW signal and an RGBY signal. Further, the electronic device 100 may additionally output a signal of at least one of yellow, cyan, or magenta other than the RGB signal. Thus, the electronic device 100 does not necessarily control only RGB by color calibration, but may also calibrate additional pixels (or sub-pixels).
Meanwhile, the processor 130 may perform an operation of identifying whether the screen member 30 is a standard screen member 30 manufactured by a manufacturer. The reason for this is that: if the screen member 30 is not a standard screen member 30, some errors may occur in the color space transformation matrix.
Meanwhile, if a predetermined (e.g., designated) object related to the screen member 30 is recognized to be included in the first photographed image, the processor 130 may acquire color space conversion information based on the first photographed image, and if a predetermined object related to the screen member 30 is recognized to be not included in the first photographed image, the processor 130 may control the projection part 120 to project a UI including information that the screen member 30 is not recognized. The specific operation in this regard will be described in more detail below with reference to fig. 20-22.
Meanwhile, the methods for the electronic device 100 to acquire a photographed image may be divided into two types.
According to an embodiment, the electronic apparatus 100 may receive a photographed image from the terminal apparatus 200, and the terminal apparatus 200 may be an external apparatus. Here, the electronic apparatus 100 may further include a communication interface 150 including various communication circuits, and the processor 130 may control the communication interface to receive the first photographed image and the second photographed image from the external terminal apparatus 200.
According to another embodiment, the electronic device 100 may acquire a photographed image using a camera 140 installed inside the electronic device 100. Here, the electronic device 100 may further include a camera 140, and the processor 130 may control the camera 140 to acquire the first photographed image and the second photographed image.
Meanwhile, the first pattern image may include at least one of a white pattern image, a red pattern image, a green pattern image, or a blue pattern image, and the second pattern image may include a white pattern image.
Meanwhile, the processor 130 may control the projection part 120 such that a white pattern image among a plurality of pattern images included in the first pattern image is first projected, and the remaining pattern images are sequentially projected.
Here, the first pattern image may be an image projected on the screen member 30, and the processor 130 may preferentially project the white pattern image. Then, after acquiring the photographed image of the screen member 30 photographed with the white pattern image projected thereon, the processor 130 may project the red pattern image, the green pattern image, or the blue pattern image in any order. If the white pattern image is output first, the projection order of the red pattern image, the green pattern image, or the blue pattern image may be changed according to a user setting.
Here, the reason why the second pattern image is output as only a white pattern image is different from the first pattern image is that: for color calibration, it is advantageous to use white when considering the color of the screen 10.
Meanwhile, in the above, only simple components of the electronic device 100 are shown and described, but in actual implementation, various components may be additionally included. This will be explained in more detail below with reference to fig. 3.
Meanwhile, there may be various methods to output the first pattern image and the second pattern image.
According to an embodiment, the first pattern image and the second pattern image may already be stored in the memory 110 of the first pattern image. If a predetermined (e.g., specified) event occurs, the electronic device 100 may output a first pattern image stored in the memory 110 or output a second pattern image stored in the memory 110.
According to another embodiment, the first pattern image and the second pattern image may be provided from the terminal device 200. Specifically, if a predetermined (e.g., designated) control signal is received from the electronic apparatus 100, the terminal apparatus 200 may transmit the first pattern image or the second pattern image to the electronic apparatus 100 in real time. Here, the predetermined control signal may be a signal requesting the first pattern image or a signal requesting the second pattern image. Then, the electronic apparatus 100 may output the first pattern image or the second pattern image received from the terminal apparatus 200.
According to yet another embodiment, the first pattern image and the second pattern image may be provided from the server 300. Specifically, if a predetermined control signal is received from the electronic apparatus 100, the server 300 may transmit the first pattern image or the second pattern image to the electronic apparatus 100. Here, the predetermined control signal may be a signal requesting the first pattern image or a signal requesting the second pattern image. Then, the electronic apparatus 100 may output the first pattern image or the second pattern image received from the terminal apparatus 200.
Meanwhile, there may be various methods to acquire color space information corresponding to the first pattern image.
According to an embodiment, color space information corresponding to the first pattern image may already be stored in the memory 110. In the event of a predetermined event, the electronic device 100 may acquire color space information corresponding to the first pattern image stored in the memory 110.
According to another embodiment, color space information corresponding to the first pattern image may be provided from the server 300. Specifically, if a predetermined control signal is received, the server 300 may transmit color space information corresponding to the first pattern image to the electronic device 100. Here, the predetermined control signal may be a signal requesting color space information corresponding to the first pattern image. Then, the electronic device 100 may receive color space information corresponding to the first pattern image.
Meanwhile, there may be various points in time at which the first pattern image and the second pattern image are output.
According to an embodiment, the electronic device 100 may output the first pattern image first and output the second pattern image after outputting the first pattern image. Specifically, the electronic device 100 may output the first pattern image in a state where the screen member 30 is mounted, and output the second pattern image in a state where the screen member 30 is not mounted.
According to another embodiment, the electronic device 100 may output the first pattern image and the second pattern image at the same time. Specifically, the electronic device 100 may simultaneously output the first pattern image and the second pattern image in a state in which the screen member 30 is mounted. Here, the electronic device 100 may output the first pattern image in a first area where the screen member 30 is located among the entire area of the projection surface, and output the second pattern image in a second area where the screen member 30 is not present among the entire area of the projection surface. Then, the electronic device 100 may acquire a photographed image photographed with the projection surface. Here, the photographed image may include a first pattern image output in the first region and a second pattern image output in the second region. Here, the electronic device 100 may acquire conversion information (e.g., a color space conversion matrix) based on the first pattern image output in the first region and perform color calibration based on the second pattern image output in the second region. Here, the electronic apparatus 100 may acquire transformation information (e.g., a color space transformation matrix) based on an original image of the first pattern image and the first pattern image included in the photographed image, and at the same time, perform color calibration according to characteristics of the projection surface based on the original image of the second pattern image, the second pattern image included in the photographed image, and the transformation information (e.g., the color space transformation matrix).
Meanwhile, the electronic device 100 may be implemented in a form in which the first pattern image and the second pattern image are identical, and performs color calibration using one photographed image. Specifically, the electronic device 100 may output the pattern image in a state including the screen member 30. Here, the electronic device 100 may output the pattern image in both the first region including the screen member 30 and the second region not including the screen member 30. That is, some portions of the pattern image may be output in a first region (including the screen member 30), and some portions of the pattern image may be output in a second region (not including the screen member 30). Here, the electronic apparatus 100 may acquire transformation information (e.g., a color space transformation matrix) based on an original image of the pattern image and the pattern image included in the photographed image, and at the same time, perform color calibration according to characteristics of the projection surface based on the original image of the pattern image, the pattern image included in the photographed image, and the transformation information (e.g., the color space transformation matrix).
Fig. 3 is a block diagram for illustrating a detailed configuration of the electronic apparatus 100 in fig. 2.
Referring to fig. 3, the electronic device 100 may include at least one of a memory 110, a projection component (e.g., including a projector or a projection lens) 120, a processor (e.g., including processing circuitry) 130, a camera 140, a communication interface (e.g., including communication circuitry) 150, a manipulation interface (e.g., including various circuitry) 161, an input/output interface (e.g., including various input/output circuitry) 162, a speaker 170, a microphone 180, or a power component (e.g., including power management circuitry) 190.
Meanwhile, among the operations of the memory 110, the projection unit 120, and the processor 130, the same operations as described above are not described in detail.
The camera 140 is a component for capturing an object and generating a captured image, and here, the captured image is a concept including both a moving image and a still image. The camera 140 may acquire an image for at least one external device, and may be implemented as a camera, a lens, an infrared sensor, or the like.
The camera 140 may include a lens and an image sensor. As the type of lens, there are a common general-purpose lens, a wide-angle lens, a zoom lens, and the like, and the type may be determined according to the type, characteristics, use environment, and the like of the electronic apparatus 100. As the image sensor, a Complementary Metal Oxide Semiconductor (CMOS), a Charge Coupled Device (CCD), or the like can be used.
The camera 140 outputs incident light as an image signal. Specifically, the camera 140 may include a lens, pixels, and an AD converter. The lens collects light of an object and forms an optical image in a photographing region, and pixels may output the light introduced through the lens in analog form as an image signal. Then, the AD converter may convert the image signal in analog form into an image signal in digital form and output the signal. In particular, the camera 140 is arranged to take a front direction of the electronic device 100, and take a user existing at the front of the electronic device 100 and generate a taken image.
The communication interface 150 is a component that performs communication with various types of external devices according to various types of communication methods. The communication interface 150 may include various communication circuits including, for example, a wireless communication module or a wired communication module. Here, each communication module may be implemented in the form of at least one hardware chip.
The wireless communication module may be a module that wirelessly communicates with an external device. For example, the wireless communication module may include at least one of a Wi-Fi module, a bluetooth module, an infrared communication module, or other communication module.
The Wi-Fi module and the bluetooth module may perform communication using a Wi-Fi method and a bluetooth method, respectively. In the case of using a Wi-Fi module or a bluetooth module, various types of connection information (e.g., SSID and session key) are first transmitted and received, and communication connection is performed using these information, after which various types of information can be transmitted and received.
The infrared communication module performs communication according to an infrared data association (IrDA) technology that wirelessly transmits data to a near field using infrared rays between visible light and millimeter waves.
The other communication module may include at least one communication chip that performs communication according to various wireless communication protocols other than the above-described communication methods, such as Zigbee, third generation (3G), third generation partnership project (3 GPP), long Term Evolution (LTE), LTE-advanced (LTE-a), fourth generation (4G), fifth generation (5G), and the like.
The wired communication module may be a module that communicates with an external device by wire. For example, the wired communication module may include at least one of a Local Area Network (LAN) module, an ethernet module, a twisted pair cable, a coaxial cable, a fiber optic cable, or an Ultra Wideband (UWB) module.
The manipulation interface 161 may include various circuits and may be implemented as devices such as buttons, a touch pad, a mouse, and a keyboard, or as a touch screen that can simultaneously perform the above-described display function and manipulation input function. Here, the buttons may be various types of buttons formed in any area (e.g., a front surface portion, a side surface portion, a rear surface portion, etc. of the main body of the electronic device 100), such as mechanical buttons, a touch pad, a wheel, etc.
The input/output interface 162 may be an interface including various input/output circuits including any one of a High Definition Multimedia Interface (HDMI), a mobile high definition link (MHL), a Universal Serial Bus (USB), a Display Port (DP), a lightning interface, a Video Graphics Array (VGA) port, an RGB port, a D-subminiature (D-SUB), or a Digital Visual Interface (DVI). The input/output interface 162 may input or output at least one audio signal or video signal. According to an implementation example, the input/output interface 162 may include a port that outputs only an audio signal and a port that outputs only a video signal as separate ports, or it may be implemented as one port that inputs and outputs an audio signal and a video signal. Meanwhile, the electronic device 100 may transmit at least one of an audio signal or a video signal to an external device (e.g., an external display device or an external speaker) through the input/output interface 162. Specifically, an output port included in the input/output interface 162 may be connected with an external device, and the electronic device 100 may transmit at least one of an audio signal or a video signal to the external device through the output port.
Here, the input/output interface 162 may be connected with a communication interface. The input/output interface 162 may transmit information received from an external device to the communication interface or transmit information received through the communication interface to the external device.
The speaker 170 may be a component that outputs not only various audio data but also various notification sounds, voice messages, or the like.
The electronic device 100 may include a microphone 180.
Microphone 180 is a component for receiving input of user speech or other sounds and converting it into audio data. The microphone 180 may receive the voice of the user in the activated state. For example, the microphone 180 may be integrally formed in an upper side or front surface direction, a side surface direction, or the like of the electronic device 100. The microphone 180 may include various components such as a microphone that collects user voice in an analog form, an amplifier circuit that amplifies the collected user voice, an a/D conversion circuit that samples the amplified user voice and converts the user voice into a digital signal, a filter circuit that removes a noise component from the converted digital signal, and the like.
The power supply section 190 may include various power management circuits and be externally supplied with power and supply power to the respective components of the electronic device 100. The power supply unit 190 according to the embodiment of the present disclosure may be supplied with power by various methods. Further, a 220V DC power line may be used to supply power to the power supply section 190. However, the present disclosure is not limited thereto, and the electronic device 100 may be powered using a USB power line or the electronic device 100 may be powered using a wireless charging method.
The electronic device 100 may also include a display (not shown).
The display (not shown) may be implemented as various forms of display, such as a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED) display, a Plasma Display Panel (PDP), etc. Inside the display (not shown), a driving circuit, a backlight unit, etc., which may be implemented in the form of an a-si TFT, a Low Temperature Polysilicon (LTPS) TFT, an organic TFT (OTFT), etc., may also be included. Meanwhile, a display (not shown) may be implemented as a touch screen in combination with a touch sensor, a flexible display, a 3D display, or the like. Further, a display (not shown) according to an embodiment of the present disclosure may include not only a display panel outputting an image, but also a bezel accommodating the display panel. In particular, a bezel according to an embodiment of the present disclosure may include a touch sensor (not shown) for detecting user interactions.
Fig. 4 is a table showing various embodiments for performing color calibration operations.
Referring to table 410 in fig. 4, in performing a color calibration operation, various embodiments may exist according to a subject acquiring an image and a method of using a screen member. The subject from which the image is acquired may be the terminal apparatus 200 or the electronic apparatus 100. Meanwhile, the method of using the screen member may be a method of being installed by a user or a method of being accommodated inside the electronic device.
Here, the method of installation by the user may refer to the user installing the individual screen member 30 directly on the projection surface. Further, the method of being accommodated inside the electronic device may refer to a method in which the screen member 30 is accommodated inside the electronic device 100 and the screen member 30 is automatically used.
According to the first embodiment, the color calibration operation may be performed based on the operation of the terminal apparatus 200 to acquire an image and the operation of the user to directly mount the screen member 30. A specific explanation of this will be described in more detail below with reference to fig. 5.
According to the second embodiment, the color calibration operation can be performed based on the operation of the terminal device 200 to acquire an image and the operation of using the screen member 30 housed inside the electronic device 100. A specific explanation of this will be described in fig. 6.
According to the third embodiment, the color calibration operation may be performed based on the operation of the electronic apparatus 100 to acquire an image and the operation of the user to directly mount the screen member 30. A specific explanation of this will be described in fig. 7.
According to the fourth embodiment, the color calibration operation can be performed based on the operation of the electronic apparatus 100 to acquire an image and the operation of using the screen member 30 housed inside the electronic apparatus 100. A specific explanation of this will be described in fig. 8.
Fig. 5 is a diagram for illustrating a color calibration operation according to the first embodiment.
Referring to fig. 5, a user may install a screen member (e.g., reflector) 30 on the screen 10. After the screen member 30 is mounted on the screen 10, the electronic device 100 may project the first pattern image 510 onto the screen member 30. Here, the terminal device 200 may capture the first pattern image 510 projected onto the screen member 30.
Fig. 6 is a diagram for illustrating a color calibration operation according to the second embodiment.
Referring to fig. 6, the electronic device 100 may include a screen member 30. For example, the screen member 30 included in the electronic device 100 may be unfolded as in fig. 6 by manipulation of a user. For another example, the screen member 30 included in the electronic device 100 may be unfolded as in fig. 6 by a motor (not shown). In a state in which the screen member 30 is unfolded, the electronic device 100 may project the first pattern image 610 on the screen member 30. Then, the terminal device 200 may capture the first pattern image 610 projected on the screen member 30.
Fig. 7 is a diagram for illustrating a color calibration operation according to the third embodiment.
Referring to fig. 7, a user may install the screen member 30 on the screen 10. After the screen member 30 is mounted on the screen 10, the electronic device 100 may project the first pattern image 710 onto the screen member 30 using the projection part 120. Here, the electronic device 100 may capture the first pattern image 710 projected on the screen member 30 using the camera 140 included in the electronic device 100.
Fig. 8 is a diagram for illustrating a color calibration operation according to the fourth embodiment.
Referring to fig. 8, the electronic device 100 may include a screen member 30. For example, the screen member 30 included in the electronic device 100 may be unfolded as in fig. 8 by manipulation of a user. For another example, the screen member 30 included in the electronic device 100 may be unfolded as in fig. 8 by a motor (not shown). In a state in which the screen member 30 is unfolded, the electronic device 100 may project the first pattern image 810 on the screen member 30. Then, the electronic device 100 may capture the first pattern image 810 projected on the screen member 30.
Fig. 9 is a diagram for illustrating an operation using the screen member 30 according to the first and third embodiments.
Referring to fig. 9, the electronic device 100 may project an image for guiding an arrangement position of a screen member through the projection part 120. In particular, the guide image projected by the electronic device 100 may include at least one of text 910 or UI 920 for guiding the arrangement of the screen member 30. The user can easily find the position of the screen member 30 based on the projected UI 920 and position the screen member 30 on the projected UI 920 accordingly.
Fig. 10 is a diagram for illustrating an operation using the screen member 30 according to the second and fourth embodiments.
Referring to fig. 10, the electronic device 100 may include a screen member 30. For example, the screen member 30 included in the electronic device 100 may be unfolded by manipulation of a user. For another example, the screen member 30 included in the electronic device 100 may be unfolded by a motor (not shown).
Embodiment 1010 shows a state in which the screen member 30 is folded. Embodiment 1020 shows a state in which the screen member 30 is unfolded. Embodiment 1030 represents a state in which the projection section 120 or the camera 140 is disposed on the electronic device 100.
Meanwhile, the screen member 30 disclosed in fig. 10 is shown as being attached to the electronic device 100. Here, the screen member 30 may be attached to the electronic device 100 or detached from the electronic device 100, and the user may separate the screen member 30 from the electronic device 100 and attach the screen member 30 on the projection surface. Therefore, the user does not have to hold the screen member 30 alone, and the user can easily hold the screen member 30 by fixing the screen member 30 to the electronic device 100.
Fig. 11 is a diagram for illustrating an operation of generating a color space conversion matrix.
Referring to fig. 11, the electronic device 100 may perform an operation of acquiring a color space transformation matrix.
Referring to the embodiment 1110, the electronic device 100 may store color space information corresponding to the first pattern image 1111. Here, the color space information may refer to CIE XYZ color space. Herein, CIE may refer to the international commission on illumination (Commission Internationale de l' Eclairage). Specifically, the electronic device 100 may project the first pattern image 1111 onto the screen member 30. Then, the professional measuring device 1100 may capture the first pattern image 1111 projected onto the screen member 30. Based on the image photographed with the first pattern image 1111, the electronic apparatus 100 may acquire color space information corresponding to the first pattern image 1111. Here, the acquired color space information may be standard information. Since the professional measuring device 1100 is a high-precision device, it is difficult for an average consumer to own such a device. Therefore, there may be a difference between the image photographed by the professional measuring apparatus 1100 and the image photographed by the general terminal apparatus 200.
Referring to the embodiment 1120, the electronic device 100 may acquire RGB information corresponding to the first pattern image 1121. Specifically, the electronic device 100 may project the first pattern image 1121 onto the screen member 30, and the terminal device 200 may capture the projected first pattern image 1121. Here, the first pattern image 1121 may be the same as the first pattern image 1111 in embodiment 1110. The electronic apparatus 100 may acquire RGB information corresponding to the first pattern image photographed by the terminal apparatus 200.
The electronic device 100 may acquire a color space conversion matrix based on CIE XYZ color space information corresponding to the first pattern image 1111 acquired through the embodiment 1110 and RGB information corresponding to the first photographed image acquired through the embodiment 1120.
Fig. 12 is a diagram for illustrating an operation of projecting a color calibration result.
Referring to fig. 12, after performing the color calibration operation, the electronic device 100 may project a resulting image 1210 onto the screen 10. The resulting image may include information indicating the ratio that may be reached by the color calibration for the projection surface compared to the target level. For example, if the target level is 100%, the electronic device 100 may project a resulting image 1210 including information that the color calibration may reach 90% of the target level onto the screen 10.
Fig. 13 is a diagram for illustrating an operation of guiding a user behavior corresponding to a color calibration result.
Referring to fig. 13, after performing the color calibration operation, the electronic device 100 may project a result image 1310 including a color calibration result 1311 and information 1312 for guiding user behavior onto the screen 10. The color calibration result 1311 may refer to information indicating the ratio that may be achieved by the color calibration compared to the target level. In addition, the information 1312 for guiding the user's behavior may include actions that the user may take in order to get better color calibration results. For example, the information 1312 for guiding the user behavior may include information guiding dimming the illumination to improve the accuracy of the color calibration. In case the illumination is dimmed, the first pattern image and the second pattern image may be more clearly identified. Thus, the electronic device 100 may provide information to the user that directs dimming of the illumination to improve the accuracy of the color calibration.
Fig. 14 is a diagram for illustrating an operation of comparing a projection before color calibration and a projection after color calibration according to an embodiment.
Referring to fig. 14, an embodiment 1401 may instruct the electronic device 100 to project a resulting image 1410 according to a pre-color-calibration setting. The electronic device 100 may project the resulting image 1410 after the color calibration operation. The resulting image 1410 may include at least one of the following: a UI 1411 indicating a color calibration result, a UI 1412 for guiding projection of a result image according to a setting before color calibration, a UI 1413 for guiding projection of a result image according to a setting after color calibration, or a UI 1414 for guiding a user to select application of a setting after color calibration. For example, if the user selects the UI 1412 for directing the projection of the result image according to the pre-color-calibration setting, the electronic device 100 may control the projection component 120 to project the result image 1410 according to the pre-color-calibration setting.
Meanwhile, the embodiment 1402 may instruct the electronic device 100 to project the resulting image 1420 according to the color-calibrated setting. For example, if the user selects the UI 1413 for guiding the projection of the result image according to the color-calibrated setting, the electronic device 100 may control the projection part 120 to project the result image 1420 according to the color-calibrated setting.
The resulting image 1410 may be an image projected according to a setting prior to color calibration. Thus, the resulting image 1410 may be identified differently due to the color of the screen 10. However, the resulting image 1420 may be an image projected according to the color-calibrated settings. Thus, regardless of the color of the screen 10, the resulting image 1420 may be represented in terms of the color of the original desired original image.
The result image projected without the user's individual selection after the color calibration operation may be any one of the images between the result image 1410 of the embodiment 1401 and the result image 1420 of the embodiment 1402.
Fig. 15 is a diagram illustrating an example operation of comparing a projection before color calibration and a projection after color calibration, in accordance with various embodiments.
Referring to fig. 15, the electronic device 100 may project a resulting image 1510. The resulting image 1510 can be divided into two regions. The first region 1511 may be a region projected in accordance with the setting before color calibration among the entire region of the result image 1510. The second area 1512 may be an area projected in terms of the color-calibrated settings among the entire area of the result image 1510. The electronic device 100 may project some areas in the resulting image 1510 in a pre-color-calibration setting and the remaining areas in a post-color-calibration setting.
Fig. 16 is a diagram illustrating an example operation of projecting information about a terminal device that may be connected with the electronic device 100, according to various embodiments.
Referring to fig. 16, the electronic device 100 may project an image 1610 including a list of at least one device that may be connected with the electronic device 100. For example, assume that the electronic apparatus 100 is in a state capable of being connected to the first terminal apparatus, the second terminal apparatus, and the third terminal apparatus. The electronic apparatus 100 may project information corresponding to three terminal apparatuses as a list. If the user selects the UI 1611 for connection with the first terminal device, the electronic device 100 may perform communication connection with the first terminal device.
When the electronic device 100 is communicatively connected with a particular terminal device, the electronic device 100 may project a resulting image 1620. The resulting image 1620 may include at least one of the following: information 1621 informing that the electronic apparatus 100 is connected to a specific terminal apparatus, or a UI 1622 for selecting whether to keep a connection with a specific terminal apparatus. The UI 1622 for selecting whether to maintain connection with a specific terminal apparatus may include at least one of a UI 1623 corresponding to maintaining connection or a UI 1624 corresponding to disconnection. If the user selects the UI 1624 corresponding to the disconnection, the electronic device 100 may complete the communication connection with the specific terminal device (e.g., the first terminal device) previously connected.
After completing the communication connection with the particular terminal device (e.g., the first terminal device), the electronic device 100 may again project an image 1630 including the list of connectable devices. The image 1630 may project UIs 1631, 1632, the UIs 1631, 1632 corresponding to devices that may be connected to the electronic device 100 but whose connections are broken in a different layout than the other UIs. For example, the electronic apparatus 100 may display UIs 1631, 1632 corresponding to the first terminal apparatus whose connection has been disconnected in dark or gray.
Fig. 17 is a flowchart for schematically showing an example procedure of performing an operation of acquiring a color space conversion matrix and performing a color calibration operation.
Referring to fig. 17, the electronic device 100 may acquire a first photographed image including the screen member 30 in operation S1705. Here, the first photographed image may be photographed by the electronic device 100 or the terminal device 200. Further, the electronic device 100 may acquire a color space conversion matrix in operation S1710. Here, the color space conversion matrix may be generated based on the first captured image. Further, the electronic apparatus 100 may acquire a second photographed image including the screen (e.g., projection surface) 10 in operation S1715. Here, the second photographed image may be photographed by the electronic device 100 or the terminal device 200. In addition, the electronic apparatus 100 may perform a color calibration operation in operation S1720. Here, the color calibration operation may be performed based on the second captured image.
Fig. 18 is a flowchart for showing in detail the operation of acquiring the color space conversion matrix.
Referring to fig. 18, the electronic device 100 may project a first pattern image onto the screen member 30 in operation S1805. Further, the electronic device 100 may acquire a first photographed image including the screen member 30 on which the first pattern image is projected in operation S1810. In addition, the electronic device 100 may acquire a color space conversion matrix based on RGB information corresponding to the first photographed image and color space information corresponding to the first pattern image in operation S1815. Further, the electronic device 100 may project the second pattern image onto the screen 10 in operation S1820. Further, the electronic device 100 may acquire a second photographed image including the screen 10 on which the second pattern image is projected in operation S1825. Further, the electronic device 100 may perform color calibration based on RGB information corresponding to the second photographed image, the color space conversion matrix, and color space information corresponding to the second pattern image in operation S1830.
Fig. 19 is a flowchart for showing the color calibration operation in detail.
Referring to fig. 19, after acquiring the color space conversion matrix, the electronic device 100 may project a second pattern image onto the screen (e.g., projection surface) 10 in operation S1905. Further, the electronic apparatus 100 may acquire a second photographed image including the screen (e.g., projection surface) 10 on which the second pattern image is projected in operation S1910. In addition, the electronic device 100 may transform (or convert) RGB information corresponding to the second photographed image into color space information corresponding to the second photographed image based on the color space transformation matrix in operation S1915. Further, the electronic device 100 may acquire color difference between color space information corresponding to the second photographing image and color space information corresponding to the second pattern image in operation S1920.
Here, the electronic device 100 may recognize whether the color difference is greater than or equal to the threshold value in operation S1925. If the color difference is greater than or equal to the threshold value (yes) in operation S1925, the electronic device 100 may perform color calibration based on the color difference in operation S1930. Then, after performing the color calibration, the electronic device 100 may project the second pattern image again. Then, the electronic device 100 may repeat operations S1905 to S1925. Meanwhile, if the color difference is less than the threshold value (no) in operation S1925, the electronic device 100 may maintain the current set value without performing the color calibration in operation S1935.
Fig. 20 is a flowchart for illustrating an operation of identifying whether a predetermined object is included in the screen member 30.
Referring to fig. 20, in the first and third embodiments in which the user directly installs the screen member 30 in fig. 4, it is necessary to check whether the screen member 30 is a genuine screen member 30 corresponding to the electronic device 100. The reason for this is that: if the screen member 30 is not a genuine screen member 30, color calibration may not be properly performed.
The electronic device 100 may acquire a first photographed image including the screen member 30 in operation S2005. Further, the electronic apparatus 100 may recognize whether a predetermined (e.g., designated) object is included in the first photographed image in operation S2010. Here, the predetermined object may refer to an object for identifying whether the screen member 30 is a screen member corresponding to the electronic device 100. A specific explanation of this will be described in fig. 21 and 22.
If the predetermined object is included in the first photographed image in operation S2010 (yes), the electronic device 100 may acquire a color space transformation matrix in operation S2015. The feature of the predetermined object included in the first photographed image may mean that the screen member 30 exists on the screen 10. The predetermined object may be an object related to the screen member 30.
If the predetermined object is not included in the first photographed image in operation S2010 (no), the electronic device 100 may project text in which the screen member 30 is not recognized in operation S2020. The feature that the predetermined object is not included in the first captured image may mean that the screen member 30 is not present on the screen 10. However, the following situation may occur: there are screen members of different types than the actual screen member 30. In this case, the electronic device 100 may project text informing that there is a screen member that is not truly a different type.
Fig. 21 is a diagram for illustrating an operation of the recognition screen member 30 according to the embodiment.
Referring to fig. 21, the screen member 30 may include a general region 2111 and a concave-convex region 2112. Here, the general region 2111 may be a region including a plane having a regular reflectivity. Here, the concave-convex area 2112 may be an area including a surface of which reflectance varies. In the general region 2111, the reflectance of light is regular, but in the concave-convex region 2112, the reflectance may be irregular. The concave-convex area 2112 may be an area made of intaglio (engraving) or relief (embossing). Here, the concave-convex area 2112 may be a predetermined (e.g., specified) object. If a predetermined object (concave-convex area 2112) is included in the acquired first photographed image, the electronic device 100 may determine that the screen member 30 is genuine.
In fig. 21, the concave-convex area 2112 is indicated by oblique lines, but in a practical implementation, the concave-convex area 2112 may be an area that is not perceived by the naked eye of the user in a general case, but may be perceived by the naked eye of the user only in a case of light irradiation. Further, the concave-convex area 2112 is implemented as an uneven surface, and thus the reflectance of light may be irregular.
Fig. 22 is a diagram for illustrating an operation of the recognition screen member 30 according to another embodiment.
Referring to fig. 22, the screen member 30 may include a predetermined (e.g., designated) object 2210. Here, the predetermined object 2210 may mean an object that may indicate a real screen member. For example, the predetermined object 2210 may include at least one of a predetermined text, a predetermined image, a predetermined icon, or a predetermined pattern. Meanwhile, the predetermined object 2210 may be an object which cannot be clearly seen in a state where light is not irradiated, but can be clearly seen in a state where light is irradiated.
Fig. 23 is a flowchart for illustrating an embodiment of performing a color calibration operation in the electronic device 100.
Referring to fig. 23, the electronic device 100 may perform color calibration by acquiring a color space transformation matrix. Specifically, the electronic device 100 may project a first pattern image onto the screen member 30 in operation S2305. Further, after projecting the first pattern image, the electronic device 100 may transmit a first photographing command to the terminal device 200 in operation S2310. If the user directly performs photographing behavior after observing that the first pattern image is projected, operation S2310 may be omitted.
Here, after projecting the first pattern image, the terminal device 200 may acquire a first photographed image including the screen member 30 on which the first pattern image is projected in operation S2315. Then, the terminal device 200 may transmit the first photographed image to the electronic device 100 in operation S2320.
Here, the electronic device 100 may acquire a color space conversion matrix based on the first photographed image and the first pattern image received from the terminal device 200 in operation S2325. Specifically, the electronic device 100 may acquire the color space conversion matrix based on RGB information corresponding to the first captured image and color space information corresponding to the first pattern image.
Here, after acquiring the color space conversion matrix, the electronic device 100 may project the second pattern image onto the screen 10 in operation S2330. Further, after projecting the second pattern image, the electronic device 100 may transmit a second photographing command to the terminal device 200 in operation S2335. If the user directly performs photographing behavior after observing that the second pattern image is projected, operation S2335 may be omitted.
Here, after projecting the second pattern image, the terminal device 200 may acquire a second photographed image including the screen 10 on which the second pattern image is projected in operation S2340. Then, the terminal device 200 may transmit the second photographed image to the electronic device 100 in operation S2345.
Here, the electronic apparatus 100 may perform color calibration based on the second photographed image, the color space conversion matrix, and the second pattern image received from the terminal apparatus 200 in operation S2350. Specifically, the electronic device 100 may change RGB information corresponding to the second photographed image to color space information corresponding to the second photographed image based on the color space transformation matrix. Then, the electronic apparatus 100 may perform color calibration based on the color difference between the color space information corresponding to the second photographed image and the color space information corresponding to the second pattern image.
Fig. 24 is a flowchart for illustrating an embodiment of performing a color calibration operation in the terminal apparatus 200.
Referring to fig. 24, the electronic device 100 may perform color calibration by acquiring a color space transformation matrix. Specifically, the electronic device 100 may project a first pattern image onto the screen member 30 in operation S2405. Further, after projecting the first pattern image, the electronic apparatus 100 may transmit a first photographing command to the terminal apparatus 200 in operation S2410. If the user directly performs photographing behavior after observing the projection of the first pattern image, operation S2410 may be omitted.
Here, after the first pattern image is projected, the terminal apparatus 200 may acquire a first photographed image including the screen member 30 on which the first pattern image is projected in operation S2415.
Here, the terminal apparatus 200 may acquire a color space conversion matrix based on the first photographed image and the first pattern image in operation S2420. Specifically, the terminal apparatus 200 may acquire the color space conversion matrix based on RGB information corresponding to the first captured image and color space information corresponding to the first pattern image. Here, the color space information corresponding to the first pattern image may be information already stored in the terminal apparatus 200 or information transmitted together in operation 2410. Then, the terminal device 200 may transmit a command for projecting the second pattern image to the electronic device 100 in operation 2425.
Here, if a command for projecting the second pattern image is received from the terminal apparatus 200, the electronic apparatus 100 may project the second pattern image onto the screen 10 in operation S2430. Further, after projecting the second pattern image, the electronic device 100 may transmit a second photographing command to the terminal device 200 in operation S2435. If the user directly performs photographing behavior after observing that the second pattern image is projected, operation S2435 may be omitted.
Here, after projecting the second pattern image, the terminal apparatus 200 may acquire a second photographed image including the screen 10 on which the second pattern image is projected in operation S2440.
Here, the terminal apparatus 200 may perform color calibration based on the second photographed image, the color space conversion matrix, and the second pattern image in operation S2445. Specifically, the terminal apparatus 200 may change RGB information corresponding to the second photographed image to color space information corresponding to the second photographed image based on the color space conversion matrix. Then, the terminal device 200 may perform color calibration based on the color difference between the color space information corresponding to the second photographed image and the color space information corresponding to the second pattern image. Here, the color space information corresponding to the second pattern image may be information already stored in the terminal apparatus 200 or information transmitted together in operation 2435. Then, the terminal device 200 may transmit the color calibration result to the electronic device 100 in operation 2450.
Meanwhile, unlike fig. 24, the operation of acquiring the color space conversion matrix and the color calibration operation described as being performed in the electronic apparatus 100 in the description of the present disclosure may be performed in the terminal apparatus 200.
Fig. 25 is a flowchart illustrating an embodiment of performing a color calibration operation using streaming data.
Referring to fig. 25, the electronic device 100 may acquire an image including the screen member 30 or the screen 10 in real time. Specifically, the electronic apparatus 100 may acquire a real-time streaming image corresponding to a space to which the electronic apparatus 100 projects an image in operation S2505. Here, the electronic device 100 may recognize whether the screen member 30 is included in the streaming image in operation S2510.
If the screen member 30 is not included in the streaming image in operation S2510 (no), the electronic device 100 may repeatedly acquire the streaming image. If the screen member 30 is included in the streaming image in operation S2510 (yes), the electronic device 100 may project a first pattern image onto the screen member 30 in operation S2515.
Further, the electronic device 100 may acquire a color space conversion matrix based on the streaming image including the screen member 30 on which the first pattern image is projected in operation S2520. Specifically, the electronic device 100 may acquire the color space conversion matrix based on RGB information corresponding to the streaming image including the screen member 30 and color space information corresponding to the first pattern image.
Further, after acquiring the color space conversion matrix, the electronic device 100 may recognize whether the screen member 30 is included in the streaming image again in operation S2525.
If the screen member 30 is included in the streaming image after the color space conversion matrix is acquired in operation S2525 (yes), the electronic device 100 may project a UI for guiding removal of the screen member 30 in operation S2530. Then, the electronic device 100 may repeatedly recognize whether the screen member 30 is included in the streaming image. If the screen member 30 is not included in the streaming image after the color space conversion matrix is acquired in operation S2525 (no), the electronic device 100 may project the second pattern image onto the screen 10 in operation S2535.
Further, the electronic device 100 may perform color calibration based on the streaming image including the screen 10 on which the second pattern image is projected in operation S2540. Specifically, the electronic device 100 may obtain RGB information corresponding to the streaming image including the screen 10 and color space information corresponding to the streaming image including the screen 10 based on the color space transformation matrix. Then, the electronic device 100 may perform color calibration based on a color difference between color space information corresponding to the streaming image including the screen 10 and color space information corresponding to the second pattern image.
Fig. 26 is a diagram for illustrating a system including the electronic apparatus 100, the terminal apparatus 200, and the server 300.
Referring to fig. 26, the server 300 may refer to a device that may be communicatively connected with the electronic device 100 and the terminal device 200. The server 300 may transmit information required for the electronic device 100 or the terminal device 200.
According to the embodiment, the electronic device 100 and the terminal device 200 can directly transmit and receive information.
According to another embodiment, the electronic device 100 and the terminal device 200 may transmit and receive information through the server 300.
For example, the terminal device 200 may transmit the photographed image to the server 300, and the server 300 may transmit the image received from the terminal device 200 to the electronic device 100.
For another example, the color calibration may be performed in the terminal apparatus 200 by using a color space transformation matrix. Here, the server 300 may transmit information related to the first pattern image and information related to the second pattern image to the terminal device 200. Here, the information related to the first pattern image may refer to color space information corresponding to the first pattern image. Here, the information related to the second pattern image may refer to color space information corresponding to the second pattern image. Further, the color calibration result generated by the terminal apparatus 200 may be transmitted to the server 300, and the server 300 may transmit the color calibration result received from the terminal apparatus 200 to the electronic apparatus 100.
Fig. 27 is a diagram for illustrating a process of acquiring a color space conversion matrix for acquiring XYZ color space information corresponding to a captured image.
Referring to fig. 27, an exemplary process of acquiring a color space transformation matrix is described in embodiment 2700. Here, the color space conversion matrix may refer to a matrix for converting from RGB information to XYZ color space information. Here, the matrix 2701 may refer to color space conversion information or a color space conversion matrix. In addition, matrix 2701 may refer to a 3×3 matrix.
The matrix 2701 may include nine unknowns. Here, the nine unknowns may be KXR, KXG, KXB, KYR, KYG, KYB, KZR, KZG and KZB. To obtain the correct values for the nine unknowns, the electronic device 100 may use three embodiments 2710, 2720, 2730. Embodiment 2710 is an embodiment in which a red pattern image is projected and then the image is photographed, embodiment 2720 is an embodiment in which a green pattern image is projected and then the image is photographed, and embodiment 2730 is an embodiment in which a blue pattern image is projected and then the image is photographed.
In the embodiment 2710, the electronic device 100 may project a red pattern image and acquire a photographed image including the screen member 30 on which the red pattern image is projected. Then, the electronic device 100 may acquire RGB information corresponding to the red pattern image from the acquired photographed image. The matrix 2701 may refer to a color space transformation matrix, and it may refer to a 3×3 matrix with nine unknowns. The matrix 2712 may refer to RGB information corresponding to a photographed image including the screen member 30 on which the red pattern image is projected. Further, the matrix 2713 may refer to XYZ color space information corresponding to a red pattern image. The matrix 2713 may be stored in the memory 110 in advance before the captured image is acquired. With the embodiment 2710, three simultaneous equations related to the color space conversion matrix can be acquired.
In embodiment 2720, electronic device 100 may project a green pattern image and acquire a captured image including screen member 30 on which the green pattern image is projected. Then, the electronic device 100 may acquire RGB information corresponding to the green pattern image from the acquired photographed image. The matrix 2701 may refer to a color space transformation matrix, and it may refer to a 3×3 matrix with nine unknowns. The matrix 2722 may refer to RGB information corresponding to a photographed image including the screen member 30 on which the green pattern image is projected. Further, the matrix 2723 may refer to XYZ color space information corresponding to a green pattern image. The matrix 2723 may be stored in the memory 110 in advance before capturing the photographed image. With embodiment 2720, three simultaneous equations related to the color space transformation matrix can be obtained.
In embodiment 2730, electronic device 100 may project a blue pattern image and acquire a captured image including screen member 30 on which the blue pattern image is projected. Then, the electronic device 100 may acquire RGB information corresponding to the blue pattern image from the acquired photographed image. The matrix 2701 may refer to a color space transformation matrix, and it may refer to a 3×3 matrix with nine unknowns. The matrix 2732 may refer to RGB information corresponding to a photographed image including the screen member 30 on which the blue pattern image is projected. Further, the matrix 2733 may refer to XYZ color space information corresponding to a blue pattern image. The matrix 2733 may be stored in the memory 110 in advance before capturing the photographed image. With embodiment 2730, three simultaneous equations related to the color space transformation matrix can be obtained.
The electronic device 100 may acquire nine simultaneous equations through the three embodiments 2710, 2720, 2730, and find all nine unknowns included in the color space transformation matrix by using the nine simultaneous equations. Finally, the electronic device 100 may acquire a color space transformation matrix of a 3×3 matrix.
Meanwhile, in fig. 27, a 3×3 matrix is acquired to acquire a color space conversion matrix corresponding to an embodiment using XYZ color space information in RGB information. However, according to an implementation example, various color space information other than XYZ color space information may be used. According to an implementation example, the electronic device 100 may acquire a color space transformation matrix whose size is different from a 3×3 matrix.
Fig. 28 is a diagram for showing RGB information corresponding to a captured image and XYZ color space information corresponding to the captured image.
Referring to fig. 28, the electronic device 100 may transform RGB information corresponding to a photographed image into XYZ color space information. Specifically, the electronic apparatus 100 may acquire XYZ color space information by multiplying RGB information corresponding to a captured image by a color space conversion matrix. Table 2810 may indicate RGB information and Table 2820 may indicate color space conversion information.
Fig. 29 is a flowchart for illustrating a control method of an electronic device according to an embodiment of the present disclosure.
A control method of an electronic device 100 that can store a first pattern image and a second pattern image and communicate with an external terminal device according to an embodiment of the present disclosure includes the steps of: the method includes projecting a first pattern image onto a screen member 30 located on a projection surface (S2905), acquiring conversion information based on the first shot image and the first pattern image in response to receiving the first shot image photographed with the screen member 30 from an external terminal device (S2910), projecting a second pattern image onto the projection surface (S2915), and performing color calibration according to characteristics of the projection surface based on the second shot image, the second pattern image, and the conversion information in response to receiving the second shot image photographed with the projection surface from the external terminal device (S2920).
Meanwhile, the transformation information may be color space transformation information, and in the step of acquiring transformation information (S2910), color space information corresponding to the first pattern image may be acquired, and color space transformation information according to characteristics of the projection part 120 included in the electronic device 100 may be acquired based on the first photographed image and the color space information corresponding to the first pattern image.
Meanwhile, in the step of performing color calibration (S2920), color space information corresponding to the second pattern image may be acquired, and color calibration according to the characteristics of the projection surface may be performed based on the second captured image, the color space information corresponding to the second pattern image, and the color space conversion information.
Meanwhile, in the step of acquiring the conversion information (S2910), the color space conversion information may be acquired based on RGB information corresponding to the first photographed image and XYZ color space information corresponding to the first pattern image.
Meanwhile, the transformation information may be a color space transformation matrix that transforms RGB information into XYZ color space information.
Meanwhile, in the step of performing color calibration (S2920), RGB information corresponding to the second photographed image may be converted into XYZ color space information based on the conversion information, a color difference between XYZ color space information corresponding to the second photographed image and XYZ color space information corresponding to the second pattern image may be acquired, and color calibration may be performed based on the acquired color difference.
Meanwhile, in the step of performing color calibration (S2920), at least one of a gain value or an offset value related to the RGB signal may be acquired based on the acquired color difference.
Meanwhile, the control method may further include the steps of: in response to identifying that the predetermined object related to the screen member 30 is included in the first captured image, transformation information is acquired based on the first captured image and the first pattern image, and in response to identifying that the predetermined object related to the screen member 30 is not included in the first captured image, a UI including information that the screen member 30 is not identified is projected.
Meanwhile, the first pattern image may include at least one of a white pattern image, a red pattern image, a green pattern image, or a blue pattern image, and the second pattern image may include a white pattern image.
Meanwhile, in the step of projecting the first pattern image (S2905), a white pattern image among the plurality of pattern images included in the first pattern image may be first projected, and the remaining pattern images may be sequentially projected.
Meanwhile, the first pattern image may include at least one of a red pattern, a green pattern, a blue pattern, or a white pattern. Here, the electronic device 100 may first output a white pattern. Then, the electronic device 100 may output red, green, and blue patterns. If the white pattern is output first, the order of the red, green, and blue patterns thereafter may be changed according to user settings.
Meanwhile, the screen member 30 may be a white screen member. For example, the screen member 30 may be a material such as plastic, and it may be a flat material that may be attached or mounted on a projection surface. An explanation of this has been described in fig. 5 and 7. As another example, a portion of the electronic device 100 may be designed to function as a screen member. The screen member 30 may be a mechanical component or panel, and it may be part of the assembly of the electronic device 100. An explanation of this has been described in fig. 6, 8 and 10.
Here, the edges of the screen member 30 may have been designed with a pattern having a different reflectivity than the thickness and/or depth defined by the manufacturer. Thus, when a particular light is projected from the projection means onto the screen member 30, the reflectivity of the edges and the reflectivity of other areas may be different. Therefore, in the case where the camera photographs the screen member 30, gradation (brightness or color) can be recognized in different ways according to the area of the screen member 30.
In addition, patterns having different reflectivities may be designed as follows: the pattern can be recognized well when the directional light is projected by the electronic device 100, but is difficult to distinguish under normal natural light or indoor light. In this way, specific shapes or letters can be printed, and these specific shapes or letters can be recognized when the camera performs photographing.
The user positions the screen member 30 provided at the time of purchasing the electronic device 100 at a position where the output image is projected, and positions the screen member 30 such that the entire output image or a part of the output image is projected onto the screen member 30. In order to identify a screen member defined by a manufacturer, the electronic device 100 may implement naked eye identification by projecting a predefined pattern or light, or for automation, the electronic device 100 may provide a calibration function such that if a camera or a smart phone camera provided on the electronic device 100 photographs the screen member, the calibration function is operated when the screen member is identified.
According to an embodiment, when a menu provided at the electronic apparatus 100 is selected, a pre-calibration operation may be performed. Here, the pre-calibration operation may include an operation of communicatively connecting the electronic device 100 and the terminal device 200 that project the image, and adjusting the sensitivity of the projection part 120.
If a pre-calibration menu among SW OSD menus of the electronic apparatus 100 is selected, the electronic apparatus 100 may find the user terminal apparatus 200 with the camera installed in the surrounding environment and output the list. If the user selects a particular device, the electronic apparatus 100 may output a message attempting to connect to the particular device and wait for the user to accept (or select). The electronic apparatus 100 may automatically find surrounding devices through methods such as WiFi, UPNP, and the like, and output a list. Here, the list may include at least one of a device name, a device ID, or a device shape that graphically processes a schematic shape of the device. When the user selects a specific device, the electronic apparatus 100 may temporarily set the pairing connection and output an image indicating which device is the device (on which the color calibration function is to be performed) currently selected by the user.
Here, if the device is not the device desired by the user, the electronic apparatus 100 may disconnect the temporary connection and may output the list again, and may output the guide image so that the user may select the connected device again. Here, the electronic apparatus 100 may output information about the device whose connection is disconnected in gray, avoiding selecting the device whose connection is disconnected again. By displaying the peripheral devices in distinguishable colors, the user can easily distinguish and display the peripheral devices that can be connected with the electronic apparatus 100.
When a user selects one device, one device between the electronic apparatus 100 and the selected device may become an Access Point (AP) and may operate pairing, and the device may enter a state in which wireless communication between the electronic apparatus 100 and the selected device is possible. Here, the electronic apparatus 100 may store the name or physical address of the pairing device, etc., and reuse the stored information when pairing is required again later. Since the connection information is stored, the electronic device 100 can provide an automatic connection function, which can shorten the time for the user to make a selection.
According to another embodiment, the pre-color calibration function may be performed when a menu provided at the electronic device 100 is selected.
If the pre-calibration menu provided at the application of the terminal apparatus 200 is selected, the terminal apparatus 200 may find surrounding devices that can be communicatively connected with the terminal apparatus 200 and display a subject having a projection function in the terminal apparatus 200 as a list. The user may select a specific device in the list displayed through the terminal apparatus 200. Here, the list may include at least one of a device name, a device ID, or an image thumbnail reproduced (or outputted) in a surrounding device.
Here, if one surrounding device is selected in the list, one device between the terminal apparatus 200 and the selected device may become an Access Point (AP) and may operate pairing, and the device may enter a state where wireless communication between the terminal apparatus 200 and the selected device is possible. Here, the terminal apparatus 200 may store the name, physical address, or the like of the pairing device, and reuse the stored information when pairing is required again later. Since the connection information is stored, the terminal apparatus 200 can provide an automatic connection function, which can shorten the time for the user to make a selection.
Here, the pre-calibration may be a process of matching display and calibration reference points to solve calibration accuracy and allocation problems due to program allocation of various cellular phone cameras. With this, the program distribution problem of various smart phone cameras can be solved.
Meanwhile, an embodiment is assumed in which a color calibration operation is performed in the terminal apparatus 200. The electronic device 100 may change the patterns of red, green, and blue and output the patterns on the screen member 30, and transmit a signal (a mark) for capturing an image to the terminal device 200. Further, the terminal device 200 may analyze the photographed image and transmit picture quality adjustment information according to the analysis result and a signal (flag) for changing to the next pattern to the electronic device 100. Here, after the terminal device 200 photographs the red pattern output by the electronic device 100, red, green, and blue wavelength information may be extracted from the acquired image and stored. Also, after the terminal device 200 photographs the green and blue patterns output by the electronic device 100, corresponding red, green, and blue wavelength information may be analyzed from the acquired images and stored. In addition, the white pattern may be analyzed and stored. By combining the pattern photographing information with the R/G/B reaction characteristics of the camera included in the terminal apparatus 200, the terminal apparatus 200 can generate a color space conversion matrix for accurate RGB to XYZ color gamut conversion. The color space conversion matrix may be used in color space conversion for performing optical color coordinate adjustment based on RGB information of an image photographed by a camera of the terminal apparatus 200.
Even if the actual colors of the screens are different, the electronic device 100 or the terminal device 200 can automatically calibrate the colors to the picture quality of the original image or the picture quality set by the user.
If a communication connection is set between the terminal apparatus 200 and the electronic apparatus 100, the color calibration function can be immediately performed without separately selecting a device.
If only the color space conversion matrix is generated and the color calibration operation is not completed in the previous process, an operation for connecting the electronic apparatus 100 and the terminal apparatus 200 may be performed. If the user selects a specific menu, the list may be output through the electronic device 100 or may be output through an application of the terminal device 200.
Meanwhile, if a specific menu is selected at the electronic apparatus 100 or the terminal apparatus 200 after the communication connection is set, the electronic apparatus 100 may perform a color calibration function.
In the case where the electronic device 100 utilizes a wall surface having a specific color as the screen 10, a color calibration function may be required. Specifically, the electronic device 100 may project a white pattern onto the screen member 30, and when the projection is completed, the electronic device 100 may give a flag and transmit a signal for photographing to the terminal device 200. After the terminal device 200 photographs and reflects the white pattern output from the electronic device 100 onto the wall surface, the color calibration function may be performed by using only information of a predefined operation region (e.g., n×n pixels of the image center) among the entire region of the RGB image.
The electronic device 100 may transform RGB information of the photographed image into XYZ color space information by using a color space transformation matrix and then calculate color coordinates. Here, it is possible to analyze how much difference exists between the calculated color coordinates and the target color coordinates to be adjusted by the color calibration function. Further, the electronic device 100 may calculate a signal variation (adjustment) value for calibrating color coordinates, and inverse-transform the color coordinate calibration value in the XYZ color space again into an RGB color space (which may be calculated from an inverse transform matrix of the transform matrix), and calculate an RGB calibration value. The electronic device 100 may adjust the output signal level by utilizing the gain and/or offset of the RGB signals. The calibration process may be completed at a point in time by repeatedly performing this process so that the error becomes less than or equal to a predetermined threshold.
Meanwhile, an embodiment is assumed in which a color calibration operation is performed in the terminal apparatus 200. The electronic apparatus 100 may output a white pattern and give a flag for photographing to the terminal apparatus 200. Here, when the terminal device 200 completes photographing and analysis, the terminal device 200 may transmit a result value of its analysis and a flag for changing to a next pattern to the electronic device 100.
Meanwhile, an embodiment in which a color calibration operation is performed in the electronic apparatus 100 is assumed. The camera of the terminal apparatus 200 may transmit photographed RGB information to the electronic apparatus 100, and the electronic apparatus 100 may perform all of the following: color space conversion, calculation of RGB signal output adjustment values, and adjustment. For this purpose, the color space transformation matrix acquired in the pre-calibration step may be stored in the electronic device 100 and then may be shared to the terminal device 200, or vice versa. Alternatively, the color space transformation matrix may be saved in the server 300 and immediately downloaded to a desired party for use.
Here, in the case of performing the color calibration function, the electronic apparatus 100 may calculate only the operation region (the center n×n pixels of the RGB image) of the RGB image acquired by the camera of the terminal apparatus 200, and perform the color calibration based on the center region, for example. For another example, the electronic device 100 may divide the acquired RGB image into X and Y number of rectangular areas in the horizontal and vertical directions, respectively, and extract n×n pixels at the center of the x×y rectangular area, respectively. In addition, the electronic device 100 may convert these into XYZ color spaces, respectively, and compare the x×y number of color coordinates with the color coordinates of the RGB image center area, and simultaneously calibrate color unevenness on the wall surface. Specifically, the electronic device 100 may calculate a deviation between the color coordinates of the central area and the color coordinates of the surrounding area, and calculate the non-uniformity characteristics of the projected wall surface. Further, when signal adjustment is performed in the electronic device 100, the electronic device 100 may perform calibration through an erroneous adjustment process so that uniformity of the RGB signal can be matched when it is moved more to the outside based on the center of the wall surface. When uniform optical signal information projected from the electronic device 100 is reflected onto a wall surface to be projected by a user and recognized by the eyes of the user, color unevenness of the wall surface can be accurately adjusted.
The operation procedure performed by the terminal apparatus 200 or by the electronic apparatus 100 as described above may be changed to the following method: in consideration of the processing capability between the terminal device 200 and the electronic device 100, roles of each other are completely changed automatically, or by dividing and processing a certain part of the procedure or the like under the determination of the two devices. Furthermore, the assignment of such roles may be implemented in the form of a user selection, so that it may be fixed as an option or may be flexibly changed.
During initial capture and analysis, the electronic device 100 may output a resulting image indicating to which level the wall surface may be accurately calibrated by a wall color calibration function (intended to be provided in the present disclosure). Here, the result image may include a UI informing the logic inside the product and the processing result of the predetermined rule. The electronic device 100 may measure the color of the wall surface and project a white pattern onto the wall surface at the current level. Then, the electronic device 100 may analyze the photographing result and calculate a signal change range (e.g., minimum value, maximum value) that may be adjusted at the electronic device 100, and calculate an accuracy degree to which the wall surface may be adjusted. Further, if the degree of accuracy is lower than the minimum level provided by the present disclosure according to the measurement result of the signal processing, the electronic device 100 may output at least one of a warning image or guidance for the degree of accuracy. Here, the electronic apparatus 100 may provide the following operations: after providing guidance for predicting calibration accuracy, it is identified whether the user is performing calibration or stopping calibration.
Even though the color of the projection surface has a specific color other than white by this process, the color coordinates of the intended XYZ color space can be accurately adjusted. Although the color calibration for the previous RGB color space is different from the identified color, according to the present disclosure, by the method of calibrating the color using the XYZ space, an adjustment to make the color look the same as the original image (e.g., white screen) color can be made. After the adjustment is completed, the electronic device 100 may display the adjusted picture quality changes before and after the adjustment. The electronic device 100 may then provide a UI so that the user may select the pre-adjustment or post-adjustment settings.
If the setting is ultimately selected by the user, the electronic device 100 may store the setting. Further, if the electronic device 100 is used on the same wall surface later, the electronic device 100 may recall the stored setting value and use it again. The setting values may be implemented in the form of being stored in the terminal device 200 or the server 300 other than the electronic device 100.
Meanwhile, the electronic device 100 may store information about the wall surface of which picture quality is adjusted (information of the wall surface may be specified in an adjustment process, for example, a position at home, a color distribution measured when a white pattern is projected from a projector onto the wall surface, etc.), and may be easily applied in a case where a user uses a portable projector which is portable.
Meanwhile, the electronic device 100 can provide the picture quality of the original image without change, regardless of which material or which color the screen is composed of. Thus, the function or usability of the projector can be improved.

Claims (15)

1. An electronic device, comprising:
a memory storing a first pattern image and a second pattern image;
a communication interface including a communication circuit configured to communicate with an external terminal device;
a projection member; and
a processor configured to:
controlling the projection means to project the first pattern image onto a screen member located on a projection surface,
in response to receiving a first photographed image photographed with the screen member from the external terminal device through the communication interface, acquiring transformation information based on the first photographed image and the first pattern image,
controlling the projection means to project the second pattern image onto the projection surface, and
in response to receiving a second photographed image photographed with the projection surface from the external terminal device through the communication interface, color calibration is performed according to characteristics of the projection surface based on the second photographed image, the second pattern image, and the transformation information.
2. The electronic device of claim 1, wherein the transformation information comprises color space transformation information, and
the processor is configured to:
acquiring color space information corresponding to the first pattern image, and
based on the first captured image and color space information corresponding to the first pattern image, the color space conversion information is acquired according to the characteristics of the projection means.
3. The electronic device of claim 2, wherein the processor is configured to:
acquiring color space information corresponding to the second pattern image, and
based on the second captured image, color space information corresponding to the second pattern image, and the color space conversion information, color calibration is performed according to the characteristics of the projection surface.
4. The electronic device of claim 2, wherein the processor is configured to:
the color space conversion information is acquired based on red, green, and blue RGB information corresponding to the first captured image and XYZ color space information corresponding to the first pattern image.
5. The electronic device of claim 1, wherein the transformation information comprises a color space transformation matrix that transforms RGB information to XYZ color space information.
6. The electronic device of claim 1, wherein the processor is configured to:
converting RGB information corresponding to the second captured image into XYZ color space information based on the conversion information,
acquiring a color difference between XYZ color space information corresponding to the second captured image and XYZ color space information corresponding to the second pattern image, and
the color calibration is performed based on the acquired color difference.
7. The electronic device of claim 6, wherein the processor is configured to:
at least one of a gain value or an offset value related to the RGB signal is changed based on the acquired color difference.
8. The electronic device of claim 1, wherein the processor is configured to:
in response to identifying that a predetermined object related to the screen member is included in the first captured image, acquiring the transformation information based on the first captured image, and
in response to the recognition that the predetermined object related to the screen member is not included in the first captured image, the projection means is controlled to project a user interface UI including information that the screen member is not recognized.
9. The electronic device of claim 1, wherein the first pattern image comprises at least one of a white pattern image, a red pattern image, a green pattern image, or a blue pattern image, and
The second pattern image includes a white pattern image.
10. The electronic device of claim 9, wherein the processor is configured to:
the projection means is controlled to first project a white pattern image among a plurality of pattern images included in the first pattern image, and sequentially project the remaining pattern images.
11. A method of controlling an electronic device that stores a first pattern image and a second pattern image and communicates with an external terminal device, the method comprising:
projecting the first pattern image onto a screen member located on a projection surface;
acquiring conversion information based on a first photographed image and the first pattern image in response to receiving the first photographed image photographed with the screen member from the external terminal device;
projecting the second pattern image onto the projection surface; and
in response to receiving a second photographed image photographed with the projection surface from the external terminal device, color calibration is performed according to characteristics of the projection surface based on the second photographed image, the second pattern image, and the transformation information.
12. The method of claim 11, wherein the transform information comprises color space transform information, and
The obtaining the transformation information includes:
acquiring color space information corresponding to the first pattern image; and
based on the first captured image and color space information corresponding to the first pattern image, the color space conversion information is acquired according to characteristics of a projection part included in the electronic device.
13. The method of claim 12, wherein performing the color calibration comprises:
acquiring color space information corresponding to the second pattern image; and
based on the second captured image, color space information corresponding to the second pattern image, and the color space conversion information, color calibration is performed according to the characteristics of the projection surface.
14. The method of claim 12, wherein obtaining the transformation information comprises:
the color space conversion information is acquired based on red, green, and blue RGB information corresponding to the first captured image and XYZ color space information corresponding to the first pattern image.
15. The method according to claim 11,
wherein the transformation information is a color space transformation matrix that transforms RGB information into XYZ color space information.
CN202280008498.9A 2021-04-22 2022-01-03 Electronic device and control method thereof Pending CN116711300A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2021-0052220 2021-04-22
KR10-2021-0144246 2021-10-27
KR1020210144246A KR20220145743A (en) 2021-04-22 2021-10-27 Electronic apparatus and controlling method thereof
PCT/KR2022/000018 WO2022225138A1 (en) 2021-04-22 2022-01-03 Electronic device and control method therefor

Publications (1)

Publication Number Publication Date
CN116711300A true CN116711300A (en) 2023-09-05

Family

ID=87832606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280008498.9A Pending CN116711300A (en) 2021-04-22 2022-01-03 Electronic device and control method thereof

Country Status (1)

Country Link
CN (1) CN116711300A (en)

Similar Documents

Publication Publication Date Title
EP3699903B1 (en) Screen color conversion method and apparatus, and storage medium
US10911748B1 (en) Display calibration system
JP6352185B2 (en) Fast display calibration using a multicolor camera calibrated colorimetrically based on spectrum
US20160165229A1 (en) Calibration system and method for multi-display system
US10083641B2 (en) Electronic apparatus, method of calibrating display panel apparatus, and calibration system
CN112116888B (en) Screen calibration method, calibration device and storage medium
US9824664B2 (en) Calibration device, calibration method and display device
US9161000B2 (en) Light parameter measurement method and device, screen adjustment method and electronic device
KR20190026128A (en) Electronic device comprising display and method for calibrating thereof
JPWO2016002511A1 (en) Image processing apparatus and method
KR102233316B1 (en) Display apparatus, display correction appratus, display correction system and display correction method
EP2290442B1 (en) Method for compensating light reflection of projection frame and projection apparatus
CN113542709B (en) Projection image brightness adjusting method and device, storage medium and projection equipment
TWI602419B (en) Adjusting method and display apparatus using same
CN111083460B (en) Illuminance testing method, device, equipment and medium based on ultra-short focus projection module
CN116711300A (en) Electronic device and control method thereof
EP4228253A1 (en) Electronic device and control method therefor
CN111918047A (en) Photographing control method and device, storage medium and electronic equipment
CN104427322A (en) Information processing device, imaging device, information processing method, and program
CN113542708B (en) Projection surface parameter confirmation method and device, storage medium and projection equipment
JP6919485B2 (en) Image output control device, image output control method, image output system and image output control program
KR20220145743A (en) Electronic apparatus and controlling method thereof
JP6106969B2 (en) Projection device, pointer device, and projection system
JP2018101003A (en) Projection device, projection method, and program
JP2015095867A (en) Color reproduction characteristic creation device, color reproduction characteristic creation system, program, and color reproduction characteristic creation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination