US8803903B2 - Color space determination devices and display devices and systems including the same - Google Patents

Color space determination devices and display devices and systems including the same Download PDF

Info

Publication number
US8803903B2
US8803903B2 US13/419,085 US201213419085A US8803903B2 US 8803903 B2 US8803903 B2 US 8803903B2 US 201213419085 A US201213419085 A US 201213419085A US 8803903 B2 US8803903 B2 US 8803903B2
Authority
US
United States
Prior art keywords
video data
color space
input video
sampling
sampled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/419,085
Other versions
US20120236205A1 (en
Inventor
Sang-hyun Lee
Hong-Mi Choi
Taek-kyun Shin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHIN, TAEK-KYUN, CHOI, HONG-MI, LEE, SANG-HYUN
Publication of US20120236205A1 publication Critical patent/US20120236205A1/en
Application granted granted Critical
Publication of US8803903B2 publication Critical patent/US8803903B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Definitions

  • Example embodiments may relate to image processing. More particularly, example embodiments may relate to color space determination devices and display devices including the color space determination devices.
  • video data may be classified as video data having a wide color space and video data having a narrow color space according to a range of a value that each of video data might have.
  • An image processor may receive input video data stored in a storage device or received from a communication device, and may generate output video data by converting a format of the input video data into a red, green, blue (RGB) format. To increase a performance of a color reproduction, the image processor may be required to generate the output video data having a wide color space when the input video data has a wide color space, and may be required to generate the output video data having a narrow color space when the input video data has a narrow color space.
  • RGB red, green, blue
  • Example embodiments may be directed color space determination devices able to determine a color space of input video data.
  • Example embodiments also may be directed to display devices including the color space determination devices.
  • a color space determination device may include a sampling configuration unit and/or a determination unit.
  • the sampling configuration unit may determine a sampling ratio and/or a sampling number based on a resolution of input video data.
  • the sampling ratio may be a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame.
  • the sampling number may be a number of frames to be sampled.
  • the determination unit may receive the input video data in units of frames (e.g., frame by frame), may sample the input video data with the sampling ratio for each of the sampling number of frames, and/or may generate a color space signal representing a color space of the input video data based on the sampled input video data that are sampled from the sampling number of frames of the input video data.
  • the sampling configuration unit may decrease the sampling ratio and/or increase the sampling number as the resolution of the input video data increases.
  • the sampling configuration unit may include a table having entries, each of which may store a sampling ratio, a sampling number, and/or a resolution. One or more of the sampling ratio, sampling number, and resolution may be predetermined.
  • the sampling configuration unit may read the sampling ratio and/or the sampling number from an entry of the table that stores the resolution corresponding to the resolution of the input video data, and output the sampling ratio and the sampling number as the sampling ratio and the sampling number, respectively.
  • the determination unit may sample the input video data at different locations in a frame for each of the sampling number of frames.
  • the determination unit may generate the color space signal based on a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space.
  • the determination unit may include a sampling unit, an examination unit, and/or a comparison unit.
  • the sampling unit may be configured to receive the input video data in units of frames (e.g., frame by frame), to sample the input video data with the sampling ratio for each of the sampling number of frames, and/or to output the sampled input video data for each of the sampling number of frames.
  • the examination unit may be configured to generate a wide number by accumulatively counting a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space among the sampled input video data sampled from the sampling number of frames of the input video data.
  • the comparison unit may be configured to generate the color space signal by comparing the wide number with a threshold number.
  • the sampling unit may sample the input video data included in a sampling region of a frame, which has an area corresponding to the sampling ratio, for each of the sampling number of frames, where the sampling regions of the sampling number of frames is different from each other.
  • the examination unit may generate the wide number by accumulatively counting a number of the sampled input video data having a value from 0 to 15 or from 236 to 255 among the sampled input video data sampled from the sampling number of frames of the input video data.
  • the examination unit may receive the sampling number from the sampling configuration unit, and/or may reset the wide number after receiving the sampled input video data from the sampling unit the sampling number of times.
  • the comparison unit may determine the color space of the input video data as a narrow color space and/or may generate the color space signal having a first logic level when the wide number is less than the threshold number, and/or may determine the color space of the input video data as a wide color space and/or may generate the color space signal having a second logic level when the wide number is equal to or greater than the threshold number.
  • the determination unit may further include a multiplexer configured to output one of the color space signal received from the comparison unit and a default color space signal representing one of a wide color space and a narrow color space in response to an enable signal.
  • the multiplexer may output the color space signal received from the comparison unit when the enable signal has a logic high level, and/or may output the default color space signal when the enable signal has a logic low level.
  • a display device may include a color space determination device, an image processor, and/or a display unit.
  • the color space determination device may generate a color space signal representing a color space of input video data based on a resolution of the input video data.
  • the image processor may receive the input video data, and/or may generate output video data by converting a format of the input video data based on the color space signal.
  • the display unit may display the output video data reflected, for example, in an output video signal.
  • the color space determination device may include a sampling configuration unit and/or a determination unit.
  • the sampling configuration unit may determine a sampling ratio and/or a sampling number based on the resolution of the input video data, where the sampling ratio may be a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame, and/or the sampling number may be a number of frames to be sampled.
  • the determination unit may receive the input video data by a unit of a frame, may sample the input video data with the sampling ratio for each of the sampling number of frames, and/or may generate the color space signal based on the sampled input video data that are sampled from the sampling number of frames of the input video data.
  • the image processor may store a first conversion coefficient corresponding to a wide color space and/or a second conversion coefficient corresponding to a narrow color space, and/or may generate the output video data by converting a format of the input video data into a RGB format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal.
  • a determination unit may include a sampling unit configured to receive input video data in units of frames, to sample the input video data with a sampling ratio for each of a sampling number of frames, and/or to output the sampled input video data for each of the sampling number of frames; an examination unit configured to generate a wide number by accumulatively counting a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space among the sampled input video data sampled from the sampling number of frames of the input video data; and/or a comparison unit configured to generate a color space signal by comparing the wide number with a threshold number.
  • the determination unit may further include a multiplexer configured to output one of the color space signal received from the comparison unit and a default color space signal representing one of the wide color space and the narrow color space in response to an enable signal.
  • a system may include a storage device and/or a display device.
  • the storage device may be configured to store video data.
  • the display device may include a color space determination device, an image processor, and/or a display unit.
  • the color space determination device may be configured to sample the video data.
  • the color space determination device may be configured to generate a color space signal representing a color space of the video data based on the sampled video data.
  • the image processor may be configured to generate output video data based on the sampled video data and the color space signal.
  • the display unit may be configured to display the output video data.
  • the color space signal may correspond to a wide color space.
  • the image processor may generate the output video data by converting a format of the video data into a red, green, blue (RGB) format.
  • RGB red, green, blue
  • the image processor may store a first conversion coefficient corresponding to a wide color space and a second conversion coefficient corresponding to a narrow color space and/or may generate the output video data by converting a format of the video data into a red, green, blue (RGB) format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal.
  • RGB red, green, blue
  • the video data is sampled at different locations for each frame of the video data that is sampled.
  • FIG. 1 is a block diagram illustrating a color space determination device according to example embodiments
  • FIG. 2 is a diagram illustrating an example of a table included in a sampling configuration unit of FIG. 1 ;
  • FIG. 3 is a block diagram illustrating an example of a determination unit included in a color space determination device of FIG. 1 ;
  • FIG. 4 is a diagram for describing an operation of a sampling unit included in a determination unit of FIG. 3 ;
  • FIG. 5 is a block diagram illustrating another example of a determination unit included in a color space determination device of FIG. 1 ;
  • FIG. 6 is a flow chart illustrating a method of determining a color space according to example embodiments
  • FIG. 7 is a flow chart illustrating an example of a step of generating a color space signal included in a method of determining a color space of FIG. 6 ;
  • FIG. 8 is a block diagram illustrating a display device according to example embodiments.
  • FIG. 9 is a block diagram illustrating a system according to example embodiments.
  • first, second, third, etc. may be used herein to describe various elements, components, regions, layers, and/or sections, these elements, components, regions, layers, and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer, and/or section from another element, component, region, layer, and/or section. For example, a first element, component, region, layer, and/or section could be termed a second element, component, region, layer, and/or section without departing from the teachings of example embodiments.
  • FIG. 1 is a block diagram illustrating a color space determination device according to example embodiments.
  • a color space determination device 100 may include a sampling configuration unit 110 and/or a determination unit 120 .
  • the sampling configuration unit 110 may receive a resolution RES of input video data I_DATA from outside.
  • the sampling configuration unit 110 may determine a sampling ratio S_RATIO and/or a sampling number S_FNUM based on the resolution RES of the input video data I_DATA.
  • the sampling ratio S_RATIO may be a ratio of a number of the input video data I_DATA to be sampled to a total number of the input video data I_DATA included in a frame.
  • the sampling number S_FNUM may be a number of frames of the input video data I_DATA to be sampled.
  • the determination unit 120 may receive the input video data I_DATA by a unit of a frame from outside, and/or may receive the sampling ratio S_RATIO and/or the sampling number S_FNUM from the sampling configuration unit 110 .
  • the determination unit 120 may sample the input video data I_DATA with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames to generate sampled input video data for each of the sampling number S_FNUM of frames.
  • the determination unit 120 may determine a color space of the input video data I_DATA and/or may generate a color space signal CS representing the color space of the input video data I_DATA based on the sampled input video data that are sampled from the sampling number S_FNUM of frames of the input video data I_DATA.
  • the determination unit 120 may determine the color space of the input video data I_DATA as one of a wide color space and a narrow color space. Therefore, the color space signal CS may represent one of a wide color space and a narrow color space. A range of a value that each of the input video data I_DATA may have is relatively wide when the input video data I_DATA has a wide color space, and a range of a value that each of the input video data I_DATA may have is relatively narrow when the input video data I_DATA has a narrow color space.
  • an image processor may receive input video data stored in a storage device or received from a communication device, and/or may generate output video data by converting a format of the input video data into a RGB format, so that a display unit displays the output video data.
  • the image processor may be required to generate the output video data having a wide color space when the input video data has a wide color space, and/or maybe required to generate the output video data having a narrow color space when the input video data has a narrow color space.
  • a conventional display device may require a user to select one of a wide color space and a narrow color space which will be used by an image processor in converting the input video data into the output video data.
  • a user may not be able to select a correct color space corresponding to a color space of the input video data since a user usually does not know a color space of the input video data.
  • the color space determination device 100 may determine a color space of the input video data I_DATA based on the resolution RES of the input video data I_DATA and/or may generate the color space signal CS representing the color space of the input video data I_DATA during operation.
  • the color space determination device 100 may provide the color space signal CS to an image processor, so that the image processor may generate output video data having a same color space as a color space of the input video data I_DATA by converting a format of the input video data I_DATA into a RGB format based on the color space signal CS received from the color space determination device 100 .
  • the color space determination device 100 determines the color space of the input video data I_DATA by sampling all the input video data I_DATA included in a frame, a time for the color space determination device 100 to analyze one frame may increase such that the color space determination device 100 may slow down an overall operation speed of a display device including the color space determination device 100 . Therefore, the color space determination device 100 may sample only a part of the input video data I_DATA among all the input video data I_DATA included in a frame according to the sampling ratio S_RATIO to determine the color space of the input video data I_DATA.
  • the sampling configuration unit 110 may determine the sampling ratio S_RATIO, which represents a ratio of a number of the input video data I_DATA to be sampled to a total number of the input video data I_DATA included in a frame, and/or the sampling number S_FNUM, which represents a number of frames of the input video data I_DATA to be sampled, based on the resolution RES of the input video data I_DATA.
  • the determination unit 120 may sample the input video data I_DATA with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, may determine the color space of the input video data I_DATA, and may generate the color space signal CS based on the sampled input video data that are sampled from the sampling number S_FNUM of frames of the input video data I_DATA. As such, the color space determination device 100 may decrease a time required to analyze one frame such that the color space determination device 100 may not influence an overall operation speed of a display device including the color space determination device 100 .
  • the color space determination device 100 may decrease the sampling ratio S_RATIO and increase the sampling number S_FNUM as the resolution RES of the input video data I_DATA increases.
  • the color space determination device 100 may increase the sampling ratio S_RATIO and decrease the sampling number S_FNUM as the resolution RES of the input video data I_DATA decreases. As such, the color space determination device 100 may control a time required to analyze one frame.
  • the sampling configuration unit 110 may include a table 111 that has entries each of which stores a sampling ratio, a sampling number, and a resolution. One or more of the sampling ratio, sampling number, and resolution may be predetermined.
  • FIG. 2 is a diagram illustrating an example of a table included in a sampling configuration unit of FIG. 1 .
  • the table 111 may include a resolution field, a sampling ratio field, and/or a sampling number field.
  • the resolution field may store a resolution (that may or may not be predetermined). As illustrated in FIG. 2 , the resolution field may store a standard resolution of video data.
  • the sampling ratio field may store a sampling ratio (that may or may not be predetermined) that may be selected as the sampling ratio S_RATIO when a resolution (that may or may not be predetermined) stored in the same entry is equal to the resolution RES of the input vide data I_DATA.
  • the sampling number field may store a sampling number (that may or may not be predetermined) that may be selected as the sampling number S_FNUM when a resolution (that may or may not be predetermined) stored in the same entry is equal to the resolution RES of the input vide data I_DATA.
  • the sampling configuration unit 110 may read the sampling ratio (that may or may not be predetermined) and/or the sampling number (that may or may not be predetermined) from an entry of the table 111 that stores the resolution (that may or may not be predetermined) corresponding to the resolution RES of the input video data I_DATA, and may output the sampling ratio (that may or may not be predetermined) and/or the sampling number (that may or may not be predetermined) as the sampling ratio S_RATIO and the sampling number S_FNUM, respectively.
  • the sampling configuration unit 110 may determine the sampling ratio S_RATIO as 5% and/or may determine the sampling number S_FNUM as 20 when the resolution RES of the input video data I_DATA is less than VGA (Video Graphic Adapter).
  • the sampling configuration unit 110 may determine the sampling ratio S_RATIO as 3% and/or may determine the sampling number S_FNUM as 33 when the resolution RES of the input video data I_DATA is equal to or greater than VGA (Video Graphic Adapter) and less than HD (High Definition).
  • the sampling configuration unit 110 may determine the sampling ratio S_RATIO as 1% and/or may determine the sampling number S_FNUM as 100 when the resolution RES of the input video data I_DATA is equal to or greater than HD (High Definition).
  • the sampling ratio (that may or may not be predetermined) and the sampling number (that may or may not be predetermined) according to the resolution (that may or may not be predetermined) of FIG. 2 serves as an example embodiment.
  • the table 111 may store other values for the sampling ratio (that may or may not be predetermined) and/or the sampling number (that may or may not be predetermined) according to the resolution (that may or may not be predetermined).
  • Images of consecutive frames may be similar to each other in general video data. Therefore, if the determination unit 120 samples the input video data I_DATA of same locations in a frame for each of the sampling number S_FNUM of frames, a possibility of determining the color space of the input video data I_DATA incorrectly may increase. Therefore, the determination unit 120 may sample the input video data I_DATA at different locations in a frame for each of the sampling number S_FNUM of frames.
  • the determination unit 120 may determine the color space of the input video data I_DATA and/or may generate the color space signal CS representing the color space of the input video data I_DATA based on a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space.
  • FIG. 3 is a block diagram illustrating an example embodiment of a determination unit included in a color space determination device of FIG. 1 .
  • a determination unit 120 a may include a sampling unit 121 , an examination unit 123 , and/or a comparison unit 125 .
  • the sampling unit 121 may receive the input video data I_DATA by a unit of a frame from outside, and/or may receive the sampling ratio S_RATIO from the sampling configuration unit 110 .
  • the sampling unit 121 may sample the input video data I_DATA with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, and/or may output the sampled input video data S_DATA for each of the sampling number S_FNUM of frames.
  • the sampling unit 121 may sample the input video data I_DATA at different locations in a frame for each of the sampling number S_FNUM of frames.
  • FIG. 4 is a diagram for describing an operation of a sampling unit included in a determination unit of FIG. 3 .
  • sampling number S_FNUM is N, where N is a positive integer. Consecutive N frames are illustrated in FIG. 4 .
  • the sampling unit 121 may sample the input video data I_DATA included in a sampling region SA of a frame for each of the sampling number S_FNUM of frames.
  • the sampling region SA may have an area corresponding to the sampling ratio S_RATIO.
  • the sampling regions SA of the sampling number S_FNUM of frames may be different from each other.
  • the sampling unit 121 may sample the input video data I_DATA included in the sampling region SA of a frame for each of the sampling number S_FNUM of frames while the sampling unit 121 moves the sampling region SA to an adjacent region frame by frame.
  • the sampling region SA of a current frame may overlap with the sampling region SA of a previous frame.
  • the sampling region SA of a frame may be randomly determined for each frame.
  • the examination unit 123 may generate a wide number W_NUM by accumulatively counting a number of the sampled input video data S_DATA having a value used in a wide color space and not used in a narrow color space among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA.
  • the comparison unit 125 may generate the color space signal CS by comparing the wide number W_NUM with a threshold number W_TH.
  • each of the input video data I_DATA may have a value from 0 to 255.
  • each of the input video data I_DATA may have a value from 16 to 235. Therefore, if the input video data I_DATA having a value from 0 to 15 or from 236 to 255 exist, the color space of the input video data I_DATA may be determined as a wide color space. Alternatively, if the input video data I_DATA having a value from 0 to 15 or from 236 to 255 do not exist, the color space of the input video data I_DATA may be determined as a narrow color space. However, even if the input video data I_DATA has a narrow color space, the input video data I_DATA having a value from 0 to 15 or from 236 to 255 may exist because of noise, operational error, etc.
  • the examination unit 123 may generate the wide number W_NUM by accumulatively counting a number of the sampled input video data S_DATA having a value from 0 to 15 or from 236 to 255 among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA.
  • the comparison unit 125 may generate the color space signal CS by comparing the wide number W_NUM with the threshold number W_TH. For example, the comparison unit 125 may determine the color space of the input video data I_DATA as a narrow color space and/or may generate the color space signal CS having a first logic level when the wide number W_NUM is less than the threshold number W_TH.
  • the comparison unit 125 may determine the color space of the input video data I_DATA as a wide color space and/or may generate the color space signal CS having a second logic level when the wide number W_NUM is equal to or greater than the threshold number W_TH. As such, the color space determination device 100 may reduce a possibility of determining the color space of the input video data I_DATA incorrectly.
  • the examination unit 123 may receive the sampling number S_FNUM from the sampling configuration unit 110 .
  • the examination unit 123 may reset the wide number W_NUM after receiving the sampled input video data S_DATA from the sampling unit 121 the sampling number S_FNUM of times, so that the examination unit 123 may accumulatively count the number of the sampled input video data S_DATA having a value used in a wide color space and not used in a narrow color space only for the sampling number S_FNUM of frames.
  • FIG. 5 is a block diagram illustrating another example embodiment of a determination unit included in a color space determination device of FIG. 1 .
  • a determination unit 120 b may include a sampling unit 121 , an examination unit 123 , a comparison unit 125 , and/or a multiplexer 127 .
  • the determination unit 120 b of FIG. 5 may be substantially the same as the determination unit 120 a of FIG. 3 except for the multiplexer 127 . Therefore, a detail description of the sampling unit 121 , the examination unit 123 , and/or the comparison unit 125 of FIG. 5 will be omitted.
  • the multiplexer 127 may receive the color space signal CS from the comparison unit 125 , and/or may receive an enable signal ENABLE and/or a default color space signal D_CS from outside.
  • the default color space signal D_CS may represent one of a wide color space and a narrow color space.
  • the multiplexer 127 may output one of the color space signal CS and the default color space signal D_CS in response to the enable signal ENABLE. For example, the multiplexer 127 may output the color space signal CS received from the comparison unit 125 when the enable signal ENABLE has a logic high level, and/or may output the default color space signal D_CS received from outside when the enable signal ENABLE has a logic low level.
  • the determination unit 120 b may output the color space signal CS representing the color space of the input video data I_DATA determined by an operation described above when the determination unit 120 b receives the enable signal ENABLE having a logic high level from outside, and/or may output the default color space signal D_CS (that may or may not be predetermined) as representing one of a wide color space and a narrow color space regardless of the input video data I_DATA when the determination unit 120 b receives the enable signal ENABLE having a logic low level from outside.
  • the default color space signal D_CS that may or may not be predetermined
  • FIG. 6 is a flow chart illustrating a method of determining a color space according to example embodiments.
  • the method of determining a color space may be performed by the color space determination device 100 of FIG. 1 .
  • a sampling ratio S_RATIO and/or a sampling number S_FNUM may be determined based on a resolution RES of input video data I_DATA (step S 100 ).
  • the sampling ratio S_RATIO may be a ratio of a number of the input video data I_DATA to be sampled to a total number of the input video data I_DATA included in a frame.
  • the sampling number S_FNUM may be a number of frames of the input video data I_DATA to be sampled.
  • the input video data I_DATA may be received by a unit of a frame, the input video data I_DATA may be sampled with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, and/or a color space signal CS representing the color space of the input video data I_DATA may be generated based on the sampled input video data that are sampled from the sampling number S_FNUM of frames of the input video data I_DATA (step S 200 ).
  • the color space of the input video data I_DATA may be determined as one of a wide color space and a narrow color space, and/or the color space signal CS may represent one of a wide color space and a narrow color space.
  • FIG. 7 is a flow chart illustrating an example embodiment of a step of generating a color space signal included in a method of determining a color space of FIG. 6 .
  • the input video data I_DATA may be received by a unit of a frame, the input video data I_DATA may be sampled with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, and/or the sampled input video data S_DATA for each of the sampling number S_FNUM of frames may be output (step S 210 ).
  • a wide number W_NUM may be generated by accumulatively counting a number of the sampled input video data S_DATA having a value used in a wide color space and not used in a narrow color space among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA (step S 220 ).
  • the color space signal CS may be generated by comparing the wide number W_NUM with a threshold number W_TH (step S 230 ).
  • the wide number W_NUM may be generated by accumulatively counting a number of the sampled input video data S_DATA having a value from 0 to 15 or from 236 to 255 among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA.
  • the color space of the input video data I_DATA may be determined as a narrow color space and the color space signal CS having a first logic level may be generated when the wide number W_NUM is less than the threshold number W_TH.
  • the color space of the input video data I_DATA may be determined as a wide color space and the color space signal CS having a second logic level may be generated when the wide number W_NUM is equal to or greater than the threshold number W_TH.
  • the color space of the input video data I_DATA may be determined by sampling all the input video data I_DATA included in a frame, a time required to analyze one frame may increase such that an overall operation speed of a display device in which the method of determining a color space is performed may be slowed down. Therefore, in the method of determining a color space according to example embodiments, only a part of the input video data I_DATA among all the input video data I_DATA included in a frame may be sampled with the sampling ratio S_RATIO to determine the color space of the input video data I_DATA.
  • the method of determining a color space may decrease a time required to analyze one frame such that the method of determining a color space may not influence an overall operation speed of the display device.
  • the sampling ratio S_RATIO may be decreased and the sampling number S_FNUM may be increased as the resolution RES of the input video data I_DATA increases.
  • the sampling ratio S_RATIO may be increased and the sampling number S_FNUM may be decreased as the resolution RES of the input video data I_DATA decreases.
  • the method of determining a color space may control a time required to analyze one frame.
  • the input video data I_DATA at different locations in a frame for each of the sampling number S_FNUM of frames may be sampled to decrease a possibility of determining the color space of the input video data I_DATA incorrectly.
  • an enable signal ENABLE and/or a default color space signal D_CS representing one of a wide color space and a narrow color space may be received from outside, and one of the color space signal CS and the default color space signal D_CS may be output in response to the enable signal ENABLE (step S 240 ).
  • the color space signal CS may be output when the enable signal ENABLE has a logic high level
  • the default color space signal D_CS may be output when the enable signal ENABLE has a logic low level.
  • the color space signal CS representing the color space of the input video data I_DATA determined by an operation described above may be output when the enable signal ENABLE having a logic high level is received from outside, and the default color space signal D_CS (that may or may not be predetermined) as representing one of a wide color space and a narrow color space regardless of the input video data I_DATA may be output when the enable signal ENABLE having a logic low level is received from outside.
  • FIG. 8 is a block diagram illustrating a display device according to example embodiments.
  • a display device 200 may include a color space determination device 210 , an image processor 220 , and/or a display unit 230 .
  • the color space determination device 210 may generate a color space signal CS representing a color space of input video data I_DATA based on a resolution RES of the input video data I_DATA.
  • the color space determination device 210 of FIG. 8 may be embodied with the color space determination device 100 of FIG. 1 .
  • a structure and an operation of the color space determination device 100 of FIG. 1 are described above with reference to FIGS. 1 to 5 . Therefore, a detail description of the color space determination device 210 of FIG. 8 will be omitted.
  • the image processor 220 may receive the input video data I_DATA and/or may generate output video data O_DATA by converting a format of the input video data I_DATA based on the color space signal CS received from the color space determination device 210 .
  • the image processor 220 may generate the output video data O_DATA by converting a format of the input video data I_DATA into a RGB format.
  • the image processor 220 may convert a format of the input video data I_DATA such that the output video data O_DATA may have a wide color space when the color space signal CS represents a wide color space, and may convert a format of the input video data I_DATA such that the output video data O_DATA may have a narrow color space when the color space signal CS represents a narrow color space.
  • the image processor 220 may store a first conversion coefficient corresponding to a wide color space and a second conversion coefficient corresponding to a narrow color space, and/or may generate the output video data O_DATA by converting a format of the input video data I_DATA into a RGB format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal CS received from the color space determination device 210 .
  • the display unit 230 may receive the output video data O_DATA from the image processor 220 and/or may display the output video signal O_DATA.
  • the display device 200 may determine the color space of the input video data I_DATA and/or may generate the output video data O_DATA by converting a format of the input video data I_DATA based on the determined color space of the input video data I_DATA. Therefore, the display device 200 according to example embodiments may increase a performance of a color reproduction.
  • FIG. 9 is a block diagram illustrating a system according to example embodiments.
  • a system 300 may include a processor 310 , a display device 320 , and/or a storage device 330 .
  • the storage device 330 may store video data.
  • the storage device 330 may include a solid state drive (SSD), a hard disk drive (HDD), a compact disk read-only memory (CD-ROM) drive, etc.
  • SSD solid state drive
  • HDD hard disk drive
  • CD-ROM compact disk read-only memory
  • the processor 310 may read the video data stored in the storage device 330 and/or may provide the read video data to the display device 320 as input video data.
  • the display device 320 may determine a color space of the input video data, may generate output video data by converting a format of the input video data into a RGB format based on the determined color space of the input video data, and/or may display the output video data.
  • the display device 320 may include a color space determination device 321 , an image processor 323 , and/or a display unit 325 .
  • the color space determination device 321 may generate a color space signal CS representing a color space of the input video data based on a resolution of the input video data.
  • the image processor 323 may receive the input video data and/or may generate the output video data by converting a format of the input video data into a RGB format based on the color space signal CS received from the color space determination device 321 .
  • the display unit 325 may receive the output video data from the image processor 323 and/or may display the output video data reflected, for example, in an output video signal.
  • the display device 320 of FIG. 9 may be embodied with the display device 200 of FIG. 8 . Therefore, the color space determination device 321 , the image processor 323 , and/or the display unit 325 included in the display device 320 of FIG. 9 may be embodied with the color space determination device 210 , the image processor 220 , and/or the display unit 230 included in the display device 200 of FIG. 8 , respectively.
  • a structure and an operation of the display device 200 of FIG. 8 are described above with reference to FIGS. 1 to 5 and 8 . Therefore, a detail description of the display device 320 of FIG. 9 will be omitted.
  • the processor 310 may perform various computing functions, such as executing specific software for performing specific calculations or tasks.
  • the processor 310 may be a microprocessor or a central process unit.
  • the processor 310 may be connected to the display device 320 and the storage device 330 via bus such as an address bus, a control bus or a data bus, etc.
  • the processor 310 may be connected to an extended bus, such as peripheral component interconnect (PCI) bus.
  • PCI peripheral component interconnect
  • the processor 310 may be embodied as a single core architecture or a multi core architecture.
  • the processor 310 may be embodied as a single core architecture when an operating frequency of the processor 310 is less than 1 GHz, and/or the processor 310 may be embodied as a multi core architecture when an operating frequency of the processor 310 is greater than 1 GHz.
  • the processor 310 that is embodied as a multi core architecture may communicate with peripheral devices via an advanced extensible interface (AXI) bus.
  • AXI advanced extensible interface
  • the system 300 may further include a camera 340 , a memory device 350 , a user interface 360 , and/or an input/output device 370 .
  • the system 300 may further include ports to communicate with a video card, a sound card, a memory card, a universal serial bus (USB) device, etc.
  • USB universal serial bus
  • the camera 340 may generate video data by capturing images.
  • the processor 310 may provide the input video data to the display device 320 by reading the video data generated by the camera 340 as well as by reading the video data stored in the storage device 330 .
  • the memory device 350 may store data required for an operation of the system 300 .
  • the memory device 350 may be a volatile memory such as a dynamic random access memory (DRAM), a static random access memory (SRAM), etc., or a non-volatile memory such as an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a flash memory, etc.
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory etc.
  • the user interface 360 may include devices required for a user to control the system 300 .
  • the input/output device 370 may include an input device (e.g., a keyboard or a mouse), an output device (e.g., a printer), etc.
  • the system 300 may be a mobile device, a smart phone, a cellular phone, a desktop computer, a laptop computer, a work station, a handheld device, a digital camera, or the like.

Abstract

A color space determination device may include a sampling configuration unit and determination unit. The sampling configuration unit may be configured to determine a sampling ratio and a sampling number based on a resolution of input video data, the sampling ratio being a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame, the sampling number being a number of frames to be sampled. The determination unit may be configured to receive the input video data in units of frames, to sample the input video data with the sampling ratio for each of the sampling number of frames, and to generate a color space signal representing a color space of the input video data based on the sampled input video data that are sampled from the sampling number of frames of the input video data.

Description

CROSS-REFERENCE TO RELATED APPLICATION(S)
This application claims priority from Korean Patent Application No. 10-2011-0023135, filed on Mar. 16, 2011, in the Korean Intellectual Property Office (KIPO), the entire contents of which are incorporated herein by reference.
BACKGROUND
1. Technical Field
Example embodiments may relate to image processing. More particularly, example embodiments may relate to color space determination devices and display devices including the color space determination devices.
2. Description of the Related Art
Generally, video data may be classified as video data having a wide color space and video data having a narrow color space according to a range of a value that each of video data might have.
An image processor may receive input video data stored in a storage device or received from a communication device, and may generate output video data by converting a format of the input video data into a red, green, blue (RGB) format. To increase a performance of a color reproduction, the image processor may be required to generate the output video data having a wide color space when the input video data has a wide color space, and may be required to generate the output video data having a narrow color space when the input video data has a narrow color space.
SUMMARY
Example embodiments may be directed color space determination devices able to determine a color space of input video data.
Example embodiments also may be directed to display devices including the color space determination devices.
According to example embodiments, a color space determination device may include a sampling configuration unit and/or a determination unit. The sampling configuration unit may determine a sampling ratio and/or a sampling number based on a resolution of input video data. The sampling ratio may be a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame. The sampling number may be a number of frames to be sampled. The determination unit may receive the input video data in units of frames (e.g., frame by frame), may sample the input video data with the sampling ratio for each of the sampling number of frames, and/or may generate a color space signal representing a color space of the input video data based on the sampled input video data that are sampled from the sampling number of frames of the input video data.
According to example embodiments, the sampling configuration unit may decrease the sampling ratio and/or increase the sampling number as the resolution of the input video data increases.
According to example embodiments, the sampling configuration unit may include a table having entries, each of which may store a sampling ratio, a sampling number, and/or a resolution. One or more of the sampling ratio, sampling number, and resolution may be predetermined.
The sampling configuration unit may read the sampling ratio and/or the sampling number from an entry of the table that stores the resolution corresponding to the resolution of the input video data, and output the sampling ratio and the sampling number as the sampling ratio and the sampling number, respectively.
According to example embodiments, the determination unit may sample the input video data at different locations in a frame for each of the sampling number of frames.
According to example embodiments, the determination unit may generate the color space signal based on a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space.
According to example embodiments, the determination unit may include a sampling unit, an examination unit, and/or a comparison unit. The sampling unit may be configured to receive the input video data in units of frames (e.g., frame by frame), to sample the input video data with the sampling ratio for each of the sampling number of frames, and/or to output the sampled input video data for each of the sampling number of frames. The examination unit may be configured to generate a wide number by accumulatively counting a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space among the sampled input video data sampled from the sampling number of frames of the input video data. The comparison unit may be configured to generate the color space signal by comparing the wide number with a threshold number.
The sampling unit may sample the input video data included in a sampling region of a frame, which has an area corresponding to the sampling ratio, for each of the sampling number of frames, where the sampling regions of the sampling number of frames is different from each other.
The examination unit may generate the wide number by accumulatively counting a number of the sampled input video data having a value from 0 to 15 or from 236 to 255 among the sampled input video data sampled from the sampling number of frames of the input video data.
The examination unit may receive the sampling number from the sampling configuration unit, and/or may reset the wide number after receiving the sampled input video data from the sampling unit the sampling number of times.
The comparison unit may determine the color space of the input video data as a narrow color space and/or may generate the color space signal having a first logic level when the wide number is less than the threshold number, and/or may determine the color space of the input video data as a wide color space and/or may generate the color space signal having a second logic level when the wide number is equal to or greater than the threshold number.
The determination unit may further include a multiplexer configured to output one of the color space signal received from the comparison unit and a default color space signal representing one of a wide color space and a narrow color space in response to an enable signal.
The multiplexer may output the color space signal received from the comparison unit when the enable signal has a logic high level, and/or may output the default color space signal when the enable signal has a logic low level.
According to example embodiments, a display device may include a color space determination device, an image processor, and/or a display unit. The color space determination device may generate a color space signal representing a color space of input video data based on a resolution of the input video data. The image processor may receive the input video data, and/or may generate output video data by converting a format of the input video data based on the color space signal. The display unit may display the output video data reflected, for example, in an output video signal. The color space determination device may include a sampling configuration unit and/or a determination unit. The sampling configuration unit may determine a sampling ratio and/or a sampling number based on the resolution of the input video data, where the sampling ratio may be a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame, and/or the sampling number may be a number of frames to be sampled. The determination unit may receive the input video data by a unit of a frame, may sample the input video data with the sampling ratio for each of the sampling number of frames, and/or may generate the color space signal based on the sampled input video data that are sampled from the sampling number of frames of the input video data.
According to example embodiments, the image processor may store a first conversion coefficient corresponding to a wide color space and/or a second conversion coefficient corresponding to a narrow color space, and/or may generate the output video data by converting a format of the input video data into a RGB format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal.
According to example embodiments, a determination unit may include a sampling unit configured to receive input video data in units of frames, to sample the input video data with a sampling ratio for each of a sampling number of frames, and/or to output the sampled input video data for each of the sampling number of frames; an examination unit configured to generate a wide number by accumulatively counting a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space among the sampled input video data sampled from the sampling number of frames of the input video data; and/or a comparison unit configured to generate a color space signal by comparing the wide number with a threshold number.
According to example embodiments, the determination unit may further include a multiplexer configured to output one of the color space signal received from the comparison unit and a default color space signal representing one of the wide color space and the narrow color space in response to an enable signal.
According to example embodiments, a system may include a storage device and/or a display device. The storage device may be configured to store video data. The display device may include a color space determination device, an image processor, and/or a display unit. The color space determination device may be configured to sample the video data. The color space determination device may be configured to generate a color space signal representing a color space of the video data based on the sampled video data. The image processor may be configured to generate output video data based on the sampled video data and the color space signal. The display unit may be configured to display the output video data.
According to example embodiments, the color space signal may correspond to a wide color space.
According to example embodiments, the image processor may generate the output video data by converting a format of the video data into a red, green, blue (RGB) format.
According to example embodiments, the image processor may store a first conversion coefficient corresponding to a wide color space and a second conversion coefficient corresponding to a narrow color space and/or may generate the output video data by converting a format of the video data into a red, green, blue (RGB) format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal.
According to example embodiments, the video data is sampled at different locations for each frame of the video data that is sampled.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and/or other aspects and advantages will become more apparent and more readily appreciated from the following detailed description of example embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram illustrating a color space determination device according to example embodiments;
FIG. 2 is a diagram illustrating an example of a table included in a sampling configuration unit of FIG. 1;
FIG. 3 is a block diagram illustrating an example of a determination unit included in a color space determination device of FIG. 1;
FIG. 4 is a diagram for describing an operation of a sampling unit included in a determination unit of FIG. 3;
FIG. 5 is a block diagram illustrating another example of a determination unit included in a color space determination device of FIG. 1;
FIG. 6 is a flow chart illustrating a method of determining a color space according to example embodiments;
FIG. 7 is a flow chart illustrating an example of a step of generating a color space signal included in a method of determining a color space of FIG. 6;
FIG. 8 is a block diagram illustrating a display device according to example embodiments; and
FIG. 9 is a block diagram illustrating a system according to example embodiments.
DETAILED DESCRIPTION
Example embodiments will now be described more fully with reference to the accompanying drawings. Embodiments, however, may be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope to those skilled in the art. In the drawings, the thicknesses of layers and regions may be exaggerated for clarity.
It will be understood that when an element is referred to as being “on,” “connected to,” “electrically connected to,” or “coupled to” to another component, it may be directly on, connected to, electrically connected to, or coupled to the other component or intervening components may be present. In contrast, when a component is referred to as being “directly on,” “directly connected to,” “directly electrically connected to,” or “directly coupled to” another component, there are no intervening components present. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that although the terms first, second, third, etc., may be used herein to describe various elements, components, regions, layers, and/or sections, these elements, components, regions, layers, and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer, and/or section from another element, component, region, layer, and/or section. For example, a first element, component, region, layer, and/or section could be termed a second element, component, region, layer, and/or section without departing from the teachings of example embodiments.
Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper,” and the like may be used herein for ease of description to describe the relationship of one component and/or feature to another component and/or feature, or other component(s) and/or feature(s), as illustrated in the drawings. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and should not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Reference will now be made to example embodiments, which are illustrated in the accompanying drawings, wherein like reference numerals may refer to like components throughout.
FIG. 1 is a block diagram illustrating a color space determination device according to example embodiments.
Referring to FIG. 1, a color space determination device 100 may include a sampling configuration unit 110 and/or a determination unit 120.
The sampling configuration unit 110 may receive a resolution RES of input video data I_DATA from outside. The sampling configuration unit 110 may determine a sampling ratio S_RATIO and/or a sampling number S_FNUM based on the resolution RES of the input video data I_DATA. The sampling ratio S_RATIO may be a ratio of a number of the input video data I_DATA to be sampled to a total number of the input video data I_DATA included in a frame. The sampling number S_FNUM may be a number of frames of the input video data I_DATA to be sampled.
The determination unit 120 may receive the input video data I_DATA by a unit of a frame from outside, and/or may receive the sampling ratio S_RATIO and/or the sampling number S_FNUM from the sampling configuration unit 110. The determination unit 120 may sample the input video data I_DATA with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames to generate sampled input video data for each of the sampling number S_FNUM of frames. The determination unit 120 may determine a color space of the input video data I_DATA and/or may generate a color space signal CS representing the color space of the input video data I_DATA based on the sampled input video data that are sampled from the sampling number S_FNUM of frames of the input video data I_DATA.
For example, the determination unit 120 may determine the color space of the input video data I_DATA as one of a wide color space and a narrow color space. Therefore, the color space signal CS may represent one of a wide color space and a narrow color space. A range of a value that each of the input video data I_DATA may have is relatively wide when the input video data I_DATA has a wide color space, and a range of a value that each of the input video data I_DATA may have is relatively narrow when the input video data I_DATA has a narrow color space.
Generally, an image processor may receive input video data stored in a storage device or received from a communication device, and/or may generate output video data by converting a format of the input video data into a RGB format, so that a display unit displays the output video data. To increase a performance of a color reproduction, the image processor may be required to generate the output video data having a wide color space when the input video data has a wide color space, and/or maybe required to generate the output video data having a narrow color space when the input video data has a narrow color space. To achieve this, a conventional display device may require a user to select one of a wide color space and a narrow color space which will be used by an image processor in converting the input video data into the output video data. However, a user may not be able to select a correct color space corresponding to a color space of the input video data since a user usually does not know a color space of the input video data.
On the contrary, the color space determination device 100 according to example embodiments may determine a color space of the input video data I_DATA based on the resolution RES of the input video data I_DATA and/or may generate the color space signal CS representing the color space of the input video data I_DATA during operation. As will be described later with reference to FIG. 8, the color space determination device 100 may provide the color space signal CS to an image processor, so that the image processor may generate output video data having a same color space as a color space of the input video data I_DATA by converting a format of the input video data I_DATA into a RGB format based on the color space signal CS received from the color space determination device 100.
If the color space determination device 100 determines the color space of the input video data I_DATA by sampling all the input video data I_DATA included in a frame, a time for the color space determination device 100 to analyze one frame may increase such that the color space determination device 100 may slow down an overall operation speed of a display device including the color space determination device 100. Therefore, the color space determination device 100 may sample only a part of the input video data I_DATA among all the input video data I_DATA included in a frame according to the sampling ratio S_RATIO to determine the color space of the input video data I_DATA. That is, as described above, the sampling configuration unit 110 may determine the sampling ratio S_RATIO, which represents a ratio of a number of the input video data I_DATA to be sampled to a total number of the input video data I_DATA included in a frame, and/or the sampling number S_FNUM, which represents a number of frames of the input video data I_DATA to be sampled, based on the resolution RES of the input video data I_DATA. The determination unit 120 may sample the input video data I_DATA with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, may determine the color space of the input video data I_DATA, and may generate the color space signal CS based on the sampled input video data that are sampled from the sampling number S_FNUM of frames of the input video data I_DATA. As such, the color space determination device 100 may decrease a time required to analyze one frame such that the color space determination device 100 may not influence an overall operation speed of a display device including the color space determination device 100.
If the color space determination device 100 samples the input video data I_DATA with a fixed sampling ratio regardless of the resolution RES of the input video data I_DATA, a time for the color space determination device 100 to analyze one frame may increase as the resolution RES of the input video data I_DATA increases such that the color space determination device 100 may slow down an overall operation speed of a display device including the color space determination device 100. Therefore, the color space determination device 100 may decrease the sampling ratio S_RATIO and increase the sampling number S_FNUM as the resolution RES of the input video data I_DATA increases. Alternatively, the color space determination device 100 may increase the sampling ratio S_RATIO and decrease the sampling number S_FNUM as the resolution RES of the input video data I_DATA decreases. As such, the color space determination device 100 may control a time required to analyze one frame.
In example embodiments, the sampling configuration unit 110 may include a table 111 that has entries each of which stores a sampling ratio, a sampling number, and a resolution. One or more of the sampling ratio, sampling number, and resolution may be predetermined.
FIG. 2 is a diagram illustrating an example of a table included in a sampling configuration unit of FIG. 1.
Referring to FIG. 2, the table 111 may include a resolution field, a sampling ratio field, and/or a sampling number field.
The resolution field may store a resolution (that may or may not be predetermined). As illustrated in FIG. 2, the resolution field may store a standard resolution of video data. The sampling ratio field may store a sampling ratio (that may or may not be predetermined) that may be selected as the sampling ratio S_RATIO when a resolution (that may or may not be predetermined) stored in the same entry is equal to the resolution RES of the input vide data I_DATA. The sampling number field may store a sampling number (that may or may not be predetermined) that may be selected as the sampling number S_FNUM when a resolution (that may or may not be predetermined) stored in the same entry is equal to the resolution RES of the input vide data I_DATA.
The sampling configuration unit 110 may read the sampling ratio (that may or may not be predetermined) and/or the sampling number (that may or may not be predetermined) from an entry of the table 111 that stores the resolution (that may or may not be predetermined) corresponding to the resolution RES of the input video data I_DATA, and may output the sampling ratio (that may or may not be predetermined) and/or the sampling number (that may or may not be predetermined) as the sampling ratio S_RATIO and the sampling number S_FNUM, respectively.
For example, as illustrated in FIG. 2, the sampling configuration unit 110 may determine the sampling ratio S_RATIO as 5% and/or may determine the sampling number S_FNUM as 20 when the resolution RES of the input video data I_DATA is less than VGA (Video Graphic Adapter). The sampling configuration unit 110 may determine the sampling ratio S_RATIO as 3% and/or may determine the sampling number S_FNUM as 33 when the resolution RES of the input video data I_DATA is equal to or greater than VGA (Video Graphic Adapter) and less than HD (High Definition). The sampling configuration unit 110 may determine the sampling ratio S_RATIO as 1% and/or may determine the sampling number S_FNUM as 100 when the resolution RES of the input video data I_DATA is equal to or greater than HD (High Definition).
The sampling ratio (that may or may not be predetermined) and the sampling number (that may or may not be predetermined) according to the resolution (that may or may not be predetermined) of FIG. 2 serves as an example embodiment. In example embodiments, the table 111 may store other values for the sampling ratio (that may or may not be predetermined) and/or the sampling number (that may or may not be predetermined) according to the resolution (that may or may not be predetermined).
Images of consecutive frames may be similar to each other in general video data. Therefore, if the determination unit 120 samples the input video data I_DATA of same locations in a frame for each of the sampling number S_FNUM of frames, a possibility of determining the color space of the input video data I_DATA incorrectly may increase. Therefore, the determination unit 120 may sample the input video data I_DATA at different locations in a frame for each of the sampling number S_FNUM of frames.
The determination unit 120 may determine the color space of the input video data I_DATA and/or may generate the color space signal CS representing the color space of the input video data I_DATA based on a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space.
FIG. 3 is a block diagram illustrating an example embodiment of a determination unit included in a color space determination device of FIG. 1.
Referring to FIG. 3, a determination unit 120 a may include a sampling unit 121, an examination unit 123, and/or a comparison unit 125.
The sampling unit 121 may receive the input video data I_DATA by a unit of a frame from outside, and/or may receive the sampling ratio S_RATIO from the sampling configuration unit 110. The sampling unit 121 may sample the input video data I_DATA with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, and/or may output the sampled input video data S_DATA for each of the sampling number S_FNUM of frames. To decrease a possibility of determining the color space of the input video data I_DATA incorrectly, the sampling unit 121 may sample the input video data I_DATA at different locations in a frame for each of the sampling number S_FNUM of frames.
FIG. 4 is a diagram for describing an operation of a sampling unit included in a determination unit of FIG. 3.
In FIG. 4, the sampling number S_FNUM is N, where N is a positive integer. Consecutive N frames are illustrated in FIG. 4.
Referring to FIG. 4, the sampling unit 121 may sample the input video data I_DATA included in a sampling region SA of a frame for each of the sampling number S_FNUM of frames. The sampling region SA may have an area corresponding to the sampling ratio S_RATIO. The sampling regions SA of the sampling number S_FNUM of frames may be different from each other. For example, as illustrated in FIG. 4, the sampling unit 121 may sample the input video data I_DATA included in the sampling region SA of a frame for each of the sampling number S_FNUM of frames while the sampling unit 121 moves the sampling region SA to an adjacent region frame by frame. In example embodiments, the sampling region SA of a current frame may overlap with the sampling region SA of a previous frame. In example embodiments, the sampling region SA of a frame may be randomly determined for each frame.
Referring again to FIG. 3, the examination unit 123 may generate a wide number W_NUM by accumulatively counting a number of the sampled input video data S_DATA having a value used in a wide color space and not used in a narrow color space among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA.
The comparison unit 125 may generate the color space signal CS by comparing the wide number W_NUM with a threshold number W_TH.
When the input video data I_DATA has a wide color space, each of the input video data I_DATA may have a value from 0 to 255. Alternatively, when the input video data I_DATA has a narrow color space, each of the input video data I_DATA may have a value from 16 to 235. Therefore, if the input video data I_DATA having a value from 0 to 15 or from 236 to 255 exist, the color space of the input video data I_DATA may be determined as a wide color space. Alternatively, if the input video data I_DATA having a value from 0 to 15 or from 236 to 255 do not exist, the color space of the input video data I_DATA may be determined as a narrow color space. However, even if the input video data I_DATA has a narrow color space, the input video data I_DATA having a value from 0 to 15 or from 236 to 255 may exist because of noise, operational error, etc.
Therefore, the examination unit 123 may generate the wide number W_NUM by accumulatively counting a number of the sampled input video data S_DATA having a value from 0 to 15 or from 236 to 255 among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA. The comparison unit 125 may generate the color space signal CS by comparing the wide number W_NUM with the threshold number W_TH. For example, the comparison unit 125 may determine the color space of the input video data I_DATA as a narrow color space and/or may generate the color space signal CS having a first logic level when the wide number W_NUM is less than the threshold number W_TH. Alternatively, the comparison unit 125 may determine the color space of the input video data I_DATA as a wide color space and/or may generate the color space signal CS having a second logic level when the wide number W_NUM is equal to or greater than the threshold number W_TH. As such, the color space determination device 100 may reduce a possibility of determining the color space of the input video data I_DATA incorrectly.
The examination unit 123 may receive the sampling number S_FNUM from the sampling configuration unit 110. The examination unit 123 may reset the wide number W_NUM after receiving the sampled input video data S_DATA from the sampling unit 121 the sampling number S_FNUM of times, so that the examination unit 123 may accumulatively count the number of the sampled input video data S_DATA having a value used in a wide color space and not used in a narrow color space only for the sampling number S_FNUM of frames.
FIG. 5 is a block diagram illustrating another example embodiment of a determination unit included in a color space determination device of FIG. 1.
Referring to FIG. 5, a determination unit 120 b may include a sampling unit 121, an examination unit 123, a comparison unit 125, and/or a multiplexer 127.
The determination unit 120 b of FIG. 5 may be substantially the same as the determination unit 120 a of FIG. 3 except for the multiplexer 127. Therefore, a detail description of the sampling unit 121, the examination unit 123, and/or the comparison unit 125 of FIG. 5 will be omitted.
The multiplexer 127 may receive the color space signal CS from the comparison unit 125, and/or may receive an enable signal ENABLE and/or a default color space signal D_CS from outside. The default color space signal D_CS may represent one of a wide color space and a narrow color space. The multiplexer 127 may output one of the color space signal CS and the default color space signal D_CS in response to the enable signal ENABLE. For example, the multiplexer 127 may output the color space signal CS received from the comparison unit 125 when the enable signal ENABLE has a logic high level, and/or may output the default color space signal D_CS received from outside when the enable signal ENABLE has a logic low level.
Therefore, the determination unit 120 b may output the color space signal CS representing the color space of the input video data I_DATA determined by an operation described above when the determination unit 120 b receives the enable signal ENABLE having a logic high level from outside, and/or may output the default color space signal D_CS (that may or may not be predetermined) as representing one of a wide color space and a narrow color space regardless of the input video data I_DATA when the determination unit 120 b receives the enable signal ENABLE having a logic low level from outside.
FIG. 6 is a flow chart illustrating a method of determining a color space according to example embodiments.
The method of determining a color space may be performed by the color space determination device 100 of FIG. 1.
Hereinafter, the method of determining a color space according to example embodiments will be described with reference to FIGS. 1 to 6.
Referring to FIG. 6, a sampling ratio S_RATIO and/or a sampling number S_FNUM may be determined based on a resolution RES of input video data I_DATA (step S100). The sampling ratio S_RATIO may be a ratio of a number of the input video data I_DATA to be sampled to a total number of the input video data I_DATA included in a frame. The sampling number S_FNUM may be a number of frames of the input video data I_DATA to be sampled.
The input video data I_DATA may be received by a unit of a frame, the input video data I_DATA may be sampled with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, and/or a color space signal CS representing the color space of the input video data I_DATA may be generated based on the sampled input video data that are sampled from the sampling number S_FNUM of frames of the input video data I_DATA (step S200). The color space of the input video data I_DATA may be determined as one of a wide color space and a narrow color space, and/or the color space signal CS may represent one of a wide color space and a narrow color space.
FIG. 7 is a flow chart illustrating an example embodiment of a step of generating a color space signal included in a method of determining a color space of FIG. 6.
Referring to FIG. 7, the input video data I_DATA may be received by a unit of a frame, the input video data I_DATA may be sampled with the sampling ratio S_RATIO for each of the sampling number S_FNUM of frames, and/or the sampled input video data S_DATA for each of the sampling number S_FNUM of frames may be output (step S210). A wide number W_NUM may be generated by accumulatively counting a number of the sampled input video data S_DATA having a value used in a wide color space and not used in a narrow color space among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA (step S220). The color space signal CS may be generated by comparing the wide number W_NUM with a threshold number W_TH (step S230).
In example embodiments, the wide number W_NUM may be generated by accumulatively counting a number of the sampled input video data S_DATA having a value from 0 to 15 or from 236 to 255 among the sampled input video data S_DATA sampled from the sampling number S_FNUM of frames of the input video data I_DATA. The color space of the input video data I_DATA may be determined as a narrow color space and the color space signal CS having a first logic level may be generated when the wide number W_NUM is less than the threshold number W_TH. Alternatively, the color space of the input video data I_DATA may be determined as a wide color space and the color space signal CS having a second logic level may be generated when the wide number W_NUM is equal to or greater than the threshold number W_TH.
As described above, if the color space of the input video data I_DATA may be determined by sampling all the input video data I_DATA included in a frame, a time required to analyze one frame may increase such that an overall operation speed of a display device in which the method of determining a color space is performed may be slowed down. Therefore, in the method of determining a color space according to example embodiments, only a part of the input video data I_DATA among all the input video data I_DATA included in a frame may be sampled with the sampling ratio S_RATIO to determine the color space of the input video data I_DATA. As such, the method of determining a color space according to example embodiments may decrease a time required to analyze one frame such that the method of determining a color space may not influence an overall operation speed of the display device. In addition, in the method of determining a color space according to example embodiments, the sampling ratio S_RATIO may be decreased and the sampling number S_FNUM may be increased as the resolution RES of the input video data I_DATA increases. Alternatively, the sampling ratio S_RATIO may be increased and the sampling number S_FNUM may be decreased as the resolution RES of the input video data I_DATA decreases. As such, the method of determining a color space according to example embodiments may control a time required to analyze one frame.
Images of consecutive frames are similar to each other in general video data. Therefore, if the input video data I_DATA of same locations in a frame are sampled for each of the sampling number S_FNUM of frames, a possibility of determining the color space of the input video data I_DATA incorrectly may increase. Therefore, in the method of determining a color space according to example embodiments, the input video data I_DATA at different locations in a frame for each of the sampling number S_FNUM of frames may be sampled to decrease a possibility of determining the color space of the input video data I_DATA incorrectly.
In example embodiments, an enable signal ENABLE and/or a default color space signal D_CS representing one of a wide color space and a narrow color space may be received from outside, and one of the color space signal CS and the default color space signal D_CS may be output in response to the enable signal ENABLE (step S240). For example, the color space signal CS may be output when the enable signal ENABLE has a logic high level, and the default color space signal D_CS may be output when the enable signal ENABLE has a logic low level. Therefore, in the method of determining a color space according to example embodiments, the color space signal CS representing the color space of the input video data I_DATA determined by an operation described above may be output when the enable signal ENABLE having a logic high level is received from outside, and the default color space signal D_CS (that may or may not be predetermined) as representing one of a wide color space and a narrow color space regardless of the input video data I_DATA may be output when the enable signal ENABLE having a logic low level is received from outside.
FIG. 8 is a block diagram illustrating a display device according to example embodiments.
Referring to FIG. 8, a display device 200 may include a color space determination device 210, an image processor 220, and/or a display unit 230.
The color space determination device 210 may generate a color space signal CS representing a color space of input video data I_DATA based on a resolution RES of the input video data I_DATA.
The color space determination device 210 of FIG. 8 may be embodied with the color space determination device 100 of FIG. 1. A structure and an operation of the color space determination device 100 of FIG. 1 are described above with reference to FIGS. 1 to 5. Therefore, a detail description of the color space determination device 210 of FIG. 8 will be omitted.
The image processor 220 may receive the input video data I_DATA and/or may generate output video data O_DATA by converting a format of the input video data I_DATA based on the color space signal CS received from the color space determination device 210. The image processor 220 may generate the output video data O_DATA by converting a format of the input video data I_DATA into a RGB format. The image processor 220 may convert a format of the input video data I_DATA such that the output video data O_DATA may have a wide color space when the color space signal CS represents a wide color space, and may convert a format of the input video data I_DATA such that the output video data O_DATA may have a narrow color space when the color space signal CS represents a narrow color space. For example, the image processor 220 may store a first conversion coefficient corresponding to a wide color space and a second conversion coefficient corresponding to a narrow color space, and/or may generate the output video data O_DATA by converting a format of the input video data I_DATA into a RGB format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal CS received from the color space determination device 210.
The display unit 230 may receive the output video data O_DATA from the image processor 220 and/or may display the output video signal O_DATA.
As described above, the display device 200 may determine the color space of the input video data I_DATA and/or may generate the output video data O_DATA by converting a format of the input video data I_DATA based on the determined color space of the input video data I_DATA. Therefore, the display device 200 according to example embodiments may increase a performance of a color reproduction.
FIG. 9 is a block diagram illustrating a system according to example embodiments.
A system 300 may include a processor 310, a display device 320, and/or a storage device 330.
The storage device 330 may store video data. The storage device 330 may include a solid state drive (SSD), a hard disk drive (HDD), a compact disk read-only memory (CD-ROM) drive, etc.
The processor 310 may read the video data stored in the storage device 330 and/or may provide the read video data to the display device 320 as input video data.
The display device 320 may determine a color space of the input video data, may generate output video data by converting a format of the input video data into a RGB format based on the determined color space of the input video data, and/or may display the output video data. The display device 320 may include a color space determination device 321, an image processor 323, and/or a display unit 325. The color space determination device 321 may generate a color space signal CS representing a color space of the input video data based on a resolution of the input video data. The image processor 323 may receive the input video data and/or may generate the output video data by converting a format of the input video data into a RGB format based on the color space signal CS received from the color space determination device 321. The display unit 325 may receive the output video data from the image processor 323 and/or may display the output video data reflected, for example, in an output video signal.
The display device 320 of FIG. 9 may be embodied with the display device 200 of FIG. 8. Therefore, the color space determination device 321, the image processor 323, and/or the display unit 325 included in the display device 320 of FIG. 9 may be embodied with the color space determination device 210, the image processor 220, and/or the display unit 230 included in the display device 200 of FIG. 8, respectively. A structure and an operation of the display device 200 of FIG. 8 are described above with reference to FIGS. 1 to 5 and 8. Therefore, a detail description of the display device 320 of FIG. 9 will be omitted.
The processor 310 may perform various computing functions, such as executing specific software for performing specific calculations or tasks. For example, the processor 310 may be a microprocessor or a central process unit. The processor 310 may be connected to the display device 320 and the storage device 330 via bus such as an address bus, a control bus or a data bus, etc. The processor 310 may be connected to an extended bus, such as peripheral component interconnect (PCI) bus.
The processor 310 may be embodied as a single core architecture or a multi core architecture. For example, the processor 310 may be embodied as a single core architecture when an operating frequency of the processor 310 is less than 1 GHz, and/or the processor 310 may be embodied as a multi core architecture when an operating frequency of the processor 310 is greater than 1 GHz. The processor 310 that is embodied as a multi core architecture may communicate with peripheral devices via an advanced extensible interface (AXI) bus.
The system 300 may further include a camera 340, a memory device 350, a user interface 360, and/or an input/output device 370. Although not illustrated in FIG. 9, the system 300 may further include ports to communicate with a video card, a sound card, a memory card, a universal serial bus (USB) device, etc.
The camera 340 may generate video data by capturing images. In this case the processor 310 may provide the input video data to the display device 320 by reading the video data generated by the camera 340 as well as by reading the video data stored in the storage device 330.
The memory device 350 may store data required for an operation of the system 300. The memory device 350 may be a volatile memory such as a dynamic random access memory (DRAM), a static random access memory (SRAM), etc., or a non-volatile memory such as an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a flash memory, etc.
The user interface 360 may include devices required for a user to control the system 300. The input/output device 370 may include an input device (e.g., a keyboard or a mouse), an output device (e.g., a printer), etc.
The system 300 may be a mobile device, a smart phone, a cellular phone, a desktop computer, a laptop computer, a work station, a handheld device, a digital camera, or the like.
While example embodiments have been particularly shown and described, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.

Claims (20)

What is claimed is:
1. A color space determination device, comprising:
a sampling configuration unit configured to determine a sampling ratio and a sampling number based on a resolution of input video data, the sampling ratio being a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame, the sampling number being a number of frames to be sampled; and
a determination unit configured to receive the input video data in units of frames, to sample the input video data with the sampling ratio for each of the sampling number of frames, and to generate a color space signal representing a color space of the input video data based on the sampled input video data that are sampled from the sampling number of frames of the input video data.
2. The color space determination device of claim 1, wherein the sampling configuration unit decreases the sampling ratio and increases the sampling number as the resolution of the input video data increases.
3. The color space determination device of claim 1, wherein the sampling configuration unit includes a table having entries, each of which stores data for the sampling ratio, the sampling number, and the resolution.
4. The color space determination device of claim 3, wherein the sampling configuration unit reads the data for the sampling ratio and the data for the sampling number from an entry of the table that stores the data for the resolution corresponding to the resolution of the input video data, and outputs the read sampling ratio and the read sampling number as the sampling ratio and the sampling number, respectively.
5. The color space determination device of claim 1, wherein the determination unit samples the input video data at different locations in a frame for each of the sampling number of frames.
6. The color space determination device of claim 1, wherein the determination unit generates the color space signal based on a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space.
7. The color space determination device of claim 1, wherein the determination unit includes:
a sampling unit configured to receive the input video data in units of frames, to sample the input video data with the sampling ratio for each of the sampling number of frames, and to output the sampled input video data for each of the sampling number of frames;
an examination unit configured to generate a wide number by accumulatively counting a number of the sampled input video data having a value used in a wide color space and not used in a narrow color space among the sampled input video data sampled from the sampling number of frames of the input video data; and
a comparison unit configured to generate the color space signal by comparing the wide number with a threshold number.
8. The color space determination device of claim 7, wherein the sampling unit samples the input video data included in a sampling region of a frame, which has an area corresponding to the sampling ratio, for each of the sampling number of frames, the sampling regions of the sampling number of frames being different from each other.
9. The color space determination device of claim 7, wherein the examination unit generates the wide number by accumulatively counting a number of the sampled input video data having a value from 0 to 15 or from 236 to 255 among the sampled input video data sampled from the sampling number of frames of the input video data.
10. The color space determination device of claim 7, wherein the examination unit receives the sampling number from the sampling configuration unit, and resets the wide number after receiving the sampled input video data from the sampling unit the sampling number of times.
11. The color space determination device of claim 7, wherein the comparison unit determines a color space of the input video data as the narrow color space and generates the color space signal having a first logic level when the wide number is less than the threshold number, and determines the color space of the input video data as the wide color space and generates the color space signal having a second logic level when the wide number is equal to or greater than the threshold number.
12. The color space determination device of claim 7, wherein the determination unit further includes a multiplexer configured to output one of the color space signal received from the comparison unit and a default color space signal representing one of the wide color space and the narrow color space in response to an enable signal.
13. The color space determination device of claim 12, wherein the multiplexer outputs the color space signal received from the comparison unit when the enable signal has a logic high level, and outputs the default color space signal when the enable signal has a logic low level.
14. A display device, comprising:
a color space determination device configured to generate a color space signal representing a color space of input video data based on a resolution of the input video data;
an image processor configured to receive the input video data, and to generate output video data by converting a format of the input video data based on the color space signal; and
a display unit configured to display the output video data;
wherein the color space determination device includes:
a sampling configuration unit configured to determine a sampling ratio and a sampling number based on the resolution of the input video data, the sampling ratio being a ratio of a number of the input video data to be sampled to a total number of the input video data included in a frame, the sampling number being a number of frames to be sampled; and
a determination unit configured to receive the input video data in units of frames, to sample the input video data with the sampling ratio for each of the sampling number of frames, and to generate the color space signal based on the sampled input video data that are sampled from the sampling number of frames of the input video data.
15. The display device of claim 14, wherein the image processor stores a first conversion coefficient corresponding to a wide color space and a second conversion coefficient corresponding to a narrow color space, and generates the output video data by converting a format of the input video data into a red, green, blue (RGB) format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal.
16. A system, comprising:
a storage device; and
a display device;
wherein the storage device is configured to store video data,
wherein the display device includes:
a color space determination device;
an image processor; and
a display unit;
wherein the color space determination device is configured to sample the video data,
wherein the color space determination device is configured to generate a color space signal representing a color space of the video data based on the sampled video data,
wherein the image processor is configured to generate output video data based on the sampled video data and the color space signal, and
wherein the display unit is configured to display the output video data.
17. The system of claim 16, wherein the sampled video data is sampled based on a sampling ratio and a sampling number derived from a resolution of the video data.
18. The system of claim 16, wherein the image processor generates the output video data by converting a format of the video data into a red, green, blue (RGB) format.
19. The system of claim 16, wherein the image processor stores a first conversion coefficient corresponding to a wide color space and a second conversion coefficient corresponding to a narrow color space, and
wherein the image processor generates the output video data by converting a format of the video data into a red, green, blue (RGB) format using one of the first conversion coefficient and the second conversion coefficient selected in response to the color space signal.
20. The system of claim 16, wherein the video data is sampled at different locations for each frame of the video data that is sampled.
US13/419,085 2011-03-16 2012-03-13 Color space determination devices and display devices and systems including the same Active 2033-01-29 US8803903B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2011-0023135 2011-03-16
KR1020110023135A KR20120105615A (en) 2011-03-16 2011-03-16 Color space determination device and display device including the same

Publications (2)

Publication Number Publication Date
US20120236205A1 US20120236205A1 (en) 2012-09-20
US8803903B2 true US8803903B2 (en) 2014-08-12

Family

ID=46828160

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/419,085 Active 2033-01-29 US8803903B2 (en) 2011-03-16 2012-03-13 Color space determination devices and display devices and systems including the same

Country Status (2)

Country Link
US (1) US8803903B2 (en)
KR (1) KR20120105615A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347635A (en) * 2019-06-28 2019-10-18 西安理工大学 A kind of heterogeneous polynuclear microprocessor based on multilayer bus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017128314A1 (en) * 2016-01-29 2017-08-03 深圳市大疆创新科技有限公司 Method, system and device for video data transmission, and photographic apparatus
US11178204B1 (en) * 2017-02-23 2021-11-16 Cox Communications, Inc. Video processor to enhance color space and/or bit-depth

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4682225A (en) * 1985-09-13 1987-07-21 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for telemetry adaptive bandwidth compression
JP2007125728A (en) 2005-11-01 2007-05-24 Seiko Epson Corp Printing device and image processor
KR20070058204A (en) 2005-12-01 2007-06-08 주식회사 대우일렉트로닉스 Method for controlling color space conversion of image display apparatus
KR100744018B1 (en) 2005-04-19 2007-07-30 엘지전자 주식회사 apparatus for converting color space automatically according to input signal and method thereof
US20100208989A1 (en) * 2008-07-08 2010-08-19 Matthias Narroschke Image coding method, image decoding method, image coding apparatus, image decoding apparatus, program and integrated circuit
US20110280307A1 (en) * 1998-11-09 2011-11-17 Macinnis Alexander G Video and Graphics System with Video Scaling
US20120321220A1 (en) * 2002-08-28 2012-12-20 Fujifilm Corporation Method and device for video image processing, calculating the similarity between video frames, and acquiring a synthesized frame by synthesizing a plurality of contiguous sampled frames
US8427547B1 (en) * 2006-04-28 2013-04-23 Ambarella, Inc. Camera with high-quality still capture during continuous video capture
US8520009B1 (en) * 2003-05-29 2013-08-27 Nvidia Corporation Method and apparatus for filtering video data using a programmable graphics processor

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4682225A (en) * 1985-09-13 1987-07-21 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for telemetry adaptive bandwidth compression
US20110280307A1 (en) * 1998-11-09 2011-11-17 Macinnis Alexander G Video and Graphics System with Video Scaling
US20120321220A1 (en) * 2002-08-28 2012-12-20 Fujifilm Corporation Method and device for video image processing, calculating the similarity between video frames, and acquiring a synthesized frame by synthesizing a plurality of contiguous sampled frames
US8520009B1 (en) * 2003-05-29 2013-08-27 Nvidia Corporation Method and apparatus for filtering video data using a programmable graphics processor
KR100744018B1 (en) 2005-04-19 2007-07-30 엘지전자 주식회사 apparatus for converting color space automatically according to input signal and method thereof
JP2007125728A (en) 2005-11-01 2007-05-24 Seiko Epson Corp Printing device and image processor
KR20070058204A (en) 2005-12-01 2007-06-08 주식회사 대우일렉트로닉스 Method for controlling color space conversion of image display apparatus
US8427547B1 (en) * 2006-04-28 2013-04-23 Ambarella, Inc. Camera with high-quality still capture during continuous video capture
US20100208989A1 (en) * 2008-07-08 2010-08-19 Matthias Narroschke Image coding method, image decoding method, image coding apparatus, image decoding apparatus, program and integrated circuit

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347635A (en) * 2019-06-28 2019-10-18 西安理工大学 A kind of heterogeneous polynuclear microprocessor based on multilayer bus
CN110347635B (en) * 2019-06-28 2021-08-06 西安理工大学 Heterogeneous multi-core microprocessor based on multilayer bus

Also Published As

Publication number Publication date
KR20120105615A (en) 2012-09-26
US20120236205A1 (en) 2012-09-20

Similar Documents

Publication Publication Date Title
US9800798B2 (en) Systems and methods for power optimization for imaging devices with dual cameras
US20180268571A1 (en) Image compression device
US9258539B2 (en) Method of calibrating automatic white balance and image capturing device performing the same
US10049427B1 (en) Image data high throughput predictive compression systems and methods
US20080292219A1 (en) Method And System For An Image Sensor Pipeline On A Mobile Imaging Device
US8213512B2 (en) Determining an intermediate image
US8363713B2 (en) Method and apparatus for loading image data
US10031916B2 (en) Methods and systems for virtualizing and managing cloud storage sources
CN109213703B (en) Data detection method and data detection device
US8803903B2 (en) Color space determination devices and display devices and systems including the same
JP5870145B2 (en) Color buffer compression
CN110430431A (en) Video encoding/decoding method, chip, device, computer equipment and storage medium
US11050428B2 (en) Synchronous sampling in-phase and quadrature-phase (I/Q) detection circuit
US9817837B2 (en) Method and system for file storage and access
US20170076427A1 (en) Methods and devices for outputting a zoom sequence
WO2023134625A1 (en) Special effect optimization method and apparatus, and storage medium and program product
CN108765503B (en) Skin color detection method, device and terminal
US20170185346A1 (en) SoC Fabric extensions for configurable memory maps through memory range screens and selectable address flattening
US11621000B2 (en) Systems and methods for associating a voice command with a search image
US9838275B2 (en) Method for resource and performance matching in virtual pool of mixed graphics workloads
US7849414B2 (en) Edge anti-aliasing
US10791331B2 (en) Foldable electronic device and file decompression method
JP5968463B2 (en) Pointer swapping to process data buffered by the data source without copying the data to another storage device
US8904060B2 (en) First-in first-out memory device and electronic apparatus having the same
US20150213787A1 (en) Display controller and display system including the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, SANG-HYUN;CHOI, HONG-MI;SHIN, TAEK-KYUN;SIGNING DATES FROM 20120306 TO 20120309;REEL/FRAME:027881/0728

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8