WO2008047518A1 - Dispositif de synthèse d'image, procédé de synthèse d'image, programme de synthèse d'image, circuit intégré - Google Patents
Dispositif de synthèse d'image, procédé de synthèse d'image, programme de synthèse d'image, circuit intégré Download PDFInfo
- Publication number
- WO2008047518A1 WO2008047518A1 PCT/JP2007/067631 JP2007067631W WO2008047518A1 WO 2008047518 A1 WO2008047518 A1 WO 2008047518A1 JP 2007067631 W JP2007067631 W JP 2007067631W WO 2008047518 A1 WO2008047518 A1 WO 2008047518A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image data
- data
- image
- composition
- format
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6072—Colour correction or control adapting to different types of images, e.g. characters, graphs, black and white image portions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0233—Improving the luminance or brightness uniformity across the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0242—Compensation of deficiencies in the appearance of colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/02—Graphics controller able to handle multiple formats, e.g. input or output formats
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
Definitions
- Image synthesis device image synthesis method, image synthesis program, integrated circuit
- the present invention relates to a technique for combining a plurality of images into one image.
- the present invention relates to a technique for synthesizing image data described in different formats.
- An image displayed on a television or personal computer display is generated based on data representing the color of each pixel by a number.
- RGB RGB
- YUV YUV
- HSV formats for expressing colors by numerical values
- the RGB format represents the intensity of each of the three primary colors of light, red, green, and blue, for each pixel. As the value of each color increases, a color close to white is expressed, and as the value decreases, a color close to black is expressed.
- This RGB format is generally used for personal computer displays. Therefore, text image data and graphic images generated using a personal computer are described in RGB format.
- YUV formats include the YUV422 format and YUV420 format, which reduce the amount of data by thinning out a set of color difference data shared by four adjacent pixels. These use the characteristics of the human eye that are more sensitive to brightness than color, and are suitable for images with gradually changing colors, such as natural images (for example, landscape photos, portraits). It is also used for digital TV broadcasting, which is often used for JPEG, MPE G, and so on.
- image data representing one screen is created by combining image data described in various formats. For example, when posting graphic image data in RGB format and photo data in YUV format on the homepage.
- Patent Document 1 Japanese Patent Application Laid-Open No. 61-156091
- Non-Patent Document 1 Hitoshi Kiya “Easy-to-understand video and still image processing technology” September 1, 2004 CQ Publishing
- Non-Patent Document 2 Hideyuki Tamura “Computer Image Processing” December 20, 2002 Ohm
- the optimal processing algorithm differs between the portion originally in the RGB format and the portion in the YUV format.
- the combined image data is described in a single display format, all parts to be processed must be processed using the same algorithm. Therefore, it may be processed by a non-optimal processing algorithm and the image quality may be coarse.
- the present invention takes into consideration the above-mentioned possibilities, and can select an optimal processing algorithm even when processing a combined image, an image combining device, an image combining system, and an image combining It is an object to provide a method, an image synthesis program, and an integrated circuit.
- the present invention provides an image composition device that generates a composite image data by combining a plurality of image data, and includes a plurality of target image data to be combined and each target image.
- Acquisition means for acquiring attribute information relating to encoding at the time of data generation; combining the plurality of acquired target image data to generate the composite image data; and each of the composite image data, And a means for associating each piece of attribute information with a portion corresponding to the target image data.
- the present invention includes an image synthesis device that generates a composite image data by combining a plurality of image data, and a processing device that processes the composite image data to generate processed image data.
- the image synthesizing device includes a plurality of target image data to be combined and attribute information related to encoding at the time of generating each target image data; Synthesizing the target image data to generate the synthesized image data; and association means for associating each attribute information with a portion corresponding to each target image data in the synthesized image data
- the processing device includes: a data acquisition unit that acquires the composite image data; a reception unit that receives a processing instruction for the composite image data; and Image data comprising: selection means for selecting a processing algorithm based on the attribute information corresponding to a part to be instructed; and processing means for processing the composite image data in accordance with the selected processing algorithm It is also a usage system.
- the composition control unit 141 and the format conversion unit 142 have the functions.
- the compositing engine 143 combining, the buffer 144 and the memory 101 have the function.
- the associating means associates the attribute information with a portion corresponding to each target image data in the composite image data. Therefore, based on the attribute information, it is possible to know how each part constituting the composite image data is encoded before the composition. Therefore, when further processing the composite image data, Based on the attribute information, the optimum machining algorithm can be selected and machining can be performed for each part using the optimum machining algorithm.
- the plurality of target image data are generated by unifying input image data of different data formats into any one data format, and
- the stage may acquire the attribute information indicating a data format of input image data corresponding to each target image data.
- the association unit associates the attribute information indicating the data format of each input image data with a portion corresponding to each target image data in the composite image data. Therefore, for each part of the composite image data, the ability S to know the data format before format conversion can be obtained.
- any of the plurality of target image data is generated by processing input image data, and the acquisition unit includes the attribute information indicating a processing history of each target image data. Even an image synthesizer to get!
- processing includes image quality correction processing in addition to processing such as enlargement, reduction, and reversal in the following embodiments.
- the acquisition unit acquires the attribute information indicating the processing history for each input image data, and thus knows whether each part of the composite image data has been processed before combining. be able to.
- the associating unit adds the part to the part on the composite image generated from the composite image data.
- An image composition apparatus characterized by associating attribute information of target image data corresponding to a target image displayed in the forefront of a corresponding region with the part.
- the correlating means is displayed on the forefront of the area corresponding to the part on the composite image generated from the composite image data.
- the attribute information of the target image data that is the basis of the displayed target image is associated with the part. Therefore, attribute information can be added to the part reflecting the intention of the producer.
- the composite image data is composed of a plurality of pixel data
- the correlating means constituting the image composition device configures a portion corresponding to each target image data in the composite image data.
- the pixel data may be associated with corresponding attribute information.
- the association unit associates the attribute information with each piece of pixel data constituting the composite image data.
- attribute information corresponding to each pixel data included in the part to be used can be reliably detected.
- the associating means includes a storage unit that stores data in units of an even multiple of 16 bits, and a sum of a bit length of 1 pixel data and a bit length of the attribute information is 16 bits.
- a complementary unit that generates complementary data having a bit length that is an even multiple of the pixel data, and the storage unit continuously includes each pixel data constituting the composite image data, attribute information corresponding to the pixel data, and the complementary data.
- the image composition device may include a writing unit for writing.
- the sum of the bit length of 1-pixel data and the bit length complementary data of the attribute information is an even multiple of 16 bits, and the writing unit is configured to output the composite image data.
- Each pixel data that constitutes and the attribute information corresponding to the pixel data are written in succession. Therefore, the address management of each pixel data and attribute data is simplified, and when trying to use a part of the composite image data, the pixel data contained in the part to be used for ⁇ IJ and each pixel data can be handled quickly. Attribute information to be read.
- the attribute information is data of a minimum number of bits indicating characteristics related to encoding at the time of generation of corresponding target image data
- the association unit stores a data storage unit;
- the image composition device may include a writing unit that writes a corresponding attribute information group to the storage unit.
- the attribute information is data having a minimum number of bits indicating characteristics relating to encoding at the time of generation of corresponding target image data, and the writing unit separately sets the composite image data and the attribute information group. Write. Therefore, the total amount of data can be minimized.
- the association means includes a storage unit that stores data, a generation unit that generates partial information indicating a portion corresponding to each target image data in the composite image data, and each partial information and correspondence.
- the image composition apparatus may include a writing unit for writing attribute information to be written into the storage unit! /.
- the writing unit writes each piece of partial information and attribute information corresponding to each piece of partial information in the storage unit. Therefore, the amount of data can be greatly reduced as compared with the case where attribute information is held for each pixel.
- the composition means includes a selection unit that selects an additive algorithm based on each attribute information, and a target corresponding to each attribute information based on the selected processing algorithm It may include a processing unit that processes image data and a combining unit that combines target image data groups after processing.
- a processing algorithm when it is necessary to process each target image data before synthesis, a processing algorithm can be quickly selected and processed based on the attribute information.
- the image synthesizing device is further configured to receive a processing instruction for the composite image data, and based on the attribute information corresponding to a portion to be processed in the composite image data. Selection means for selecting a processing algorithm and processing means for processing the composite image data in accordance with the selected processing algorithm may be provided.
- the selection unit selects an optimal processing algorithm based on the attribute information group corresponding to the pixel data included in the portion to be processed, and the processing unit Applying the optimal processing to the target part it can.
- FIG. 1 is a block diagram showing a functional configuration of an image composition device 100 according to Embodiment 1
- FIG. 2 shows details of the composition instruction table 112.
- FIG. 3 visually shows the image data stored in the image storage area 121.
- FIG. 4 shows a composite image 127 and a processed image in which a part of the composite image 127 is enlarged.
- FIG. 5 is a visual representation of the arrangement of the value of the format flag P included in each pixel data constituting the composite image data 127.
- FIG. 6 is a flowchart showing the operation of the image composition device 100 in image composition processing.
- FIG. 7 is a flowchart showing the operation of the image composition device 100 in the image composition process, and continues at step S 114 in FIG.
- FIG. 8 is a flowchart showing the operation of the image synthesizing apparatus 100 in the image synthesizing process, and continues from step S 116 in FIG.
- FIG. 9 A flow chart showing the operation of the image composition device 100 in the processing of the composite image data is shown.
- FIG. 10 is a flowchart showing the operation of the image composition device 100 in the composite image data processing, continued from FIG.
- FIG. 11 is a block diagram showing a functional configuration of the image composition device 200 in Embodiment 2.
- FIG. 12 shows details of the composition instruction table 212.
- FIG. 13 is a composite image displayed based on the composite image data generated according to the composite instruction table 212.
- FIG. 14 is a flowchart showing the operation of the image composition device 200 in the image composition processing.
- FIG. 15 is a flowchart showing the operation of the image synthesizing apparatus 200 in the image synthesizing process, and continues from step S213 in FIG.
- FIG. 16 is a flowchart showing the operation of the image composition device 200 in the image composition processing. Step S226 in Fig. 15 continues.
- FIG. 17 is a flowchart showing the operation of the image composition device 200 in the processing of the composite image data.
- FIG. 18 is a flowchart showing the operation of the image composition device 200 in the composite image data processing, continued from FIG.
- FIG. 20 is a flowchart showing the operation of the image composition device 300 in the image composition processing.
- FIG. 21 is a flowchart showing the operation of the image composition apparatus 300 in the image composition process, and continues from step S312 in FIG.
- FIG. 22 is a flowchart showing the operation of the image composition device 300 in the image composition process, and continues from step S322 in FIG.
- FIG. 23 is a flowchart showing the operation of the image composition device 300 in the processing of the composite image data.
- FIG. 24 is a flowchart showing the operation of the image composition device 300 in the composite image data processing, continued from FIG.
- 25] is a block diagram showing a functional configuration of an image composition device 400 according to Embodiment 4.
- FIG. 26 is a flowchart showing the operation of the image composition device 400 in the image composition processing.
- FIG. 27 is a flowchart showing the operation of the image composition device 400 in the image composition process, continued from FIG.
- FIG. 28 is a flowchart showing the operation of the image composition apparatus 400 in the image composition process, continued from FIG.
- the image synthesizing apparatus synthesizes a plurality of image data to generate one synthesized image data.
- video displayed on a monitor is expressed as “image”
- data for generating the video is expressed as “image data”.
- the image data to be combined includes RGB image data generated using a computer and YUV format image data generated by photographing a landscape or a person with a digital camera or the like. .
- the image composition device unifies each image data into RGB format.
- each color is expressed in 8-bit length.
- RGB image data when the data length allocated to represent each pixel's red, blue, and green is a, b, c (integer, unit is bit), the format of the image data is Expressed as RaGbBc format.
- RaGbBc Insignificant n-bit length data (hereinafter blank bits) added to make the data length a predetermined length is expressed as Xn.
- Xn Insignificant n-bit length data
- Data corresponding to one pixel is called pixel data.
- the image composition device of the present invention converts each image data into the X8R8G8B8 format in which each color is represented by 8-bit data, and the first bit of the blank bits is converted into RGB format and YUV. Replace with flag P to indicate which of the formats is used, and unify to P1X7R8G 8B8 format.
- P1X7R8G8B8 format image data is arranged at a predetermined position to generate composite image data.
- the power to adjust each pixel data to be 32 bits long. This is due to the physical structure of the existing memory.
- the data length of 1 pixel data is 16 bits or 16 bits. This is because address management becomes easier when the number is even times. Therefore, it does not necessarily have to be 32 bits long.
- the image synthesizing apparatus 100 displays a function for synthesizing a plurality of image data to generate synthesized image data, and a processed image obtained by processing (enlarging, reducing, rotating, inverting, etc.) a part or all of the synthesized image. It is the apparatus which produces
- FIG. 1 is a block diagram showing a functional configuration of the image composition device 100, and arrows in the figure indicate a data flow in image data composition processing and composite image data processing processing.
- the image composition device 100 is composed of a memory 101, a display unit 102, an input unit 103, a main control unit 104, and an image processing unit 105.
- the image processing unit 105 includes a composition control unit 141, a format conversion, and the like.
- the image composition device 100 is a computer system including a microprocessor, a RAM, and a ROM, and the RAM and ROM store a computer program. Microprocessor power By operating in accordance with the computer program, the image synthesizing apparatus 100 realizes a part of its functions.
- the user program is configured by combining a plurality of instruction codes indicating instructions to the computer in order to achieve a predetermined function.
- the memory 101 is configured to include a hard disk unit, and the inside thereof includes a general area 111 and an image storage area 121.
- the general area 111 includes a composition instruction table 112, enlargement data A113, reduction data A114, inversion data 115, rotation data A116, correction data A117, and correction data B118. I remember it.
- various programs used by the main control unit 104, image data (start screen, menu display, etc.), functions, calculation parameters, and the like are stored.
- the image storage area 121 stores image data saved by a user operation.
- paint image data 122 character image data 123, photographic image data 124 and 126, composite image data 127, and processed image data 128 are stored.
- the combination instruction table 112 is a table indicating information necessary for executing the combination.
- FIG. 2 shows details of the combination instruction table 112.
- the composition instruction table 112 is composed of a plurality of composition information 131, 132, 133, and 134.
- Each combination information corresponds to paint image data 122, character image data 123, and photographic image data 124 and 126, which will be described later. Also, it is generated based on these image data and is displayed corresponding to the displayed image!
- Each composite information includes input image data, data format, overlay order, output coordinates, and transmittance.
- the input image data is information indicating the image data to be combined.
- the power that describes the name of the image data is not limited to this. You can use the first address.
- the data format indicates the data format of the corresponding image data! /. "X8R8G8B8"
- the corresponding image data is in RGB format
- pixel data corresponding to one pixel is represented by 8 bits of blank bits
- red, green, and blue intensities are represented by 8 bits. Is shown.
- R5G6B5 indicates that the corresponding image data is in RGB format, and the data corresponding to one pixel is represented by 5 bits in red, 6 bits in green, and 5 bits in blue. .
- YUV422 and YUV420 are YUV systems that express pixel data by luminance and color difference as described above, and “YUV422” and “YUV420” differ in the method of thinning out color difference data.
- the superposition order is information indicating the order in which the corresponding images are displayed on the screen. The higher the numerical value, the higher (front), the lower the numerical value, the lower (rear). In this embodiment, since the images to be combined do not overlap on the screen, all the superposition orders are “1”!
- the output coordinates indicate the starting point of the coordinates at which the corresponding image is displayed.
- Transmittance indicates how much the lower image is transmitted when the corresponding image is displayed.
- the composite information 132 corresponds to the character image data 123
- the data format of the character image data 123 to be combined is “R5G6B6”
- the overlay order when displayed is the forefront. Yes, the start coordinates of the display position are (50, 150).
- the transmittance “0” indicates that the character image displayed based on the character image data 123 does not transmit the subsequent image! /.
- composition of the composition instruction table 112 described here is an example, and each composition information includes the data size of each input image data and the image size (for example, horizontal 5
- the processing data stored in the general area 111 includes enlargement data A113, reduction data A114, inversion data 115, rotation data A116,.
- Enlargement data Al l 3, reduction data A114, inversion data 115, and rotation data A1 16 are the parameters, functions, and programs necessary for enlarging, reducing, inverting, and rotating R8G8B8 format image data. In particular, it is suitable for processing image data generated by a computer, such as graphic images and text images.
- the general area 111 of the memory 101 is similarly processed with processing data (enlarged data B, reduced data) suitable for processing image data of natural images (landscapes, people, etc.).
- processing data enlarged data B, reduced data
- Data B, reversing data B, rotation data ( ⁇ ⁇ ) are also stored.
- the correction data Al 17 contains the functions, parameters, and programs necessary to perform image quality correction on image data in RGB format. Since RGB images are likely to have been generated by a computer using drawing software, etc., the correction data A117 can be used, for example, to eliminate the blurring of contours by eliminating intermediate colors or to enhance contrast. Functions and parameters according to the algorithm to be used.
- the correction data B118 includes functions, parameters, and programs necessary to perform image quality correction on image data of RGB natural images (landscape photos, portraits, etc.). Specifically, for example, it includes functions and parameters according to an algorithm that adjusts the contour line to a smooth curve and smoothes the color change (tone change).
- correction data for performing image quality correction processing associated with enlargement processing
- correction data for performing image quality correction associated with reduction processing may be stored.
- the image storage area 121 stores paint image data 122, character image data 123, photographic image data 124 and 126, composite image data 127, and processed image data 128.
- the paint image data 122 is image data created using drawing software or the like, and the data format is X8R8G8B8.
- the character image data 123 is image data created using the text drawing function of the document creation software or drawing software, and the data format is R5Y6G5.
- the photographic image data 124 and 126 are image data generated by digitizing natural images taken with a digital camera, and the data formats are YUV422 and YUV420, respectively.
- the combined image data 127 is image data generated by combining the above four image data based on the combining instruction table 112, and the data format is P1X7R8G8B8.
- Figure 3 shows the paint image 122g, character image 123g, photo image 124g, 126g, and composite image 127g displayed based on the paint image data 122, character image data 123, photographic image data 124, 126, and composite image data 127. Show.
- the processed image data 128 is image data generated by enlarging a part of the composite image data.
- FIG. 4 shows a composite image 127g generated based on the composite image data 127 and a processed image 128g generated based on the processed image data 128.
- a pointer 161 is displayed on the composite image 127g in FIG. 4, and an image obtained by enlarging an area of 100 ⁇ 100 pixels around the point indicated by the pointer 161 is a processed image 128g.
- the input unit 103 is connected to a keyboard, a mouse, and the like, receives an instruction and data input from a user, and outputs the received instruction and data to the main control unit 104.
- an interface for connecting an external device or a portable recording medium is provided, and data is read from these devices under the control of the main control unit 104 and the read data is output to the main control unit 104.
- composition control unit 141 has a function of controlling the operation by outputting a control signal to each function unit in the image composition process.
- the composition control unit 141 receives an image composition instruction from the main control unit 104.
- the composition buffer 144 is initialized. Specifically, for example, P1X7R8G8B8 format background data (blue back, background image data specified by the user, etc.) is written. .
- the format flag P 0 included in each pixel data constituting the background image data.
- the composition control unit 141 searches the composition instruction table 112 stored at a predetermined address in the general area 111 of the memory 101, and reads the composition information having the smallest overlay order value.
- the top composition information 131 is read.
- the composition control unit 141 outputs the data format included in the read composition information 131 to the format conversion unit 142 (described later). Subsequently, the input image data of the composite information 131 is read from the image storage area 121 of the memory 101 and output to the format conversion unit 142 to instruct format conversion.
- the composition control unit 141 determines the transmittance and output coordinates included in the read composition information 131 and the image size of the paint image 122g corresponding to the read composition information 131 as a composition engine 143 (described later). Output to.
- the image size may be calculated by the compositing control unit 141 or may be included in advance in each compositing information in the compositing instruction table 112! /.
- the composition control unit 141 monitors the operation of the composition engine 143.
- the composition control unit 141 reads the next composition information from the composition instruction table 112.
- composition control unit 141 repeats the same processing for all composition information constituting the composition instruction table 112.
- the composition control unit 141 When the same processing is completed for all the composition information, the composition control unit 141 writes the composition-in-composition image data held by the composition buffer 144 to the image storage area 121 of the memory 101 as composition image data.
- the main control unit 104 is notified that the image composition processing has been completed.
- Received data format force SRGB format force S if not X8R8G8B8 format
- the format bit that stores the blank bit and the bit length of each color to 8 bits and stores the head of the blank bit Rewrite to P generate converted image data in P1X7R8G8B8 format, and output to composite engine 143.
- each pixel data is converted into R8 G8B8 format by the following formula.
- the above conversion formula is an example of a conversion formula based on ITU-R BT.601.
- the YU V format is stipulated in detail in, for example, ITU-R BT.601, ITU-R BT.709, etc., and the type of input image data and converted image data (analog, digital, bit scale) ), Other conversion formulas may be used according to the type of standards to be complied with and the method of parameter selection.
- the composition engine 143 receives the transmittance, the output coordinates, and the image size (vertical a X horizontal b pixels) from the synthesis control unit 141. Also, the converted image data is received from the format conversion unit 142.
- the composition engine 143 Upon receiving these, the composition engine 143 reads the P1X7R8G8B8 format intermediate composition image data from the composition buffer 144.
- the received converted image data is synthesized with the read-out synthesized image data, and the synthesized image data is updated. Specifically, out of the pixel data group that constitutes the mid-conversion image data, the output position is based on the screen.
- each color component of the pixel data corresponding to each pixel belonging to the range of vertical a X horizontal b pixels (hereinafter referred to as the synthesis target range) where the input image is arranged (in the following formula, D (b)) and each color component of the corresponding pixel data in the pixel data group constituting the converted input image data (denoted as D (f) in the following formula! /) Is synthesized using the following formula.
- composition engine 143 writes the updated composite image data back to the composition buffer 144.
- the synthesis buffer 144 temporarily stores image data that is being synthesized during image synthesis processing by the synthesis engine 143.
- the image data being combined at the time when the combining process by the combining engine 143 is completed is the same as the combined image data 127.
- the first bit of each pixel data constituting the composite image data 127 is the format flag P
- the FIG. 5 is a visual representation of each pixel and the value of the format flag P included in each pixel data in the composite image 127g generated from the composite image data 127.
- the format flag P 0 corresponding to the pixels included in the region 151a where the paint image 122a is displayed on the composite image 127g and the region 152a where the character image 123a is displayed.
- the format flag P 0 corresponding to the pixels included in the other region 156 is set.
- the deformation engine 147 performs processing on a part or all of the P1X7R8G8B8 format composite image data 127 using functions, parameters, and the like stored in the deformation memory 146 under the control of the main control unit 104. Specifically, under the instruction of the main control unit 104, in the order of the pixel at the left end of the first line to the pixel at the right end of the last line in the range to be processed (referred to as a processing target range) in the composite image 127g. Pixel data corresponding to each pixel is read pixel by pixel, and the read pixel data is processed using data stored in the deformation memory 146. If necessary, pixel data corresponding to pixels around the pixel to be processed is read out and used for processing. When processing is completed, the processed pixel data is output to the correction engine.
- the processing is, for example, processing such as enlargement, reduction, inversion, and rotation.
- the deformation engine 147 may be instructed by the main control unit 104 to pause. In this case, the pixel data corresponding to the processing target range is read and output to the correction engine 149 as it is.
- the deformation memory 146 is constituted by a RAM as an example, and temporarily stores functions and parameters necessary for the deformation engine 147.
- the correction engine 149 receives the processed image data generated by the deformation engine 147, performs image quality correction processing on the received image data, generates processed image data, and stores the generated processed image data in the memory 101. Write to image storage area 121. Specifically, processed pixel data is sequentially received from the deformation engine 147, and the received processed pixel data group is temporarily stored. Next, using the functions and parameters for image quality correction stored in the correction memory 148, the image data correction processing is performed on the pixel data group, and processed image data is generated.
- the image quality correction processing is mainly edge enhancement, noise removal, and image distortion correction.
- the correction engine 149 receives an instruction to stop the operation from the main control unit 104.
- the correction engine 149 writes the processed image data received from the deformation engine 147 as it is into the image storage area 121 of the memory 101 as processed image data.
- the correction memory 148 is composed of a RAM as an example, and temporarily stores functions, parameters, and the like used for image quality correction.
- the main control unit 104 outputs a control signal to each component constituting the image composition device 100 and controls these operations.
- the main control unit 104 receives a user operation via the input unit 103, reads out the image data recorded in the external device or the recording medium, and writes it in the image storage area 121 of the memory 101. .
- the main control unit 104 receives a composite image generation request via the input unit 103. Furthermore, a composite image processing request is received from the user.
- a list showing the image data stored in the image storage area 121 or a thumbnail screen is generated, and the image to be combined is selected and displayed. Accepts input of position and transmittance.
- the main control unit 104 Upon receiving these inputs, the main control unit 104 generates a synthesis instruction table 112 as shown in FIG. 2 based on the input information, and the generated synthesis instruction table 112 is stored in the general area 111 of the memory 101. Write to a predetermined address.
- the data format may be input by the user, or may be generated by the main control unit 104 by searching for image data designated by the user. For example, the data format is indicated in the first 16 or 32 bits of each image data. There is a method to extract each information and extract it.
- an image synthesis instruction is output to the synthesis control unit 141.
- the main control unit 104 reads the composite image data 127, generates a composite image 127g based on the read composite image data 127, and causes the display unit 102 to display the composite image 127g.
- the user With the composite image 127g generated based on the composite image data 127 displayed on the display unit 102, the user operates the mouse or the like to specify the position of the pointer 161.
- the main control unit 104 Upon receiving the position designation by the user, the main control unit 104 displays a processing menu including options indicating processing processing that can be performed on the composite image, such as “enlarge”, “reduced”, “inverted”, and “rotated”. Display the screen. Alternatively, the above-described options may be scrolled and displayed at the position of the pointer 161.
- the main control unit 104 receives an operation of a user who selects one of the displayed options.
- the main control unit 104 determines a range to be processed in the composite image 127g according to the processing indicated by the selected option. For example, if the selected option is an enlargement process, a range of 100 ⁇ 100 pixels with the position of the pointer 161 as the center is determined as an enlargement process target range. If the selected option indicates inversion, the range of 200 X 200 pixels around the position of the pointer 161 is determined as the range to be inverted.
- the main control unit 104 performs the following processing on the pixel data corresponding to each pixel in order from the leftmost pixel of the first line to the rightmost pixel of the last line of the determined processing target range. ,repeat.
- the main control unit 104 initializes the deformation memory 146 and the correction memory 448.
- the first bit of the pixel data is read.
- the main control unit 104 reads out the processing data suitable for processing of the graphic image from the processing data stored in the general area 111, and reads the read processing data. Write the data to the deformation memory 146. Similarly, it is suitable for image quality correction of graphic flag images.
- the correction data Al 7 is read and written to the correction memory 148.
- the main control unit 104 reads processing data suitable for natural image processing out of the processing data stored in the general area 111, and reads the processing data read out. Write the data to the deformation memory 146. Similarly, the correction data B118 suitable for natural image quality correction is read, and the read correction data B118 is written to the correction memory 148.
- the correction memory is initialized and the correction engine is instructed to stop the operation.
- the display unit 102 includes a monitor and an image buffer, and displays various screens on the monitor under the control of the main control unit 104.
- 6 to 8 are flow charts showing the operation of the image composition device 100 related to the image composition processing. The operation of the image composition apparatus 100 in the image composition processing will be described below with reference to FIGS.
- the main control unit 104 displays a predetermined input screen on the display unit 102 and accepts input such as selection of input image data and display position by the user (step S 101).
- a composition instruction table is generated based on the received information, the generated composition instruction table is written in the general area 111 of the memory 101, and an image composition is instructed to the composition control unit 141 (step S102).
- the composition control unit 141 in the image processing unit 105 is instructed by the main control unit 104 to perform image composition. .
- synthesis information is read from the synthesis instruction table 112 stored in the general area 111 of the memory 101 (step S103).
- the composite control unit 141 reads the composite image data from the composite buffer 144, and uses the read composite image data as composite image data. Write to the image storage area 121 of the memory 101 (step S107).
- synthesis control unit 144 If all the synthesis information has not been read (NO in step S106), synthesis control unit 144 outputs the data format included in the read synthesis information to format conversion unit 142 (step S111). . Subsequently, the composition control unit 141 reads the image data corresponding to the read composition information and outputs the read image data to the format conversion unit 142 (step S112).
- the composition control unit 141 outputs the output coordinates and transmittance included in the read composition information, and the image size (vertical a X horizontal b pixels) of the image corresponding to the read composite information to the composition engine 143. (Step S120).
- the composition engine 143 receives the transmittance, output coordinates, and image size from the composition control unit 141. Also, the converted image data is received from the format conversion unit 142. When these are received, the composition engine 143 reads the image data in the middle of composition from the composition buffer 144 (step S121). Of the read image data in the middle of composition, the pixel data corresponding to each pixel included in the range indicated by the image size (the composition target range) starting from the output coordinates is synthesized with the corresponding pixel data in the converted image data. To update the image data during synthesis (step S122). [0078] Subsequently, the composition engine 143 writes the updated composite image data back to the composition buffer (step S123).
- the format conversion unit 142 receives input image data from the composition control unit 141.
- the format converter 142 converts the received YUV format image data into the R8G8B8 format (step S126).
- step S119 the format converter 142 moves the process to step S119.
- FIG. 9 to FIG. 10 are flowcharts showing operations in the composite image data processing by the image composition device 100.
- the operations in the composite image data processing will be described below with reference to FIGS.
- the processing for enlarging a part of the composite image will be described.
- the main control unit 104 displays the composite image 127g on the display unit 102 in accordance with the user operation (step S131).
- the user operates the mouse, cursor, etc. to move the pointer 161 on the composite image 127g and designates an arbitrary position.
- the main control unit 104 accepts designation of a position by a user operation (step S132).
- a menu is displayed (step S 133).
- the menu lists the names of the processing that can be applied to the processed image, such as “Enlarge” and “Reduce”.
- the main control unit 104 accepts an operation for selecting “enlargement” by the user (step S 144).
- each pixel data in a range of 100 ⁇ 100 pixels (hereinafter referred to as a secondary usage range) centered on a specified position on the composite image 127g.
- a processed image (enlarged image) obtained by enlarging the image in the range of 100 ⁇ 100 pixels around the designated position on the synthesized image 127g is generated.
- the main control unit 104 reads the first bit (P) of the pixel data (step S 147).
- Data B for enlargement includes functions and parameters according to an algorithm suitable for natural image enlargement in the P1X7R8G8B8 format.
- the main control unit 104 reads the correction data B118 and outputs the read correction data B118 to the correction memory 148 (step S 152).
- the correction data B 118 includes a function and parameters according to an algorithm suitable for image quality correction of natural images in the P1X7R8G8B8 format.
- the main control unit 104 shifts the processing to step S158.
- the main control unit reads the enlargement data A113 from the general area 111 of the memory 101, and uses the read enlargement data A113 for modification.
- the data is output to the memory 146 (step S156).
- the enlargement data A113 includes functions and parameters according to an algorithm suitable for enlargement of graphic images in the P1X7 R8G8B8 format.
- the main control unit 104 reads the correction data All 7 from the memory 101, and outputs the read correction data A117 to the correction memory 148 (step S157).
- the correction data Al 7 includes functions and parameters according to an algorithm suitable for image quality correction of computer generated P1X7R8G8B8 format graphic images.
- the main control unit 104 designates an address on the memory 101 where the pixel data currently being processed is stored, and instructs the deformation engine to perform processing (enlargement in this case) (step S1). S 158).
- the deformation engine 147 receives an address of pixel data to be processed and a processing instruction from the main control unit 104.
- the processing instruction is received, the pixel data stored at the specified address is read, and the functions and parameters stored in the deformation memory 146 are read.
- the read pixel data is enlarged and output to the correction engine 149 (step S159).
- pixel data multiple pixel data generated by enlarging 1 pixel data is referred to as pixel data (group).
- the correction engine 149 receives the pixel data (group) that has been enlarged by the deformation engine 147, and uses the functions and parameters stored in the correction memory 148 to convert the received pixel data (group). Apply image quality correction processing.
- step S1 When the repetition for all pixel data in the secondary usage range is completed (step S1
- the correction engine 149 writes the processed image data including the pixel data group after the enlargement and correction processing in the image storage area 121 of the memory 101.
- the main control unit 104 reads the processed image data 128 from the image storage area 121 of the memory 101, generates a processed image 128g based on the read processed image data 128, and displays the processed image 128g on the display unit 102 ( Step S163).
- the image data before composition is in the RGB format in each pixel data constituting the composite image data. Embed a format flag that indicates whether or not.
- RGB format data is often text data generated using a computer
- YUV format image data is often a graphic image based on a natural image captured by a digital camera or the like. It is often generated!
- the optimum processing method and image quality correction method differ between the part of the composite image generated based on the RGB format image data and the part generated based on the Y UV format image data.
- the image composition apparatus 200 performs processing such as enlargement or reduction on the image data to be synthesized according to a user instruction, and then converts the format of each image into R8G8B8.
- a 1-bit format flag indicating the format of the image data before conversion is generated separately from the converted image data.
- the image composition device 200 synthesizes the converted image data using the equation shown in Embodiment 1, and generates composite image data in the R8G8B8 format.
- a format flag group including a format flag corresponding to each pixel of the synthesized image is generated.
- the format flag group is data of n ⁇ m bits long in which the format flags of each pixel are arranged in order from the upper left to the lower right.
- FIG. 11 is a block diagram showing a functional configuration of the image composition device 200.
- the arrows in the figure indicate the data flow related to the image composition processing (described later), and the bold arrows indicate the data flow related to the composite image processing (described later).
- the image composition device 200 includes a memory 201, a display unit 202, an input unit 203, a main control unit 204, and an image processing unit 205.
- the image processing unit 205 includes the composition control unit 2 41, a format conversion unit 242, a synthesis engine 243, a synthesis buffer 244, a deformation memory 246, a deformation engine 247, a correction memory 248, and a correction engine 249.
- the image composition device 200 is a computer system configured to include a microprocessor, RAM, and ROM, and the RAM and ROM store computer programs.
- the image synthesizing apparatus 200 achieves part of its functions by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- each component will be described.
- the configuration and operation of the input unit 203, the display unit 202, the deformation memory 246, and the correction memory 248 are the same as those of the input unit 103, the display unit 102, the deformation memory 146, and the correction memory 148 of the first embodiment. Therefore, explanation is omitted.
- the deformation engine 247 and the correction engine 249 operate according to instructions from the synthesis control unit 241 as well as the main control unit 204, and when instructed to pause, the received pixel data is directly used as the next functional unit.
- the functions and parameters stored in the corresponding memory are stored in the same manner as in the first embodiment except that the output destination of the data of the correction engine 249 is the format conversion unit 242. It is a functional part that uses and processes the input data. Therefore, detailed description is omitted here.
- the memory 201 includes a hard disk unit, and is divided into a general area 211 and an image storage area 221 for storing image data in accordance with a user operation.
- composite instruction table 212 In general area 211, for example, composite instruction table 212, RGB enlargement data 213, RGB reduction data 214, YUV enlargement data 215, YUV secondary enlargement data 216, RGB compensation data 217, YUV compensation data 218 and natural image correction data 219 are stored, and the image storage area 221 has paint image data 122, character image data 123, photographic image data 124, photographic image data 126, composite image data 227, Stores format flag group 228, processed image data 229, and the like.
- FIG. 12 shows details of the composition instruction table 212.
- the synthesis instruction table 212 is composed of a plurality (four in this case) of synthesis ten blues 231, 232, 233 and 234 forces. Each composite information corresponds to paint image data 122, character image data 123, and photo image data 124 and 126, respectively.
- Each composite information includes input image data, data format, overlay order, output coordinates, deformation information. Information, image quality correction information, and transmittance.
- the input image data, data format, overlay order, output coordinates, and transmittance are the same as the input image data, data format, overlay order, output coordinates, and transmittance included in the compositing instruction table 11 2 of Embodiment 1. Therefore, explanation is omitted.
- Deformation information is information indicating whether or not to perform processing before combining corresponding image data, and if so, what processing is to be performed. If processing is not performed before synthesis, the synthesis information may not include deformation information.
- the image quality correction information is information indicating whether or not to perform image quality correction processing before combining the corresponding image data, and the image quality correction information “required” indicates that the image quality correction processing needs to be performed.
- the image quality correction information “unnecessary” indicates that there is no need to perform image quality correction processing.
- the composite information 231 corresponds to the paint image data 122, and the input image data “paint image data”, the data format “X8R8G8”, the overlay order “3”, the output coordinates “(100 , 100) ”, deformation information“ 1.4 times ”, and image quality correction“ necessary ”.
- the paint image data 122 is subjected to enlargement processing with a magnification of 1.4 times and image quality correction.
- an image generated based on the image data after the enlargement process and the image quality correction is displayed in a predetermined range starting from the coordinates (100, 100) in the composite image after the composite process.
- When displaying images in layers indicate that they are displayed on the third side from the back! /.
- the composite information 233 corresponds to the photographic image data 124.
- the input image data "photographic image data”, the data format “YUV422", the overlay order "3", and the output coordinates " (400, 150) ”, deformation information“ horizontal inversion ”, and image quality adjustment“ unnecessary ”.
- the image generated from the image data after the left-right reversal is displayed within the range starting from the coordinates (400, 150) in the composite image, and when multiple images are displayed in an overlapping manner, 1 It is displayed on the back side, indicating that part of it is hidden behind other images.
- processing data data that is temporarily transferred to the deformation memory 246 and used for processing by the deformation engine 247 is collectively referred to as processing data.
- the RGB enlargement data 213 includes functions and parameters necessary for enlarging RGB image data.
- the data for RGB reduction includes functions and parameters necessary for performing reduction processing on image data in RGB format.
- the YUV enlargement data 215 includes functions and parameters necessary for enlarging the YUV format image data.
- YUV secondary enlargement data 216 is used to perform enlargement processing on the portion of the composite image data 227 (described later) generated by combining various types of image data that was in the YUV format before combining. Consists of suitable functions and parameters.
- the processing data is also used for the reversal processing of the RGB format image data, the RGB reversal data, and the rotation processing.
- machining data used for other machining is also stored!
- YUV secondary enlargement data 216 corresponds to the data format of the image data before composition.
- a large amount of processing data (called secondary processing data) is stored, and these are appropriately used when processing the composite image data.
- the RGB correction data 217 includes a function and a parameter suitable for image quality correction of RGB image data.
- the YUV correction data 218 includes functions and parameters suitable for image quality correction of image data in the YUV format.
- the natural image correction data 219 is image data in RGB format, and includes functions and parameters suitable for image quality correction of image data constituting the natural image. More precisely, in the composite image data, if the data format before composition is YUV format, the displayed image will be Predicted to be natural images in many cases. Therefore, in the present embodiment, the correction data 219 for natural images is used for the image quality correction of the portion of the combined image data 227 in the R8G8B8 format whose data format before combining is the YUV format.
- the paint image data 122, the character image data 123, the photographic image data 124, and the photographic image data 126 are the same as those in the first embodiment, description thereof will be omitted.
- the combined image data 227 is image data generated by combining the paint image data 122, the character image data 123, the photographic image data 124, and the photographic image data 126 in accordance with the combining instruction table 212.
- FIG. 13 shows a composite image 227g displayed based on the composite image data 227.
- the composite image 227g consists of a paint image 122G, which is a 1.4x enlargement of the paint image 122g, a character image 123G, which is a 1.2x enlargement of the character image 123g, Included is a photographic image 1 26G obtained by reducing the photographic image 126g by 0.8 times. Text image 123G, photo image 124G, and photo image 126G are partially overlapped. Photo image 126G is frontmost, text image 123G is second, and photo image 124G is displayed on the rearmost surface. Is done.
- the format flag group 228 includes a pixel data force corresponding to each pixel of the composite image 227g, a force that was in RGB format before composition, and a 1-bit format flag that indicates whether it was in YUV format. These pixels are arranged in the order of the lower right pixel and the lower right pixel.
- the format flag corresponding to the pixel in the region where neither the paint image 122G, the character image 123G, the photographic image 124G, or the photographic image 126G is drawn is “0” indicating the RGB format.
- the processed image data 229 is generated by further applying processing such as enlargement or reduction to part or all of the composite image data 227 in the R8G8B8 format.
- Synthesis control unit 241 The synthesis control unit 241 has a function of outputting a control signal to each functional unit and controlling its operation, although not shown, in the generation of synthesized image data.
- the composition control unit 241 receives an image composition instruction from the main control unit 204.
- the composition buffer 144 is initialized. The composite image data and format flag group immediately after initialization will be described later.
- the synthesis control unit 241 reads synthesis information from the synthesis instruction table 212 stored at a predetermined address in the general area 211 of the memory 201. At this time, reading is performed in order from the one with the smallest overlapping order.
- the composition control unit 241 reads processing data suitable for the deformation information included in the read composition information from the general area 211 of the memory 201 and writes it into the deformation memory 246.
- the magnification and rotation angle included as deformation information included in the read composite information are also written in the deformation memory 246.
- the composition control unit 241 instructs the deformation engine 247 to pause.
- the synthesis control unit 241 reads the data included in the read synthesis information. Based on the format, the optimal correction data is read from the memory, and the read correction data is output to the correction memory 248. When the image quality correction information is “unnecessary”, the correction engine 249 is instructed to pause.
- the synthesis control unit 241 outputs the data format included in the read synthesis information to the format conversion unit 242. Further, the transmittance and output coordinates included in the read composition information are output to the composition engine 243. The image size (horizontal a X vertical b pixels) corresponding to the read composite information is detected, and the detected image size is output to the composite engine 243.
- the synthesis control unit 241 multiplies each of the vertical and horizontal pixel numbers by a magnification to obtain a ⁇ a X magnification and b ⁇ b X magnification. Calculate and output horizontal a X vertical b pixels as the image size.
- the composition control unit 241 performs the above processing for all composition information in order from the smallest overlay order.
- the synthesis control unit 241 reads the synthesis information in the order of the synthesis information 233, 232, 231 and 234, and performs the above processing.
- the synthesis control unit 241 reads out the synthesis intermediate image data stored in the synthesis buffer 244, and uses the image storage area 221 of the memory 201 as the synthesized image data 227. Write to. Also. The format flag group is read from the synthesis buffer 244, and the read format flag group is written in the image storage area 221 of the memory 201 in association with the synthesized image data 227.
- the composition control unit 241 informs the main control unit 204 that the image composition processing has been completed.
- corrected image data image data that has been subjected to processing and image quality correction as necessary (hereinafter referred to as corrected image data) is received from the correction engine 249.
- the data format of the corrected image data received here is the same as the data format received from the composition control unit 241.
- the format converter 242 converts the image data received from the correction engine 249 into the R8G8B8 format, generates image data for synthesis, stores the generated image data for synthesis, and stores the data.
- the format flag is output to the composition engine 243.
- the format converter 242 deletes the first 8 bits of each pixel data and generates the R8G8B8 format image data for synthesis. Generate.
- the data format received from synthesis control unit 241 is YUV420, it is converted to RGB format using the following three formulas, the data length of each color is aligned to 8 bits, and R8G8B8 format synthesis is performed. Image data is generated.
- the composition buffer 244 stores image data in the middle of composition processing by the composition engine 243 (composition intermediate image data) and a format flag group.
- the in-combination image data is configured by arranging pixel data corresponding to each of the monitor pixels (n ⁇ m pixels) provided in the display unit 202 in order from the upper left to the lower right on the screen.
- the format flag group is n ⁇ m-bit long data that lists the pixel data and the corresponding format flag that constitute the image data being synthesized.
- the synthesis buffer 244 stores predetermined background image data as synthesis image data. At this time, all the format flags constituting the format flag group are initialized to “0”.
- the composition engine 243 receives the transmittance, output coordinates, and image size from the composition control unit 241.
- composition engine 243 receives the image data for composition in the R8G8B8 format and the format flag P from the format conversion unit 242.
- the composition engine 243 Upon receiving these, the composition engine 243 reads the image data during composition from the composition buffer 244.
- the range defined by the image size starting from the received output position on the synthesized image 227g is called the synthesis target range.
- the image data for synthesis received in the portion corresponding to the synthesis target range is synthesized. Specifically, out of the combined image data that has been read out, each color component (denoted as D (b)) of the pixel data of each pixel included in the composition target range, In the received image data, each color component (denoted as D (f)) of the corresponding pixel data is synthesized by the following calculation to calculate each color component (denoted as D (com)) of the new pixel data, The in-combination image data is updated by replacing each color component D (b) of the pixel data on the in-combination image data with the calculated color component D (com) of the new pixel data.
- the composition engine 243 reads out the format flag group, and rewrites all the portions corresponding to each pixel in the composition target range in the format flag group to the received format flag P.
- the composition engine 243 writes back the composition image data and the format flag group updated in the above procedure to the composition buffer 244.
- the main control unit 204 receives a user operation via the input unit 203, and controls the operation of each unit constituting the image composition device 200 according to the received operation.
- the main control unit 204 receives a request for image composition via the input unit 203. Further, the composite image 227g is displayed on the display unit 202, and designation of a specific position in the composite image is accepted.
- the main control unit 204 Upon receiving a request for image composition by a user operation, the main control unit 204 displays a predetermined input screen, selects image data to be composed, display position of each selected image, overlay order, processing and The input of the necessity of image quality correction and the transmittance is received. Next, based on the input information, the main control unit 204 generates a composition instruction table 212, writes it in a predetermined area of the general area 211 of the memory 201, and instructs the composition control unit 241 to perform image composition.
- an end notification indicating that the image composition processing has been completed and information (for example, an identifier, an address, etc.) indicating the generated composite image data are received.
- the composite image data 227 is read from the image storage area 221 of the memory 201, a composite image is generated from the read composite image data 227, and the generated composite image 227g is displayed on the display unit 202.
- a menu is displayed in the vicinity of the specified position.
- a separate menu screen may be generated and displayed.
- the menu lists the names of processing that can be performed on the composite image (for example, enlargement, reduction, inversion, ). Next, it accepts menu selections by user operations.
- a range to be processed (hereinafter referred to as a secondary usage range) is determined in the composite image 227g according to the accepted menu. For example, if “enlarge” is selected, the range is 100 x 100 pixels centered on the selected position, and if “reduced” is selected, 300 x 300 pixels centered on the selected position. When the range, “Rotate” is selected, the entire screen is set as the secondary usage range.
- a secondary use range may be determined by displaying a screen that prompts the user to select a range to be processed.
- the main control unit 204 sequentially reads out the portion corresponding to the secondary usage range of the format flag group 228 bit by bit. Processing data corresponding to the read format flag and the selected processing type is read from the general area 211 of the memory 201 and output to the deformation memory 246. Further, according to the read format flag and the selected processing, the correction data is read from the general area 211 of the memory 201 as necessary, and the read correction data is output to the correction memory 248.
- the main control unit 204 reads the YUV secondary enlargement data 216 from the general area 211 of the memory 201.
- the YUV secondary expansion data read out and read out are output to the deformation memory 246.
- the natural image correction data is read and output to the correction memory 248.
- the main control unit 204 repeats the same processing for all the pixels included in the secondary usage range and controls the deformation engine 247 and the correction engine 249, thereby processing the processed image data.
- the generated processed image data 229 is written in the image storage area of the memory 201.
- the processed image data 229 is read to generate a processed image, which is displayed on the display unit 202.
- FIGS. 14 to 16 are flowcharts showing operations in the image composition processing by the image composition device 200.
- FIG. Hereinafter, the operation of the image composition apparatus 200 related to image composition will be described with reference to FIGS.
- the main control unit 204 Upon receiving a request for image composition by a user operation, the main control unit 204 displays a predetermined input screen, selects image data to be composed, display position of each selected image, overlay order, processing and image quality. The input of necessity / unnecessity of correction and transmittance is accepted (step S201). Next, based on the input information, the main control unit 204 generates a composition instruction table 212, writes it in a predetermined area of the general area 211 of the memory 201, and instructs the composition control unit 241 to perform image composition ( Step S202).
- step S203 Upon receiving an image composition instruction from the main control unit 204, the composition control unit 241 is still read from the composition instruction table 212 stored in a predetermined area in the general area 211 of the memory 201! / Among the composite information not yet obtained, the one with the smallest overlapping order is read (step S203). At this time, if all the synthesis information has already been read (YES in step S206), the process proceeds to step S207.
- the synthesis control unit 241 converts the data format based on the data format, deformation information, and image quality correction information included in the read synthesis information.
- the optimum machining data and correction data are read from the memory 201 (step S211).
- the read processing data is written to the deformation memory 246, and the read correction data is written to the correction memory 248 (step S212).
- the composition control unit 241 reads the image data corresponding to the composition information from the image storage area 221 of the memory 201, and outputs it to the deformation engine 247 (step S213). Next, the composition control unit 241 notifies the format conversion unit 242 of the data format included in the read composition information (step S216).
- the deformation engine 247 receives the image data and uses the functions and parameters stored in the deformation memory 246 to perform processing (enlargement, reduction, rotation, etc.) on the received image data to generate deformed image data. To do. Subsequently, the deformation engine 247 outputs the generated deformed image data to the correction engine 249 (step S217).
- the correction engine 249 receives the deformed image data from the deformation engine 247, performs the image quality correction processing on the received deformed image data using the functions and parameters held in the correction memory 248, and the corrected image data. Is generated (step S218). Next, the correction engine 249 outputs the generated corrected image data to the format conversion unit 242 (step S219).
- the format conversion unit 242 receives the corrected image data from the correction engine 249, and receives the data format from the synthesis control unit 241.
- the data format of the corrected image data received from the correction engine 249 matches the data format received from the composition control unit 241.
- the format conversion unit 242 converts the data format of the received image data into the R8G8B8 format based on the data format received from the synthesis control unit 241, and generates image data for synthesis (step S221). .
- the format conversion unit 242 generates a 1-bit format flag “0” (step S223). If the received data format is YUV format, format converter 242 generates format flag “1” (step S224).
- the format conversion unit 242 outputs the generated synthesized image data, and a format flag P of 1 bit length to the synthesis engine 24 3 (step S226).
- the composition control unit 241 outputs the transmittance and output coordinates included in the read composition information, and the image size corresponding to the read composition information to the composition engine 243 (step S231).
- the composition engine 243 receives the transmittance, output coordinates, and image size from the composition control unit 241. Take away. Further, the image data for synthesis and the format flag P are received from the format conversion unit 242.
- the compositing engine 243 Upon receiving these, the compositing engine 243 reads out the compositing image data and the format flag group from the compositing buffer 244 (step S232).
- each pixel data in the synthesis target range defined by the image size starting from the output coordinates and the corresponding pixel data of the synthesis target data are synthesized by the following formula, and the combined pixel
- the in-combination image data is updated (step S234).
- the format flag group is updated by rewriting all the portions corresponding to the composition target range in the read format flag group to the received format flag P (step S236).
- step S237 the composite image data and format flag group after update are overwritten in the composite buffer 244 (step S237).
- the image composition device 200 After overwriting, the image composition device 200 returns to step S203, and proceeds to processing related to the next composition information.
- the synthesis control unit 24 1 reads the synthesis intermediate image data held in the synthesis buffer 244, and stores the read synthesis intermediate image data as synthesized image data in the memory 201. Write to the image storage area 221.
- the format flag group is read from the composition buffer 244 and similarly written to the image storage area of the memory 201 (step S207).
- the main control unit 204 generates a composite image from the generated composite image data, and displays the generated composite image on the display unit 202 (step S208).
- 17 to 18 are flowcharts showing the operation of the image composition device 200 in the processing (secondary use) processing of the composite image data 227.
- the main control unit 204 generates a composite image from the composite image data 227 and displays it on the display unit 202 (step S 241).
- the main control unit 204 receives selection of a position (pixel) by a user operation via the input unit 203 (step S242).
- the main control unit 204 displays a menu on the display unit 202 (step S243).
- the menu includes forces that include “enlarge”, “reduce”, “rotate”, etc.
- the case where “enlarge” is selected will be described (step S244).
- the range power is 100 ⁇ 100 pixels around the specified position in the composite image.
- step S246 to step S262 the processed image data is generated by repeating the processing of step S247 to step S261 for the pixel data corresponding to each pixel included in the secondary usage range.
- the main control unit 204 reads the format flag P corresponding to one pixel in the secondary usage range from the format flag group 228 (step S247).
- the main control unit 204 reads the YUV secondary expansion data 216 from the general area 211 of the memory 201 and stores it in the deformation memory 246. Write (step S251). Next, the natural image correction data 219 is read and output to the correction memory 248 (step S252).
- the main control unit 204 designates the address where the pixel data corresponding to the read format flag is stored, and instructs the transformation engine 247 to perform enlargement processing (step S 257).
- the deformation engine 247 reads pixel data from a specified address in accordance with an instruction from the main control unit 204, and performs an enlargement process on the read pixel data using the functions and parameters stored in the deformation memory 246. Pixel data after enlargement processing ( Group) is output to the correction engine 249 (step S258).
- the correction engine 249 receives pixel data (group) from the deformation engine 247, and performs image quality correction processing on the received pixel data (group) using the functions and parameters stored in the correction memory 248. (Step S259).
- the correction engine 249 sequentially writes the pixel data group after the correction process to the image storage area 221 of the memory 201 (step S261).
- the main control unit 204 reads the processed (enlarged) image data 229 composed of the pixel data (group) written by the correction engine 249, and also generates the processed (enlarged) image for display. This is displayed on the part 202 (step S263).
- the image composition device 200 performs processing such as enlargement on the image data before composition while maintaining the original data format, and then performs format conversion and composition processing. To generate composite image data.
- a format flag group is generated separately from each pixel data constituting the composite image data. For this reason, there is no blank bit in the composite image data, so the total data amount can be reduced.
- the data format of the composite image data is the R8G8B8 format.
- the data length of the pixel data is adjusted so that each pixel data is 16 bits (for example, R5G6B5) or an even multiple of 16 bits. , Address management may be made easier
- each pixel data constituting the composite image data includes an attribute flag indicating whether or not processing such as enlargement or image quality correction processing has been performed before the composition.
- the composite image data generated after processing and image quality adjustment before composition are Consider the case. If image data that has undergone processing or image quality correction is subjected to processing or image quality correction again using functions and parameters based on the same algorithm, an unnatural image may result. For example, if image quality correction is performed before synthesis, and the image quality correction based on the same algorithm is applied again to the part where the contrast is clear, the contrast ratio will be larger than necessary, resulting in an unnatural image. There are cases. Also, if the enlargement process is further performed with the same algorithm on the part that has been subjected to the enlargement process before composition, the contour line may become unclear or the image quality may become rough.
- an attribute flag indicating the presence or absence of processing and correction before composition is added to each pixel data constituting the composite image data.
- processing or correction is further performed on the portion of the composite image data that has been processed or corrected before combining (hereinafter referred to as multiple processing or multiple correction), it is different from the first time.
- Use multi-processing data including optimal functions and parameters.
- the image composition apparatus has one or more selected by the user. Assume that the device synthesizes the image data by enlarging it at a magnification of 1 or more and performing image quality correction processing as necessary.
- FIG. 19 is a block diagram showing a functional configuration of the image composition apparatus 300 according to the present embodiment.
- the arrows in the figure indicate the data flow related to the generation of the composite image data 314, and the bold arrows indicate the data flow related to the generation of the processed image data.
- the image composition device 300 includes a memory 301, a display unit 302, an input unit 303, a main control unit 304, and an image processing unit 305.
- the image processing unit 305 is composed of a composition control unit 341. , A deformation memory 346, a deformation engine 347, a correction memory 348, a correction engine 349, a format conversion unit 342, a synthesis engine 343, a synthesis buffer 344, and an attribute flag addition unit 351.
- the image composition apparatus 300 is a computer system configured to include a microprocessor, RAM, and ROM, and the RAM and ROM store computer programs. Microprocessor power The image synthesizing apparatus 300 achieves a part of its functions by operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- each configuration will be described below. Note that the configuration and operation of the input unit 303, the display unit 302, and the format conversion unit 342 are the same as those in the second embodiment, and a description thereof will be omitted. Also, the configuration and operation of the deformation memory 346, the deformation engine 347, the correction memory 348, and the correction engine 349 are input using the functions and parameters stored in each memory, as in the second embodiment. Process the pixel data. Therefore, detailed description is omitted.
- the memory 301 includes a hard disk unit, and includes a general area 311 and an image storage area 321.
- the general area 311 stores a composition instruction table 312, RGB enlargement data 213, YUV enlargement data 215, RGB compensation data 217, YUV compensation data 218, multiple enlargement data 313, and multiple compensation data 315. . If necessary, RGB enlargement data 213, YUV enlargement data 215, RGB correction data 217, YUV correction data 218 are called normal processing data groups, and multiple enlargement data 313 and multiple correction data 315 are This is called a heavy processing data group.
- the general area 311 stores various programs, parameters, image data, and the like used by the main control unit 304 in addition to these.
- the image storage area 321 includes image data stored in accordance with user operations such as paint image data 122 and photographic image data 124, composite image data 314 generated by the image composition device, and processed image data 316. Is remembered.
- the composition instruction table 312 has the same configuration as the composition instruction table 212 described in Embodiment 2, and is composed of a plurality of composition information. However, here The information is only "unnecessary” or "n times" (n ⁇ l).
- the other detailed configuration is the same as that of the synthesis instruction table 212 of the second embodiment, and thus the description thereof is omitted.
- the data for multiple enlargement 313 includes functions and parameters in accordance with an algorithm suitable for further enlarging the portion of the composite image data 314 that has been subjected to enlargement processing before composition.
- the multiple correction data 315 includes a function and parameters according to an algorithm for further correcting the image quality of the portion of the composite image data 314 that has been subjected to the image quality correction processing before the synthesis.
- Image quality correction processing roughly includes noise removal, edge enhancement, and distortion correction processing.
- image quality correction processing is performed multiple times according to the same algorithm.
- the data generated for edge enhancement may be mistaken for noise after the second image quality correction processing.
- the contour enhancement process a portion (contour) in which the density of each color changes rapidly is detected, and the high-frequency (high concentration) component of the detected portion is further enhanced.
- the second derivative of the input image data is used (this is described in detail in Non-Patent Document 2 in the contour enhancement process using the second derivative).
- the addition component of the second derivative described above may be mistaken for a new contour.
- the RGB correction data 217 and the YUV correction data 218 include functions and parameters relating to noise removal, edge enhancement, and distortion correction, respectively, but the multiple correction data 315 includes Only functions and parameters related to distortion compensation are included.
- the edge enhancement coefficients included in the RGB correction data 217, the YUV correction data 218, and the multiple correction data 315 may be set to be small.
- the composite image data 314 is image data in which the paint image data 122, the photographic image data 124, etc. are enlarged as necessary and subjected to image quality correction processing, and the data format is P2X6R8G8B8 format.
- the first 1 bit of each pixel data indicates whether or not the bit data has been enlarged during the synthesis process, and “1” indicates that the enlargement has been performed and “0”. Indicates that it was not applied.
- the second bit of each pixel data indicates whether or not the pixel data has been subjected to image quality correction processing in the synthesis process, “1” indicates that image quality correction processing has been performed, and “0” indicates , Indicates that it was not applied.
- the first two bits of each pixel data are called attribute bits and are represented by the variable P.
- the processed image data 316 is image data generated by further processing a part or all of the composite image data 314.
- the synthesis control unit 341 has a function of outputting a control signal to each functional unit and controlling its operation, although not shown, in the composite image data generation process.
- the composition control unit 341 receives an image composition instruction from the main control unit 304.
- the synthesis buffer 344 is initialized. Specifically, background image data (blue background, user-specified background image data) is written.
- the background image data written here is in the P2X6R8G8B8 format, and the attribute bit of each pixel data is “00”.
- composition control unit 341 reads the composition information included in the composition instruction table stored in the predetermined area of the memory 301 in order from the smallest overlapping order.
- the composite image data 314 is generated by repeating the following processing for all the composite information.
- the deformation engine 347 is instructed to pause. If it is not “unnecessary”, read either RGB enlargement data 213 or YUV enlargement data 215 based on the data format included in the read synthesis information. Read out and write to the memory 346 for deformation. At this time, the magnification factor n is also written.
- the correction engine 349 is instructed to pause. If it is not “unnecessary”, either the RGB correction data 217 or the YUV correction data 218 is read based on the data format included in the read synthesis information and written to the deformation memory 346.
- composition control unit 341 outputs the data format included in the read composition information to the format conversion unit 342.
- the image size is calculated from the deformation information included in the read composition information, and the calculated image size, output coordinates, and transmittance are output to the composition engine 343.
- the composition control unit 341 generates a 2-bit length attribute flag P based on the deformation information and the image quality information included in the composition information. Specifically, if the deformation information is “unnecessary”, the first bit “0” is generated. If the deformation information is other than “unnecessary”, the first bit “1” is generated. If the image quality correction information is “unnecessary”, the first bit “0” is generated. If the bit is other than “0” or “unnecessary”, the second bit “1” is generated. Next, it outputs to the generated 2-bit attribute flag adding unit 351.
- composition control unit 341 outputs the image data corresponding to the read composition information to the deformation engine 347.
- the composition buffer 344 includes a RAM, and stores composition image data in the middle of image composition processing by the composition engine 343 and the attribute flag addition unit 351.
- the format conversion unit 342 receives the data format from the synthesis control unit 341. Further, the format conversion unit 342 receives the corrected image data from the correction engine 349. The data format of the corrected image data received here matches the data format received from the composition control unit 341.
- the format conversion unit 342 converts the data format of the received corrected image data into the X8R8G8B8 format, Generate image data. Details of the conversion method are described in Embodiment 1 Since it is the same, description is abbreviate
- the format conversion unit 342 outputs the generated converted image data to the composition engine.
- the composition engine 343 receives output coordinates, transmittance, and image size from the composition control unit 341.
- composition engine 343 receives the converted image data from the format conversion unit 342.
- the image size of the converted image generated based on the converted image data received here matches the image size received from the composition control unit 341.
- the compositing engine 343 Upon receiving these, the compositing engine 343 reads the compositing image data from the compositing buffer 344, and among the read compositing image data, the received output coordinates are the starting points, and the range indicated by the image size ( This is combined with the received converted image data. Since the composition operation is as described in Embodiment 1, it is omitted here. At this time, the format of each pixel data in the composition target range is X8R8G9B8 format.
- composition engine 343 outputs the combined image data to the attribute flag adding unit 351.
- the attribute flag addition unit 351 receives the 2-bit attribute flag P from the synthesis control unit 341 and temporarily stores it.
- the attribute flag adding unit 351 receives the image data after the composition process from the composition engine 343.
- the attribute flag adding unit 351 rewrites the received image data with the attribute flag P that stores the first two bits of each pixel data included in the compositing target range.
- the in-combination image data held in the compositing buffer 344 is updated with the image data after rewriting.
- the main control unit 304 has a function of outputting a control signal to each unit constituting the image composition device 300 and controlling the operation of each unit.
- the main control unit 304 accepts a user operation via the input unit 303, reads image data recorded in an external device or a recording medium, and writes it in the image storage area of the memory 301.
- the main control unit 304 receives a composite image creation request via the input unit 303.
- a composite instruction table 312 is generated in accordance with a user operation, and an image composite is instructed to the composite control unit 341. Since control of composite image generation is the same as that in Embodiment 1, description thereof is omitted here.
- the main control unit 304 accepts designation of a position by the user in the input unit 303 in a state where a composite image generated based on the composite image data 314 is displayed on the display unit.
- a position designation is accepted, an input screen for the magnification m ( ⁇ l) is displayed on the display unit 302, and the magnification input is accepted.
- the main control unit 304 enlarges an image in a range of, for example, 100 pixels vertically and 100 pixels horizontally (referred to as an enlargement target range) to the input magnification with the designated position as the center. 31 6 is generated.
- the main control unit 304 repeats the following processing for each pixel data included in the enlargement target range.
- the main control unit 304 reads the multiple enlargement data 313 from the memory 301, and uses the read multiple enlargement data 313 as the deformation memory 34.
- the main control unit 304 reads the RGB enlargement data 213 and writes the read RGB enlargement data 213 into the deformation memory 346.
- the main control unit 304 reads the multiple correction data 315 and writes the read multiple correction data 315 to the correction memory 348. Include.
- the main control unit 304 reads the RGB correction data 217, The read RGB correction data 217 is written into the correction memory 348.
- the main control unit 304 reads the processed image data 316, generates a processed image based on the read processed image data 316, and generates the processed image. Displayed on the display unit 302.
- the image composition apparatus 300 When an image composition is instructed by a user operation, the image composition apparatus 300 starts an image composition process.
- FIGS. 20 to 22 are flowcharts showing the operation of the image composition device 300 in the image composition processing.
- the operation of the image composition device 300 will be described with reference to FIGS.
- the main control unit 304 displays a predetermined selection screen, input screen, etc., and selects the image to be synthesized by the user, display position, overlay order, necessity of processing (enlargement), image quality correction.
- the input of whether or not is necessary is accepted (step S301).
- a synthesis instruction table 312 is generated based on the received data, and the generated synthesis instruction table 312 is written in a predetermined area of the memory 301 (step S302).
- the main control unit 304 instructs the synthesis control unit 341 to synthesize image data (step S303).
- the composition control unit 341 receives an image composition instruction from the main control unit 304.
- the combination instruction table 312 stored in the predetermined area of the memory 301 is searched, and the combination information constituting the combination instruction table 312 is read in order from the smallest in the overlapping order (step S306). .
- step S307 if all the combined information has already been read (YES in step S307).
- the composition control unit 341 moves the process to step S308.
- the composite control unit 341 stores the optimum processing data in the memory 301 based on the deformation information, image quality correction information, and data format included in the read composite information. Is read out, the read processing data and the magnification are written into the deformation memory 346, and the optimum correction data is read out from the memory 301 and written into the correction memory 348 (step S311). However, if the deformation information included in the read synthesis information is “unnecessary”, the deformation engine 347 is instructed to pause. If the image quality correction information is “unnecessary”, the correction engine 349 is instructed to pause.
- composition control unit 341 outputs the data format included in the read composition information to the format conversion unit 342 (step S312).
- the composition control unit 341 reads image data corresponding to the read composition information and outputs the image data to the deformation engine 347 (step S313).
- the deformation engine 347 processes (enlarges) the received image data using parameters and functions stored in the deformation memory 346, generates deformed image data, and regenerates the generated deformed image data. Output to 349 (step S314). However, when the synthesis control unit 341 instructs to pause, the received image data is output to the correction engine 349 as it is.
- the correction engine 349 receives the deformed image data from the deformation engine 347.
- the correction engine 349 performs image quality correction processing on the received deformed image data using the functions and parameters stored in the correction memory 348 to generate corrected image data (Ste S316).
- the correction engine 349 outputs the generated corrected image data to the format conversion unit 342 (step S317).
- the received transformed image data is output to the format conversion unit 342 as it is.
- the format conversion unit 342 Upon receiving the corrected image data, the format conversion unit 342 refers to the data format notified from the composition control unit 341, converts the data format of the received corrected image data into the X8R8G 8B8 format, and Generate converted image data. Subsequently, the generated converted image data is output to the composition engine 343 (step S318). Next, the composition control unit 341 outputs the output position, transmittance, and image size included in the read composition information to the composition engine 343 (step S321).
- the composition engine 343 Upon receiving the converted image data, the composition engine 343 reads out the image data being synthesized from the composition buffer 344 (step S322).
- the compositing engine 343 synthesizes the pixel data of the converted image data corresponding to the pixel data in the compositing target range in the read compositing image data (step S323).
- the composition control unit 341 generates a 2-bit length attribute flag P based on the deformation information and image quality correction information included in the read composition information, and outputs the generated attribute flag P to the attribute flag addition unit 3 51 (Step S326).
- the attribute flag adding unit 351 receives the attribute flag P from the composition control unit 341 and temporarily stores it. Next, the attribute flag adding unit 351 rewrites the image data received from the composition engine 343 to the attribute flag that has received the first two bits of the blank bits of the category claim pixel data included in the composition target range. Then, the composite image data is updated (step S327).
- the attribute flag adding unit 351 writes the updated composite image data back into the composite buffer 344 (step S328).
- the image composition device 300 returns to step S306 and repeats the same processing for all the composition information.
- step S307 if all the synthesis information has been read (YES in step S3 07), the synthesis control unit 341 reads the in-combination image data stored in the synthesis buffer 344, and reads the synthesized image data. Data is written in the memory 301 (step S308). Next, the main control unit 304 is notified that the generation of the composite image has been completed.
- the main control unit 304 reads the composite image data 314 from the memory 301, generates a composite image based on the read composite image data 314, and displays the composite image on the display unit 302 (step S309).
- FIGS. 23 to 24 show the image synthesizing apparatus 300 related to the processing (enlargement) processing of the synthesized image data. It is a flowchart which shows operation
- the main control unit 304 generates a control image based on the composite image data 314 according to the user's operation, and displays the generated composite image on the display unit 302 (step S341). Subsequently, designation of the position (pixel) by the user's operation is accepted via the input unit 303 (step S342). Next, the main control unit 304 accepts an input of magnification by the user (step S343).
- step S346 to step S358 the processed (enlarged) image is obtained by repeating the processing of step S347 to step S357 for each pixel data included in the secondary usage range of the composite image data. Generate data.
- the main control unit 304 reads the multiple enlargement data 313 from the memory 301 and writes the deformation memory 346 (step S348).
- the main control unit 304 reads the RGB enlargement data 213 from the memory 301 and writes it in the deformation memory 346 (step S349). ).
- the main control unit 304 writes the received magnification into the deformation memory 346 (step S351).
- the main control unit 304 reads the multiple correction data 315 from the memory 301 and writes the correction memory 348 ( Step S353).
- the main control unit 304 reads the RGB correction data 217 from the memory 301 and writes it to the correction memory 348 (step S354). .
- the main control unit 304 outputs the deformation engine 347 pixel data, and the deformation engine 347 uses the magnification, function, and parameters stored in the deformation memory 346 to convert the received pixel data. Enlarge the lower 24 bits (ie R8G8B8) and output the correction engine 349 (step S356)
- the correction engine 349 performs image quality correction processing on the pixel data received from the deformation engine 347 using the functions and parameters stored in the correction memory 348, and writes the result in the image storage area 321 of the memory 301 (step S357).
- step S358 When the repetition is completed for all the pixel data included in the secondary usage range (step S358), the main control unit 304 processes the processed image data composed of the pixel data group written by the correction engine 349. 316 is read from the memory 301 (step S359). Next, a processed image is generated from the read processed image data and displayed on the display unit 302 (step S361).
- the image compositing apparatus 300 generates the attribute flag P indicating the presence / absence of enlargement processing and the presence / absence of image quality correction before compositing, and the generated attribute flag is combined with the composite image data. Embedded in each pixel data constituting the.
- the attribute flag embedded in each pixel data of the secondary usage range is referred to, and the image is enlarged and the image quality is adjusted during the synthesis process. Depending on whether or not the image is to be enlarged and the image quality is corrected, the data to be used is selected. By doing this, it is possible to reduce the unnaturalness of the image caused by repeating the same process.
- the data format of the image data to be combined is converted to the RGB format, and the image before conversion is also converted.
- the deformation engine that executes processing such as enlargement or reduction selects and uses the optimum processing data.
- the correction engine that performs image quality correction also uses the format flag P to obtain the optimal correction data. Select and use.
- the generation of the composite image data can be speeded up.
- FIG. 25 is a functional block diagram showing a functional configuration of the image composition device 400.
- the arrows in the figure indicate the data flow in the image composition process.
- the image composition device 400 includes a memory 401, a display unit 402, an input unit 403, a main control unit 404, and an image processing unit 405.
- the image processing unit 405 is composed of a composition control unit 441.
- the image composition device 400 is a computer system that includes a microprocessor, RAM, and ROM, and the RAM and ROM store computer programs. Microprocessor power By operating according to the computer program, the image composition device 400 achieves a part of its functions.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- the configuration of the memory 401 is the same as that in the first embodiment, and the composition instruction table 212, the paint image, the 122 photograph image 124, etc. stored in the memory 401 are the same as those in the second embodiment and the first embodiment, respectively. Since it is as described, the description is omitted.
- the composite image data 414 has a different generation process S and the data structure is the same as that of the composite image data 127 described in the first embodiment, and a description thereof will be omitted.
- the processing data group 453 and the correction data group 454 will be described later.
- the synthesis control unit 441 has a function of outputting a control signal to each function and controlling the operation, although not shown, in the process of generating the composite image data.
- the composition control unit 441 is instructed by the main control unit 404 to perform image composition.
- the synthesis control unit 441 reads the synthesis information from the synthesis information constituting the synthesis instruction table 212 in ascending order of overlap, and performs the following processing for all the synthesis information. repeat.
- the composition control unit 441 outputs the data format included in the read composition information to the format conversion unit 442.
- the deformation information and the image quality correction information are output to the deformation engine 447 and the correction engine 449, respectively.
- the output position, transmittance, and image size are output to the composition engine 443.
- composition control unit 441 outputs image data corresponding to the read composition information to the format conversion unit 442.
- the compositing control unit 441 monitors the operation of the compositing engine 443.
- the compositing buffer 444 The stored in-combination image data is read, and the read out in-combination image data is written into the memory 401 as composite image data 414.
- the format conversion unit 442 receives the data format and the image data from the composition control unit 441.
- a 1-bit format flag P is generated according to the received data format.
- the format conversion unit 442 converts the data format of the received image data into the R8G8B8 format and generates converted image data. Since the detailed procedure of conversion is described in Embodiment 1, the description is omitted here.
- the generated format flag P and converted image data are output to the transformation engine 447.
- the deformation memory 446 is composed of a RAM, and stores a machining data group 451 here. This is the same as the processing data group 453 stored in the memory 401.
- the processing data group 451 includes a plurality of data used for processing such as enlargement, reduction, rotation, and reversal. Specifically, the graphic enlargement data A113, reduction data A114, inversion data 115, and rotation data described in the first embodiment are used. ⁇ 116 ⁇ Includes enlargement data B, reduction data B, inversion data B, and rotation data ⁇ ⁇ ⁇ ⁇ for natural images.
- the deformation engine 447 receives deformation information from the composition control unit 441 and receives a format flag P and converted image data from the format conversion unit 442.
- the deformation engine 447 Upon receiving these, the deformation engine 447 reads the optimum machining data from the machining data group stored in the deformation memory 446 based on the received format flag P and deformation information. Using the read processing data, the received converted image data is processed to generate deformed image data.
- the reduction data B for natural images is read, and the scale factor “0. Using “8” and, the received converted image data is reduced to generate transformed image data.
- the deformation engine 447 outputs the generated deformed image data and the received format flag P to the correction engine 449.
- composition information received from the composition control unit 441 is "unnecessary" If the composition information received from the composition control unit 441 is "unnecessary", the received converted image data is output as it is as modified image data.
- the compensation memory 448 is composed of a RAM, and stores a compensation data group 452 here. This is the same as the correction data group 454 stored in the memory 401.
- the correction data group 452 includes a plurality of data used for image quality correction. Specifically, it includes graphic correction data A117 and natural image correction data B118 described in the first embodiment.
- the correction engine 449 receives image quality correction information from the composition control unit 441. Also, the deformed image data and the format flag P are received from the deformation engine 447.
- the correction engine 449 selects the optimal correction data from the correction data group 452 based on the image quality correction information and the format flag P, and reads the selected correction data.
- the correction engine 449 performs image quality correction processing on the received deformed image data using the read correction data, and generates corrected image data.
- the correction data B for natural images is read, and the image data is corrected using the read correction data B to perform image quality correction processing. Generate finished image data.
- the correction engine 449 outputs the generated corrected image data and the received format flag P to the synthesis engine 443.
- the above processing is not performed and the received modified image data is output as corrected image data as it is.
- the composition buffer 444 is composed of RAM, and stores image data being synthesized by the composition engine 443.
- the data format of the in-composition image data stored in the composition buffer 444 is P1X7R8G8B8 format.
- the composition engine 443 receives the output position, transmittance, and image size from the composition control unit 441. Further, the composition engine 443 receives the corrected image data and the format flag P from the correction engine 449. The image size received here matches the data size of the corrected image generated from the corrected image data.
- composition target range a range determined from the output position and determined by the image size.
- the composition engine 443 reads out the image data during composition from the composition buffer 444.
- the portion corresponding to the composition target range and the corrected image data are combined. Details of the composition are calculated using the data and transmittance of each color as described in the first embodiment.
- the first bit of each pixel data in the compositing target range is rewritten to the received format flag, and the compositing image data is updated.
- composition engine 443 writes the updated composition image data back to the composition buffer 444.
- the main control unit 404 has a function of outputting a control signal not shown in the drawing to each functional unit and controlling its operation.
- the main control unit 404 receives a composite image generation request from the user via the input unit 403.
- the main control unit 404 initializes the deformation memory 446 and the correction memory 448.
- the machining data group 453 is read from the memory 401 and written to the deformation memory 446.
- the correction data group 454 is read from the memory 301 and written to the correction memory 448.
- the timing of writing these data is not limited to this, and may be when the image compositing apparatus 400 is activated, or may be written by the compositing control unit 441 after an image compositing instruction (described later). ,.
- the main control unit 404 displays a predetermined selection screen and input screen on the display unit 402, and accepts input of selection of an image to be synthesized, necessity of processing, and necessity of correction. .
- a synthesis instruction table 212 is generated based on the received data, and the generated synthesis instruction table 212 is written in the memory 401.
- the composition control unit 441 is instructed to perform image composition.
- the main control unit 404 generates a composite image based on the composite image data 414 in accordance with a user operation, and displays it on the display unit 402. At this time, when a position (pixel) in the composite image is designated by the user, processing of the composite image data is controlled. Since the control of the processing of the composite image data is the same as in the first embodiment, the description thereof is omitted here.
- the image composition device 400 Upon receiving the operation of the user requesting image composition, the image composition device 400 displays the image composition. Start the process.
- 26 to 28 are flowcharts showing the operation of the image composition device 400 in the image composition processing. The operation of the image composition apparatus 400 in the image composition process will be described below with reference to FIGS. 26 to 28.
- the main control unit 404 initializes the deformation memory 446 and the correction memory 448, reads the processing data group 453 from the memory 401, and reads it out.
- the processed machining data group 453 is written into the deformation memory 446.
- the correction data group 454 is read from the memory 401, and the read correction data group 454 is written into the correction memory 448 (step S401).
- the main control unit 404 receives, via the input unit 403, input of selection of an image to be combined, display position, overlay order, necessity of processing, necessity of image quality correction (Ste S4 02).
- the main control unit 404 generates a composition instruction table 212 based on the received data, writes it in the memory 301, and instructs the composition control unit 441 to perform image composition (step S404).
- the composition control unit 441 reads the composition information included in the composition instruction table 212 in order from the one with the smallest overlay order (step S406). At this time, if all the combined information has already been read (YES in step S407), the main control unit 404 moves the process to step S408.
- synthesis control unit 441 outputs the data format included in the read synthesis information to format conversion unit 442 (step S411).
- the composition control unit 441 outputs the deformation information included in the read composition information to the deformation engine 447 (step S412).
- the synthesis control unit 441 outputs the image quality correction information included in the read synthesis information to the correction engine 449 (step S413).
- the composition control unit 441 outputs the output position, transmittance, and image size included in the read composition information to the composition engine 443 (step S414).
- the composition control unit 441 reads the image data corresponding to the read composition information from the memory 401, and outputs the read image data to the format conversion unit 442 (step S416).
- the format converter 442 converts the received image data into the R8G8B8 format based on the received data format, and generates converted image data (step S421).
- the generated converted image data and the format flag P are output to the transformation engine 447 (step S422).
- the deformation engine 447 receives the deformation information, the format flag P, and the converted image data. If the received deformation information is not “unnecessary” (NO in step S424), the deformation engine 447 uses the received format flag P and the deformation information to select the optimum processing data from the processing data group 451. Data is selected, and the selected machining data is read from the deformation memory 446 (step S426).
- the deformation engine 447 processes the received deformed image data using the read processing data, and generates deformed image data (step S427).
- the deformation engine 447 outputs the generated deformed image data and the received format flag P to the correction engine 449 (step S428).
- the received deformation information is “unnecessary” (YES in step S424)
- the received converted image data is output as it is to the correction engine 449 as deformed image data (step S431).
- the received format flag P is output to the correction engine 449 (step S432).
- the correction engine 449 receives the correction information, the format flag P, and the transformed image data.
- the correction engine 449 selects the optimal correction data from the correction data group 452 based on the received format flag P. Then, the selected correction data is read from the correction memory 448 (step S4 34). Next, the modified image data received using the correction data read out from the correction engine 449 is subjected to image quality correction processing to generate corrected image data (step S436). Subsequently, the correction engine 449 outputs the received format flag P and the generated corrected image data to the synthesis engine 443 (step S437).
- the correction engine 449 outputs the received deformed image data as it is as corrected image data to the synthesis engine 443 (step S441). ). Subsequently, the received format flag P is output to the synthesis engine 443 (step S442).
- the composition engine 443 receives the output position, the transmittance, the image size, the corrected image data, and the format flag. Upon receiving these, the composition engine 443 reads out the image data during composition from the composition buffer 444 (step S443). Of the read in-combination image data, the portion corresponding to the synthesis target range and the corrected image data are synthesized (step S444). Subsequently, the composition engine 443 rewrites the first bit of each piece of pixel data corresponding to the composition target range with the received format flag P to update the image data during composition (step S446).
- the compositing engine 443 writes the updated composite image data back to the compositing buffer 444 (step S447).
- the synthesis control unit 441 reads out the image data in the middle of synthesis held in the synthesis buffer 444 and writes it into the memory 401 as synthesized image data. (Step S408).
- the main control unit 404 reads the composite image data 414 from the memory 401, generates a composite image based on the read composite image data, and displays it on the display unit 402 (step S409).
- the format converter first converts the format of the image data to be combined into R8G8B8 and generates a format flag indicating the format of the image data before conversion.
- each engine can quickly and optimally process the optimal processing data from the associated deformation memory or correction memory based on the format flag. Or, select the correction data and read it with the force S. 5.
- the ability to generate a format flag (attribute flag in the third embodiment) P for each pixel data is not limited to this. It is sufficient that the part corresponding to the image data before composition and the flag P corresponding to the part can be identified.
- format converter 142 receives the data format from synthesis controller 141, and generates a 1-bit format flag P in accordance with the received data format.
- the format converter 142 receives the input image data, converts the data format of the received input data into, for example, the R8G8B8 format, and generates converted image data. Next, the format converter 142 outputs the generated format flag P and the converted image data to the synthesis engine.
- the in-combination image data held in the composition buffer 144 and the composite image data stored in the memory 101 are in the V8 and deviation are in the R8G8B8 format.
- the composition engine 143 receives the transmittance, the overlay order, the output coordinates, and the image size from the composition control unit 141. Further, the format flag P and the converted image data are received from the format conversion unit 142. When these are received, the compositing image data is read from the compositing buffer 144. Using the received transmittance, the read-out composite image data and the converted image data are combined. Next, it generates attribute information consisting of the received format flag P, overlay order, output coordinates, and image size. Then, to update the synthesis process image data held by the synthetic buffer 14 4 by the image data after synthesis. Next, the generated attribute information is written into the synthesis buffer 144.
- composition engine 143 Every time the composition engine 143 receives the transmittance, the output coordinates, the image size, the format flag P, and the converted image data, the composition engine 143 repeats the updating of the image data during composition, generation and writing of attribute information in the same manner.
- the attribute information generated here indicates format flags corresponding to the pixels included in each of the areas 151a, 152a, 153a, and 154a in the composite image.
- the main control unit 104 performs an enlargement process on each pixel included in the secondary usage range in the same manner as in the first embodiment.
- the main control unit 104 detects attribute information indicating a region including the pixel to be processed, and uses the format flag P included in the detected attribute information.
- the main control unit 104 compares the overlapping order included in the detected attribute information, and determines the attribute information including the maximum overlapping order. adopt.
- attribute information including information indicating a region occupied by each image on the composite image, a format flag (or attribute flag), and an overlapping order may be stored. In this way, it is not necessary to store the format flag for each pixel, and the total data amount can be reduced.
- the main control unit 104 receives the processing data and the correction data according to the value of the format flag P included in each pixel data in the secondary usage range of the composite image data. Selected and written in the deformation memory 146 and the correction memory, respectively.
- the main control unit 104 receives the processing data and the correction data selected by the value of the format flag included in the pixel data corresponding to the position (pixel) specified by the user.
- the deformation engine 147 and the correction engine 149 are configured to process and correct all the pixel data corresponding to the secondary usage range using the data written in the deformation memory 146 and the correction memory 148, respectively. May be.
- the secondary usage range is determined based on the format flag P (the attribute flag P in the third embodiment) corresponding to the pixel selected by the user. Data used for processing and correction may be selected only once and written to the deformation memory and correction memory, respectively.
- the total number of P values corresponding to each pixel included in the secondary usage range is 1 Compare the total number of zero values, select the data to be used for processing and correction according to the larger P value, and write it to each memory only once.
- the compositing engine reads the in-composition image data (in addition to this, the format flag group in the second embodiment) from the compositing buffer, and performs the compositing process.
- the transmission rate received from the copy unit is “0”, reading may not be performed, and the portion corresponding to the range to be combined in the image data being combined may be overwritten with the received image data.
- the flag indicating the format of the pre-combination image data is 1 bit long, “0” is RGB format, and “1” is YUV format. Not limited
- YUV format there are two types of methods, the packed method and the blanker method, which differ in how the data is organized. Furthermore, there are several types of formats such as YUV420 and YUV422 depending on how data is thinned out.
- the format flag P may be multi-bit data indicating a format including these differences. Specifically, the first bit indicates whether it is RGB format or YUV format, the second bit indicates whether it is a packed method or a blanker method, and the third bit (or several bits after the third bit) is The method of thinning data in the YU V format may be indicated.
- the format conversion unit 242 is not limited to the force S for converting the data format of the input corrected image data into the R8G8B8 format.
- the configuration may be such that the address management of the generated composite image data is facilitated by using the R5G6B5 format.
- the image composition device 300 has been described as a device that synthesizes a plurality of image data as they are or in an enlarged manner.
- it may be combined after processing such as reduction, inversion, and rotation.
- the attribute bits are data of 8 bits or less, and each bit is
- the first bit may indicate the presence / absence of expansion
- the second bit may indicate the presence / absence of reduction
- the third bit may indicate the presence / absence of inversion.
- the value of P indicates the data format of the image data before composition, and in Embodiment 3, the presence or absence of processing before image composition and image quality correction processing is indicated. However, it is not limited to this!
- the type of software that generated each image data is conceivable. Specifically, this includes paint image software and software that generates photographic image data. In the embodiment, a plurality of pieces of image data have been combined. However, not only image data but also text data created by document creation software, tables created by spreadsheet software, etc. can be synthesized. There may be.
- the format conversion unit converts each data into RGB format image data, and generates a format flag of several bits length indicating the type and format of the data before conversion.
- each image data is stored in a memory in the image composition device! /, And the power is not limited to this! /.
- the input unit includes a mechanism for mounting a portable storage medium and a mechanism for connecting a recording device, and a user can store images in a memory in the image composition device, an external recording medium, and the recording device.
- the image data to be combined may be selected from the data.
- the generated composite image data and processed image data are written in the internal memory.
- the present invention is not limited to this, and is configured to output to an external recording medium, a recording device, or the like. Also good.
- the image composition device generates composite image data, and further outputs the force processed image data generating the processed image data to an external device or a recording medium.
- An external device may generate processed image data based on the composite image data.
- Each of the above devices is specifically a computer system including a microprocessor, ROM, RAM, hard disk unit, display unit, keyboard, mouse, and the like.
- the RAM, ROM, and hard disk unit include a computer
- the program is stored. By operating according to the microprocessor power and the computer program, each device achieves its function.
- the computer program is configured by combining a plurality of instruction codes indicating instructions to the computer in order to achieve a predetermined function.
- a part or all of the constituent elements constituting each of the above devices may be constituted by one system LSI (Large Scale Integration).
- a system LSI is an ultra-multifunctional LSI that is manufactured by integrating multiple components on a single chip. Specifically, it is a computer system that includes a microprocessor, ROM, RAM, and so on. It is. A computer program is stored in the RAM. Microprocessor power The system LSI achieves its functions by operating according to the computer program.
- a part or all of the constituent elements constituting each of the above devices may be constituted by an IC card or a single module that can be attached to and detached from each device.
- the IC card or the module is a computer system including a microprocessor, ROM, RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above. Microprocessor power
- the IC card or the module achieves its functions by operating according to a computer program. This IC card or module may be tamper resistant! /.
- the present invention may be the method described above.
- the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
- the present invention also provides a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray). Disc), semiconductor memory, etc. may be used. Further, the present invention may be the computer program or the digital signal recorded on these recording media.
- the present invention provides the computer program or the digital signal to an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, etc. It is good also as what is transmitted via etc.
- the present invention may also be a computer system including a microprocessor and a memory.
- the memory may store the computer program, and the microprocessor may operate according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- the present invention includes a case where all of the functional blocks of the memory card and SIM card in the above embodiment are realized as an LSI which is an integrated circuit. Further, the present invention includes not only all functional blocks but also a part realized as LSI. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them. Here, it is sometimes called IC, system LSI, super LSI, or Unoletra LSI, due to the difference in power integration of LSI.
- the method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible. It is also possible to use an FPGA (Field Programmable Gate Array) that can be programmed after LSI manufacturing and a reconfigurable processor that can reconfigure the connection and settings of circuit cells inside the LSI! /.
- FPGA Field Programmable Gate Array
- the present invention relates to an apparatus that synthesizes and outputs image data, an industry that manufactures and sells an apparatus that lowers the synthesized image data, and an industry that provides various services using these apparatuses. , Management, iteratively available.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Editing Of Facsimile Originals (AREA)
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/444,624 US8179403B2 (en) | 2006-10-19 | 2007-09-11 | Image synthesis device, image synthesis method, image synthesis program, integrated circuit |
JP2008539698A JP4814337B2 (ja) | 2006-10-19 | 2007-09-11 | 画像合成装置、画像合成方法、画像合成プログラム、集積回路 |
CN200780038746XA CN101529465B (zh) | 2006-10-19 | 2007-09-11 | 图像合成装置、方法、集成电路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006284529 | 2006-10-19 | ||
JP2006-284529 | 2006-10-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008047518A1 true WO2008047518A1 (fr) | 2008-04-24 |
Family
ID=39313768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/067631 WO2008047518A1 (fr) | 2006-10-19 | 2007-09-11 | Dispositif de synthèse d'image, procédé de synthèse d'image, programme de synthèse d'image, circuit intégré |
Country Status (4)
Country | Link |
---|---|
US (1) | US8179403B2 (ja) |
JP (1) | JP4814337B2 (ja) |
CN (1) | CN101529465B (ja) |
WO (1) | WO2008047518A1 (ja) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8243098B2 (en) * | 2009-06-16 | 2012-08-14 | Mitre Corporation | Authoritative display for critical systems |
JP2011109245A (ja) * | 2009-11-13 | 2011-06-02 | Renesas Electronics Corp | 画像処理装置及び画像処理方法 |
US20120242693A1 (en) * | 2009-12-11 | 2012-09-27 | Mitsubishi Electric Corporation | Image synthesis device and image synthesis program |
US8570617B2 (en) * | 2010-11-30 | 2013-10-29 | Kyocera Mita Corporation | Image reading apparatus for reading adjustment document used for image adjustment, image forming apparatus with image reading apparatus and adjustment document used in image reading apparatus |
US8836727B2 (en) | 2010-12-22 | 2014-09-16 | Apple Inc. | System level graphics manipulations on protected content |
JP2013200639A (ja) * | 2012-03-23 | 2013-10-03 | Canon Inc | 描画データ生成装置、描画データ生成方法、プログラム、および描画データ生成システム |
KR101978239B1 (ko) * | 2012-06-22 | 2019-05-14 | 삼성전자주식회사 | 컨텐츠를 편집하는 방법 및 그 전자 장치 |
JP2014067310A (ja) * | 2012-09-26 | 2014-04-17 | Olympus Imaging Corp | 画像編集装置、画像編集方法、およびプログラム |
JP2014068274A (ja) * | 2012-09-26 | 2014-04-17 | Olympus Imaging Corp | 画像編集装置、画像編集方法、およびプログラム |
KR20140112891A (ko) * | 2013-03-14 | 2014-09-24 | 삼성전자주식회사 | 미디어 편집을 위한 영역 선택 처리 장치, 방법 및 컴퓨터 판독 가능한 기록 매체 |
KR102047704B1 (ko) * | 2013-08-16 | 2019-12-02 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어 방법 |
AU2013248213A1 (en) | 2013-10-24 | 2015-05-14 | Canon Kabushiki Kaisha | Method, apparatus and system for generating an attribute map for processing an image |
JP6450589B2 (ja) | 2014-12-26 | 2019-01-09 | 株式会社モルフォ | 画像生成装置、電子機器、画像生成方法及びプログラム |
KR101967819B1 (ko) * | 2017-11-02 | 2019-04-10 | 주식회사 코난테크놀로지 | 타일 영상 기반 다중 재생을 위한 영상 처리장치 및 그 타일 영상 구성방법 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003310602A (ja) * | 2002-04-19 | 2003-11-05 | Fuji Photo Film Co Ltd | 画像表示装置 |
JP2003317078A (ja) * | 2002-04-23 | 2003-11-07 | Konica Minolta Holdings Inc | 画像管理方法、プログラム、及び記録媒体 |
JP2005108105A (ja) * | 2003-10-01 | 2005-04-21 | Canon Inc | 画像処理装置及び方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61156091A (ja) | 1984-12-27 | 1986-07-15 | 株式会社東芝 | カ−ソル制御装置 |
US5638498A (en) * | 1992-11-10 | 1997-06-10 | Adobe Systems Incorporated | Method and apparatus for reducing storage requirements for display data |
JP3814456B2 (ja) * | 2000-02-07 | 2006-08-30 | キヤノン株式会社 | 画像処理装置及びその方法 |
JP2003320715A (ja) * | 2002-04-30 | 2003-11-11 | Canon Inc | 情報処理装置、情報処理システム、情報出力制御方法、記憶媒体、及びプログラム |
US7715640B2 (en) * | 2002-11-05 | 2010-05-11 | Konica Minolta Business Technologies, Inc. | Image processing device, image processing method, image processing program and computer-readable recording medium on which the program is recorded |
CN100493160C (zh) * | 2003-04-09 | 2009-05-27 | 松下电器产业株式会社 | Osd合成图像译码装置和osd合成图像译码方法 |
US8612847B2 (en) * | 2006-10-03 | 2013-12-17 | Adobe Systems Incorporated | Embedding rendering interface |
JP5031668B2 (ja) * | 2008-05-30 | 2012-09-19 | キヤノン株式会社 | 印刷装置、その制御方法、プログラム |
-
2007
- 2007-09-11 CN CN200780038746XA patent/CN101529465B/zh not_active Expired - Fee Related
- 2007-09-11 US US12/444,624 patent/US8179403B2/en not_active Expired - Fee Related
- 2007-09-11 JP JP2008539698A patent/JP4814337B2/ja not_active Expired - Fee Related
- 2007-09-11 WO PCT/JP2007/067631 patent/WO2008047518A1/ja active Search and Examination
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003310602A (ja) * | 2002-04-19 | 2003-11-05 | Fuji Photo Film Co Ltd | 画像表示装置 |
JP2003317078A (ja) * | 2002-04-23 | 2003-11-07 | Konica Minolta Holdings Inc | 画像管理方法、プログラム、及び記録媒体 |
JP2005108105A (ja) * | 2003-10-01 | 2005-04-21 | Canon Inc | 画像処理装置及び方法 |
Also Published As
Publication number | Publication date |
---|---|
US8179403B2 (en) | 2012-05-15 |
CN101529465B (zh) | 2012-07-04 |
CN101529465A (zh) | 2009-09-09 |
JP4814337B2 (ja) | 2011-11-16 |
US20100079492A1 (en) | 2010-04-01 |
JPWO2008047518A1 (ja) | 2010-02-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4814337B2 (ja) | 画像合成装置、画像合成方法、画像合成プログラム、集積回路 | |
JP3701204B2 (ja) | 画質調整機能を有する画像表示装置および画像表示方法、並びに記録媒体 | |
US6657655B1 (en) | Stereoscopic-image display apparatus | |
JP2002027260A (ja) | カラー画像のグレー変換方法および装置 | |
JP2001186332A (ja) | 画像処理装置、画像処理方法およびその方法をコンピュータに実行させるプログラムを記録したコンピュータ読み取り可能な記録媒体 | |
KR20090041169A (ko) | 디지털 이미지 처리장치, 그 제어방법, 제어방법을실행시키기 위한 프로그램을 저장한 기록매체 및 디지털이미지 압축방법 | |
KR20110035981A (ko) | 화상 처리 장치, 화상 처리 방법, 및 기억 매체 | |
US20010004258A1 (en) | Method, apparatus and recording medium for generating composite image | |
JP4079163B2 (ja) | 投影型画像表示装置 | |
US20070053009A1 (en) | Image processing method, image processor, and image forming apparatus | |
JPH09190546A (ja) | 自動画像編集装置 | |
JP2003046908A (ja) | 投影型画像表示装置 | |
US20020154343A1 (en) | System and method of capturing a digital picture | |
WO2024038651A1 (ja) | プログラム、画像処理方法及び画像処理装置 | |
JP2009005081A (ja) | プロファイル作成装置およびプロファイル作成方法 | |
JP2009182746A (ja) | 画像管理方法および画像管理装置 | |
JPH0546718A (ja) | 画像編集装置 | |
JP2011244364A (ja) | 画像処理装置、画像合成方法、画像合成プログラム及び記録媒体 | |
JP5202263B2 (ja) | 画像処理装置、画像処理方法、及びコンピュータプログラム | |
KR20050017060A (ko) | 이미지 일괄 편집 시스템 | |
JP3219382B2 (ja) | 画像処理システム並びに読取装置及びホストコンピュータ | |
JP2006330094A (ja) | カラー画像表示装置、カラー画像表示方法、プログラムおよび記録媒体 | |
TWI507831B (zh) | 車機系統之介面切換方法、裝置與系統 | |
JP6202991B2 (ja) | 画像表示装置、その制御方法及びプログラム | |
JP4314525B2 (ja) | 補正画像表示方法及び装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200780038746.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07807040 Country of ref document: EP Kind code of ref document: A1 |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 12444624 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008539698 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 07807040 Country of ref document: EP Kind code of ref document: A1 |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) |