KR101681840B1 - Real time overlay image processing apparatus of multiple camera image - Google Patents

Real time overlay image processing apparatus of multiple camera image Download PDF

Info

Publication number
KR101681840B1
KR101681840B1 KR1020150120468A KR20150120468A KR101681840B1 KR 101681840 B1 KR101681840 B1 KR 101681840B1 KR 1020150120468 A KR1020150120468 A KR 1020150120468A KR 20150120468 A KR20150120468 A KR 20150120468A KR 101681840 B1 KR101681840 B1 KR 101681840B1
Authority
KR
South Korea
Prior art keywords
type
data
unit
video
image
Prior art date
Application number
KR1020150120468A
Other languages
Korean (ko)
Inventor
이형
김연호
Original Assignee
대전보건대학 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 대전보건대학 산학협력단 filed Critical 대전보건대학 산학협력단
Priority to KR1020150120468A priority Critical patent/KR101681840B1/en
Application granted granted Critical
Publication of KR101681840B1 publication Critical patent/KR101681840B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2625Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Image Processing (AREA)

Abstract

The present invention relates to a multi-image real-time overlay image processing apparatus for overlaying each image frame input from a plurality of image output apparatuses into one image frame in real time.
A real-time overlay image processing apparatus for multiple images according to the present invention is a real-time overlay image processing apparatus for multiple images that overlays an image frame input from a plurality of video apparatuses in real time to match one image frame, Each of the input image frames is accessed according to the setting type of the user through the user interface, and the selected image frame data is extracted for each setting type. Then, the selected image frame data is overlaid on the selected image frame data to be matched to one image frame, So that image frames inputted from a plurality of video apparatuses can be converted into a single frame in real time and output through a display device.

Description

REAL TIME OVERLAY IMAGE PROCESSING APPARATUS OF MULTIPLE CAMERA IMAGE

The present invention relates to a real-time overlay image processing apparatus, and more particularly, to a real-time overlay image processing apparatus for overlaying multiple image frames inputted from a plurality of image output apparatuses in a single image frame in real time.

Examples of devices for outputting video images include video cameras, still cameras capable of video recording, and video players. Generally, in order to acquire images individually from these image output devices, devices such as a frame grabber for inputting and storing images individually from the image output device are required. That is, N image acquisition apparatuses are required to store output images of N image output apparatuses.

2. Description of the Related Art In recent years, image processing methods have been used in which video frames output from a plurality of video output devices are synchronized in real time and overlayed to form one video frame. Such an overlay image processing method may be applied to various image output apparatuses such as a general video camera, a digital still camera, a thermal camera, an infrared camera, a radar, etc., And output it as one image. In order to synchronize the image frames output from the plurality of image output devices in the same specific time period to construct one image frame, metadata such as time information is added to each image frame to acquire an image through the image acquisition device, It is stored in the storage medium and the frames are read from the storage medium based on the meta information to form one frame.

In order to acquire an image of the same specific time through the time synchronization, a separate external device for triggering an image frame acquired through a plurality of image acquisition devices is needed.

In addition, the image frames stored in the storage medium are searched with the frames of the same time based on the meta data such as the time information, and the frames inputted at a specific time are overlaid to form a single frame. The resultant image can not be confirmed in real time. For example, when a specific scene is photographed by a plurality of cameras and is overlaid on one scene, there is a problem that a delayed scene is displayed due to an internal processing problem due to serial processing.

Korean Patent Publication No. 10-2010-0060396 (published on Jun. 7, 2010)

An object of the present invention is to overlay and process each image frame transmitted from a plurality of video apparatuses in a single image frame in real time Time overlay image processing apparatus.

In order to accomplish the above object, the present invention provides a real-time overlay image processing apparatus for multiple images, which realizes overlay processing of image frames input from a plurality of video apparatuses and matches the image frames into one image frame An image frame input from a plurality of video apparatuses is accessed for each set type of a user through a user interface to extract image frame data selected for each setting type, overlays the selected image frame data, And stored in the frame buffer.

The setting type through the user interface includes Type 1 for selecting any video device among a plurality of video devices and storing the video frames of the video device directly in a frame buffer by passing them through the frame buffer, Type 2 that simultaneously accesses a specific location pixel within input image frames and performs an overlay operation and then stores the same at a specific same location of the frame buffer; Type 3 which accesses the pixels at the same position and performs an overlay operation and then stores the same at a specific same position of the frame buffer.

In accordance with the present invention, a real-time overlay image processing apparatus for multiple images includes a simultaneous storage unit for simultaneously storing image frame data transmitted from a plurality of video apparatuses in a memory module unit, A concurrent extracting unit for simultaneously extracting valid data by accessing each data by type, and an arithmetic unit for overlaying the data extracted through the simultaneous extracting unit and storing the data in a frame buffer.

If the access type set by the user is Type 1, the simultaneous storage unit allows the video frame data transmitted from the selected single video device to be stored in the frame buffer by bypassing, Or Type 3, n image frame data simultaneously input from N video devices are simultaneously accessed to pixels corresponding to the same position of N image frame data, which is an approach of Type 2, to rearrange the data into m data, And simultaneously stored in m memory modules of the module part.

If the type set by the user is Type 1, the simultaneous storage method is selected as Type 1, and if the type set by the user is Type 2 or Type 3, 2, a memory module selection module for selecting n memory modules out of m memory modules in the memory module part when the simultaneous storage mode is selected as the type 2 by the access type selection module, A data routing module for routing the data transmitted from the n video devices to m memory modules of the memory module part and for storing n data transmitted from the n video devices to m memory modules of the memory module part simultaneously An address calculation and routing module is provided for calculating and routing n addresses.

Also, the simultaneous extracting unit simultaneously extracts n data according to a Type 2 or Type 3 access type set by the user among the data stored in the memory module unit, and transmits the extracted data to the computing device.

For this, the simultaneous extraction unit may include an access type selection module for selecting whether the access type of data stored in the memory module unit is Type 2 or Type 3, and m memories in the memory module unit according to the type selected through the access type selection module A memory module selection module for selecting n valid memory modules out of the modules; a data routing module for extracting n valid data out of m pieces of data simultaneously extracted from the m memory modules in the memory module part; an address calculation and routing module for calculating and routing n addresses of m data to be extracted from m memory modules is provided.

The present invention provides a real-time overlay image processing apparatus and method for real-time overlay image processing, which are capable of realizing a single frame of image frames input from a plurality of video apparatuses and outputting them through a display device, , An infrared camera, a radar, and the like can be overlaid and displayed in real time.

FIG. 1 is a conceptual diagram illustrating an image processing method of a real-time overlay image processing apparatus for multiple images according to the present invention.
2 is a block diagram of a real-time overlay image processing apparatus for multiple images according to the present invention.
3 is a detailed block diagram of the simultaneous storage unit and the simultaneous extraction module of the real-time overlay image processing apparatus according to the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a conceptual diagram illustrating an image processing method of a real-time overlay image processing apparatus for multiple images according to an embodiment of the present invention.

1, a real-time overlay image processing apparatus according to an exemplary embodiment of the present invention includes a plurality of individual video apparatuses (video output apparatuses) including at least two or more video apparatuses (general video cameras, digital still cameras, (Type 1, Type 2, Type 3) of a plurality of image frames A outputted from a camera, a camera, a radar, and the like. In FIG. 1, a plurality of image frames A arranged for each type represents image frames output through an individual video device, and a single overlayed image frame B indicates a video frame A of the individual video device And represents one image frame that is overlaid on each type and stored in the frame buffer.

The real-time overlay image processing apparatus according to the present invention is a real-time overlay image processing apparatus for real-time overlaying three types of image frames (Type 1, Type 2 and Type 3) according to user setting through a user interface, (Overlay) processing is performed. Each processing method is as follows.

- Type 1: One-to-one processing method in which only one specific video device among a plurality of individual video devices is selected and the video frame (A) of the video device is directly passed through (Pass)

- Type 2: An n-to-1 processing method in which the pixels of a specific position within a plurality of image frames (A) corresponding to all video devices are simultaneously accessed and overlays are performed,

- Type 3: Overlay operation is performed by simultaneously accessing a plurality of pixels at a specific position in an image frame (A) corresponding to two or more video apparatuses selected from a plurality of video apparatuses, and then stored at a specific same position in the frame buffer (Where n is a multiple of r)

As described above, the real-time overlay image processing apparatus according to the present invention extracts three types of image frames A transmitted from the respective video apparatuses according to a user's setting and performs overlay processing to form one image frame.

2 is a block diagram of a real-time overlay image processing apparatus for multiple images according to an embodiment of the present invention.

2, the real-time overlay image processing apparatus according to the present invention includes a simultaneous storage unit 100 for storing image frames transmitted from a plurality of video apparatuses in a memory module unit 200, A simultaneous extracting unit 300 for accessing the image frames stored in the storage unit 200 for each type according to a user setting and simultaneously extracting valid data, and an arithmetic unit for calculating and overlaying the extracted data through the simultaneous extracting unit 300 And a frame buffer 500 for storing a single image frame overlaid through the arithmetic unit 400.

When the user setting is Type 1, the simultaneous storage unit 100 directly forwards the image frame data to the frame buffer, and if the user setting is Type 2 or In the case of Type 3, the pixels corresponding to the same position of the N image frames, which is the approach of Type 2, are simultaneously accessed to rearrange the data into m data, and then stored in the memory module unit 200.

The memory module unit (M1) (200) includes m memory modules (100) and (200) so that the simultaneous storage unit (100) and the simultaneous extraction unit (300) access n data simultaneously according to a user- ≪ / RTI > In each memory module of the memory module unit 200, m pieces of data in which video frame data are rearranged by the simultaneous storage module 100 are stored. The simultaneous extractor 300 simultaneously accesses the data stored in the memory module 200 according to a type of user setting, that is, a Type 2 or Type 3 access method, extracts n valid data out of m data, To the device (400).

The arithmetic unit 400 overlaid data extracted by the simultaneous extracting unit 300. The arithmetic unit 400 selectively overlays t (1 < t < = n) And stored in the frame buffer 500.

Hereinafter, a process of processing image frame data through the multi-image real-time overlay image processing apparatus having the above-described configuration will be described in detail.

3 is a detailed block diagram of a simultaneous storage unit and a simultaneous extracting unit of a real-time overlay image processing apparatus according to an embodiment of the present invention.

The simultaneous storage unit 100 rearranges the input n pieces of data into m pieces of data by an approach of Type 2, and then, And simultaneously transmitted to m memory modules of the memory module unit 200. M pieces of data stored in the memory module unit 200 are extracted by the simultaneous extractor 300 and transferred to the arithmetic unit 400. [ On the other hand, if the type set by the user is Type 1, the image frame data transmitted from the selected video device is not stored in the memory module unit 200 but is directly transferred to the frame buffer 500 and stored.

The simultaneous storage unit 100 is provided with an access type selection module W1, a memory module selection module W2, a data routing module W3, an address calculation and routing module W4, The access type selection module R1, the memory module selection module R2, the data routing module R3 and the address calculation and routing module R4 are provided. In the embodiment of the present invention, Each module of the extraction unit 300 has the following preconditions.

- An image frame A input from N video devices has L (frame length) × M (frame width).

- p, q, and r are design factors for designing a real-time overlay image processing device, which is a natural number greater than or equal to 2, and r represents the minimum number of data to process.

- m is the number of memory modules in the memory module unit 200 and is greater than n in any prime number greater than p x q x r (i.e., m> (p x q x r), where m> n ).

The basic function for selecting n memory modules to perform operations for simultaneously writing (storing, extracting) n pieces of data simultaneously in m memory modules under the above conditions is expressed by the following equation (1).

Figure 112015083018729-pat00001

Here, BF2 (c) is a basic function for selecting n memory modules from m memory modules in the Type 2 system in the simultaneous storage unit 100 and simultaneous extraction unit 300, and BF3 (c, t) The extraction unit 300 shows a basic function for selecting n memory modules from m memory modules in the Type 3 scheme.

When the access type set by the user is Type 1, the simultaneous storage unit 100 allows the video frame data transmitted from the selected single video device to be stored in the frame buffer 500 by bypassing, When the access type is Type 2 or Type 3, n image frame data simultaneously input from N video devices are accessed by the Type 2 access method and are simultaneously stored in m memory modules of the memory module unit 200.

The access type selection module W1 included in the simultaneous storage unit 100 selects whether the access method is Type 1 or Type 2. The access type selection module W1 selects a type In case of Type 1, simultaneous storage type is selected as Type 1, and when the type set by user is Type 2 or Type 3, simultaneous storage type is selected as Type 2. If Type 1 is selected by the access type selection module W1, the single video frame data received from the selected video device is bypassed without being further processed, and is stored in the frame buffer 500.

The memory module selection module W2 included in the simultaneous storage unit 100 may select one of m memory modules in the memory module unit 200 when the simultaneous storage mode is selected by the access type selection module W1, A module for selecting n memory modules, wherein n memory modules are selected according to the following formula (2).

Figure 112015083018729-pat00002

The data routing module W3 included in the simultaneous storage unit 100 is a module for routing data transmitted from n video devices to m memory modules of the memory module unit 200. The data routing module W3, Performs the operation of shifting the data in D2 to the right by BF2 (c) after performing D1 (k) => D2 (pqk% m). Here, D1 denotes n input terminals associated with n video devices, and D2 denotes m output terminals connected to the memory module unit 200. [ Also, D1 and D2 are made up of m, and (m-n) memory modules which are not allocated are do not care.

The address calculation and routing module W4 included in the simultaneous storage unit 100 stores n addresses for simultaneously storing n data transmitted from n video devices in m memory modules of the memory module unit 200 And calculates an address at which n pieces of data are stored in n memory modules determined by Equation (2) according to the following Equation (3).

Figure 112015083018729-pat00003

Here, a natural number greater than 0 <= k <n, s1 = L / pr and a natural number greater than s2 = M / q are calculated.

After performing A1 (k) => A2 (pqk% m) for the n addresses calculated for the routing, the address calculation and routing module W4 also sets the data in A2 to the right by BF2 As shown in FIG. The n addresses calculated in the above process are stored in A1, and the routed address is stored in A2. The A2 is m output terminals connected to the memory module 200, and A1 and A2 are m-numbered and do not care about (m-n) addresses which are not allocated.

The simultaneous storage unit 100 configured as described above selects the simultaneous storage mode according to the access type set by the user and stores n data input from the n video devices into m memory modules of the memory module unit 200 Simultaneous storage.

The simultaneous extractor 300 simultaneously extracts n data according to an access type (Type 2 or Type 3) set by the user among the data stored in the memory module 200 and provides the extracted data to the calculation device 400.

The access type selection module Rl included in the simultaneous extraction unit 300 is a module for selecting an access type of data stored in the memory module unit 200. The access type selection module 210 receives a user ID Type 2 or type 3 according to the type set by the user.

The memory module selection module R2 included in the simultaneous extraction unit 300 is a module for selecting n valid memory modules among the m memory modules in the memory module unit 200 according to the selected type. The memory module is selected through the same method as that of the memory module selection module W2 of the simultaneous storage unit 100, that is, the method described in Equation (2). On the other hand, in the Type 3 method, n memory modules are selected according to the following equation (4).

Figure 112015083018729-pat00004

The data routing module R3 included in the simultaneous extracting unit 300 is a module for extracting n valid data out of m data simultaneously extracted from m memory modules in the memory module unit 200. The data routing module R3 includes a Type 2 (K) => D2 (pqk% m), as in the data routing module W3 of the concurrent storage unit 100, where D1 represents the output stage of m memory modules. In addition, the routing portion performs an operation of shifting the data in D2 to the left by BF2 (c). In the case of the Type 3 method, m data pieces extracted from the memory module unit 200 are shifted to the left by BF3 (c, t).

The address calculation and routing module R4 included in the simultaneous extractor 300 calculates and routes n addresses of m data to be extracted from the m memory modules in the memory module 200, 2 scheme calculates the address of n data to be extracted through the same method as the address calculation and routing module W4 of the simultaneous storage unit 200 described above, that is, the method described in Equation (3). In the Type 3 method, n addresses are stored in n memory modules determined by Equation (4) according to the following Equation (5).

Figure 112015083018729-pat00005

The n addresses calculated by the AD (c, t) operation are shifted to the right by BF3 (c, t).

The simultaneous extraction unit 300 having the above configuration selects the simultaneous extraction mode according to the access type set by the user and simultaneously extracts n valid data among the data stored in the memory module unit 200 and outputs the extracted n data to the calculation unit 400. [ .

The calculation unit 400 calculates n pieces of data transmitted through the simultaneous extracting unit 300 and overlays the n pieces of data on the same position with t pieces of data, and transmits data overlaid on one frame to the frame buffer 500 .

Since the image frame stored in the frame buffer 500 is provided to the display device, the image output from the plurality of video cameras can be overlay processed and displayed in real time.

As described above, the real-time overlay image processing apparatus for multiple images according to the present invention realizes overlay processing of image frames photographed and inputted by a plurality of video apparatuses in three types, .

The above-described real-time overlay image processing apparatus according to the present invention is not limited to the above-described embodiments, but may be implemented by a person having ordinary skill in the art to which the present invention pertains, It is to be understood that various changes and modifications may be made without departing from the scope of the present invention.

100: simultaneous storage unit 200: memory module unit
300: simultaneous extraction module 400: computing device
500: frame buffer
W1, R1: access type selection module
W2, R2: memory module selection module
W3, R3: Data routing module
W4, R4: Address calculation and routing module

Claims (7)

An image frame input from each of a plurality of video apparatuses is accessed for each user setting type through a user interface to extract image frame data selected for each setting type and overlays the image frame data to be matched to one image frame, A real-time overlay image processing apparatus for storing multiple images in a buffer (500)
The setting type through the user interface
Type 1 for selecting any video device among a plurality of video devices and directly storing the video frames of the video device in the frame buffer 500,
A Type 2 for accessing a specific location pixel in video frames input from all the plurality of video devices at the same time to perform an overlay operation and storing the same at a specific same position in the frame buffer 500,
Type 3 for accessing pixels at a specific position in an image frame inputted from two or more video apparatuses selected from a plurality of video apparatuses at the same time and performing an overlay operation and then storing the overlay operation at a specific same position in the frame buffer 500 Wherein the overlay image processing unit comprises:
delete The method according to claim 1,
The multi-image real-time overlay image processing apparatus
A simultaneous storage unit 100 for simultaneously storing image frame data transmitted from a plurality of video apparatuses in the memory module unit 200,
A simultaneous extracting unit 300 for accessing the image frame data stored in the memory module unit 200 for each type according to a user setting and simultaneously extracting valid data,
And an arithmetic unit (400) for overlaying the data extracted through the simultaneous extracting unit (300) and storing the data in the frame buffer (500).
The method of claim 3,
The simultaneous storage unit 100 stores
When the access type set by the user is Type 1, the video frame data transmitted from the selected single video device is bypassed to be stored in the frame buffer 500. If the set access type is Type 2 or Type 3 , N image frame data simultaneously input from N video devices are simultaneously accessed to pixels corresponding to the same position of N image frame data, which is an approach of Type 2, to rearrange the data into m data, Are stored in m memory modules of the real time overlay image processor (200).
5. The method of claim 4,
In the simultaneous storage unit 100,
When the type set by the user through the user interface is Type 1, the simultaneous storage method is selected as Type 1, and when the type set by the user is Type 2 or Type 3, an access type selection module (W1)
A memory module selection module (W2) for selecting n memory modules out of m memory modules in the memory module unit (200) when the simultaneous storage mode is selected by the access type selection module (W1)
A data routing module W3 for routing data transmitted from the n video devices to m memory modules of the memory module unit 200,
And an address calculation and routing module (W4) for calculating and routing n addresses for simultaneously storing n data transmitted from the n video apparatuses into m memory modules of the memory module unit (200) A real - time overlay image processing device for multiple images.
The method of claim 3,
The simultaneous extraction unit 300 extracts
Wherein the n number of data is extracted at the same time according to a Type 2 or Type 3 access type set by the user among the data stored in the memory module unit (200) and transmitted to the arithmetic unit (400) Image processing apparatus.
The method according to claim 6,
The simultaneous extraction unit 300
An access type selection module R1 for selecting whether the access type of data stored in the memory module unit 200 is Type 2 or Type 3,
A memory module selection module (R2) for selecting n memory modules valid among the m memory modules in the memory module unit (200) according to the type selected through the access type selection module (R1)
A data routing module (R3) for extracting valid n data out of m data simultaneously extracted from m memory modules in the memory module unit (200)
And an address calculation and routing module (R4) for calculating and routing n addresses of m data to be extracted from m memory modules in the memory module unit (200). Device.
KR1020150120468A 2015-08-26 2015-08-26 Real time overlay image processing apparatus of multiple camera image KR101681840B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150120468A KR101681840B1 (en) 2015-08-26 2015-08-26 Real time overlay image processing apparatus of multiple camera image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150120468A KR101681840B1 (en) 2015-08-26 2015-08-26 Real time overlay image processing apparatus of multiple camera image

Publications (1)

Publication Number Publication Date
KR101681840B1 true KR101681840B1 (en) 2016-12-12

Family

ID=57574075

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150120468A KR101681840B1 (en) 2015-08-26 2015-08-26 Real time overlay image processing apparatus of multiple camera image

Country Status (1)

Country Link
KR (1) KR101681840B1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050011050A (en) * 2003-07-21 2005-01-29 주식회사 팬택앤큐리텔 Dual camera phone
KR20100060396A (en) 2008-11-27 2010-06-07 주식회사 케이티 Apparatus and method of overlay image processing for real time image conversion
KR20110038371A (en) * 2009-10-08 2011-04-14 엘지전자 주식회사 Mobile terminal and method for extracting data thereof
KR20110097512A (en) * 2010-02-25 2011-08-31 성균관대학교산학협력단 Image generation apparatus and panoramic image generation method thereof
KR20120081496A (en) * 2011-01-11 2012-07-19 주식회사 창성에이스산업 The method for fire warning using analysis of thermal image temperature

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050011050A (en) * 2003-07-21 2005-01-29 주식회사 팬택앤큐리텔 Dual camera phone
KR20100060396A (en) 2008-11-27 2010-06-07 주식회사 케이티 Apparatus and method of overlay image processing for real time image conversion
KR20110038371A (en) * 2009-10-08 2011-04-14 엘지전자 주식회사 Mobile terminal and method for extracting data thereof
KR20110097512A (en) * 2010-02-25 2011-08-31 성균관대학교산학협력단 Image generation apparatus and panoramic image generation method thereof
KR20120081496A (en) * 2011-01-11 2012-07-19 주식회사 창성에이스산업 The method for fire warning using analysis of thermal image temperature

Similar Documents

Publication Publication Date Title
TWI573454B (en) Electronic device and image composition method thereof
US9286650B2 (en) Video processing apparatus, display apparatus, and video processing method
EP2403235B1 (en) Image correction device and image correction method
CN105453538B (en) Electronic device and its control method and non-transient computer readable recording medium
US20120127346A1 (en) Imaging apparatus, imaging method and computer program
EP2786556B1 (en) Controlling image capture and/or controlling image processing
CN107295249B (en) All focus implementation
CN108322722B (en) Image processing method and device based on augmented reality and electronic equipment
JP5824953B2 (en) Image processing apparatus, image processing method, and imaging apparatus
US9651767B2 (en) Image processing apparatus for endoscope, endoscope system and image processing method for endoscope
JP2019135810A (en) Image processing apparatus, image processing method, and program
CN102651798A (en) Electronic device
WO2017182790A1 (en) Hardware optimisation for generating 360° images
TW201607296A (en) Method of quickly generating depth map of image and image processing device
KR101681840B1 (en) Real time overlay image processing apparatus of multiple camera image
CN105516594A (en) Image shooting method and apparatus
US9892090B2 (en) Image processing apparatus and method for vector data
JP2007102478A (en) Image processor, image processing method, and semiconductor integrated circuit
KR100769460B1 (en) A real-time stereo matching system
CN114339071A (en) Image processing circuit, image processing method and electronic device
JP6306952B2 (en) Intermediate viewpoint image generation apparatus, intermediate viewpoint image generation method, and computer program
KR101826463B1 (en) Method and apparatus for synchronizing time line of videos
US9716819B2 (en) Imaging device with 4-lens time-of-flight pixels and interleaved readout thereof
JP2022057901A5 (en)
JP2019213171A (en) Image processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20191218

Year of fee payment: 4