KR20170019159A - Electronic device and image processing method thereof - Google Patents

Electronic device and image processing method thereof Download PDF

Info

Publication number
KR20170019159A
KR20170019159A KR1020150113176A KR20150113176A KR20170019159A KR 20170019159 A KR20170019159 A KR 20170019159A KR 1020150113176 A KR1020150113176 A KR 1020150113176A KR 20150113176 A KR20150113176 A KR 20150113176A KR 20170019159 A KR20170019159 A KR 20170019159A
Authority
KR
South Korea
Prior art keywords
image
alpha value
pixel
filtering
value
Prior art date
Application number
KR1020150113176A
Other languages
Korean (ko)
Inventor
이신준
이주형
주영훈
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020150113176A priority Critical patent/KR20170019159A/en
Publication of KR20170019159A publication Critical patent/KR20170019159A/en

Links

Images

Classifications

    • H04M1/72522
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

An electronic device according to various embodiments of the present invention includes: a memory; A processor; And an application that is stored in the memory and is operable in the processor, the application performing image filtering from the original image to generate a filtered image, wherein the memory stores the alpha value of the original image Wherein the processor stores an instruction to perform an operation of restoring an alpha value of the filtered image, and the processor executes an instruction stored in the memory after the application generates the filtered image.

Description

ELECTRONIC DEVICE AND IMAGE PROCESSING METHOD THEREOF FIELD OF THE INVENTION [0001]

Various embodiments of the present invention are directed to an electronic device, for example, an electronic device capable of performing image filtering.

Portable terminal devices have evolved according to the development of mobile communication technology and hardware technology, and can perform various functions required by users. In particular, the portable terminal device includes a camera module to acquire a digital image, and provides a function of filtering an image obtained through a camera module or an image received from the outside.

Conventionally, analog camera filters have been widely used. Recently, portable terminal devices have been used to provide similar effects to conventional analog camera filters through digital real-time or pre-processing camera filters, Image filters that are capable of digital post-processing filtering are being provided. Recent portable terminal devices provide as many as dozens of image filtering techniques (or correction, editing, and conversion techniques), such as grayscale to give a monochrome effect on an image, cartoon to give a cartoon effect, sketch, etc., and it is possible to apply multiple filtering techniques overlapping one image.

Such digital filters can be implemented through an application, and a portable terminal device can install a corresponding application through various paths such as downloading from a market.

Many of the various image filtering applications do not preserve alpha values (alpha) during image filtering. Here, the alpha value is a value indicating the degree of transparency of the image, and an alpha value may be determined for each pixel. These applications can perform filtering by reflecting the area in the color of the background even if a completely transparent area exists in the image. In this way, when the alpha value is not maintained before / after the image filtering, there is a problem that the user can not obtain the expected result due to the image filtering. That is, a transparent region of an image can not be maintained, and a transparent region may be changed to a different color, unlike a user's intention.

It is an object of the present invention to provide an electronic device and an image processing method of an electronic device capable of restoring a lost alpha value after image filtering.

An electronic device according to various embodiments of the present invention includes: a memory; A processor; And an application that is stored in the memory and is operable in the processor, the application performing image filtering from the original image to generate a filtered image, wherein the memory stores the alpha value of the original image Storing an instruction to perform an operation of restoring an alpha value of the filtered image, and the processor may execute an instruction stored in the memory after the application generates the filtered image.

An image processing method of an electronic device according to various embodiments of the present invention includes: executing an application for performing image filtering; Performing image filtering on the original image through the application to generate a filtered image; And generating a restored image by restoring an alpha value of the filtered image using an alpha value of the original image.

According to the present invention described above, it is possible to provide an electronic device and an image processing method of an electronic device capable of restoring a lost alpha value after image filtering.

1 illustrates an electronic device in a network environment in various embodiments.
2 is a block diagram of an electronic device according to various embodiments.
3 is a block diagram of a program module according to various embodiments.
Figures 4A-4C illustrate examples of original images, filtered images, and restored images in accordance with various embodiments.
5 is a block diagram of an electronic device according to various embodiments.
Figure 6 illustrates a portion of the pixel values of the original image, the filtered image, and the restored image according to various embodiments.
7 and 8 are flow charts of a method of image processing an electronic device according to various embodiments.

Hereinafter, various embodiments of the present document will be described with reference to the accompanying drawings. It should be understood, however, that this invention is not intended to be limited to the particular embodiments described herein but includes various modifications, equivalents, and / or alternatives of the embodiments of this document . In connection with the description of the drawings, like reference numerals may be used for similar components.

In this document, the expressions "having," " having, "" comprising," or &Quot;, and does not exclude the presence of additional features.

In this document, the expressions "A or B," "at least one of A or / and B," or "one or more of A and / or B," etc. may include all possible combinations of the listed items . For example, "A or B," "at least one of A and B," or "at least one of A or B" includes (1) at least one A, (2) Or (3) at least one A and at least one B all together.

As used herein, the terms "first," "second," "first," or "second," and the like may denote various components, regardless of their order and / or importance, But is used to distinguish it from other components and does not limit the components. For example, the first user equipment and the second user equipment may represent different user equipment, regardless of order or importance. For example, without departing from the scope of the rights described in this document, the first component can be named as the second component, and similarly the second component can also be named as the first component.

(Or functionally or communicatively) coupled with / to "another component (eg, a second component), or a component (eg, a second component) Quot; connected to ", it is to be understood that any such element may be directly connected to the other element or may be connected through another element (e.g., a third element). On the other hand, when it is mentioned that a component (e.g., a first component) is "directly connected" or "directly connected" to another component (e.g., a second component) It can be understood that there is no other component (e.g., a third component) between other components.

As used herein, the phrase " configured to " (or set) to be "configured according to circumstances may include, for example, having the capacity to, To be designed to, "" adapted to, "" made to, "or" capable of ". The term " configured to (or set up) "may not necessarily mean" specifically designed to "in hardware. Instead, in some situations, the expression "configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a generic-purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the scope of the other embodiments. The singular expressions may include plural expressions unless the context clearly dictates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as commonly understood by one of ordinary skill in the art. The general predefined terms used in this document may be interpreted in the same or similar sense as the contextual meanings of the related art and, unless expressly defined in this document, include ideally or excessively formal meanings . In some cases, even the terms defined in this document can not be construed as excluding the embodiments of this document.

An electronic device according to various embodiments of the present document may be, for example, a smartphone, a tablet personal computer, a mobile phone, a video phone, an e-book reader, A desktop personal computer, a laptop personal computer, a netbook computer, a workstation, a server, a personal digital assistant (PDA), a portable multimedia player (PMP) A medical device, a camera, or a wearable device. According to various embodiments, the wearable device may be of the accessory type (e.g., a watch, a ring, a bracelet, a bracelet, a necklace, a pair of glasses, a contact lens or a head-mounted-device (HMD) (E. G., Electronic apparel), a body attachment type (e. G., A skin pad or tattoo), or a bioimplantable type (e.g., implantable circuit).

In some embodiments, the electronic device may be a home appliance. Home appliances include, for example, televisions, digital video disc (DVD) players, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwaves, washing machines, air cleaners, set- Such as a home automation control panel, a security control panel, a TV box such as Samsung HomeSync, Apple TVTM or Google TVTM, a game console such as Xbox ™, PlayStation ™, a digital camera, a camcorder, or an electronic photo frame.

In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) Navigation systems, global navigation satellite systems (GNSS), event data recorders (EDRs), flight data recorders (FDRs), infotainment (infotainment) systems, ) Automotive electronic equipment (eg marine navigation systems, gyro compass, etc.), avionics, security devices, head units for vehicles, industrial or home robots, automatic teller's machines (ATMs) Point of sale, or internet of things (eg, light bulbs, various sensors, electrical or gas meters, sprinkler devices, fire alarms, thermostats, street lights, Of the emitter (toaster), exercise equipment, hot water tank, a heater, boiler, etc.) may include at least one.

According to some embodiments, the electronic device is a piece of furniture or a part of a building / structure, an electronic board, an electronic signature receiving device, a projector, Water, electricity, gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device may be a combination of one or more of the various devices described above. An electronic device according to some embodiments may be a flexible electronic device. Further, the electronic device according to the embodiment of the present document is not limited to the above-described devices, and may include a new electronic device according to technological advancement.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An electronic apparatus according to various embodiments will now be described with reference to the accompanying drawings. In this document, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).

Referring to Figure 1, in various embodiments, an electronic device 101 in a network environment 100 is described. The electronic device 101 may include a bus 110, a processor 120, a memory 130, an input / output interface 150, a display 160, and a communication interface 170. In some embodiments, the electronic device 101 may omit at least one of the components or additionally include other components.

The bus 110 may include circuitry, for example, to connect the components 110-170 to one another and to communicate communications (e.g., control messages and / or data) between the components.

The processor 120 may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP). The processor 120 may perform computations or data processing related to, for example, control and / or communication of at least one other component of the electronic device 101.

Memory 130 may include volatile and / or non-volatile memory. Memory 130 may store instructions or data related to at least one other component of electronic device 101, for example. According to one embodiment, the memory 130 may store software and / or programs 140. The program 140 may include one or more of the following: a kernel 141, a middleware 143, an application programming interface (API) 145, and / or an application program . ≪ / RTI > At least a portion of the kernel 141, middleware 143, or API 145 may be referred to as an operating system (OS).

The kernel 141 may include system resources used to execute an operation or function implemented in other programs (e.g., middleware 143, API 145, or application program 147) (E.g., bus 110, processor 120, or memory 130). The kernel 141 also provides an interface to control or manage system resources by accessing individual components of the electronic device 101 in the middleware 143, API 145, or application program 147 .

The middleware 143 can perform an intermediary role such that the API 145 or the application program 147 can communicate with the kernel 141 to exchange data.

In addition, the middleware 143 may process one or more task requests received from the application program 147 according to the priority order. For example, middleware 143 may use system resources (e.g., bus 110, processor 120, or memory 130, etc.) of electronic device 101 in at least one of application programs 147 Priority can be given. For example, the middleware 143 may perform the scheduling or load balancing of the one or more task requests by processing the one or more task requests according to the priority assigned to the at least one task.

The API 145 is an interface for the application 147 to control the functions provided by the kernel 141 or the middleware 143, Control or the like, for example, instructions.

The input / output interface 150 may serve as an interface by which commands or data input from, for example, a user or other external device can be transferred to another component (s) of the electronic device 101. Output interface 150 may output commands or data received from other component (s) of the electronic device 101 to a user or other external device.

Display 160 may include, for example, a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) A microelectromechanical systems (MEMS) display, or an electronic paper display. Display 160 may display various content (e.g., text, image, video, icon, or symbol, etc.) to a user, for example. Display 160 may include a touch screen and may receive a touch, gesture, proximity, or hovering input using, for example, an electronic pen or a portion of the user's body.

The communication interface 170 establishes communication between the electronic device 101 and an external device (e.g., the first external electronic device 102, the second external electronic device 104, or the server 106) . For example, communication interface 170 may be connected to network 162 via wireless or wired communication to communicate with an external device (e.g., second external electronic device 104 or server 106).

Wireless communications may include, for example, cellular communication protocols such as long-term evolution (LTE), LTE Advance (LTE), code division multiple access (CDMA), wideband CDMA (WCDMA) mobile telecommunications system, WiBro (Wireless Broadband), or Global System for Mobile Communications (GSM). The wireless communication may also include, for example, local communication 164. The local area communication 164 may include at least one of, for example, wireless fidelity (WiFi), Bluetooth, near field communication (NFC), or global navigation satellite system (GNSS). GNSS can be classified into two types according to the use area or bandwidth, for example, Global Positioning System (GPS), Global Navigation Satellite System (Glonass), Beidou Navigation Satellite System (Beidou) And may include at least one. Hereinafter, in this document, " GPS " can be interchangeably used with " GNSS ". The wired communication may include at least one of, for example, a universal serial bus (USB), a high definition multimedia interface (HDMI), a recommended standard 232 (RS-232), or plain old telephone service (POTS). The network 162 may include at least one of a telecommunications network, e.g., a computer network (e.g., a LAN or WAN), the Internet, or a telephone network.

Each of the first and second external electronic devices 102, 104 may be the same or a different kind of device as the electronic device 101. According to one embodiment, the server 106 may comprise a group of one or more servers. According to various embodiments, all or a portion of the operations performed in the electronic device 101 may be performed in one or more other electronic devices (e.g., electronic devices 102, 104, or server 106). According to the present invention, when electronic device 101 is to perform a function or service automatically or on demand, electronic device 101 may perform at least some functions associated therewith instead of, or in addition to, (E.g., electronic device 102, 104, or server 106) may request the other device (e.g., electronic device 102, 104, or server 106) Perform additional functions, and forward the results to the electronic device 101. The electronic device 101 may process the received results as is or additionally to provide the requested functionality or services. For example, Cloud computing, distributed computing, or client-server computing techniques can be used.

2 is a block diagram of an electronic device 201 according to various embodiments. The electronic device 201 may include all or part of the electronic device 101 shown in Fig. 1, for example. The electronic device 201 may include one or more processors (e.g., an application processor (AP)) 210, a communication module 220, a subscriber identification module 224, a memory 230, a sensor module 240, an input device 250 A display 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298 have.

The processor 210 may control a plurality of hardware or software components connected to the processor 210, for example, by driving an operating system or an application program, and may perform various data processing and calculations. The processor 210 may be implemented with, for example, a system on chip (SoC). According to one embodiment, the processor 210 may further include a graphics processing unit (GPU) and / or an image signal processor. Processor 210 may include at least some of the components shown in FIG. 2 (e.g., cellular module 221). Processor 210 may load or process instructions or data received from at least one of the other components (e.g., non-volatile memory) into volatile memory and store the various data in non-volatile memory have.

The communication module 220 may have the same or similar configuration as the communication interface 170 of FIG. The communication module 220 includes a cellular module 221, a WiFi module 223, a Bluetooth module 225, a GNSS module 227 (e.g., a GPS module, a Glonass module, a Beidou module, or a Galileo module) An NFC module 228, and a radio frequency (RF) module 229.

The cellular module 221 can provide voice calls, video calls, text services, or Internet services, for example, over a communication network. According to one embodiment, the cellular module 221 may utilize a subscriber identity module (e.g., a SIM card) 224 to perform the identification and authentication of the electronic device 201 within the communication network. According to one embodiment, the cellular module 221 may perform at least some of the functions that the processor 210 may provide. According to one embodiment, the cellular module 221 may include a communication processor (CP).

Each of the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228 may include a processor for processing data transmitted and received through the corresponding module, for example. At least some (e.g., two or more) of the cellular module 221, the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228, according to some embodiments, (IC) or an IC package.

The RF module 229 can, for example, send and receive communication signals (e.g., RF signals). The RF module 229 may include, for example, a transceiver, a power amplifier module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna. According to another embodiment, at least one of the cellular module 221, the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228 transmits / receives an RF signal through a separate RF module .

The subscriber identity module 224 may include, for example, a card containing a subscriber identity module and / or an embedded SIM and may include unique identification information (e.g., an integrated circuit card identifier (ICCID) Subscriber information (e.g., international mobile subscriber identity (IMSI)).

Memory 230 (e.g., memory 130) may include, for example, internal memory 232 or external memory 234. The built-in memory 232 may be implemented as, for example, a volatile memory (e.g., dynamic RAM, SRAM, or synchronous dynamic RAM), a non-volatile memory Programmable ROM (EPROM), electrically erasable and programmable ROM (EEPROM), mask ROM, flash ROM, flash memory (e.g., NAND flash or NOR flash) A hard drive, or a solid state drive (SSD).

The external memory 234 may be a flash drive such as a compact flash (CF), a secure digital (SD), a micro secure digital (SD), a mini secure digital (SD) digital, a multi-media card (MMC), a memory stick, and the like. The external memory 234 may be functionally and / or physically connected to the electronic device 201 via various interfaces.

The sensor module 240 may, for example, measure a physical quantity or sense the operating state of the electronic device 201 to convert the measured or sensed information into an electrical signal. The sensor module 240 includes a gesture sensor 240A, a gyro sensor 240B, an air pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, A temperature sensor 240G, a UV sensor 240G, a color sensor 240H (e.g., an RGB (red, green, blue) sensor), a living body sensor 240I, And a sensor 240M. Additionally or alternatively, the sensor module 240 may include, for example, an E-nose sensor, an electromyography sensor, an electroencephalogram sensor, an electrocardiogram sensor, , An infrared (IR) sensor, an iris sensor, and / or a fingerprint sensor. The sensor module 240 may further include a control circuit for controlling at least one or more sensors belonging to the sensor module 240. In some embodiments, the electronic device 201 further includes a processor configured to control the sensor module 240, either as part of the processor 210 or separately, so that while the processor 210 is in a sleep state, The sensor module 240 can be controlled.

The input device 250 may include a touch panel 252, a (digital) pen sensor 254, a key 256, or an ultrasonic input device 258). As the touch panel 252, for example, at least one of an electrostatic type, a pressure sensitive type, an infrared type, and an ultrasonic type can be used. Further, the touch panel 252 may further include a control circuit. The touch panel 252 may further include a tactile layer to provide a tactile response to the user.

(Digital) pen sensor 254 may be part of, for example, a touch panel or may include a separate recognition sheet. Key 256 may include, for example, a physical button, an optical key, or a keypad. The ultrasonic input device 258 can sense the ultrasonic wave generated by the input tool through the microphone (e.g., the microphone 288) and confirm the data corresponding to the ultrasonic wave detected.

Display 260 (e.g., display 160) may include a panel 262, a hologram device 264, or a projector 266. Panel 262 may include the same or similar configuration as display 160 of FIG. The panel 262 may be embodied, for example, flexible, transparent, or wearable. The panel 262 may be composed of one module with the touch panel 252. [ The hologram device 264 can display a stereoscopic image in the air using interference of light. The projector 266 can display an image by projecting light onto a screen. The screen may be located, for example, inside or outside the electronic device 201. According to one embodiment, the display 260 may further comprise control circuitry for controlling the panel 262, the hologram device 264, or the projector 266.

The interface 270 may be implemented using a variety of interfaces including, for example, a high-definition multimedia interface (HDMI) 272, a universal serial bus (USB) 274, an optical interface 276, or a D- ) ≪ / RTI > The interface 270 may, for example, be included in the communication interface 170 shown in FIG. Additionally or alternatively, the interface 270 may be, for example, a mobile high-definition link (MHL) interface, a secure digital (SD) card / multi-media card (MMC) data association standard interface.

The audio module 280 can, for example, convert sound and electrical signals in both directions. At least some of the components of the audio module 280 may be included, for example, in the input / output interface 145 shown in FIG. The audio module 280 may process sound information input or output through, for example, a speaker 282, a receiver 284, an earphone 286, a microphone 288, or the like.

The camera module 291 may be, for example, a device capable of capturing still images and moving images, and may include one or more image sensors (e.g., a front sensor or a rear sensor), a lens, an image signal processor (ISP) , Or a flash (e.g., an LED or xenon lamp, etc.).

The power management module 295 can, for example, manage the power of the electronic device 201. [ According to one embodiment, the power management module 295 may include a power management integrated circuit (PMIC), a charger integrated circuit, or a battery or fuel gauge. The PMIC may have a wired and / or wireless charging scheme. The wireless charging scheme may include, for example, a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic wave scheme, and may further include an additional circuit for wireless charging, for example, a coil loop, a resonant circuit, have. The battery gauge can measure, for example, the remaining amount of the battery 296, the voltage during charging, the current, or the temperature. The battery 296 may include, for example, a rechargeable battery and / or a solar battery.

The indicator 297 may indicate a particular state of the electronic device 201 or a portion thereof (e.g., processor 210), e.g., a boot state, a message state, or a state of charge. The motor 298 can convert electrical signals to mechanical vibration and can generate vibration, haptic effects, and the like. Although not shown, the electronic device 201 may include a processing unit (e.g., a GPU) for mobile TV support. The processing device for supporting mobile TV can process media data conforming to standards such as digital multimedia broadcasting (DMB), digital video broadcasting (DVB), or media flow (TM), for example.

Each of the components described in this document may be composed of one or more components, and the name of the component may be changed according to the type of the electronic device. In various embodiments, the electronic device may comprise at least one of the components described herein, some components may be omitted, or may further include additional other components. In addition, some of the components of the electronic device according to various embodiments may be combined into one entity, so that the functions of the components before being combined can be performed in the same manner.

3 is a block diagram of a program module according to various embodiments. According to one embodiment, program module 310 (e.g., program 140) includes an operating system (OS) that controls resources associated with an electronic device (e.g., electronic device 101) (E.g., application programs 147) running on the system. The operating system may be, for example, android, iOS, windows, symbian, tizen, or bada.

The program module 310 may include a kernel 320, a middleware 330, an application programming interface (API) 360, and / or an application 370. At least a portion of the program module 310 may be preloaded on the electronic device or may be downloaded from an external electronic device such as the electronic device 102 104 or the server 106,

The kernel 320 (e.g., the kernel 141) may include, for example, a system resource manager 321 and / or a device driver 323. The system resource manager 321 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 321 may include a process manager, a memory manager, or a file system manager. The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication .

The middleware 330 may provide various functions commonly required by the application 370 or may be provided through the API 360 in various ways to enable the application 370 to efficiently use limited system resources within the electronic device. Functions can be provided to the application 370. According to one embodiment, middleware 330 (e.g., middleware 143) includes a runtime library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager 346, a package manager 347, a connectivity manager 346, (Not shown) 348, a notification manager 349, a location manager 350, a graphic manager 351, or a security manager 352 can do.

The runtime library 335 may include, for example, a library module that the compiler uses to add new functionality via a programming language while the application 370 is executing. The runtime library 335 may perform input / output management, memory management, or functions for arithmetic functions.

The application manager 341 can manage the life cycle of at least one of the applications 370, for example. The window manager 342 can manage GUI resources used in the screen. The multimedia manager 343 can recognize the format required for reproducing various media files and can encode or decode the media file using a codec suitable for the format. The resource manager 344 can manage resources such as source code, memory or storage space of at least one of the applications 370.

The power manager 345 operates together with a basic input / output system (BIOS), for example, to manage a battery or a power source, and can provide power information and the like necessary for the operation of the electronic device. The database manager 346 may create, retrieve, or modify a database for use in at least one of the applications 370. The package manager 347 can manage installation or update of an application distributed in the form of a package file.

The connection manager 348 may manage wireless connections, such as, for example, WiFi or Bluetooth. The notification manager 349 may display or notify events such as arrival messages, appointments, proximity notifications, etc. in a manner that is unobtrusive to the user. The location manager 350 may manage the location information of the electronic device. The graphic manager 351 may manage the graphic effect to be provided to the user or a user interface related thereto. The security manager 352 can provide all security functions necessary for system security or user authentication. According to one embodiment, when an electronic device (e.g., electronic device 101) includes a telephone function, middleware 330 further includes a telephony manager for managing the voice or video call capabilities of the electronic device can do.

Middleware 330 may include a middleware module that forms a combination of various functions of the above-described components. The middleware 330 may provide a module specialized for each type of operating system in order to provide differentiated functions. In addition, the middleware 330 may dynamically delete some existing components or add new ones.

The API 360 (e.g., API 145) may be provided in a different configuration depending on the operating system, for example, as a set of API programming functions. For example, for Android or iOS, you can provide one API set per platform, and for tizen, you can provide more than two API sets per platform.

An application 370 (e.g., an application program 147) may include, for example, a home 371, a dialer 372, an SMS / MMS 373, an instant message 374, a browser 375, The camera 376, the alarm 377, the contact 378, the voice dial 379, the email 380, the calendar 381, the media player 382, the album 383 or the clock 384, or one or more applications capable of performing functions such as health care (e.g., measuring exercise or blood glucose), or providing environmental information (e.g., providing atmospheric pressure, humidity, or temperature information, etc.).

According to one embodiment, an application 370 is an application that supports the exchange of information between an electronic device (e.g., electronic device 101) and an external electronic device (e.g., electronic devices 102 and 104) For convenience, an "information exchange application"). The information exchange application may include, for example, a notification relay application for communicating specific information to an external electronic device, or a device management application for managing an external electronic device.

For example, the notification delivery application may send notification information generated by other applications (e.g., SMS / MMS applications, email applications, health care applications, or environmental information applications) of the electronic device to external electronic devices , 104), respectively. Further, the notification delivery application can receive notification information from, for example, an external electronic device and provide it to the user.

The device management application may be configured to perform at least one function (e.g., turn-on or turn-off) of an external electronic device (e.g., an electronic device 102 or 104) (E.g., on / off-off, or adjusting the brightness (or resolution) of the display), managing applications (e.g., , Or updated).

According to one embodiment, the application 370 may include an application (e.g., a healthcare application of a mobile medical device, etc.) designated according to an attribute of an external electronic device (e.g., electronic device 102, 104). According to one embodiment, application 370 may include an application received from an external electronic device (e.g., server 106 or electronic device 102, 104) May include a preloaded application or a third party application downloadable from a server. The names of the components of the program module 310 according to the illustrated embodiment may include the type of operating system Therefore, it can be changed.

According to various embodiments, at least some of the program modules 310 may be implemented in software, firmware, hardware, or a combination of at least two of them. At least some of the program modules 310 may be implemented (e.g., executed) by, for example, a processor (e.g., processor 210). At least some of the program modules 310 may include, for example, modules, programs, routines, sets of instructions or processes, etc. to perform one or more functions.

As used in this document, the term "module" may refer to a unit comprising, for example, one or a combination of two or more of hardware, software or firmware. A "module" may be interchangeably used with terms such as, for example, unit, logic, logical block, component, or circuit. A "module" may be a minimum unit or a portion of an integrally constructed component. A "module" may be a minimum unit or a portion thereof that performs one or more functions. "Modules" may be implemented either mechanically or electronically. For example, a "module" may be an application-specific integrated circuit (ASIC) chip, field-programmable gate arrays (FPGAs) or programmable-logic devices And may include at least one.

At least a portion of a device (e.g., modules or functions thereof) or a method (e.g., operations) according to various embodiments may include, for example, computer-readable storage media in the form of program modules, As shown in FIG. When the instruction is executed by a processor (e.g., processor 120), the one or more processors may perform a function corresponding to the instruction. The computer readable storage medium may be, for example, memory 130. [

The computer readable recording medium may be a hard disk, a floppy disk, a magnetic media (e.g., a magnetic tape), an optical media (e.g., a compact disc read only memory (CD-ROM) digital versatile discs, magneto-optical media such as floptical disks, hardware devices such as read only memory (ROM), random access memory (RAM) Etc. The program instructions may also include machine language code such as those produced by a compiler, as well as high-level language code that may be executed by a computer using an interpreter, etc. The above- May be configured to operate as one or more software modules to perform the operations of the embodiment, and vice versa.

Modules or program modules according to various embodiments may include at least one or more of the elements described above, some of which may be omitted, or may further include additional other elements. Operations performed by modules, program modules, or other components in accordance with various embodiments may be performed in a sequential, parallel, iterative, or heuristic manner. Also, some operations may be performed in a different order, omitted, or other operations may be added. And the embodiments disclosed in this document are presented for the purpose of explanation and understanding of the disclosed technology and do not limit the scope of the technology described in this document. Accordingly, the scope of this document should be interpreted to include all modifications based on the technical idea of this document or various other embodiments.

An image consists of a plurality of pixels, digital images contain information about each pixel, and each pixel stores a pixel value (or pixel data) of a predefined size according to the format of the image Lt; / RTI > The pixel value may include information such as the color, brightness, etc. of the image, and may include an alpha value.

The alpha value is a value indicating the transparency of the pixel. For example, when the alpha value is represented by 8-bit data, the pixel has a value of 0 to 255. Here, when the pixel has an alpha value of 255, the pixel is completely opaque, translucent when 0 to 255, transparent when converged to 0 ≪ / RTI > When displaying on the display 440 of the electronic device 400, the area of a pixel that is completely opaque (e.g., alpha = 255) is output as it is according to the RGB values without affecting the background, For example, a region of a pixel that is translucent (for example, 0 < 留 < 255) may have a color of the pixel value of the image and a color of the pixel value of the background Can be output in combination according to the alpha value. In the present specification, the pixel values are represented by 8-bit data having values of 0 to 255, but the present invention is not limited thereto. Pixel values (or alpha values, RGB values) may be defined as values of 0 to 1, May be defined.

Of various formats of digital images, formats such as portable network graphics (PNG), bitmap (BMP), and TGA may include an alpha channel. The pixel value of each pixel can be defined as an RGB (red, green, blue) value including an alpha value and color information, and can be represented by (A, R, G, B) in this case. In the case of JPEG (or JPG) among some image formats, an alpha channel is not defined, and the electronic device 400 can change an image in the JPEG format to a format including an alpha channel such as PNG.

Digital images can be filtered (or corrected, edited, transformed, enhanced, adjusted) according to various image processing techniques. Image filtering is a process of editing the color, contrast, and outline of at least some pixels or objects in an image, for example, grayscale to give a monochrome effect on an image, cartoon to give a cartoon effect, sketch to convert as a drawing with a pen And the like. According to various embodiments, image filtering may be performed in a manner that includes a series of processes for correcting, editing, converting, enhancing, and adjusting an image according to known image processing techniques . ≪ / RTI > That is, the image filtering application can define the process of receiving the original image and generating a modified image from the original image through a predetermined process of correction, editing, conversion, enhancement, and adjustment.

In the following, various embodiments of the present invention for restoring alpha values after an image filtering operation will be described in more detail.

Figures 4A-4C illustrate an example of an original image 10, a filtered image 20, and a restored image 30 in accordance with various embodiments of the present invention.

According to various embodiments of the present invention, an image obtained by filtering the original image 10 using a filtering application (or an image filter) is defined as a filtered image 20, and an image obtained by restoring the alpha value of the filtered image 20 Is defined as a restoration image (30). The original image 10, the filtered image 20, and the restored image 30 will now be described by exemplifying the inclusion of a flower-shaped object and a background, as shown in Figs. 4A to 4C.

The electronic device 400 temporarily creates a first buffer for storage of the original image 10 and a second buffer for storage of the filtered image 20 on the memory 420, The reconstructed image 30 may be generated using the alpha value of the original image 10 stored in the first buffer in the buffer 20.

4A shows an image that includes an opaque (e.g., alpha = 255) flower shaped object and a fully transparent (e.g., alpha = 0) background as an example of the original image 10.

Figure 4b shows a filtered image 20 filtered from the original image 10, such as by grayscale, and the color of at least some of the flower objects may be changed as an effect of image filtering, as shown in Figure 4b. Also, unlike intended in image filtering, the alpha value of the background of the original image 10 in the filtered image 20 may be lost and the background of the filtered image may be changed to opaque white (or gray) Shows that the alpha value of the image filtering result is lost.

4C shows an example of the restored image 30, in which the alpha value for the background in which the alpha value is lost in the filtered image 20 is restored using the alpha value of the original image 10. [ As shown in FIG. 4C, the restored image 30 includes an appropriately transformed flower shaped object according to image filtering, and the alpha value of the background in which the alpha value is lost in the filtered image 20 is restored, It can be confirmed that the background of the original image 30 becomes transparent as in the original image 10.

In various embodiments, the original image 10 may be an image generated by the camera module, or an image generated by the electronic device 400, such as an image generated through an application (such as Paint, Photoshop, etc.) Or an image obtained from an external device (another electronic device 400, an external storage device, or a server device), but is not limited thereto.

In various embodiments, the electronic device 400 may install and execute at least one application that includes an image filtering function. The electronic device 400 can acquire an application through various paths such as accessing and downloading to an application market. The application is stored and installed on the memory 420 of the electronic device 400, and the electronic device 400 can execute the installed application according to the user's input.

When the application is executed, the electronic device 400 displays a UI, which can be selected from at least one of the original images 10 stored in the electronic device 400 or accessible through the electronic device 400, 10), and can receive a user's input.

The electronic device 400 may filter the original image 10 through the application to generate the filtered image 20. Image filtering can mean digital preprocessing (or real-time) image filtering or post-processing image filtering. Here, the image filtering may include at least one of image processing techniques such as image correction, editing, conversion, enhancement, and adjustment. In addition, the pre-processing (or real-time) image filtering means a method of filtering an image taken in a photographing step using a camera module in real time and displaying it on a preview screen. Quot;) may be filtered to produce a filtered (or corrected, edited, transformed, enhanced, adjusted) image.

The application may support at least one of pre-processing (or real-time) image filtering and post-processing image filtering, and various embodiments of the present invention may be used to restore an alpha value Technology.

Image filtering is a process of filtering the color, contrast, and outline of an original image 10 according to a digital image processing technique. For example, the image filtering may be performed by grayscale, And an application can use the multiple filtering techniques to filter the original image 10 and generate the filtered image 20.

The application for performing image filtering may be a central processing unit (CPU) or an image filtering application based on a GPU (graphical processing unit). As is known, a CPU is a processor that performs control and operation on an overall system, and a GPU corresponds to a processor for graphics processing. Various image filtering applications can be classified into a CPU-based application operated by a CPU and a GPU-based application operated by a GPU. The GPU is designed to efficiently process parallel processing including a plurality of cores It is advantageous for parallel processing of relatively simple operations, and the CPU consists of a small number of cores optimized for serial processing, and each of the advantages and disadvantages of processing complicated logic is that the applications can be CPU-based And can be designed to operate on a GPU basis. For example, an application for pre-processing (or real-time) image filtering that requires fast operation processing may operate on a GPU basis, an application for post-processing image filtering may be designed to operate on a CPU basis, Pre-processing (or real-time) image filtering as well as post-processing image filtering can be performed, and CPU-based applications can perform pre-processing (or real-time) image filtering as well as post-processing image filtering.

Some of the image filtering applications may cause the alpha value of the original image 10 to be lost if the original image 10 is filtered to generate the filtered image 20. [ For example, an application can recognize a fully transparent region of an image (an area having an alpha value of 0) as a background color, and thereby perform image filtering on the transparent region in the same manner as other opaque regions. If the alpha value of the image filtering result is lost, the alpha value of the lost area may be stored as being completely opaque (e.g., alpha = 255).

Some filtering applications are GPU-based applications that are suitable for pre-processing (or real-time) image filtering, and pre-processing image filtering is to filter images currently being captured through the camera module of the electronic device 400, There is no need to be. Thus, when post-processing image filtering is performed using these applications to generate the filtered image 20, the alpha value may be lost. That is, even though the original image 10 in the format in which the alpha channel exists, the alpha value of some pixels on the filtered image 20 may be lost, depending on the application used or the type of filtering technique selected . For example, as shown in FIG. 4B, the alpha value of the background area may be lost and displayed as a completely opaque area.

The electronic device 400 according to various embodiments of the present invention may recover the loss of alpha values that may occur after image filtering due to the above reasons. To this end, the electronic device 400 may be configured to recover the alpha value of the filtered image 20 using the alpha value of the original image 10.

5 is a block diagram of an electronic device 400 in accordance with various embodiments of the present invention.

As shown, the electronic device 400 may include a processor 410, a memory 420, an input device 430, a display 440, and a communication module 450, some of which are omitted Or substituted for the various embodiments of the present invention.

The input device 430 may include at least a portion of the configuration of the input device 250 of FIG. 2 in a configuration for receiving a user input (e.g., a touch input, a cursor input, etc.). The electronic device 400 may support selection of a menu, an icon, etc. according to a user's input through the input device 430. [

The display 440 may include at least a portion of the display 160 of FIG. 1 and / or the display 260 of FIG. 2 in a configuration for displaying an image. A panel (not shown) of the display 440 may be configured as a module with a touch panel (not shown) of the input device 430. The display 440 can display the original image 10, the filtered image 20 and the restored image 30 under the control of the processor 410 and can display various types of graphical user interface (GUI) Can be displayed.

The communication module 450 is configured to transmit and receive data to / from an external device (another electronic device 400, a server device, etc.) and includes a communication interface 170 of FIG. 1 and / At least a portion of < / RTI >

The memory 420 may include at least one of the memory 130 of FIG. 1 and / or the memory 230 of FIG. 2 in a configuration for storing digital data, such as an image. The electronic device 400 may include a plurality of memories and in FIG. 5 the memory 420 may be interpreted to include at least some of the plurality of memories included in the electronic device 400.

Memory 420 may include volatile memory 422 and non-volatile memory 424. Volatile memory 422 may be implemented as one or more programmable ROM (OTPROM), erasable and programmable ROM (EPROM), electrically erasable and programmable ROM (EEPROM), mask ROM, flash ROM, flash memory volatile memory 440 may include at least one of a dynamic random access memory (DRAM), a static random access memory (SRAM), or a static random access memory (SRAM) And synchronous dynamic RAM (SDRAM).

The memory 420 stores at least one application for performing image filtering, and the stored application may be loaded by the processor 410. The memory 420 may also store various instructions that perform operations to restore the alpha value from the filtered image 20 generated by the application. For example, the instructions may include instructions to perform an operation of restoring the alpha value of the filtered image 20 using the alpha value of the original image 10, instructions to save the original image 10 on the memory 420, A first buffer and a command to create and / or remove a second buffer for storage of the filtered image, and so on.

In various embodiments of the invention, the instructions may include control instructions such as arithmetic and logic operations, data movement, input / output, etc., which may be recognized by the processor 410. Such instructions may be loaded and executed by the processor 410 after the filtered image is generated by the application, and operations in which the processor 410 executes the instructions may be implemented on the operating system or on other applications than the image filtering application However, the present invention is not limited to specific embodiments.

In the following, in addition to the operation in which the application generates the filtered image, the operations performed by the processor 410 may be performed by the processor 410 executing the instructions stored in the memory 420 in a separate area from the application .

The processor 410 may be any of the processors 120 of FIG. 1 and / or the application processor of FIG. 2 (FIG. 1), which may perform operations or data processing on the control and / or communication of each component of the electronic device 400 210). ≪ / RTI >

The processor 410 controls the installation and execution of the application and can execute an application for performing image filtering according to a user's input. That is, the processor 410 may load the application stored in the non-volatile memory 424 into the volatile memory 422. [ The processor 410 controls to generate and display a GUI implemented in the application on the display 440. The GUI may include a menu for selection of the original image 10 and an image filtering technique.

The application may generate the filtered image 20 from the original image 10. The filtering image 20 may be filtered from the original image 10 according to various image filtering techniques implemented on the application (e.g., grayscale to give a monochrome effect to the image, cartoon to give a cartoon effect, sketch to transform as if drawn with a pen, (Or corrected, edited, converted, enhanced, adjusted), which is illustrated in FIG. 4B. The generated filtered image 20 may be an image in which the alpha value is lost. That is, at least a part of the information of the alpha channel may be deleted in the process of filtering through the application.

According to various embodiments, the processor 410 may restore the lost alpha value in the filtered image 20 to generate a reconstructed image 30, wherein the alpha value of the original image 10 is set to < RTI ID = 0.0 > The alpha value of the filtered image 20 can be restored. The operation of generating the reconstructed image 30 may be an operation by an operating system or other application, rather than an application that generates the filtering image 20 described above in software. The operation of the processor 410 to generate the reconstructed image 30 may be performed by executing instructions for generating a reconstructed image stored in the memory 420 after the filtered image is generated by the application.

The processor 410 may generate a first buffer for storage of the original image 10 and a second buffer for storage of the filtered image 20 on the memory 420 (or volatile memory 422). Here, the first buffer and the second buffer may correspond to a specific (or unspecified) segment on the memory 420 (or the volatile memory 422), and need not be implemented as separate hardware buffers.

The first buffer and the second buffer are created to temporarily store the original image 10 and the filtered image 20 to use the alpha value of the original image 10 in restoring the alpha value of the filtered image 20 Lt; / RTI > Accordingly, the processor 410 may recover the alpha value of the filtered image 20 to generate the reconstructed image 30, and then remove the first buffer and the second buffer.

The digital image data including the original image 10, the filtered image 20, and the restored image 30 is composed of a plurality of pixels. The pixel value may include information such as the color, brightness, etc. of the image, and may include an alpha value. The alpha value is a value indicating the transparency of the pixel. For example, when the alpha value is represented by 8-bit data, the alpha value has a value of 0 to 255. When the alpha value of the pixel is 255, Completely opaque, translucent when 0 to 255, transparent when converged to 0, and so on.

The processor 410 may restore the alpha value of the filtered image 20 and restore the alpha value for each pixel of the filtered image 20 to produce a reconstructed image 30. [ The processor 140 may extract the pixels for which the alpha value has been lost in the filtered image 20. [ For example, the processor 140 compares the alpha value of the original image 10 matched with the pixel coordinates of the filtered image 20, and if the alpha value is greater than a predetermined limit value, .

The processor 410 uses the alpha value of the pixel of the original image 10 corresponding to the coordinates (x, y) of the pixel whose alpha value has been lost in the filtered image 20 to determine the alpha value of the corresponding pixel of the filtered image 20 Can be restored.

The processor 410 may compare the resolution of the original image 10 with the resolution of the filtered image 20 to determine if they have the same resolution. (X, y) that is the same as the pixel coordinate (x, y) in which the alpha value has been lost in the filtered image 20 is the same as the resolution of the original image 10 and the filtered image 20, And the alpha value of the pixel (x, y) of the filtering image 20 can be restored using the alpha value of the pixel. If the resolution of the original image 10 and the filtered image 20 are different, that is, if the resolution of the image filtering result by the application is changed, the ratio of the scaling of the resolution of the original image 10 and the filtered image 20 The alpha value of the filtered image 20 can be restored using the alpha value of the pixel of the original image 10 corresponding to the pixel whose alpha value has been lost in the filtered image 20. [

The processor 410 may determine the pixel value of the reconstructed image 30 by combining the alpha value of the original image 10 and the RGB value of the filtered image 20 for each pixel. For example, assume that the pixel coordinates (x1, y1) of the pixel coordinates (A1a, R1a, G1a, B1a) in the original image 10 and the pixel coordinates (x1, y1) The pixel values of the pixel coordinates (x1, y1) of the restored image 30 are the alpha values of the original image 10 and the RGB values of the filtered image 20, respectively, when the pixel values of the restored image 30 are (A1b, R1b, G1b, (A1a, R1b, G1b, B1b). This will be described later in more detail with reference to FIG.

The processor 410 may determine the pixel value of the reconstructed image 30 by combining the alpha value of the original image 10 and the RGB value of the filtered image 20 for every pixel of the filtered image 20, As an embodiment, after searching pixels in the filtered image 20 or the original image 10 that satisfy a condition where the alpha value is constant (for example, a pixel whose difference in alpha value is equal to or greater than a predetermined limit value) The restoration operation of the alpha value can be performed.

According to various embodiments, the processor 410 may extract at least one pixel whose alpha value is greater than or equal to a predetermined upper limit in the filtered image 20, and restore the alpha value for at least one extracted pixel. For example, the upper limit value may be 255 or a specific value of 128 or more based on 8-bit data. If the upper limit value is set to 255, the alpha value is restored only for perfectly opaque pixels in the filtered image 20. This means that the pixels whose alpha value is lost are treated as being completely opaque so that the alpha value can be set to 255 .

According to various embodiments, the processor 410 extracts at least one pixel whose alpha value is less than or equal to a predetermined lower limit in the original image 10, and extracts at least one pixel of the filtered image 20 corresponding to each extracted at least one pixel Alpha values can be restored. Here, the lower limit value may be a specific value of 0 or less than 127 based on 8-bit data. When the lower limit value is set to 0, the alpha value of the corresponding filtered image 20 is restored using the alpha value of the completely transparent pixels in the original image 10. This is because it is necessary to maintain the information related to the transparency after the image filtering for the pixels whose alpha value is less than or equal to the predetermined lower limit value in the original image 10 so that the result of filtering the image in the state preserving the shape of the original image 10 .

As described above, the processor 410 generates the filtered image 20 and restores the alpha value of the filtered image 20 using the alpha value of the original image 10 to generate the restored image 30, Here, the operation of generating the filtered image 20 is an operation using an application, and the operation of generating the restored image 30 may be an operation of an application or an operating system different from the application. The processor 410 may determine whether to perform the operation of generating the restored image, which is the operation of the other application or the operating system, in consideration of the condition of the application and the type of the filtering technique.

The processor 410 filters the original image 10 according to a filtering technique selected according to a user's input and restores the alpha value of the filtered image 20 only when the selected filtering technique is one of predetermined filtering techniques .

According to various embodiments, the processor 410 may recover the alpha value of the filtered image 20 only if the selected filtering technique is a filtering technique that does not change the shape of the image. For example, the image filtering technique uses a filtering technique (e.g., gray scale) that does not change the shape of an object, such as filtering RGB values of each pixel depending on only RGB values of the corresponding pixels, blurring processing There is a filtering technique in which the shape of the resulting object, which filters the RGB values of some pixels depending on the RGB values of adjacent pixels, is changed. In the case of the filtering technique in which the shape of the image is changed, if the alpha value of the original image 10 is directly applied to the filtering image 20, the filtering effect may not exhibit the intended filtering effect. Accordingly, the processor 410 restores the alpha value of the filtered image 20 only when the selected filtering method is one of filtering methods that do not change the type of the image that has been checked in advance, referring to the detailed information of the filtering technique Operation can be performed.

The processor 410 may restore the alpha value of the filtered image 20 only if the application performing the filtering of the original image 10 is one of predetermined applications. Here, the processor 410 refers to the detailed information of the application, checks whether the application is an application whose alpha value is lost after performing the filtering, and performs an alpha value restoration operation of the filtering image 20 according to the confirmation result Can be determined.

According to various embodiments, the processor 410 may recover the alpha value of the filtered image 20 only if the application is an application that performs GPU (Graphical Processing Unit) based image filtering. As described above, the application for performing image filtering may be an image filtering application based on a central processing unit (CPU) or a GPU (graphical processing unit). Some filtering applications are GPU-based applications that are suitable for pre-processing (or real-time) image filtering, and pre-processing image filtering is to filter images currently being captured through the camera module of the electronic device 400, There is a possibility that the alpha value may be lost if the filtering image 20 is generated using these GPU-based applications. Accordingly, the processor 410 may perform an operation of restoring the alpha value of the filtering image 20 when the application is an application that performs GPU (Graphical Processing Unit) -based image filtering.

Figure 6 depicts some of the pixel values of the original image 10, the filtered image 20, and the restored image 30 according to various embodiments.

In FIG. 6, the first image shows the original image 10 of FIG. 4A, the second image shows the filtered image 20 of FIG. 4B, and the third image shows the restored image 30 of FIG. 4C.

According to various embodiments, in restoring the alpha value of the filtered image 20, the processor 410 of the electronic device 400 may restore the alpha value for each pixel of the filtered image 20, Can be generated.

As shown in the figure, in the original image 10, pixel values of pixel coordinates (x1, y1) are (A1a, R1a, G1a, B1a) The pixel value of the original image 10 of the pixel coordinates (x1, y1) of the restored image 30 is the alpha value and the RGB value of the filtered image 20 (A1a, R1b, G1b, B1b). It is also assumed that the pixel values of the pixel coordinates (x2, y2) in the original image 10 are (A2a, R2a, G2a, B2a) The pixel value of the original image 10 of the pixel coordinates (x2, y2) of the restored image 30 is a combination of the alpha value and the RGB value of the filtered image 20, when the value is (A2b, R2b, G2b, (A2a, R2b, G2b, B2b).

An electronic device according to various embodiments of the present invention includes: a memory; A processor; And an application that is stored in the memory and is operable in the processor, the application performing image filtering from the original image to generate a filtered image, wherein the memory stores the alpha value of the original image Storing an instruction to perform an operation of restoring an alpha value of the filtered image, and the processor may execute an instruction stored in the memory after the application generates the filtered image.

According to various embodiments, the processor may execute instructions to create a first buffer for storage of the original image and a second buffer for storage of the filtered image on the memory.

According to various embodiments, the processor may perform an instruction to restore the alpha value of the filtered image to generate a reconstructed image, and then to remove the first buffer and the second buffer.

According to various embodiments, the processor may perform an instruction to reconstruct an alpha value for each pixel of the filtered image to generate a reconstructed image.

According to various embodiments, each pixel value of the original image, the filtered image, and the restored image may include an alpha value and an RGB value.

According to various embodiments, the processor may perform an instruction to determine a pixel value of the restored image by combining the alpha value of the original image and the RGB value of the filtered image for each pixel.

According to various embodiments, the processor may perform an instruction to extract at least one pixel whose alpha value is equal to or greater than a predetermined upper limit value in the filtered image, and to restore an alpha value for the extracted at least one pixel.

According to various embodiments, the processor may extract at least one pixel whose alpha value is less than or equal to a predetermined lower limit value in the original image, and restore the alpha value of the pixel of the filtered image corresponding to each of the extracted at least one pixel Can be executed.

According to various embodiments, the processor may filter the original image according to a filtering technique selected according to a user's input, and only when the selected filtering technique is a filtering technique that does not change the shape of an image, You can perform the command to restore the alpha value of the image.

According to various embodiments, the processor may perform an instruction to restore an alpha value of the filtered image only when the application is an application that performs GPU (Graphical Processing Unit) -based image filtering.

7 is a flow chart of a method of image processing an electronic device according to various embodiments of the present invention.

The illustrated method can be performed by the above-described electronic apparatus, and a description of the same or similar features of the electronic apparatus described above will be omitted.

At 710, the electronic device may perform an operation of receiving an original image.

At 720, the electronic device may include an operation of driving an image filtering application and performing image filtering using the driven application. As a result of performing the image filtering, a filtered image may be generated.

At 830, the electronic device may include restoring the alpha value of the filtered image using the alpha value of the original image. As a result, a restored image can be generated.

8 is a flow chart of a method of image processing an electronic device according to various embodiments of the present invention.

The illustrated method can be performed by the above-described electronic apparatus, and a description of the same or similar features of the electronic apparatus described above will be omitted.

At 810, the electronic device may include an operation to execute an image filtering application stored in memory (or non-volatile memory), depending on the user's input.

At 820, the electronic device may include, depending on the user's input, an operation to receive a selection of the original image and image filtering technique. Here, the image filtering technique can be grayscale to give a monochrome effect to an image, cartoon to give a cartoon effect, sketch to convert a drawing with a pen, and various filtering techniques can be selected for one original image.

At 830, the electronic device may include an act of filtering the original image using the executed application and generating a filtered image.

At 840, the electronic device may include an operation to determine whether the application is an application that performs GPU-based image filtering. This is because some of the filtering applications that are GPU based applications are suitable for pre-processing (or real-time) image filtering, and the preprocessing image filtering is for filtering the image currently being photographed through the camera module of the electronic device, As there is no need to generate filtered images using these GPU-based applications, alpha values may be lost.

At 850, the electronic device may include an operation to confirm whether the selected filtering technique is a technique that does not change the shape of the image. This is because, in the case of the filtering technique in which the shape of the image including the blurring process is changed, if the alpha value of the original image is directly applied to the filtered image, it may not exhibit the intended filtering effect.

Here, the electronic device may omit at least some of the operations 840 and 850, or may be executed by changing the order thereof.

At 860, the electronic device generates a first buffer for storage of the original image and a second buffer for storage of the filtered image on the memory, and stores the original image and the filtered image in the first buffer and the second buffer, respectively . ≪ / RTI > Here, the first buffer and the second buffer may correspond to a specific (or unspecified) segment in a memory (or a volatile memory), and need not be implemented as separate hardware buffers.

At 870, the electronic device may include restoring the alpha value of the original image stored in the first buffer to restore the alpha value of the filtered image in the second buffer.

In restoring the alpha value of the filtered image, the electronic device may restore the alpha value for each pixel of the filtered image to produce a reconstructed image. The electronic device can recover the alpha value of the corresponding pixel of the filtered image using the alpha value of the pixel of the original image corresponding to the coordinates (x, y) of the pixel whose alpha value has been lost in the filtered image.

The electronic device can determine the pixel value of the reconstructed image by combining the alpha value of the original image and the RGB value of the filtered image for each pixel. For example, if the pixel value of the pixel coordinate (x1, y1) in the original image is (A1a, R1a, G1a, B1a) and the pixel value of the pixel coordinate (x1, y1) , R1b, G1b, and B1b), the pixel value of the original image in the pixel coordinates (x1, y1) of the restored image may be a combination of the alpha value and the RGB values of the filtered image (A1a, R1b, G1b, B1b) .

The electronic device can determine the pixel value of the restored image by combining the alpha value of the original image and the RGB value of the filtered image for all the pixels of the filtered image. In another embodiment, the alpha value in the filtered image or the original image It is possible to perform an operation of restoring the alpha value only for the searched pixels.

According to various embodiments, it is possible to extract at least one pixel whose alpha value is greater than or equal to a predetermined upper limit in the electronic device filtering image, and restore the alpha value only for the extracted at least one pixel. Here, the upper limit value may be 255 or a specific value of 128 or more based on 8-bit data.

According to various embodiments, it is possible to extract at least one pixel whose alpha value is less than or equal to a predetermined lower limit value in the original image of the electronic device, and to restore the alpha value of the pixel of the filtered image corresponding to each extracted at least one pixel. Here, the lower limit value may be a specific value of 0 or less than 127 based on 8-bit data.

At 880, the electronic device may include an operation of generating a reconstructed image as a result of alpha value restoration of the filtered image.

At 890, the electronic device may include removing the first and second pre-created buffers after the creation of the restored image is complete. However, the operation of 890 may be omitted depending on the setting of the electronic device and the memory storage space.

An image processing method according to various embodiments of the present invention includes: executing an application for performing image filtering; Performing image filtering on the original image through the application to generate a filtered image; And generating a restored image by restoring an alpha value of the filtered image using an alpha value of the original image.

According to various embodiments, the method may further include generating a first buffer for storing the original image and a second buffer for storing the filtered image on a memory.

According to various embodiments, after generating the restored image, the method may further include removing the first buffer and the second buffer.

According to various embodiments, the act of generating the reconstructed image may include generating an reconstructed image by reconstructing an alpha value for each pixel of the filtered image.

According to various embodiments, each pixel value of the original image, the filtered image, and the restored image may include an alpha value and an RGB value.

According to various embodiments, the act of generating the reconstructed image comprises determining the pixel value of the reconstructed image by combining the alpha value of the original image and the RGB value of the filtered image for each pixel can do.

According to various embodiments, the operation of generating the reconstructed image may include extracting at least one pixel whose alpha value is equal to or greater than a predetermined upper limit value in the filtered image, and restoring the alpha value of the extracted at least one pixel .

According to various embodiments, the operation of generating the reconstructed image may include extracting at least one pixel whose alpha value is less than or equal to a predetermined lower limit value in the original image, extracting at least one pixel of the filtered image corresponding to the extracted at least one pixel, Lt; RTI ID = 0.0 > alpha. ≪ / RTI >

According to various embodiments, the act of generating the filtered image includes filtering the original image according to a filtering technique selected in accordance with a user's input, wherein the act of generating the restored image comprises: And restoring the alpha value of the filtered image only when the filtering technique does not change the shape of the image.

According to various embodiments, the act of creating the restored image may include restoring the alpha value of the filtered image only if the application is an application that performs GPU (Graphical Processing Unit) based image filtering. have.

According to various embodiments of the invention described above, an electronic device and an image of an electronic device capable of restoring a lost alpha value if the alpha value of at least some of the pixels in the filtered image is lost through an application that filters the image And can provide a processing method.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

400: electronic device
410: Processor
420: memory
430: input device
440: Display
450: Communication module

Claims (20)

Memory;
A processor; And
An application stored in the memory and operable in the processor,
The application performs image filtering from the original image to generate a filtered image,
Wherein the memory stores an instruction for performing an operation of restoring an alpha value of the filtered image using an alpha value of the original image,
Wherein the processor executes the instructions stored in the memory after the application generates the filtered image.
The method according to claim 1,
The processor comprising:
A first buffer for storing the original image and a second buffer for storing the filtered image on the memory.
3. The method of claim 2,
The processor comprising:
And performing an instruction to remove the first buffer and the second buffer after restoring an alpha value of the filtered image to generate a restored image.
The method according to claim 1,
The processor comprising:
And to restore the alpha value for each pixel of the filtered image to produce a reconstructed image.
5. The method of claim 4,
Wherein each pixel value of the original image, filtered image, and restored image comprises an alpha value and an RGB value.
6. The method of claim 5,
The processor comprising:
And performing an instruction to determine a pixel value of the restored image by combining the alpha value of the original image and the RGB value of the filtered image for each of the pixels.
6. The method of claim 5,
The processor comprising:
Extracts at least one pixel whose alpha value is equal to or greater than a predetermined upper limit value in the filtered image, and restores an alpha value for the extracted at least one pixel.
6. The method of claim 5,
The processor comprising:
Extracting at least one pixel whose alpha value is equal to or less than a predetermined lower limit value in the original image and restoring an alpha value of a pixel of the filtered image corresponding to each of the extracted at least one pixel Device.
The method according to claim 1,
The processor comprising:
Filtering the original image according to a filtering technique selected according to a user's input,
Wherein the command is to restore an alpha value of the filtered image only when the selected filtering technique is a filtering technique that does not change the shape of the image.
The method according to claim 1,
The processor comprising:
Wherein the controller executes an instruction to restore an alpha value of the filtered image only when the application is an application that performs GPU (Graphical Processing Unit) based image filtering.
Executing an application for performing image filtering;
Performing image filtering on the original image through the application to generate a filtered image; And
And generating a restored image by restoring an alpha value of the filtered image using an alpha value of the original image.
12. The method of claim 11,
Further comprising generating a first buffer for storage of the original image and a second buffer for storage of the filtered image on a memory.
13. The method of claim 12,
Further comprising: after generating the reconstructed image, removing the first buffer and the second buffer.
12. The method of claim 11,
Wherein the generating the reconstructed image comprises:
And restoring an alpha value for each pixel of the filtered image to produce a reconstructed image.
15. The method of claim 14,
Wherein each pixel value of the original image, the filtered image, and the restored image comprises an alpha value and an RGB value.
16. The method of claim 15,
Wherein the generating the reconstructed image comprises:
And combining the alpha value of the original image and the RGB value of the filtered image for each pixel to determine a pixel value of the reconstructed image.
16. The method of claim 15,
Wherein the generating the reconstructed image comprises:
Extracting at least one pixel whose alpha value is equal to or greater than a predetermined upper limit value in the filtered image, and restoring an alpha value for the extracted at least one pixel.
16. The method of claim 15,
Wherein the generating the reconstructed image comprises:
Extracting at least one pixel whose alpha value is less than or equal to a predetermined lower limit value in the original image and restoring an alpha value of the pixel of the filtered image corresponding to each of the extracted at least one pixel .
12. The method of claim 11,
The operation of generating the filtered image comprises:
Filtering the original image according to a filtering technique selected according to a user's input,
Wherein the generating the reconstructed image comprises:
And restoring an alpha value of the filtered image only when the selected filtering technique is a filtering technique that does not change the shape of the image.
The method according to claim 1,
Wherein the generating the reconstructed image comprises:
And restoring an alpha value of the filtered image only when the application is an application that performs GPU (Graphical Processing Unit) based image filtering.
KR1020150113176A 2015-08-11 2015-08-11 Electronic device and image processing method thereof KR20170019159A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150113176A KR20170019159A (en) 2015-08-11 2015-08-11 Electronic device and image processing method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150113176A KR20170019159A (en) 2015-08-11 2015-08-11 Electronic device and image processing method thereof

Publications (1)

Publication Number Publication Date
KR20170019159A true KR20170019159A (en) 2017-02-21

Family

ID=58313775

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150113176A KR20170019159A (en) 2015-08-11 2015-08-11 Electronic device and image processing method thereof

Country Status (1)

Country Link
KR (1) KR20170019159A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019209007A1 (en) * 2018-04-24 2019-10-31 주식회사 지디에프랩 Ai-based image compression and decompression system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019209007A1 (en) * 2018-04-24 2019-10-31 주식회사 지디에프랩 Ai-based image compression and decompression system

Similar Documents

Publication Publication Date Title
US10284775B2 (en) Electronic device and method for processing captured image associated with preview frames by electronic device
CN107665485B (en) Electronic device and computer-readable recording medium for displaying graphic objects
KR102264806B1 (en) Method and apparatus for providing of screen mirroring service
KR20160105242A (en) Apparatus and method for providing of screen mirroring service
KR102399764B1 (en) Method and apparatus for capturing image
KR102272108B1 (en) Image processing apparatus and method
KR20170122580A (en) Electronic eevice for compositing graphic data and method thereof
KR20170105213A (en) Electronic device and method for driving display thereof
KR20160129607A (en) Method for short-range wireless communication and Electronic device using the same
KR20170019615A (en) Apparatus and method for adjusting resolution of electronic device
KR20170014979A (en) Method and apparatus for displaying in an electronic device
KR20160105239A (en) Electronic device and method for displaying picture thereof
KR20170077730A (en) Content recognition device and method for controlling thereof
KR20160143094A (en) Electronic Device for Performing Personal Authentication and Method Thereof
KR20170086977A (en) Method and apparatus for processing image data
KR20160132620A (en) Display data processing method and electronic device supporting the same
KR20170004723A (en) Dynamic preview display method of electronic apparatus and electronic apparatus thereof
KR20160084147A (en) Image metadata managing method and apparatus
KR20160031217A (en) Method for controlling and an electronic device thereof
KR20160147432A (en) Device For Controlling Respectively Multiple Areas of Display and Method thereof
JP2020502696A (en) Electronic device and web page display method using the same
KR20160114434A (en) Electronic Device And Method For Taking Images Of The Same
KR20170081903A (en) Electronic device and method for operating thereof
US10582156B2 (en) Electronic device for performing video call and computer-readable recording medium
KR20160125783A (en) Electronic apparatus and method for displaying contetns