KR20170052264A - Electronic device and method for tracking an object using a camera module of the same - Google Patents

Electronic device and method for tracking an object using a camera module of the same Download PDF

Info

Publication number
KR20170052264A
KR20170052264A KR1020150154511A KR20150154511A KR20170052264A KR 20170052264 A KR20170052264 A KR 20170052264A KR 1020150154511 A KR1020150154511 A KR 1020150154511A KR 20150154511 A KR20150154511 A KR 20150154511A KR 20170052264 A KR20170052264 A KR 20170052264A
Authority
KR
South Korea
Prior art keywords
line elements
electronic device
line
processor
image
Prior art date
Application number
KR1020150154511A
Other languages
Korean (ko)
Inventor
그르제고르즈 파웰 그르지시악
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Priority to KR1020150154511A priority Critical patent/KR20170052264A/en
Publication of KR20170052264A publication Critical patent/KR20170052264A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • G06K9/00355
    • G06K9/6296
    • H04N5/23267

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

According to embodiments of the present invention, an electronic device comprises: an electronic device housing; one or more cameras which are included in the housing, and obtain an image including one or more objects; a display module which is included in the housing, and displays the obtained image; a processor which is included in the housing, and is electrically connected to the cameras and/or the display module; and a memory which is electrically connected to the processor. The memory stores instructions which makes, when executed, the processor detect edges of the object from a single image frame obtained through the cameras, analyze line elements of the object based on the edges detected from the signal image frame and confirm movement of the object based on a result of analyzing the line elements of the object in the single image frame. Other embodiments are possible.

Description

TECHNICAL FIELD [0001] The present invention relates to an electronic device and a method for tracking an object using the camera module,

Various embodiments of the present invention are directed to an electronic device for tracking an object through a camera module and to an object tracking method thereof.

The electronic device can photograph a subject or an object through the camera module, and can detect an edge of a subject or an object from the photographed image. The electronic device can identify various information using an edge in the image. For example, the direction of the object can be checked by comparing the edge of the previous image frame with the edge of the current image frame. The electronic device may use a line detection algorithm such as a Hough transform algorithm to analyze the edge. For general description of Hough transform, see references [1], [2], and [3] below.

[references]

[1] Hough Transform, http://homepages.inf.ed.ac.uk/rbf/HIPR2/hough.htm

[2] Hough Transform, http://en.wikipedia.org/wiki/Hough_transform, Oct 20, 2011

[3] MACHII KIMIYOSHI, "INFORMATION PROCESSOR WITH HAND SHAPE RECOGNIZING FUNCTION", Patent no. JP 9035066, Feb 07, 1997

The electronic device must utilize at least two image frames to identify the motion of the object through the camera module and the velocity or direction of movement of the object after the two image frames are analyzed. In this case, there is a problem that it is difficult to confirm the accurate moving direction or moving speed due to the processing time of at least two image frames.

Further, when a blurred image frame is acquired, the electronic device has a problem that it is difficult to confirm the moving direction or the moving speed of the object from the blurred image of a single frame.

Various embodiments of the present invention may provide an apparatus and method for analyzing line elements of an object from a single image frame and for verifying the motion of the object based on the analysis results.

An electronic device in accordance with various embodiments of the present invention includes an electronic device housing, at least one camera included in the housing and obtaining an image comprising at least one object, the housing being included in the housing, A processor included in the housing and electrically coupled to the at least one camera and / or the display module, and a memory electrically coupled to the processor, wherein the memory, when executed, Detecting edges of the object in a single image frame acquired through the camera, analyzing line elements of the object based on the edges detected in the single image frame, The line elements of the object in the single image frame It may be based on the results seokhan store instruction that to determine a movement of the object.

The object tracking method of an electronic device according to various embodiments of the present invention can confirm the movement of an object in a short time by analyzing a single image frame.

The object tracking method of an electronic device according to various embodiments of the present invention can provide an efficient image analysis method by confirming the movement of an object from a blurred image of a single frame.

1 illustrates a network environment including an electronic device according to an embodiment of the present invention.
2 is a block diagram of an electronic device in accordance with various embodiments of the present invention.
3 is a block diagram of a program module in accordance with various embodiments of the present invention.
Figure 4 is a flow chart illustrating the operation of the electronic device of Figure 2 in accordance with various embodiments of the present invention to confirm movement of an object through a single image frame.
5A is a diagram illustrating an electronic device in accordance with various embodiments of the present invention detecting edges of an object in a single frame image using an edge detection method.
FIG. 5B is a diagram illustrating an electronic device according to various embodiments of the present invention performing image binarization on an edge detection image of an object to display line elements of the object. FIG.
6 is a diagram showing the result of an electronic device according to various embodiments of the present invention performing Hough transform on an image representing a human hand.
7 is a diagram illustrating a regular expression of the parameters of a Hough transform for a straight line in an image according to various embodiments of the present invention.
8 is a diagram showing an image of one line and the results of the Hough transform according to various embodiments of the present invention.
Figure 9 is a diagram showing the image of two lines and the result of the Hough transform according to various embodiments of the present invention.
10 is a diagram illustrating the results of Hough transform of two parallel lines having the same angular value in the Huff space according to various embodiments of the present invention.
11 is a diagram showing the image of a plurality of lines and the result of the Hough transform according to various embodiments of the present invention.
12 is a diagram illustrating a contour representation image of a human finger and the results of its Hough transform according to various embodiments of the present invention.
Figure 13 is an illustration of an electronic device according to various embodiments of the present invention identifying parallel line elements having a maximum value within a predetermined angular range from a result of a Hough transform of a plurality of lines.
14A is a diagram showing an electronic device according to various embodiments of the present invention displaying a linear shape according to an angular velocity of a hand;
14B is a view showing that the electronic device according to various embodiments of the present invention displays a wavy shape according to the angular velocity of the hand.
15A is a diagram illustrating an electronic device according to various embodiments of the present invention performing various functions according to various gesture inputs of a hand.
15B is a diagram illustrating that an electronic device according to various embodiments of the present invention performs various functions according to various gesture inputs of a finger that is a part of a hand.

Hereinafter, various embodiments of the present document will be described with reference to the accompanying drawings. It is to be understood that the embodiments and terminologies used herein are not intended to limit the invention to the particular embodiments described, but to include various modifications, equivalents, and / or alternatives of the embodiments. In connection with the description of the drawings, like reference numerals may be used for similar components. The singular expressions may include plural expressions unless the context clearly dictates otherwise. In this document, the expressions "A or B" or "at least one of A and / or B" and the like may include all possible combinations of the items listed together. Expressions such as " first, "" second," " first, "or" second, " But is not limited to those components. When it is mentioned that some (e.g., first) component is "(functionally or communicatively) connected" or "connected" to another (second) component, May be connected directly to the component, or may be connected through another component (e.g., a third component).

In this document, the term " configured to (or configured) to "as used herein is intended to encompass all types of hardware, software, , "" Made to "," can do ", or" designed to ". In some situations, the expression "a device configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a general purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.

Electronic devices in accordance with various embodiments of the present document may be used in various applications such as, for example, smart phones, tablet PCs, mobile phones, videophones, electronic book readers, desktop PCs, laptop PCs, netbook computers, workstations, a portable multimedia player, an MP3 player, a medical device, a camera, or a wearable device. Wearable devices may be of the type of accessories (eg, watches, rings, bracelets, braces, necklaces, glasses, contact lenses or head-mounted-devices (HMD) (E.g., a skin pad or tattoo), or a bio-implantable circuit. In some embodiments, the electronic device may be, for example, a television, a digital video disk (Such as Samsung HomeSync TM , Apple TV TM , or Google TV TM ), which are used in home appliances such as home appliances, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwave ovens, washing machines, air cleaners, set top boxes, home automation control panels, , A game console (e.g., Xbox TM , PlayStation TM ), an electronic dictionary, an electronic key, a camcorder, or an electronic photo frame.

In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) A navigation system, a global navigation satellite system (GNSS), an event data recorder (EDR), a flight data recorder (FDR), an automobile infotainment device, a marine electronic equipment (For example, marine navigation systems, gyro compasses, etc.), avionics, security devices, head units for vehicles, industrial or domestic robots, drones, ATMs at financial institutions, of at least one of the following types of devices: a light bulb, a fire detector, a fire alarm, a thermostat, a streetlight, a toaster, a fitness device, a hot water tank, a heater, a boiler, . According to some embodiments, the electronic device may be a piece of furniture, a building / structure or part of an automobile, an electronic board, an electronic signature receiving device, a projector, or various measuring devices (e.g., Gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device is flexible or may be a combination of two or more of the various devices described above. The electronic device according to the embodiment of the present document is not limited to the above-described devices. In this document, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).

Referring to Figure 1, in various embodiments, an electronic device 101 in a network environment 100 is described. The electronic device 101 may include a bus 110, a processor 120, a memory 130, an input / output interface 150, a display 160, and a communication interface 170. In some embodiments, the electronic device 101 may omit at least one of the components or additionally include other components. The bus 110 may include circuitry to connect the components 110-170 to one another and to communicate communications (e.g., control messages or data) between the components. Processor 120 may include one or more of a central processing unit, an application processor, or a communications processor (CP). The processor 120 may perform computations or data processing related to, for example, control and / or communication of at least one other component of the electronic device 101.

Memory 130 may include volatile and / or non-volatile memory. Memory 130 may store instructions or data related to at least one other component of electronic device 101, for example. According to one embodiment, the memory 130 may store software and / or programs 140. The program 140 may include, for example, a kernel 141, a middleware 143, an application programming interface (API) 145, and / or an application program . At least some of the kernel 141, middleware 143, or API 145 may be referred to as an operating system. The kernel 141 may include system resources used to execute an operation or function implemented in other programs (e.g., middleware 143, API 145, or application program 147) (E.g., bus 110, processor 120, or memory 130). The kernel 141 also provides an interface to control or manage system resources by accessing individual components of the electronic device 101 in the middleware 143, API 145, or application program 147 .

The middleware 143 can perform an intermediary role such that the API 145 or the application program 147 can communicate with the kernel 141 to exchange data. In addition, the middleware 143 may process one or more task requests received from the application program 147 according to the priority order. For example, middleware 143 may use system resources (e.g., bus 110, processor 120, or memory 130, etc.) of electronic device 101 in at least one of application programs 147 Prioritize, and process the one or more task requests. The API 145 is an interface for the application 147 to control the functions provided by the kernel 141 or the middleware 143. The API 145 is an interface for controlling the functions provided by the application 141. For example, An interface or a function (e.g., a command). Output interface 150 may be configured to communicate commands or data entered from a user or other external device to another component (s) of the electronic device 101, or to another component (s) of the electronic device 101 ) To the user or other external device.

The display 160 may include a display such as, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or a microelectromechanical system (MEMS) display, or an electronic paper display . Display 160 may display various content (e.g., text, images, video, icons, and / or symbols, etc.) to a user, for example. Display 160 may include a touch screen and may receive a touch, gesture, proximity, or hovering input using, for example, an electronic pen or a portion of the user's body. The communication interface 170 establishes communication between the electronic device 101 and an external device (e.g., the first external electronic device 102, the second external electronic device 104, or the server 106) . For example, communication interface 170 may be connected to network 162 via wireless or wired communication to communicate with an external device (e.g., second external electronic device 104 or server 106).

The wireless communication may include, for example, LTE, LTE-A (LTE Advance), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro) System for Mobile Communications), and the like. According to one embodiment, the wireless communication may be wireless communication, such as wireless fidelity (WiFi), Bluetooth, Bluetooth low power (BLE), Zigbee, NFC, Magnetic Secure Transmission, Frequency (RF), or body area network (BAN). According to one example, wireless communication may include GNSS. GNSS may be, for example, Global Positioning System (GPS), Global Navigation Satellite System (Glonass), Beidou Navigation Satellite System (Beidou) or Galileo, the European global satellite-based navigation system. Hereinafter, in this document, "GPS" can be used interchangeably with "GNSS ". The wired communication may include, for example, at least one of a universal serial bus (USB), a high definition multimedia interface (HDMI), a recommended standard 232 (RS-232), a power line communication or a plain old telephone service have. Network 162 may include at least one of a telecommunications network, e.g., a computer network (e.g., LAN or WAN), the Internet, or a telephone network.

Each of the first and second external electronic devices 102, 104 may be the same or a different kind of device as the electronic device 101. According to various embodiments, all or a portion of the operations performed in the electronic device 101 may be performed in one or more other electronic devices (e.g., electronic devices 102, 104, or server 106). According to the present invention, when electronic device 101 is to perform a function or service automatically or on demand, electronic device 101 may perform at least some functions associated therewith instead of, or in addition to, (E.g., electronic device 102, 104, or server 106) may request the other device (e.g., electronic device 102, 104, or server 106) Perform additional functions, and forward the results to the electronic device 101. The electronic device 101 may process the received results as is or additionally to provide the requested functionality or services. For example, Cloud computing, distributed computing, or client-server computing techniques can be used.

2 is a block diagram of an electronic device 201 according to various embodiments. The electronic device 201 may include all or part of the electronic device 101 shown in Fig. 1, for example. The electronic device 201 may include one or more processors (e.g., AP) 210, a communications module 220, a subscriber identification module 224, a memory 230, a sensor module 240, an input device 250, An audio interface module 260, an interface 270, an audio module 280, a camera module 291, a power management module 295, a battery 296, an indicator 297, and a motor 298.

The processor 210 may control a plurality of hardware or software components connected to the processor 210, for example, by driving an operating system or an application program, and may perform various data processing and calculations. The processor 210 may be implemented with, for example, a system on chip (SoC). According to one embodiment, the processor 210 may further include a graphics processing unit (GPU) and / or an image signal processor. Processor 210 may include at least some of the components shown in FIG. 2 (e.g., cellular module 221). Processor 210 may load instructions and / or data received from at least one of the other components (e.g., non-volatile memory) into volatile memory) and process the resultant data in non-volatile memory.

May have the same or similar configuration as communication module 220 (e.g., communication interface 170). The communication module 220 may include, for example, a cellular module 221, a WiFi module 223, a Bluetooth module 225, a GNSS module 227, an NFC module 228 and an RF module 229 have. The cellular module 221 can provide voice calls, video calls, text services, or Internet services, for example, over a communication network. According to one embodiment, the cellular module 221 may utilize a subscriber identity module (e.g., a SIM card) 224 to perform the identification and authentication of the electronic device 201 within the communication network. According to one embodiment, the cellular module 221 may perform at least some of the functions that the processor 210 may provide. According to one embodiment, the cellular module 221 may comprise a communications processor (CP). At least some (e.g., two or more) of the cellular module 221, the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228, according to some embodiments, (IC) or an IC package. The RF module 229 can, for example, send and receive communication signals (e.g., RF signals). The RF module 229 may include, for example, a transceiver, a power amplifier module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna. According to another embodiment, at least one of the cellular module 221, the WiFi module 223, the Bluetooth module 225, the GNSS module 227, or the NFC module 228 transmits / receives an RF signal through a separate RF module . The subscriber identification module 224 may include, for example, a card or an embedded SIM containing a subscriber identity module, and may include unique identification information (e.g., ICCID) or subscriber information (e.g., IMSI (international mobile subscriber identity).

Memory 230 (e.g., memory 130) may include, for example, internal memory 232 or external memory 234. Volatile memory (e.g., a DRAM, an SRAM, or an SDRAM), a non-volatile memory (e.g., an OTPROM, a PROM, an EPROM, an EEPROM, a mask ROM, a flash ROM , A flash memory, a hard drive, or a solid state drive (SSD). The external memory 234 may be a flash drive, for example, a compact flash (CF) ), Micro-SD, Mini-SD, extreme digital (xD), multi-media card (MMC), or memory stick, etc. External memory 234 may communicate with electronic device 201, Or may be physically connected.

In accordance with various embodiments of the present invention, the memory 230 may, at runtime, detect the edges of the object in a single image frame that the processor 210 has acquired via the camera 291, Analyzing line elements of the object based on the edges detected in a single image frame and verifying movement of the object based on a result of analyzing line elements of the object in the single image frame Instructions can be stored.

The memory 230 according to various embodiments of the present invention may be implemented in the processor 210 such that upon execution the processor 210 obtains an edge detection image from which the edges of the object have been detected and adds the edge detection image to the acquired edge detection image And store the instructions to perform a binarization to obtain a line detection image representing line elements of the object.

Memory 230 in accordance with various embodiments of the present invention may store instructions that, at runtime, cause processor 210 to identify line elements of the object through a line detection algorithm.

The memory 230 according to various embodiments of the present invention may store instructions that, when executed, cause the processor 210 to identify parallel line elements parallel to each other among the line elements of the object through a line detection algorithm.

The memory 230 according to various embodiments of the present invention may store instructions that, when executed, cause the processor 210 to ascertain the angular rate of motion of the object based on the analysis results of the line elements of the object .

The memory 230 according to various embodiments of the present invention may be implemented in the processor 210 such that at run time the processor 210 uses at least one of an average length, a middle length, and a minimum length of parallel line elements, And may store instructions that cause an angular rate of motion of the object to be ascertained.

The memory 230 in accordance with various embodiments of the present invention may be configured such that, at run time, the processor 210 may determine, based on the analysis results of line elements of the object and the distance information from the device to the object, You can save the instructions to check the speed.

The memory 230 according to various embodiments of the present invention may store instructions that, at run time, cause the processor 210 to determine the direction of the motion of the object based on the analysis results of the line elements of the object have.

The memory 230 according to various embodiments of the present invention may be configured such that upon execution, the processor 210 displays on the display at least one of a velocity and an angular velocity for the motion of the object, Lt; RTI ID = 0.0 > display. ≪ / RTI >

The sensor module 240 may, for example, measure a physical quantity or sense the operating state of the electronic device 201 to convert the measured or sensed information into an electrical signal. The sensor module 240 includes a gesture sensor 240A, a gyro sensor 240B, an air pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, A temperature sensor 240G, a UV sensor 240G, a color sensor 240H (e.g., an RGB (red, green, blue) sensor), a living body sensor 240I, And a sensor 240M. Additionally or alternatively, the sensor module 240 may be configured to perform various functions such as, for example, an e-nose sensor, an electromyography (EMG) sensor, an electroencephalograph (EEG) sensor, an electrocardiogram An infrared (IR) sensor, an iris sensor, and / or a fingerprint sensor. The sensor module 240 may further include a control circuit for controlling at least one or more sensors belonging to the sensor module 240. In some embodiments, the electronic device 201 further includes a processor configured to control the sensor module 240, either as part of the processor 210 or separately, so that while the processor 210 is in a sleep state, The sensor module 240 can be controlled.

The input device 250 may include, for example, a touch panel 252, a (digital) pen sensor 254, a key 256, or an ultrasonic input device 258. As the touch panel 252, for example, at least one of an electrostatic type, a pressure sensitive type, an infrared type, and an ultrasonic type can be used. Further, the touch panel 252 may further include a control circuit. The touch panel 252 may further include a tactile layer to provide a tactile response to the user. (Digital) pen sensor 254 may be part of, for example, a touch panel or may include a separate recognition sheet. Key 256 may include, for example, a physical button, an optical key, or a keypad. The ultrasonic input device 258 can sense the ultrasonic wave generated by the input tool through the microphone (e.g., the microphone 288) and confirm the data corresponding to the ultrasonic wave detected.

Display 260 (e.g., display 160) may include panel 262, hologram device 264, projector 266, and / or control circuitry for controlling them. The panel 262 may be embodied, for example, flexibly, transparently, or wearably. The panel 262 may comprise a touch panel 252 and one or more modules. The hologram device 264 can display a stereoscopic image in the air using interference of light. The projector 266 can display an image by projecting light onto a screen. The screen may be located, for example, inside or outside the electronic device 201. The interface 270 may include, for example, an HDMI 272, a USB 274, an optical interface 276, or a D-sub (D-subminiature) 278. The interface 270 may, for example, be included in the communication interface 170 shown in FIG. Additionally or alternatively, the interface 270 may include, for example, a mobile high-definition link (MHL) interface, an SD card / multi-media card (MMC) interface, or an infrared data association have.

The audio module 280 can, for example, convert sound and electrical signals in both directions. At least some of the components of the audio module 280 may be included, for example, in the input / output interface 145 shown in FIG. The audio module 280 may process sound information input or output through, for example, a speaker 282, a receiver 284, an earphone 286, a microphone 288, or the like. The camera module 291 is, for example, a device capable of capturing still images and moving images, and according to one embodiment, one or more image sensors (e.g., a front sensor or a rear sensor), a lens, an image signal processor (ISP) , Or flash (e.g., an LED or xenon lamp, etc.). The power management module 295 can, for example, manage the power of the electronic device 201. [ According to one embodiment, the power management module 295 may include a power management integrated circuit (PMIC), a charging IC, or a battery or fuel gauge. The PMIC may have a wired and / or wireless charging scheme. The wireless charging scheme may include, for example, a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic wave scheme, and may further include an additional circuit for wireless charging, for example, a coil loop, a resonant circuit, have. The battery gauge can measure, for example, the remaining amount of the battery 296, the voltage during charging, the current, or the temperature. The battery 296 may include, for example, a rechargeable battery and / or a solar cell.

The indicator 297 may indicate a particular state of the electronic device 201 or a portion thereof (e.g., processor 210), e.g., a boot state, a message state, or a state of charge. The motor 298 can convert the electrical signal to mechanical vibration, and can generate vibration, haptic effects, and the like. Electronic device 201 is, for example, DMB Mobile TV-enabled devices capable of handling media data in accordance with standards such as (digital multimedia broadcasting), DVB (digital video broadcasting), or MediaFLO (mediaFlo TM) (for example, : GPU). Each of the components described in this document may be composed of one or more components, and the name of the component may be changed according to the type of the electronic device. In various embodiments, an electronic device (e. G., Electronic device 201) may have some components omitted, further include additional components, or some of the components may be combined into one entity, The functions of the preceding components can be performed in the same manner.

3 is a block diagram of a program module according to various embodiments. According to one embodiment, program module 310 (e.g., program 140) includes an operating system that controls resources associated with an electronic device (e.g., electronic device 101) and / E.g., an application program 147). The operating system may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM . 3, program module 310 includes a kernel 320 (e.g., kernel 141), middleware 330 (e.g., middleware 143), API 360 (e.g., API 145) ), And / or an application 370 (e.g., an application program 147). At least a portion of the program module 310 may be preloaded on an electronic device, 102 and 104, a server 106, and the like).

The kernel 320 may include, for example, a system resource manager 321 and / or a device driver 323. The system resource manager 321 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 321 may include a process manager, a memory manager, or a file system manager. The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication . The middleware 330 may provide various functions through the API 360, for example, to provide functions that are commonly needed by the application 370 or allow the application 370 to use limited system resources within the electronic device. Application 370 as shown in FIG. According to one embodiment, the middleware 330 includes a runtime library 335, an application manager 341, a window manager 342, a multimedia manager 343, a resource manager 344, a power manager 345, a database manager The location manager 350, the graphic manager 351, or the security manager 352. In this case, the service manager 341 may be a service manager, a service manager, a service manager, a package manager 346, a package manager 347, a connectivity manager 348, a notification manager 349,

The runtime library 335 may include, for example, a library module that the compiler uses to add new functionality via a programming language while the application 370 is executing. The runtime library 335 may perform input / output management, memory management, or arithmetic function processing. The application manager 341 can manage the life cycle of the application 370, for example. The window manager 342 can manage GUI resources used in the screen. The multimedia manager 343 can recognize the format required for reproducing the media files and can perform encoding or decoding of the media file using a codec according to the format. The resource manager 344 can manage the source code of the application 370 or the space of the memory. The power manager 345 may, for example, manage the capacity or power of the battery and provide the power information necessary for operation of the electronic device. According to one embodiment, the power manager 345 may interoperate with a basic input / output system (BIOS). The database manager 346 may create, retrieve, or modify the database to be used in the application 370, for example. The package manager 347 can manage installation or update of an application distributed in the form of a package file.

The connectivity manager 348 may, for example, manage the wireless connection. The notification manager 349 may provide the user with an event such as, for example, an arrival message, an appointment, a proximity notification, and the like. The location manager 350 can manage the location information of the electronic device, for example. The graphic manager 351 may, for example, manage the graphical effects to be presented to the user or a user interface associated therewith. Security manager 352 may provide, for example, system security or user authentication. According to one embodiment, the middleware 330 may include a telephony manager for managing the voice or video call function of the electronic device, or a middleware module capable of forming a combination of the functions of the above-described components . According to one embodiment, the middleware 330 may provide a module specialized for each type of operating system. Middleware 330 may dynamically delete some existing components or add new ones. The API 360 may be provided in a different configuration depending on the operating system, for example, as a set of API programming functions. For example, for Android or iOS, you can provide a single API set for each platform, and for Tizen, you can provide two or more API sets for each platform.

The application 370 may include a home 371, a dialer 372, an SMS / MMS 373, an instant message 374, a browser 375, a camera 376, an alarm 377, Contact 378, voice dial 379, email 380, calendar 381, media player 382, album 383, watch 384, healthcare (e.g., measuring exercise or blood glucose) , Or environmental information (e.g., air pressure, humidity, or temperature information) application. According to one embodiment, the application 370 may include an information exchange application capable of supporting the exchange of information between the electronic device and the external electronic device. The information exchange application may include, for example, a notification relay application for communicating specific information to an external electronic device, or a device management application for managing an external electronic device. For example, the notification delivery application can transmit notification information generated in another application of the electronic device to the external electronic device, or receive notification information from the external electronic device and provide the notification information to the user. The device management application may, for example, control the turn-on / turn-off or brightness (or resolution) of an external electronic device in communication with the electronic device (e.g., the external electronic device itself Control), or install, delete, or update an application running on an external electronic device. According to one embodiment, the application 370 may include an application (e.g., a healthcare application of a mobile medical device) designated according to the attributes of the external electronic device. According to one embodiment, the application 370 may include an application received from an external electronic device. At least some of the program modules 310 may be implemented (e.g., executed) in software, firmware, hardware (e.g., processor 210), or a combination of at least two of the same, Program, routine, instruction set or process.

4 is a flow diagram illustrating an operation in which the electronic device 201 of FIG. 2 according to various embodiments of the present invention confirms movement of an object through a single image frame.

The electronic device 201 may detect the edges of the object in a single image frame acquired through the camera in 401 operation. The electronic device 201 may use the basic methods of image processing and computer vision in the field of feature detection and feature extraction for the purpose of identifying points whose brightness changes suddenly or discontinuity in a digital image. The edge detection method may include, for example, a Sobel edge detection method, a Canny edge detection method, and the like. Referring to FIG. 5A, an electronic device 201 in accordance with various embodiments of the present invention may obtain a single image frame representing a hand with a blurry shape in < 501 >. In <502>, the electronic device 201 can detect an edge of a rapidly moving object (e.g., a hand) using an edge detection method. After performing the edge detection, the electronic device 201 can obtain the image (edge detection image) where the edges of the object are detected.

The electronic device 201 according to various embodiments of the present invention may perform image binarization (e.g., thresholding or binarization techniques) on the edge detection image to identify the line element of the object. For example, referring to FIG. 5B, the electronic device 201 may perform image binarization on an edge detection image to obtain a line detection image that is clearly represented by the line elements. The electronic device 201 according to various embodiments of the present invention may extract only horizontal line elements or vertical line elements in the line detection image to obtain an image containing only horizontal line elements or only images or vertical line elements .

The electronic device 201 may, in operation 403, analyze the line elements of the object based on the edges detected in the single image frame. The electronic device 201 can identify line elements of an object through a line detection algorithm. The line detection algorithm may include, for example, a Hough transform. The line detection algorithm is not limited to the Hough transform algorithm and may include various algorithms previously known.

The electronic device 201 in accordance with various embodiments of the present invention may utilize a linear Hough transform to identify line elements of an object based on operations in the Huff space. Examples of general techniques for Hough transform are [1] and [2]. Reference [3] also discloses a technique for detecting straight lines using Hough transform. For example, after the Huff transformation of the image for the hand, a plurality of dots and sinusoids can be seen in the Huff space, as shown in FIG. Each point in every sinusoid within the Hough space can represent the r and theta parameters of the line elements in the image.

Hereinafter, how to interpret Hough space will be described.

Figure 7, in accordance with various embodiments of the present invention, illustrates a straight line in a Cartesian coordinate system where the linear equation is y = mx + b. The main idea in the Hough transform is that the features of the straight line are not used as image points, for example (x1, y1), (x2, y2) . Based on the fact that the straight line y = mx + b can be expressed as a point (b, m) in the parameter space, the parameter r in Fig. 5 represents the distance between the line and the origin, Is the angle of the vector up to. Using such parameters, we can calculate the line equation

Figure pat00001
.

In theory, infinitely many lines can be generated passing through each point at an angle [theta] [0, 2 pi] for every point of the image. All lines (passing through the points of the image) can be created with individual r and θ, and can form a Hough space represented by the (r, θ) plane. Each point from the image is transformed into a Huff space through a number of generated lines. Each point of the image can be represented as a sinusoidal line in Hough space. All points of the sine curve in the Hough space can represent different lines through the points of the image. If there are two points in the image, there may be lines that clearly connect the two points. Since all points in the original image appear to be sinusoids in the Hough space and we know that the two points of the original image can be connected by a single line, there will be two sinusoids in Hough space. The sinusoids will pass through r and θ corresponding to the line passing through the two points in the original image. If the three points of the original image are aligned in a straight line, they are represented by three sinusoids in the Hough space, and in the Hough space the sinusoids intersect at a single point. That is, the coordinates r and θ of the point in the Hough space represent the line connecting the corresponding points in the original image.

FIG. 8 shows a plurality of points forming a straight line on the left side, and a right side of the image shows the Hough space on the right side. One straight line element including a plurality of points in the image may be represented by a local maximum value 810 at which a plurality of sinusoids intersect in the Hough space. Examples of other magnitudes are also presented by way of example in FIG.

FIG. 9 shows, for example, two line elements in an image and their representations in Hough space, < 901 >. Sine curves intersect at two points defining r and θ for the two lines in the original image.

FIG. 10 shows a huff space having two maxima values 1010 with the same θ parameter and different r parameters. Such a Hough space may mean an image comprising two parallel lines.

Returning to operation 403 in FIG. 4, due to the motion of the object, the line elements derived from an object in a single image frame may be parallel to each other. For example, referring to FIG. 5B, the electronic device 201 can identify a plurality of line elements in a moving hand. The line elements according to various embodiments of the present invention are not limited to a rectilinear shape but may have various shapes, for example, S-shaped curves, circles, polygons, etc., have.

The electronic device 201 according to various embodiments of the present invention can identify line elements parallel to each other in the object by analyzing the results obtained through the line detection algorithm. For example, referring to FIG. 11, FIG. 11 illustrates an image having a plurality of line elements and a Hough space analyzing the line elements, for example, in FIG. <1101>, two parallel line elements of an image having a plurality of lines and another line element intersecting the parallel line elements are shown by dotted lines, and as shown in <1102> Lt; RTI ID = 0.0 &gt; r &lt; / RTI &gt; In Fig. 11, the arrows indicate which line is represented by which maximum value in the Huff space.

The electronic device 201 according to various embodiments of the present invention can identify the shape of the fingertip within a single image frame through a line detection algorithm. For example, referring to FIG. 12, the semicircular maximum values (sinusoids formed by the maxima values) representing the curved elements of the fingertips, starting from one maximal value 1210 representing the first line element of the finger in the Huff space, And then to another maximal value 1220 representing the second linear element of the finger. The electronic device 201 can identify the fingertip or fingertip position by detecting the semicircles connecting the parallel lines and their parallel lines. The fingertip position can be defined as the? And r parameters that characterize the maximum value from the curve (which detects a semicircle) at a position where the tangents to the semicircle are perpendicular to the lines defining the edges of the finger. It may mean that the parameter θ of the semicircle should be about (+/-) 90 degrees to the line of the line θ.

The electronic device 201 according to various embodiments of the present invention can identify the most parallel elements based on the identified parallel line elements. The electronic device 201 can confirm the angular range having the maximum maximum value for a predetermined angular range in the Hough space and identify the parallel line elements corresponding to the respective maximum values. For example, referring to FIG. 13, there are six maximum line values within a predetermined angular range (a line to b line) in the Hough space, thereby identifying six parallel line elements.

In operation 405, the electronic device 201 can confirm the movement of the object based on the analysis result of the line elements of the object. The electronic device 201 can confirm the angular velocity of the motion of the object. The electronic device 201 uses at least one of an average length, a median length, and a min length of parallel line elements of the line elements of the object to calculate an angular velocity .

The electronic device 201 according to various embodiments of the present invention can determine the angular velocity of the object motion using at least one of an average length, an intermediate length, and a minimum length of the most parallel line elements. For example, referring to FIG. 13, the electronic device 201 can identify six parallel line elements, which are the most parallel line elements, and determine the average line length of the six parallel line elements, the middle line element of the six parallel line elements The length of the line element having the smallest length among the six parallel line elements can be used to confirm the angular velocity with respect to the movement of the object. The electronic device 201 can check the time stamp of the image frame or the time taken to photograph the frame by checking the FPS (frame per second) of the camera performance to calculate the angular velocity of the motion of the object.

The electronic device 201 according to various embodiments of the present invention can determine the angular velocity of the hand displayed in the image based on the line element at the end of the hand indicated in the image. The electronic device 201 can confirm the length of the fingertip through the previous frame of the current frame to be analyzed. For example, if the previous frame is a non-blurred image, the electronic device 201 can determine the position of the fingertip and determine how many pixels the fingertip is made of in the image. The electronic device 201 can identify the distance the fingertip has moved through the length of the line element of the fingertip in the current blur image frame. For example, if the length of the line element at the end of the hand is 200 pixels and the length of the finger obtained at the previous frame is 20 pixels, the electronic device 201 can confirm that the moving distance of the hand is 180 pixels . The electronic device 201 can check the time stamp of the image frame or calculate the FPS (frame per second) of the camera performance so as to calculate the angular speed with respect to the motion of the hand, .

The electronic device 201 according to various embodiments of the present invention can determine the speed for the motion of the object based on the analysis results of the line elements of the object and the information about the distance from the electronic device 201 to the object have. The electronic device 201 can confirm the distance from the electronic device 201 to the object. For example, the electronic device 201 can mount a time-of-flight (ToF) camera and can confirm the distance to the object through the ToF camera. The electronic device 201 can accurately determine the distance (the length of the line element of the object) the object moves by using the distance to the identified object, and thereby confirm the moving speed with respect to the movement of the object. The electronic device 201 can confirm the moving speed with respect to the motion of the object by using the information on the distance to the object and the method of checking the angular velocity with respect to the motion of the object described above.

The electronic device 201 according to various embodiments of the present invention can determine the direction of the movement of the object based on the analysis results of the line elements of the object. The electronic device 201 according to various embodiments of the present invention can determine the direction of movement of the object based on the position information of the edges of the previous image frame and the analysis results of the line elements of the object. The electronic device 201 can determine the position of the edges of the object in the previous image frame of the image frame to be analyzed at present. The electronic device 201 compares the position of the edges of the previous image frame with the positional change of the edges of the object of the current image frame or the position of the edges of the object of the current image frame with the position of the edges of the previous image frame The direction of movement of the object can be confirmed. The electronic device 201 according to various embodiments of the present invention can confirm the direction of movement of the object using the sensor data of the external device and the analysis results of the line elements of the object. For example, when the electronic device 201 senses gestural motion of a hand wearing an external device (e.g., a smart watch), the electronic device 201 may receive accelerometer data of the external device, Based on the information represented by the accelerometer data and the results of analysis of the line elements of the object, progress reflections on hand gesture motion can be confirmed. The electronic device 201 can confirm the time stamp of the image frame to be analyzed at present and use the sensor data of the external device having the reception time synchronized with the time stamp.

Hereinafter, embodiments in which the electronic device 201 displays the result of analyzing the motion of the object will be described.

The electronic device 201 according to various embodiments of the present invention may display at least one of the angular velocity and the velocity corresponding to the motion of the object on the display module 260. [ The electronic device 201 can display the progress direction of the object on the display module 260. [ The electronic device 291 can display an object photographed by the camera module as a preview image through the display module 260. The electronic device 201 can identify at least one of the angular velocity and velocity of a moving object in blur form displayed in a single image frame and display the determined angular velocity or velocity on the display module 260. [ For example, referring to FIG. 14A, the electronic device 201 can analyze the line elements of a moving hand and determine the angular velocity of the hand based on the analysis results. The electronic device 201 may display the identified angular velocity (speed = 40) on the display module 260. [ The electronic device 201 according to various embodiments of the present invention may display on the display module 260 shapes corresponding to the line elements of the object based on at least one of angular velocity and velocity corresponding to the motion of the object . The electronic device 201 may display different shapes corresponding to the line elements by angular velocity or speed with respect to movement of the object. For example, referring to FIG. 14A, the electronic device 201 may display a straight line on the display 260 when the angular velocity of the hand is 40, and referring to FIG. 14B, If the angular velocity is 80, a wavy pattern can be displayed on the display 260. The shape corresponding to the line element according to various embodiments of the present invention is not limited to the above example, and the electronic device 201 can display various shapes such as a wind shape, a lightning shape, and the like depending on the angular velocity or speed of the object have.

 The electronic device 201 according to various embodiments of the present invention may perform a function corresponding to the direction of movement or the angular velocity of motion of the displayed object within a single image frame. For example, referring to Figure < 1501 > of Figure 15A, the electronic device 201 may analyze the line elements of a hand in a single image frame to confirm that the movement of the hand is a gesture moving from right to left, It is possible to perform a corresponding specific function (for example, the function of switching the current page to the next page). Referring to FIG. 15A, the electronic device 201 can analyze the line elements of a hand in a single image frame to confirm that the movement of the hand is a gesture moving in a circular shape, Function (e.g., a function for rotating an image in the image viewer or a zoom function for zooming an image). 15A, the electronic device 201 can analyze the line elements of a hand in a single image frame to confirm that the movement of the hand is a gesture moving in a wave shape, Function (for example, an eraser function that erases a picture in an image editing application). The electronic device 201 may perform a function corresponding to the angular velocity or velocity of the object. The electronic device 201 may perform other functions depending on the determined angular velocity or speed even though the same gesture input is used. For example, if the gesture input of < 1501 > is below a predetermined speed, the electronic device 201 may perform a page enlargement / reduction function. If the gesture input of < 1501 > can do.

The electronic device 201 according to various embodiments of the present invention may perform a function corresponding to the direction of movement or angular velocity of movement of a portion of an object displayed within a single image frame. For example, referring to <1504> of FIG. 15B, the electronic device 201 can analyze line elements of a specific finger in a single image frame to confirm that the movement of the finger is a diagonal reciprocating gesture, It is possible to perform a corresponding specific function (for example, an eraser function for erasing a picture in an image editing application). Referring to FIG. 15B, the electronic device 201 analyzes the line elements of a specific finger in a single image frame, and confirms that the movement of the finger proceeds in the downward direction and then proceeds in the right direction And can perform a specific function corresponding to the gesture (for example, a function of terminating the currently executed web browser). Referring to FIG. 15B, the electronic device 201 analyzes the line elements of a specific finger in a single image frame, and confirms that the movement of the finger proceeds in the leftward direction and then proceeds in the upward direction , And may perform a specific function corresponding to the gesture (for example, a function of moving to a previous window in a web browser). Referring to FIG. 15B, the electronic device 201 analyzes the line elements of a specific finger in a single image frame to confirm that the movement of the finger is a gesture advancing in the U-shaped direction, and the gesture The corresponding specific function (e.g., the function of opening the same window as the current window in the web browser) can be performed. The function corresponding to the velocity or angular velocity of the object according to various embodiments of the present invention is not limited to the above example, and may be set by the user or variously set by the manufacturer at the time of manufacture.

As used herein, the term "module " includes units comprised of hardware, software, or firmware and may be used interchangeably with terms such as, for example, logic, logic blocks, components, or circuits. A "module" may be an integrally constructed component or a minimum unit or part thereof that performs one or more functions. "Module" may be implemented either mechanically or electronically, for example, by application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs) And may include programmable logic devices. At least some of the devices (e.g., modules or functions thereof) or methods (e.g., operations) according to various embodiments may be stored in a computer readable storage medium (e.g., memory 130) . &Lt; / RTI &gt; When the instruction is executed by a processor (e.g., processor 120), the processor may perform a function corresponding to the instruction. The computer-readable recording medium may be a hard disk, a floppy disk, a magnetic medium such as a magnetic tape, an optical recording medium such as a CD-ROM, a DVD, a magnetic-optical medium such as a floppy disk, The instructions may include code that is generated by the compiler or code that may be executed by the interpreter. Modules or program modules according to various embodiments may include at least one or more of the components described above Operations that are performed by modules, program modules, or other components, in accordance with various embodiments, may be performed in a sequential, parallel, iterative, or heuristic manner, or at least in part Some operations may be executed in a different order, omitted, or other operations may be added.

Claims (20)

In an electronic device,
An electronic device housing;
At least one camera included in the housing and obtaining an image including at least one object;
A display module included in the housing, the display module displaying the acquired image;
A processor included in the housing and electrically connected to the at least one camera and / or the display module; And
And a memory electrically coupled to the processor,
Wherein the memory, when executed,
Detecting edges of the object in a single image frame acquired through the camera, analyzing line elements of the object based on the edges detected in the single image frame, And stores the instructions for confirming the movement of the object based on a result of analyzing the line elements of the object in a single image frame.
The method according to claim 1,
Wherein the instructions cause the processor to:
The edge of the object acquires the detected edge detection image,
And performs image binarization on the obtained edge detection image to obtain a line detection image representing line elements of the object.
The method according to claim 1,
Wherein the instructions cause the processor to:
And to identify line elements of the object through a line detection algorithm.
The method of claim 3,
Wherein the line detection algorithm comprises a Hough transform.
The method according to claim 1,
Wherein the instructions cause the processor to:
And to identify parallel parallel elements among the line elements of the object through a line detection algorithm.
The method according to claim 1,
Wherein the instructions cause the processor to:
And to determine an angular velocity of the motion of the object based on an analysis result of the line elements of the object.
The method according to claim 1,
Wherein the instructions cause the processor to:
Wherein the angular velocity of the object is determined using at least one of an average length, a middle length, and a minimum length of parallel line elements among the line elements of the object.
The method according to claim 1,
Wherein the instructions cause the processor to:
And to determine a speed of movement of the object based on the analysis result of the line elements of the object and the distance information from the device to the object.
The method according to claim 1,
Wherein the instructions cause the processor to:
And confirms the progress direction of the motion of the object based on an analysis result of the line elements of the object.
The method according to claim 1,
Wherein the instructions cause the processor to:
Displaying on the display at least one of a velocity and an angular velocity with respect to movement of the object,
And causes the display to display shapes corresponding to line elements of the object on the display.
A method of tracking an object in an electronic device comprising at least one camera,
Detecting edges of the object in a single image frame acquired through the camera;
Analyzing line elements of the object based on the edges detected in the single image frame;
And checking movement of the object based on a result of analyzing line elements of the object in the single image frame.
12. The method of claim 11,
The detecting of the edges of the object comprises:
Obtaining an edge detection image where edges of the object are detected; And
And performing an image binarization on the obtained edge detection image to obtain a line detection image representing line elements of the object.
12. The method of claim 11,
The act of analyzing line elements of the object,
And identifying line elements of the object through a line detection algorithm.
14. The method of claim 13,
Wherein the line detection algorithm comprises a Hough transform.
12. The method of claim 11,
The act of analyzing line elements of the object,
And detecting line parallel elements parallel to each other among line elements of the object through a line detection algorithm.
12. The method of claim 11,
The operation for confirming the movement of the object may include:
And determining an angular velocity of the object based on an analysis result of the line elements of the object.
12. The method of claim 11,
The operation for confirming the movement of the object may include:
Determining an angular velocity with respect to a motion of the object using at least one of an average length, a middle length, and a minimum length of parallel line elements among the line elements of the object.
12. The method of claim 11,
The operation for confirming the movement of the object may include:
Determining a speed of movement of the object based on the analysis result of the line elements of the object and the distance information from the device to the object.
12. The method of claim 11,
The operation for confirming the movement of the object may include:
And checking the direction of movement of the object based on an analysis result of the line elements of the object.
12. The method of claim 11,
Displaying on the display at least one of a velocity and an angular velocity of the motion of the object; And
Further comprising displaying on the display shapes corresponding to line elements of the object.
KR1020150154511A 2015-11-04 2015-11-04 Electronic device and method for tracking an object using a camera module of the same KR20170052264A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150154511A KR20170052264A (en) 2015-11-04 2015-11-04 Electronic device and method for tracking an object using a camera module of the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150154511A KR20170052264A (en) 2015-11-04 2015-11-04 Electronic device and method for tracking an object using a camera module of the same

Publications (1)

Publication Number Publication Date
KR20170052264A true KR20170052264A (en) 2017-05-12

Family

ID=58740534

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150154511A KR20170052264A (en) 2015-11-04 2015-11-04 Electronic device and method for tracking an object using a camera module of the same

Country Status (1)

Country Link
KR (1) KR20170052264A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663443A (en) * 2022-02-24 2022-06-24 清华大学 12-lead paper electrocardiogram digitization method and device
KR20220098463A (en) 2021-01-04 2022-07-12 강유미 D-log Cameraman with fixed and driving assistance functions

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220098463A (en) 2021-01-04 2022-07-12 강유미 D-log Cameraman with fixed and driving assistance functions
CN114663443A (en) * 2022-02-24 2022-06-24 清华大学 12-lead paper electrocardiogram digitization method and device

Similar Documents

Publication Publication Date Title
US10871798B2 (en) Electronic device and image capture method thereof
US10948949B2 (en) Electronic apparatus having a hole area within screen and control method thereof
EP3346696B1 (en) Image capturing method and electronic device
EP3086217B1 (en) Electronic device for displaying screen and control method thereof
EP3457268B1 (en) Screen output method and electronic device supporting same
EP3358455A1 (en) Apparatus and method for controlling fingerprint sensor
CN110476189B (en) Method and apparatus for providing augmented reality functions in an electronic device
EP3367282B1 (en) Electronic device for authenticating using biometric information and method of operating electronic device
US20190324640A1 (en) Electronic device for providing user interface according to electronic device usage environment and method therefor
KR20180013277A (en) Electronic apparatus for displaying graphic object and computer readable recording medium
US10444920B2 (en) Electronic device and method for controlling display in electronic device
KR20170097884A (en) Method for processing image and electronic device thereof
KR20180089810A (en) Electronic device and method for determining touch coordinate thereof
KR20160124536A (en) Method and electronic apparatus for providing user interface
KR20170084558A (en) Electronic Device and Operating Method Thereof
CN108632529B (en) Electronic device providing a graphical indicator for a focus and method of operating an electronic device
KR20170052984A (en) Electronic apparatus for determining position of user and method for controlling thereof
US20180032174A1 (en) Method and electronic device for processing touch input
KR102513147B1 (en) Electronic device and method for recognizing touch input using the same
KR20180024238A (en) Electronic apparatus for reducing burn-in and computer readable recording medium
EP3327551A1 (en) Electronic device for displaying image and method for controlling the same
KR20160137258A (en) Electronic apparatus and method for displaying screen thereof
US10091436B2 (en) Electronic device for processing image and method for controlling the same
KR20180091380A (en) Electronic device and operating method thereof
KR20170052264A (en) Electronic device and method for tracking an object using a camera module of the same