KR20170052264A - Electronic device and method for tracking an object using a camera module of the same - Google Patents
Electronic device and method for tracking an object using a camera module of the same Download PDFInfo
- Publication number
- KR20170052264A KR20170052264A KR1020150154511A KR20150154511A KR20170052264A KR 20170052264 A KR20170052264 A KR 20170052264A KR 1020150154511 A KR1020150154511 A KR 1020150154511A KR 20150154511 A KR20150154511 A KR 20150154511A KR 20170052264 A KR20170052264 A KR 20170052264A
- Authority
- KR
- South Korea
- Prior art keywords
- line elements
- electronic device
- line
- processor
- image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- G06K9/00355—
-
- G06K9/6296—
-
- H04N5/23267—
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Various embodiments of the present invention are directed to an electronic device for tracking an object through a camera module and to an object tracking method thereof.
The electronic device can photograph a subject or an object through the camera module, and can detect an edge of a subject or an object from the photographed image. The electronic device can identify various information using an edge in the image. For example, the direction of the object can be checked by comparing the edge of the previous image frame with the edge of the current image frame. The electronic device may use a line detection algorithm such as a Hough transform algorithm to analyze the edge. For general description of Hough transform, see references [1], [2], and [3] below.
[references]
[1] Hough Transform, http://homepages.inf.ed.ac.uk/rbf/HIPR2/hough.htm
[2] Hough Transform, http://en.wikipedia.org/wiki/Hough_transform, Oct 20, 2011
[3] MACHII KIMIYOSHI, "INFORMATION PROCESSOR WITH HAND SHAPE RECOGNIZING FUNCTION", Patent no. JP 9035066, Feb 07, 1997
The electronic device must utilize at least two image frames to identify the motion of the object through the camera module and the velocity or direction of movement of the object after the two image frames are analyzed. In this case, there is a problem that it is difficult to confirm the accurate moving direction or moving speed due to the processing time of at least two image frames.
Further, when a blurred image frame is acquired, the electronic device has a problem that it is difficult to confirm the moving direction or the moving speed of the object from the blurred image of a single frame.
Various embodiments of the present invention may provide an apparatus and method for analyzing line elements of an object from a single image frame and for verifying the motion of the object based on the analysis results.
An electronic device in accordance with various embodiments of the present invention includes an electronic device housing, at least one camera included in the housing and obtaining an image comprising at least one object, the housing being included in the housing, A processor included in the housing and electrically coupled to the at least one camera and / or the display module, and a memory electrically coupled to the processor, wherein the memory, when executed, Detecting edges of the object in a single image frame acquired through the camera, analyzing line elements of the object based on the edges detected in the single image frame, The line elements of the object in the single image frame It may be based on the results seokhan store instruction that to determine a movement of the object.
The object tracking method of an electronic device according to various embodiments of the present invention can confirm the movement of an object in a short time by analyzing a single image frame.
The object tracking method of an electronic device according to various embodiments of the present invention can provide an efficient image analysis method by confirming the movement of an object from a blurred image of a single frame.
1 illustrates a network environment including an electronic device according to an embodiment of the present invention.
2 is a block diagram of an electronic device in accordance with various embodiments of the present invention.
3 is a block diagram of a program module in accordance with various embodiments of the present invention.
Figure 4 is a flow chart illustrating the operation of the electronic device of Figure 2 in accordance with various embodiments of the present invention to confirm movement of an object through a single image frame.
5A is a diagram illustrating an electronic device in accordance with various embodiments of the present invention detecting edges of an object in a single frame image using an edge detection method.
FIG. 5B is a diagram illustrating an electronic device according to various embodiments of the present invention performing image binarization on an edge detection image of an object to display line elements of the object. FIG.
6 is a diagram showing the result of an electronic device according to various embodiments of the present invention performing Hough transform on an image representing a human hand.
7 is a diagram illustrating a regular expression of the parameters of a Hough transform for a straight line in an image according to various embodiments of the present invention.
8 is a diagram showing an image of one line and the results of the Hough transform according to various embodiments of the present invention.
Figure 9 is a diagram showing the image of two lines and the result of the Hough transform according to various embodiments of the present invention.
10 is a diagram illustrating the results of Hough transform of two parallel lines having the same angular value in the Huff space according to various embodiments of the present invention.
11 is a diagram showing the image of a plurality of lines and the result of the Hough transform according to various embodiments of the present invention.
12 is a diagram illustrating a contour representation image of a human finger and the results of its Hough transform according to various embodiments of the present invention.
Figure 13 is an illustration of an electronic device according to various embodiments of the present invention identifying parallel line elements having a maximum value within a predetermined angular range from a result of a Hough transform of a plurality of lines.
14A is a diagram showing an electronic device according to various embodiments of the present invention displaying a linear shape according to an angular velocity of a hand;
14B is a view showing that the electronic device according to various embodiments of the present invention displays a wavy shape according to the angular velocity of the hand.
15A is a diagram illustrating an electronic device according to various embodiments of the present invention performing various functions according to various gesture inputs of a hand.
15B is a diagram illustrating that an electronic device according to various embodiments of the present invention performs various functions according to various gesture inputs of a finger that is a part of a hand.
Hereinafter, various embodiments of the present document will be described with reference to the accompanying drawings. It is to be understood that the embodiments and terminologies used herein are not intended to limit the invention to the particular embodiments described, but to include various modifications, equivalents, and / or alternatives of the embodiments. In connection with the description of the drawings, like reference numerals may be used for similar components. The singular expressions may include plural expressions unless the context clearly dictates otherwise. In this document, the expressions "A or B" or "at least one of A and / or B" and the like may include all possible combinations of the items listed together. Expressions such as " first, "" second," " first, "or" second, " But is not limited to those components. When it is mentioned that some (e.g., first) component is "(functionally or communicatively) connected" or "connected" to another (second) component, May be connected directly to the component, or may be connected through another component (e.g., a third component).
In this document, the term " configured to (or configured) to "as used herein is intended to encompass all types of hardware, software, , "" Made to "," can do ", or" designed to ". In some situations, the expression "a device configured to" may mean that the device can "do " with other devices or components. For example, a processor configured (or configured) to perform the phrases "A, B, and C" may be implemented by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) , And a general purpose processor (e.g., a CPU or an application processor) capable of performing the corresponding operations.
Electronic devices in accordance with various embodiments of the present document may be used in various applications such as, for example, smart phones, tablet PCs, mobile phones, videophones, electronic book readers, desktop PCs, laptop PCs, netbook computers, workstations, a portable multimedia player, an MP3 player, a medical device, a camera, or a wearable device. Wearable devices may be of the type of accessories (eg, watches, rings, bracelets, braces, necklaces, glasses, contact lenses or head-mounted-devices (HMD) (E.g., a skin pad or tattoo), or a bio-implantable circuit. In some embodiments, the electronic device may be, for example, a television, a digital video disk (Such as Samsung HomeSync TM , Apple TV TM , or Google TV TM ), which are used in home appliances such as home appliances, audio, refrigerators, air conditioners, vacuum cleaners, ovens, microwave ovens, washing machines, air cleaners, set top boxes, home automation control panels, , A game console (e.g., Xbox TM , PlayStation TM ), an electronic dictionary, an electronic key, a camcorder, or an electronic photo frame.
In an alternative embodiment, the electronic device may be any of a variety of medical devices (e.g., various portable medical measurement devices such as a blood glucose meter, a heart rate meter, a blood pressure meter, or a body temperature meter), magnetic resonance angiography (MRA) A navigation system, a global navigation satellite system (GNSS), an event data recorder (EDR), a flight data recorder (FDR), an automobile infotainment device, a marine electronic equipment (For example, marine navigation systems, gyro compasses, etc.), avionics, security devices, head units for vehicles, industrial or domestic robots, drones, ATMs at financial institutions, of at least one of the following types of devices: a light bulb, a fire detector, a fire alarm, a thermostat, a streetlight, a toaster, a fitness device, a hot water tank, a heater, a boiler, . According to some embodiments, the electronic device may be a piece of furniture, a building / structure or part of an automobile, an electronic board, an electronic signature receiving device, a projector, or various measuring devices (e.g., Gas, or radio wave measuring instruments, etc.). In various embodiments, the electronic device is flexible or may be a combination of two or more of the various devices described above. The electronic device according to the embodiment of the present document is not limited to the above-described devices. In this document, the term user may refer to a person using an electronic device or a device using an electronic device (e.g., an artificial intelligence electronic device).
Referring to Figure 1, in various embodiments, an electronic device 101 in a
The
The
The wireless communication may include, for example, LTE, LTE-A (LTE Advance), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadband (WiBro) System for Mobile Communications), and the like. According to one embodiment, the wireless communication may be wireless communication, such as wireless fidelity (WiFi), Bluetooth, Bluetooth low power (BLE), Zigbee, NFC, Magnetic Secure Transmission, Frequency (RF), or body area network (BAN). According to one example, wireless communication may include GNSS. GNSS may be, for example, Global Positioning System (GPS), Global Navigation Satellite System (Glonass), Beidou Navigation Satellite System (Beidou) or Galileo, the European global satellite-based navigation system. Hereinafter, in this document, "GPS" can be used interchangeably with "GNSS ". The wired communication may include, for example, at least one of a universal serial bus (USB), a high definition multimedia interface (HDMI), a recommended standard 232 (RS-232), a power line communication or a plain old telephone service have.
Each of the first and second external
2 is a block diagram of an
The
May have the same or similar configuration as communication module 220 (e.g., communication interface 170). The
Memory 230 (e.g., memory 130) may include, for example,
In accordance with various embodiments of the present invention, the
The
The
The
The
The
The
The
The
The
Display 260 (e.g., display 160) may include
The
The
3 is a block diagram of a program module according to various embodiments. According to one embodiment, program module 310 (e.g., program 140) includes an operating system that controls resources associated with an electronic device (e.g., electronic device 101) and / E.g., an application program 147). The operating system may include, for example, Android TM , iOS TM , Windows TM , Symbian TM , Tizen TM , or Bada TM . 3,
The kernel 320 may include, for example, a system resource manager 321 and / or a device driver 323. The system resource manager 321 can perform control, allocation, or recovery of system resources. According to one embodiment, the system resource manager 321 may include a process manager, a memory manager, or a file system manager. The device driver 323 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, an audio driver, or an inter-process communication . The middleware 330 may provide various functions through the
The
The connectivity manager 348 may, for example, manage the wireless connection. The
The application 370 may include a
4 is a flow diagram illustrating an operation in which the
The
The
The
The
Hereinafter, how to interpret Hough space will be described.
Figure 7, in accordance with various embodiments of the present invention, illustrates a straight line in a Cartesian coordinate system where the linear equation is y = mx + b. The main idea in the Hough transform is that the features of the straight line are not used as image points, for example (x1, y1), (x2, y2) . Based on the fact that the straight line y = mx + b can be expressed as a point (b, m) in the parameter space, the parameter r in Fig. 5 represents the distance between the line and the origin, Is the angle of the vector up to. Using such parameters, we can calculate the line equation
.In theory, infinitely many lines can be generated passing through each point at an angle [theta] [0, 2 pi] for every point of the image. All lines (passing through the points of the image) can be created with individual r and θ, and can form a Hough space represented by the (r, θ) plane. Each point from the image is transformed into a Huff space through a number of generated lines. Each point of the image can be represented as a sinusoidal line in Hough space. All points of the sine curve in the Hough space can represent different lines through the points of the image. If there are two points in the image, there may be lines that clearly connect the two points. Since all points in the original image appear to be sinusoids in the Hough space and we know that the two points of the original image can be connected by a single line, there will be two sinusoids in Hough space. The sinusoids will pass through r and θ corresponding to the line passing through the two points in the original image. If the three points of the original image are aligned in a straight line, they are represented by three sinusoids in the Hough space, and in the Hough space the sinusoids intersect at a single point. That is, the coordinates r and θ of the point in the Hough space represent the line connecting the corresponding points in the original image.
FIG. 8 shows a plurality of points forming a straight line on the left side, and a right side of the image shows the Hough space on the right side. One straight line element including a plurality of points in the image may be represented by a local maximum value 810 at which a plurality of sinusoids intersect in the Hough space. Examples of other magnitudes are also presented by way of example in FIG.
FIG. 9 shows, for example, two line elements in an image and their representations in Hough space, < 901 >. Sine curves intersect at two points defining r and θ for the two lines in the original image.
FIG. 10 shows a huff space having two
Returning to
The
The
The
In
The
The
The
The
Hereinafter, embodiments in which the
The
The
The
As used herein, the term "module " includes units comprised of hardware, software, or firmware and may be used interchangeably with terms such as, for example, logic, logic blocks, components, or circuits. A "module" may be an integrally constructed component or a minimum unit or part thereof that performs one or more functions. "Module" may be implemented either mechanically or electronically, for example, by application-specific integrated circuit (ASIC) chips, field-programmable gate arrays (FPGAs) And may include programmable logic devices. At least some of the devices (e.g., modules or functions thereof) or methods (e.g., operations) according to various embodiments may be stored in a computer readable storage medium (e.g., memory 130) . ≪ / RTI > When the instruction is executed by a processor (e.g., processor 120), the processor may perform a function corresponding to the instruction. The computer-readable recording medium may be a hard disk, a floppy disk, a magnetic medium such as a magnetic tape, an optical recording medium such as a CD-ROM, a DVD, a magnetic-optical medium such as a floppy disk, The instructions may include code that is generated by the compiler or code that may be executed by the interpreter. Modules or program modules according to various embodiments may include at least one or more of the components described above Operations that are performed by modules, program modules, or other components, in accordance with various embodiments, may be performed in a sequential, parallel, iterative, or heuristic manner, or at least in part Some operations may be executed in a different order, omitted, or other operations may be added.
Claims (20)
An electronic device housing;
At least one camera included in the housing and obtaining an image including at least one object;
A display module included in the housing, the display module displaying the acquired image;
A processor included in the housing and electrically connected to the at least one camera and / or the display module; And
And a memory electrically coupled to the processor,
Wherein the memory, when executed,
Detecting edges of the object in a single image frame acquired through the camera, analyzing line elements of the object based on the edges detected in the single image frame, And stores the instructions for confirming the movement of the object based on a result of analyzing the line elements of the object in a single image frame.
Wherein the instructions cause the processor to:
The edge of the object acquires the detected edge detection image,
And performs image binarization on the obtained edge detection image to obtain a line detection image representing line elements of the object.
Wherein the instructions cause the processor to:
And to identify line elements of the object through a line detection algorithm.
Wherein the line detection algorithm comprises a Hough transform.
Wherein the instructions cause the processor to:
And to identify parallel parallel elements among the line elements of the object through a line detection algorithm.
Wherein the instructions cause the processor to:
And to determine an angular velocity of the motion of the object based on an analysis result of the line elements of the object.
Wherein the instructions cause the processor to:
Wherein the angular velocity of the object is determined using at least one of an average length, a middle length, and a minimum length of parallel line elements among the line elements of the object.
Wherein the instructions cause the processor to:
And to determine a speed of movement of the object based on the analysis result of the line elements of the object and the distance information from the device to the object.
Wherein the instructions cause the processor to:
And confirms the progress direction of the motion of the object based on an analysis result of the line elements of the object.
Wherein the instructions cause the processor to:
Displaying on the display at least one of a velocity and an angular velocity with respect to movement of the object,
And causes the display to display shapes corresponding to line elements of the object on the display.
Detecting edges of the object in a single image frame acquired through the camera;
Analyzing line elements of the object based on the edges detected in the single image frame;
And checking movement of the object based on a result of analyzing line elements of the object in the single image frame.
The detecting of the edges of the object comprises:
Obtaining an edge detection image where edges of the object are detected; And
And performing an image binarization on the obtained edge detection image to obtain a line detection image representing line elements of the object.
The act of analyzing line elements of the object,
And identifying line elements of the object through a line detection algorithm.
Wherein the line detection algorithm comprises a Hough transform.
The act of analyzing line elements of the object,
And detecting line parallel elements parallel to each other among line elements of the object through a line detection algorithm.
The operation for confirming the movement of the object may include:
And determining an angular velocity of the object based on an analysis result of the line elements of the object.
The operation for confirming the movement of the object may include:
Determining an angular velocity with respect to a motion of the object using at least one of an average length, a middle length, and a minimum length of parallel line elements among the line elements of the object.
The operation for confirming the movement of the object may include:
Determining a speed of movement of the object based on the analysis result of the line elements of the object and the distance information from the device to the object.
The operation for confirming the movement of the object may include:
And checking the direction of movement of the object based on an analysis result of the line elements of the object.
Displaying on the display at least one of a velocity and an angular velocity of the motion of the object; And
Further comprising displaying on the display shapes corresponding to line elements of the object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150154511A KR20170052264A (en) | 2015-11-04 | 2015-11-04 | Electronic device and method for tracking an object using a camera module of the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150154511A KR20170052264A (en) | 2015-11-04 | 2015-11-04 | Electronic device and method for tracking an object using a camera module of the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170052264A true KR20170052264A (en) | 2017-05-12 |
Family
ID=58740534
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150154511A KR20170052264A (en) | 2015-11-04 | 2015-11-04 | Electronic device and method for tracking an object using a camera module of the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170052264A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114663443A (en) * | 2022-02-24 | 2022-06-24 | 清华大学 | 12-lead paper electrocardiogram digitization method and device |
KR20220098463A (en) | 2021-01-04 | 2022-07-12 | 강유미 | D-log Cameraman with fixed and driving assistance functions |
-
2015
- 2015-11-04 KR KR1020150154511A patent/KR20170052264A/en unknown
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220098463A (en) | 2021-01-04 | 2022-07-12 | 강유미 | D-log Cameraman with fixed and driving assistance functions |
CN114663443A (en) * | 2022-02-24 | 2022-06-24 | 清华大学 | 12-lead paper electrocardiogram digitization method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10871798B2 (en) | Electronic device and image capture method thereof | |
US10948949B2 (en) | Electronic apparatus having a hole area within screen and control method thereof | |
EP3346696B1 (en) | Image capturing method and electronic device | |
EP3086217B1 (en) | Electronic device for displaying screen and control method thereof | |
EP3457268B1 (en) | Screen output method and electronic device supporting same | |
EP3358455A1 (en) | Apparatus and method for controlling fingerprint sensor | |
CN110476189B (en) | Method and apparatus for providing augmented reality functions in an electronic device | |
EP3367282B1 (en) | Electronic device for authenticating using biometric information and method of operating electronic device | |
US20190324640A1 (en) | Electronic device for providing user interface according to electronic device usage environment and method therefor | |
KR20180013277A (en) | Electronic apparatus for displaying graphic object and computer readable recording medium | |
US10444920B2 (en) | Electronic device and method for controlling display in electronic device | |
KR20170097884A (en) | Method for processing image and electronic device thereof | |
KR20180089810A (en) | Electronic device and method for determining touch coordinate thereof | |
KR20160124536A (en) | Method and electronic apparatus for providing user interface | |
KR20170084558A (en) | Electronic Device and Operating Method Thereof | |
CN108632529B (en) | Electronic device providing a graphical indicator for a focus and method of operating an electronic device | |
KR20170052984A (en) | Electronic apparatus for determining position of user and method for controlling thereof | |
US20180032174A1 (en) | Method and electronic device for processing touch input | |
KR102513147B1 (en) | Electronic device and method for recognizing touch input using the same | |
KR20180024238A (en) | Electronic apparatus for reducing burn-in and computer readable recording medium | |
EP3327551A1 (en) | Electronic device for displaying image and method for controlling the same | |
KR20160137258A (en) | Electronic apparatus and method for displaying screen thereof | |
US10091436B2 (en) | Electronic device for processing image and method for controlling the same | |
KR20180091380A (en) | Electronic device and operating method thereof | |
KR20170052264A (en) | Electronic device and method for tracking an object using a camera module of the same |