KR101774556B1 - User identification method based on coordinate recognition of IR type input device - Google Patents

User identification method based on coordinate recognition of IR type input device Download PDF

Info

Publication number
KR101774556B1
KR101774556B1 KR1020150188764A KR20150188764A KR101774556B1 KR 101774556 B1 KR101774556 B1 KR 101774556B1 KR 1020150188764 A KR1020150188764 A KR 1020150188764A KR 20150188764 A KR20150188764 A KR 20150188764A KR 101774556 B1 KR101774556 B1 KR 101774556B1
Authority
KR
South Korea
Prior art keywords
infrared
user
input device
user identification
point
Prior art date
Application number
KR1020150188764A
Other languages
Korean (ko)
Other versions
KR20170078344A (en
Inventor
박홍식
윤태수
채일진
정규홍
Original Assignee
동서대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 동서대학교 산학협력단 filed Critical 동서대학교 산학협력단
Priority to KR1020150188764A priority Critical patent/KR101774556B1/en
Publication of KR20170078344A publication Critical patent/KR20170078344A/en
Application granted granted Critical
Publication of KR101774556B1 publication Critical patent/KR101774556B1/en

Links

Images

Classifications

    • G06K9/62
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Abstract

The present invention relates to a method for identifying a user based on location recognition of a multi-infrared input device. In the present invention, a plurality of infrared input devices generate an image captured through an infrared camera with respect to an infrared point (or pattern) that blinks at a position intersecting on the screen in the direction of viewing in real time, A first step of performing infrared point detection corresponding to the user identification code; And analyzing the user from the user identification code and deriving coordinates from the detected infrared point; .
Accordingly, it is unnecessary to construct a separate system for user recognition, so that the cost for constructing the additional system is not increased, and even when the input of the multi-point infrared point is performed, the user identification and the real- ), And it is possible to promptly perform identification even when a large number of users participate in the operation.

Description

Technical Field [0001] The present invention relates to a user identification method based on location recognition of a multi-

The present invention relates to a method for identifying a user based on the recognition of a location information of a multi-user infrared input device, and more particularly, to a method for identifying a user using a multi- The present invention relates to a method for identifying a user based on location information of a multi-infrared input device for rapidly identifying a user even when using a multi-directional infrared input device.

1 is a diagram illustrating a conventional infrared point location recognition system. 1, an infrared point position recognition system according to the related art includes a calculation device 4 for calculating an infrared point coordinate on a screen 1 using an infrared ray input device 3, It is possible to calculate the position and the firing point of the infrared ray input device 3 by the user by recognizing the coordinates.

More specifically, in accordance with the operation of the infrared ray input apparatus 3 including the infrared ray point function, the calculating apparatus 4 calculates the infrared ray point (in the direction of the infrared ray input device 3) Or pattern) through the infrared camera 2, and then calculates the position of the infrared ray point.

Here, the infrared camera 2 has a separate form from the infrared input device 3 and has an internal form included in the infrared input device 3, but the overall operation of the system is very different.

2, the infrared point (pattern) sensing process S1, the coordinate derivation and user analysis process S2, and the user and the coordinate matching process S3 are performed in the same manner as in the first embodiment It is subjected to a three-step calculation process. That is, the infrared point position recognition system senses the infrared point generated at the intersection of the direction in which the infrared ray input device 3 is viewed and the screen 1, and analyzes the infrared ray point so that the infrared ray position information of the infrared ray Coordinates of the point and the user. Finally, the infrared point position recognition system can be applied to various calculation processes based on the user identification value of the coordinates by matching the coordinates with the user.

However, the existing system has a problem that it is impossible to process the final data by calculating the final data through three steps of user identification, coordinate recognition, user and coordinate matching, .

In other words, the infrared point position recognition system has a disadvantage in that it can not identify who is the detected infrared point when a plurality of people participate, and in order to solve this problem, a method of mainly outputting a specific pattern or color is used However, there is a limit in addition that arithmetic processing according to user identification occurs.

Accordingly, in the related art, it is required to reduce the number of steps required for the entire operation so that the time required for user identification is greatly reduced, so that a technology for enabling system operation even when using a multi-infrared input device is required.

Korean Patent Laid-Open Publication No. 10-2014-0104539 entitled "Three-dimensional motion recognition input device equipped with infrared sensor" Korea Patent Registration No. 10-1375056 "Touch Screen System Recognizing Infrared Pen Coordinates" Korean Patent Laid-Open Publication No. 10-2013-0017773 entitled " Infrared Image Output Device and Infrared Image Output &

SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems, and it is an object of the present invention to provide an information processing apparatus and method capable of processing user identification and real-time operation processing with low system resources even in a multi- And to provide a user identification method based on the location recognition of a multi-infrared input device for quickly identifying the user.

In addition, the present invention provides a method for identifying a user based on location information of a multi-infrared input device, in order to prevent an increase in the cost of constructing an additional system, because it is not necessary to construct a separate system for user recognition while securing system resources .

However, the objects of the present invention are not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a method for identifying a user based on location recognition of a multi-user infrared input device, the method comprising: a plurality of infrared input devices, A first step of detecting an infrared ray point corresponding to a user identification code matched with each frame of the photographed image, And analyzing the user from the user identification code and deriving coordinates from the detected infrared point; .

At this time, before the first step, a plurality of infrared ray input devices are provided, from the infrared ray camera which performs a predetermined number of frames per second for a screen to be irradiated with infrared rays, Specifying a frame-specific user; .

In addition, the step of designating a user for each frame includes:

Figure 112015128384310-pat00001
The user (1 to N, N is a natural number of 2 or more) for each frame according to the number of users set in advance by the calculation using the mathematical expression by the equation (1).

In the first step, since a code for identifying the user is not separately provided as data in the frame of the photographed image received from the infrared camera, user identification information (User Identification Information) is added as header information of each frame, Enabling identification; .

The user identification method based on the position recognition of the multi-infra-red IR input device according to the embodiment of the present invention does not require the construction of a separate system for recognizing the user, so that the cost for constructing the additional system is not increased, , User identification and real-time operation processing can be handled as low system resources, and even when a large number of users participate, it is possible to smoothly perform the operation, thereby providing an effect that users can be quickly identified.

BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a diagram showing an infrared point position recognition system according to the prior art; Fig.
2 is a flow diagram illustrating a method for processing data operations on an infrared point location system in accordance with the prior art;
3 is a general flowchart illustrating a method of identifying a user based on location recognition of a multi-infrared input device according to an exemplary embodiment of the present invention.
4 is a diagram showing a user-specific frame index (based on four persons) in a user identification method based on a location-based recognition of a multi-infrared input device according to an embodiment of the present invention.
5 is a view for explaining a form in which a user identification code is inserted into a frame which is image information in which a user identification code does not exist in a user identification method based on a location recognition of a multi-infrared infrared input device according to an embodiment of the present invention.
FIG. 6 is a flowchart illustrating a synchronization process for each infrared input device using image information in which a user identification code is inserted in a user identification method based on a location recognition of a multi-infrared infrared input device according to an exemplary embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a detailed description of preferred embodiments of the present invention will be given with reference to the accompanying drawings. In the following description of the present invention, detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.

In the present specification, when any one element 'transmits' data or signals to another element, the element can transmit the data or signal directly to the other element, and through at least one other element Data or signal can be transmitted to another component.

Hereinafter, the present invention will be described in detail with reference to the accompanying drawings, in which: FIG. 1 is a block diagram of a conventional multi- Description will be given using the screen 1, the infrared camera 2, the infrared ray input device 3 and the computing device 4 as constituent elements.

FIG. 3 is a general flowchart illustrating a method of identifying a user based on the location recognition of a multi-infrared input device according to an exemplary embodiment of the present invention. Referring to FIG. 3, a user identification method based on a location recognition of a multi-input infrared (IR) input device is performed by a plurality of users, each of which has an infrared input device 3, in order to quickly identify a user without reducing system resources and performance, ) Method, it is possible to identify the infrared point of each user by the two-step process of the infrared point detection process (S10) matching with the user identification code and the user analysis and coordinate derivation process (S20).

That is, the infrared point (pattern) of the three steps of the infrared point (pattern) sensing process S1, the coordinate derivation and user analysis process S2, and the user and the coordinate matching process S3 in FIG. The sensing step S1 and the user and the coordinate matching step S3 are performed by one process, thereby providing an effect of reducing the processing step by one step.

Hereinafter, each process of FIG. 3 will be described in detail.

[Infrared point detection process (S10) matching with the user identification code]

In the present invention, when the infrared camera 2 is 120 fps (frame per second), it is possible to take 120 frames per second. For convenience of explanation, the infrared camera 2 will be described with reference to 120 fps.

The calculation device 4 fetches all the images from the photographed image received from the infrared camera 2 without discriminating the user and analyzes the coordinates and the user in step S20 in the conventional infrared ray point (Pattern) sensing process S1 and the user and the coordinate matching process S3 are performed in the infrared point sensing process S10 matching the user identification code as one process, A user such as a table (for four persons) of the table (for four persons) is designated, and the data of all users is not recognized for every frame, and only the data of the user designated for each frame is processed as shown in the table of Fig.

To this end, the computing device 4 designates a user for each frame according to a preset number of users.

delete

After the user designation for each frame, the computing device 4 determines whether or not a plurality of infrared ray input devices 3 are in contact with the infrared camera (or pattern) for the infrared point (or pattern) 2), and performs infrared point detection corresponding to the user identification code matched with each frame of the photographed image.

Since the code for identifying the user is not separately provided as data in the frame of the photographed image received from the infrared camera 2, the computing device 4 adds user identification information as header information of each frame Thereby enabling user identification.

5 shows a form in which a user identification code is inserted into a frame, which is image information in which the user identification code does not exist, by the arithmetic unit 4. Fig.

[Table 1] below shows an example of user-specific indexes for inserting user identification codes into a frame by the arithmetic and logic unit 4.

user index user index 1p 00 09p 08 2p 01 10p 09 3p 02 11p 0a 4p 03 12p 0b 5p 04 13p 0c 6p 05 14p 0d 7p 06 15p 0e 8p 07 16p 0f

Referring to Table 1, the user identification code may exist in various sizes depending on the system type, and preferably has an Index value of the user's infrared input device 3.

[Table 1] is a simple example of the index for each user, and an index corresponding to the user identification code can be added / changed for each system.

[Coordinate derivation and user analysis process (S20)]

After step S10, the computing device 4 identifies the user from the user identification code and derives the coordinates from the detected infrared point through the frame, which is the image information including the respective user identification code information and the infrared point.

The computing device 4 may be stored by each user matching the user identification code or may be transmitted to an external terminal or server by completing the coordinate calculation for the infrared input device 3 matching the user identification code.

6 is a diagram illustrating a process of synchronizing the infrared input device 3 using the image information in which the user identification code is inserted as shown in FIG. 5 in step S10. Referring to FIG. 6, the computing device 4 inserts a user identification code into the image information received from the infrared camera 2 (S110).

After step S110, the computing device 4 identifies the user's infrared input device 3 from the user identification code through the image data processing and calculates the infrared point coordinates of the identified infrared input device 3 (S120 ).

After the step S120, the arithmetic unit 4 sets ON for the use of the infrared input device 3 corresponding to the next frame among the preset frames per second, and sets the ON for the other infrared input device 3 For the use, OFF is set (S130).

That is, in the case where the user identification code is not provided, the user identification method based on the position recognition of the multi-infrared infrared input device of the present invention receives the frame of the image information through the infrared camera 2 even when the infrared input device 3 operates, In a situation where the user is not identified by the identification code, the data reception to the infrared input device 3 is interrupted to provide an effect that only the data for the infrared input device 3 of the specific user for each frame can be received and processed .

As described above, the system through the insertion of the user identification code is easy to recognize the user as compared with the conventional system, and can operate as a system which does not require a cost for adding a system or an increase in data throughput.

The present invention can also be embodied as computer-readable codes on a computer-readable recording medium. A computer-readable recording medium includes all kinds of recording apparatuses in which data that can be read by a computer system is stored.

Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) .

The computer readable recording medium may also be distributed over a networked computer system so that computer readable code can be stored and executed in a distributed manner. And functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers skilled in the art to which the present invention pertains.

As described above, preferred embodiments of the present invention have been disclosed in the present specification and drawings, and although specific terms have been used, they have been used only in a general sense to easily describe the technical contents of the present invention and to facilitate understanding of the invention , And are not intended to limit the scope of the present invention. It is to be understood by those skilled in the art that other modifications based on the technical idea of the present invention are possible in addition to the embodiments disclosed herein.

1: Screen 2: Infrared camera
3: infrared input device 4: computing device

Claims (4)

An infrared ray point position recognition system for calculating the position of an infrared ray point on the screen 1 after recognizing the infrared ray point coordinates on the screen 1 through the built-in infrared ray camera 2 included in the infrared ray input device 3, A method for identifying a location based multi-user infrared input device, the method comprising:
A first step in which a user for each frame per second of an image photographed by the infrared camera 2 capturing at 120 frames per second is specified and a user for each frame according to the number of preset users is designated by the computing device 4; ;
An infrared camera 2 built in each infrared ray input device 3 performs imaging in a direction in which a plurality of infrared ray input devices 3 are viewed in real time and an infrared image camera 2 is mounted on the screen 1 A second step of photographing a blinking infrared point disposed at a set point on the screen;
A third step in which the computing device 4 receives the photographed image from the infrared camera 2 incorporated in each infrared ray input device 3;
The computing device 4 inserts the user identification code designated in the first step in the frame of the photographed image in which the user identification code does not exist and the index value of the user's infrared input device 3 is used as the user identification code A fourth step;
The computing device 4 identifies the user's infrared input device 3 from the user identification code through the frame which is the image information including the respective user identification code information and the infrared point, A fifth step of calculating point coordinates;
The computing device 4 sets ON for the use of the infrared input device 3 corresponding to the next frame among the preset frames per second and turns OFF for use for the remaining infrared input devices 3 And a sixth step of performing the setting of the location information based on the location information.
delete delete delete
KR1020150188764A 2015-12-29 2015-12-29 User identification method based on coordinate recognition of IR type input device KR101774556B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150188764A KR101774556B1 (en) 2015-12-29 2015-12-29 User identification method based on coordinate recognition of IR type input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150188764A KR101774556B1 (en) 2015-12-29 2015-12-29 User identification method based on coordinate recognition of IR type input device

Publications (2)

Publication Number Publication Date
KR20170078344A KR20170078344A (en) 2017-07-07
KR101774556B1 true KR101774556B1 (en) 2017-09-04

Family

ID=59353269

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150188764A KR101774556B1 (en) 2015-12-29 2015-12-29 User identification method based on coordinate recognition of IR type input device

Country Status (1)

Country Link
KR (1) KR101774556B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040239653A1 (en) * 2003-05-27 2004-12-02 Wolfgang Stuerzlinger Collaborative pointing devices
KR101348346B1 (en) 2007-09-06 2014-01-08 삼성전자주식회사 Pointing apparatus, pointer controlling apparatus, pointing method and pointer controlling method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040239653A1 (en) * 2003-05-27 2004-12-02 Wolfgang Stuerzlinger Collaborative pointing devices
KR101348346B1 (en) 2007-09-06 2014-01-08 삼성전자주식회사 Pointing apparatus, pointer controlling apparatus, pointing method and pointer controlling method

Also Published As

Publication number Publication date
KR20170078344A (en) 2017-07-07

Similar Documents

Publication Publication Date Title
US10762649B2 (en) Methods and systems for providing selective disparity refinement
WO2019071664A1 (en) Human face recognition method and apparatus combined with depth information, and storage medium
US8897502B2 (en) Calibration for stereoscopic capture system
IL256885A (en) Apparatus and methods for facial recognition and video analytics to identify individuals in contextual video streams
CN106454079B (en) Image processing method and device and camera
US9609305B1 (en) Feature-based rectification of stereo cameras
US9041776B2 (en) 3-dimensional depth image generating system and method thereof
TWI669664B (en) Eye state detection system and method for operating an eye state detection system
JP2021177399A (en) Information processor, control method, and program
KR20150039252A (en) Apparatus and method for providing application service by using action recognition
US9268408B2 (en) Operating area determination method and system
CN112543343A (en) Live broadcast picture processing method and device based on live broadcast with wheat and electronic equipment
CN111222432A (en) Face living body detection method, system, equipment and readable storage medium
US9811916B1 (en) Approaches for head tracking
KR101360999B1 (en) Real time data providing method and system based on augmented reality and portable terminal using the same
US8803941B2 (en) Method and apparatus for hands-free control of a far end camera
WO2022179251A1 (en) Image processing method and apparatus, electronic device, and storage medium
KR101797814B1 (en) Teaching apparatus, method for child based on image comparison algorithm
Chowdhury et al. Robust human detection and localization in security applications
KR101774556B1 (en) User identification method based on coordinate recognition of IR type input device
CN111104921A (en) Multi-mode pedestrian detection model and method based on Faster rcnn
EP3206188A1 (en) Method and system for realizing motion-sensing control based on intelligent device, and intelligent device
Hadi et al. Fusion of thermal and depth images for occlusion handling for human detection from mobile robot
KR20190001873A (en) Apparatus for searching object and method thereof
US11263784B2 (en) Determine image capture position information based on a quasi-periodic pattern

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant