WO2021167312A1 - 라이다 센서를 구비한 터치 인식 방법 및 장치 - Google Patents
라이다 센서를 구비한 터치 인식 방법 및 장치 Download PDFInfo
- Publication number
- WO2021167312A1 WO2021167312A1 PCT/KR2021/001945 KR2021001945W WO2021167312A1 WO 2021167312 A1 WO2021167312 A1 WO 2021167312A1 KR 2021001945 W KR2021001945 W KR 2021001945W WO 2021167312 A1 WO2021167312 A1 WO 2021167312A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch object
- coordinates
- touch
- frame
- distance
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000013523 data management Methods 0.000 claims description 10
- 238000000605 extraction Methods 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
Definitions
- the present invention relates to a method and apparatus for recognizing a touch having a lidar sensor, and more particularly, to a method and apparatus for recognizing a touch object using sensing information collected through a lidar sensor.
- the method of using the piezoelectric sensor is similar to the method of the DDR pump machine found in the arcade, and since the sensor must be installed below the location where the touch actually occurs, the difficulty of recognizing the touch around the sensor and the failure of the sensor due to touch There is a problem that this easily occurs.
- the method using 3D image information can acquire various information such as the location and distance of the touch moment, but there is a technical difficulty in that it is necessary to recognize a person in the image and track it in order to recognize a touch in the image information.
- An object of the present invention is to solve the above-described problem, and to recognize a touch using a lidar sensor.
- Another object of the present invention is to determine a state of a touch object using sensing information obtained from a lidar sensor, and to recognize a user's movement.
- the present invention for achieving this object is a method in which a server recognizes a touch of an object, a step of extracting the coordinates of the object based on sensing information on at least one object sensed through a lidar sensor; Step b of calculating the first touch object of the first frame and the first coordinates of the first touch object by performing clustering on the coordinates of the object extracted from the first frame, the first touch object and the first touch object Step c of determining the state of the first touch object by using first coordinates, Step d of converting the first coordinates into real coordinates and determining whether the first coordinates are located in a preset effective area, and the second step and a step e of displaying the first touch object at the actual coordinates when it is determined that the first touch object is located in the effective area.
- a sensor unit generating sensing information by detecting at least one object through a lidar sensor, a coordinate extracting unit extracting coordinates of the object based on the sensing information, a first A data manager configured to perform clustering on the coordinates of the object extracted from the frame to extract the first touch object of the first frame and the first coordinates of the first touch object, the first touch object, and the first coordinates a state determination unit that determines the state of the first touch object using and a display unit configured to display the first touch object at the actual coordinates when the touch object is located in the effective area.
- the present invention can determine the user's movement by determining the state of the touch object using the sensing information obtained from the lidar sensor.
- the present invention can provide a sense of reality to a user using the content by providing a touch recognition technology with high accuracy in a technology related to body motion recognition such as AR and VR content.
- FIG. 1 is a view showing the configuration of a touch recognition device according to an embodiment of the present invention.
- FIG. 2 is a view for explaining a method for recognizing a touch object according to an embodiment of the present invention
- FIG. 3 is a view for explaining a state of a touch object according to an embodiment of the present invention.
- FIG. 4 is a view for explaining a touch recognition method according to an embodiment of the present invention.
- each component may be implemented as a hardware processor, respectively, the above components may be integrated into one hardware processor, or the above components may be combined with each other and implemented as a plurality of hardware processors.
- the touch recognition device includes a sensor unit 100 , a coordinate extraction unit 200 , a data management unit 300 , a state determination unit 400 , a conversion unit 500 , a motion recognition unit 600 , and A display unit 700 may be included.
- the sensor unit 100 may detect at least one object through a lidar sensor and generate sensing information for each object.
- LiDAR sensor is a technology that calculates the distance based on the time it takes for the light to return by recognizing that it is reflected off an object after emitting light. LiDAR sensors are being used in various fields such as autonomous vehicle driving, construction, and aerial surveying. The lidar sensor may acquire relatively accurate distance information compared to other distance sensors.
- the sensing information generated by the sensor unit 100 may include at least one of a distance to an object, a direction, a speed, a temperature, a material distribution and concentration characteristics, or 3D image information.
- the sensor unit 100 senses the floor at a distance of about 4.6 m at an angle of 180 degrees in front, so that it can recognize a motion of the user touching the floor in the sensingable area.
- the sensingable area may depend on the performance of the lidar sensor used.
- the coordinate extraction unit 200 may extract the coordinates of the object based on sensing information for all frames received from the sensor unit 100 .
- the coordinate extraction unit 200 will extract coordinates only for the recognized touch object within a preset angle range.
- the coordinate extraction unit 200 may extract the coordinates of the object by using the distance and the direction included in the sensing information.
- the coordinates of the object extracted by the coordinate extraction unit 200 mean not coordinates corresponding to an actual area, but coordinates corresponding to a two-dimensional distance map having the location where the sensor unit 100 is installed as an origin.
- the coordinate extractor 200 may display the object for each frame in a planar coordinate system that is a two-dimensional distance map according to coordinates.
- the coordinate extractor 200 may display the object included in the first frame at the corresponding coordinates of the first planar coordinate system, and may display the object included in the second frame at the corresponding coordinates of the second planar coordinate system. In this way, the coordinate extraction unit 200 will allocate a planar coordinate system for each frame.
- the data management unit 300 may perform clustering on at least one or more objects displayed in a planar coordinate system to extract a touch object and its coordinates from each frame.
- the data manager 300 may extract the first touch object of the first frame and the first coordinates of the first touch object by using a planar coordinate system in which objects belonging to the first frame are displayed.
- the data management unit 300 calculates a first distance between coordinates of an object extracted from the first frame, and performs clustering to form a cluster according to the distance.
- the data manager 300 sets the cluster generated by performing clustering by comparing the first distance with a preset first distance threshold as the first touch object.
- the data manager 300 may view at least one or more objects included in the cluster, in which the distance between the objects extracted from the first frame is less than or equal to the first distance threshold, as the first touch object. If there are a plurality of clusters in the first frame, the data management unit 300 may also set a plurality of first touch objects. The data management unit 300 may form at least one cluster based on the coordinates according to the sensing information in the first frame.
- the data management unit 300 When the data management unit 300 sets the first touch object, it will extract the first coordinates of the first touch object.
- the first coordinate may be a point or an area.
- the touch object list may include the first touch object, the first coordinates of the first touch object, and the number of objects belonging to the first touch object.
- the data manager 300 may determine noise from the first touch object included in the touch object list by using the number of objects included in the first touch object.
- the data manager 300 may determine the first touch object as noise when the number of objects included in the first touch object is less than a predetermined level.
- the predetermined level may be different according to the setting of the administrator.
- the data management unit 300 will remove the first touch object determined as noise from the touch object list.
- the process of removing the first touch object determined as noise from the touch object list may be performed before or after adding information on the first touch object to the touch object list.
- the state determination unit 400 will determine the state of the first touch object by using the first touch object and the first coordinates of the first frame extracted from the data management unit 300 included in the touch object list. Referring to FIG. 4 , in order to determine the state of the first touch object, the state determiner 400 may consider a second frame that is a previous frame continuous with the first frame, and a third frame that is a next frame.
- the state determiner 400 will check whether the second touch object exists in the second frame. According to (1) of FIG. 4 , if the second touch object does not exist in the second frame, the state determiner 400 may determine the first touch object as the enter state. This means that there is no sensing information in the previous frame, but a new touch object is sensed, and the new touch object is recognized.
- the enter state is similar to pressing the mouse button.
- the state determiner 400 may calculate a second distance between the first coordinates and the second coordinates of the second touch object in the second frame. According to (2) of FIG. 4 , the state determination unit 400 compares the second distance with a preset second distance threshold value, and when the second distance is greater than the second distance threshold value, the first touch object enters the enter state. can judge The state determiner 400 may determine that a new touch object is recognized because the first touch object is located in a different area from the second touch object.
- the state determination unit 400 may determine the first touch object as a stay state.
- the stay state may provide information that a touch object having already sensed information continues to exist.
- the state determining unit 400 determines that the first coordinate and the second coordinate are not the same, and when the second distance is less than the second distance threshold, the first A touch object may be determined as a move state.
- the move state is similar to the state of dragging the mouse.
- the state determiner 400 may determine the first touch object as an exit state. This means that the first touch object no longer exists in the third frame, and the exit state is similar to the state in which the mouse button is released.
- the state determiner 400 calculates a third distance between the first coordinates and the third coordinates of the third touch object in the third frame. can do.
- the state determination unit 400 may compare the third distance with a preset second distance threshold, and if the third distance is greater than the second distance threshold, determine the first touch object as an exit state. Since the first touch object and the third touch object are located in different regions, the state determiner 400 may determine that the first touch object disappears and a new third touch object is recognized.
- the state determination unit 400 may determine the first touch object as a stay state or a move state when the third distance is smaller than the second distance threshold value.
- the state determiner 400 may determine the state of the first touch object in consideration of the number of various cases as at least one touch object exists for each frame.
- the transform unit 500 may convert the first coordinates into real coordinates to determine whether the first touch object is located in a preset effective area.
- the effective area may be the entire sensing area or an area obtained by dividing the entire sensing area. In the touch recognition method according to an embodiment of the present invention, by setting an effective area, it can be confirmed that a touch object enters a specific area or that the touch object is recognized at a specific location.
- the motion recognizer 600 may recognize the movement of the touch object located in the effective area.
- the motion recognition unit 600 measures the speed of a user's footsteps based on the touch object located in the effective area, or analyzes information of the touch object recognized in the effective area to sequentially move the area or stay in a specific area. time can be measured.
- the motion recognition unit 600 may use the actual coordinates of the touch object and the time at which the touch object is recognized in order to recognize the above-described movement of the touch object.
- the motion recognition unit 600 detects the recognized touch object for 1 minute and 10.09 seconds. It will be judged that it stayed in the effective area.
- the motion recognition unit 600 may measure a touch count such as a jump within the effective area.
- the motion recognition unit 600 may determine the number of jumps by determining that the user is jumping when the first touch object repeatedly appears at the same first coordinate in several consecutive frames.
- the motion recognizer 600 may track the movement of the user using the first recognized touch object, the last recognized touch object, and a touch object existing therebetween in the effective area. In addition, the motion recognition unit 600 may also track a motion of lifting or dragging an object located at a specific location.
- the touch recognition apparatus may further include a camera module including a depth camera and/or a 3D camera to simultaneously photograph the same area as the sensor unit 100 .
- the motion recognition unit 600 may recognize not only the movement of the touch object, but also the entire user's movement from the image captured by the camera module. Since the motion of the touch object can only detect the motion of simply touching the floor, the present invention can obtain information on how the user's body is moving by further including a camera module.
- the display unit 700 may display a touch object for the effective area.
- the beam projector may be used to display the screen on which the touch object is displayed on the display unit 700 at a location corresponding to the actual coordinates, that is, in the effective area of the sensor.
- the display unit 700 may induce the user to touch the target position by not only displaying the touch object, but also displaying the target position where the touch object should be located. At this time, the target location will be displayed differently according to the user's body information.
- the display unit 700 may set the same effective area in each area to induce multi-touch. Also, the display unit 700 may induce the user to touch a specific section reciprocally.
- the display unit 700 may induce a user's movement by displaying the target location. Due to these characteristics, the touch recognition device of the present invention may be used to guide a user to an appropriate exercise posture. The user may correct the exercise posture by checking the target position displayed on the display unit 700 and touching the target position with a hand and/or a foot.
- the touch recognition device may be implemented as a server, hereinafter referred to as a server.
- the server may detect at least one object from the lidar sensor and generate sensing information for each object.
- the sensing information may include at least one of distance, direction, speed, temperature, material distribution and concentration characteristics from the sensor to the object, or 3D image information.
- the lidar sensor may recognize a motion of the user touching the floor.
- the server may extract the coordinates of the object based on all the sensing information collected from the lidar sensor.
- the server may extract coordinates only for the recognized touch object within the preset angle range. More specifically, the server may extract the coordinates of the object by using the distance and direction included in the sensing information.
- the coordinates of the object mean coordinates corresponding to the two-dimensional distance map having the location where the sensor is installed as the origin.
- the server may display the coordinates of the extracted object in a planar coordinate system.
- the server may perform clustering on at least one or more objects displayed in the planar coordinate system to extract the first touch object of the first frame and the first coordinates of the first touch object.
- the server calculates a first distance between coordinates of each object in a plane coordinate system in which an object for the first frame is displayed, and performs clustering to form a cluster according to the distance.
- the server may view at least one or more objects included in the cluster having the first distance equal to or less than the first distance threshold value as the first touch object.
- a plurality of clusters are formed in the first frame, a plurality of first touch objects may also be formed. In this case, sensing information of an object not included in the first touch object will be deleted.
- the server may extract the first coordinates of the first touch object.
- the first coordinate may be a point or an area.
- the server will determine the state of the first touch object by using the first touch object and its first coordinates. In order to determine the state of the first touch object, the server may consider a previous frame (second frame) and a next frame (third frame) that are continuous with the first frame.
- the server may determine the first touch object as an enter state.
- the server may calculate a second distance between the first coordinates and the second coordinates of the second touch object. In this case, if the second distance is greater than the second distance threshold, the server may determine the first touch object as an enter state.
- the server puts the first touch object in a stay state, and if the first coordinates and the second coordinates are not the same, the first touch An object can be judged as a move state.
- the server may determine the first touch object as an exit state.
- the server may calculate a third distance between the first coordinates and the third coordinates of the third touch object. In this case, if the third distance is greater than the third distance threshold, the first touch object may be determined as an exit state, otherwise, it may be determined as a stay state or a move state.
- the effective area may be the entire sensing area or an area obtained by dividing the entire sensing area.
- the server will display the first touch object on the display unit.
- the display unit may use a beam projector capable of displaying the screen on which the touch object is displayed at a position corresponding to the actual coordinates, ie, corresponding to the sensing area.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (11)
- 서버가 객체의 터치를 인식하는 방법에 있어서,라이다 센서를 통해 감지된 적어도 하나 이상의 객체에 대한 센싱 정보를 기반으로, 상기 객체의 좌표를 추출하는 a 단계;제1 프레임에서 추출된 상기 객체의 좌표에 대한 클러스터링을 수행하여, 상기 제1 프레임의 제1 터치 객체 및 상기 제1 터치 객체의 제1 좌표를 연산하는 b 단계;상기 제1 터치 객체 및 상기 제1 좌표를 이용하여 상기 제1 터치 객체의 상태를 판단하는 c 단계;상기 제1 좌표를 실제 좌표로 변환하여, 상기 제1 좌표가 기 설정된 유효 영역에 위치하는 지 판단하는 d 단계;상기 제1 터치 객체가 상기 유효 영역에 위치한다고 판단하면, 상기 실제 좌표에 상기 제1 터치 객체를 디스플레이 하는 e 단계를 포함하는 터치 인식 방법.
- 제1항에 있어서,상기 a 단계는,상기 객체의 좌표는 2차원 좌표이며,상기 객체의 좌표를 프레임 별로 평면 좌표계에 표시하는 단계를 포함하는 터치 인식 방법.
- 제2항에 있어서,상기 b 단계는,상기 제1 프레임에 대응하는 평면 좌표계에 표시된 적어도 하나 이상의 상기 객체의 좌표 간의 제1 거리를 연산하는 단계;상기 제1 거리와 기 설정된 제1 거리 임계 값을 비교하여 클러스터링을 수행하는 단계; 및클러스터링 하여 생성된 군집을 상기 제1 터치 객체로 설정하는 단계를 더 포함하는 터치 인식 방법.
- 제3항에 있어서,상기 제1 터치 객체, 상기 제1 터치 객체의 제1 좌표, 및 상기 제1 터치 객체에 포함된 상기 객체의 수를 이용하여 터치 객체 리스트를 생성하는 단계;상기 터치 객체 리스트에서, 상기 객체의 수가 기 설정된 개수 이하인 제1 터치 객체가 존재하면, 상기 터치 객체 리스트에서 제거하는 단계를 더 포함하는 터치 인식 방법.
- 제1항에 있어서,상기 c 단계는,상기 제1 프레임과 연속하는 이전 프레임인 제2 프레임에 제2 터치 객체가 존재하는 지 판단하는 단계;상기 제2 터치 객체가 존재하면, 상기 제1 좌표와 상기 제2 터치 객체의 제2 좌표 사이의 제2 거리를 연산하는 단계;상기 제2 거리와 기 설정된 제2 임계 값을 비교하여 상기 제1 터치 객체의 상태를 판단하는 단계를 포함하는 터치 인식 방법.
- 제1항에 있어서,상기 c 단계는,상기 제1 프레임과 연속하는 다음 프레임인 제3 프레임에 제3 터치 객체가 존재하는 지 판단하는 단계;상기 제3 터치 객체가 존재하면 상기 제1 좌표와 상기 제3 터치 객체의 제3 좌표 사이의 제3 거리를 연산하는 단계;상기 제3 거리와 기 설정된 제2 임계 값을 비교하여 상기 제1 터치 객체의 상태를 판단하는 단계를 포함하는 터치 인식 방법.
- 객체의 터치를 인식하는 장치에 있어서,라이다 센서를 통해 적어도 하나 이상의 객체를 감지하여 센싱 정보를 생성하는 센서부;상기 센싱 정보를 기반으로 상기 객체의 좌표를 추출하는 좌표 추출부;제1 프레임에서 추출된 상기 객체의 좌표에 대한 클러스터링을 수행하여, 상기 제1 프레임의 제1 터치 객체 및 상기 제1 터치 객체의 제1 좌표를 추출하는 데이터 관리부;상기 제1 터치 객체 및 상기 제1 좌표를 이용하여 상기 제1 터치 객체의 상태를 판단하는 상태 판단부;상기 제1 좌표를 실제 좌표로 변환하여, 상기 제1 터치 객체가 기 설정된 유효 영역에 위치하는 지를 판단하는 변환부; 및상기 제1 터치 객체가 상기 유효 영역에 위치하면, 상기 실제 좌표에 상기 제1 터치 객체를 디스플레이 하는 디스플레이부를 포함하는 터치 인식 장치.
- 제7항에 있어서,상기 데이터 관리부는,상기 제1 프레임에 포함된 적어도 하나 이상의 상기 객체의 좌표 간의 제1 거리를 연산하고, 상기 제1 거리와 기 설정된 제1 거리 임계 값을 비교하여 클러스터링을 수행하며, 생성된 군집을 제1 터치 객체로 설정하는 터치 인식 장치.
- 제7항에 있어서,상기 상태 판단부는,상기 제1 프레임과 연속하는 이전 프레임인 제2 프레임에 제2 터치 객체가 존재하는 지 판단하여,상기 제2 터치 객체가 존재하지 않으면, 상기 제1 터치 객체를 enter 상태로 판단하고,상기 제2 터치 객체가 존재하면, 상기 제1 좌표와 상기 제2 터치 객체의 제2 좌표 사이의 제2 거리를 연산하여, 상기 제2 거리가 상기 제2 거리 임계 값 이상이면 상기 제1 터치 객체를 enter 상태로 판단하는 터치 인식 장치.
- 제9항에 있어서,상기 상태 판단부는,상기 제1 좌표와 상기 제2 좌표가 동일하면 상기 제1 터치 객체를 stay 상태로 판단하고,상기 제1 좌표와 상기 제2 좌표가 동일하지 않고, 상기 제2 거리가 기 설정된 제2 거리 임계 값 이하이면 상기 제1 터치 객체를 move 상태로 판단하는 터치 인식 장치.
- 제7항에 있어서,상기 상태 판단부는,상기 제1 프레임과 연속하는 다음 프레임인 제3 프레임에 제3 터치 객체가 존재하는 지 판단하여,상기 제3 터치 객체가 존재하지 않으면, 상기 제1 터치 객체를 exit 상태로 판단하고,상기 제3 터치 객체가 존재하면, 상기 제1 좌표와 상기 제3 터치 객체의 제3 좌표 사이의 제3 거리를 연산하여, 상기 제3 거리가 상기 제2 거리 임계 값 이상이면 상기 제1 터치 객체를 exit 상태로 판단하는 터치 인식 장치.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180027951.6A CN115668113A (zh) | 2020-02-19 | 2021-02-16 | 具有激光雷达传感器的触摸识别方法和装置 |
US17/801,043 US11868567B2 (en) | 2020-02-19 | 2021-02-16 | Touch recognition method and device having LiDAR sensor |
US18/522,853 US20240111382A1 (en) | 2020-02-19 | 2023-11-29 | Touch recognition method and device having lidar sensor |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020200020301A KR102340281B1 (ko) | 2020-02-19 | 2020-02-19 | 라이다 센서를 구비한 터치 인식 방법 및 장치 |
KR10-2020-0020301 | 2020-02-19 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/801,043 A-371-Of-International US11868567B2 (en) | 2020-02-19 | 2021-02-16 | Touch recognition method and device having LiDAR sensor |
US18/522,853 Continuation US20240111382A1 (en) | 2020-02-19 | 2023-11-29 | Touch recognition method and device having lidar sensor |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021167312A1 true WO2021167312A1 (ko) | 2021-08-26 |
Family
ID=77391033
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2021/001945 WO2021167312A1 (ko) | 2020-02-19 | 2021-02-16 | 라이다 센서를 구비한 터치 인식 방법 및 장치 |
Country Status (4)
Country | Link |
---|---|
US (2) | US11868567B2 (ko) |
KR (1) | KR102340281B1 (ko) |
CN (1) | CN115668113A (ko) |
WO (1) | WO2021167312A1 (ko) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102506127B1 (ko) * | 2021-10-29 | 2023-03-07 | (주)아이원 | 라이더를 이용한 가상 터치 스크린 장치 |
KR102400435B1 (ko) * | 2022-03-03 | 2022-05-20 | 주식회사 에이치아이엔티 | 라이다 기반 실시간 감지 시스템의 데이터 처리를 가속화하는 방법 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090006543A (ko) * | 2007-07-12 | 2009-01-15 | 주식회사 토비스 | 광을 이용한 터치 센싱 방법 및 그에 따른 터치패널 장치와시스템 |
KR20110057146A (ko) * | 2008-08-07 | 2011-05-31 | 오웬 드럼 | 광학 터치 감응 장치에서 멀티터치 이벤트를 감지하는 방법 및 기구 |
KR20140098282A (ko) * | 2013-01-30 | 2014-08-08 | 엘지디스플레이 주식회사 | 터치 인식 장치 및 터치 인식 방법 |
KR20170114045A (ko) * | 2016-03-31 | 2017-10-13 | 주식회사 아이유플러스 | 이미지 센서 및 레이더 센서를 이용하여 타겟의 궤적을 추적하는 장치 및 방법 |
CN109656457A (zh) * | 2017-10-10 | 2019-04-19 | 北京仁光科技有限公司 | 多指触控方法、装置、设备及计算机可读存储介质 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101560726B1 (ko) * | 2009-06-30 | 2015-10-16 | 네이버 주식회사 | 링크에 따른 시드를 이용한 클러스터 생성 시스템 및 방법 |
KR102143760B1 (ko) * | 2013-12-02 | 2020-08-13 | 엘지디스플레이 주식회사 | 터치추적방법 |
EP3360119B1 (en) * | 2015-10-05 | 2023-07-26 | Zurface Group ApS | A hygiene nudging system |
-
2020
- 2020-02-19 KR KR1020200020301A patent/KR102340281B1/ko active IP Right Grant
-
2021
- 2021-02-16 CN CN202180027951.6A patent/CN115668113A/zh active Pending
- 2021-02-16 US US17/801,043 patent/US11868567B2/en active Active
- 2021-02-16 WO PCT/KR2021/001945 patent/WO2021167312A1/ko active Application Filing
-
2023
- 2023-11-29 US US18/522,853 patent/US20240111382A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090006543A (ko) * | 2007-07-12 | 2009-01-15 | 주식회사 토비스 | 광을 이용한 터치 센싱 방법 및 그에 따른 터치패널 장치와시스템 |
KR20110057146A (ko) * | 2008-08-07 | 2011-05-31 | 오웬 드럼 | 광학 터치 감응 장치에서 멀티터치 이벤트를 감지하는 방법 및 기구 |
KR20140098282A (ko) * | 2013-01-30 | 2014-08-08 | 엘지디스플레이 주식회사 | 터치 인식 장치 및 터치 인식 방법 |
KR20170114045A (ko) * | 2016-03-31 | 2017-10-13 | 주식회사 아이유플러스 | 이미지 센서 및 레이더 센서를 이용하여 타겟의 궤적을 추적하는 장치 및 방법 |
CN109656457A (zh) * | 2017-10-10 | 2019-04-19 | 北京仁光科技有限公司 | 多指触控方法、装置、设备及计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US11868567B2 (en) | 2024-01-09 |
US20240111382A1 (en) | 2024-04-04 |
KR102340281B1 (ko) | 2021-12-17 |
KR20210105618A (ko) | 2021-08-27 |
CN115668113A (zh) | 2023-01-31 |
US20230062630A1 (en) | 2023-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2023103441A5 (ko) | ||
WO2021167312A1 (ko) | 라이다 센서를 구비한 터치 인식 방법 및 장치 | |
WO2012023639A1 (ko) | 다수의 센서를 이용하는 객체 계수 방법 및 장치 | |
WO2015080431A1 (ko) | 골프 시뮬레이터, 및 골프 시뮬레이션 방법 | |
WO2016122069A1 (ko) | 타이어 마모도 측정 방법 및 그 장치 | |
CN110428449B (zh) | 目标检测跟踪方法、装置、设备及存储介质 | |
WO2018079903A1 (ko) | 장애물 감지장치 및 감지방법 | |
JP2009143722A (ja) | 人物追跡装置、人物追跡方法及び人物追跡プログラム | |
WO2020138782A1 (ko) | 울트라 와이드 밴드 레이더를 이용한 차량내 승객 감지 시스템 및 방법 | |
US20140044342A1 (en) | Method for generating 3d coordinates and mobile terminal for generating 3d coordinates | |
JPH10334207A (ja) | 人流計測装置 | |
WO2023027421A1 (ko) | 작업 공간에서의 관심 영역을 모니터링하는 방법 및 센싱 장치 | |
US20080225131A1 (en) | Image Analysis System and Image Analysis Method | |
WO2015099249A1 (ko) | 의료 영상의 병변 유사도 판단 장치 및 방법 | |
US20170228602A1 (en) | Method for detecting height | |
WO2016175579A1 (en) | User interface control using impact gestures | |
WO2016209029A1 (ko) | 입체 영상 카메라와 로고를 이용한 광학 호밍 시스템 및 방법 | |
WO2013133624A1 (ko) | 모션 인식을 통한 인터페이스 장치 및 이의 제어방법 | |
WO2020054975A1 (ko) | 철근 직선화 설비의 제어방법 및 그 장치 | |
JPH0546771A (ja) | 運動物体検出装置 | |
WO2021241920A1 (ko) | 피트니스 센터 내 운동 지도 서비스 및 시스템 | |
WO2018131729A1 (ko) | 단일 카메라를 이용한 영상에서 움직이는 객체 검출 방법 및 시스템 | |
WO2015064991A2 (ko) | 비접촉 동작 제어가 가능한 스마트 디바이스 및 이를 이용한 비접촉 동작 제어 방법 | |
KR101355206B1 (ko) | 영상분석을 이용한 출입 계수시스템 및 그 방법 | |
WO2012077909A2 (ko) | 근전도 센서와 자이로 센서를 이용한 지화 인식 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21756342 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21756342 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13.01.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21756342 Country of ref document: EP Kind code of ref document: A1 |