WO2022250238A1 - Système d'entrée d'informations dans un ordinateur et procédé de calcul de coordonnées d'informations d'entrée à l'aide d'un système d'entrée d'informations - Google Patents

Système d'entrée d'informations dans un ordinateur et procédé de calcul de coordonnées d'informations d'entrée à l'aide d'un système d'entrée d'informations Download PDF

Info

Publication number
WO2022250238A1
WO2022250238A1 PCT/KR2021/020053 KR2021020053W WO2022250238A1 WO 2022250238 A1 WO2022250238 A1 WO 2022250238A1 KR 2021020053 W KR2021020053 W KR 2021020053W WO 2022250238 A1 WO2022250238 A1 WO 2022250238A1
Authority
WO
WIPO (PCT)
Prior art keywords
information input
coordinates
image
area
display area
Prior art date
Application number
PCT/KR2021/020053
Other languages
English (en)
Korean (ko)
Inventor
박상욱
이인호
김체호
윤동수
Original Assignee
(주) 이즈커뮤니케이션즈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주) 이즈커뮤니케이션즈 filed Critical (주) 이즈커뮤니케이션즈
Publication of WO2022250238A1 publication Critical patent/WO2022250238A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to an information input system for inputting information into a computer by inputting information into a display area where a computer image is displayed, and a coordinate calculation method for calculating coordinates of information input in a display area using the information input system. .
  • a user In the case of using a display device with a touch panel attached, a user inputs information to a computer by controlling an image provided while touching the display or displaying characters or figures on the display screen, and the input information is displayed on the display screen again.
  • the user advances or stops the image with a remote control, etc. It is impossible to interact with the computer, such as controlling the computer on the displayed screen or monitor or displaying figures or letters on the screen or monitor to be input to the computer, other than the commanded operation.
  • Patent No. 820573 (Document 1) for 'Computer Input Device Recognizing Location and Blinking by Comparing a Laser Pointed Image Using a Camera and a Computer Image'
  • Patent No. 820573 (Document 1) for 'Computer Input Device Recognizing Location and Blinking by Comparing a Laser Pointed Image Using a Camera and a Computer Image'
  • an invention related to an 'infrared screen type projection image touch device' disclosed in No. 936666 (Document 2).
  • the invention of Document 1 irradiates a point light source with a laser pointer on a screen or monitor, photographs the monitor with a camera, recognizes the image of the point light source from the laser pointer, receives it as input to a computer, displays an image for coordinate system correction on a monitor, etc. By shooting, the coordinate system of the image captured by the camera is corrected to match the coordinate system of the image displayed by the computer.
  • an infrared screen is formed by an infrared LED array and an infrared camera facing the infrared screen is used to detect distortion of infrared rays on the infrared screen by a user's finger or the like to detect a point where a user's finger is located.
  • a camera In a system in which information is input using a point light source or a finger in the area where such computer images are displayed, a camera is commonly used to capture the display area, so it is essential to install a camera to capture the entire display area.
  • the camera should be positioned so that it can capture the entire display area, but if the camera cannot be installed far from the display area or if the camera is closely attached to the display There is a problem that it is difficult to photograph the entire area.
  • each camera captures a portion of the display area, and the coordinate systems of the cameras are synchronized with each other so that a user's input from each camera can be processed as a single information input.
  • the coordinates of the display area photographed by the respective cameras are inconsistent with each other due to deformation of the display or vibration of the display area and the projector during use.
  • the line segment received and processed by the camera may be disconnected from each other at the interface between the display areas or may have a different form from that input by the user.
  • This invention is to solve the above-mentioned problems of the prior art, an information input system that can calculate the coordinates of information input by a user from the captured image by taking a picture of a display area with a plurality of cameras, and using the information input system It is intended to provide a method for calculating the coordinates of information input.
  • the present invention divides the display area into a plurality of pieces, and a plurality of cameras take pictures of each divided display area, and in calculating the coordinates of information input from the captured images, at the boundary between the display areas taken by different cameras. It is intended to provide an information input system that solves the problem that the coordinates of continuous information input are inconsistent or discontinuous, and a method for calculating the coordinates of information input using this information input system.
  • the above-described problem to be solved by the present invention is to acquire an image of a display area where a computer image is displayed, calculate coordinates of information input in the display area, and process them as input information. and a coordinate calculation method of information input using this information input system.
  • It includes a plurality of image capturing devices that divide and capture a display area and an image processing module that analyzes images captured by the image capturing devices and extracts a signal related to information input from the captured images,
  • Each image capture device is arranged to capture one divided display area of the display area, and has a unique capture area of the image capture device and an overlapping capture area overlapped with other image capture devices;
  • the image processing module includes
  • coordinates of the information input are determined according to coordinate values in the display area of the image capture device
  • the coordinates of the information input according to the coordinates in the display area of each image capturing device for the overlapping imaging area are determined, and the determined coordinates are processed according to a blending function to obtain information input. It is calculated as coordinates.
  • An image reproduced or formed by a computer is displayed on a display area by a display device, and a plurality of image capturing devices divides the display area and captures images.
  • the image processing module matches a coordinate system of an image captured by each image capturing device to a coordinate system of a display area. Accordingly, information input determined from images captured by each of the separate image capturing devices may be processed as coordinates in one coordinate system.
  • the image processing module extracts an image of information input by a user from an image captured by each image capturing device, and calculates coordinates of the extracted image.
  • the calculated coordinates are processed according to the blending function and calculated as coordinates of information input.
  • Input information may have continuous coordinate values.
  • the overlapping shooting area is provided in one axial direction in the display area,
  • the image processing module gives a linear weight in the range of 1 to 0 to the coordinate value of the information input according to the relative distance in the one axial direction from the boundary between the one unique capture area and the overlapping area, , It is calculated as the coordinates of the information input by summing the coordinate values of the information input of the weighted image capturing devices, or
  • a weight is assigned in the range of 1 to 0 according to a sigmoid function that takes the relative distance in the one axis direction from the boundary between any one unique capture area and the overlapping area as an input value, and the weighted image It may be configured to sum up the coordinate values of the information input of the photographing devices and calculate them as the coordinates of the information input.
  • the coordinate values of the information input obtained from the images captured by each image capturing device are added together with weights calculated according to the relative distances in the overlapping area and calculated as the coordinates of the information input.
  • the coordinate value of the information input obtained from one imaging device is processed only with the coordinates of the information input obtained from each imaging device when there is continuous information input from the display area to the other display area through the overlapping area.
  • the coordinates of the input information are not disconnected or greatly deviate from the originally intended coordinates.
  • FIG. 1 and 2 are diagrams each illustrating an information input system in which a coordinate calculation system according to an embodiment of the present invention is implemented.
  • FIG. 3 is a diagram showing a state in which a display area is divided in a coordinate calculation system according to an exemplary embodiment.
  • FIGS. 4 and 5 are graphs illustrating a method of calculating weights for coordinates of information input in different regions in a coordinate calculation system according to an exemplary embodiment.
  • 1 and 2 show an information input system in which information input by a user is determined in an area where an image displayed by a computer is displayed and input as information input to a computer.
  • This information input system includes a computer 40 that reproduces or forms an image to be displayed, a display device 10 that forms a display area by displaying an image reproduced by the computer, and two camera devices 21 and 22 that capture the display area.
  • It consists of an infrared pen 30 configured to irradiate infrared rays from the front end, which allows a user to hold and irradiate infrared rays to a display area to input information into a computer.
  • a user may touch a specific part with the infrared pen 30 in a display area where the display device 10 displays an image transmitted from a computer, or may display a character, number, figure, or line segment with the infrared pen 30 .
  • the infrared light emitted from the tip of the infrared pen 30 by the user's action is photographed by the cameras 21 and 22, and the photographed images are transmitted to the computer 40 and processed.
  • the computer 40 processes the images from the cameras, determines the infrared light from the light pen 30 included in the image, calculates the coordinates of the infrared light in the display area, and inputs the information to the position of the calculated coordinates.
  • An image processing module for processing is provided.
  • such an image processing module is implemented as a computer program and is driven by a memory and a CPU provided in the computer 40 to process an image from a camera and calculate coordinates.
  • the image processing module may be provided as a separate hardware device and a program loaded on the device and installed as part of the computer 40 or as a separate device.
  • the display device 10 is provided as a flat panel display device on which a screen reproduced or formed by the computer 40 is displayed, but a projector that scans the screen of the computer 40 on the screen is provided as the display device, and a camera (21, 22) captures a screen on which an image is projected by a projector, and the screen may form a display area.
  • the display device 10 is provided in a form in which the width is very wide compared to the height, and two cameras 21 and 22 disposed horizontally and spaced apart from each other are provided at the top of the display device 10 .
  • One camera 21 is arranged to mainly take pictures of the left side of the display area, and the other camera 22 mainly takes pictures of the right side of the display area, but the areas they capture overlap in the horizontal direction of the display area. .
  • the infrared pen 30 used by the user to input information emits infrared rays from the front end when the user touches the surface of the display device 10, which is a display area, or when the user presses a button.
  • a user may hold the infrared pen 30 and touch a specific part of the surface of the display device 10 or draw a character, number, or figure.
  • infrared light is irradiated onto the display area of the display device 10 to display an image transmitted from the computer 10 .
  • the images captured by the cameras 21 and 22 include infrared light
  • the image processing module of the computer 40 extracts the infrared light from the image captured by the camera, calculates the coordinates of the infrared light from the coordinate system of the image, and calculates the coordinates of the infrared light. It is processed as the user's information input about the location of
  • the infrared pen 30 held and used by the user is used as a means of inputting information, but unlike this, laser light or infrared light is irradiated parallel to the surface of the display device 10, and the user When a finger or the like touches the display area or moves adjacent to the display area, the camera may capture laser light or infrared light reflected from the user's finger or the like.
  • the image including the image of the reference point for matching the coordinate system is displayed on the display device 10, and the image processing module of the computer 40 calculates the coordinates of the image of the reference point from the image taken by the camera, and the computer 40
  • the matching operation is performed by matching with the coordinates of the reference point in the image transmitted by .
  • the area a on the left of the display area is set as the display area captured by the camera 22 on the right.
  • the right area (c) of the display area is set as the display area taken by the camera on the right, and the coordinate system is matched.
  • the left area (a) and the right area (c) of the display area include an overlapping area (b).
  • the infrared light generated by the touch is captured by both the two cameras 21 and 22 and processed by the arrangement processing module of the computer, respectively.
  • the image from includes all images of infrared light.
  • the coordinates of the images of the infrared light included in the images from the respective cameras coincide with each other, and the information processing module of the computer sends the information to the same coordinates. It can be processed as an input.
  • the cameras 21 and 22 or the display device 10 may be shaken or twisted due to vibration or the like, so that the position of the infrared light captured by both cameras may be different.
  • the calculated coordinates of the infrared light images extracted from the images from the respective cameras 21 and 22 may be different from each other with respect to the infrared light projected on one point of the overlapping region b.
  • the information input system of this embodiment calculates the coordinates of the infrared light image in the image taken by the left camera 21 and the coordinates of the infrared light image in the image taken by the right camera 22, respectively, and puts the calculated coordinates together. processing to calculate the coordinates of the image of the infrared light as the final result as the coordinates of information input by the user.
  • the image processing module of the computer calculates the coordinates of the information input by assigning weights to the calculated coordinates and summing them.
  • the camera 21 on the left and the camera 22 on the right have areas b overlapping each other in the horizontal direction in the display area.
  • a weight of 1 is given to the coordinates of the image of the infrared light from the left camera 21, and the right Even if the image of the infrared light from the camera 22 is included in the image and the coordinates are calculated, a weight of 0 is given to these coordinates.
  • the left display area (a) even if an infrared light image is included in the image from the right camera 22, it is disregarded.
  • the display area c of the right camera 22 excluding the overlapping area b, the coordinate values of the image of the infrared light from the left camera 21 are given a weight of 0 and ignored.
  • a weight is given according to a position in the horizontal direction in the display area.
  • the relative position of the infrared light in the horizontal direction in the overlapping area becomes a variable in the weighting function, and the weight becomes a threshold value.
  • 1 is assigned at the position of the interface between the left display area (a) and the overlapping area (b), and the right display area (c) and the overlapping area (b) 01 is assigned at the location of the boundary, and a value between 1 and 0 is linearly assigned according to the distance at the location between the boundaries.
  • 0 is given at the position of the boundary between the left display area (a) and the overlapping area (b), and the right display area (c) and the overlapping area are assigned.
  • 1 is assigned at the location of the boundary in (b), and a value between 1 and 0 is linearly assigned depending on the distance between the boundaries.
  • the coordinates of the images of the infrared light weighted in this way are summed to calculate the coordinates of the information input.
  • the coordinates have coordinate values in each of the horizontal and vertical directions, and are processed separately from each other.
  • the above linear weight is calculated by a linear function, but may be calculated as a non-linear value by a sigmoid function without calculating the weight linearly.
  • sigmoid function a logistic function, a hyperbolic tangent function, an arctangent function, and an error function may be used, and these functions, like the primary function for calculating linear weights, are horizontal in the overlapping area (b) as a variable.
  • a coordinate value of a direction may be used.
  • a constant of the function may be determined such that a threshold value is calculated as a value of 0 to 1 in a range of coordinate values in the horizontal direction of the overlapping area.
  • Weights according to such a sigmoid function have a form as shown in the graph shown in FIG. 5 .
  • the overlapping area (b) approaches the left display area (a) and the right display area (c), it has a value that rapidly converges to 0 or 1, so the overlapping area (b) and the surrounding display areas (a, Compared to the case where the coordinate values in c) are not discontinuous and linear weights are applied, a result in which the change in coordinate values is not discontinuous can be obtained.
  • the coordinates are rapidly at the boundary of the display area. It is not calculated that the value changes, and a very continuous coordinate value can be obtained.
  • the display area is divided in the horizontal direction, and an overlapping area is set in the horizontal direction.
  • the display area is divided vertically and An overlapping area may also be provided between photographing areas of cameras disposed vertically.
  • the relative distance in the overlapping area is used as a variable and weights are calculated with the range calculated from the linear function or the sigmoid function.
  • the coordinates obtained by multiplying the coordinates of the image by the weight were added together to calculate the coordinates of the information input.
  • the present invention is not limited to the method of blending the coordinate values, and in addition to the weight calculation method using the linear function and the sigmoid function, the coordinates of the infrared light image calculated from the captured images of each camera are used as input values. Coordinates of information input may be calculated using various blending functions.
  • two cameras are used as an image capture device, but three or more cameras may be used, the display area may be divided into three or more, and overlapping areas may be set between the three or more cameras.

Abstract

La présente invention concerne un système d'entrée d'informations pour acquérir une image d'une zone d'affichage dans laquelle une image d'un ordinateur est affichée, et calculer des coordonnées d'une entrée d'informations dans la zone d'affichage et traiter en tant qu'informations d'entrée, et comprend : une pluralité d'appareils de capture d'images qui divisent et photographient une zone d'affichage ; et un module de traitement d'image qui analyse des images capturées par les appareils de capture d'images et extrait, à partir des images capturées, un signal relatif à une entrée d'informations. Chacun des appareils de capture d'images est agencé de façon à photographier une zone d'affichage divisée de la zone d'affichage, et a une zone de photographie unique de l'appareil de capture d'images correspondant et une zone de photographie en chevauchement chevauchant l'autre appareil de capture d'images. Le module de traitement d'image : par rapport à une entrée d'informations dans la zone de photographie unique de chacun des appareils de capture d'images, différencie les coordonnées de l'entrée d'informations en fonction de valeurs de coordonnées de l'appareil de capture d'images correspondant dans la zone d'affichage ; par rapport à une entrée d'informations dans la zone de photographie en chevauchement, différencie les coordonnées de l'entrée d'informations en fonction des coordonnées de chacun des appareils de capture d'images pour la zone de photographie en chevauchement correspondante dans la zone d'affichage ; et traite les coordonnées distinguées selon une fonction de mélange et calcule celles-ci en tant que coordonnées d'une entrée d'informations.
PCT/KR2021/020053 2021-05-25 2021-12-28 Système d'entrée d'informations dans un ordinateur et procédé de calcul de coordonnées d'informations d'entrée à l'aide d'un système d'entrée d'informations WO2022250238A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2021-0067307 2021-05-25
KR20210067307 2021-05-25

Publications (1)

Publication Number Publication Date
WO2022250238A1 true WO2022250238A1 (fr) 2022-12-01

Family

ID=84229929

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/020053 WO2022250238A1 (fr) 2021-05-25 2021-12-28 Système d'entrée d'informations dans un ordinateur et procédé de calcul de coordonnées d'informations d'entrée à l'aide d'un système d'entrée d'informations

Country Status (1)

Country Link
WO (1) WO2022250238A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100047793A (ko) * 2008-10-29 2010-05-10 한국전자통신연구원 착용형 컴퓨팅 환경 기반의 사용자 인터페이스 장치 및 그 방법
KR101137003B1 (ko) * 2010-10-29 2012-04-19 (주)더게이트테크놀러지스 카메라 센서를 장착한 터치 스크린의 터치 동작 방법
KR101593950B1 (ko) * 2014-05-28 2016-02-15 숭실대학교산학협력단 손동작 기반의 인터페이스 장치 및 이를 이용한 포인팅 방법
WO2016204068A1 (fr) * 2015-06-19 2016-12-22 ソニー株式会社 Appareil de traitement d'image, procédé de traitement d'image et système de projection
JP6154905B2 (ja) * 2013-08-30 2017-06-28 クラリオン株式会社 カメラ校正装置、カメラ校正システム、及びカメラ校正方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100047793A (ko) * 2008-10-29 2010-05-10 한국전자통신연구원 착용형 컴퓨팅 환경 기반의 사용자 인터페이스 장치 및 그 방법
KR101137003B1 (ko) * 2010-10-29 2012-04-19 (주)더게이트테크놀러지스 카메라 센서를 장착한 터치 스크린의 터치 동작 방법
JP6154905B2 (ja) * 2013-08-30 2017-06-28 クラリオン株式会社 カメラ校正装置、カメラ校正システム、及びカメラ校正方法
KR101593950B1 (ko) * 2014-05-28 2016-02-15 숭실대학교산학협력단 손동작 기반의 인터페이스 장치 및 이를 이용한 포인팅 방법
WO2016204068A1 (fr) * 2015-06-19 2016-12-22 ソニー株式会社 Appareil de traitement d'image, procédé de traitement d'image et système de projection

Similar Documents

Publication Publication Date Title
US6554431B1 (en) Method and apparatus for image projection, and apparatus controlling image projection
EP2610718A1 (fr) Système de tableau noir électronique et programme
WO2014189216A1 (fr) Procédé et système de suivi automatique de position du visage et de reconnaissance faciale
WO2013129792A1 (fr) Procédé et terminal portable pour corriger la direction du regard de l'utilisateur dans une image
US20100103099A1 (en) Pointing device using camera and outputting mark
WO2014104521A1 (fr) Appareil et procédé de transformation d'image
KR20010020668A (ko) 컴퓨터-생성 투사 영상의 캘리브레이션을 위한 방법 및 장치
JPH08240407A (ja) 位置検出入力装置
WO2015182904A1 (fr) Appareil d'étude de zone d'intérêt et procédé de détection d'objet d'intérêt
WO2018135906A1 (fr) Caméra et procédé de traitement d'image d'une caméra
WO2013025011A1 (fr) Procédé et système de suivi d'un corps permettant de reconnaître des gestes dans un espace
WO2022250238A1 (fr) Système d'entrée d'informations dans un ordinateur et procédé de calcul de coordonnées d'informations d'entrée à l'aide d'un système d'entrée d'informations
WO2020204366A2 (fr) Procédé fournissant un guide de balayage et dispositif de traitement d'image associé
WO2020111389A1 (fr) Structure de mla multicouche pour corriger une anomalie d'indice de réfraction d'un utilisateur, panneau d'affichage et procédé de traitement d'image
WO2018110810A1 (fr) Système de création de contenu de réalité virtuelle
WO2020101121A1 (fr) Procédé d'analyse d'image basée sur l'apprentissage profond, système et terminal portable
WO2023149603A1 (fr) Système de surveillance par images thermiques utilisant une pluralité de caméras
WO2011040653A1 (fr) Appareil de photographie et procédé pour fournir un objet 3d
WO2022131720A1 (fr) Dispositif et procédé pour générer une image de construction
WO2018084381A1 (fr) Procédé de correction d'image utilisant une analyse d'apprentissage profond basée sur un dispositif gpu
WO2017026834A1 (fr) Procédé de génération et programme de génération de vidéo réactive
WO2019054534A1 (fr) Procédé et appareil de capture d'écran pour un terminal électronique
WO2019231119A1 (fr) Appareil et procédé de visualisation d'encombrement
KR100873445B1 (ko) 영상차이 인식 시스템 및 영상 시스템을 이용한 영상차이인식방법
WO2013047944A1 (fr) Écran tactile pour une personne ayant une vision défaillante et procédé d'affichage de celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21943222

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE