WO2016072610A1 - Procédé de reconnaissance et dispositif de reconnaissance - Google Patents

Procédé de reconnaissance et dispositif de reconnaissance Download PDF

Info

Publication number
WO2016072610A1
WO2016072610A1 PCT/KR2015/009648 KR2015009648W WO2016072610A1 WO 2016072610 A1 WO2016072610 A1 WO 2016072610A1 KR 2015009648 W KR2015009648 W KR 2015009648W WO 2016072610 A1 WO2016072610 A1 WO 2016072610A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal device
terminal
content
recognition
table device
Prior art date
Application number
PCT/KR2015/009648
Other languages
English (en)
Korean (ko)
Inventor
김태일
최재필
Original Assignee
(주)라온스퀘어
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)라온스퀘어 filed Critical (주)라온스퀘어
Priority to CN201580000684.8A priority Critical patent/CN105900043A/zh
Publication of WO2016072610A1 publication Critical patent/WO2016072610A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present invention relates to an object recognition method and an object recognition apparatus, and more particularly, to a method and apparatus for recognizing a terminal apparatus for a table apparatus.
  • Such a table not only inputs a thing as a touch but also displays the content on the table by identifying the thing, which enables more effective information provision to the user.
  • the existing table merely recognizes the object, and does not consider the state of the object, for example, whether the object is moving or in which direction. There was a limit.
  • a touch screen monitor for generating a touch signal corresponding to the external touch, and outputs the corresponding image information
  • An integrated control device for receiving the generated touch signal and outputting corresponding image information to the touch screen monitor, and controlling an external input / output device to correspond to the touch signal
  • a table top plate for accommodating the touch screen monitor, and integrated control
  • a multimedia table is provided that includes a table body having a support frame for receiving a device and for supporting a table top plate.
  • the background art described above is technical information that the inventors possess for the derivation of the present invention or acquired in the derivation process of the present invention, and is not necessarily a publicly known technique disclosed to the general public before the application of the present invention. .
  • One embodiment of the present invention is to provide a method and apparatus for recognizing a terminal apparatus for a table apparatus.
  • a recognition device configured to communicate with a table device for detecting a contact to any object, from the terminal device paired with the table device, And a state updater configured to receive motion information of the terminal device, and when detecting contact with the table device, determining whether the terminal device has contacted the table device based on the motion information of the terminal device. It may include a terminal recognition unit.
  • a terminal device configured to transmit motion information to a recognition device, comprising: a tilt sensor detecting a tilt of the terminal device, a compass sensor detecting a direction of the terminal device, and a terminal device. And a sensing unit configured to generate motion information of the terminal device based on at least one of the acceleration sensors that detect the acceleration, and a state transmitter configured to transmit the motion information to the recognition device.
  • a recognition device may be a table device configured to communicate with the recognition device.
  • a recognition device configured to communicate with a table device that senses contact with an arbitrary object, wherein the terminal device on the table device is paired with the table device.
  • Receiving motion information of the terminal device from a device if detecting a contact on a table device, determining whether the terminal device has contacted the table device based on the motion information of the terminal device, and the determination result And determining that the terminal device contacts the table device.
  • the method may be a computer-readable recording medium having recorded thereon a program.
  • a motion information of the terminal device from a terminal device paired with a table device that is coupled to a computing device and detects contact with an arbitrary object, the contact on the table device Upon detection, determining whether the terminal device has contacted the table device based on the motion information of the terminal device, and determining that the terminal device contacts the table device as a result of the determination. It may be a computer program stored in the medium.
  • an embodiment of the present invention can propose a method and apparatus for recognizing a terminal device for a table device.
  • the terminal device can be recognized when the terminal device is placed on the table device by utilizing the existing table device as it is.
  • the information corresponding to the terminal device is displayed through the table device, thereby enabling efficient information transfer.
  • the content can be displayed differently or the displayed content can be changed according to the movement of the terminal device or the direction in which the terminal device is recognized to be placed on the table device, the efficient information Delivery is possible.
  • FIG. 1 and 2 are schematic configuration diagrams of a system including a recognition device according to an embodiment of the present invention.
  • FIG. 3 is a block diagram illustrating a terminal device communicating with a recognition device according to an embodiment of the present invention.
  • FIG. 4 is a block diagram illustrating a recognition device according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a recognition method according to an embodiment of the present invention.
  • 6 to 10 are exemplary views for explaining a recognition method according to an embodiment of the present invention.
  • FIG. 1 and 2 are schematic configuration diagrams illustrating a system including a recognition device 10 according to an embodiment of the present invention
  • FIG. 3 is a recognition device 10 according to an embodiment of the present invention. It is a block diagram showing the terminal device 20 to communicate.
  • the recognition device 10 may communicate with the terminal device 20.
  • the recognition device 10 may be connected to the terminal device 20 by short-range wireless communication, for example, WiFi, but is not limited thereto.
  • communication between the recognition device 10 and the terminal device 20 may be performed according to any short-range communication method such as Bluetooth, Zigbee, infrared light, or the like.
  • the recognition device 10 may perform bidirectional communication with the terminal device 20, or may perform unidirectional communication for unilaterally receiving data from the terminal device 20.
  • the recognition device 10 communicates with the table device 30 that senses contact with an arbitrary object, and determines the contact of the terminal device 20 with respect to the table device. have.
  • Such a recognition device 10 may communicate with the table device 30 by being located outside the table device 30, included in the table device 30, or by including the table device 30. 2 shows a state in which the table device 30 includes the recognition device 10 for convenience of description.
  • the table device 30 may be a touch table capable of sensing one or more touches and includes a panel capable of detecting that there is a contact on the touch table.
  • the table device 30 may include resistive sensing technology, capacitive sensing technology, surface acoustic wave sensing technology, and optical imaging sensing technology.
  • the panel may be applied, and an infrared-LED Cell Imaging method FTTR (Frustrated Total Internal Reflection),
  • the panel may be provided with an infrared sensing technology such as DI (Diffused Illumination), LLP (Laser Light Plane Illumination), DSI (Diffused Surface Illumination), or LED-LE (LED Light Plane).
  • the table apparatus 30 may include a display unit capable of displaying the contents provided by the recognition apparatus 10 according to a representation method determined by the recognition apparatus 10.
  • the terminal device 20 may be any object that can be in contact with the table device 30.
  • the terminal device 20 may have various shapes and may be formed of various materials.
  • the terminal device 20 includes a sensing unit 210 and a state transmitter 220.
  • the sensing unit 210 is configured to generate the motion information by detecting the movement of the terminal device (20).
  • the 'motion information' is general information indicating an amount of change according to the movement when the terminal device 20 moves according to rotation, distance movement, and the like.
  • inclination information that is information on the inclination of the terminal device 20
  • acceleration information that is information on the dynamic force, such as acceleration, vibration, shock of the terminal device 20 may be included.
  • the sensing unit 210 may include an inclination sensor for detecting an inclination of the terminal device 20, and may include an inclination of the terminal device 20 in the motion information using the inclination sensor.
  • the sensing unit 210 may include a compass sensor for detecting the orientation of the terminal device 20, and may include the orientation of the terminal device 20 in the motion information using the compass sensor. .
  • the sensing unit 210 may include an acceleration sensor that detects the acceleration of the terminal device 20, and may include the acceleration of the terminal device 20 in the motion information by using the acceleration sensor. .
  • the state transmitting unit 220 is configured to transmit the motion information of the terminal device 20 to the recognition device 10.
  • the state transmitter 220 may periodically transmit the motion information of the terminal device 20 to the recognition device 10, or recognize the updated motion information whenever the motion information of the terminal device 20 is updated. 10 can be transmitted.
  • the state transmitter 220 may transmit the terminal identification information of the terminal device 20 together with the motion information of the terminal device 20.
  • the terminal device 20 may further include a first pairing unit 230.
  • the first pairing unit 230 is a module for pairing with the recognition device 10 for communication with the recognition device 10.
  • the terminal device 20 is connected to the recognition device 10 in various ways. Can be paired.
  • the first pairing unit 230 transmits a pairing request to the recognition device 10 when it is determined that the recognition device 10 is nearby and there is no history of pairing with the recognition device 10. And may continue to request pairing until a response to the request is received.
  • the first pairing unit 230 may register the terminal identification information when the terminal identification information is allocated from the pairing result recognizing apparatus 10, and may delete the terminal identification information when the pairing ends.
  • the first pairing unit 230 may transmit a predetermined area (for example, a contact area when the terminal device is placed on the table device) to the recognition device.
  • a predetermined area for example, a contact area when the terminal device is placed on the table device
  • Figure 4 is a block diagram showing a recognition device 10 according to an embodiment of the present invention.
  • the recognition apparatus 10 may include a second pairing unit 110, a state updating unit 120, a terminal identification unit 130, and a content providing unit 140.
  • the recognition apparatus 10 enables communication between internal components, that is, the second pairing unit 110, the state updating unit 120, the terminal identification unit 130, and the content providing unit 140, and also the external device. It may include a communication unit (not shown) to enable communication with the component.
  • the recognition apparatus 10 may include a storage unit (not shown) for storing data (for example, content) for performing the recognition method according to an embodiment of the present invention, or located outside It may be in communication with a storage device (not shown), for example a database.
  • a storage unit for storing data (for example, content) for performing the recognition method according to an embodiment of the present invention, or located outside It may be in communication with a storage device (not shown), for example a database.
  • the second pairing unit 110 is a module for pairing with the terminal device 20 for communication with the terminal device 20.
  • the second pairing unit 110 may pair with the terminal device 20 by responding to a pairing request from the first paying unit 230 of the terminal device 20.
  • the second pairing unit 110 may be paired with the terminal device 20 by receiving and registering the terminal identification information of the terminal device 20.
  • the second pairing unit 110 may pair and pair the terminal device 20 by allocating and registering the terminal identification information corresponding to the motion information.
  • the state update unit 120 may receive the motion information of the terminal device from the terminal device 20.
  • the state updater 120 may update each time the motion information of the terminal device is received or accumulate the motion information of the terminal device for a predetermined period or a predetermined number of times.
  • the state updating unit 120 may receive movement information of the terminal device even before the terminal device contacts the table device, and also after the terminal device contacts the table device (for example, a touch point corresponding to the terminal device). Can be received even if the motion information of the terminal device can be received, and the terminal device detects that the contact is terminated after contacting the table device (for example, after a touch point corresponding to the terminal device is generated, Even if the contact with the touch point is no longer detected during the period of time), the motion information of the terminal device can be received.
  • the terminal identification unit 130 may determine whether the contact is by the terminal device.
  • the terminal identification unit 130 may determine whether the terminal apparatus has contacted the table apparatus based on the motion information of the terminal apparatus. For example, when the terminal identification unit 130 does not update the motion information of the terminal device 20 within a predetermined time based on the point of time when the touch on the table device 30 is detected (that is, the terminal device does not move). If it is determined), it may be determined that the object in contact with the table device 30 is the terminal device 20.
  • the terminal identification unit 130 may also determine whether the terminal device has contacted the table apparatus by comparing the contact area that the terminal apparatus is in contact with the table apparatus and a predetermined area of the terminal apparatus. For example, when the terminal identification unit 130 detects a contact on the table device 30, the terminal identification unit 130 calculates an area where a contact is made based on the coordinates of the point where the contact is detected. If it is equal to the area of the set area or within a predetermined error range, it may be determined that the object in contact with the table device 30 is the terminal device 20.
  • the terminal identification unit 130 may be set to the touch point to match the terminal device, and thus, for example, the terminal device is in contact with the table device
  • the touch point may move according to the movement path on the table device of the terminal device.
  • the 'touch point' may be all coordinates corresponding to the area that the terminal device is in contact with, and may be a predetermined number of coordinates corresponding to the area that the terminal device is in contact with, and may also be selected on the table device. It may be a coordinate for a point.
  • the content providing unit 140 when the content providing unit 140 identifies the terminal device in contact with the table device, the content providing unit 140 provides the content so that the content is displayed through the table device.
  • 'content' refers to various information that can be displayed through the display unit of the table device 30 and may be, for example, text, an image, a video, a hologram, and a sound.
  • the content providing unit 140 may provide content corresponding to the terminal device, or the motion information of the terminal device (the time when the terminal device touches the last received motion information or the table device immediately before contacting the table device). Based on the last received motion information) can be provided through the display unit of the table device by providing the content corresponding to the terminal device.
  • the content providing unit 140 may provide display information including information on how and at what point the content is to be displayed through the table device.
  • the content can be displayed accordingly.
  • the display information may be setting information for displaying content radially based on the touch point of the terminal device.
  • Such display information may be preset by the recognition device, or the content, type, or motion information of the terminal device (the last received motion information immediately before the terminal device contacts the table device or the table device that has touched the table device). The last received motion information based on the viewpoint).
  • the content providing unit 140 may provide at least one of first content corresponding to each of the plurality of terminal devices or second content corresponding to a group of the plurality of terminal devices. It can be displayed through the table device.
  • the content providing unit 140 may also provide display information for each of the first content and the second content. For example, as the display information, the content providing unit 140 touches the first content corresponding to each terminal device. The display may be displayed around the point, but the second content may be displayed at the point where the first content of each terminal device is displayed together.
  • the content providing unit 140 may control the content to no longer be displayed through the touch table.
  • FIG. 5 is a flowchart illustrating a recognition method according to an embodiment of the present invention.
  • the recognition method according to the embodiment shown in FIG. 5 includes steps processed in time series in the recognition device 10 shown in FIGS. 1 to 4. Therefore, although omitted below, the above descriptions of the recognition apparatus 10 shown in FIGS. 1 to 4 may be applied to the recognition method according to the embodiment shown in FIG. 5.
  • FIGS. 6 to 10 are exemplary views for explaining a recognition method according to an embodiment of the present invention.
  • the recognition device 10 may be paired with the terminal device 20.
  • the terminal device 20 it is not necessary to go through a separate pairing process with the terminal device 20 is already completed pairing.
  • the recognition device 10 paired with the terminal device 20 as described above may receive motion information from the terminal device 20 (S510).
  • the terminal device 20 before the terminal device 20 is placed on the touch table 30, the terminal device 20 is shown in FIG. 6A to FIG. 6B.
  • the recognition device 10 may receive the updated movement information.
  • the recognition device 10 may receive motion information from the terminal device 20.
  • the recognition device 10 may detect a contact on the table device 30 (S520).
  • the recognition device 10 may determine whether the object in contact with the table device is the terminal device 20 (S530).
  • the recognition device 10 may determine whether the terminal device has contacted the table device by comparing the contact area of the terminal device with a predetermined area of the terminal device. For example, it is possible to determine that the object placed on the table apparatus is the terminal apparatus only when the contact area by the object placed on the table apparatus 30 is within a predetermined error range as a result of the comparison with the area of the preset terminal apparatus. have.
  • the recognition device 10 may provide content so that the content is displayed through the display unit of the table device (S550). ).
  • the recognition device 10 may provide content corresponding to the terminal device 20, and the content may be provided through at least a partial area on the table device.
  • the content 710 may be randomly provided based on the touch point of the terminal device 20, or as shown in (b) of FIG. 7.
  • the content 720 may be provided radially based on the touch point of the terminal device.
  • the recognition apparatus 10 may provide content corresponding to the terminal apparatus based on the motion information of the terminal apparatus to be displayed on the table apparatus. That is, as shown in (a) and (b) of FIG. 7, when the inclination and the orientation of the terminal device 20 are different, the recognition device 10 provides a table device by providing different contents 710 and 720. 30 can be displayed on.
  • the recognition device 10 may provide at least one of first content corresponding to each of the plurality of terminal devices and second content corresponding to a group of the plurality of terminal devices. Can be displayed on the screen.
  • each terminal device corresponds to each of the consonants or vowels of Hangul, and if it is determined that each of the terminal devices contacts on the table device, the terminal devices may provide consonants or vowels corresponding to the respective terminal devices as the first content.
  • a plurality of terminal devices are in contact with the table device, but the initial, neutral, and finality are determined in the order in which each of the terminal devices are in contact with the table device, to provide one letter corresponding to the group of the plurality of terminal devices as the second content. Can be.
  • each terminal device corresponds to a different color
  • the terminal device may provide a color corresponding to each terminal device as the first content, and a plurality of terminals.
  • the color corresponding to the group of the plurality of terminal devices may be provided as the second content, wherein the second content is, for example, the first content of each of the terminal devices included in the group. It may be a color generated by mixing 1 content.
  • the terminal device when each terminal device corresponds to each English alphabet and it is determined that each of the terminal devices contacts each other on the table device, the terminal device may provide the alphabet corresponding to each terminal device as first content, When it is determined that the number of terminal devices makes contact on the table device, one letter corresponding to the group of the plurality of terminal devices may be provided as the second content, wherein the second content is included in the group, for example. It may be an English word consisting of a combination of first contents of each terminal device.
  • each terminal device corresponds to a number or an operator
  • the terminal device may provide a number or operator corresponding to each terminal device as first content, or
  • one content corresponding to the group of the plurality of terminal devices may be provided as second content, wherein the second content may be, for example, the group. It may be a formula consisting of a combination of first contents of each terminal device included in a calculation result or an operation according to the formula.
  • the second device 20 corresponding to the terminal device 20 may be used.
  • Each of the first content 810 and the first content 820 corresponding to the terminal device 20 ′ may be displayed on the table device.
  • the content corresponding to the terminal device 20 is' 3' and the terminal device If the content corresponding to 20 'is' 4 'and the content corresponding to the terminal device 20' 'is' + ', the second content corresponding to the group of the plurality of terminal devices 20 and 20' is grouped.
  • the result may be a result according to a formula consisting of a combination of first contents of each of the terminal devices 20, 20 ′, and 20 ′′ included in the group, and the second content 910, 920 according to the table device 30. ) May be displayed on the screen.
  • the recognition device 10 provides the table with at least one of first content corresponding to each of the plurality of terminal devices and second content corresponding to a group of the plurality of terminal devices.
  • the first content and the second content may be provided together with display information to be displayed on the device to be displayed on the table device.
  • the first content 1010 corresponding to each of the plurality of terminal devices 20 and 20 ′ is provided.
  • the second content 1030 corresponding to the group of the plurality of terminal devices 20 and 20 ′ may also be displayed on the table device 30.
  • the first content 1010 and 1020 and the second content 1030 may be displayed on the table device 30 by reflecting the display information on which the second content 1030 is to be displayed.
  • the first content may be displayed.
  • the second content 1030 may be displayed at the points where the 1010 and 1020 intersect.
  • the recognition method according to the embodiment described with reference to FIG. 5 may also be implemented in the form of a recording medium including instructions executable by a computer, such as a program module executed by the computer.
  • Computer readable media can be any available media that can be accessed by a computer and includes both volatile and nonvolatile media, removable and non-removable media.
  • computer readable media may include both computer storage media and communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transmission mechanism, and includes any information delivery media.
  • the recognition method according to an embodiment of the present invention may be implemented as a computer program (or computer program product) including instructions executable by a computer.
  • the computer program includes programmable machine instructions processed by the processor and may be implemented in a high-level programming language, an object-oriented programming language, an assembly language, or a machine language.
  • the computer program may also be recorded on tangible computer readable media (eg, memory, hard disks, magnetic / optical media or solid-state drives, etc.).
  • the recognition method may be implemented by executing the computer program as described above by the computing device.
  • the computing device may include at least a portion of a processor, a memory, a storage device, a high speed interface connected to the memory and a high speed expansion port, and a low speed interface connected to the low speed bus and the storage device.
  • a processor may include at least a portion of a processor, a memory, a storage device, a high speed interface connected to the memory and a high speed expansion port, and a low speed interface connected to the low speed bus and the storage device.
  • Each of these components are connected to each other using a variety of buses and may be mounted on a common motherboard or otherwise mounted in a suitable manner.
  • the processor may process instructions within the computing device, such as to display graphical information for providing a graphical user interface (GUI) on an external input, output device, such as a display connected to a high speed interface. Instructions stored in memory or storage. In other embodiments, multiple processors and / or multiple buses may be used with appropriately multiple memories and memory types.
  • the processor may also be implemented as a chipset consisting of chips comprising a plurality of independent analog and / or digital processors.
  • the memory also stores information within the computing device.
  • the memory may consist of a volatile memory unit or a collection thereof.
  • the memory may consist of a nonvolatile memory unit or a collection thereof.
  • the memory may also be other forms of computer readable media, such as, for example, magnetic or optical disks.
  • the storage device can provide a large amount of storage space to the computing device.
  • the storage device may be a computer readable medium or a configuration including such a medium, and may include, for example, devices or other configurations within a storage area network (SAN), and may include a floppy disk device, a hard disk device, an optical disk device, Or a tape device, flash memory, or similar other semiconductor memory device or device array.
  • SAN storage area network

Abstract

La présente invention concerne un procédé et un dispositif permettant la reconnaissance d'un dispositif terminal par rapport à un dispositif de table. Selon un premier aspect de la présente invention, un dispositif de reconnaissance configuré pour communiquer avec un dispositif de table pour détecter un contact avec un objet arbitraire comprend : une unité de mise à jour d'état configurée pour recevoir, à partir d'un terminal apparié avec le dispositif de table, des informations de mouvement du dispositif terminal ; et une unité de reconnaissance de terminal configurée pour déterminer si le dispositif terminal est en contact avec le dispositif de table sur la base des informations de mouvement du dispositif terminal.
PCT/KR2015/009648 2014-11-03 2015-09-15 Procédé de reconnaissance et dispositif de reconnaissance WO2016072610A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201580000684.8A CN105900043A (zh) 2014-11-03 2015-09-15 识别方法及识别装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140151147A KR101533603B1 (ko) 2014-11-03 2014-11-03 인식방법 및 인식장치
KR10-2014-0151147 2014-11-03

Publications (1)

Publication Number Publication Date
WO2016072610A1 true WO2016072610A1 (fr) 2016-05-12

Family

ID=53789129

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/009648 WO2016072610A1 (fr) 2014-11-03 2015-09-15 Procédé de reconnaissance et dispositif de reconnaissance

Country Status (3)

Country Link
KR (1) KR101533603B1 (fr)
CN (1) CN105900043A (fr)
WO (1) WO2016072610A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102238697B1 (ko) * 2015-11-09 2021-04-09 에스케이텔레콤 주식회사 테이블탑 인터페이스 장치, 멀티 터치 객체 및 방법
KR101643968B1 (ko) * 2015-12-21 2016-08-01 (주)라온스퀘어 사물카드가 삽입되는 슬롯을 구비한 슬롯장치를 이용한 사물정보 제공 방법 및 시스템
KR101895022B1 (ko) * 2016-06-21 2018-09-10 한양대학교 에리카산학협력단 상판 디스플레이를 갖는 인트랙티브 테이블의 구동 방법
KR102226719B1 (ko) * 2018-12-17 2021-03-12 울산과학기술원 음악 재생기

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040020262A (ko) * 2002-08-30 2004-03-09 윤용상 펜 타입의 다기능 마우스 입력장치
KR20110063649A (ko) * 2008-09-24 2011-06-13 마이크로소프트 코포레이션 개체 감지 및 사용자 설정
WO2012053786A2 (fr) * 2010-10-20 2012-04-26 주식회사 애트랩 Appareil d'affichage et procédé de défilement destiné à l'appareil
KR101212364B1 (ko) * 2012-03-06 2012-12-13 한양대학교 산학협력단 단말기 연동 및 제어 시스템 및 이에 사용되는 사용자 단말기

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008092294A (ja) * 2006-10-02 2008-04-17 Ntt Docomo Inc 移動通信ネットワークシステム及び移動体端末装置のロック方法
US20090128513A1 (en) * 2007-11-20 2009-05-21 Samsung Electronics Co., Ltd Device identification method and apparatus, device information provision method and apparatus, and computer-readable recording mediums having recorded thereon programs for executing the device identification method and the device information provision method
JP2010157189A (ja) * 2009-01-05 2010-07-15 Sony Corp 情報処理装置、情報処理方法およびプログラム
KR101999119B1 (ko) * 2012-07-11 2019-07-12 삼성전자 주식회사 펜 입력 장치를 이용하는 입력 방법 및 그 단말

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040020262A (ko) * 2002-08-30 2004-03-09 윤용상 펜 타입의 다기능 마우스 입력장치
KR20110063649A (ko) * 2008-09-24 2011-06-13 마이크로소프트 코포레이션 개체 감지 및 사용자 설정
WO2012053786A2 (fr) * 2010-10-20 2012-04-26 주식회사 애트랩 Appareil d'affichage et procédé de défilement destiné à l'appareil
KR101212364B1 (ko) * 2012-03-06 2012-12-13 한양대학교 산학협력단 단말기 연동 및 제어 시스템 및 이에 사용되는 사용자 단말기

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KONG, YOUNG SIK ET AL.: "Development of the FishBowl Game Employing a Tabletop Tiled Display Coupling With Mobile Interfaces", JOURNAL OF KOREA GAME SOCIETY, vol. 10, no. 2, 30 April 2010 (2010-04-30), pages 57 - 65, XP055279123 *

Also Published As

Publication number Publication date
CN105900043A (zh) 2016-08-24
KR101533603B1 (ko) 2015-07-06

Similar Documents

Publication Publication Date Title
WO2015030303A1 (fr) Dispositif portatif affichant une image de réalité augmentée et son procédé de commande
WO2015122559A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2016056703A1 (fr) Dispositif portable et son procédé de commande
WO2018026202A1 (fr) Dispositif de détection tactile pour déterminer des informations relatives à un stylet, procédé de commande associé et stylet
WO2016190634A1 (fr) Appareil de reconnaissance tactile et son procédé de commande
WO2014084633A1 (fr) Procédé d'affichage d'applications et dispositif électronique associé
WO2016060291A1 (fr) Dispositif portatif et procédé de commande associé
WO2016072610A1 (fr) Procédé de reconnaissance et dispositif de reconnaissance
WO2014123289A1 (fr) Dispositif numérique de reconnaissance d'un toucher sur deux côtés et son procédé de commande
WO2015190647A1 (fr) Dispositif d'affichage exécutant une fonction de flexion et son procédé de commande
WO2015174597A1 (fr) Dispositif d'affichage d'image à commande vocale et procédé de commande vocale pour dispositif d'affichage d'image
WO2015194709A1 (fr) Dispositif d'affichage portable et procédé de commande associé
EP3028447A1 (fr) Procédé et appareil de construction d'affichage multiécran
EP2740103A1 (fr) Appareil de reconnaissance de voie de trafic et procédé associé
WO2014148689A1 (fr) Dispositif d'affichage capturant du contenu numérique et son procédé de commande
WO2020017890A1 (fr) Système et procédé d'association 3d d'objets détectés
WO2016035940A1 (fr) Dispositif d'affichage et procédé de commande associé
WO2016080557A1 (fr) Dispositif pouvant être porté et son procédé de commande
CN107783669A (zh) 光标产生系统、方法及计算机程序产品
WO2013022159A1 (fr) Appareil de reconnaissance de voie de circulation et procédé associé
WO2014133258A1 (fr) Appareil de saisie avec un stylet et procédé de fonctionnement associé
WO2018117518A1 (fr) Appareil d'affichage et son procédé de commande
EP3186956A1 (fr) Dispositif d'affichage et procédé de commande pour celui-ci
WO2015064827A1 (fr) Étiquette tactile reconnaissable via un panneau tactile capacitif, procédé de reconnaissance d'informations s'y rapportant et procédé de fourniture d'informations l'utilisant
WO2014061859A1 (fr) Procédé capable d'analyser et d'afficher des commentaires, et appareil et système à cet effet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15856560

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15856560

Country of ref document: EP

Kind code of ref document: A1