WO2015182967A1 - Système et procédé de mise en correspondance - Google Patents

Système et procédé de mise en correspondance Download PDF

Info

Publication number
WO2015182967A1
WO2015182967A1 PCT/KR2015/005263 KR2015005263W WO2015182967A1 WO 2015182967 A1 WO2015182967 A1 WO 2015182967A1 KR 2015005263 W KR2015005263 W KR 2015005263W WO 2015182967 A1 WO2015182967 A1 WO 2015182967A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
information
detector
motion
user
Prior art date
Application number
PCT/KR2015/005263
Other languages
English (en)
Korean (ko)
Inventor
김석중
유영준
Original Assignee
(주)브이터치
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)브이터치 filed Critical (주)브이터치
Publication of WO2015182967A1 publication Critical patent/WO2015182967A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/18Service support devices; Network management devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0261Targeted advertisements based on user location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/61Time-dependent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/63Location-dependent; Proximity-dependent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/68Gesture-dependent or behaviour-dependent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services

Definitions

  • the movement pattern, position, and time of a terminal identified by a device such as a user's body (mainly a hand, a finger, etc.) or a movement pattern, a position, a time, and a motion sensor built into the terminal through an image recording
  • a device such as a user's body (mainly a hand, a finger, etc.) or a movement pattern, a position, a time, and a motion sensor built into the terminal through an image recording
  • the present invention relates to a matching system and a matching method for identifying a terminal in which a movement pattern is detected by comparing the two.
  • Wireless communication terminals are expanding to mobile computing devices such as mobile phones, netbooks, tablet PCs, and e-book terminals, including smartphones. Based on this, new services and application markets are being created and numerous services and applications are provided. .
  • IoT Internet of Things
  • An object of the present invention is to provide a matching system and a matching method for specifying a terminal from which a movement pattern is detected.
  • a matching system includes a matching server and a terminal that can be connected to a matching server through a wireless communication network.
  • the matching server may include an image acquisition unit configured to acquire an image of a user's body or a terminal carried by the user through image capturing;
  • a position detector configured to calculate coordinate data indicating a position of a user's body or a terminal carried by the user by using the image acquired from the image acquirer, and a movement of the user detected based on the coordinate data calculated by the position detector
  • the terminal the terminal position detection unit for detecting the coordinate data indicating the current position of the terminal based on the GPS information or IPS information or the Wi-Fi network or carrier base station location information, the acceleration sensor, the gyroscope sensor included in the terminal And a terminal motion detector for detecting a user's motion pattern measured through at least one of gravity sensors, a terminal viewpoint detector for detecting a time when the motion pattern of the terminal detected by the terminal motion detector is recognized, and the terminal position.
  • the movement pattern is detected by comparing the movement pattern, position, and time of the terminal received from the detector, the terminal movement detection unit, and the terminal view detection unit with the movement pattern, position, and time of the user's body or the terminal carried by the matching server.
  • Terminal identifying It may include a matching processing unit.
  • a matching system includes a matching server and a terminal that can be connected to a matching server through a wireless communication network, wherein the terminal indicates coordinates indicating a current position of the terminal based on GPS information or IPS information or Wi-Fi network or carrier base station location information.
  • a terminal position detector for detecting data a terminal motion detector for detecting a user's movement pattern measured through at least one of an acceleration sensor, a gyroscope sensor, and a gravity sensor included in the terminal, and a terminal detected by the terminal motion detector
  • a terminal viewpoint detection unit for detecting a time when a movement pattern of the mobile terminal is recognized, a movement pattern, a position, a time of the terminal received from the terminal position detection unit, a terminal movement detection unit, and a terminal viewpoint detection unit, and a user body or user received from the matching server Autumn cell
  • a terminal matching processing unit for identifying a terminal from which a movement pattern is detected by comparing a movement pattern, a position, and a time of the terminal.
  • the matching server the image acquisition unit for acquiring an image of the user's body or the terminal carried by the user through the image, and the terminal carried by the user's body or the user using the image obtained from the image acquisition unit
  • a position detector which calculates coordinate data indicating a position of the user, a motion detector that detects a user's movement pattern detected based on the coordinate data calculated by the position detector, and a motion pattern detected when the motion pattern detected by the motion detector is recognized. Compares the movement pattern, position, and time of the user body received from the position detection unit, the movement detection unit, and the viewpoint detection unit with the time detection unit for detecting time, and the movement pattern, position, and time of the terminal received from the terminal.
  • Matching unit for specifying a terminal from which a movement pattern is detected It can be included.
  • a matching method may include: generating, by a matching server, first motion information about at least one object included in a field of view; Receiving, by the matching server, second motion information about the at least one object from terminals included in the field of view; And specifying, by the matching server, an object moved among the at least one object based on the first motion information and the second motion information.
  • the matching server may further include calculating respective coordinate data of the at least one object.
  • the first motion information may include movement pattern information, position information, and moved visual information of each of the at least one object measured by the matching server.
  • the second motion information may include movement pattern information, position information, and moved visual information of each of at least one object measured by the terminal.
  • a matching method may include: generating, by a matching server, first motion information about at least one object included in a field of view; Determining whether the matching server directly specifies a moving object; When the matching server directly specifies a moving object, the second server receives second motion information about the at least one object from terminals included in the field of view, and receives the first motion information and the second motion information. Specifying an object moved among the at least one object based on the movement; And if the matching server does not directly specify the moved object, transmitting the first motion information to the terminals.
  • the matching server may further include calculating respective coordinate data of the at least one object.
  • the first motion information may include motion pattern information, location information, and motion time information of each of the at least one object measured by the matching server.
  • the second motion information may include motion pattern information, location information, and motion time information of each of at least one object measured by the terminal.
  • the portable terminal information of the user may be stored in advance, or may be immediately and quickly provided to a portable terminal owned by the user without having to read or photograph an installed tag. .
  • the object information acquisition place can be implemented anywhere, indoors and outdoors, as well as indoors such as museums, exhibition halls, art galleries, as well as amusement parks, zoos, and botanical gardens. There is an effect that can be applied.
  • FIG. 2 is a block diagram showing an example of a matching server and a terminal forming a matching system
  • FIG. 3 is a flowchart illustrating an operation of a matching server according to an example of a matching method
  • FIG. 4 is a flowchart illustrating an operation of a matching server according to another example of a matching method
  • FIG. 5 is a flowchart illustrating an operation of a terminal according to the matching method of FIG. 3;
  • FIG. 6 is a flowchart illustrating an operation of a terminal according to another example of a matching method.
  • 1 is a diagram illustrating an example of a matching system.
  • the matching system includes a matching server 100 and at least one terminal 200a, 200b, 200c.
  • FIG. 1 illustrates a case in which three terminals 200a, 200b, and 200c are located in a field of view (FOV) of the image acquisition unit 110 of the matching server 100.
  • FOV field of view
  • the number of terminals located in the FOV is not limited to this.
  • the matching server 100 Detects the movement and time of movement of the object within the FOV. Comparing this with the movement and the time of movement of the object detected by the terminal, the moved object 200b can be specified. Therefore, the user can conveniently obtain the information of the object remotely without having to access the object. Alternatively, the user may be provided with object information quickly and easily to a portable terminal owned by the user without a separate procedure for receiving object information.
  • the matching system illustrated in FIG. 1 may be implemented anywhere in or outside the indoor and outdoor spaces where virtual touch devices can be installed, such as an amusement park, a zoo, a botanical garden, as well as indoors, museums, exhibition halls, and art galleries.
  • FIG. 2 is a block diagram illustrating an example of a matching server and a terminal forming a matching system.
  • the matching system includes a matching server 100 and at least one terminal 200.
  • the matching server 100 and the terminal 200 are connected to each other through a communication network.
  • the matching server 100 further includes an image obtaining unit 110, a position detecting unit 120, a motion detecting unit 130, a viewpoint detecting unit 140, and a matching processing unit 150.
  • the terminal 200 further includes a terminal position detector 210, a terminal motion detector 220, a terminal viewpoint detector 230, and a terminal matching processor 240.
  • the matching server 100 will be described in detail.
  • the image acquisition unit 110 is a kind of camera module, and acquires an image of a user's body or a terminal carried by the user through image capturing.
  • the area photographed by the image acquisition unit will be referred to as FOV.
  • the image acquisition unit 110 may include a camera including an imaging sensor such as a CCD or a CMOS.
  • the position detector 120 calculates coordinate data indicating the position of the user's body or the terminal carried by the user by using the image acquired from the image acquisition unit 110.
  • the motion detector 130 detects the detected movement pattern of the user based on the coordinate data calculated by the position detector 120. At this time, the detected movement pattern is a preset pattern and transmits different functions according to the shape of the pattern.
  • the viewpoint detector 140 determines a time when the movement pattern detected by the movement detector 140 is recognized. This makes it possible to detect a more accurate time by detecting the time when the movement pattern is recognized based on the current time.
  • the matching processor 150 may include the user's body received from the position detector 120, the motion detector 130, and the viewpoint detector 140, or the movement pattern, position, time of the terminal carried by the user, and the terminal transmitted from the terminal 200. Compares a movement pattern, a position, and a time to specify a terminal from which a movement pattern is detected, or a user body received from the position detector 120, the motion detector 130, and the viewpoint detector 140, or a movement of a terminal carried by the user. The pattern, location, and time are transmitted to the terminal 200.
  • the matching processor 150 transfers the movement pattern, the position, and the time of the user body received from the position detector 120, the motion detector 130, and the viewpoint detector 140 to the terminal 200.
  • the transmission is performed when the terminal 200 that identifies the terminal from which the movement pattern is detected is the terminal 200.
  • the transmission to the terminal 200 is omitted.
  • the terminal location detection unit 210 detects coordinate data indicating the current location of the terminal based on GPS information, IPS information, or Wi-Fi network or carrier base station location information. Wi-Fi network or carrier base station location information may be detected based on the Wi-Fi network location information I am currently connected to or the carrier base station location information I am currently connected to.
  • the terminal motion detector 220 detects a user's movement pattern measured through at least one of an acceleration sensor, a gyroscope sensor, and a gravity sensor included in the terminal. At this time, the movement of the input terminal is measured when the user moves in a preset pattern while holding his terminal 200 by hand.
  • the terminal 200 includes a wearable terminal such as a ring 200b, a watch 200c, glasses, as well as a smart phone 200a. In the case of the wearable terminal, it is possible to prevent the user from discomfort moving while holding the terminal by hand.
  • the terminal viewpoint detector 230 detects a time when the movement pattern of the terminal detected by the terminal movement detector 220 is recognized. This can detect a more accurate time by detecting a time when the movement pattern is recognized based on the current time.
  • the terminal matching processor 240 receives the movement pattern, the position, and the time of the terminal 200 received from the terminal position detector 210, the terminal motion detector 220, and the terminal viewpoint detector 230, and received from the matching server 100. By comparing the movement pattern, the position, and the time of the user's body or the terminal carried by the user, the terminal from which the movement pattern is detected is specified, or from the terminal position detection unit 210, the terminal movement detection unit 220, and the terminal viewpoint detection unit 230. The received movement pattern, position, and time of the terminal 200 are transmitted to the matching server 100.
  • the terminal matching processor 240 transmits the movement pattern, the position, and the time of the terminal 200 received from the terminal position detector 210, the terminal motion detector 220, and the terminal viewpoint detector 230 to the matching server 100.
  • the transmission is performed when the matching server 100 is a subject specifying the terminal from which the movement pattern is detected.
  • the transmission to the matching server 100 is omitted.
  • FIG. 3 is a flowchart illustrating an operation of a matching server according to an example of a matching method.
  • the matching server is a subject that specifies the terminal from which the movement pattern is detected.
  • the matching server captures an image in the FOV (30). As described above, at least one object is included in the FOV, and the matching server calculates coordinate data of each object based on the captured image (31).
  • the matching server generates first motion information when movement occurs in the image in the FOV (32).
  • the first motion information includes motion pattern information, position information, and moved visual information of each object determined based on the coordinate data of each object.
  • the matching server receives the second motion information from each terminal (33).
  • the second motion information includes movement pattern information, position information, and moved visual information of the corresponding terminal detected by each terminal.
  • steps 32 and 33 may be reversed, and may be performed at the same time.
  • the matching server specifies the moved object based on the first motion information and the second motion information (34). That is, the matching server moves the movement pattern information, the position information, and the movement time information of each object in the FOV included in the first motion information, and the movement pattern information, the position information, and the movement time of each object included in the second motion information. By comparing with the information, it is possible to determine which object actually moved.
  • the matching server may be a subject that specifies the terminal from which the movement pattern is detected, or instead, may cause the terminal to detect the movement pattern.
  • the matching server captures an image in the FOV (40).
  • the matching server calculates coordinate data of each object based on the captured image (41).
  • the matching server generates first motion information when motion occurs in the image in the FOV (42).
  • the first motion information includes motion pattern information, position information, and moved visual information of each object determined based on the coordinate data of each object.
  • the matching server determines 43 whether to specify the object directly.
  • the matching server receives second motion information from each terminal (43).
  • the second motion information includes movement pattern information, position information, and moved visual information of the corresponding terminal detected by each terminal.
  • the matching server specifies the moved object based on the first motion information and the second motion information (45). That is, the matching server moves the movement pattern information, the position information, and the movement time information of each object in the FOV included in the first motion information, and the movement pattern information, the position information, and the movement time of each object included in the second motion information. By comparing with the information, it is possible to determine which object actually moved.
  • the matching server If the matching server causes the terminal to specify the object instead of specifying the object, the matching server transmits first motion information to the respective terminals (46).
  • the transmission method may be a method of transmitting to each terminal, a method of broadcasting to all terminals in the FOV, or any other applicable method.
  • FIG. 5 is a flowchart illustrating an operation of a terminal according to the matching method of FIG. 3.
  • the matching server directly specifies an object.
  • the terminal generates second motion information (50), and transmits the generated second motion information to the matching server (51).
  • the second motion information includes motion pattern information, position information, and moved visual information of the corresponding terminal.
  • the terminal may be a subject specifying the terminal from which the movement pattern is detected, or instead, the terminal may cause the server to detect the movement pattern.
  • the terminal first generates second motion information about the corresponding terminal (60).
  • the matching server receives first motion information generated by the matching server (62).
  • the terminal identifies the object as a moving object based on the first motion information received from the matching server and the second motion information generated by itself (63). Alternatively, if there is another terminal moved, the moved terminal may be specified as the moved object.
  • the matching server decides to specify the object instead of specifying the object directly, the second motion information is transmitted to the matching server (64).
  • the object moved by the matching server is specified (65).
  • the matching server may provide information on the specified object to at least one terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Finance (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Primary Health Care (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Telephone Function (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un système et un procédé de mise en correspondance permettant de spécifier un terminal dans lequel un motif de mouvement est détecté, en comparant : un motif de mouvement, un emplacement et un moment d'un terminal transporté par le corps d'un utilisateur (principalement les mains, les doigts et similaires) ou de l'utilisateur par photographie d'image ; avec un motif de mouvement, un emplacement et un moment du terminal identifié par un dispositif tel qu'un détecteur de mouvement intégré dans le terminal.
PCT/KR2015/005263 2014-05-24 2015-05-26 Système et procédé de mise en correspondance WO2015182967A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140062668A KR101611898B1 (ko) 2014-05-24 2014-05-24 매칭 시스템
KR10-2014-0062668 2014-05-24

Publications (1)

Publication Number Publication Date
WO2015182967A1 true WO2015182967A1 (fr) 2015-12-03

Family

ID=54699238

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/005263 WO2015182967A1 (fr) 2014-05-24 2015-05-26 Système et procédé de mise en correspondance

Country Status (2)

Country Link
KR (1) KR101611898B1 (fr)
WO (1) WO2015182967A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102463712B1 (ko) 2017-11-24 2022-11-08 현대자동차주식회사 가상 터치 인식 장치 및 그의 인식 오류 보정 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120024247A (ko) * 2010-09-06 2012-03-14 삼성전자주식회사 사용자의 제스처를 인식하여 이동 장치를 동작하는 방법 및 그 이동 장치
KR20130085094A (ko) * 2012-01-19 2013-07-29 삼성전기주식회사 유저 인터페이스 장치 및 유저 인터페이스 제공 방법
KR20130112188A (ko) * 2012-04-03 2013-10-14 모젼스랩(주) 모션인식장치를 이용한 모바일 단말기와 전시디바이스 간 동기화 방법
KR20140060615A (ko) * 2012-11-12 2014-05-21 주식회사 브이터치 포인터를 사용하지 않는 가상 터치 장치에 있어서의 전자기기의 가상 평면을 이용한 원격 조작 장치

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101228336B1 (ko) * 2011-04-01 2013-01-31 한국과학기술원 모바일 단말기의 사용자 행동패턴을 이용하여 개인화 서비스를 제공하는 방법 및 이를 위한 모바일 단말기

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120024247A (ko) * 2010-09-06 2012-03-14 삼성전자주식회사 사용자의 제스처를 인식하여 이동 장치를 동작하는 방법 및 그 이동 장치
KR20130085094A (ko) * 2012-01-19 2013-07-29 삼성전기주식회사 유저 인터페이스 장치 및 유저 인터페이스 제공 방법
KR20130112188A (ko) * 2012-04-03 2013-10-14 모젼스랩(주) 모션인식장치를 이용한 모바일 단말기와 전시디바이스 간 동기화 방법
KR20140060615A (ko) * 2012-11-12 2014-05-21 주식회사 브이터치 포인터를 사용하지 않는 가상 터치 장치에 있어서의 전자기기의 가상 평면을 이용한 원격 조작 장치

Also Published As

Publication number Publication date
KR101611898B1 (ko) 2016-04-12
KR20150135023A (ko) 2015-12-02

Similar Documents

Publication Publication Date Title
WO2015064992A1 (fr) Dispositif intelligent réalisant une communication par del-id/rf au moyen d'une caméra, et système et procédé de fourniture de services à base de localisation faisant appel à celui-ci
WO2015014018A1 (fr) Procédé de navigation et de positionnement en intérieur pour terminal mobile basé sur la technologie de reconnaissance d'image
WO2013162235A1 (fr) Appareil d'obtention d'informations d'un objet tridimensionnel virtuel sans recours à un pointeur
WO2016024797A1 (fr) Système de suivi et procédé de suivi l'utilisant
WO2012173381A2 (fr) Procédé et système de gestion de chantier de construction fondée sur la position
WO2011139070A2 (fr) Procédé et appareil pour reconnaître la localisation d'un utilisateur
WO2018151449A1 (fr) Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
WO2011139115A2 (fr) Procédé pour accéder à des informations sur des personnages à l'aide d'une réalité augmentée, serveur et support d'enregistrement lisible par ordinateur
EP3542208A1 (fr) Procédé de fourniture de contenu de réalité augmentée, et dispositif électronique et système adaptés au procédé
EP3039476A1 (fr) Dispositif d'affichage monté sur tête (hmd) et procédé pour sa commande
WO2013069963A1 (fr) Dispositif de compensation de position utilisant la communication à la lumière visible et procédé associé
WO2015102126A1 (fr) Procédé et système pour gérer un album électronique à l'aide d'une technologie de reconnaissance de visage
WO2016018067A1 (fr) Procédé et dispositif de mise en correspondance de l'emplacement d'un capteur et d'une opération d'événement à l'aide d'un dispositif de surveillance
WO2019221340A1 (fr) Procédé et système de calcul de coordonnées spatiales d'une région d'intérêt et support d'enregistrement non transitoire lisible par ordinateur
WO2012086984A2 (fr) Procédé, dispositif et système pour fournir des informations sensorielles et assurer une détection
US20180028861A1 (en) Information processing device and information processing method
WO2018131852A1 (fr) Dispositif électronique utilisé pour exécuter un appel vidéo, et support d'enregistrement lisible par ordinateur
WO2016060312A1 (fr) Procédé et dispositif de gestion de sécurité basés sur la reconnaissance de position en intérieur
WO2015108401A1 (fr) Dispositif portatif et procédé de commande employant une pluralité de caméras
WO2015182967A1 (fr) Système et procédé de mise en correspondance
WO2017030233A1 (fr) Procédé de détection de position par un dispositif informatique mobile, et dispositif informatique mobile l'exécutant
WO2017135514A1 (fr) Caméra tridimensionnelle permettant une capture d'image en vue de fournir une réalité virtuelle
WO2020235740A1 (fr) Système et procédé de service de positionnement intérieur basé sur une image
WO2018066902A1 (fr) Correction d'orientation de photo et de vidéo sphériques cohérente
WO2017142130A1 (fr) Procédé de traitement d'image destiné à fournir une réalité virtuelle, et un dispositif de réalité virtuelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15800512

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15800512

Country of ref document: EP

Kind code of ref document: A1