WO2011042632A1 - Dispositif d'interaction avec un objet augmente - Google Patents
Dispositif d'interaction avec un objet augmente Download PDFInfo
- Publication number
- WO2011042632A1 WO2011042632A1 PCT/FR2010/051837 FR2010051837W WO2011042632A1 WO 2011042632 A1 WO2011042632 A1 WO 2011042632A1 FR 2010051837 W FR2010051837 W FR 2010051837W WO 2011042632 A1 WO2011042632 A1 WO 2011042632A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- augmented
- user
- augmented object
- zone
- control interface
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 96
- 230000003993 interaction Effects 0.000 title description 7
- 238000004891 communication Methods 0.000 claims abstract description 32
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000002360 preparation method Methods 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 1
- 230000008033 biological extinction Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/409—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/12—Arrangements for remote connection or disconnection of substations or of equipment thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4131—Peripherals receiving signals from specially adapted client devices home appliance, e.g. lighting, air conditioning system, metering devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/43615—Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47205—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/32—Operator till task planning
- G05B2219/32014—Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40195—Tele-operation, computer assisted manual operation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/002—Special television systems not provided for by H04N7/007 - H04N7/18
Definitions
- the present invention relates to a new interaction paradigm, allowing a user to interact with at least one object of an environment, via a video link.
- the present invention overcomes these disadvantages by providing means for automatically associating an active object with control interfaces or means for defining such interfaces. This is made possible by means of an innovative concept of augmented object. Such a concept naturally finds its place in current developments around the Internet of Things.
- the subject of the invention is a device enabling a user, located in a user area, to interact with at least one augmented object present in an environment, said augmented object being located in an object zone, the augmented object comprising an object associated virtual object and object, said virtual object comprising at least one defining element of at least one service provided by said augmented object, said augmented object being connectable via a first communication means and said device comprises:
- a video capture means located in an object zone, capable of taking a video image, of at least part of said environment
- a second communication means able to transmit said video image from the object area to the user area, a user terminal located in a user area, able to display said video image, in order to present it to the user,
- a first communication means able to establish a connection between the device and the identified augmented object, and to recover said service definition elements offered by said augmented object
- control interface definition means able to define a control interface for the services of the augmented object identified in agreement with said service definition elements, and to present said interface of command to the user, via the user terminal,
- a third means of communication capable of transmitting said commands to the augmented object, for execution of said services by the physical object.
- the video capture means can be steered in orientation and / or in position
- the user terminal also comprises a control means, capable of enabling the user to carry out said steering by producing control commands.
- control and the device further comprises a fifth communication means, adapted to transmit said control commands from the user area to the object area.
- the automatic identification means of the augmented object comprises means for recognizing the shape of the physical object.
- the automatic identification means of the augmented object comprises means for locating the physical object.
- the automatic identification means of the augmented object comprises an identifying mark disposed on or near the physical object.
- the interface definition means is able to use a command interface directly proposed by the augmented object.
- the interface definition means is capable of recovering from a storage means a control interface defined in agree with the service definition elements.
- the interface definition means is able to display said control interface, in incrustation in the video image displayed on the user terminal.
- the means for reading the commands of said services in relation with said control interface, comprises a keyboard, a touch screen, and / or a pointing device.
- the first, second, third, fourth and fifth communication means comprise the Internet network.
- FIG. 1 presents a block diagram of a device according to the invention
- FIG. 2 illustrates an example of an object zone comprising an augmented object
- FIG. 3 shows an example of a superimposed display of a video image and a control interface for the augmented object presented in FIG. 2.
- a lamp is an object that can provide two services: a service "lights" ignition of said lamp and an "off" service of extinction of said lamp.
- a media player is another object that can for example provide a "broadcast” service of multimedia content.
- a phone is an object that can offer a "call" service for telephone connection.
- a virtual object 7 is associated with the physical object 6.
- This virtual object 7 is an object of the computer world. It thus comprises purely software elements of data and / or program type, and may further comprise physical, typically electronic, management, interface or interconnection elements.
- Said virtual object 7 may comprise at least one element for defining and / or identifying the physical object 6.
- an augmented object 3 is uniquely defined in a given environment.
- the virtual object 7 associated with a physical object 6 comprises at least one definition element, defining the service or services offered by said augmented object 3, via the physical object 6.
- an augmented object can be connected via a first communication means 10, for example of the network type.
- the connection and the communication interface with said communication means 10 are provided by the virtual object 7 of the augmented object 3.
- a user 5 can, via a user terminal 12, connect with an augmented object 3, so that to dialogue and exchange information.
- the device 1 comprises a video capture means 8 capable of taking a video image 9 of at least part of the environment.
- This video capture means 8 is located in object zone 2.
- This video capture means 8 may comprise at least one video camera, for example of the webcam type.
- a second communication means 11 makes it possible to transmit said captured video image 9 from the object zone 2 to the user zone 4.
- the user 5 communicates with the device 1 and the control by means of a user terminal 12.
- This user terminal 12 is located in the user area 4 and is able to display said captured and transmitted video image 9, in order to present it to the user. 5.
- the user can advantageously, remotely, be visually informed, in real time, of what is happening in the object zone 2.
- the device 1 also comprises an automatic identification means 13 for an augmented object 3 when the physical object 6 associated with this augmented object 3 is present in the field of the video capture means 8.
- Said identification means 13 can operate according to different principles, some of which will be detailed for illustrative purposes. Identification can be done at any time, on demand or permanently. However it is appropriate that an object 3 is identifiable, at least when it is visible from the video capture means 8 and present in the video image 9 presented to the user 5. As will be detailed below, the identification allows to recognize object 3, but to uniquely identify it. This identification thus provides a means of locating said object 3 in order to be able to connect with it. Thus in the case of a communication network, the identification provides a unique network address of said augmented object 3, for locating the object on the network and to connect to it.
- the device 1 also comprises a first communication means 10.
- This first communication means 10 makes it possible to establish a connection between the device 1 and the augmented object 3 thus identified, based on the identification element or elements provided by the means of identification 13.
- This connection established with the augmented object 3 makes it possible to interrogate the augmented object 3 in order to retrieve information relating to the augmented object 3 contained in a storage means associated with said object 3 or referenced by said object 3.
- the device connects with the identified augmented lamp and recovers two services "turns on” and “off” and the defining elements of these services.
- the definition elements of the "lit” service include, for example, its name: “lit”, its typology: binary command, as well as the elements necessary for its implementation: address or physical connection to be activated in order to carry out the service. lighting the lamp.
- the device 1 further comprises a control interface definition means 14, which, from the service definition elements of an augmented object 3, can define a control interface 15 for the services of the augmented object 3 identified.
- This control interface 15 has a speaking face, for example graphic, which can be displayed on the user terminal 12 in order to be perceived by the user 5. In relation to this speaking face, the control interface 15 comprises the connections necessary for the transmission of service commands to the augmented object 3.
- the control interface 15 is defined in accordance with the service definition elements.
- the device 1 also comprises a reading means 16 capable of reading commands, applied by the user 5.
- This reading means 16 is configured in accordance with the service definition elements extracted from the identified augmented object 3 and in relation to it. with the talking face of said control interface 15.
- both services "turn on” and “off”, will, for example, be shown on a control interface 15 comprising two buttons. These two buttons will be graphically figured, each recognizable by its name.
- the reading means 16 is then able to detect a user actuation 5 of either of these two buttons. If the user actuates the button associated with the service “lights up”, the reading means 16 reads a corresponding service command "lights up”.
- the device further comprises a third communication means 17.
- This communication means is in charge of transmitting a command thus read by the reading means 16, to the augmented object 3. This transmission is carried out so that the augmented object 3 receives said command and executes on the physical object 6 the corresponding service.
- the device 1 may advantageously comprise a sound capture means, in object zone 3, in geographical coincidence or not with the video capture means 8, in order to make a sound recording of the environment.
- a fourth transmission means is then used to transmit said sound from the object zone 2 to the user zone 4, in order to restore said sound by means of a sound reproduction means, of headphone or speaker type, included in the user terminal. 12, for the attention of the user 5.
- the video capture means 8 is controllable.
- a video sensor can be steerable in orientation, for example in site and deposit.
- a video sensor can still be zoomed.
- a video sensor can still be controllable in position.
- a mobile base such as a controllable robot, a video sensor being mounted on said mobile base.
- This can still be achieved with a plurality of video sensors coupled to a control type source selection means. It is still possible to combine these different embodiments.
- the user terminal 12 then comprises a control means
- a large part of the device 1 is held by the identification of the augmented object 3.
- Several techniques are applicable to indifferently obtain such an identification. This identification is necessary in that it makes it possible to uniquely designate the augmented object 3 in order to make a connection with said augmented object 3.
- a method of recognition by form must thus be completed by other means. It is thus possible to proceed using a prior inventory.
- the lamp can thus be identified with certainty if there is only one lamp (or a lamp of this type / color, or in this environment / this room).
- an identification means 13 may employ, alone or in combination with another means such as shape recognition, a location of the physical object 6.
- the object is identified according to its location.
- the object is a lamp because it is in the left corner of the room.
- the lamp recognized by its shape is the lamp No. 3 because it is (the only lamp) located one meter above the ground.
- Such an approach requires a prior definition, for example in the form of a map, of the positions of the different objects, which reduces the genericity of the device.
- Another disadvantage is that it becomes difficult or impossible to identify an augmented object 3 as soon as it has been moved.
- an embodiment of an identification means 13 that does not require any pre-mapping type preparation and tolerates a possible and always possible movement of the object 3, is preferable.
- Such identification means 13 can be realized by equipping the physical object 6 associated with an augmented object 3 that it is desired to identify, an identification mark 20.
- a mark 20 is unique in order to be an identifier. It is arranged on or near the physical object 6. It is placed so that it can be seen / read according to its nature. Its characteristic signature (shape, color, coding, etc.) is recorded in a correspondence table, accessible from the device 1, associating a signature with a unique identifier of an augmented object (for example the network address of said augmented object 3 ).
- An alternative approach is to directly code the augmented object identification elements 3 allowing the connection in the identifier mark 20.
- the identification mark 20 may be a visual cue (pictogram, barcode, color code, etc.) advantageously visible by the video capture means 8, in order to be recognized by the device 1, for example by image analysis.
- a "coding" visual such as a barcode, advantageously makes it possible to directly include the network connection identifier in its code, or else an address or any other reference means making it possible to find it.
- such a visual cue does not require additional reading means, since it reuses the video capture means 8.
- a disadvantage of such a visual cue is that it is visible only from a limited area of space around a privileged direction.
- the identifying mark is a radiofrequency label, also known as RFID.
- the device 1 must still include a reader of such radiofrequency tags. This reader located in object zone 2, is able to read such a radiofrequency label.
- Such a radiofrequency label is "coding" with the advantage previously mentioned.
- such a radiofrequency label does not need to be seen in order to be read and can be hidden / buried in the object 3.
- the reading of such a radiofrequency label can be carried out indifferently from any direction of the 'space.
- the interface definition means 14 extracts from the augmented object 3, the definition elements of the control interface 15 associated with the services provided by said augmented object 3.
- the control interface is at least defined by object 3 in a typological way.
- the definition elements of the control interface 15 comprise at least the indication that each of the two services "turns on” and "off” is associated with a binary command.
- the object may further include more precise indications, such as the layout, appearance, or even the detailed graphic representation of a control interface provided by the augmented object 3.
- the interface definition means 14 uses directly and entirely the control interface proposed by the augmented object 3. This makes it possible to respect a graphic aesthetic specific to the augmented object 3.
- the interface definition means 14 uses the typological definition elements to construct a compatible control interface of the object 3 and its services, but defines a graphical appearance irrespective of that proposed by the augmented object. 3.
- This ability to extract from the augmented object 3 automatically identified in a video image 9 the elements of typological interface definition and all or part of their graphical representation, without any prior knowledge of the environment or objects is very advantageous in this respect. it allows an implementation of the device 1 without definition, modeling of the environment or prior configuration.
- an automatic and autonomous identification (without configuration or preparation) makes it possible to apprehend an unknown environment and to tolerate an evolution of said environment, for example by moving objects 3.
- the definition means 14 is capable of displaying it on a display means advantageously included in the user terminal 12.
- This third communication means 17 systematically comprises a first connection between the user terminal 12 and the augmented object 3.
- This link most often of network type, establishes a link between the user terminal 12 and the virtual object 7. It can also understand a second connection.
- This second link is a specific connection interface between the virtual object 7 and the physical object 6.
- the user 5 from his place of work (user zone 4) views the interior of the home (object zone 2), by means of a webcam 8 whose image is transmitted, via the Internet, to a personal computer (terminal user 12), preferably standard type.
- Video image 9 shows a living room.
- the device automatically detects and, if necessary, graphically highlights the presence in this scene of three augmented objects: a lamp, a phone and a media player.
- the device selectively allows (for example when passing a pointer over the object) to show or hide a control interface 15 for each of said three augmented objects.
- the parent suggests that their child view the last video clip of their favorite group, which the parent has just downloaded.
- the parent selects the media player object. This brings up a command interface proposing at least one "broadcast” service. The parent selects the video clip on the desktop of his device and slips it over the control interface of the "broadcast” service. This triggers a download of said clip from the workplace to the home media player and then broadcast said clip on said media player for viewing by the child.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10763807.4A EP2486701B1 (fr) | 2009-10-05 | 2010-09-03 | Dispositif d'interaction avec un objet augmente |
KR1020127008786A KR101470711B1 (ko) | 2009-10-05 | 2010-09-03 | 증강 객체와의 상호 작용을 위한 디바이스 |
CN201080044455.3A CN102577250B (zh) | 2009-10-05 | 2010-09-03 | 用于与增强对象进行交互的设备 |
JP2012532642A JP5799018B2 (ja) | 2009-10-05 | 2010-09-03 | 拡張物体との対話用デバイス |
ES10763807.4T ES2464127T3 (es) | 2009-10-05 | 2010-09-03 | Dispositivo de interacción con un objeto aumentado |
US13/499,691 US9063537B2 (en) | 2009-10-05 | 2010-09-03 | Device for interaction with an augmented object |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0956914 | 2009-10-05 | ||
FR0956914A FR2950989B1 (fr) | 2009-10-05 | 2009-10-05 | Dispositif d'interaction avec un objet augmente. |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011042632A1 true WO2011042632A1 (fr) | 2011-04-14 |
Family
ID=41719226
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/FR2010/051837 WO2011042632A1 (fr) | 2009-10-05 | 2010-09-03 | Dispositif d'interaction avec un objet augmente |
Country Status (8)
Country | Link |
---|---|
US (1) | US9063537B2 (fr) |
EP (1) | EP2486701B1 (fr) |
JP (1) | JP5799018B2 (fr) |
KR (1) | KR101470711B1 (fr) |
CN (1) | CN102577250B (fr) |
ES (1) | ES2464127T3 (fr) |
FR (1) | FR2950989B1 (fr) |
WO (1) | WO2011042632A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2634653A1 (fr) * | 2012-02-28 | 2013-09-04 | General Electric Company | Systèmes et procédés de configuration de machine |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9965564B2 (en) | 2011-07-26 | 2018-05-08 | Schneider Electric It Corporation | Apparatus and method of displaying hardware status using augmented reality |
US9514570B2 (en) | 2012-07-26 | 2016-12-06 | Qualcomm Incorporated | Augmentation of tangible objects as user interface controller |
US9639984B2 (en) * | 2013-06-03 | 2017-05-02 | Daqri, Llc | Data manipulation based on real world object manipulation |
CN103777851B (zh) * | 2014-02-26 | 2018-05-29 | 大国创新智能科技(东莞)有限公司 | 物联网视频交互方法和系统 |
US20160071319A1 (en) * | 2014-09-09 | 2016-03-10 | Schneider Electric It Corporation | Method to use augumented reality to function as hmi display |
KR102427328B1 (ko) * | 2014-10-17 | 2022-08-01 | 삼성전자주식회사 | 사물 인터넷 단말 및 그 동작 방법 |
KR102332752B1 (ko) * | 2014-11-24 | 2021-11-30 | 삼성전자주식회사 | 지도 서비스를 제공하는 전자 장치 및 방법 |
US9760744B1 (en) | 2016-06-01 | 2017-09-12 | International Business Machines Corporation | Physical interactive IDs (P2D) |
US11222081B2 (en) | 2017-11-27 | 2022-01-11 | Evoqua Water Technologies Llc | Off-line electronic documentation solutions |
KR102359601B1 (ko) * | 2019-11-29 | 2022-02-08 | 한국과학기술원 | 투명 평판을 이용한 영상 처리 방법 및 이를 수행하는 장치 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6463343B1 (en) | 1999-08-10 | 2002-10-08 | International Business Machines Corporation | System and method for controlling remote devices from a client computer using digital images |
WO2006009521A1 (fr) * | 2004-07-23 | 2006-01-26 | Agency For Science, Technology And Research | Systeme et procede pour generer des retransmissions differees pour transmission video |
WO2009113067A2 (fr) * | 2008-03-11 | 2009-09-17 | In-Dot Ltd. | Systèmes et procédés de communication |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4982438A (en) * | 1987-06-02 | 1991-01-01 | Hitachi, Ltd. | Apparatus and method for recognizing three-dimensional shape of object |
JP4178697B2 (ja) * | 1999-11-18 | 2008-11-12 | ソニー株式会社 | 携帯型情報処理端末、情報入出力システム及び情報入出力方法 |
GB2360356A (en) * | 2000-03-18 | 2001-09-19 | Rupert William Meldrum Curwen | Actuating an operation in accordance with location of a transponder |
JP2002044646A (ja) * | 2000-07-26 | 2002-02-08 | Kansai Electric Power Co Inc:The | Adslを利用した監視システム |
US6678413B1 (en) * | 2000-11-24 | 2004-01-13 | Yiqing Liang | System and method for object identification and behavior characterization using video analysis |
US7054645B1 (en) * | 2001-03-07 | 2006-05-30 | At&T Corp. | System for intelligent home controller and home monitor |
US7060407B2 (en) * | 2002-10-04 | 2006-06-13 | Ricoh Company, Limited | Image removing method, image removing device, and image forming apparatus |
US6998987B2 (en) * | 2003-02-26 | 2006-02-14 | Activseye, Inc. | Integrated RFID and video tracking system |
US8042049B2 (en) * | 2003-11-03 | 2011-10-18 | Openpeak Inc. | User interface for multi-device control |
JP2005063225A (ja) | 2003-08-15 | 2005-03-10 | Nippon Telegr & Teleph Corp <Ntt> | 自己画像表示を用いたインタフェース方法、装置、ならびにプログラム |
US7616782B2 (en) * | 2004-05-07 | 2009-11-10 | Intelliview Technologies Inc. | Mesh based frame processing and applications |
US8042048B2 (en) * | 2005-11-17 | 2011-10-18 | Att Knowledge Ventures, L.P. | System and method for home automation |
JP2009520316A (ja) * | 2005-12-19 | 2009-05-21 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 照明制御のための方法および装置 |
WO2008032225A2 (fr) * | 2006-03-21 | 2008-03-20 | Ranco Incorporated Of Delaware | Unité de contrôle de réfrigération |
US20080144884A1 (en) * | 2006-07-20 | 2008-06-19 | Babak Habibi | System and method of aerial surveillance |
JP5132138B2 (ja) * | 2006-11-28 | 2013-01-30 | キヤノン株式会社 | 位置姿勢計測方法、位置姿勢計測装置 |
US8295543B2 (en) * | 2007-08-31 | 2012-10-23 | Lockheed Martin Corporation | Device and method for detecting targets in images based on user-defined classifiers |
-
2009
- 2009-10-05 FR FR0956914A patent/FR2950989B1/fr not_active Expired - Fee Related
-
2010
- 2010-09-03 EP EP10763807.4A patent/EP2486701B1/fr not_active Not-in-force
- 2010-09-03 ES ES10763807.4T patent/ES2464127T3/es active Active
- 2010-09-03 JP JP2012532642A patent/JP5799018B2/ja not_active Expired - Fee Related
- 2010-09-03 CN CN201080044455.3A patent/CN102577250B/zh not_active Expired - Fee Related
- 2010-09-03 KR KR1020127008786A patent/KR101470711B1/ko active IP Right Grant
- 2010-09-03 US US13/499,691 patent/US9063537B2/en not_active Expired - Fee Related
- 2010-09-03 WO PCT/FR2010/051837 patent/WO2011042632A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6463343B1 (en) | 1999-08-10 | 2002-10-08 | International Business Machines Corporation | System and method for controlling remote devices from a client computer using digital images |
WO2006009521A1 (fr) * | 2004-07-23 | 2006-01-26 | Agency For Science, Technology And Research | Systeme et procede pour generer des retransmissions differees pour transmission video |
WO2009113067A2 (fr) * | 2008-03-11 | 2009-09-17 | In-Dot Ltd. | Systèmes et procédés de communication |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2634653A1 (fr) * | 2012-02-28 | 2013-09-04 | General Electric Company | Systèmes et procédés de configuration de machine |
CN103295012A (zh) * | 2012-02-28 | 2013-09-11 | 通用电气公司 | 用于机器配置的系统和方法 |
US8825824B2 (en) | 2012-02-28 | 2014-09-02 | General Electric Company | Systems and methods for machine configuration |
Also Published As
Publication number | Publication date |
---|---|
EP2486701B1 (fr) | 2014-04-09 |
EP2486701A1 (fr) | 2012-08-15 |
CN102577250B (zh) | 2014-12-03 |
FR2950989B1 (fr) | 2011-10-28 |
FR2950989A1 (fr) | 2011-04-08 |
US20120243743A1 (en) | 2012-09-27 |
US9063537B2 (en) | 2015-06-23 |
ES2464127T3 (es) | 2014-05-30 |
KR101470711B1 (ko) | 2014-12-08 |
JP5799018B2 (ja) | 2015-10-21 |
KR20120063508A (ko) | 2012-06-15 |
CN102577250A (zh) | 2012-07-11 |
JP2013506925A (ja) | 2013-02-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2486701B1 (fr) | Dispositif d'interaction avec un objet augmente | |
JP6562972B2 (ja) | 消費者向け電子装置を登録し、制御し、及びサポートするコンシェルジュデバイス及び方法 | |
US10026229B1 (en) | Auxiliary device as augmented reality platform | |
TWI391831B (zh) | 用於攜帶式裝置之預先組態設定方法與系統 | |
KR101803168B1 (ko) | 실질 세계 객체 조작에 기초한 데이터 조작 | |
TWI423709B (zh) | 在主設備和目標設備之間自動介面連接 | |
US20150199851A1 (en) | Interactivity With A Mixed Reality | |
TW201210286A (en) | System and method for managing a network of user-selectable devices | |
FR3000632A1 (fr) | Procede d'affichage de donnees dans un reseau et dispositif mobile associe | |
WO2017194777A1 (fr) | Système permettant la création et le déploiement d'applications multiplateformes | |
CN112261482B (zh) | 互动视频的播放方法、装置、设备及可读存储介质 | |
CN114302221A (zh) | 一种虚拟现实设备及投屏媒资播放方法 | |
FR2898719A1 (fr) | Procede de parametrisation d'une interface de commande adaptable,et systeme d'adaptation associe. | |
WO2007012768A2 (fr) | Procede pour le controle d'une interface a l'aide d'une camera equipant un terminal de communication. | |
JP2019114272A (ja) | 端末装置、システム、情報提示方法およびプログラム | |
CN115129280A (zh) | 一种虚拟现实设备及投屏媒资播放方法 | |
FR2999847A1 (fr) | Procede d'activation d'un dispositif mobile dans un reseau, dispositif d'affichage et systeme associes | |
WO2023046902A1 (fr) | Systeme d'interface homme-machine | |
FR3107390A1 (fr) | Dispositif, système et procédé de traitement de données de réalité virtuelle | |
EP2887704A1 (fr) | Procédé d'interaction entre un premier objet numérique et au moins un deuxième objet numérique et système d'interaction | |
WO2021044851A1 (fr) | Dispositif et procédé de traitement d'informations | |
FR3127597A1 (fr) | Systeme d’interface homme-machine | |
US20230251778A1 (en) | Accessible virtual tour system and method | |
WO2020187950A1 (fr) | Dispositif, système et procédé de traitement de données de réalite virtuelle | |
US20200073967A1 (en) | Technique for saving metadata onto photographs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080044455.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10763807 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010763807 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012532642 Country of ref document: JP Ref document number: 3074/CHENP/2012 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 20127008786 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13499691 Country of ref document: US |