WO2017125783A1 - Système interactif à distance de réalité augmentée et procédé associé - Google Patents

Système interactif à distance de réalité augmentée et procédé associé Download PDF

Info

Publication number
WO2017125783A1
WO2017125783A1 PCT/IB2016/050250 IB2016050250W WO2017125783A1 WO 2017125783 A1 WO2017125783 A1 WO 2017125783A1 IB 2016050250 W IB2016050250 W IB 2016050250W WO 2017125783 A1 WO2017125783 A1 WO 2017125783A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
virtual
tridimensional
user interface
computer
Prior art date
Application number
PCT/IB2016/050250
Other languages
English (en)
Inventor
Maurizio NAGGIAR
Original Assignee
Tycoon Promotion Sagl
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tycoon Promotion Sagl filed Critical Tycoon Promotion Sagl
Priority to EP16704048.4A priority Critical patent/EP3405853A1/fr
Priority to US16/071,437 priority patent/US20210208763A1/en
Priority to CN201680079481.7A priority patent/CN108475118A/zh
Priority to PCT/IB2016/050250 priority patent/WO2017125783A1/fr
Publication of WO2017125783A1 publication Critical patent/WO2017125783A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers

Definitions

  • the present invention refers to the field augmented reality, and in detail concerns a remote interactive system of augmented reality.
  • the present invention concerns furthermore an interactive method of augmented reality between two remote devices.
  • the tridimensional virtual space is typically defined basing on a tridimensional space.
  • the Applicant has noticed that the virtual shops present a degree of APPETIBILITA by an aesthetic and emotional point of view on the client extremely limited, since their APPETIBILITA is increasable only in function of the aesthetic aspect of the site. In other words, all the virtual shops present the drawback of not offering to the user an actual area within which he can move and perform purchases.
  • the scope of the present invention is therefore to describe an interactive remote system of augmented virtual reality that allows to solve the aforementioned drawbacks.
  • a second scope of the present invention is to describe a method of interaction in augmented reality between two remote devices which allows to solve the aforementioned drawbacks. Summary of the invention
  • the present invention is targeted to describe a method and a remote interactive system of augmented reality which allows the most possible emotional and realistic interaction of a user in a tridimensional virtual space or shop wherein it is possible to perform purchases as well as meet with other people that in truth are remotely positioned respective to the user in the real environment but find themselves in the same circumscribed virtual environment under the form of virtual figure or avatar.
  • an electronic system of interaction between a user and a tridimensional virtual space comprising means of creation of a virtual tridimensional space within which said user can move and/or move the point of vision of a virtual subject, means of definition and motion of a point of vision of said virtual subject within said virtual tridimensional space, and further comprising:
  • At least one user interface electronic device transceiving at least electronic data of motion of said point of vision with said at least one computer or electronic device or server or computer network
  • said user interface electronic device comprises means of visualization of said virtual tridimensional space and represents on said means of visualization of said virtual tridimensional space a variation in real time of said point of vision
  • said computer or electronic device or server or computer network stores a plurality of objects defined on a plurality of points respective to a reference system own of said virtual tridimensional space
  • said means of definition and motion of said point of vision store on said at least one computer or electronic device or server or computer network a variation or adaptation of a status of the at least one of said plurality of objects according to a selection or adaptation command of at least one object of said plurality of objects performed by said user by means of said at least one user interface electronic device, and transmit towards said at least one user interface electronic device, an adaptation code of said status of said selected object, wherein said adaptation code of said status causes a variation of the operation of said at least one user interface electronic device.
  • said computer or electronic device or server or computer network are configured for providing in streaming, that is to say in real time, the electronic data that allow to show said tridimensional space on an at least one external electronic device.
  • said plurality of objects is a plurality of usable products, wherein for each purchasable product on said at least one computer or electronic device or server or computer network is stored at least one adaptable price data, a residual quantity datum, and wherein at the moment of said selection, means of electronic counting are configured for decreasing said residual quantity datum and store a new residual quantity datum according to said selection performed by said user by means of said at least one user interface electronic device.
  • said plurality of objects is a plurality of parts of a confined environment
  • said status is a datum stored on said at least one computer or electronic device or server or computer network, comprising size data and/or of graphic appearance of said object, and wherein said means of definition and motion of said point of vision, at the moment of the adaptation of said status, transmit data of adaptation of a graphic appearance of said selected object towards said at least one user interface electronic device.
  • said system is configured so as to that said computer or electronic device or server or computer network transmits automatically towards said user interface electronic device a movie suitable to be perceived by the user as a tridimensional and electronically associated to said at least one selected object, and wherein said user interface electronic device reproduces said tridimensional movie by adapting the point of vision within said tridimensional movie according to motion data of said user interface electronic device respective to a tridimensional reference system own of said user interface electronic device, and wherein said variation of operation of said at least one user interface electronic device comprises a switching of the representation of said virtual tridimensional space to the representation of said tridimensional movie.
  • said user interface electronic device reproduces in real time said tridimensional movie.
  • said point of vision is originated by a tridimensional reference of said virtual user positioned at a known distance respective to a fixed reference system of said virtual tridimensional space, said fixed reference being stored and automatically updated at each movement of said user interface electronic device.
  • said point of vision is originated from a movable tridimensional reference system within said virtual tridimensional space and wherein said electronic computer or electronic computer network update automatically the distance between said tridimensional movable reference system and said fixed tridimensional reference system, said movable reference being stored and automatically updated at each movement of said user interface electronic device.
  • said user interface electronic device comprises a plurality of sensors of zenithal and azimuthal motion, said sensors being configured for transmitting on a plurality of time intervals data of motion of said point of vision within said virtual tridimensional space.
  • said virtual tridimensional space said computer or electronic device or server or computer network calculates the motion of more virtual subjects simultaneously, each one with an own point of vision controlled by a respective real user provided with a respective user interface electronic device univocally electronically associated with said own virtual user.
  • said computer or electronic device or server or computer network calculates a communication of an audio and/or video data stream among virtual subjects simultaneously.
  • - a step of storage in said memory of at least one of a plurality of virtual subjects, susceptible of move themselves and/or adapt an own point of vision of said virtual tridimensional space; - a subsequent step of electronic assigning of each virtual subject of said plurality of virtual subjects to the electronic control by of a respective user interface remote electronic device, wherein said user interface electronic device comprises means of visualization of said virtual tridimensional space and represents on said means of visualization of said virtual tridimensional space a variation in real time of said point of vision;
  • said adaptable electronic status of said at least one object of said plurality of objects is a price, a residual quantity of products in at least one warehouse, and at the moment of said selection, means of electronic counting perform a step of decrease of a residual quantity of product within said warehouse and store a new residual quantity datum according to said selection performed by said user by means of said at least one user interface remote electronic device.
  • said plurality of objects comprises at least in part a plurality of parts of a confined environment and wherein said status is a datum stored that comprises size data and/or of graphic appearance of said object, and wherein means of definition and motion of a point of vision of said object, at the moment of the adaptation of said status, transmit data of adaptation of a graphic appearance of said selected object towards said at least one user interface remote electronic device.
  • each virtual subject of said plurality of virtual subjects comprises a point of vision generated by a tridimensional reference of said virtual user positioned to a known distance respective to a reference system fixed of said virtual tridimensional space; said tridimensional reference system of said virtual user is stored and automatically updated to each motion of said user interface electronic device.
  • said point of vision is originated from a tridimensional movable reference system within said virtual tridimensional space and wherein said electronic computer or electronic computer network update automatically the distance between said tridimensional reference system fixed and said tridimensional movable reference system, said tridimensional movable reference system being stored and automatically updated to each motion of said user interface electronic device.
  • said method comprises a transmission on a plurality of intervals of time, data of motion of said point of vision within said virtual tridimensional space, said transmission taking place starting from user interface remote electronic device which comprises a plurality of sensors of motion zenithal and azimuthal.
  • said computer or electronic device or server or computer network calculates the motion of more virtual subjects simultaneously, each one with a own point of vision controlled by a respective real user provided with of a respective user interface electronic device univocally electronically associated with said own virtual user.
  • said computer or electronic device or server or computer network calculates the a communication of an audio and/or video data stream among a plurality of virtual subjects simultaneously.
  • the data transmitted towards said user interface electronic device are preferably transmitted with a low latency, lower than to 1 s.
  • Such a latency when possible in compatibility with the requirements of the network and/or channel of data transmission, concurs to render the experience of fruition of the tridimensional virtual environment on the user interface electronic device the most possible real and fluid, in particular when a multiplicity of users can simultaneously interact on the same tridimensional virtual environment.
  • the data transmitted towards said user interface electronic device are transmitted with channel encoding for the mitigation of burst errors.
  • FIG. 1 illustrates a tridimensional virtual space, wherein a virtual subject (avatar) orientates its point of vision respective to a system of coordinates of reference;
  • FIG. 2 illustrates a plurality of user-wearable electronic devices and schematized for reasons of representation in an exchange of data with a computer or electronic device or server or computer network;
  • FIG. 3 illustrates an action of selection of an object contained in said tridimensional virtual space by of said virtual subject
  • FIG. 4 illustrates a schematic block diagram of representation of a process of selection by of said user. Detailed description of the invention.
  • the present invention concerns an electronic system of interaction between a user and a tridimensional virtual space 100.
  • the system 100 comprises at least one electronic server computer 1 10 or alternatively an electronic computer network electronically connected in such a way of transceiving data one to the other and eventually subdivide among them the working computation load.
  • Within the electronic server computer 1 10 is contained at least a memory 120 wherein are stored data defining a tridimensional virtual space 200, of a tridimensional type, defined on a first triad of fixed axes ⁇ , ⁇ , ⁇ .
  • objects 210, 220 are defined, of which some have parameters which are adaptable according to actions and electronic commands received from a user interface remote electronic device 140 provided to a user 300.
  • the transmission of said electronic commands by said user interface remote electronic device 140 towards the electronic server computer 1 10 takes place according to a known technique.
  • the objects 210, 220 which are present in the tridimensional virtual space 200 are divided therefore in "fixed" objects, that is to say in objects that can represent for example and in a non-limiting extent furniture, walls, complement furniture in general, and objects 220 which as already stated have the parameters adaptable according to the actions and electronic commands received by said user interface remote electronic device 140.
  • each object 210, 220 presents a position defined in the tridimensional virtual space 200, and said position is defined by at least a triad of coordinates (x, y, z) which are centered on said first triad of axes ⁇ , ⁇ , ⁇ .
  • the at least one triad of coordinates (x,y,z) of each object 210, 220 is stored in advance in the memory 120 so as to create a sort of furnished tridimensional virtual space.
  • each object 210, 220, on the memory 120 is furthermore stored an image of the object itself.
  • Tale image can be either a traditional image of bidimensional type, or a tridimensional image, thus wherein the graphic appearance of the object recalls a photography or model that appears different according to the direction from which the object itself is observed.
  • a virtual subject 230 onto which is centered a second triad of axes ⁇ ', ⁇ ', ⁇ '.
  • the second triad of axes ⁇ ', ⁇ ', ⁇ ' defines a position of the virtual subject 230 and it is also stored in the memory 120.
  • the second triad of axes ⁇ ', ⁇ ', ⁇ ' is fixed.
  • the virtual subject 230 cannot move inside the tridimensional virtual space 200.
  • the virtual subject 230 has a own direction of observation.
  • the direction of observation, represented in figure 1 though the line K is defined in accordance for example of a zenithal and azimuthal angle centered on the second triad of axes ⁇ ', ⁇ ', ⁇ '.
  • the virtual subject 230 points a direction defined by the line K and can "see" an object 210, 220, technically the first object that is positioned at the minimum distance between its position respective to the first triad of axes ⁇ , ⁇ , ⁇ , along the line K.
  • the second triad of axes ⁇ ', ⁇ ', ⁇ ' is movable.
  • the virtual subject 230 can move inside the tridimensional virtual space 200. Therefore the line K, representing the observation direction of the virtual subject 230, rigidly moves with the position of the second triad of axes ⁇ ', ⁇ ', ⁇ '.
  • the position of the second triad of axes ⁇ ', ⁇ ', ⁇ ' is therefore no more statically stored in the memory 120.
  • said position of the second triad of axes ⁇ ', ⁇ ', ⁇ ' respective to the first triad of axes ⁇ , ⁇ , ⁇ is updated by means of a new loading in the memory 120 each time the virtual subject 230 is made to move by the user 300 by means of said user interface remote electronic device 140.
  • FIG. 3 is shown a virtual subject 230 that points a specific object 210 though the line K.
  • the virtual subject 230 frames an area 250 around the line K.
  • Said area is conceivable as a cone that has vertex in the origin of the line K.
  • the area 250 is transmitted as a time-variant electronic image by the electronic server computer 1 10 towards the user interface remote electronic device 140 of the user 300.
  • the user interface remote electronic device 140 of the user preferably comprises a screen for visualizing the images coming at least by said electronic server computer 1 10 and at least one user-interface means, a button, pad, joystick or other, suitable for causing the sending of commands of selection.
  • Said commands of selection are transmitted by user interface electronic device 100 towards the electronic server computer 1 10 by means of any type of known type coding and anyway preferably on a transmission channel at least partially wireless.
  • said user interface remote electronic device 140 is advantageously realized under the form of spectacles with visor or with an electronic portable device of a smartphone type provided with a screen for visualizing said tridimensional virtual environment.
  • said electronic portable device of a smartphone type can be integrated in an helmet or head-wearable support. This advantageously allows for having hand-free applications.
  • the user interface remote electronic device 140 can be a device provided with either a single body or more separate bodies.
  • the user interface remote electronic device 140 comprises anyway at least one sensor of orientation and/or of spatial position capable of causing the sending of commands of zenithal and/or azimuthal rotation towards the electronic server computer 1 10 in real time and according to the motion of a part of the body of the user 300.
  • said user interface remote electronic device 140 can be a device with substantially helmet-shaped form, and more in general assumes form of a wearable device on the head of a user 300 so that its own screen is arranged in front of the eyes of the user itself. More preferably, even though not necessarily the screen shall have such a size to occupy the substantial entirety of the area of vision of the user 300, so as to ensure advantageously greater immersion in the augmented reality.
  • said user interface remote electronic device 140 can be a spectacle with integrated display, or a cellular phone, or an electronic device susceptible of creating an holographic image of said tridimensional virtual environment.
  • the user interface remote electronic device 140 transmits of the data of adaptation of the observation direction of the line K in real time with the motion detected by sensor of zenithal and/or azimuthal orientation towards the electronic server computer 1 10 and as well in real time with the motion of the user 300 said user interface remote electronic device 140 receives by said electronic server computer 1 10 an electric signal of feedback that preferably causes the adaptation of the image represented on the screen of the user interface remote electronic device 140 according to the motion detected by the sensor.
  • the user interface remote electronic device 140 could have sensors of a known type suitable for allowing the inclination and the rotation of the head, translate it in a set of electric signals to be transmitted in substantially real time towards the electronic server computer 1 10.
  • the user 300 in the real life moves in a tridimensional environment and can be thought that this its motion shall be generated starting from a third triad of axes X",Y",Z" centered on the user 300 itself.
  • the user interface remote electronic device 140 can furthermore have of the accelerometric sensors suitable for detecting not only a zenithal and azimuthal rotation of the user 300 respective to the third triad of axes X",Y",Z” but also a true and actual motion of the user, that is a motion in the space of the third triad of axes X",Y",Z” that translates in real time - though the aforementioned transmission of the set of electric signals towards and from the electronic server computer 1 10, in a motion of the second triad of axes ⁇ ', ⁇ ', ⁇ ' respective to the first triad of axes ⁇ , ⁇ , ⁇ .
  • Figure 4 illustrates in detail the behavior of the system object of the present invention when the user 300, though the user interface remote electronic device 140 selects an object 210, 220 inside the tridimensional virtual space 200.
  • the user when has detected the object 220 with adaptable parameters of interest (for example a product of a virtual mall), the selects by means of the of user interface means on its own user interface remote electronic device 140 (block 1001 ).
  • the selection performed by the user is performed though user-interface means connected with the remote electronic device by means of a cabled or wireless connection, according to any known type technique.
  • This selection translates in an adaptation command of the parameter on the object
  • the adaptation of the parameter to be adapted is for example a reduction of amount a warehouse of the product, meaning a purchase (block 1005).
  • first of the adaptation of the parameter on the memory 120 according to the selection performed by the user is possible configure the system in such a way that the electronic server computer 1 10, following the loading from the memory of the object 220 pointed by the line K (block 1003) there is a sending of a message of request of confirmation towards the user interface remote electronic device 140 that has generated the request (block 1004), leaving once more to the user 300 the burden of using the user-interface means of its user interface remote electronic device 140 for confirming the selection, sending therefore towards said electronic server computer 1 10 a command of confirmation of selection of the object 220, which triggers the adaptation of the parameter to be adapted by performing then an update of the status of the object 220 with the adapted parameter in its memory 120.
  • the parameter to be adapted can be any type of datum among a price or a residual quantity datum.
  • the tridimensional virtual space 200 can be adapted according to the selection performed by the user. Therefore, the parameter to be adapted though the selection on the user interface means, can also be a parameter of appearance/disappearance type of the object 220. This happens for example in case of a purchase of a the last product in stock in a virtual mall.
  • the data of feedback correspond preferably - in case the object 220 is a product to be purchased - in a message of confirmation of introduction of said object 220 within an electronic cart of purchase.
  • the sending of an adaptation command of the status of an object 220 generates the start of the reproduction of a tridimensional movie on the user interface remote electronic device 140.
  • Said tridimensional movie is loaded in advance in the memory of the electronic server computer 1 10.
  • the electronic server computer 1 10 transmits in streaming the movie towards said user interface remote electronic device 140, and the movie is reproduced on the screen of this last keeping into account in real time the zenithal and azimuthal rotation of the user 300 respective to the third triad of axes X",Y",Z".
  • a virtual subject in this case, can also point other virtual subjects in such a way that the adaptation command of a status, cannot only be referred to an object 220 to which one or more parameters can be adapted, but also to a further virtual subject.
  • the electronic computer 1 10 when it receives said command, it sends a request of adaptation of the status towards another virtual subject, and therefore retransmits a command towards a second user interface remote electronic device 140.
  • the electronic computer 1 10 calculates therefore in real time the positioning of a plurality of virtual subjects in a single environment, wherein each virtual subject is provided with of a own point of vision K managed independently from that of the others and is electronically univocally associated to a respective user interface remote electronic device 140 which is provided to one of the users 300 of the system.
  • the electronic communication among a plurality of virtual subjects corresponding to a same amount of real subjects in a preferred and non-limiting embodiment can translate to a set-up of a channel of audio/video communication at least between two user interface remote electronic devices 140 of two users that have decided of reciprocally communicating by means of a command of instauration of a channel of communication audio/video performed through the own remote electronic device.
  • the virtual environment is for example that of a supermarket - that many users can meet together in a virtual environment and start a private communication one with the other interacting virtually even being in truth in different environments one respective to the other.
  • the system 100 object of the present invention is for example possible to virtually meet for shopping together, even if being physically in different places and most of all not coinciding with the supermarket itself.
  • the electronic communication among a plurality of virtual subjects in the system 100 object of the present invention takes place advantageously by means of or an electronic chat or though the transceiving of audio/video streams.
  • a first virtual subject 230 at first points by means of the line K a second virtual subject 230.
  • the user of the first virtual subject by means of the own user interface remote electronic device, provides for transmitting a command of request of activation of an electronic communication towards the electronic computer 1 10.
  • This last receives said command of request of activation of said electronic communication and provides for detecting the direction pointed by the line K of the first virtual subject by electronically identifying to which virtual subject 230 the request of electronic communication is addressed.
  • the electronic computer 1 10 transmits towards the so identified second virtual subject 230 an electronic signal of request of activation of an electronic communication with the first virtual subject 230, that at the moment of confirmation by of the second virtual subject 230, causes the adaptation of the status of operation of both the user interface remote electronic devices 140 associated to first and second virtual subject 230 for activating on the screen of these last ones an electronic chat by means of which two real users can exchange texts messages and/or images.
  • the user of the first virtual subject by means of the own user interface remote electronic device, provides for transmitting a command of request of activation of an electronic communication towards the electronic computer 1 10.
  • This last receives said command of request of activation of said electronic communication and provides for detecting the direction pointed by the line K of the first virtual subject identifying electronically to which virtual subject 230 the request of electronic communication is addressed.
  • the electronic computer 1 10 transmits towards the so identified second virtual subject 230 an electronic signal of request of activation of an electronic communication with the first virtual subject 230, that at the moment of confirmation by of the second virtual subject 230, causes the adaptation of the status of operation of both the user interface remote electronic devices 140 associated to the first and second virtual subject 230 for activating a transmission of a stream audio and/or video streaming of communication among the users 300 associated to the first and to the second virtual subject 230.
  • streaming it is meant a stream of data generated starting by sensors and transducers arranged on the user interface remote electronic device 140 that are transmitted substantially in real time between the two or more user interface remote electronic devices 140 by passing through the electronic computer 1 10.
  • the transmission of the audio and/or video streaming is performed also among a plurality of subjects simultaneously.
  • a virtual subject 230 can at first point by means of the line K a second virtual subject 230 and then to a third virtual subject 230, and a fourth one, thereby creating a sequence of virtual subjects addressees of simultaneous an electronic communication audio/video.
  • the electronic computer receives said command of request of activation of said electronic communication, provides for detecting the direction pointed by the line K of the first virtual subject identifying electronically to which virtual subject 230 is addressed the request of electronic communication. Then the electronic computer 1 10 transmits towards the second virtual subject 230, towards the third, fourth, ... virtual subject so identified an electronic signal of request of activation of an electronic communication with the first virtual subject 230, that at the moment of confirmation by the respective subjects, cause the adaptation of the status of operation of all the user interface remote electronic devices 140 associated to the first, second virtual subject 230, ... for activating a transmission of an audio and/or video streaming of communication among the users 300 associated at first to the virtual subjects 230 involved in the request.
  • the tridimensional virtual space 200 is represented in a form of a cockpit of a vehicle.
  • objects 220 whose status can be adapted, and said objects are portions of said cockpit which are optionally customizable.
  • a non-exhaustive list of objects 220 comprises: steering wheels with form and/or adaptable material, colors and finishes and/or upholstery material of the vehicle and/or of seats thereof, presence and/or absence of components like navigators, openable roofs, etc.
  • the adaptable status stored in the memory of the electronic computer comprises one or more parameters of presence, form and color distinguished.
  • Said command of adaptation of the status of the object 220 causes the loading by the memory of the electronic computer of a new status and/or show of the object 220 in the virtual environment, allowing the user of adapting in real time and with a tridimensional view completely fluid in the cockpit of the vehicle, by living a more vivid experience, when the traditional configurators of the vehicle allow only an adaptation of the optional and features of the cockpit in 2D or 3D manually rotatable, but not with a continuous adaptation of the perspective of view according to the motion of the user interface electronic device.
  • the tridimensional virtual environment is represented by an aircraft cockpit, an helicopter or plane.
  • Said adaptation command of the status of the object 220 is capable of adapting one or more technical functions of the aircraft like for example and in a non-limiting extent commands of the gas, of the flap, of the gear, of the autopilot, transponder controls.
  • the system object of the present invention becomes a particular aid to the remote assistance for example of flight crew, included the pilot being in troubles.
  • the tridimensional virtual environment can be that captured in a true aircraft, wherein the images that appear on the user interface electronic device 140 are images captured in real time by a plurality of video cameras arranged within the aircraft itself.
  • the system object of the present invention allows advantageously of providing a true remote assistance to the pilot in case there are problems during the flight. Therefore, in this last case the system object of the present invention configures like a true and real electronic flight aid.
  • the latency in the transmission of the video streams by the plurality of video cameras towards the user interface electronic device 140 becomes of particular importance, and shall be therefore reduced as most as possible.
  • for latency it is meant the delay of transmission that there is between the time instant of the capture of the images or of the video stream by the plurality of video cameras, up to the time instant wherein said images or video stream are represented on the screen of the user interface electronic device 140.
  • Said latency is preferably lower than to 1 s, and further more preferably is lower than than 100ms. This advantageously allows an actual bidirectional interface of command- response in real time, applicable also to real applications.
  • the signal that is transmitted towards the user interface electronic device 140 is protected by means of FEC techniques or of interleaving.
  • the virtual tridimensional space can become an actual and real environment of interaction among a plurality of subjects, wherein each one of them can hear and let voice be heard or images also towards other subjects, a function typical of the real environment wherein each of us can hear voices and see a plurality of people close to him and interactively cooperating.
  • the tridimensional virtual space becomes therefore that actual area wherein it is possible to perform purchases.
  • the predetermined scopes are reached: obtaining a tridimensional virtual space wherein a plurality of users can perform purchases of objects, or see with more engaging way products and/or services, together with the possibility of organizing a real and actual virtual space within which construct an environment where a plurality of subjects can get involved also simultaneously.
  • the system object of the present invention it is possible therefore to create a real and actual virtual mall.

Abstract

L'invention concerne un système électronique (100) d'interaction entre un utilisateur (300) et un espace virtuel tridimensionnel (200), comprenant des moyens de création d'un espace tridimensionnel virtuel dans lequel ledit utilisateur peut se déplacer et/ou déplacer le point de vision d'un sujet virtuel (230), des moyens de définition et de mouvement d'un point de vision (K) dudit sujet virtuel (200) dans ledit espace tridimensionnel virtuel (200), et comprenant en outre : au moins un ordinateur, dispositif électronique, serveur ou réseau informatique (110) où ledit espace tridimensionnel (200) est créé et stocké ; au moins un dispositif électronique d'interface utilisateur (140), qui émet-reçoit au moins des données électroniques du mouvement dudit point de vision (K) avec ledit au moins un ordinateur, dispositif électronique, serveur ou réseau informatique (110); dans ledit espace tridimensionnel virtuel (200), ledit ordinateur, dispositif électronique, serveur ou réseau informatique (110) stockant une pluralité d'objets définis sur une pluralité de points correspondant à un système de référence (Χ', Υ', Ζ') dudit espace tridimensionnel virtuel (200) ; et lesdits moyens de définition et de mouvement dudit point de vision stockant, sur ledit au moins un ordinateur, dispositif électronique, serveur ou réseau informatique (110), une variation ou une adaptation d'un état d'au moins un objet de ladite pluralité d'objets (210, 220) selon une instruction de sélection ou d'adaptation d'au moins un objet de ladite pluralité d'objets (210, 220) effectuée par ledit utilisateur (300) par ledit au moins un dispositif électronique d'interface utilisateur (140), et transmettant, vers ledit au moins un dispositif électronique d'interface utilisateur, un code d'adaptation dudit état dudit objet sélectionné (210, 220), ledit code d'adaptation dudit état provoquant une variation du fonctionnement dudit au moins un dispositif électronique d'interface utilisateur (140).
PCT/IB2016/050250 2016-01-19 2016-01-19 Système interactif à distance de réalité augmentée et procédé associé WO2017125783A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP16704048.4A EP3405853A1 (fr) 2016-01-19 2016-01-19 Système interactif à distance de réalité augmentée et procédé associé
US16/071,437 US20210208763A1 (en) 2016-01-19 2016-01-19 Remote interactive system of augmented reality and associated method
CN201680079481.7A CN108475118A (zh) 2016-01-19 2016-01-19 增强现实的远程交互式系统及相关方法
PCT/IB2016/050250 WO2017125783A1 (fr) 2016-01-19 2016-01-19 Système interactif à distance de réalité augmentée et procédé associé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2016/050250 WO2017125783A1 (fr) 2016-01-19 2016-01-19 Système interactif à distance de réalité augmentée et procédé associé

Publications (1)

Publication Number Publication Date
WO2017125783A1 true WO2017125783A1 (fr) 2017-07-27

Family

ID=55349887

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/050250 WO2017125783A1 (fr) 2016-01-19 2016-01-19 Système interactif à distance de réalité augmentée et procédé associé

Country Status (4)

Country Link
US (1) US20210208763A1 (fr)
EP (1) EP3405853A1 (fr)
CN (1) CN108475118A (fr)
WO (1) WO2017125783A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11321768B2 (en) * 2018-12-21 2022-05-03 Shopify Inc. Methods and systems for an e-commerce platform with augmented reality application for display of virtual objects

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6411266B1 (en) * 1993-08-23 2002-06-25 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US20150309705A1 (en) * 2014-04-28 2015-10-29 Invodo, Inc. System and method of three-dimensional virtual commerce environments

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9508146B2 (en) * 2012-10-31 2016-11-29 The Boeing Company Automated frame of reference calibration for augmented reality
CN104883556B (zh) * 2015-05-25 2017-08-29 深圳市虚拟现实科技有限公司 基于增强现实的三维显示方法和增强现实眼镜

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6411266B1 (en) * 1993-08-23 2002-06-25 Francis J. Maguire, Jr. Apparatus and method for providing images of real and virtual objects in a head mounted display
US20150309705A1 (en) * 2014-04-28 2015-10-29 Invodo, Inc. System and method of three-dimensional virtual commerce environments

Also Published As

Publication number Publication date
US20210208763A1 (en) 2021-07-08
EP3405853A1 (fr) 2018-11-28
CN108475118A (zh) 2018-08-31

Similar Documents

Publication Publication Date Title
US11484790B2 (en) Reality vs virtual reality racing
JP6524917B2 (ja) 画像表示装置及び画像表示方法
JP6642432B2 (ja) 情報処理装置及び情報処理方法、並びに画像表示システム
TWI540534B (zh) 虛擬導覽控制系統與方法
KR102516096B1 (ko) 정보 처리 시스템 및 정보 처리 방법
CN111716365B (zh) 基于自然行走的沉浸式远程交互系统及方法
US20210069894A1 (en) Remote control system, information processing method, and non-transitory computer-readable recording medium
US20190377474A1 (en) Systems and methods for a mixed reality user interface
WO2016079470A1 (fr) Système et procédé d'informations et de divertissement de réalité mélangée
WO2015095507A1 (fr) Système basé sur un emplacement pour partager un contenu de réalité augmentée
WO2017064926A1 (fr) Dispositif de traitement d'information et procédé de traitement d'information
KR102190072B1 (ko) 콘텐츠 발견
US11302051B2 (en) Systems configured to control digital characters utilizing real-time facial and/or body motion capture and methods of use thereof
US20210208763A1 (en) Remote interactive system of augmented reality and associated method
JP2020181264A (ja) 画像生成装置、画像表示システム、および情報提示方法
KR20180122869A (ko) 3차원 영상 처리 방법 및 장치
JP7038217B2 (ja) 少なくとも第1の人物と第2の人物との間でバーチャル会議を行う方法およびシステム
US20230092395A1 (en) Physical object integration with extended reality environments
US10940387B2 (en) Synchronized augmented reality gameplay across multiple gaming environments
WO2023032264A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
Inoue et al. Enhancing bodily expression and communication capacity of telexistence robot with augmented reality
WO2018216327A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
CN112383345A (zh) 一种分布式遥控装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16704048

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE