WO2012059781A1 - Système et procédé de fourniture d'une représentation virtuelle - Google Patents

Système et procédé de fourniture d'une représentation virtuelle Download PDF

Info

Publication number
WO2012059781A1
WO2012059781A1 PCT/IB2010/003237 IB2010003237W WO2012059781A1 WO 2012059781 A1 WO2012059781 A1 WO 2012059781A1 IB 2010003237 W IB2010003237 W IB 2010003237W WO 2012059781 A1 WO2012059781 A1 WO 2012059781A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual space
space
source
constraints
virtual
Prior art date
Application number
PCT/IB2010/003237
Other languages
English (en)
Inventor
Maarten Aerts
Donny Tytgat
Sammy Lievens
Original Assignee
Alcatel Lucent
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcatel Lucent filed Critical Alcatel Lucent
Priority to PCT/IB2010/003237 priority Critical patent/WO2012059781A1/fr
Publication of WO2012059781A1 publication Critical patent/WO2012059781A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/157Conference systems defining a virtual conference space and using avatars or agents
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6623Methods for processing data by generating or executing the game program for rendering three dimensional images for animating a group of characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6684Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dynamically adapting its position to keep a game object in its viewing frustrum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Definitions

  • This invention generally relates to virtual reality. More particularly, this invention relates to providing a virtual representation.
  • Typical applications include presenting a virtual space or environment that an individual can observe on a display screen.
  • a visual representation of a character or individual allows for the person in the real world to virtually interact with that which is happening in the virtual space.
  • many video games have a virtual world type of display in which a character in the game moves about within the virtual space based upon input to a game controller from an individual playing the game.
  • More complicated systems are proposed including virtual reality suits that have sensors for detecting motion of different portions of an individual's body. These systems allow for actual, physical motion to be well represented by corresponding motion in the virtual environment.
  • a limitation is that there is typically no correspondence between physical objects in the real world and virtual objects in the virtual space. When an individual sits on a physical chair, the representation of that individual in the virtual world typically will not be sitting on any particular object in the virtual world, for example.
  • An exemplary method of providing a virtual representation consistent with the physical space that is virtually represented facilitates physically interacting with a virtual space.
  • the exemplary method includes determining a plurality of constraints that correspond to a plurality of limitations that exist in a source space.
  • a virtual space layout is determined from the constraints such that the virtual space layout satisfies each of the constraints.
  • a warping of source space coordinates to virtual space coordinates is determined such that the source space coordinates have corresponding virtual space coordinates in the determined virtual space layout while satisfying the determined constraints.
  • An exemplary system for providing a virtual representation includes at least one processor that is configured to determine a plurality of constraints that correspond to a plurality of limitations that exist in a source space.
  • the processor determines a virtual space layout from the constraints such that the virtual space layout satisfies each of the constraints.
  • the processor determines a warping of source space coordinates to virtual space coordinates such that the source space coordinates have corresponding virtual space coordinates in the determined virtual space layout while satisfying the determined constraints.
  • Figure 1 schematically illustrates a system and methodology for providing a virtual representation designed according to an embodiment of this invention.
  • Figure 2 schematically illustrates an example plurality of source spaces and a corresponding virtual space.
  • Figure 3 schematically illustrates one of the source spaces from the example of Figure 2 with corresponding source space coordinates.
  • Figure 4 schematically illustrates a portion of the virtual space from the example of Figure 2 including virtual space coordinates that correspond to the source space coordinates of Figure 3.
  • a disclosed example method and system for providing a virtual representation includes mapping a source space such as a physical space onto a virtual space to provide consistency between the spaces and facilitate physically interacting with the virtual world.
  • Objects in the physical environment have a corresponding representation in the virtual environment to provide a more consistent and realistic representation of the source space within the virtual space.
  • One feature of the disclosed example is that it provides a single virtual representation of more than one source space so that individuals in different physical representations can virtually interact with each other and the virtual space even though they are physically located remotely from each other.
  • FIG. 1 schematically illustrates a system 20 for providing a virtual representation.
  • a teleconferencing application is considered within this description.
  • the discussed example involves more than one physical space within which participants in the teleconference are located.
  • Each space is considered a source space in this description as it is a source of information for configuring the virtual space within which all of the participants can be virtually represented.
  • the example system 20 of Figure 1 includes at least one camera 22 that obtains information regarding a physical source space.
  • the cameras 22 provide information regarding positions of walls, chairs, a table, people and other objects within each conference room.
  • the example of Figure 1 includes a video analysis module 24 that processes information from the cameras 22.
  • the video analysis module performs various functions in this example.
  • One task of the video analysis module 24 is to identify physical structures, objects and people in each source space that will be represented by corresponding virtual representations in the eventual virtual space. Examples of such objects in a teleconferencing example include documents on a table, props used by a speaker. Such objects and people are the objects of interest that are visualized or virtually represented in the virtual space and can be referred to as being in an object of interest category.
  • Another task of the video analysis module 24 is to identify objects of another category.
  • objects that put constraints on the virtual space based on the configuration of each source space. Objects such as walls, large furniture, tables and doors fit into this category. These objects are recognized for this purpose without a concern for how they can be virtually represented in the virtual space. Rather the information regarding such objects is used to develop constraints on the virtual space and can be considered to be in a constraint imposing category.
  • the objects of interest are important for virtual representation and are in some cases not used for developing constraints.
  • Some objects may fit in both categories. For example, more than one person cannot occupy the same virtual location. It follows that people fit within the objects of interest category (i.e., they are to be virtually represented) and the constraint imposing category (i.e., they place constraints on how the contents of the virtual space can be represented). Moveable chairs are another example of an object that fits in both categories.
  • the information from the video analysis module 24 is provided to a scene composition module 26. Based on what is observed by the cameras 22 in a given source space, a scene that depicts the actual room is developed by the scene composition module 26. There are known techniques for using video analysis information to compose a scene that represents that which is observed by the cameras 22. In this example, such known techniques are used by the scene composition module 26.
  • the resulting scene composition in one example represents objects that are not used for imposing constraints on the virtual space in a description file so that they are accounted for in the scene composition without necessarily having an impact on the way the virtual space is configured based on the constraints.
  • the objects that do impose constraints on the virtual space are part of the scene composition regarding the physical layout in the corresponding source space. Some objects that may be in both categories will be represented in the description file and used for generating the constraints on the virtual space.
  • a constraint generating module 30 receives information from the video analysis module 24.
  • the constraint generating module 30 determines a plurality of constraints that correspond to a plurality of limitations that exist in each source space (e.g., a physical conference room).
  • the constraint generating module 30 utilizes the information regarding the objects in the constraint imposing category from the video analysis module 24 to generate particular constraints that are useful for developing an actual virtual representation of that which is present or occurring in the source spaces.
  • one constraint is that each object in a source space is represented by only one corresponding virtual object in a virtual space.
  • the constraint generating module 30 would utilize this constraint and the information from the video analysis module 24 to establish an appropriate number of physical objects to be displayed in the virtual space such as a number of individuals, a number of pieces of furniture and a number of objects hanging on the walls.
  • Another constraint in this example requires that only an occupied area in a source space can be warped to an occupied area in the virtual space.
  • An occupied area in this example corresponds to an area that is the location of a physical object that is the only object in that location.
  • a wall is one example object or structure that exclusively occupies the location in which the wall is situated.
  • only an unoccupied area in the source space can be warped to an unoccupied area in the virtual space.
  • this constraint prevents the virtual representation from displaying an individual inside of a wall or standing in the middle of a table, for example.
  • Information from the video analysis module 24 allows the constraint generating module 30 to recognize occupied space versus unoccupied space and to generate a corresponding constraint or set of constraints.
  • Another constraint in this example requires that objects that cannot occupy the same location in a source space cannot occupy the same location in the virtual space. This is another constraint that is useful, for example, for preventing a virtual representation of an individual from being at the same location in the virtual space as a wall, for example.
  • Another example constraint requires that a relationship between selected objects in each source space is represented by a corresponding relationship in the virtual space. This constraint allows for accurately representing an individual sitting on a chair. When an individual in a physical conference room is sitting at a chair near a table, this constraint facilitates having a virtual representation of that individual sitting on a virtual representation of that chair near a virtual representation of that table. This constraint prevents the individual from being displayed sitting on the table or from being displayed sitting on nothing within the virtual space.
  • constraints used by the constraint generating module 30 in one example are predetermined depending on the particular application.
  • the constraints will be like those described above to establish limitations within the virtual space that facilitate accurately representing corresponding limitations from the physical or source space.
  • Each constraint is unique to the current conditions in each source space.
  • the constraints are continuously updated to address any changes in any of the source spaces to provide a current virtual representation that is consistent with the current conditions in the real, physical world.
  • the generated constraints from the constraint generating module 30 are provided to an optimization module 32.
  • the constraints can be considered to establish an optimization problem and the optimization module 32 solves that problem.
  • the constraints are continuously updated and so is the optimization.
  • the optimization module 32 provides an optimal virtual space layout that includes object placement for furniture, walls and other things in the source space.
  • the optimization module 32 in this example uses coordinates of each source space from the video analysis module 24.
  • the optimization module determines a coordinate mapping to map the source space coordinates to corresponding virtual space coordinates.
  • the optimization solution provides a best fit between the virtual space coordinates and the coordinates of the source spaces given the constraints provided to the optimization module 32.
  • a layout database 38 includes a plurality of parameterized room layouts that can be used to establish at least some of the virtual space characteristics.
  • the optimization module 32 in this example may select the room layout from the database 38 that best suits the optimization function. For example, the optimization module 32 evaluates the cost function for each of the layouts and selects the one that corresponds to the minimum global cost. At the same time, optimal parameters for the room (e.g., number of chairs, position of the table, etc.) may be determined.
  • a user is provided with the ability to select a desired room layout from the database 38 and the optimization module uses that as a basis for developing the coordinate mapping to the virtual space.
  • a warping module 34 receives information from the optimization module 32 and the scene composition module 26.
  • the warping module 34 translates the coordinates for the source spaces to coordinates in the virtual space.
  • the objects of interest in each source space are translated to a virtual space representation.
  • the coordinates of all items in the source space are warped to virtual coordinates of the virtual representations of the corresponding items in the virtual space.
  • the illustrated feedback between the optimization module 32 and the warping module 34 accommodates iterative dynamic propagation of the virtual representation output.
  • the output at one time has to be similar to the output of a subsequent time but the disclosed example allows for smooth changes in the output.
  • the warping of a source space to the virtual space is dynamic to preserve consistency and to avoid double occupancy of a single space while objects or persons move around within a source space. Additionally, the source space and virtual space may be altered during a session if, for example, another chair is brought into a room. Dynamically accommodating such changes in the source space provides a more consistent and realistic virtual representation.
  • mapping from a source space (denoted by index i) to a destination virtual space (denoted by index j) can be formulated as:
  • S denotes the number of source spaces
  • D denotes the number of virtual spaces
  • x, y and z correspond to the three-dimensional coordinates of a physical object in a source space i and are mapped by functions f, g and h to the respective three-dimensional coordinates x', y' and z' of the virtual representation of that object in the virtual space j.
  • the functions f, g and h can be, for instance, continuous parameterized functions (e.g., piecewise spline surfaces) or can be discrete mappings (e.g., a look up table) in combination with some interpolation method. For two dimensional representations, identity vertical mapping may be employed.
  • the disclosed example includes solving for these variables as different dynamic minimalization problems for each destination virtual space.
  • the optimization function is the sum of deformation terms, consistency terms and constraint penalties in one example.
  • Deformation for example, is a weighted, nonuniform constraint with certain locations where deformation must be very close to zero and some places where more deformation is allowed to occur.
  • the optimization is dynamic so the deformation term also includes similarity over time.
  • the deformation term in this example provides a smooth representation of movement in the virtual space to correspond to actual movement in one of the source spaces. For example, it is useful to avoid abrupt changes in position of the virtual representation of an individual or a chair in the virtual space when the corresponding individual or chair moves in the real world.
  • the optimization module 32 also receives input such as user preferences shown at 36. This allows for a user to select scene layout design features such as coloring, lighting and other features that may be desired by a user for a particular virtual representation.
  • the warping module 34 in this example also receives input from the scene composition module 26 for coordinating the virtual representation with the actual look of the source space.
  • a visualization or display module 40 generates display information for showing the virtual representation of the virtual space, which corresponds to the source spaces of interest, for one or more users.
  • the constraint generating module 30, the optimization module 32 and the warping module 34 comprise portions of at least one processor that is configured to perform the functions of each of those modules.
  • the configuration of the processor may comprise programming the processor, specific hardware design for the processor, specific firmware that is part of the processor or a combination of two or more of these.
  • the constraint generating module 30, the optimization module 32 and the warping module 34 are all embodied on a computer readable storage medium and comprise programming that includes instructions directing a computer or processor to perform the functions of each of the modules.
  • the example of Figure 1 also allows for additional input from other source spaces shown at 44. Accordingly, the optimization module 32 and the warping module 34 are capable of receiving constraint information regarding a plurality of source spaces.
  • the resulting virtual space that is presented comprises a blended version of the multiple source spaces into a single virtual space. This allows for presenting the same view to the various participants on a teleconferencing call even though those participants are located in different rooms. There are other situations in which combining more than one source space into a single virtual space is also useful.
  • the teleconferencing example is provided for discussion purposes.
  • the constraint information and scene composition information from Figure 1 may be provided to other optimization and warping modules so that the source space information can be incorporated into another virtual space presentation. This is shown schematically at 46.
  • FIG. 2 schematically illustrates an example in which two source spaces that each comprise a conference room are presented as a single virtual space that allows participants in a teleconference, for example, to have a virtual experience of all being in the same room.
  • a first source space 50 is a conference room including physical walls 52, 54, 56 and 58. Each of the walls places a physical limitation on what occurs within the source space 50. Appropriate corresponding constraints provided by the constraint generating module 30 ensure that the walls 52- 58 will be appropriately represented in a corresponding virtual space.
  • Within the room 50 there are stationary elements such as a table 60 and a whiteboard 62.
  • Another stationary element is a display 64 upon which a presentation of the virtual space can be observed by an individual 66 within the room 50.
  • This example also includes a plurality of chairs 68, 70, 72, 74 and 76. In this example, the chairs are positioned around the table 60 as illustrated.
  • the individual 66 cannot stand in the middle of the table 60 or walk all the way around it because one end is against a wall.
  • the individual 66 also cannot stand or sit in the middle of any of the walls.
  • the individual 66 can move about the room, write something upon the whiteboard 62, for example, and sit on one of the chairs 68-76 that is available to that individual 66.
  • Another feature of the room 50 is a door 78. In this example, the door 78 interrupts the wall 56 to provide passage between the room 50 and an adjacent environment that is not going to be part of the virtual representation of the source space 50.
  • cameras 22 are positioned at various locations within the room 50 to provide various perspectives on the room and to provide a complete representation of the contents of the room, for example.
  • the cameras 22 are examples of structures within the room 50 that are not considered relevant for purposes of providing a virtual representation of the room so they are not included in the constraints, scene composition, optimization or warping that is used for providing the virtual representation in this example.
  • Another source space 80 comprises another conference room.
  • This example includes walls 82, 84, 86 and 88.
  • a door 90 provides access to the room 80 through a portion of the wall 88.
  • a table 92 and a whiteboard 96 are stationary objects within the room 80. Each of those represent occupied space that is constantly occupied and cannot be occupied by an individual within the room, for example.
  • a display 98 is provided on the table 92 in this example. The display 98 provides the visual display of the virtual representation of the room 80 and, in this example, of the room 50.
  • Other objects in the room 80 include cameras 22 and chairs 100, 102 and 104.
  • a single virtual space 110 provides a virtual representation of the two source spaces 50 and 80.
  • the virtual space 110 provides a combined conference room presentation that allows any participant in a teleconference in either of the rooms 50 or 80 to have a virtual experience of being in a single room with all of the participants in the conference.
  • the actual layout of the virtual space 110 has correspondence to the layouts of the source spaces 50 and 80 but is not an exact replication of either of them.
  • the optimization module 32 determined an optimized representation that allows for both rooms to be presented together in a manner that provides some correspondence between the actual physical layout in each of the source spaces 50, 80 and the virtual space layout of the virtual space 110. Given that the two rooms have features that are not exactly compatible with each other, the optimization module 32 made some decisions regarding how to accommodate those differences.
  • a known optimization algorithm is used for this purpose in one example.
  • One example includes using a known graph cut optimization algorithm.
  • the optimization module 32 utilizes the constraint information from the constraint generating module 30 to determine the possible locations of the furniture within each room.
  • the optimization function applies a penalty if a chair position in the source space does not correspond to the same chair position in the virtual space.
  • the optimization function applies a reward if a chair position in the physical space corresponds to the same chair position in the virtual space.
  • the chair 68 is at the head of the table 60 in the source space 50.
  • the corresponding chair 68' in the virtual space 110 is at the head of a table 60' that represents the table 60 within the virtual space 110. If the chair location for the virtual chair representation 68' were at a different location relative to the virtual representation of the table 60', that would result in a penalty in the optimization function.
  • Each of the conference rooms has a display 64, 98.
  • the virtual space 110 does not include a representation of either display because the intent of the virtual representation is to provide a sense that all participants in the conference call are looking at each other rather than at a display screen.
  • the virtual space 110 includes a single whiteboard 120 to correspond to the whiteboards 62 and 96 of the source spaces, respectively.
  • the position of the whiteboard 120 relative to the other contents of the room 80 is more consistent than the position of the whiteboard 62 relative to the contents of the room 50.
  • This is a decision made by the optimization module 32 to provide some consistency in the perspective from which individuals look toward the whiteboard in physical space and the way those same individuals are represented in the virtual space.
  • the warping performed by the warping module 34 is dynamically adjusted to accommodate motion within either of the source spaces.
  • the combination of the two source spaces can be appreciated by considering the space within the outline 130 in the virtual space 110. That amount of the virtual space corresponds to the mapping of the source space 50 into the virtual space 110.
  • the portion of the illustration to the right (according to the drawing) of the line 132 represents the information that corresponds to the mapping of the source space 80 into the virtual space 110.
  • the other space shown at 134 and the portion of the table shown at 136 corresponds to a combined mapping of the two source spaces 50 and 80.
  • Figures 3 and 4 schematically illustrate a warping of coordinates from a source space to the virtual space.
  • Figure 3 represents the source space 50 including a coordinate grid schematically shown at 140.
  • Each location within the source space 50 has a corresponding coordinate in the virtual space 50' that is determined by the optimization module 32, for example.
  • the corresponding virtual space 50' in Figure 4 has a coordinate grid schematically shown at 140' and is used by the warping module 34 for purposes of placing the representations within the virtual space 50' at coordinates that correspond to the coordinates of the corresponding physical objects in the source space 50.
  • the illustrated example shows the mapping, the source spaces and the virtual space in two dimensions.
  • vertical direction warping would also be provided.
  • the grid or coordinate map 140 in the source space is rectilinear while the grid of coordinates 140' in the virtual space 50' has an irregular shape including various curvatures and a variety of spacings between coordinate points.
  • the spacings of the coordinate points in the physical space 50 are consistent and regular across the entire grid.
  • the variations in the two coordinate maps allows for warping information regarding physical objects within the source space 50 into appropriate locations for the corresponding virtual representations within the virtual space 50'.
  • the warping may be dynamically adjusted.
  • the coordinate grid 140' would have a different configuration in some circumstances depending on the amount of adjustment to the warping.
  • the disclosed example demonstrates how input from different kinds of video analysis tools such as room layout analysis, occupancy monitoring and object tracking can be used for generating constraints that place limitations on how a corresponding virtual space may be presented. While the preceding description focused on physical source spaces, it is possible to have a virtual space as one or more of the source spaces. Appropriate information regarding various objects and locations within such a virtual space may be converted into corresponding virtual representations in the destination virtual space. A source space that is virtual does not need video analysis based on camera information, for example. Rather, the constraints from a virtual source space may be deduced directly from the scene composition information regarding that virtual source space.
  • the destination virtual space may be a predefined layout for which some parameters have to be optimized, may be developed entirely by the constraint generating module 30 and the optimization module 32 and may be altered during a session of presenting the virtual space to one or more users.
  • the preceding description is exemplary rather than limiting in nature. The scope of legal protection given to this invention can only be determined by studying the following claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention porte sur un procédé à titre d'exemple consistant à fournir une représentation virtuelle, lequel procédé comprend la détermination d'une pluralité de contraintes qui correspondent à une pluralité de limitations qui existent dans un espace source. Une disposition d'espace virtuel est déterminée à partir des contraintes, de sorte que la disposition d'espace virtuel satisfasse chacune des contraintes. Une déformation de coordonnées d'espace source d'intérêt en coordonnées d'espace virtuel est déterminée de telle sorte que chaque coordonnée d'espace source d'intérêt a une coordonnée d'espace virtuel correspondante dans la disposition d'espace virtuel déterminée, tout en satisfaisant les contraintes déterminées.
PCT/IB2010/003237 2010-11-03 2010-11-03 Système et procédé de fourniture d'une représentation virtuelle WO2012059781A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2010/003237 WO2012059781A1 (fr) 2010-11-03 2010-11-03 Système et procédé de fourniture d'une représentation virtuelle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2010/003237 WO2012059781A1 (fr) 2010-11-03 2010-11-03 Système et procédé de fourniture d'une représentation virtuelle

Publications (1)

Publication Number Publication Date
WO2012059781A1 true WO2012059781A1 (fr) 2012-05-10

Family

ID=43806837

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/003237 WO2012059781A1 (fr) 2010-11-03 2010-11-03 Système et procédé de fourniture d'une représentation virtuelle

Country Status (1)

Country Link
WO (1) WO2012059781A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016001418A1 (de) * 2016-02-05 2017-08-10 Audi Ag Verfahren zum Betreiben eines Virtual-Reality-Systems und Virtual-Reality-System
CN107071334A (zh) * 2016-12-24 2017-08-18 深圳市虚拟现实技术有限公司 基于虚拟现实技术的3d视频会议方法和设备
CN108370431A (zh) * 2015-12-11 2018-08-03 索尼公司 信息处理装置、信息处理方法和程序
EP3251342A4 (fr) * 2015-01-30 2018-09-12 Ent. Services Development Corporation LP Étalonnage d'espace virtuel

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234859A1 (en) * 2002-06-21 2003-12-25 Thomas Malzbender Method and system for real-time video communication within a virtual environment
US20090033737A1 (en) * 2007-08-02 2009-02-05 Stuart Goose Method and System for Video Conferencing in a Virtual Environment
WO2009112967A1 (fr) * 2008-03-10 2009-09-17 Koninklijke Philips Electronics N.V. Procédé et appareil permettant la modification d'une image numérique
WO2009152769A1 (fr) * 2008-06-17 2009-12-23 深圳华为通信技术有限公司 Procédé, appareil et système de communication vidéo

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030234859A1 (en) * 2002-06-21 2003-12-25 Thomas Malzbender Method and system for real-time video communication within a virtual environment
US20090033737A1 (en) * 2007-08-02 2009-02-05 Stuart Goose Method and System for Video Conferencing in a Virtual Environment
WO2009112967A1 (fr) * 2008-03-10 2009-09-17 Koninklijke Philips Electronics N.V. Procédé et appareil permettant la modification d'une image numérique
WO2009152769A1 (fr) * 2008-06-17 2009-12-23 深圳华为通信技术有限公司 Procédé, appareil et système de communication vidéo
EP2299726A1 (fr) * 2008-06-17 2011-03-23 Huawei Device Co., Ltd. Procédé, appareil et système de communication vidéo

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PETER KAUFF, OLIVER SCHREER: "An Immersive 3D Video-Conferencing System using Shared Virtual Team User Environments", PROC. OF 4TH INTL. CONF. ON COLLABORATIVE VIRTUAL ENVIRONMENTS, CVE'02, 30 September 2002 (2002-09-30) - 2 October 2002 (2002-10-02), BONN, pages 105 - 112, XP002633305 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3251342A4 (fr) * 2015-01-30 2018-09-12 Ent. Services Development Corporation LP Étalonnage d'espace virtuel
CN108370431A (zh) * 2015-12-11 2018-08-03 索尼公司 信息处理装置、信息处理方法和程序
EP3389261A4 (fr) * 2015-12-11 2018-12-05 Sony Corporation Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US10785447B2 (en) 2015-12-11 2020-09-22 Sony Corporation Information processing apparatus, information processing method, and program
DE102016001418A1 (de) * 2016-02-05 2017-08-10 Audi Ag Verfahren zum Betreiben eines Virtual-Reality-Systems und Virtual-Reality-System
DE102016001418B4 (de) 2016-02-05 2024-06-13 Audi Ag Verfahren zum Betreiben eines Virtual-Reality-Systems und Virtual-Reality-System
CN107071334A (zh) * 2016-12-24 2017-08-18 深圳市虚拟现实技术有限公司 基于虚拟现实技术的3d视频会议方法和设备

Similar Documents

Publication Publication Date Title
US11222469B1 (en) Virtual affordance sales tool
Wang et al. Mutual awareness in collaborative design: An Augmented Reality integrated telepresence system
Feuchtner et al. Extending the body for interaction with reality
Grønbæk et al. MirrorBlender: Supporting hybrid meetings with a malleable video-conferencing system
Bowman et al. 3d user interfaces: New directions and perspectives
US5590268A (en) System and method for evaluating a workspace represented by a three-dimensional model
Müller et al. A qualitative comparison between augmented and virtual reality collaboration with handheld devices
Liang et al. Functional workspace optimization via learning personal preferences from virtual experiences
Bellgardt et al. Utilizing immersive virtual reality in everydaywork
Goebbels et al. Design and evaluation of team work in distributed collaborative virtual environments
WO2012059781A1 (fr) Système et procédé de fourniture d'une représentation virtuelle
Stahl et al. Social telepresence robots: The role of gesture for collaboration over a distance
JP2007034415A (ja) 設計支援システム
Sugiura et al. An asymmetric collaborative system for architectural-scale space design
Lee et al. Interactive and situated guidelines to help users design a personal desk that fits their bodies
Johnson et al. Developing the paris: Using the cave to prototype a new vr display
Han et al. Foldable spaces: An overt redirection approach for natural walking in virtual reality
Müller et al. Collaborative remote laboratories in engineering education: Challenges and visions
JP2007328389A (ja) 仮想空間表示方法
KR20200008400A (ko) 사용자 생성 콘텐츠를 포함하는 가상 체험공간 제공 시스템
Fujita et al. Human-Workspace Interaction: prior research efforts and future challenges for supporting knowledge workers
US9195377B2 (en) Method and system for providing consistency between a virtual representation and corresponding physical spaces
Chen Collaboration in Multi-user Immersive Virtual Environment
Schwede et al. HoloR: Interactive mixed-reality rooms
Saulton et al. Egocentric biases in comparative volume judgments of rooms

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10809333

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10809333

Country of ref document: EP

Kind code of ref document: A1