WO2011010533A1 - 情報処理装置、および情報処理方法、並びにプログラム - Google Patents
情報処理装置、および情報処理方法、並びにプログラム Download PDFInfo
- Publication number
- WO2011010533A1 WO2011010533A1 PCT/JP2010/061161 JP2010061161W WO2011010533A1 WO 2011010533 A1 WO2011010533 A1 WO 2011010533A1 JP 2010061161 W JP2010061161 W JP 2010061161W WO 2011010533 A1 WO2011010533 A1 WO 2011010533A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- display unit
- cursor
- virtual object
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a program. More specifically, the present invention relates to an information processing apparatus, an information processing method, and a program for performing data processing using a mixed reality (MR) in which a real object in the real world and an electronic display are fused.
- MR mixed reality
- a new physical display is prepared and connected to a computer operated by the user so that a plurality of displays can be used.
- a virtual desktop is set and used in one display unit.
- the former (a) there is a problem that cost and space for adding a display are required.
- the latter (b) has a problem that command input by a user operation or operation of an icon displayed on a tray or the like is necessary in order to access a part other than the part actually displayed on the display unit.
- the present invention solves such a problem by using data processing using, for example, mixed reality (MR).
- MR mixed reality
- Patent Document 1 Japanese Patent Laid-Open No. 2008-304268
- Patent Document 2 Japanese Patent Laid-Open No. 2008-304269
- the present invention enables mixed data processing using MR (Mixed Reality) to effectively use a space area other than the display unit without being limited to a display unit such as a PC.
- An information processing apparatus, an information processing method, and a program are provided.
- the first aspect of the present invention is: It is determined whether the position of the cursor that is the position indicator displayed on the first display unit is within or outside the region of the first display unit, and if it is out of the region, the cursor position information is virtual object management
- a coordinate processing module that outputs to the A camera that captures an image of a real object including the first display unit;
- a three-dimensional information analysis unit for analyzing the three-dimensional position of a real object included in the camera image;
- a second display unit for displaying the camera-captured image;
- a virtual object management unit that generates a virtual object different from the real object included in the camera-captured image, generates a composite image including the generated virtual object and the real object, and displays the composite image on the second display unit;
- the virtual object management unit The information processing apparatus calculates a three-dimensional position of a cursor based on cursor position information input from the coordinate processing module, and displays a composite image in which the cursor is set as a virtual object at the calculated position on the second
- the information processing apparatus includes an application execution unit that performs processing on a specified object specified by the position indicator, and the application execution unit includes the specified object.
- the application execution unit includes the specified object. Is in the area of the first display unit or out of the area, and if it is out of the area, the object position information is output to the virtual object management unit, and the virtual object management unit The three-dimensional position of the object is calculated based on the object position information input from, and a composite image in which the object is set as a virtual object at the calculated position is displayed on the second display unit.
- the virtual object management unit is configured such that the three-dimensional position of the object calculated based on the object position information input from the coordinate processing module is a display area of the first display unit. Is included, the composite image obtained by deleting the object region image overlapping the display region of the first display unit is displayed on the second display unit.
- the information processing apparatus further acquires image data of a real object specified by a cursor set as the virtual object, and performs data search based on the acquired image data
- An object information acquisition unit that performs an object information acquisition process, and the object information acquisition unit performs a process of outputting the acquired object information as display data to the first display unit.
- the object information acquisition unit executes access to a database or server that associates image data of real objects and object information, or image data of the real objects.
- Object information is acquired by a search process based on.
- the virtual object management unit is configured to display the display surface of the first display unit based on the three-dimensional position information of the constituent parts of the first display unit included in the camera-captured image. Is calculated, and a three-dimensional position of the cursor is calculated so that the position of the cursor is set on the plane.
- the cursor is a mouse cursor that is moved by a mouse operation
- the coordinate processing module inputs displacement information of the mouse cursor by the mouse operation, and the position of the mouse cursor Is processed to determine whether or not is within the area of the first display section.
- the second aspect of the present invention provides An information processing method executed in an information processing apparatus,
- the coordinate processing module determines whether the position of the cursor that is the position indicator displayed on the first display unit is within or outside the region of the first display unit, and if it is outside the region, the cursor position
- a coordinate processing step for outputting information to the virtual object management unit A shooting step in which the camera captures an image of a real object including the first display unit;
- a three-dimensional information analysis unit for analyzing a three-dimensional position of a real object included in the camera-captured image;
- a virtual object management step in which a virtual object management unit generates a virtual object different from the real object included in the camera-captured image, generates a composite image including the generated virtual object and the real object, and displays the composite image on the second display unit
- the virtual object management step includes: An information processing method which is a step of calculating a three-dimensional position of a cursor based on cursor position information input from the coordinate processing module and displaying a composite
- the third aspect of the present invention provides A program for executing information processing in an information processing apparatus; Let the coordinate processing module determine whether the position of the cursor, which is the position indicator displayed on the first display unit, is inside or outside the region of the first display unit.
- the virtual object management step includes:
- the program is a step of calculating a three-dimensional position of a cursor based on cursor position information input from the coordinate processing module and displaying a composite image in
- the program of the present invention is a program that can be provided by, for example, a storage medium or a communication medium provided in a computer-readable format to an image processing apparatus or a computer system that can execute various program codes.
- a storage medium or a communication medium provided in a computer-readable format to an image processing apparatus or a computer system that can execute various program codes.
- system is a logical set configuration of a plurality of devices, and is not limited to one in which the devices of each configuration are in the same casing.
- the cursor or object that protrudes outside the display unit such as a PC is displayed as a virtual object.
- a display device such as a PC and an image of the external area are displayed on a display of glasses worn by the user.
- a three-dimensional position of a cursor or object that is assumed to have moved according to a user operation is calculated, and displayed as a virtual object at the calculated position.
- object information corresponding to the object specified by the cursor is acquired and presented.
- the present invention enables data processing that effectively uses a space area other than a display unit (display) such as a PC, for example, by processing using mixed reality (MR).
- a display unit such as a PC
- MR mixed reality
- FIG. 1 is a diagram showing an example of processing executed by the information processing apparatus of the present invention.
- FIG. 1 shows a display unit 10 such as a PC operated by a user. Although the detailed configuration will be described later, the user operates the PC while wearing glasses having a display for displaying a generated image of a mixed reality (MR) generation device.
- MR mixed reality
- the glasses are provided with a camera that captures the surrounding environment, and a composite image of the virtual object generated by the mixed reality (MR) generation device is displayed on the display of the glasses.
- 1A and 1B are images displayed on the display of the glasses worn by the user and observed by the user.
- a display unit 10 shown in FIG. 1A displays a mouse cursor 11a as a position indicator that moves corresponding to a mouse operated by the user.
- the user can move the mouse cursor 11a by operating the mouse.
- the moving range of the mouse cursor is limited to the display area of the display unit 10.
- the moving range of the mouse cursor is not limited to the display area of the display unit 10.
- the mouse cursor 11a when the mouse cursor 11a is moved by the user's mouse operation according to the movement line 12 shown in FIG. 1A, the mouse cursor is moved to a space outside the display unit 10 as shown in FIG. be able to.
- the mouse cursor 11b shown in FIG. 1B is a virtual object generated by the mixed reality (MR) generation device.
- the user observes the mouse cursor 11b which is a virtual object displayed on the display of the glasses worn by the user.
- the mouse cursor 11 can be freely moved regardless of the inside or outside of the display unit 10.
- FIG. 2 is also a diagram illustrating an example of processing executed by the information processing apparatus of the present invention.
- FIG. 2 also shows a display unit 10 such as a PC operated by the user, as in FIG.
- a user wears glasses equipped with a display for displaying an image generated by a mixed reality (MR) generation device.
- 2A and 2B are images displayed on the display of the glasses worn by the user and observed by the user.
- MR mixed reality
- a mouse cursor 11a and an object 21a designated by the mouse cursor 11a are displayed.
- the object 21a is an object displayed on the display unit 10 by executing a clock display application in the PC.
- the user moves the mouse cursor 11a onto the object 21a by operating the mouse, performs the mouse operation to execute the object designation process, and further moves the mouse cursor 11a according to the movement line 22 shown in FIG.
- the object 21 designated as the mouse cursor can be moved to the space outside the display unit 10.
- An object 21b shown in FIG. 2B is a virtual object generated by the mixed reality (MR) generation device.
- the user observes the object 21b displayed on the display of the glasses worn by the user.
- MR mixed reality
- FIG. 3 is also a diagram showing an example of processing executed by the information processing apparatus of the present invention.
- FIG. 3 also shows a display unit 10 such as a PC operated by the user, as in FIGS.
- a user wears glasses equipped with a display for displaying an image generated by a mixed reality (MR) generation device.
- 3A and 3B are images displayed on the display of the glasses worn by the user and observed by the user.
- MR mixed reality
- FIG. 3A shows the mouse cursor 11a set outside the display unit 10 and the real object 31a designated by the mouse cursor 11a by the operation described above with reference to FIG.
- the object 31a is a real object that actually exists in the space.
- the object 31a is a jacket photo of a CD that is a disc storing music data.
- the user sets the mouse cursor 11a on the object 31a by the mouse operation, performs the mouse operation, and executes the object designation process.
- Information related to the object specified by the object specifying process that is, object information is acquired from a database or a server.
- the acquired object information is displayed on the display unit 10. This is the object image 31b and the object information 31c shown in FIG.
- FIG. 4 is a diagram showing a configuration of an information processing apparatus according to an embodiment of the present invention that executes the above processing.
- a user 100 operates a PC (personal computer) 120 to perform various data processing.
- the PC 120 includes a mouse driver 121, a mouse coordinate processing module 122, a GUI unit 123, a communication unit 124, an application execution unit 125, a control unit 126, a memory 127, and a display unit 128.
- it has the mouse
- the mouse driver 121 inputs position information and operation information that are input information from the mouse 129.
- the mouse coordinate processing module 122 determines the display position of the mouse cursor according to the position information of the mouse 129 input via the mouse driver 121.
- the display position of the mouse cursor is not limited to the display area of the display unit 128.
- the GUI unit 123 is a user interface that performs processing of input information from the user, processing of output information for the user, and the like.
- the communication unit 124 performs communication processing with the mixed reality (MR: Mixed Reality) generation device 130.
- MR Mixed Reality
- the application execution unit 125 executes applications corresponding to various data processing executed in the PC 120.
- the control unit 126 performs overall control of processing executed in the PC 120.
- the memory 127 is a memory configured by a RAM, a ROM, etc. for storing programs, data processing parameters, and the like.
- the display unit 128 is a display unit configured by, for example, an LCD.
- User 100 is wearing glasses 141 having a display for displaying virtual objects.
- the glasses 141 are provided with a camera 142 that captures the surrounding environment.
- the glasses 141 and the camera 142 are connected to a mixed reality (MR) generator 130.
- MR mixed reality
- a real world image that is a captured image of the camera 142 is displayed, and a virtual object generated by the mixed reality (MR) generation device 130 is displayed together with the real world image.
- MR mixed reality
- the user 100 operates a PC (personal computer) 120
- the camera 142 captures an image of the PC (personal computer) 120 operated by the user 100. Therefore, on the display of the glasses 141, for example, an image including a display (display unit 128) of a PC (personal computer) 120 operated by the user 100 and various real objects around it is displayed as a real world image. Further, the virtual object generated by the mixed reality (MR) generation device 130 is displayed superimposed on the real world image. The direction of the camera 142 is also changed according to the movement of the user 100.
- MR mixed reality
- display data 150 as shown in FIG. 5 is displayed on the display of the glasses 141 worn by the user 100.
- the display data 150 shown in FIG. 5 is a composite image of a real object and a virtual object.
- the mixed reality (MR) generation device 130 includes a three-dimensional information analysis unit 131, a virtual object management module 132, a memory 133, and a communication unit 134.
- the three-dimensional information analysis unit 131 inputs a photographed image of the camera 142 worn by the user and performs a process of analyzing the three-dimensional position of the object included in the photographed image.
- This three-dimensional position analysis process is performed as a process to which, for example, SLAM (simultaneous localization and mapping) is applied.
- SLAM is a process of selecting a feature point from various real objects included in a captured image of a camera and detecting the position of the selected feature point and the position and orientation of the camera together.
- Patent Document 1 Japanese Patent Laid-Open No. 2008-304268
- Patent Document 2 Japanese Patent Laid-Open No. 2008-304269
- the three-dimensional information analysis unit 131 calculates the three-dimensional position of the real object included in the photographed image of the camera 142 worn by the user by applying the above SLAM, for example.
- the three-dimensional information analysis unit 131 is not limited to the above-described SLAM, and may be set to obtain the three-dimensional position of the object included in the camera captured image by other methods.
- the virtual object management module 132 manages virtual objects displayed on the display of the glasses 141 worn by the user.
- the virtual object is data stored in the memory 133.
- the display data 150 shown in FIG. 5 is displayed on the display of the glasses 141 worn by the user.
- the PC image 151 included in the display data 150 is a real image (real object) photographed by the camera 142.
- the mouse cursor 152a displayed in the PC image 151 shown in FIG. 5 moves to the outside of the PC image 151, and the mouse cursor 152b as a virtual object is displayed. Is done.
- the user 100 shown in FIG. 4 can observe the composite image of the real object and the virtual object shown in FIG. 5 on the display of the glasses 141, for example.
- a PC image 151 shown in FIG. 5 is a real object photographed by the camera 142.
- the mouse cursor 152a in the PC image 151 is information actually displayed in the PC image 151 and is a real object.
- an object that exists in the real world photographed by the camera 142 and can be photographed by the camera will be described as a real object.
- the mouse cursor 152b outside the PC image 151 shown in FIG. 5 is not a real-world object (real object).
- the mouse cursor 152b is a virtual object generated by the mixed reality (MR) generation device 130.
- a processing sequence for performing such a mouse cursor display process will be described with reference to the flowchart shown in FIG. It is assumed that the user is operating the mouse 129 connected to the PC 120 shown in FIG. The operation information is input to the mouse driver. After this processing, the processing after step S101 in the flowchart shown in FIG. 6 is performed.
- steps S101 to S105 in the flowchart shown in FIG. 6 is the processing of the PC 120 shown in FIG.
- the processing of steps S106 to S109 is the processing of the mixed reality (MR) generation device 130 shown in FIG.
- MR mixed reality
- step S ⁇ b> 101 the mouse coordinate processing module 122 of the PC 120 inputs mouse displacement (dX, dY) information from the mouse driver 121.
- step S102 the mouse coordinate processing module 122 calculates the updated mouse cursor position (XQ, YQ) from the previous mouse cursor position (XP, YP) and mouse displacement (dX, dY) information.
- step S103 the mouse coordinate processing module 122 determines whether or not the updated mouse cursor position (XQ, YQ) is outside the display unit. If the updated mouse cursor position (XQ, YQ) is in the display area, the process proceeds to step S104, and normal mouse cursor display update processing in the PC is executed. If the updated mouse cursor position (XQ, YQ) is outside the display area, the process proceeds to step S105.
- step S105 the mouse cursor position information (XQ, YQ) stored in the memory is transferred to the mixed reality (MR) generator 130 via the communication unit.
- the positional information transferred from the PC 120 to the mixed reality (MR) generation device 130 is only the positional information of the mouse cursor, and the positional information is the positional information of the mouse cursor. It is assumed that the (MR) generation apparatus 130 has been set in advance. When transferring position information or the like for other objects, it is necessary to transfer identification information of each object or drawing data of the object.
- Step S106 and subsequent steps are performed as a process of the mixed reality (MR) generation device 130.
- the mixed reality (MR) generating device 130 stores the mouse cursor position information (XQ, YQ) transferred from the PC 120 in the memory 133 on the mixed reality (MR) generating device 130 side.
- the received data is also stored in the memory 133 on the mixed reality (MR) generation device 130 side.
- step S107 the virtual object management module 132 of the mixed reality (MR) generation device 130 stores data stored in the memory 133, that is, non-display data (mouse cursor drawing data) and position information (XQ, YQ). ) To get.
- non-display data mouse cursor drawing data
- position information XQ, YQ.
- step S ⁇ b> 108 the virtual object management module 132 performs processing for converting the position information of the non-display data (mouse cursor) acquired from the memory 133 into a camera coordinate system corresponding to the camera captured image acquired from the three-dimensional information analysis unit 131. Do.
- the three-dimensional information analysis unit 131 acquires three-dimensional position information of the markers 201a to 201d at the four corners of the display unit 200 of the PC included in the camera photographed image.
- Marker 201a (xa, ya, za)
- Marker 201b (xb, yb, zb)
- Marker 201c (xc, yc, zc)
- Marker 201d (xd, yd, zd)
- This position information is position information in the camera coordinate system (x, y, z).
- the virtual object management display module 132 calculates the plane of the display unit of the PC in the camera coordinate system based on the three-dimensional position information of the markers 201a to 201d, and displays the non-display data (from the PC 120) on the calculated plane. (Mouse cursor drawing data) setting position is determined. For this processing, the coordinate transformation of the position information (XQ, YQ) indicated by the display unit plane coordinate system (X, Y) acquired from the PC 120 is executed, and the mouse in the camera coordinate system (x, y, z) is executed. The display position (xq, yq, zq) of the cursor 211q is calculated.
- the display position (xq, yq, zq) of the mouse cursor 211q is It is set to a position on the plane of the display surface formed by the markers 201a to 201d at the four corners of the display unit 200 shown in FIG. First, display surfaces formed by the markers 201a to 201d at the four corners of the display unit 200 are obtained. This display surface can be defined using arbitrary three coordinates from the four coordinates of the markers 201 a to 201 d at the four corners of the display unit 200.
- Marker 201a (xa, ya, za)
- Marker 201b (xb, yb, zb)
- Marker 201c (xc, yc, zc) It can be defined using the coordinates of the above three points.
- An xyz plane (a plane in the camera coordinate system (x, y, z)) passing through the display surface can be expressed as the following formula (formula 1) using the coordinates of these three points.
- (X ⁇ xa) (yb ⁇ ya) (zc ⁇ za) + (xb ⁇ xa) (yc ⁇ ya) (z ⁇ za) + (xc ⁇ xa) (y ⁇ ya) (zb ⁇ za) ⁇ (xc -Xa) (yb-ya) (z-za)-(xb-xa) (y-ya) (zc-za)-(x-xa) (yc-ya) (zb-za) 0 ...
- the virtual object management display module 132 displays the position information (XQ, YQ) indicated by the display unit plane coordinate system (X, Y) acquired from the PC 120 on the xyz plane in the camera coordinate system (x, y, z). Is converted to the position coordinates (xq, yq, zq).
- the coordinates to be obtained are coordinate positions (xq, yq, zq) in the camera coordinate system (x, y, z) of the mouse cursor 211q shown in FIG.
- Marker 201a (xa, ya, za)
- Marker 201b (xb, yb, zb)
- Marker 201c (xc, yc, zc)
- the positions of the three points on the display unit plane coordinate system (X, Y) are as follows.
- Marker 201a (0,0)
- Marker 201b (XB, 0)
- Marker 201c (0, YC)
- Zq is derived by substituting the above relational expressions (formula 2) and (formula 3) into the above formula (formula 1). In this way, the position (xq, yq, zq) of the mouse cursor 211q is calculated.
- the virtual object management module 132 captures the position information (XQ, YQ) of the non-display data acquired from the memory 133 from the three-dimensional information analysis unit 131 in step S108 shown in the flow of FIG. A process of converting to a position (xq, yq, zq) in the camera coordinate system corresponding to the image is performed.
- step S109 the virtual object management module 132 arranges and displays a mouse cursor at the generated coordinate position (xq, yq, zq) corresponding to the camera coordinate system.
- the non-display data included in the data stored in the memory 133 that is, the non-display data transferred from the PC 120 (mouse cursor drawing data) is used.
- the display data 150 shown in FIG. 5 is displayed on the display of the glasses 141 worn by the user 100.
- the display data 150 shown in FIG. 5 is a composite image in which a PC image 151 as a real object and a mouse cursor 152b as a virtual object are displayed together.
- the virtual object management module 132 sets the display position of the virtual object in the external space of the PC display unit as shown in FIG. By such display processing, the user can extend the movement range of the mouse cursor to a space outside the PC display unit without being limited to the PC display unit, and perform data processing using a larger work area. It becomes possible.
- the process described with reference to the flowchart shown in FIG. 6 is performed every time the user operates the mouse 129 of the PC 120 and the position of the mouse cursor is changed.
- the mouse coordinate processing module 122 transmits update data to the mixed reality (MR) generation device 130 every time the position of the mouse cursor is changed.
- the mixed reality (MR) generation device 130 executes processing for changing the display position of the virtual object (mouse cursor) based on the updated data as real-time processing.
- FIG. 2B is a configuration example for realizing a process of moving the object 21 to a space outside the display unit 10 as shown in FIG.
- This embodiment is executed by an apparatus having the configuration shown in FIG. 4 as in the first embodiment.
- the user 100 operates a PC (personal computer) 120
- the camera 142 photographs a PC (personal computer) 120 operated by the user 100.
- the display of the glasses 141 worn by the user 100 includes, as a real world image, for example, a display (display unit 128) of a PC (personal computer) 120 operated by the user 100 and various real objects around it. Is displayed.
- the virtual object generated by the mixed reality (MR) generation device 130 is displayed superimposed on the real world image.
- the direction of the camera 142 is also changed according to the movement of the user 100.
- display data 250 as shown in FIG. 8 is displayed on the display of the glasses 141 worn by the user 100.
- the display data 250 shown in FIG. 8 is a composite image of a real object and a virtual object.
- the PC image 251 included in the display data 250 is a real image (real object) photographed by the camera 142.
- FIG. 8 shows processing when the user moves the mouse 129 of the PC 120 shown in FIG.
- the object 252a displayed in the PC image 251 shown in FIG. 8 is designated by the mouse cursor 271a and then moved outside the PC image 251, the object 252 and the mouse cursor 271 move together.
- the object 252b and the mouse cursor 271b as virtual objects are displayed outside the PC image 251.
- the user 100 shown in FIG. 4 can observe the composite image of the real object and the virtual object shown in FIG. 8 on the display of the glasses 141, for example.
- a PC image 251 shown in FIG. 8 is a real object photographed by the camera 142.
- Both the object 252a and the mouse cursor 271a in the PC image 251 are information actually displayed in the PC image 151 and are real objects.
- the object 252b and the mouse cursor 271b outside the PC image 251 shown in FIG. 8 are not real world objects (real objects).
- the object 252b and the mouse cursor 271b are virtual objects generated by the mixed reality (MR) generation device 130.
- MR mixed reality
- the display processing of the mouse cursor 271b as a virtual object is performed in this embodiment as well as in the first embodiment.
- the sequence for displaying the mouse cursor 271b is executed in the same sequence as described with reference to FIG.
- a display process for the object 252 designated by the mouse is further added.
- the flowchart shown in FIG. 9 is a flow for explaining only the display sequence of the mouse designation object. That is, when the display data 250 shown in FIG. 8 is generated and displayed, processing according to the flow shown in FIG. 6 and processing according to the flow shown in FIG. 9 are executed together.
- steps S201 to S204 in the flowchart shown in FIG. 9 is the processing of the PC 120 shown in FIG.
- the processing of steps S205 to S208 is the processing of the mixed reality (MR) generation device 130 shown in FIG.
- MR mixed reality
- step S201 the object information designated by the mouse 129 of the PC 120 is stored in the memory 127 on the PC 120 side.
- the object information stored in the memory 127 includes drawing data and position information of the object.
- the position information is, for example, the coordinates of the center position serving as the reference of the object, or a plurality of position information that defines the contour.
- each coordinate information of the four vertexes PQRS is stored in the memory as a component of the object information.
- the position information only needs to be position information that can draw the object at a specific position, and may be set to store only the coordinate information of one point P in the memory instead of all four vertexes PQRS.
- Object drawing data indicating the shape of the object is also stored in the memory, so even if only one point, for example, the coordinates of the point P, is stored in the memory as position information, the object drawing process (display process) can be started from P. Because.
- step S202 it is verified whether or not a display outside area is generated in the mouse designated object area due to the movement of the mouse 129 of the PC 120 by the user operation. This processing is verified by the application execution unit 125 of the PC 120 based on the destination mouse cursor position and the object shape acquired by the mouse coordinate processing module 122.
- step S202 determines whether there is no area outside the display area in the mouse designated object area. If the determination in step S202 is No, that is, if there is no area outside the display area in the mouse designated object area, the process proceeds to step S203, and the application execution section 125 of the PC 120 displays the mouse designated object in the display section in the PC 120. To do.
- step S202 determines whether a display outside area is generated in the mouse designated object area. If the determination in step S202 is Yes, that is, if a display outside area is generated in the mouse designated object area, the process proceeds to step S204.
- the object is moved to the position of the object 301b shown in FIG. 10 or the position of the object 301c shown in FIG.
- the objects 301b and 301c shown in FIGS. 10 and 11 are examples in which at least a part is displayed as a virtual object on the display of the glasses worn by the user.
- step S204 the data (non-display data (object drawing data)) and position information recorded in the memory are transmitted to the mixed reality (MR) generator 130.
- the clock drawing data of the object 301 b and the coordinate data of the four vertices PQRS of the object 301 b are acquired from the PC 120 side memory 127 and transmitted to the mixed reality (MR) generation device 130.
- the position information to be transferred is the PC display unit plane coordinate system, as in the first embodiment.
- P (XP, YP)
- Q (XQ, YQ)
- R (XR, YR)
- S (XS, YS)
- Step S205 and subsequent steps are performed as a process of the mixed reality (MR) generation device 130.
- the mixed reality (MR) generation device 130 uses the reception data from the PC 120, that is, non-display data (object drawing data) and position information (PQRS coordinate information) as a mixed reality (MR) generation device. It is stored in the memory 133 on the 130 side.
- step S206 the virtual object management module 132 of the mixed reality (MR) generating apparatus 130 stores data stored in the memory 133, that is, non-display data (object drawing data) and position information (PQRS coordinate information). ) To get.
- step S207 the virtual object management module 132 performs processing to convert the position information PQRS acquired from the memory 133 into a camera coordinate system corresponding to the camera captured image acquired from the three-dimensional information analysis unit 131.
- Each PQRS coordinate in the display unit plane coordinate system (X, Y) of the object 301b is converted into each coordinate in the camera coordinate system (x, y, z) as follows.
- P (XP, YP) ⁇ (xp, yp, zp)
- Q (XQ, YQ) ⁇ (xq, yq, zq)
- R (XR, YR) ⁇ (xr, yr, zr)
- S (XS, YS) ⁇ (xs, ys, zs)
- the virtual object management module 132 uses the camera coordinates corresponding to the camera-captured image acquired from the three-dimensional information analysis unit 131 as the position information of the non-display data acquired from the memory 133 in step S207 shown in the flow of FIG. A process of converting to the system position (xq, yq, zq) is performed.
- step S208 the virtual object management module 132 acquires the non-display data (object drawing data) included in the data stored in the memory 133, and as shown in FIG. 10, the coordinates corresponding to the generated camera coordinate system are obtained. Draw or display the object at the position.
- the display data 250 shown in FIG. 8 is displayed on the display of the glasses 141 worn by the user 100.
- the display data 250 shown in FIG. 8 is a composite image in which a PC image 251 as a real object, an object 252b as a virtual object, and a mouse cursor 271b are displayed together.
- the virtual object management module 132 sets the display position of the virtual object in the external space of the PC display unit as shown in FIG. By such display processing, the user can display various objects in a space outside the PC display unit without being limited to the PC display unit, and perform data processing using a larger work area. Is possible.
- the process described with reference to the flowchart shown in FIG. 9 is performed each time the user operates the mouse 129 of the PC 120 and the position of the mouse cursor is changed.
- the application execution unit 125 transmits update data to the mixed reality (MR) generation device 130 every time the position of the mouse cursor is changed.
- the mixed reality (MR) generation device 130 executes a process of changing the display position of the virtual object (clock) based on the updated data as a real-time process.
- step S202 of the flowchart shown in FIG. 9 when it becomes Yes, it also occurs at the position of the object 301c shown in FIG. 11, for example. That is, the determination in step S202 is determined as Yes when even a part of the mouse designated object is outside the display area of the PC display unit.
- the position information of the PQRS shown in FIG. 11 is transferred from the PC 120 to the mixed reality (MR) generation device 130 as the position information of the object 301c.
- the object 301c may be displayed in a form that partially overlaps the display unit of the PC.
- the object portion of the area PUVS is not displayed as a virtual object but displayed on the display section of the PC.
- the real object that is, the camera image itself may be set to be displayed on the display of the glasses worn by the user.
- the virtual object management module 132 of the mixed reality (MR) generation device 130 When performing this processing, the virtual object management module 132 of the mixed reality (MR) generation device 130 generates virtual object display data including only partial data of the URQR shown in FIG. 11 during the virtual object display processing. indicate. That is, the process of making the PAVS partial data of the object drawing data received from the PC transparent is performed.
- MR mixed reality
- the mixed reality (MR) generation apparatus 130 includes a three-dimensional information analysis unit 131, a virtual object management module 132, a memory 133, a communication unit 134, an object information acquisition unit 135, and an object information database 136.
- the object information database 136 is not necessarily set in the mixed reality (MR) generation apparatus 130, and can be accessed via the communication unit of the mixed reality (MR) generation apparatus 130, for example, a network connectable database. If it is.
- the three-dimensional information analysis unit 131, the virtual object management module 132, the memory 133, the communication unit 134, and the configuration thereof are the same as those described with reference to FIG. 4 in the first embodiment.
- the communication unit 134 performs communication with the external server 140 and the object information database 136 via the network.
- the object information acquisition unit 135 acquires images of various real objects from the captured images of the camera 142 worn by the user 100, executes a collation process with data stored in the object information database 136, and selects similar images. Then, object information associated with the selected image is acquired.
- the object information is various information such as the song title, genre, artist, and price of the CD.
- the object information is stored in the object information database 136 in association with the object image.
- the server 140 also holds information similar to the information stored in the object information database 136.
- the mixed reality (MR) generation device 130 transmits a captured image of the camera 142 worn by the user 100 or a real object image (for example, a CD jacket image) selected from the captured images to the server via the communication unit 134.
- the server extracts corresponding object information from the received image and provides it to the mixed reality (MR) generator 130.
- the mixed reality (MR) generation device 130 acquires the object information from the object information database 136 or the server 140 and provides the acquired information to the PC 120 together with the actual object image data captured by the camera 142.
- the PC 120 uses the acquired information to display the acquired information on the display unit of the PC.
- display data 450 as shown in FIG. 13 is displayed on the display of the glasses 141 worn by the user 100.
- a PC image 451 included in the display data 450 is a real image (real object) photographed by the camera 142.
- the object 471a outside the PC image 451 is also a real object.
- the mouse cursor 480a is a virtual object.
- the object image 471b and the object information 471c displayed in the PC image 451 are data displayed on the display unit 128 in the application execution unit 125 of the PC 120. Therefore, information other than the mouse cursor 480a in the display data 450 shown in FIG. 13 is an image displayed on the display of the glasses 141 of the user 100 and information that can be observed by a user who does not wear the glasses.
- the object image 471b and the object information 471c displayed in the PC image 451 are display data of the display unit of the PC 120, and anyone can observe them.
- the display process of the mouse cursor 480a as a virtual object is also performed in this embodiment, as in the first and second embodiments.
- the sequence for displaying the mouse cursor 480a is executed in the same sequence as described with reference to FIG.
- a process for a real object designated by the mouse is further added.
- the flowchart shown in FIG. 14 is a flow for explaining only the processing sequence for this mouse designated object. That is, when the display data 450 shown in FIG. 13 is generated and displayed, the process according to the flow shown in FIG. 6 and the process according to the flow shown in FIG. 14 are executed together.
- step S301 in the flowchart shown in FIG. 14 is a process involving both the PC 120 and the mixed reality (MR) generation device 130 shown in FIG. 12, and the processes of steps S302 to S309 are the composite process shown in FIG. This is a process of the reality (MR) generation device 130.
- step S310 is the process of the PC 120 shown in FIG.
- step S301 the process according to the flow shown in FIG. 6 described in the first embodiment is executed, and the mouse cursor is set in the display outside area of the PC.
- step S301 it is determined whether or not a real object has been designated by a mouse operation. If a real object has been designated, the process proceeds to step S302, and if not, the process ends. Processing when a real object is specified is as follows. First, when mouse click operation information is input to the application execution unit 125 via the mouse driver 121 of the PC 120, the application execution unit 125 sends the mouse operation (click) information to the mixed reality (MR) via the communication unit 124. ) Notify the generation device 130. The mixed reality (MR) generation device 130 receives mouse operation information via the communication unit 134 and notifies the virtual object management module 132 of it.
- MR mixed reality
- step S302 the virtual object management module 132 determines whether the object area of the specified real object includes the PC outside display area and is within the camera imaging range.
- the camera is a camera 142 attached to the user 100. If the determination in step S302 is No, the process ends. If the determination in step S302 is Yes, the process proceeds to step S303.
- step S303 the camera 142 attached to the user 100 captures an image including the mouse designated object, and stores the captured image in the memory. This process is performed under the control of the virtual object management module 132.
- the processes of the next steps S304 to S306 are processes when the object information is acquired from the object information database 136, and the processes of steps S307 to S308 are processes when the object information is acquired from the server 140. Any one of these processes may be performed, or a setting for executing both processes may be performed.
- step S304 the object information database (DB) 136 is searched using the mouse designated object image stored in the memory as a search key. This process is executed as a process of the object information acquisition unit 135.
- object information database (DB) 136 image data of various real objects and object information of objects corresponding to the image data are registered.
- object information such as an image of a CD jacket and a song name and price related to the CD.
- step S305 the object information acquisition unit 135 searches the object information database (DB) 136. That is, it is determined whether registered image data that matches or is similar to the mouse-specified object image is registered in the object information database (DB) 136. If no matching or similar registered image is extracted, the process ends. If a matching or similar registered image is extracted, the process proceeds to step S306.
- DB object information database
- step S306 the object information acquisition unit 135 acquires, from the object information database (DB) 136, registration data corresponding to a registered image that matches or is similar to the mouse designated object image, that is, the object image and object information.
- DB object information database
- step S ⁇ b> 307 the object information acquisition unit 135 transmits the mouse designated object image stored in the memory to the server 140 via the communication unit 134.
- step S308 the object information acquisition unit 135 acquires from the server 140 the object image and object information selected based on the server registration information.
- the server 140 performs the same processing as the object information acquisition unit 135, searches the database of the server 140, and extracts object information using the image of the mouse designated object as a search key. Note that an error message is notified if extraction has failed.
- the mixed reality (MR) generation device 130 transmits object information and object image data acquired from the server or database to the PC 120.
- the object image data may be an object image acquired from a server or a database, or may be a captured image of the camera 142.
- the last step S310 is a process of the PC 120.
- the acquired data from the mixed reality (MR) generation device 130 is displayed on the PC side display unit by the processing of the PC side application.
- MR mixed reality
- the display data 450 shown in FIG. 13 is displayed on the display of the glasses 141 worn by the user 100.
- the object image 471b and the object information 471c displayed in the PC image 451 are data displayed on the display unit 128 by the application execution unit 125 of the PC 120.
- the display data 450 shown in FIG. 13 is information that can be observed by a user who does not wear glasses except for the mouse cursor 480a.
- the series of processes described in the specification can be executed by hardware, software, or a combined configuration of both.
- the program recording the processing sequence is installed in a memory in a computer incorporated in dedicated hardware and executed, or the program is executed on a general-purpose computer capable of executing various processing. It can be installed and run.
- the program can be recorded in advance on a recording medium.
- the program can be received via a network such as a LAN (Local Area Network) or the Internet and installed on a recording medium such as a built-in hard disk.
- the various processes described in the specification are not only executed in time series according to the description, but may be executed in parallel or individually according to the processing capability of the apparatus that executes the processes or as necessary.
- the system is a logical set configuration of a plurality of devices, and the devices of each configuration are not limited to being in the same casing.
- the cursor or object that protrudes outside the display unit such as a PC is displayed as a virtual object.
- a display device such as a PC and an image of the external area are displayed on a display of glasses worn by the user.
- a three-dimensional position of a cursor or object that is assumed to have moved according to a user operation is calculated, and displayed as a virtual object at the calculated position.
- object information corresponding to the object specified by the cursor is acquired and presented.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
(a)新たな物理的なディスプレイを用意し、ユーザの操作するコンピュータに接続して、複数のディスプレイを利用可能とする。
(b)1つの表示部内に仮想デスクトップを設定して利用する。
しかし、前者(a)の場合は、ディスプレイを追加するためのコスト、スペースが必要となるという問題がある。また、後者(b)は実際に表示部に表示されている以外の部分にアクセスするために、ユーザ操作によるコマンド入力や、トレイなどに表示されるアイコンの操作が必要であるという問題がある。
第1表示部に表示された位置指示子であるカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定し、領域外である場合はカーソル位置情報を仮想オブジェクト管理部に出力する座標処理モジュールと、
前記第1表示部を含む実オブジェクトからなる画像を撮影するカメラと、
カメラ撮影画像に含まれる実オブジェクトの三次元位置を解析する三次元情報解析部と、
前記カメラ撮影画像を表示する第2表示部と、
前記カメラ撮影画像に含まれる実オブジェクトと異なる仮想オブジェクトを生成し、生成した仮想オブジェクトと前記実オブジェクトを含む合成画像を生成して前記第2表示部に表示する仮想オブジェクト管理部を有し、
前記仮想オブジェクト管理部は、
前記座標処理モジュールから入力したカーソル位置情報に基づいてカーソルの三次元位置を算出し、算出した位置にカーソルを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示する情報処理装置にある。
情報処理装置において実行する情報処理方法であり、
座標処理モジュールが、第1表示部に表示された位置指示子であるカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定し、領域外である場合はカーソル位置情報を仮想オブジェクト管理部に出力する座標処理ステップと、
カメラが、前記第1表示部を含む実オブジェクトからなる画像を撮影する撮影ステップと、
三次元情報解析部が、カメラ撮影画像に含まれる実オブジェクトの三次元位置を解析する三次元情報解析ステップと、
仮想オブジェクト管理部が、前記カメラ撮影画像に含まれる実オブジェクトと異なる仮想オブジェクトを生成し、生成した仮想オブジェクトと前記実オブジェクトを含む合成画像を生成して第2表示部に表示する仮想オブジェクト管理ステップを有し、
前記仮想オブジェクト管理ステップは、
前記座標処理モジュールから入力したカーソル位置情報に基づいてカーソルの三次元位置を算出し、算出した位置にカーソルを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示するステップである情報処理方法にある。
情報処理装置において情報処理を実行させるプログラムであり、
座標処理モジュールに、第1表示部に表示された位置指示子であるカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定させて、領域外である場合はカーソル位置情報を仮想オブジェクト管理部に出力させる座標処理ステップと、
カメラに、前記第1表示部を含む実オブジェクトからなる画像を撮影させる撮影ステップと、
三次元情報解析部に、カメラ撮影画像に含まれる実オブジェクトの三次元位置を解析させる三次元情報解析ステップと、
仮想オブジェクト管理部に、前記カメラ撮影画像に含まれる実オブジェクトと異なる仮想オブジェクトを生成させて、生成した仮想オブジェクトと前記実オブジェクトを含む合成画像を生成して第2表示部に表示させる仮想オブジェクト管理ステップを有し、
前記仮想オブジェクト管理ステップは、
前記座標処理モジュールから入力したカーソル位置情報に基づいてカーソルの三次元位置を算出させ、算出した位置にカーソルを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示させるステップであるプログラムにある。
1.本発明の情報処理装置の実行する処理の概要について
2.本発明の情報処理装置の第1実施例の構成および処理について
3.本発明の情報処理装置の第2実施例の構成および処理について
4.本発明の情報処理装置の第3実施例の構成および処理について
まず、本発明の情報処理装置の実行する処理の概要について、図1~図3を参照して説明する。本発明は、複合現実感(MR:Mixed Reality)を利用した処理により、例えばPCなどの表示部(ディスプレイ)以外の空間領域を有効に利用したデータ処理を可能とするものである。
次に、本発明の情報処理装置の第1実施例として図1を参照して説明した処理を実行する装置の構成と処理の詳細について説明する。実施例1は、図1に示すように、ユーザのマウス操作によって図1(a)に示す移動ライン12に従ってマウスカーソル11aを移動させることで、図1(b)に示すようにマウスカーソルを表示部10の外側の空間に移動させる処理を実現する構成例である。
ステップS106~S109の処理は、図4に示す複合現実感(MR)生成装置130の処理である。
ステップS102において、マウス座標処理モジュール122が、以前のマウスカーソル位置(XP,YP)と、マウスの変位(dX,dY)情報とから、更新後のマウスカーソル位置(XQ,YQ)を算出する。
まず、ステップS106において、複合現実感(MR)生成装置130はPC120から転送されたマウスカーソル位置情報(XQ,YQ)を複合現実感(MR)生成装置130側のメモリ133に格納する。PC120から非表示データ(マウスカーソル描画データ)またはその識別子を受信している場合は、受信データについても複合現実感(MR)生成装置130側のメモリ133に格納する。
マーカー201a=(xa,ya,za)
マーカー201b=(xb,yb,zb)
マーカー201c=(xc,yc,zc)
マーカー201d=(xd,yd,zd)
これらの位置情報が取得されている。
なお、この位置情報は、カメラ座標系(x,y,z)における位置情報である。
一方、PC120から受領するマウスカーソルの位置情報(XQ,YQ)は、PC表示部平面座標系であり、図7に示すように、例えば表示部の左上端を原点(X,Y)=(0,0)として横方向をX、縦方向をYとする座標面における位置情報である。
図7に示す表示部200の4隅のマーカー201a~201dの形成する表示面の平面上の位置に設定される。まず、この表示部200の4隅のマーカー201a~201dの形成する表示面を求める。
この表示面は、表示部200の4隅のマーカー201a~201dの4座標から任意の3座標を用いて規定することができる。例えば、
マーカー201a=(xa,ya,za)
マーカー201b=(xb,yb,zb)
マーカー201c=(xc,yc,zc)
上記の3点の座標を用いて規定することができる。
(x-xa)(yb-ya)(zc-za)+(xb-xa)(yc-ya)(z-za)+(xc-xa)(y-ya)(zb-za)-(xc-xa)(yb-ya)(z-za)-(xb-xa)(y-ya)(zc-za)-(x-xa)(yc-ya)(zb-za)=0
・・・(式1)
また、
マーカー201a=(xa,ya,za)
マーカー201b=(xb,yb,zb)
マーカー201c=(xc,yc,zc)
上記の3点の表示部平面座標系(X,Y)での位置を、それぞれ以下のようにする。
マーカー201a=(0,0)
マーカー201b=(XB,0)
マーカー201c=(0,YC)
マーカー201a=(0,0)
マーカー201b=(XB,0)
マーカー201c=(0,YC)
マウスカーソル211pの位置(XP,YP)
マウスカーソル211qの位置(XQ,YQ)
これらの位置関係と、
カメラ座標系(x,y,z)における以下の各座標、すなわち、
マーカー201a=(xa,ya,za)
マーカー201b=(xb,yb,zb)
マーカー201c=(xc,yc,zc)
マウスカーソル211pの位置(xp,yp,zp)
マウスカーソル211qの位置(xq,yq,zq)
これらの位置関係は、同じ位置関係にある。
(0-XQ)/(0-XB)=(xa-xq)/(xa-xb)
(0-YQ)/(0-YC)=(ya-yq)/(ya-yc)
上記式から、
xq=xa-XQ(xa-xb)/XB ・・・(式2)
yq=ya-YQ(ya-yb)/YC ・・・(式3)
上記関係式(式2),(式3)が導かれる。
次に、本発明の情報処理装置の第2実施例として、先に図2を参照して説明した処理を実行する装置の構成と処理の詳細について説明する。実施例2は、図2を参照して説明したように、ユーザのマウス操作によってオブジェクト21を指定し、さらに図2(a)(b)に示す移動ライン22に従ってマウスカーソル11aを移動させることで、図2(b)に示すようにオブジェクト21を表示部10の外側の空間に移動させる処理を実現する構成例である。
ステップS205~S208の処理は、図4に示す複合現実感(MR)生成装置130の処理である。
P=(XP,YP)
Q=(XQ,YQ)
R=(XR,YR)
S=(XS,YS)
これらの4頂点座標情報が転送される。
まず、ステップS205において、複合現実感(MR)生成装置130はPC120からの受信データ、すなわち、非表示データ(オブジェクト描画データ)と位置情報(PQRSの座標情報)を複合現実感(MR)生成装置130側のメモリ133に格納する。
オブジェクト301bの表示部平面座標系(X,Y)におけるPQRSの各座標は、カメラ座標系(x,y,z)における各座標に以下のように変換される。
P=(XP,YP)→(xp,yp,zp)
Q=(XQ,YQ)→(xq,yq,zq)
R=(XR,YR)→(xr,yr,zr)
S=(XS,YS)→(xs,ys,zs)
次に、本発明の情報処理装置の第3実施例として、先に図3を参照して説明した処理を実行する装置の構成と処理の詳細について説明する。実施例3は、図3を参照して説明したように、ユーザのマウス操作によってPC表示部外の実空間にあるオブジェクト31を指定することで、図3(b)に示すようにオブジェクト情報31cを表示する処理を実現する構成例である。
ステップS304では、メモリに格納されたマウス指定オブジェクト画像を検索キーとして、オブジェクト情報データベース(DB)136を検索する。この処理は、オブジェクト情報取得部135の処理として実行される。
11 マウスカーソル
12 移動ライン
21 オブジェクト
22 移動ライン
31a,31b オブジェクト
31c オブジェクト情報
100 ユーザ
120 PC(パーソンナルコンピュータ)
121 マウスドライバ
122 マウス座標処理モジュール
123 GUI部
124 通信部
125 アプリケーション実行部
126 制御部
127 メモリ
128 表示部
129 マウス
130 複合現実感(MR)生成装置
131 三次元情報解析部
132 仮想オブジェクト管理モジュール
133 メモリ
134 通信部
135 オブジェクト情報取得部
136 オブジェクト情報データベース
140 サーバ
141 メガネ
142 カメラ
150 表示データ
151 PC画像
152 マウスカーソル
200 表示部
201 マーカー
211 マウスカーソル
250 表示データ
251 PC画像
252 オブジェクト
271 マウスカーソル
301 オブジェクト
450 表示データ
451 PC画像
471a,471b オブジェクト
471c オブジェクト情報
Claims (9)
- 第1表示部に表示された位置指示子であるカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定し、領域外である場合はカーソル位置情報を仮想オブジェクト管理部に出力する座標処理モジュールと、
前記第1表示部を含む実オブジェクトからなる画像を撮影するカメラと、
カメラ撮影画像に含まれる実オブジェクトの三次元位置を解析する三次元情報解析部と、
前記カメラ撮影画像を表示する第2表示部と、
前記カメラ撮影画像に含まれる実オブジェクトと異なる仮想オブジェクトを生成し、生成した仮想オブジェクトと前記実オブジェクトを含む合成画像を生成して前記第2表示部に表示する仮想オブジェクト管理部を有し、
前記仮想オブジェクト管理部は、
前記座標処理モジュールから入力したカーソル位置情報に基づいてカーソルの三次元位置を算出し、算出した位置にカーソルを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示する情報処理装置。 - 前記情報処理装置は、
前記位置指示子によって指定された指定オブジェクトに対する処理を行うアプリケーション実行部を有し、
前記アプリケーション実行部は、前記指定オブジェクトが前記第1表示部の領域内であるか領域外であるかを判定し、領域外である場合はオブジェクト位置情報を仮想オブジェクト管理部に出力し、
前記仮想オブジェクト管理部は、
前記座標処理モジュールから入力したオブジェクト位置情報に基づいてオブジェクトの三次元位置を算出し、算出した位置にオブジェクトを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示する請求項1に記載の情報処理装置。 - 前記仮想オブジェクト管理部は、
前記座標処理モジュールから入力したオブジェクト位置情報に基づいて算出したオブジェクトの三次元位置が前記第1表示部の表示領域を含む場合、該前記第1表示部の表示領域に重なるオブジェクト領域画像を消去した合成画像を前記第2表示部に表示する請求項2に記載の情報処理装置。 - 前記情報処理装置は、さらに、
前記仮想オブジェクトとして設定したカーソルによって指定された実オブジェクトの画像データを取得し、取得した画像データに基づくデータ検索を行ってオブジェクト情報の取得処理を行うオブジェクト情報取得部を有し、
前記オブジェクト情報取得部は、取得したオブジェクト情報を前記第1表示部への表示データとして出力する処理を行う請求項1に記載の情報処理装置。 - 前記オブジェクト情報取得部は、
実オブジェクトの画像データとオブジェクト情報を対応付けたデータベース、またはサーバに対するアクセスを実行して、前記実オブジェクトの画像データに基づく検索処理によりオブジェクト情報を取得する請求項4に記載の情報処理装置。 - 前記仮想オブジェクト管理部は、
前記カメラ撮影画像に含まれる第1表示部の構成部位の三次元位置情報に基づいて第1表示部の表示面を含む平面を算出し、該平面上に前記カーソルの位置が設定されるように、前記カーソルの三次元位置を算出する処理を行う請求項1に記載の情報処理装置。 - 前記カーソルはマウス操作によって移動するマウスカーソルであり、
前記座標処理モジュールは、マウス操作によるマウスカーソルの変位情報を入力して、マウスカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定する処理を行う請求項1に記載の情報処理装置。 - 情報処理装置において実行する情報処理方法であり、
座標処理モジュールが、第1表示部に表示された位置指示子であるカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定し、領域外である場合はカーソル位置情報を仮想オブジェクト管理部に出力する座標処理ステップと、
カメラが、前記第1表示部を含む実オブジェクトからなる画像を撮影する撮影ステップと、
三次元情報解析部が、カメラ撮影画像に含まれる実オブジェクトの三次元位置を解析する三次元情報解析ステップと、
仮想オブジェクト管理部が、前記カメラ撮影画像に含まれる実オブジェクトと異なる仮想オブジェクトを生成し、生成した仮想オブジェクトと前記実オブジェクトを含む合成画像を生成して第2表示部に表示する仮想オブジェクト管理ステップを有し、
前記仮想オブジェクト管理ステップは、
前記座標処理モジュールから入力したカーソル位置情報に基づいてカーソルの三次元位置を算出し、算出した位置にカーソルを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示するステップである情報処理方法。 - 情報処理装置において情報処理を実行させるプログラムであり、
座標処理モジュールに、第1表示部に表示された位置指示子であるカーソルの位置が前記第1表示部の領域内であるか領域外であるかを判定させて、領域外である場合はカーソル位置情報を仮想オブジェクト管理部に出力させる座標処理ステップと、
カメラに、前記第1表示部を含む実オブジェクトからなる画像を撮影させる撮影ステップと、
三次元情報解析部に、カメラ撮影画像に含まれる実オブジェクトの三次元位置を解析させる三次元情報解析ステップと、
仮想オブジェクト管理部に、前記カメラ撮影画像に含まれる実オブジェクトと異なる仮想オブジェクトを生成させて、生成した仮想オブジェクトと前記実オブジェクトを含む合成画像を生成して第2表示部に表示させる仮想オブジェクト管理ステップを有し、
前記仮想オブジェクト管理ステップは、
前記座標処理モジュールから入力したカーソル位置情報に基づいてカーソルの三次元位置を算出させ、算出した位置にカーソルを仮想オブジェクトとして設定した合成画像を前記第2表示部に表示させるステップであるプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010800317740A CN102473068B (zh) | 2009-07-21 | 2010-06-30 | 信息处理器、处理方法和程序 |
BR112012000913A BR112012000913A2 (pt) | 2009-07-21 | 2010-06-30 | processador de informação, método de processamento de informação, e, programa. |
EP10802154A EP2458486A1 (en) | 2009-07-21 | 2010-06-30 | Information processing device, information processing method, and program |
US13/383,511 US8751969B2 (en) | 2009-07-21 | 2010-06-30 | Information processor, processing method and program for displaying a virtual image |
RU2012101245/08A RU2524836C2 (ru) | 2009-07-21 | 2010-06-30 | Информационный процессор, способ обработки и программа |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009170118A JP5263049B2 (ja) | 2009-07-21 | 2009-07-21 | 情報処理装置、および情報処理方法、並びにプログラム |
JP2009-170118 | 2009-07-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011010533A1 true WO2011010533A1 (ja) | 2011-01-27 |
Family
ID=43499009
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/061161 WO2011010533A1 (ja) | 2009-07-21 | 2010-06-30 | 情報処理装置、および情報処理方法、並びにプログラム |
Country Status (9)
Country | Link |
---|---|
US (1) | US8751969B2 (ja) |
EP (1) | EP2458486A1 (ja) |
JP (1) | JP5263049B2 (ja) |
KR (1) | KR20120069654A (ja) |
CN (1) | CN102473068B (ja) |
BR (1) | BR112012000913A2 (ja) |
RU (1) | RU2524836C2 (ja) |
TW (1) | TW201108037A (ja) |
WO (1) | WO2011010533A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2795893A4 (en) * | 2011-12-20 | 2015-08-19 | Intel Corp | PRESENTATIONS OF AN ADVANCED REALITY BETWEEN SEVERAL EQUIPMENT |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8594467B2 (en) * | 2008-12-19 | 2013-11-26 | Microsoft Corporation | Interactive virtual display system for ubiquitous devices |
JP2012212340A (ja) | 2011-03-31 | 2012-11-01 | Sony Corp | 情報処理装置、画像表示装置、および情報処理方法 |
JP2013016116A (ja) | 2011-07-06 | 2013-01-24 | Sony Corp | 情報処理装置、画像表示装置、および情報処理方法 |
JP5821526B2 (ja) * | 2011-10-27 | 2015-11-24 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP6044079B2 (ja) | 2012-02-06 | 2016-12-14 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2013174642A (ja) * | 2012-02-23 | 2013-09-05 | Toshiba Corp | 映像表示装置 |
JP5580855B2 (ja) * | 2012-06-12 | 2014-08-27 | 株式会社ソニー・コンピュータエンタテインメント | 障害物回避装置および障害物回避方法 |
US10523899B2 (en) * | 2013-06-26 | 2019-12-31 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
CN104683683A (zh) * | 2013-11-29 | 2015-06-03 | 英业达科技有限公司 | 拍摄影像的系统及其方法 |
JP6357843B2 (ja) * | 2014-04-10 | 2018-07-18 | 凸版印刷株式会社 | アプリケーション検査システム、アプリケーション検査装置及びアプリケーション検査プログラム |
JP6280435B2 (ja) * | 2014-04-28 | 2018-02-14 | 富士通コンポーネント株式会社 | プログラム、中継装置及び情報処理装置 |
US9904055B2 (en) | 2014-07-25 | 2018-02-27 | Microsoft Technology Licensing, Llc | Smart placement of virtual objects to stay in the field of view of a head mounted display |
US9865089B2 (en) | 2014-07-25 | 2018-01-09 | Microsoft Technology Licensing, Llc | Virtual reality environment with real world objects |
US10311638B2 (en) | 2014-07-25 | 2019-06-04 | Microsoft Technology Licensing, Llc | Anti-trip when immersed in a virtual reality environment |
US10451875B2 (en) | 2014-07-25 | 2019-10-22 | Microsoft Technology Licensing, Llc | Smart transparency for virtual objects |
US10416760B2 (en) | 2014-07-25 | 2019-09-17 | Microsoft Technology Licensing, Llc | Gaze-based object placement within a virtual reality environment |
US20160027214A1 (en) * | 2014-07-25 | 2016-01-28 | Robert Memmott | Mouse sharing between a desktop and a virtual world |
US9766460B2 (en) | 2014-07-25 | 2017-09-19 | Microsoft Technology Licensing, Llc | Ground plane adjustment in a virtual reality environment |
US9858720B2 (en) | 2014-07-25 | 2018-01-02 | Microsoft Technology Licensing, Llc | Three-dimensional mixed-reality viewport |
JP6197801B2 (ja) * | 2015-01-30 | 2017-09-20 | コニカミノルタ株式会社 | データ入力システム、データ入力装置、データ入力方法およびデータ入力プログラム |
WO2016157316A1 (ja) * | 2015-03-27 | 2016-10-06 | 富士通株式会社 | 表示方法、プログラム及び表示制御装置 |
US10936872B2 (en) | 2016-12-23 | 2021-03-02 | Realwear, Inc. | Hands-free contextually aware object interaction for wearable display |
US10437070B2 (en) | 2016-12-23 | 2019-10-08 | Realwear, Inc. | Interchangeable optics for a head-mounted display |
US11507216B2 (en) | 2016-12-23 | 2022-11-22 | Realwear, Inc. | Customizing user interfaces of binary applications |
US10393312B2 (en) | 2016-12-23 | 2019-08-27 | Realwear, Inc. | Articulating components for a head-mounted display |
US10620910B2 (en) | 2016-12-23 | 2020-04-14 | Realwear, Inc. | Hands-free navigation of touch-based operating systems |
US11099716B2 (en) | 2016-12-23 | 2021-08-24 | Realwear, Inc. | Context based content navigation for wearable display |
WO2018217470A1 (en) * | 2017-05-23 | 2018-11-29 | Pcms Holdings, Inc. | System and method for prioritizing ar information based on persistence of real-life objects in the user's view |
US10983663B2 (en) * | 2017-09-29 | 2021-04-20 | Apple Inc. | Displaying applications |
CN111433710A (zh) * | 2017-12-04 | 2020-07-17 | 索尼公司 | 信息处理装置、信息处理方法以及记录介质 |
US11049322B2 (en) * | 2018-06-18 | 2021-06-29 | Ptc Inc. | Transferring graphic objects between non-augmented reality and augmented reality media domains |
US11366514B2 (en) | 2018-09-28 | 2022-06-21 | Apple Inc. | Application placement based on head position |
US11644940B1 (en) | 2019-01-31 | 2023-05-09 | Splunk Inc. | Data visualization in an extended reality environment |
US11853533B1 (en) * | 2019-01-31 | 2023-12-26 | Splunk Inc. | Data visualization workspace in an extended reality environment |
US11055918B2 (en) * | 2019-03-15 | 2021-07-06 | Sony Interactive Entertainment Inc. | Virtual character inter-reality crossover |
WO2021061351A1 (en) * | 2019-09-26 | 2021-04-01 | Apple Inc. | Wearable electronic device presenting a computer-generated reality environment |
CN116360601A (zh) | 2019-09-27 | 2023-06-30 | 苹果公司 | 用于提供扩展现实环境的电子设备、存储介质和方法 |
JP2021140085A (ja) | 2020-03-06 | 2021-09-16 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
CN116438503A (zh) * | 2020-12-17 | 2023-07-14 | 三星电子株式会社 | 电子装置和电子装置的操作方法 |
JP2022098268A (ja) | 2020-12-21 | 2022-07-01 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
JP2024506630A (ja) | 2021-02-08 | 2024-02-14 | サイトフル コンピューターズ リミテッド | 生産性のためのエクステンデッド・リアリティ |
EP4288950A1 (en) | 2021-02-08 | 2023-12-13 | Sightful Computers Ltd | User interactions in extended reality |
EP4295314A1 (en) | 2021-02-08 | 2023-12-27 | Sightful Computers Ltd | Content sharing in extended reality |
WO2023009580A2 (en) | 2021-07-28 | 2023-02-02 | Multinarity Ltd | Using an extended reality appliance for productivity |
WO2023028571A1 (en) * | 2021-08-27 | 2023-03-02 | Chinook Labs Llc | System and method of augmented representation of an electronic device |
US11948263B1 (en) | 2023-03-14 | 2024-04-02 | Sightful Computers Ltd | Recording the complete physical and extended reality environments of a user |
US20230334795A1 (en) | 2022-01-25 | 2023-10-19 | Multinarity Ltd | Dual mode presentation of user interface elements |
US12099696B2 (en) | 2022-09-30 | 2024-09-24 | Sightful Computers Ltd | Displaying virtual content on moving vehicles |
US11822941B2 (en) * | 2023-08-28 | 2023-11-21 | International Business Machines Corporation | Mobile computing device projected visualization interaction |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000056896A (ja) * | 1998-08-13 | 2000-02-25 | Nec Corp | ポインティング装置 |
JP2006154902A (ja) * | 2004-11-25 | 2006-06-15 | Olympus Corp | 手書き画像表示システム及び空間手書き用携帯情報端末 |
JP2008304269A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
WO2009072504A1 (ja) * | 2007-12-07 | 2009-06-11 | Sony Corporation | 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5305435A (en) * | 1990-07-17 | 1994-04-19 | Hewlett-Packard Company | Computer windows management system and method for simulating off-screen document storage and retrieval |
US6909443B1 (en) * | 1999-04-06 | 2005-06-21 | Microsoft Corporation | Method and apparatus for providing a three-dimensional task gallery computer interface |
JP4178697B2 (ja) * | 1999-11-18 | 2008-11-12 | ソニー株式会社 | 携帯型情報処理端末、情報入出力システム及び情報入出力方法 |
FI20010958A0 (fi) * | 2001-05-08 | 2001-05-08 | Nokia Corp | Menetelmä ja järjestely laajennetun käyttöliittymän muodostamiseksi |
FI117488B (fi) * | 2001-05-16 | 2006-10-31 | Myorigo Sarl | Informaation selaus näytöllä |
US7369102B2 (en) * | 2003-03-04 | 2008-05-06 | Microsoft Corporation | System and method for navigating a graphical user interface on a smaller display |
JP2006172423A (ja) * | 2004-11-18 | 2006-06-29 | Canon Inc | 遠隔操作システム、遠隔操作装置、被操作装置、遠隔操作方法、コンピュータプログラム、記憶媒体 |
US7843470B2 (en) * | 2005-01-31 | 2010-11-30 | Canon Kabushiki Kaisha | System, image processing apparatus, and information processing method |
US20100164990A1 (en) * | 2005-08-15 | 2010-07-01 | Koninklijke Philips Electronics, N.V. | System, apparatus, and method for augmented reality glasses for end-user programming |
US20070052672A1 (en) * | 2005-09-08 | 2007-03-08 | Swisscom Mobile Ag | Communication device, system and method |
SE0601216L (sv) * | 2006-05-31 | 2007-12-01 | Abb Technology Ltd | Virtuell arbetsplats |
US8277316B2 (en) * | 2006-09-14 | 2012-10-02 | Nintendo Co., Ltd. | Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting |
JP4909851B2 (ja) * | 2007-09-25 | 2012-04-04 | 日立アプライアンス株式会社 | 洗濯乾燥機 |
US20090221368A1 (en) * | 2007-11-28 | 2009-09-03 | Ailive Inc., | Method and system for creating a shared game space for a networked game |
US20090237492A1 (en) * | 2008-03-18 | 2009-09-24 | Invism, Inc. | Enhanced stereoscopic immersive video recording and viewing |
US8176434B2 (en) * | 2008-05-12 | 2012-05-08 | Microsoft Corporation | Virtual desktop view scrolling |
-
2009
- 2009-07-21 JP JP2009170118A patent/JP5263049B2/ja not_active Expired - Fee Related
-
2010
- 2010-06-17 TW TW099119746A patent/TW201108037A/zh unknown
- 2010-06-30 KR KR1020127000896A patent/KR20120069654A/ko not_active Application Discontinuation
- 2010-06-30 US US13/383,511 patent/US8751969B2/en active Active
- 2010-06-30 BR BR112012000913A patent/BR112012000913A2/pt not_active IP Right Cessation
- 2010-06-30 CN CN2010800317740A patent/CN102473068B/zh not_active Expired - Fee Related
- 2010-06-30 RU RU2012101245/08A patent/RU2524836C2/ru not_active IP Right Cessation
- 2010-06-30 WO PCT/JP2010/061161 patent/WO2011010533A1/ja active Application Filing
- 2010-06-30 EP EP10802154A patent/EP2458486A1/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000056896A (ja) * | 1998-08-13 | 2000-02-25 | Nec Corp | ポインティング装置 |
JP2006154902A (ja) * | 2004-11-25 | 2006-06-15 | Olympus Corp | 手書き画像表示システム及び空間手書き用携帯情報端末 |
JP2008304269A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2008304268A (ja) | 2007-06-06 | 2008-12-18 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
WO2009072504A1 (ja) * | 2007-12-07 | 2009-06-11 | Sony Corporation | 制御装置、入力装置、制御システム、制御方法及びハンドヘルド装置 |
Non-Patent Citations (1)
Title |
---|
ANDREW J. DAVISON: "Real-time simultaneous localisation and mapping with a single camera", PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON COMPUTER VISION, NINTH, 2003 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2795893A4 (en) * | 2011-12-20 | 2015-08-19 | Intel Corp | PRESENTATIONS OF AN ADVANCED REALITY BETWEEN SEVERAL EQUIPMENT |
KR101574099B1 (ko) | 2011-12-20 | 2015-12-03 | 인텔 코포레이션 | 다수의 장치에 걸친 증강 현실 표현 |
US9952820B2 (en) | 2011-12-20 | 2018-04-24 | Intel Corporation | Augmented reality representations across multiple devices |
Also Published As
Publication number | Publication date |
---|---|
JP5263049B2 (ja) | 2013-08-14 |
CN102473068A (zh) | 2012-05-23 |
US8751969B2 (en) | 2014-06-10 |
US20120124509A1 (en) | 2012-05-17 |
BR112012000913A2 (pt) | 2016-03-01 |
JP2011028309A (ja) | 2011-02-10 |
KR20120069654A (ko) | 2012-06-28 |
RU2524836C2 (ru) | 2014-08-10 |
CN102473068B (zh) | 2013-12-25 |
EP2458486A1 (en) | 2012-05-30 |
TW201108037A (en) | 2011-03-01 |
RU2012101245A (ru) | 2013-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5263049B2 (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
CN110073313B (zh) | 使用母设备和至少一个伴随设备与环境交互 | |
US10095458B2 (en) | Information processing apparatus, information processing method, non-transitory computer-readable storage medium, and system | |
JP5158006B2 (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
US9996982B2 (en) | Information processing device, authoring method, and program | |
CN103377487B (zh) | 信息处理设备、显示控制方法以及程序 | |
JP5991423B2 (ja) | 表示装置、表示方法、表示プログラムおよび位置設定システム | |
JP5807686B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2016122392A (ja) | 情報処理装置、情報処理システム、その制御方法及びプログラム | |
US20140142900A1 (en) | Information processing apparatus, information processing method, and program | |
JP5472509B2 (ja) | 情報処理装置、および情報処理方法、並びに情報記録媒体 | |
Georgel et al. | Navigation tools for viewing augmented cad models | |
JP2020009008A (ja) | 情報処理システム及び情報処理システム用プログラム | |
JP5175794B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP7401245B2 (ja) | 画像合成装置、画像合成装置の制御方法およびプログラム | |
JP6732082B1 (ja) | 画像生成装置、画像生成方法、および画像生成プログラム | |
JP6575221B2 (ja) | 表示制御方法、情報処理装置及び表示制御プログラム | |
JP2008186207A (ja) | 動体シミュレーション向け地図処理装置 | |
JP2005115467A (ja) | 仮想オブジェクト操作プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080031774.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10802154 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010802154 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13383511 Country of ref document: US Ref document number: 413/CHENP/2012 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 20127000896 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012101245 Country of ref document: RU |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012000913 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012000913 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120113 |