WO2024075544A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2024075544A1 WO2024075544A1 PCT/JP2023/034418 JP2023034418W WO2024075544A1 WO 2024075544 A1 WO2024075544 A1 WO 2024075544A1 JP 2023034418 W JP2023034418 W JP 2023034418W WO 2024075544 A1 WO2024075544 A1 WO 2024075544A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- user
- controller
- control unit
- processing device
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/22—Setup operations, e.g. calibration, key configuration or button assignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- This disclosure relates to an information processing device, an information processing method, and a program.
- XR Extended Reality, Cross Reality
- VR Virtual Reality
- AR Augmented Reality
- HMDs Head Mounted Displays
- Known devices for providing this XR to users include glasses- or hat-type HMDs (Head Mounted Displays).
- HMDs have a display and camera placed in front of the user's eyes, and display virtual objects in the XR space as seen through the camera or after image processing, in the same way as real objects that exist in the real world.
- a technology for reflecting user operations in XR space for example, in the case of games, a technology has been proposed that uses some kind of real object as a virtual game controller in XR space (hereinafter referred to as a "virtual controller") (see, for example, Patent Document 1). Also proposed is a technology that uses machine learning to learn the movements of the user's fingers relative to a real object that serves as a virtual controller, thereby enhancing the operability of the virtual controller (see, for example, Patent Document 2).
- the user in the case of games, if the above-mentioned conventional technology is used, the user must prepare a real object to be used as a virtual controller. In addition, the user must change the real object to a different shape and size for each game they want to play in order to enhance operability.
- this disclosure proposes an information processing device, an information processing method, and a program that can further improve the convenience of operations in XR spaces.
- an information processing device includes a control unit that displays an object in an XR space so that the object can be operated by a user.
- the control unit also displays an execution screen of an app and a controller that operates the app as the object, and performs control to operate the controller as a virtual controller that can be arbitrarily set without relying on an actual object in real space.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of XR glasses according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
- FIG. 1 is a schematic explanatory diagram (part 1) of an information processing method according to an embodiment of the present disclosure.
- FIG. 2 is a schematic explanatory diagram (part 2) of an information processing method according to an embodiment of the present disclosure.
- FIG. 1 is a block diagram showing an example configuration of XR glasses according to an embodiment of the present disclosure.
- FIG. 1 is a block diagram illustrating a configuration example of a smartphone according to an embodiment of the present disclosure.
- 13 is a flowchart showing a processing procedure for setting a virtual controller.
- FIG. 11 is a diagram showing an example of an operation support process (part 1); FIG. 11 is a diagram showing an example of the operation support process (part 2).
- FIG. 2 is a block diagram illustrating a configuration example of a server device according to an embodiment of the present disclosure.
- FIG. 1 is a hardware configuration diagram illustrating an example of a computer that realizes the functions of a smartphone.
- XR space is a general term for various spaces in image processing technology that combine virtual and real spaces, such as virtual reality space in VR, augmented reality space in AR, mixed reality space in MR (Mixed Reality), and substitute reality space in SR (Substitutional Reality).
- an information processing device is mainly a smartphone 50.
- the application software (hereinafter, referred to as "app") running on the smartphone 50 is a game app, and an example will be given where a user plays a game using a virtual controller while a game screen of this game app is displayed on an HMD.
- FIG. 1 is a diagram showing an example of a schematic configuration of the XR glasses 10 according to an embodiment of the present disclosure.
- the XR glasses 10 are realized as, for example, a glasses-type HMD worn on the head of the user U.
- the XR glasses 10 include a sensor unit 11 and a display unit 12.
- the sensor unit 11 is, for example, a group of sensors that acquire various sensor information related to the surroundings of the user U and the state of the user U.
- the example in FIG. 1 illustrates a stereo configuration with two sensor units 11, but the number of sensor units 11 is not limited.
- the sensor unit 11 functions as a camera for capturing images of the space in front of the user U's eyes and the user U's movements.
- the sensor unit 11 functions as an IMU (Inertial Measurement Unit) that measures the acceleration and angular velocity of the XR glasses 10.
- the XR glasses 10 can recognize the direction and posture of the user U's head, the movements of the user U's fingers, etc. based on the sensor information of the sensor unit 11 that has such functions.
- the display unit 12 corresponds to the eyeglass lens portion located in front of the user U's eyes when the eyeglasses are worn, and has optical transparency.
- the display unit 12 has a right eye display 12a that displays an image for the right eye, and a left eye display 12b that displays an image for the left eye.
- the right eye display 12a is positioned in front of the user U's right eye
- the left eye display 12b is positioned in front of the user U's left eye, as shown in FIG. 1.
- the display unit 12 does not necessarily have to be divided into a right-eye display 12a and a left-eye display 12b.
- An image for the right eye may be displayed on the right side of an integrally formed display, and an image for the left eye may be displayed on the left side of the display.
- the XR glasses 10 display an image for the right eye and an image for the left eye on the display unit 12, allowing the user U to see various objects in the XR space, including the virtual controller, in a stereoscopic view in front of his/her line of sight.
- the shape of the XR glasses 10 is not limited to the example shown in FIG. 1.
- the XR glasses 10 may be a headband-type HMD that uses a band that goes around the entire side of the head or a band that goes through the top of the head, or a helmet-type HMD in which the visor portion corresponds to the display.
- Configuration of information processing system> 2 is a diagram showing a configuration example of the information processing system 1 according to the embodiment of the present disclosure. As shown in FIG. 2, the information processing system 1 includes XR glasses 10, a smartphone 50, and a server device 100.
- the smartphone 50 is a terminal device owned and used by the user U.
- the smartphone 50 corresponds to an example of an information processing device according to this embodiment.
- One or more game apps that the user U wishes to play are installed on the smartphone 50.
- the smartphone 50 launches and executes that app.
- the XR glasses 10 and the smartphone 50 are connected so that they can communicate with each other. As shown in FIG. 2, the XR glasses 10 and the smartphone 50 are connected so that they can communicate directly with each other, for example, via Bluetooth (registered trademark).
- the XR glasses 10 and the smartphone 50 work together through this communication, and the smartphone 50 mirrors the game screen of the running game app on the display unit 12 of the XR glasses 10.
- the server device 100 is realized, for example, as a cloud server.
- the server device 100 is a game server.
- the server device 100 is provided so as to be able to communicate with one or more smartphones 50 via a network N, such as the Internet or a mobile phone network.
- the server device 100 collects and analyzes, for example, the operation history of game apps and the usage history of virtual controllers used from each smartphone 50, and manages them as history information and ranking information.
- the XR glasses 10 and the smartphone 50 are not limited to being directly connected to each other so as to be able to communicate with each other as shown in FIG. 2, but may also be connected via a network N.
- the user U when using existing technology, the user U must prepare a real object that he or she wants to use as a virtual controller. Furthermore, in order to further enhance operability, the user U must change the real object for each game he or she wants to play, with a shape and size appropriate for that game. In other words, there is room for further improvement in the existing technology in order to further improve the convenience of operation in XR space.
- the smartphone 50 displays an object in the XR space so that the user U can operate it.
- the smartphone 50 also displays an execution screen of an application and a controller for operating the application as the above-mentioned object, and performs control to operate the above-mentioned controller as a virtual controller that can be arbitrarily set without relying on an actual object in the real space.
- FIG. 3 is a schematic diagram (part 1) of an information processing method according to an embodiment of the present disclosure.
- FIG. 4 is a schematic diagram (part 2) of an information processing method according to an embodiment of the present disclosure.
- the smartphone 50 displays a game screen G in an overlapping manner in the XR space displayed on the display unit 12 of the XR glasses 10, as shown in FIG. 3.
- the game screen G is a mirrored version of the game screen of the game app running on the smartphone 50.
- the smartphone 50 also displays the virtual controller VC1 in an overlapping manner in the XR space.
- the smartphone 50 enlarges a group of operational components that are actually arranged on the game screen of a game app, and displays them as an overlapping virtual controller VC1 in the XR space at a position within reach of the user U.
- the virtual controller VC1 is linked to a group of actual operational parts when operated by the user U. This allows the user U to play a game while operating the game app with the virtual controller VC1 within reach of the user U in the XR space, without having to prepare any real objects.
- the user U can also move and arrange each of the operational components included in the virtual controller VC1 to any position. For example, when the smartphone 50 detects a movement in which the user U grabs and moves an operational component that he or she wants to move in the XR space, the smartphone 50 changes the arrangement of the corresponding operational component. This allows the user U to lay out the virtual controller VC1 in a preferred arrangement that is easy to operate.
- the smartphone 50 can also display a candidate list L of other virtual controllers in an overlapping manner in the XR space. This allows the user U to easily change the virtual controller by selecting one of the other virtual controllers from the candidate list L, in place of the virtual controller VC1 that the smartphone 50 has displayed as a default based on, for example, the past usage history of the same game.
- the virtual controllers displayed in the candidate list L are determined based on, for example, the track record of other users, the usage rate in the same genre, popularity, etc. A specific example of this will be described later using Figure 7.
- the smartphone 50 selects a steering wheel-shaped operation part according to the type, and displays this as an overlapping virtual controller VC2 in a position within reach of the user U in the XR space.
- a portion of the virtual controller VC2 extends beyond the display unit 12, i.e., the field of view of the user U, but this can be resolved by the user U tilting his or her head downward, for example, so that the entire controller VC2 is displayed.
- the user U can conveniently enjoy games, for example racing games, using the virtual controller VC2 that is easy to operate.
- the smartphone 50 displays an object in the XR space so that the user U can operate it.
- the smartphone 50 also displays an execution screen of the application and a controller for operating the application as the above-mentioned object, and performs control to operate the above-mentioned controller as a virtual controller that can be arbitrarily set without relying on an actual object in the real space.
- the information processing method according to this embodiment can further improve the convenience of operations in XR space.
- user U operates the virtual controllers VC1 and VC2 by pressing and rotating them with his or her fingers, and it is believed that the responsiveness of the game app will differ depending on the position of the fingers and the speed at which they are moved when operating. In particular, the position and movement of the fingers when operating are likely to reflect the habits of user U. There may also be cases where user U is unable to operate properly.
- the smartphone 50 calculates, for example, the responsiveness of a game app to the operation of the user U as a match rate, and performs operation assistance processing based on the match rate.
- the match rate is a numerical value that indicates how well the position and movement of the user U's fingers match the operation area corresponding to the virtual controller when, for example, the movement of the user U's fingers estimated from image information is converted into an operation event for a game app.
- the match rate may also be called the coincidence rate.
- the smartphone 50 can change the display mode of the virtual controller or calibrate the operation area depending on the match rate. Specific examples of this point will be described later using Figures 8 and 9.
- FIG. 5 is a block diagram showing an example of the configuration of the XR glasses 10 according to an embodiment of the present disclosure. Note that Fig. 5 and Figs. 6 and 10 shown later show only components necessary for explaining the features of this embodiment, and descriptions of general components are omitted.
- each component shown in Figures 5, 6, and 10 is a functional concept, and does not necessarily have to be physically configured as shown.
- the specific form of distribution and integration of each block is not limited to that shown, and all or part of it can be functionally or physically distributed and integrated in any unit depending on various loads, usage conditions, etc.
- the XR glasses 10 include a sensor unit 11, a display unit 12, a communication unit 13, a memory unit 14, and a control unit 15.
- the sensor unit 11 is a group of sensors that acquire various sensor information related to the situation around the user U and the state of the user U.
- the sensor unit 11 includes a camera 11a and an inertial sensor 11b.
- Camera 11a captures an image of a subject (i.e., real space and an actual object located in real space) located in front of user U.
- the subject located in front of user U includes the user U's fingers.
- the inertial sensor 11b corresponds to the IMU mentioned above.
- the inertial sensor 11b includes an acceleration sensor and an angular velocity sensor, not shown.
- the inertial sensor 11b measures the acceleration and angular velocity of the XR glasses 10.
- the XR glasses 10 recognize the orientation, posture, and body movement of the user U's head and body, the direction of the field of view, the speed of the viewpoint movement, etc., based on the sensor information of the inertial sensor 11b.
- the sensor unit 11 does not necessarily have to be mounted on the XR glasses 10.
- the sensor unit 11 may be, for example, a group of external sensors connected to the XR glasses 10 by wire or wirelessly.
- the display unit 12 has already been described, so a detailed description will be omitted here.
- the communication unit 13 is implemented by a network adapter or the like.
- the communication unit 13 is connected to the smartphone 50 directly via short-range wireless communication such as Bluetooth (registered trademark) or a wired connection, or wirelessly via the network N, and transmits and receives various information to and from the smartphone 50.
- short-range wireless communication such as Bluetooth (registered trademark) or a wired connection, or wirelessly via the network N, and transmits and receives various information to and from the smartphone 50.
- the storage unit 14 is realized by a storage device such as a RAM (Random Access Memory), a ROM (Read Only Memory), or a Flash Memory.
- the storage unit 14 stores linkage control information 14a and display model information 14b.
- the link control information 14a is information about the control when the XR glasses 10 link with the smartphone 50, and includes, for example, setting information about network settings.
- the display model information 14b is information about the display model and display position of various objects including the virtual controller displayed on the display unit 12.
- the memory unit 14 stores various programs that run on the XR glasses 10.
- the control unit 15 corresponds to a so-called processor, and is realized, for example, by a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), etc., which executes various programs stored in the memory unit 14 using the RAM as a working area.
- the control unit 15 can also be realized, for example, by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
- the control unit 15 has an acquisition unit 15a, a transmission unit 15b, a generation unit 15c, and a display control unit 15d, and realizes or executes the functions and actions of the information processing described below.
- the acquisition unit 15a acquires various sensor information from the sensor unit 11. For example, the acquisition unit 15a acquires imaging data captured by the camera 11a as image information. The acquisition unit 15a also acquires, via the communication unit 13, the game screen G transmitted from the smartphone 50 and notifications of virtual controllers selected on the smartphone 50.
- the transmitting unit 15b transmits the image information from the camera 11a acquired by the acquiring unit 15a to the smartphone 50 via the communication unit 13.
- the generation unit 15c generates a display screen including various objects such as the game screen G and the virtual controller obtained from the smartphone 50 based on the display model information 14b.
- the display control unit 15d performs display control to display the display screen generated by the generation unit 15c on the display unit 12 in accordance with the sensor information from the sensor unit 11.
- Fig. 6 is a block diagram showing a configuration example of the smartphone 50 according to an embodiment of the present disclosure.
- the smartphone 50 includes an operation unit 51, a display unit 52, a communication unit 53, a storage unit 54, and a control unit 55.
- the operation unit 51 accepts various operations performed by the user U on the smartphone 50.
- the display unit 52 displays various display information for the user U on the smartphone 50.
- the operation unit 51 and the display unit 52 are realized integrally by, for example, a touch panel display.
- the communication unit 53 is realized by a network adapter or the like.
- the communication unit 53 is connected to the XR glasses 10 directly by short-range wireless communication such as Bluetooth (registered trademark) or a wired connection, or wirelessly via the network N, and transmits and receives various information to and from the XR glasses 10.
- the communication unit 53 is also connected to the server device 100 wirelessly via the network N, and transmits and receives various information to and from the server device 100.
- the storage unit 54 is realized by a storage device such as a RAM, a ROM, or a flash memory.
- the storage unit 54 stores application information 54a, linkage control information 54b, recognition model information 54c, and conversion model information 54d.
- App information 54a is information that includes one or more game app programs, programs for apps other than game apps, programs relating to embodiments of the present disclosure, and various parameters used during the operation of these programs.
- the link control information 54b is information related to the control when the smartphone 50 links with the XR glasses 10, and includes, for example, setting information related to network settings.
- the recognition model information 54c is information including a recognition model used in the recognition process of image information from the XR glasses 10 performed by the recognition unit 55e described later.
- the conversion model information 54d is information including a conversion model used in the operation conversion process executed by the operation conversion unit 55g described later. This conversion model may also be referred to as an operation conversion engine.
- the "operation conversion process” is a process for converting the movement of the user U's fingers, estimated from the recognition results of the image information from the XR glasses 10, into an operation event for the game app.
- the control unit 55 corresponds to a so-called processor, and is realized, for example, by a CPU, MPU, GPU, etc., executing various programs stored in the storage unit 54 using the RAM as a working area.
- the control unit 55 can also be realized, for example, by an integrated circuit such as an ASIC or FPGA.
- the control unit 55 has an application execution unit 55a, a setting unit 55b, a transmission unit 55c, an acquisition unit 55d, a recognition unit 55e, an estimation unit 55f, an operation conversion unit 55g, an operation control unit 55h, and an operation support processing unit 55i, and realizes or executes the functions and actions of information processing described below.
- the application execution unit 55a reads the game application selected by the user U via the operation unit 51 from the application information 54a and executes it.
- the setting unit 55b sets the virtual controller to be displayed on the XR glasses 10 according to the game application selected by the user U.
- FIG. 7 is a flowchart showing the procedure for setting the virtual controller.
- the virtual controller is referred to as "V-con.”
- step S101 when the user U selects a game app (step S101), the setting unit 55b determines whether or not a virtual controller has been used in the app in the past (step S102). If a virtual controller has been used in the past (step S102, Yes), the setting unit 55b sets the XR glasses 10 to display the virtual controller (step S103).
- the setting unit 55b determines whether or not there is a history of other users using the virtual controller in the app (step S104).
- the history of other users is obtained, for example, as a result of an inquiry request made to the server device 100.
- the history of other users who have borrowed and used the smartphone 50 from the user U may be stored in the storage unit 54.
- the setting unit 55b sets the XR glasses 10 to display virtual controllers that are frequently used by other users based on the track record (step S105).
- step S104 determines whether there is a virtual controller that is frequently used in the same genre as the game app selected in step S101 (step S106). For example, a query request is made to the server device 100 to obtain virtual controllers that are frequently used in the same genre.
- step S106 If there is a virtual controller that is frequently used in the same genre (step S106, Yes), the setting unit 55b sets the XR glasses 10 to display the virtual controller that is frequently used in that genre (step S107).
- the setting unit 55b determines whether there is ranking information based on history (step S108).
- the ranking information based on history is obtained, for example, by making an inquiry request to the server device 100.
- the setting unit 55b sets the XR glasses 10 to display multiple candidates for virtual controllers that are highly popular based on the ranking information (step S109).
- step S108 If there is no ranking information based on the history (step S108, No), the setting unit 55b sets the XR glasses 10 to display multiple pre-preset candidates for virtual controllers (step S110). Then, the setting unit 55b ends the process of FIG. 7.
- the transmission unit 55c notifies the XR glasses 10 of the virtual controller selected by the setting unit 55b via the communication unit 53.
- the acquisition unit 55d acquires the image information transmitted from the XR glasses 10 via the communication unit 53.
- the recognition unit 55e performs image recognition based on the image information acquired by the acquisition unit 55d and the recognition model information 54c.
- the estimation unit 55f estimates the movement of the object recognized as the user U's fingers as a result of the image recognition by the recognition unit 55e.
- the operation conversion unit 55g executes the above-mentioned "operation conversion process" based on the movement estimated by the estimation unit 55f and the conversion model information 54d. If the virtual controller is a virtual keyboard, the operation conversion unit 55g converts the event into a Touch event. If the virtual controller is other than a virtual keyboard, the operation conversion unit 55g converts the event into a sensor event.
- the operation support processing unit 55i described below executes operation support processing.
- the operation conversion unit 55g guides the user U to a position where the operation can be converted by visually expressing the operation on a virtual controller.
- the operation control unit 55h notifies the application execution unit 55a of various events that have been converted into operation events by the operation conversion unit 55g.
- the application execution unit 55a progresses the game application based on the notified events.
- FIG. 8 is a diagram (part 1) showing an example of the operation support processing.
- FIG. 9 is a diagram (part 2) showing an example of the operation support processing.
- the operation conversion unit 55g guides the user U to a position where the operation can be converted by visually expressing the operation on the virtual controller, for example.
- the operation support processing unit 55i calculates in real time, for example, for the virtual controller VC2, the match rate of the position of the user U's fingers relative to the virtual controller VC2.
- the operation support processing unit 55i then changes in real time the manner in which the virtual controller VC2 is displayed on the XR glasses 10 depending on the level of the calculated match rate. For example, as shown in FIG. 8, the operation support processing unit 55i changes in real time the manner in which the virtual controller VC2 is displayed on the XR glasses 10 so that the lower the match rate, the lighter the color. At this time, the lower the match rate, the thinner the outline of the virtual controller VC2 may be. This can be said to reduce the visibility of the virtual controller VC2.
- the operation support processing unit 55i changes in real time the manner in which the virtual controller VC2 is displayed on the XR glasses 10 so that the higher the match rate, the darker the color. At this time, the higher the match rate, the thicker the outline of the virtual controller VC2 may be. These can be said to increase the visibility of the virtual controller VC2.
- the user U can recognize whether he or she is operating the virtual controller VC2 well, and if he or she is not operating it well, he or she can easily recognize that he or she needs to change the position of his or her fingers or the speed at which he or she moves them when operating the virtual controller VC2.
- the operation support processing unit 55i may also calibrate the operation area of the virtual controller based on the average match rate over one game.
- the operation support processing unit 55i calibrates the operation area (the area surrounded by a dashed line around the virtual controller VC2 in the figure) according to the match rate for each game so that the match rate is higher in the second game than in the first game, and in the third game than in the second game.
- the operation support processing unit 55i calibrates the operation area (the area surrounded by a dashed line around the virtual controller VC2 in the figure) according to the match rate for each game so that the match rate is higher in the second game than in the first game, and in the third game than in the second game.
- FIG. 9 for example, after the first game is completed, calibration is performed to narrow the operation area in the vertical direction in order to improve the match rate. Then, after the second game is completed, calibration is performed to widen the operation area in the horizontal direction. Then, after the third game is completed, calibration is performed to slightly narrow only the left side
- the user U can easily operate the game well. This, in turn, can increase the user U's satisfaction with the services provided by the information processing system 1 according to this embodiment.
- Fig. 10 is a block diagram showing a configuration example of the server device 100 according to an embodiment of the present disclosure.
- the server device 100 includes a communication unit 101, a storage unit 102, and a control unit 103.
- the communication unit 101 is realized by a network adapter or the like.
- the communication unit 101 is connected to the smartphone 50 via the network N in a wired or wireless manner, and transmits and receives various information to and from the smartphone 50.
- the storage unit 102 is realized by a storage device such as a RAM, a ROM, a flash memory, or a hard disk drive.
- the storage unit 102 stores history information 102a and ranking information 102b.
- the history information 102a is information that accumulates the execution history of game apps that use virtual controllers for each smartphone 50, which is acquired from one or more smartphones 50 by an acquisition unit 103a described later as needed.
- Ranking information 102b is information including various rankings that are compiled by a compilation unit 103b (described later) based on history information 102a.
- the rankings referred to here include rankings such as the popularity of the virtual controllers mentioned above, as well as usage rates.
- Ranking information 102b manages these various rankings by categorization, such as by user, by game app genre, or by game app type.
- the storage unit 102 also stores various programs that run on the server device 100.
- the control unit 103 corresponds to a so-called processor, and is realized, for example, by a CPU, MPU, GPU, etc., executing various programs stored in the storage unit 102 using the RAM as a working area.
- the control unit 103 can also be realized, for example, by an integrated circuit such as an ASIC or FPGA.
- the control unit 103 has an acquisition unit 103a, a counting unit 103b, an extraction unit 103c, and a transmission unit 103d, and realizes or executes the functions and actions of information processing described below.
- the acquisition unit 103a acquires, from one or more smartphones 50 via the communication unit 101, the execution history of a game app that uses a virtual controller for each smartphone 50 at any time.
- the acquisition unit 103a also accumulates the acquired execution history in the history information 102a.
- the acquisition unit 103a also acquires inquiry requests for the history information 102a and ranking information 102b from the smartphone 50 via the communication unit 101 each time, and notifies the extraction unit 103c of the acquired inquiry requests.
- the aggregation unit 103b aggregates the various rankings mentioned above based on the history information 102a and stores the aggregation results in the ranking information 102b.
- the extraction unit 103c In response to an inquiry request notified by the acquisition unit 103a, the extraction unit 103c extracts data corresponding to the answer to the request from the history information 102a and the ranking information 102b.
- the transmission unit 103d transmits the data extracted by the extraction unit 103c to the smartphone 50 via the communication unit 101.
- the user U can select a virtual controller to be used from among those prepared in advance, and can arbitrarily change the default display or the layout of the selected virtual controller.
- the user U may be able to generate a new virtual controller.
- the smartphone 50 causes the XR glasses 10 to display an object equivalent to a virtual controller generation tool in the XR space in the display unit 12 shown in, for example, FIG. 3 or FIG. 4.
- the smartphone 50 displays on the XR glasses 10 in the XR space a palette tool that allows the user to arbitrarily specify, for example, the shape of the virtual controller or the operation parts to be placed on the virtual controller.
- the palette tool may include a link tool that links each of the operation parts to be placed to an operation part on the actual game screen.
- the user U can create a virtual controller of his/her preference from scratch without preparing a real object.
- the virtual controller is a virtual controller for a game app, but it may also be a virtual controller for an app other than a game app.
- the smartphone 50 is the main entity that executes the information processing according to the embodiment of the present disclosure, but some or all of the information processing may be performed by the XR glasses 10 or the server device 100.
- control unit 15 of the XR glasses 10 has processing units that correspond to, for example, some or all of the setting unit 55b, recognition unit 55e, estimation unit 55f, operation conversion unit 55g, operation control unit 55h, and operation support processing unit 55i of the smartphone 50.
- the control unit 15 exchanges various information required each time with the smartphone 50 and the server device 100, and executes information processing such as the setting processing shown in FIG. 7, the operation conversion processing and operation control processing based on the image information described using FIG. 5, and the operation support processing shown in FIG. 8 and FIG. 9.
- the control unit 103 of the server device 100 has a processing unit that corresponds to, for example, some or all of the setting unit 55b, recognition unit 55e, estimation unit 55f, operation conversion unit 55g, operation control unit 55h, and operation support processing unit 55i of the smartphone 50.
- the control unit 103 exchanges various information required each time with the XR glasses 10 and the smartphone 50, and executes the setting process shown in FIG. 7, the operation conversion process and operation control process based on the image information described using FIG. 5, and the operation support process shown in FIG. 8 and FIG. 9.
- each component of each device shown in the figure is a functional concept, and does not necessarily have to be physically configured as shown in the figure.
- the specific form of distribution and integration of each device is not limited to that shown in the figure, and all or part of them can be functionally or physically distributed and integrated in any unit depending on various loads, usage conditions, etc.
- FIG. 11 is a hardware configuration diagram showing an example of the computer 1000 that realizes the functions of the smartphone 50.
- the computer 1000 has a CPU 1100, a RAM 1200, a ROM 1300, a secondary storage device 1400, a communication interface 1500, and an input/output interface 1600. Each part of the computer 1000 is connected by a bus 1050.
- the CPU 1100 operates based on the programs stored in the ROM 1300 or the secondary storage device 1400, and controls each part. For example, the CPU 1100 loads the programs stored in the ROM 1300 or the secondary storage device 1400 into the RAM 1200, and executes processes corresponding to the various programs.
- the ROM 1300 stores boot programs such as the Basic Input Output System (BIOS) that is executed by the CPU 1100 when the computer 1000 starts up, as well as programs that depend on the hardware of the computer 1000.
- BIOS Basic Input Output System
- Secondary storage device 1400 is a computer-readable recording medium that non-temporarily records programs executed by CPU 1100 and data used by such programs. Specifically, secondary storage device 1400 is a recording medium that records the program related to the present disclosure, which is an example of program data 1450.
- the communication interface 1500 is an interface for connecting the computer 1000 to an external network 1550 (e.g., the Internet).
- the CPU 1100 receives data from other devices and transmits data generated by the CPU 1100 to other devices via the communication interface 1500.
- the input/output interface 1600 is an interface for connecting the input/output device 1650 and the computer 1000.
- the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input/output interface 1600.
- the CPU 1100 also transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600.
- the input/output interface 1600 may also function as a media interface that reads programs and the like recorded on a specific recording medium.
- Examples of media include optical recording media such as DVDs (Digital Versatile Discs) and PDs (Phase change rewritable Disks), magneto-optical recording media such as MOs (Magneto-Optical Disks), tape media, magnetic recording media, and semiconductor memories.
- optical recording media such as DVDs (Digital Versatile Discs) and PDs (Phase change rewritable Disks)
- magneto-optical recording media such as MOs (Magneto-Optical Disks)
- tape media magnetic recording media
- magnetic recording media and semiconductor memories.
- the CPU 1100 of the computer 1000 executes a program loaded onto the RAM 1200 to realize the functions of the control unit 55.
- the secondary storage device 1400 stores the program according to this disclosure and data in the storage unit 54.
- the CPU 1100 reads and executes the program data 1450 from the secondary storage device 1400, but as another example, the CPU 1100 may obtain these programs from another device via the external network 1550.
- the smartphone 50 (corresponding to an example of an "information processing device") includes a control unit 55 that displays an object in the XR space so that the object can be operated by the user U.
- the control unit 55 also displays an execution screen of an application and a controller that operates the application as the object, and performs control to operate the controller as a virtual controller that can be arbitrarily set without relying on an actual object in the real space. This can further improve the convenience of operations in the XR space.
- the present technology can also be configured as follows.
- a control unit that displays an object in an XR space so that the object can be operated by a user;
- the control unit is displaying an execution screen of an application and a controller for operating the application as the object;
- Control is performed to operate the controller as a virtual controller that can be arbitrarily set without depending on a real object in real space.
- Information processing device (2)
- the control unit is displaying, as a default, the virtual controller that the user has previously used in the app, among the virtual controllers prepared in advance; The information processing device according to (1).
- the control unit is displaying, as a default, a virtual controller that has been used by other users in the app and has a high usage rate among the virtual controllers prepared in advance;
- the information processing device according to (1) is
- the control unit is displaying, as a default, the virtual controller that is highly used in apps of the same genre as the app, among the virtual controllers prepared in advance; The information processing device according to (1).
- the control unit is displaying a candidate list of another virtual controller as the object arbitrarily selectable by the user in addition to the virtual controller displayed as a default;
- the control unit is displaying a list of a plurality of candidates having high popularity based on a history from among the virtual controllers prepared in advance as the object selectable by the user;
- the information processing device according to (1) is displaying, as a default, the virtual controller that is highly used in apps of the same genre as the app, among the virtual controllers prepared in advance; The information processing device according to (1).
- the control unit is displaying, as the object, a generation tool that enables a user to newly generate any one of the virtual controllers;
- the information processing device according to (1).
- the control unit is Recognizing the object corresponding to a user's hand based on image information obtained by capturing an image of the real space; Converting the recognized movement of the object into an operation event for the virtual controller; progresses the application by notifying the application of the operation event; 8.
- the information processing device according to any one of (1) to (7).
- the control unit is calculating a matching rate of a position of the object with respect to a predetermined operation area corresponding to the virtual controller based on the movement of the object, and executing an operation assistance process for assisting a user in operating the virtual controller based on the matching rate;
- the information processing device according to (8).
- the control unit is displaying the virtual controller such that the higher the matching rate is, the greater the visibility of the virtual controller becomes, and the lower the matching rate is, the less the visibility of the virtual controller becomes.
- (11) The control unit is calibrating the operation area based on the past matching rate; The information processing device according to (9) or (10).
- a head mounted display having a display unit for displaying the XR space is communicably connected, and the control unit executes the app.
- the information processing device according to any one of (1) to (11).
- the app is a game app. 13.
- the information processing device according to any one of (1) to (12).
- An information processing method comprising: (15) A computer that is an information processing device that displays an object in an XR space so that the object can be operated by a user, displaying an execution screen of an application and a controller for operating the application as the object; performing control to operate the controller as a virtual controller that can be arbitrarily set without depending on a real object in real space; A program to execute.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2024555713A JPWO2024075544A1 (enrdf_load_stackoverflow) | 2022-10-06 | 2023-09-22 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022161457 | 2022-10-06 | ||
JP2022-161457 | 2022-10-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024075544A1 true WO2024075544A1 (ja) | 2024-04-11 |
Family
ID=90608189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/034418 WO2024075544A1 (ja) | 2022-10-06 | 2023-09-22 | 情報処理装置、情報処理方法およびプログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2024075544A1 (enrdf_load_stackoverflow) |
WO (1) | WO2024075544A1 (enrdf_load_stackoverflow) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016177658A (ja) * | 2015-03-20 | 2016-10-06 | カシオ計算機株式会社 | 仮想入力装置、入力方法、およびプログラム |
JP2018143785A (ja) * | 2018-05-15 | 2018-09-20 | グリー株式会社 | プログラム、及び表示システム |
JP2021501496A (ja) * | 2017-09-29 | 2021-01-14 | 株式会社ソニー・インタラクティブエンタテインメント | ロボットユーティリティ及びインターフェースデバイス |
WO2022180894A1 (ja) * | 2021-02-24 | 2022-09-01 | 合同会社Vessk | 触覚拡張情報処理システム、ソフトウエア、方法並びに記録媒体 |
-
2023
- 2023-09-22 JP JP2024555713A patent/JPWO2024075544A1/ja active Pending
- 2023-09-22 WO PCT/JP2023/034418 patent/WO2024075544A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016177658A (ja) * | 2015-03-20 | 2016-10-06 | カシオ計算機株式会社 | 仮想入力装置、入力方法、およびプログラム |
JP2021501496A (ja) * | 2017-09-29 | 2021-01-14 | 株式会社ソニー・インタラクティブエンタテインメント | ロボットユーティリティ及びインターフェースデバイス |
JP2018143785A (ja) * | 2018-05-15 | 2018-09-20 | グリー株式会社 | プログラム、及び表示システム |
WO2022180894A1 (ja) * | 2021-02-24 | 2022-09-01 | 合同会社Vessk | 触覚拡張情報処理システム、ソフトウエア、方法並びに記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2024075544A1 (enrdf_load_stackoverflow) | 2024-04-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102833872B1 (ko) | 환경에서 객체들을 조작하기 위한 방법들 | |
US12265657B2 (en) | Methods for navigating user interfaces | |
US20220269333A1 (en) | User interfaces and device settings based on user identification | |
US9767524B2 (en) | Interaction with virtual objects causing change of legal status | |
CN116719452A (zh) | 用于与用于移动虚拟环境中的虚拟对象的虚拟控件和/或示能表示交互的方法 | |
US10955665B2 (en) | Concurrent optimal viewing of virtual objects | |
US11340695B2 (en) | Converting a 2D positional input into a 3D point in space | |
US9645394B2 (en) | Configured virtual environments | |
US9041622B2 (en) | Controlling a virtual object with a real controller device | |
US20230171484A1 (en) | Devices, methods, and graphical user interfaces for generating and displaying a representation of a user | |
JP2020521217A (ja) | 仮想現実ディスプレイシステム、拡張現実ディスプレイシステム、および複合現実ディスプレイシステムのためのキーボード | |
US20180005440A1 (en) | Universal application programming interface for augmented reality | |
US20240361835A1 (en) | Methods for displaying and rearranging objects in an environment | |
EP3311249A1 (en) | Three-dimensional user input | |
EP4591140A1 (en) | Devices, methods, and graphical user interfaces for interacting with extended reality experiences | |
US20240104871A1 (en) | User interfaces for capturing media and manipulating virtual objects | |
US12249033B2 (en) | User interfaces that include representations of the environment | |
US20250191302A1 (en) | User interfaces that include representations of the environment | |
WO2024075544A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
CN119301592A (zh) | 用于用户认证和设备管理的设备、方法和图形用户界面 | |
US20250110551A1 (en) | Devices, methods, and graphical user interfaces for displaying presentation environments for a presentation application | |
KR20250050092A (ko) | 시선 추적 등록을 위한 사용자 인터페이스들 | |
WO2024053405A1 (ja) | 情報処理装置および情報処理方法 | |
WO2024064380A1 (en) | User interfaces for gaze tracking enrollment | |
CN120469577A (zh) | 用于基于手势的交互的设备、方法和用户界面 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23874673 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024555713 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |