WO2023112838A1 - Dispositif de traitement d'informations - Google Patents

Dispositif de traitement d'informations Download PDF

Info

Publication number
WO2023112838A1
WO2023112838A1 PCT/JP2022/045377 JP2022045377W WO2023112838A1 WO 2023112838 A1 WO2023112838 A1 WO 2023112838A1 JP 2022045377 W JP2022045377 W JP 2022045377W WO 2023112838 A1 WO2023112838 A1 WO 2023112838A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
user
sight
line
control unit
Prior art date
Application number
PCT/JP2022/045377
Other languages
English (en)
Japanese (ja)
Inventor
智仁 山▲崎▼
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023112838A1 publication Critical patent/WO2023112838A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to an information processing device.
  • MR Mated Reality
  • the real environment perceived by the user is augmented by a computer.
  • this technology for example, it is possible to precisely superimpose the virtual space on the real space that the user sees through the MR glasses worn on the head.
  • Patent Literature 1 discloses an information processing device that displays a realistic virtual reality image while ensuring the safety of a user riding in a vehicle.
  • the information processing device includes AR goggles on which images of a plurality of virtual first objects are displayed.
  • the information processing device detects a second object that actually exists outside or inside the vehicle. Further, when it is determined that the image of the first object is superimposed on the visibility ensuring area that affects the visibility of the user with respect to the second object, the information processing device superimposes the image of the first object on the visibility ensuring area. For example, the position of the image of the first object is moved so as not to occur.
  • Patent Document 1 defines the positional relationship between the virtual first object and the real second object, and the second object is determined according to which object the user is paying attention to. It was not possible to change the state of the object.
  • the present invention provides a method in which, of two virtual objects arranged in a virtual space, when the target of the user's attention changes from a state in which the user focuses on one virtual object, the other virtual object is displayed.
  • the problem to be solved is to change the state of
  • An information processing apparatus includes an acquisition unit that acquires line-of-sight information indicating a user's line of sight, and a first virtual object and a second virtual object that overlap each other in a virtual space as seen from the user. a display control unit for displaying a display at a position where the line of sight of the user does not match; an operation control unit that transitions the second virtual object from inactive to active after the transition to a second state that does not intersect with the first virtual object.
  • the target of the user's attention changes from a state in which the user focuses on one of the two virtual objects arranged in the virtual space
  • the other virtual object It is possible to change the state of
  • FIG. 2 is a perspective view showing the appearance of the MR glasses 20 according to the first embodiment
  • FIG. FIG. 4 is a schematic diagram of a virtual space VS provided to a user U1 by using the MR glasses 20 according to the first embodiment
  • FIG. 4 is a schematic diagram of a virtual space VS provided to a user U1 by using the MR glasses 20 according to the first embodiment
  • 2 is a block diagram showing a configuration example of the MR glasses 20 according to the first embodiment
  • FIG. 1 is a block diagram showing a configuration example of a terminal device 10 according to the first embodiment
  • FIG. 4 is an explanatory diagram of an operation example of the display control unit 112 according to the first embodiment
  • 3 is a block diagram showing a configuration example of a server 30 according to the first embodiment
  • FIG. 4 is a flowchart showing operations of the server 30 according to the first embodiment
  • FIG. 1 First Embodiment
  • a terminal device 10 as an information processing device according to a first embodiment of the present invention
  • FIG. 1 is a diagram showing the overall configuration of an information processing system 1 according to the first embodiment of the present invention.
  • the information processing system 1 is a system that uses MR technology to provide a virtual space to a user U1 wearing MR glasses 20, which will be described later.
  • the information processing system 1 includes a terminal device 10, MR glasses 20, and a server 30.
  • the terminal device 10 is an example of an information processing device.
  • the terminal device 10 and the server 30 are communicably connected to each other via a communication network NET.
  • the terminal device 10 and the MR glasses 20 are connected so as to be able to communicate with each other.
  • the terminal device 10 and the MR glasses 20 are combined as a pair of the terminal device 10-1 and the MR glasses 20-1, a pair of the terminal device 10-2 and the MR glasses 20-2, and a pair of the terminal device 10-1 and the MR glasses 20-2.
  • a total of three pairs of devices 10-3 and MR glasses 20-3 are described.
  • the number of sets is merely an example, and the information processing system 1 can include any number of sets of the terminal device 10 and the MR glasses 20 .
  • the server 30 provides various data and cloud services to the terminal device 10 via the communication network NET.
  • the terminal device 10 displays virtual objects placed in the virtual space on the MR glasses 20 that the user wears on the head.
  • the virtual space is, for example, a celestial space.
  • the virtual objects are, for example, virtual objects representing data such as still images, moving images, 3DCG models, HTML files, and text files, and virtual objects representing applications. Examples of text files include memos, source codes, diaries, and recipes. Examples of applications include browsers, applications for using SNS, and applications for generating document files.
  • the terminal device 10 is preferably a mobile terminal device such as a smart phone and a tablet, for example.
  • the MR glasses 20 are a see-through wearable display worn on the user's head.
  • the MR glasses 20 are controlled by the terminal device 10 to display a virtual object on the display panel provided for each of the binocular lenses. Note that the MR glasses 20 are an example of a display device.
  • FIG. 2 is a perspective view showing the appearance of the MR glasses 20. As shown in FIG. As shown in FIG. 2, the MR glasses 20 have temples 91 and 92, a bridge 93, frames 94 and 95, and lenses 41L and 41R, like general spectacles.
  • An imaging device 26 is provided in the bridge 93 . The imaging device 26 images the outside world. The imaging device 26 also outputs imaging information indicating the captured image.
  • Each of the lenses 41L and 41R has a half mirror.
  • the frame 94 is provided with a liquid crystal panel or an organic EL panel for the left eye and an optical member for guiding light emitted from the display panel for the left eye to the lens 41L.
  • a liquid crystal panel or an organic EL panel is hereinafter generically referred to as a display panel.
  • the half mirror provided in the lens 41L transmits external light and guides it to the left eye, and reflects the light guided by the optical member to enter the left eye.
  • the frame 95 is provided with a right-eye display panel and an optical member that guides light emitted from the right-eye display panel to the lens 41R.
  • the half mirror provided in the lens 41R transmits external light and guides it to the right eye, and reflects the light guided by the optical member to enter the right eye.
  • the display 28 which will be described later, includes a lens 41L, a left-eye display panel, a left-eye optical member, and a lens 41R, a right-eye display panel, and a right-eye optical member.
  • the user can observe the image displayed by the display panel in a see-through state superimposed on the state of the outside world.
  • the image for the left eye is displayed on the display panel for the left eye
  • the image for the right eye is displayed on the display panel for the right eye, thereby allowing the user U1 to
  • FIG. 3 and 4 are schematic diagrams of the virtual space VS provided to the user U1 by using the MR glasses 20.
  • FIG. 3 As shown in FIG. 3, in the virtual space VS, virtual objects VO1 to VO5 representing various contents such as browsers, cloud services, images, and moving images are arranged.
  • the user U1 walks around the public space while wearing the MR glasses 20 on which the virtual objects VO1 to VO5 arranged in the virtual space VS are displayed, thereby turning the virtual space into a private space in the public space. You can experience space VS.
  • the user U1 can act in the public space while receiving benefits brought by the virtual objects VO1 to VO5 placed in the virtual space VS.
  • FIG. 4 it is possible for a plurality of users U1 to U3 to share the virtual space VS.
  • the plurality of users U1 to U3 share one or a plurality of virtual objects VO, and through the shared virtual objects VO, the user U1 ⁇ It becomes possible to communicate between users U3.
  • any combination of two virtual objects VO out of the plurality of virtual objects VO1 to VO5 does not overlap in the virtual space VS as seen from the user U1. is displayed. Details of the display method of the virtual object VO will be described later.
  • FIG. 5 is a block diagram showing a configuration example of the MR glasses 20.
  • the MR glasses 20 include a processing device 21 , a storage device 22 , a line-of-sight detection device 23 , a GPS device 24 , a motion detection device 25 , an imaging device 26 , a communication device 27 , a display 28 and a speaker 29 .
  • Each element of the MR glasses 20 is interconnected by one or more buses for communicating information.
  • the term "apparatus" in this specification may be replaced with another term such as a circuit, a device, or a unit.
  • the processing device 21 is a processor that controls the MR glasses 20 as a whole.
  • the processing device 21 is configured using, for example, one or more chips.
  • the processing device 21 is configured using, for example, a central processing unit (CPU) including an interface with peripheral devices, an arithmetic device, registers, and the like. Some or all of the functions of the processing device 21 are realized by hardware such as DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), and FPGA (Field Programmable Gate Array). You may The processing device 21 executes various processes in parallel or sequentially.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • the storage device 22 is a recording medium that can be read and written by the processing device 21 .
  • the storage device 22 also stores a plurality of programs including the control program PR1 executed by the processing device 21 .
  • the line-of-sight detection device 23 detects the line of sight of the user U1. Any method may be used to detect the line of sight by the line of sight detection device 23 .
  • the line-of-sight detection device 23 may detect line-of-sight information based on, for example, the position of the inner corner of the eye and the position of the iris.
  • the line-of-sight detection device 23 also outputs line-of-sight information indicating the line-of-sight direction of the user U1 to the processing device 21, which will be described later, based on the detection result.
  • the line-of-sight information output to the processing device 21 is output to the terminal device 10 via the communication device 27 .
  • the GPS device 24 receives radio waves from multiple satellites.
  • the GPS device 24 also generates position information from the received radio waves.
  • the positional information indicates the position of the MR glasses 20 .
  • the location information may be in any format as long as the location can be specified.
  • the position information indicates the latitude and longitude of the MR glasses 20, for example.
  • location information is obtained from GPS device 24 .
  • the MR glasses 20 may acquire position information by any method.
  • the acquired position information is supplied to the processing device 21 .
  • the position information output to the processing device 21 is transmitted to the terminal device 10 via the communication device 27 .
  • the motion detection device 25 detects motion of the MR glasses 20 .
  • the motion detection device 25 corresponds to an inertial sensor such as an acceleration sensor that detects acceleration and a gyro sensor that detects angular acceleration.
  • the acceleration sensor detects acceleration in orthogonal X-, Y-, and Z-axes.
  • the gyro sensor detects angular acceleration around the X-, Y-, and Z-axes.
  • the motion detection device 25 can generate posture information indicating the posture of the MR glasses 20 based on the output information of the gyro sensor.
  • the motion detection device 25 supplies orientation information relating to the orientation of the MR glasses 20 to the processing device 21 .
  • the motion detection device 25 supplies motion information relating to the motion of the MR glasses 20 to the processing device 21 .
  • the motion information includes acceleration data indicating three-axis acceleration and angular acceleration data indicating three-axis angular acceleration.
  • the posture information and motion information supplied to the processing device 21 are transmitted to the terminal device 10 via the communication device 27 .
  • the imaging device 26 outputs imaging information obtained by imaging the outside world.
  • the imaging device 26 includes, for example, a lens, an imaging element, an amplifier, and an AD converter.
  • the light condensed through the lens is converted into an image pickup signal, which is an analog signal, by the image pickup device.
  • the amplifier amplifies the imaging signal and outputs it to the AD converter.
  • the AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal.
  • the converted imaging information is supplied to the processing device 21 .
  • the imaging information supplied to the processing device 21 is transmitted to the terminal device 10 via the communication device 27 .
  • the communication device 27 is hardware as a transmission/reception device for communicating with other devices.
  • the communication device 27 is also called a network device, a network controller, a network card, a communication module, etc., for example.
  • the communication device 27 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 27 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 28 is a device that displays images.
  • the display 28 displays various images under the control of the processing device 21 .
  • the display 28 includes the lens 41L, the left-eye display panel, the left-eye optical member, and the lens 41R, the right-eye display panel, and the right-eye optical member, as described above.
  • Various display panels such as a liquid crystal display panel and an organic EL display panel are preferably used as the display panel.
  • the speaker 29 is a device that emits sound.
  • the speaker 29 emits various sounds under the control of the processing device 21 .
  • audio data which is a digital signal
  • the audio signal is amplified in amplitude by an amplifier (not shown).
  • the speaker 29 emits sound indicated by the audio signal after the amplitude is amplified.
  • the processing device 21 functions as an acquisition unit 211 and a display control unit 212, for example, by reading the control program PR1 from the storage device 22 and executing it.
  • the acquisition unit 211 acquires image information indicating an image displayed on the MR glasses 20 from the terminal device 10 . More specifically, the acquisition unit 211 acquires, for example, second image information, which is transmitted from the terminal device 10 and will be described later.
  • the acquisition unit 211 also receives line-of-sight information input from the line-of-sight detection device 23 , position information input from the GPS device 24 , posture information and motion information input from the motion detection device 25 , and input from the imaging device 26 . Acquire imaging information. After that, the acquisition unit 211 outputs the acquired line-of-sight information, position information, posture information, motion information, and imaging information to the communication device 27 .
  • the display control unit 212 causes the display 28 to display an image indicated by the second image information based on the second image information acquired from the terminal device 10 by the acquisition unit 211 .
  • FIG. 6 is a block diagram showing a configuration example of the terminal device 10. As shown in FIG.
  • the terminal device 10 includes a processing device 11 , a storage device 12 , a communication device 13 , a display 14 , an input device 15 and an inertial sensor 16 . Elements of the terminal device 10 are interconnected by one or more buses for communicating information.
  • the processing device 11 is a processor that controls the terminal device 10 as a whole. Also, the processing device 11 is configured using, for example, a single chip or a plurality of chips. The processing unit 11 is configured using, for example, a central processing unit (CPU) including interfaces with peripheral devices, arithmetic units, registers, and the like. A part or all of the functions of the processing device 11 may be implemented by hardware such as DSP, ASIC, PLD, and FPGA. The processing device 11 executes various processes in parallel or sequentially.
  • CPU central processing unit
  • the storage device 12 is a recording medium readable and writable by the processing device 11 .
  • the storage device 12 also stores a plurality of programs including the control program PR2 executed by the processing device 11 .
  • the communication device 13 is hardware as a transmission/reception device for communicating with other devices.
  • the communication device 13 is also called a network device, a network controller, a network card, a communication module, etc., for example.
  • the communication device 13 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 13 may have a wireless communication interface.
  • Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection.
  • a wireless communication interface there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 14 is a device that displays images and character information.
  • the display 14 displays various images under the control of the processing device 11 .
  • various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are preferably used as the display 14 .
  • the input device 15 accepts operations from the user U1 who wears the MR glasses 20 on his head.
  • the input device 15 includes a pointing device such as a keyboard, touch pad, touch panel, or mouse.
  • the input device 15 may also serve as the display 14 .
  • the inertial sensor 16 is a sensor that detects inertial force.
  • the inertial sensor 16 includes, for example, one or more of an acceleration sensor, an angular velocity sensor, and a gyro sensor.
  • the processing device 11 detects the orientation of the terminal device 10 based on the output information from the inertial sensor 16 . Further, the processing device 11 receives selection of the virtual object VO, input of characters, and input of instructions in the celestial sphere virtual space VS based on the orientation of the terminal device 10 .
  • the user U1 directs the central axis of the terminal device 10 toward a predetermined area of the virtual space VS, and operates the input device 15 to select the virtual object VO arranged in the predetermined area.
  • the user U1's operation on the input device 15 is, for example, a double tap. By operating the terminal device 10 in this way, the user U1 can select the virtual object VO without looking at the input device 15 of the terminal device 10 .
  • the processing device 11 functions as an acquisition unit 111, a display control unit 112, a determination unit 113, and an operation control unit 114 by reading the control program PR2 from the storage device 12 and executing it.
  • the acquisition unit 111 acquires line-of-sight information indicating the line of sight of the user U1. More specifically, the acquisition unit 111 acquires line-of-sight information output from the MR glasses 20 via the communication device 13 . In addition, the acquisition unit 111 acquires first image information, which is described below and indicates an image displayed on the MR glasses 20 , from the server 30 via the communication device 13 .
  • the display control unit 112 moves the first virtual object VO1 and the second virtual object VO2 to positions not overlapping each other in the virtual space VS as seen from the user U1.
  • the first image information is image information indicating images of the first virtual object VO1 itself and the second virtual object VO2 itself.
  • the display control unit 112 provides layout information such that the first virtual object VO1 and the second virtual object VO2 do not overlap with each other from the viewpoint of the user U1 in the virtual space VS viewed by the user U1 through the MR glasses 20, and Second image information including the first image information is transmitted to the MR glasses 20 via the communication device 13 .
  • FIG. 7 is an explanatory diagram of how the display control unit 112 displays the first virtual object VO1 and the second virtual object VO2.
  • X, Y and Z axes are orthogonal to each other in the virtual space VS.
  • the X-axis extends in the horizontal direction of user U1.
  • the right direction along the X axis is the positive direction
  • the left direction along the X axis is the negative direction.
  • the Y-axis extends in the front-rear direction of the user U1.
  • the forward direction along the Y-axis is defined as the positive direction
  • the backward direction along the Y-axis is defined as the negative direction, as viewed from the user U1.
  • a horizontal plane is formed by these X-axis and Y-axis.
  • the Z-axis is orthogonal to the XY plane and extends in the vertical direction of the user U1. Further, as viewed from the user U1, the upward direction along the Z axis is the positive direction, and the downward direction along the Z axis is the negative direction.
  • a cylinder Y as a region where the user U1's line of sight V can exist while the user U1 is viewing the first virtual object VO1.
  • the cylinder Y has a first end point C1 at the position (x0, y0, z0) of the pupil EL of the left eyeball or the pupil ER of the right eyeball of the user U1, and the position (x1, y1, z1) of the first virtual object VO1 as It has a central axis L1 with a second end point C2.
  • the radius R1 of the bottom surface of the cylinder Y is the length that allows the bottom surface to contain the first virtual object VO1.
  • the position of the pupil EL of the left eyeball or the pupil ER of the right eyeball of the user U1 does not match the position of the first end point C1.
  • the position (x1, y1, z1) of the first virtual object VO1 does not match the position of the second end point C2.
  • each position in these sets will match.
  • the display control unit 112 displays the second virtual object VO2 outside the cylinder Y in an inactive state in the virtual space VS.
  • active means that the virtual object VO is in operation, or that the virtual object VO is selected by the user U1.
  • active means that the motion of the virtual object VO has stopped, or that the virtual object VO has not been selected by the user U1.
  • the virtual object VO corresponds to a moving image application
  • “the virtual object VO is in operation” means that the moving image is being reproduced by the user U1's operation.
  • the virtual object VO corresponds to an e-mail application
  • “the virtual object VO is in operation” means that the e-mail is sent/received or a text file indicating the content of the e-mail is opened by the operation of the user U1. It means that you are being killed.
  • the virtual object VO corresponds to a music application
  • “the virtual object VO is in operation” means that the music is emitted from the speaker 29 by the operation of the user U1.
  • the display control unit 112 displays the second virtual object VO2 in the virtual space VS in the first state in which the line of sight V of the user U1 intersects the first virtual object VO1 and the first virtual object VO1 is active. Let Specifically, the display control unit 112 displays the entire second virtual object VO2 in a first state in which the entire line of sight V of the user U1 is positioned inside the cylinder Y and the first virtual object VO1 is active. It is displayed outside the cylinder Y.
  • the display position of the second virtual object VO2 is an area within a predetermined distance from the first virtual object VO1. Specifically, the second virtual object VO2 is displayed within the field of view of the user U1 while the user U1 is viewing the first virtual object VO1. With this process, the terminal device 10 can call the attention of the user U1 to the second virtual object VO2.
  • the above "predetermined distance” is an example of the "first distance”.
  • the determination unit 113 determines that the first state in which the line of sight V of the user U1 intersects the first virtual object VO1 and the first virtual object VO1 is active is the line of sight V of the user U1. has transitioned to a second state that does not intersect with the first virtual object VO1.
  • the motion control unit 114 transitions the second virtual object VO2 from inactive to active. Specifically, in FIG. 7, after the line of sight V of the user U1 changes from being within the cylinder Y to being outside the cylinder Y, the motion control unit 114 changes the second virtual object VO2 from inactive to active. transition to
  • the motion control unit 114 preferably transitions the second virtual object VO2 from inactive to active after the line of sight V of the user U1 remains stationary for a predetermined period of time. Specifically, in the second state described above, the motion control unit 114 controls the second virtual object VO2 on the condition that the change in the line of sight V of the user U1 per unit time is within a predetermined range for a predetermined period of time. transition from inactive to active. As a result, the terminal device 10 can transition the second virtual object VO2 from inactive to active after the point of the line of sight V of the user U1 is not wandering and the state is almost stationary.
  • predetermined range is an example of the "second range”.
  • the above "predetermined time” is an example of the "second time”.
  • the operation control unit 114 transitions the first virtual object VO1 from active to inactive after transitioning from the first state to the second state.
  • the terminal device 10 can switch which virtual object VO to activate, the first virtual object VO1 or the second virtual object VO2, according to the position of the line of sight V of the user U1.
  • the terminal device 10 can exclusively activate the first virtual object VO1 and the second virtual object VO2.
  • the state transition of the first virtual object VO1 and the state transition of the second virtual object VO2 may be executed simultaneously, or one may precede the other.
  • FIG. 8 is a block diagram showing a configuration example of the server 30.
  • the server 30 comprises a processing device 31 , a storage device 32 , a communication device 33 , a display 34 and an input device 35 .
  • Each element of server 30 is interconnected by one or more buses for communicating information.
  • the processing device 31 is a processor that controls the server 30 as a whole. Also, the processing device 31 is configured using, for example, a single chip or a plurality of chips. The processing unit 31 is configured using, for example, a central processing unit (CPU) including interfaces with peripheral devices, arithmetic units, registers, and the like. A part or all of the functions of the processing device 31 may be implemented by hardware such as DSP, ASIC, PLD, and FPGA. The processing device 31 executes various processes in parallel or sequentially.
  • CPU central processing unit
  • the storage device 32 is a recording medium readable and writable by the processing device 31 .
  • the storage device 32 also stores a plurality of programs including the control program PR3 executed by the processing device 31 .
  • the communication device 33 is hardware as a transmission/reception device for communicating with other devices.
  • the communication device 33 is also called a network device, a network controller, a network card, a communication module, or the like, for example.
  • the communication device 33 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 33 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 34 is a device that displays images and character information.
  • the display 34 displays various images under the control of the processing device 31 .
  • various display panels such as a liquid crystal display panel and an organic EL display panel are preferably used as the display 34 .
  • the input device 35 is a device that accepts operations by the administrator of the information processing system 1 .
  • the input device 35 includes a pointing device such as a keyboard, touch pad, touch panel, or mouse.
  • the input device 35 may also serve as the display 34 .
  • the processing device 31 functions as an acquisition unit 311 and a generation unit 312, for example, by reading the control program PR3 from the storage device 32 and executing it.
  • the acquisition unit 311 acquires various data from the terminal device 10 using the communication device 33 .
  • the data includes, for example, data indicating the operation content for the virtual object VO, which is input to the terminal device 10 by the user U1 wearing the MR glasses 20 on the head.
  • the generation unit 312 generates first image information indicating an image displayed on the MR glasses 20 .
  • This first image information is transmitted to the terminal device 10 via the communication device 33 .
  • the display control unit 112 provided in the terminal device 10 displays the first virtual object VO1 and the second virtual object VO2 as seen from the user U1 based on the first image information received via the communication device 13. are displayed at positions that do not overlap each other in the virtual space VS.
  • step S ⁇ b>1 the processing device 11 functions as the display control unit 112 .
  • the processing device 11 displays the first virtual object VO1 in the virtual space VS.
  • step S2 the processing device 11 functions as the acquisition unit 111.
  • the processing device 11 acquires the line of sight V of the user U1.
  • step S3 the processing device 11 functions as the display control unit 112.
  • the processing device 11 displays the second virtual object VO2 in an inactive state so that it does not overlap the first virtual object VO1 as seen from the user U1 in the virtual space VS. It is assumed that the first virtual object VO1 is active at the stage of step S3 at the latest.
  • step S4 the processing device 11 functions as the determination unit 113.
  • the processing device 11 changes the line-of-sight V of the user U1 to intersect the first virtual object VO1 from the first state in which the line-of-sight V of the user U1 intersects the first virtual object VO1 and the first virtual object VO1 is active. It is determined whether or not the state has changed to the second state in which the state is not performed. If it is determined that the state has transitioned from the first state to the second state, that is, if the determination result of step S4 is affirmative, the processing device 11 executes the process of step S5. If it is not determined that the state has transitioned from the first state to the second state, that is, if the determination result of step S4 is negative, the processing device 11 executes the process of step S4.
  • step S5 the processing device 11 functions as the operation control unit 114.
  • the processing device 11 transitions the second virtual object VO2 from inactive to active.
  • step S6 the processing device 11 functions as the operation control unit 114.
  • the processing device 11 transitions the first virtual object VO1 from active to inactive. After that, the processing device 11 ends all the operations shown in FIG.
  • the terminal device 10 as an information processing device includes the acquisition unit 111, the display control unit 112, and the operation control unit 114.
  • the acquisition unit 111 acquires line-of-sight information indicating the line-of-sight V of the user U1.
  • the display control unit 112 causes the first virtual object VO1 and the second virtual object VO2 to be displayed at positions not overlapping each other in the virtual space VS as seen from the user U1.
  • the motion control unit 114 changes the line-of-sight V of the user U1 to the first virtual object VO1 from the first state in which the line-of-sight V of the user U1 indicated by the line-of-sight information intersects with the first virtual object VO1 and the first virtual object VO1 is active. After transitioning to a second state that does not intersect with the virtual object VO1, the second virtual object VO2 is transitioned from inactive to active.
  • the terminal device 10 can newly display the second virtual object VO2 so as not to overlap the previously displayed first virtual object VO1.
  • the terminal device 10 can switch which virtual object VO to activate between the first virtual object VO1 and the second virtual object VO2 according to the direction of the line of sight V of the user U1.
  • the operation control unit 114 transitions the first virtual object VO1 from active to inactive after transitioning from the first state to the second state.
  • the terminal device 10 can exclusively activate the first virtual object VO1 and the second virtual object VO2. For example, if the first virtual object VO1 is the first application and the second virtual object VO2 is the second application, the user U1 removes the line of sight V from the first virtual object VO1, and the first application is executed. You can stop working and run a second application. That is, the user U1 can switch the application to be operated only by removing the line of sight V from the first virtual object VO1 without inputting an instruction to the second virtual object VO2.
  • the display control unit 112 starts displaying the second virtual object VO2 in the virtual space VS in the first state.
  • the terminal device 10 can cause the second virtual object VO2 to appear in the first state.
  • the terminal device 10 can display a second application, which is the second virtual object VO2, while the first application, which is the first virtual object VO1, is running.
  • the operation control unit 114 in the above-described second state, on the condition that the change per unit time of the line of sight V of the user U1 is within the second range continues for the second time. , transitions the second virtual object VO2 from inactive to active.
  • the terminal device 10 can transition the second virtual object VO2 from inactive to active after the point of the line of sight V of the user U1 is not wandering and the second virtual object VO2 is in a near stationary state.
  • the display control unit 112 displays the second virtual object VO2 in the area within the first distance from the first virtual object VO1.
  • the terminal device 10 can increase the degree to which the user U1's attention can be drawn to the second virtual object VO2.
  • the display control unit 112 in the virtual space VS, displays a second 2 Display the virtual object VO2.
  • the terminal device 10 can display the first virtual object VO1 and the second virtual object VO2 at positions that do not overlap each other in the virtual space VS as seen from the user U1.
  • the display control unit 112 displays the second virtual object VO2 in the virtual space VS in the first state described above.
  • the method of displaying the second virtual object VO2 is not limited to this method.
  • the display control unit 112 may display the second virtual object VO2 at a position that intersects the line of sight V of the user U1 after the line of sight V of the user U1 remains stationary for a predetermined time in the second state.
  • the display control unit 112 sets the second virtual object VO2 to the second virtual object VO2 on the condition that the change per unit time of the line of sight V of the user U1 continues within a predetermined range for a predetermined time.
  • the terminal device 10 includes the display control section 112 and the operation control section 114 .
  • the display control unit 112 causes the first virtual object VO1 and the second virtual object VO2 to be displayed at positions that do not overlap each other in the virtual space VS as seen from the user U1.
  • the motion control unit 114 changes the line-of-sight V of the user U1 from the first state in which the line-of-sight V of the user U1 indicated by the line-of-sight information and the first virtual object VO1 intersect and the first virtual object VO1 is active. After transitioning to a second state that does not intersect with one virtual object VO1, the second virtual object VO2 is transitioned from inactive to active.
  • the server 30 may include components similar to the display control section 112 and the operation control section 114 .
  • the server 30 It is preferred to perform an action.
  • the server 30 is an example of an information processing device.
  • the cylinder Y was assumed as the region where the line of sight V of the user U1 may exist while the user U1 is viewing the first virtual object VO1.
  • the shape of the area is not limited to a cylinder.
  • FIG. 10 is an explanatory diagram of an operation example of the display control unit 112 according to the third modification.
  • the shape of said region may be, for example, a cone N, as shown in FIG.
  • the cone N has the position (x0, y0, z0) of the pupil EL of the left eyeball or the pupil ER of the right eyeball of the user U1 as the first end point C1, and the position (x1, y1, z1) of the first virtual object VO1 as It has a central axis L2 with a second end point C2.
  • the first end point C1 is the vertex of the cone N.
  • the radius R2 of the base of the cone N is the length that allows the base to contain the first virtual object VO1.
  • the position of the pupil EL of the left eyeball or the pupil ER of the right eyeball of the user U1 does not match the position of the first end point C1.
  • the position (x1, y1, z1) of the first virtual object VO1 does not match the position of the second end point C2.
  • each position in these sets will match.
  • the terminal device 10 and the MR glasses 20 are implemented separately.
  • the method of realizing the terminal device 10 and the MR glasses 20 in the embodiment of the present invention is not limited to this.
  • the terminal device 10 and the MR glasses 20 may be realized within a single housing by providing the MR glasses 20 with the same functions as the terminal device 10 .
  • the information processing system 1 includes the MR glasses 20 .
  • an HMD Head Mounted Display
  • VR Virtual Reality
  • AR Augmented Reality
  • Any one of the AR glasses may be provided.
  • the information processing system 1 may include, instead of the MR glasses 20, any one of a normal smartphone and tablet equipped with an imaging device.
  • the generation unit 312 provided in the server 30 generates the first image information indicating the image displayed on the MR glasses 20 .
  • the device that generates the first image information is not limited to the server 30 .
  • the terminal device 10 may generate the above first image information.
  • the information processing system 1 does not have to include the server 30 as an essential component.
  • the storage device 12, the storage device 22, and the storage device 32 are ROM and RAM. discs, Blu-ray discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable discs, hard disks, floppies ) disk, magnetic strip, database, server or other suitable storage medium.
  • the program may be transmitted from a network via an electric communication line. Also, the program may be transmitted from the communication network NET via an electric communication line.
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
  • input/output information and the like may be stored in a specific location (for example, memory), or may be managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
  • the determination may be made by a value (0 or 1) represented using 1 bit, or by a true/false value (Boolean: true or false). Alternatively, it may be performed by numerical comparison (for example, comparison with a predetermined value).
  • each function illustrated in FIGS. 1 to 10 is realized by any combination of at least one of hardware and software.
  • the method of realizing each functional block is not particularly limited. That is, each functional block may be implemented using one device physically or logically coupled, or directly or indirectly using two or more physically or logically separated devices (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices.
  • a functional block may be implemented by combining software in the one device or the plurality of devices.
  • software, instructions, information, etc. may be transmitted and received via a transmission medium.
  • the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • system and “network” are used interchangeably.
  • Information, parameters, etc. described in this disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using corresponding other information. may be represented as
  • the terminal device 10 and the server 30 may be mobile stations (MS).
  • a mobile station is defined by those skilled in the art as subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be called a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable term. Also, in the present disclosure, terms such as “mobile station”, “user terminal”, “user equipment (UE)”, “terminal”, etc. may be used interchangeably.
  • connection refers to any direct or indirect connection between two or more elements. Any connection or coupling is meant, including the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other. Couplings or connections between elements may be physical couplings or connections, logical couplings or connections, or a combination thereof. For example, “connection” may be replaced with "access.”
  • two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
  • the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
  • determining and “determining” as used in this disclosure may encompass a wide variety of actions.
  • “Judgement” and “determination” are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure);
  • "judgment” and “determination” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgement” or “decision” has been made.
  • judgment and “decision” are considered to be “judgment” and “decision” by resolving, selecting, choosing, establishing, comparing, etc. can contain.
  • judgment and “decision” may include considering that some action is “judgment” and “decision”.
  • judgment (decision) may be read as “assuming”, “expecting”, “considering”, and the like.
  • notification of predetermined information is not limited to explicit notification, but is performed implicitly (for example, not notification of the predetermined information). good too.
  • Judgment Unit 114 Operation control unit 211 Acquisition unit 212 Display control unit 311 Acquisition unit 312 Generation unit C1, C2 End point L1, L2 Central axis PR1, PR2, PR3 Control program , R1, R2... radius, U1, U2, U3... user, VO... virtual object, VO1... first virtual object, VO2... second virtual object

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un dispositif de traitement d'informations comprend : une unité d'acquisition qui acquiert des informations de ligne visuelle indiquant la ligne visuelle d'un utilisateur ; une unité de commande d'affichage qui affiche un premier objet virtuel et un second objet virtuel à des emplacements qui ne se chevauchent pas, tels que vus par l'utilisateur, dans un espace virtuel ; et une unité de commande d'opération qui amène le second objet virtuel à passer d'un état inactif à un état actif après la survenue d'une transition depuis un premier état dans lequel la ligne visuelle de l'utilisateur indiquée par les informations de ligne visuelle coupe le premier objet virtuel et où le premier objet virtuel est actif vers un second état dans lequel la ligne visuelle de l'utilisateur ne coupe pas le premier objet virtuel.
PCT/JP2022/045377 2021-12-16 2022-12-08 Dispositif de traitement d'informations WO2023112838A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021204072 2021-12-16
JP2021-204072 2021-12-16

Publications (1)

Publication Number Publication Date
WO2023112838A1 true WO2023112838A1 (fr) 2023-06-22

Family

ID=86774647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/045377 WO2023112838A1 (fr) 2021-12-16 2022-12-08 Dispositif de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023112838A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017126009A (ja) * 2016-01-15 2017-07-20 キヤノン株式会社 表示制御装置、表示制御方法、およびプログラム
JP2018088118A (ja) * 2016-11-29 2018-06-07 パイオニア株式会社 表示制御装置、制御方法、プログラム及び記憶媒体
WO2019181488A1 (fr) * 2018-03-20 2019-09-26 ソニー株式会社 Dispositif et procédé de traitement d'informations et programme

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017126009A (ja) * 2016-01-15 2017-07-20 キヤノン株式会社 表示制御装置、表示制御方法、およびプログラム
JP2018088118A (ja) * 2016-11-29 2018-06-07 パイオニア株式会社 表示制御装置、制御方法、プログラム及び記憶媒体
WO2019181488A1 (fr) * 2018-03-20 2019-09-26 ソニー株式会社 Dispositif et procédé de traitement d'informations et programme

Similar Documents

Publication Publication Date Title
US20210405761A1 (en) Augmented reality experiences with object manipulation
KR102349716B1 (ko) 영상 공유 방법 및 이를 수행하는 전자 장치
EP3172644B1 (fr) Projection de regard multiutilisateur à l'aide de dispositifs de visiocasque
JP6717773B2 (ja) 環境情報の視覚的複雑性に基づいて情報の提示を修正する方法と装置
Billinghurst Grand challenges for augmented reality
EP3714318B1 (fr) Système de suivi de position pour visiocasques qui comprend des circuits intégrés de capteur
US20180150997A1 (en) Interaction between a touch-sensitive device and a mixed-reality device
US20150193977A1 (en) Self-Describing Three-Dimensional (3D) Object Recognition and Control Descriptors for Augmented Reality Interfaces
AU2014281726A1 (en) Virtual object orientation and visualization
KR20190083464A (ko) 스크롤 입력에 기반하여 이미지 표시를 제어하는 전자 장치 및 방법
US12014466B2 (en) Triggering a collaborative augmented reality environment using an ultrasound signal
WO2023112838A1 (fr) Dispositif de traitement d'informations
US11893989B2 (en) Voice-controlled settings and navigation
WO2023149256A1 (fr) Dispositif de commande d'affichage
WO2022125384A1 (fr) Dispositif de commande de mode oculaire dépendant du regard pour la réalité mixte
WO2023145890A1 (fr) Dispositif terminal
WO2023149255A1 (fr) Dispositif de commande d'affichage
WO2023145265A1 (fr) Dispositif de transmission de message, et dispositif de réception de message
WO2023145273A1 (fr) Dispositif de commande d'affichage
WO2023162499A1 (fr) Dispositif de commande d'affichage
WO2023145892A1 (fr) Dispositif de commande d'écran et serveur
WO2023149498A1 (fr) Dispositif de commande d'affichage
WO2023079875A1 (fr) Dispositif de traitement d'informations
WO2023176317A1 (fr) Dispositif de commande d'affichage
WO2023223750A1 (fr) Dispositif d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22907365

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023567751

Country of ref document: JP