WO2023204159A1 - Dispositif de commande d'affichage - Google Patents

Dispositif de commande d'affichage Download PDF

Info

Publication number
WO2023204159A1
WO2023204159A1 PCT/JP2023/015231 JP2023015231W WO2023204159A1 WO 2023204159 A1 WO2023204159 A1 WO 2023204159A1 JP 2023015231 W JP2023015231 W JP 2023015231W WO 2023204159 A1 WO2023204159 A1 WO 2023204159A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
height
virtual object
display
display control
Prior art date
Application number
PCT/JP2023/015231
Other languages
English (en)
Japanese (ja)
Inventor
怜央 水田
有希 中村
充宏 後藤
康夫 森永
達哉 西▲崎▼
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023204159A1 publication Critical patent/WO2023204159A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • the present invention relates to a display control device.
  • the game device described in Patent Document 1 causes a display device to display an image in which a character appears from behind an object existing in the real space, based on an image showing the real space taken by a CMOS camera. Characters are generated by CG. Therefore, the game device provides the user with an augmented reality space in which a character is superimposed on the real space.
  • a character for example, a tree and a wall actually exist on the ground in the real space, and the display is controlled as if the character were standing between the real tree and the wall in the real space. That is, in conventional game devices, the position where a character is displayed is associated with the position of objects such as trees and walls in real space.
  • an object of the present disclosure is to provide a display control device that adjusts the display position of a virtual object when displaying the virtual object on the ground in the real space in an augmented reality space or a mixed reality space.
  • a display control device is a display control device that controls a display device worn on a user's head, and calculates the height of the display device with respect to the ground in real space.
  • a calculation unit ; and a display control unit that causes the display device to display an augmented reality space or a mixed reality space in which a virtual object is superimposed on the real space, the virtual object being arranged on a plane corresponding to the ground,
  • the display control unit is configured to move the display control unit to a position that is a distance corresponding to the height of the display device indicated by the height information from a position where an axis passing through the center of the display device and along the vertical direction intersects the plane.
  • a display control device that arranges virtual objects.
  • the display position of the virtual object when displaying a virtual object on the ground in real space in augmented reality space or mixed reality space, the display position of the virtual object can be adjusted.
  • FIG. 1 is a block diagram showing the overall configuration of an information processing system 1.
  • FIG. An example of a virtual object VO existing in an augmented reality space or a mixed reality space.
  • FIG. 2 is a block diagram showing a configuration example of a terminal device 20-k.
  • FIG. 2 is a block diagram showing a configuration example of a management server 50.
  • FIG. A flowchart showing the contents of a display process in which the terminal device 20-k causes the XR glasses 30-k to display an augmented reality space or a mixed reality space including a virtual object VO.
  • FIG. 12 is a flowchart showing the contents of a display process in which the terminal device 20B-k causes the XR glasses 30-k to display an augmented reality space or a mixed reality space including a virtual object VO.
  • FIG. 2 is a block diagram showing a configuration example of a terminal device 20D-k.
  • 12 is a flowchart showing the contents of a display process in which the terminal device 20D-k causes the XR glasses 30-k to display an augmented reality space or a mixed reality space including a virtual object VO.
  • FIG. 3 is a block diagram showing a configuration example of a terminal device 20E-k.
  • FIG. 6 is a diagram showing how the virtual object VO is moved by the display control unit 215E.
  • 12 is a flowchart showing the details of a movement process in which the terminal device 20E-k moves the virtual object VO.
  • FIG. 1 is a block diagram showing the overall configuration of the information processing system 1.
  • the information processing system 1 includes user devices 10-1, 10-2, ... 10-k, ... 10-j, and a management system that manages data related to an augmented reality space or a mixed reality space.
  • a server 50 is provided.
  • j is any integer greater than or equal to 1.
  • k is any integer from 1 to j.
  • the user device 10-k includes a terminal device 20-k and XR glasses 30-k.
  • the user devices 10-1, 10-2, . . . 10-k, . . . 10-j have the same configuration.
  • the terminal devices 20-1, 20-2, ...20-k, ...20-j have the same configuration
  • the XR glasses 30-1, 30-2, ...30-k, ...30-j have the same configuration. It is the composition.
  • the information processing system 1 may include terminal devices with different configurations and XR glasses with different configurations.
  • the management server 50 and the user device 10-k are communicably connected to each other via the communication network NET. Further, the terminal device 20-k and the XR glasses 30-k are connected to be able to communicate with each other. Note that in FIG. 1, user U[k] uses user device 10-k. The same applies to users U[1], U[2], ...U[k-1], U[k+1], ...U[j].
  • the terminal device 20-k functions as a relay device that relays communication between the XR glasses 30-k and the management server 50. Further, the terminal device 20-k causes the XR glasses 30-k to display an augmented reality space or a mixed reality space including a virtual object.
  • the terminal device 20-k is configured by, for example, a smartphone or a tablet terminal.
  • the XR glasses 30-k are worn on the head of the user U[k].
  • the XR glasses 30-k are see-through type glasses that allow virtual objects to be viewed.
  • User U[k] visually recognizes the virtual object while viewing the real space through the XR glasses 30-k.
  • a virtual object is arranged in virtual space in correspondence with a position in real space.
  • the user U[k] recognizes an augmented reality space or a mixed reality space that is a combination of a real space and a virtual space.
  • FIG. 2 is an example of a virtual object VO existing in an augmented reality space or a mixed reality space that is recognized by the user U[k] using the XR glasses 30-k.
  • the X-axis, Y-axis, and Z-axis are orthogonal to each other.
  • the X-axis, Y-axis, and Z-axis are common to all the figures illustrated in the following description.
  • one direction along the X axis viewed from an arbitrary point is referred to as the X1 direction
  • a direction opposite to the X1 direction is referred to as the X2 direction.
  • the X-axis direction is a direction including both the X1 direction and the X2 direction.
  • mutually opposite directions along the Y-axis from an arbitrary point are expressed as the Y1 direction and the Y2 direction.
  • the Y-axis direction includes both the Y1 direction and the Y2 direction. Further, mutually opposite directions along the Z-axis from an arbitrary point are expressed as Z1 direction and Z2 direction.
  • the Z-axis direction includes both the Z1 direction and the Z2 direction.
  • the XY plane including the X axis and the Y axis corresponds to the ground parallel to the horizontal plane.
  • the Z-axis is an axis along the vertical direction.
  • the front of the user U[k] is defined as the Y1 direction
  • the direction behind the user U[k] is defined as the Y2 direction.
  • the right hand direction is defined as the X1 direction
  • the left hand direction is defined as the X2 direction
  • the direction of the legs of the user U[k] is defined as the Z1 direction
  • the direction of the top of the head is defined as the Z2 direction.
  • user U[k] wears XR glasses 30-k on his head.
  • the height of the XR glass 30-k with respect to the ground G is H.
  • the virtual object VO is placed on the XY plane corresponding to the ground G.
  • the ground G is the ground in real space.
  • the XY plane is a plane in virtual space. That is, the virtual object VO is placed on a plane corresponding to the ground G.
  • the "plane corresponding to the ground G" refers to a plane in the virtual space that coincides with the ground G in the above-mentioned augmented reality space or mixed reality space.
  • the user U[k] visually recognizes the virtual object VO floating in the air.
  • the ground G seen from the user U[k] is a stable plane. Therefore, by displaying the virtual object VO on a plane corresponding to the ground G, the degree to which the user U[k] feels uncomfortable is reduced.
  • the virtual object VO is placed at a distance L from a position C where an axis parallel to the Z-axis passing through the center of the XR glasses 30-k intersects the XY plane corresponding to the ground G.
  • the distance between the center of gravity of the virtual object VO and the position C may be L.
  • the distance between the point of the virtual object VO that is closest to the position C and the position C may be L.
  • the distance L is set according to the height H. Specifically, the greater the height H, the greater the distance L.
  • the virtual object VO is close to the user U[k] compared to the height H of the XR glasses 30-k. will no longer be displayed. Therefore, the angle at which the user U[k] tilts his head downward can be small.
  • FIG. 3 is a block diagram showing an example of the configuration of the XR glasses 30-k.
  • the XR glasses 30-k include a processing device 31, a storage device 32, a detection device 35, an imaging device 36, a communication device 37, and a display 38.
  • Each element included in the XR glasses 30-k is interconnected by one or more buses for communicating information.
  • the term "apparatus" in this specification may be replaced with other terms such as circuit, device, or unit.
  • the processing device 31 is a processor that controls the entire XR glasses 30-k.
  • the processing device 31 is configured using, for example, a single chip or a plurality of chips. Further, the processing device 31 is configured using, for example, a central processing unit (CPU) that includes an interface with a peripheral device, an arithmetic unit, a register, and the like. Note that some or all of the functions of the processing device 31 may be implemented by hardware such as a DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), and FPGA (Field Programmable Gate Array). It may be realized. The processing device 31 executes various processes in parallel or sequentially.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • the storage device 32 is a recording medium that can be read and written by the processing device 31. Furthermore, the storage device 32 stores a plurality of programs including the control program PR1 executed by the processing device 31. The storage device 32 functions as a work area for the processing device 31.
  • the detection device 35 detects the state of the XR glasses 30-k.
  • the detection device 35 is, for example, an inertial sensor such as an acceleration sensor that detects acceleration and a gyro sensor that detects angular acceleration, and a geomagnetic sensor that detects orientation.
  • the acceleration sensor detects acceleration on orthogonal X, Y, and Z axes.
  • the gyro sensor detects angular acceleration with the X-axis, Y-axis, and Z-axis as central axes of rotation.
  • the detection device 35 can generate posture information indicating the posture of the XR glasses 30-k based on the output information of the gyro sensor.
  • the motion information includes acceleration information indicating the acceleration of each of the three axes and angular acceleration information indicating the angular acceleration of the three axes.
  • the detection device 35 also outputs posture information regarding the posture of the XR glasses 30-k, movement information regarding the movement of the XR glasses 30-k, and orientation information regarding the orientation of the XR glasses 30-k to the processing device 31.
  • the imaging device 36 outputs a captured image obtained by capturing an image of the outside world.
  • the imaging device 36 includes, for example, a lens, an imaging element, an amplifier, and an AD converter.
  • the light collected through the lens is converted into an image signal, which is an analog signal, by an image sensor.
  • the amplifier amplifies the imaging signal and outputs it to the AD converter.
  • the AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal.
  • the converted imaging information is output to the processing device 21.
  • the captured image output to the processing device 31 is output to the terminal device 20-k via the communication device 37.
  • the communication device 37 is hardware as a transmitting/receiving device for communicating with other devices. Further, the communication device 37 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 37 may include a connector for wired connection, and the terminal device 20-k corresponding to the connector may include a circuit.
  • the communication device 37 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB. Furthermore, examples of wireless communication interfaces include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 38 is a device that displays images.
  • the display 38 displays various images under the control of the processing device 31.
  • the processing device 31 functions as the communication control section 311 and the display control section 312 by reading the control program PR1 from the storage device 32 and executing the read control program PR1.
  • the communication control unit 311 uses the communication device 37 to cause the terminal device 20-k to transmit attitude information, movement information, direction information, and captured images. Further, the communication control unit 311 receives image information indicating the virtual object VO itself and position information indicating the position of the virtual object VO in the augmented reality space or mixed reality space, transmitted from the terminal device 20-k, to the communication device 37.
  • the display control unit 312 uses the image information indicating the virtual object VO itself and the position information indicating the position of the virtual object VO in the augmented reality space or mixed reality space, which is received by the communication device 37, to display the virtual object VO.
  • the augmented reality space or mixed reality space is displayed on the display 38.
  • FIG. 4 is a block diagram showing an example of the configuration of the terminal device 20-k.
  • the terminal device 20-k includes a processing device 21, a storage device 22, an input device 23, a detection device 24, a communication device 25, and a display 26.
  • Each element included in the terminal device 20-k is interconnected by a single bus or multiple buses for communicating information.
  • the processing device 21 is a processor that controls the entire terminal device 20-k.
  • the processing device 21 is configured using, for example, a single chip or a plurality of chips. Further, the processing device 21 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 21 may be realized by hardware such as DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), etc. You may.
  • the processing device 21 executes various processes in parallel or sequentially.
  • the storage device 22 is a recording medium that can be read and written by the processing device 21. Furthermore, the storage device 22 stores a plurality of programs including the control program PR1 executed by the processing device 21. The storage device 22 functions as a work area for the processing device 21. Further, the storage device 22 may store data used by a generation unit 214, which will be described later, to render the virtual object VO and the virtual space in which the virtual object VO exists.
  • the input device 23 accepts operations from user U[k].
  • the input device 23 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse.
  • the input device 23 may also serve as the display 26.
  • the detection device 24 detects the state of the terminal device 20-k.
  • the detection device 24 includes, for example, an inertial sensor such as an acceleration sensor that detects acceleration and a gyro sensor that detects angular acceleration, a geomagnetic sensor that detects orientation, and a GPS device that detects the position of the terminal device 20-k in real space. This applies to positioning devices such as The acceleration sensor detects acceleration on orthogonal X, Y, and Z axes.
  • the gyro sensor detects angular acceleration with the X-axis, Y-axis, and Z-axis as central axes of rotation.
  • the detection device 24 can generate posture information regarding the posture of the terminal device 20-k based on the output information of the gyro sensor.
  • the motion information includes acceleration information indicating the acceleration of each of the three axes and angular acceleration information indicating the angular acceleration of the three axes.
  • the detection device 24 also provides posture information regarding the posture of the terminal device 20-k, movement information regarding the movement of the terminal device 20-k, azimuth information regarding the orientation of the terminal device 20-k, and position information regarding the position of the terminal device 20-k. The information is output to the processing device 21.
  • the processing device 21 receives selection of the virtual object VO, input of characters, and input of instructions in the augmented reality space or mixed reality space based on the attitude of the terminal device 20-k. For example, when the user U[k] operates the input device 23 with the central axis of the terminal device 20-k directed toward a predetermined area in the mixed reality space, the virtual object VO placed in the predetermined area is selected. .
  • the user U[k]'s operation on the input device 23 is, for example, a double tap. In this way, by operating the terminal device 20-k, the user U[k] can select the virtual object VO without looking at the input device 23 of the terminal device 20-k.
  • the communication device 25 is hardware as a transmitting/receiving device for communicating with other devices. Further, the communication device 25 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 25 may include a connector for wired connection, and the XR glasses 30-k corresponding to the connector may include a circuit.
  • the communication device 25 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB. Furthermore, examples of wireless communication interfaces include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 26 is a device that displays images.
  • the display 26 displays various images under the control of the processing device 21.
  • the processing device 21 reads the control program PR2 from the storage device 22 and executes the read control program PR2, thereby controlling the communication control unit 211, the acquisition unit 212, the calculation unit 213, the generation unit 214, and It functions as a display control section 215.
  • the communication control unit 211 receives posture information regarding the posture of the XR glasses 30-k, movement information regarding the movement of the XR glasses 30-k, azimuth information regarding the orientation of the XR glasses 30-k, and azimuth information regarding the orientation of the XR glasses 30-k, from the XR glasses 30-k. -k is caused to be received by the communication device 25. Further, the communication control unit 211 receives from the management server 50 data used by a generation unit 214 (described later) to render a virtual object VO and an augmented reality space or a mixed reality space including the virtual object VO, to the communication device 25.
  • the communication control unit 211 uses the communication device 25 to send information regarding the posture of the XR glasses 30-k received from the XR glasses 30-k to the management server 50, and information regarding the movement of the XR glasses 30-k. Movement information, azimuth information regarding the azimuth of the XR glasses 30-k, and a captured image of the XR glasses 30-k are transmitted. Further, the communication control unit 211 uses the communication device 25 to provide the management server 50 with posture information regarding the posture of the terminal device 20-k, movement information regarding the movement of the terminal device 20-k, and information regarding the posture of the terminal device 20-k. Direction information regarding the direction, position information regarding the position of the terminal device 20-k, and operation information for the terminal device 20-k by the user U[k] using the input device 23 are transmitted.
  • the acquisition unit 212 acquires height information indicating the height of user U[k]. For example, user U[k] uses the input device 23 to input his or her height into the terminal device 20-k. The acquisition unit 212 acquires height information indicating the height of user U[k] input by user U[k].
  • the calculation unit 213 calculates the height H of the XR glasses 30-k with respect to the ground G in the real space.
  • the generation unit 214 uses at least one of the data received from the management server 50 by the communication control unit 211 and the data stored in the storage device 22 to create the virtual object VO and the virtual space in which the virtual object VO exists. Render.
  • the display control unit 215 causes the XR glasses 30-k to display the virtual object VO rendered by the generation unit 214 and the virtual space in which the virtual object VO exists. More specifically, the display control unit 215 displays image information indicating the virtual object VO, image information indicating the virtual space in which the virtual object VO exists, and position information indicating the position of the virtual object VO in the virtual space.
  • the communication device 25 is used to cause the XR glasses 30-k to transmit. As a result, a virtual space in which the virtual object VO is located at the position indicated by the position information, and an augmented reality space or mixed reality space in which the virtual space and the real space are superimposed are displayed on the XR glasses 30-k. be done.
  • the display control unit 215 arranges the virtual object VO on a plane corresponding to the ground G, as explained using FIG.
  • the display control unit 215 also controls a distance corresponding to the height H of the XR glasses 30-k from a position C where an axis passing through the center of the XR glasses 30-k and parallel to the Z-axis intersects a plane corresponding to the ground G.
  • a virtual object VO is placed at a position separated by L.
  • the display position of the virtual object VO matches the height of the ground G, the degree to which the user U[k] feels uncomfortable is also reduced. Furthermore, when the user U[k] recognizes the virtual object VO using the XR glasses 30-k, the virtual object VO is closer to the user U[k] than the height H of the XR glasses 30-k. This eliminates the need for the user U[k] to tilt his or her head downward. Furthermore, the virtual object VO is displayed at a position away from the user U[k] depending on the height of the user U[k] who wears the XR glasses 30-k. As a result, the visibility of the virtual object VO for the user U[k] increases.
  • FIG. 5 is a block diagram showing an example of the configuration of the management server 50.
  • the management server 50 includes a processing device 51, a storage device 52, an input device 53, a communication device 54, and a display 55. Each element included in the management server 50 is interconnected by one or more buses for communicating information.
  • the processing device 51 is a processor that controls the entire management server 50.
  • the processing device 51 is configured using, for example, a single chip or a plurality of chips. Further, the processing device 51 is configured using, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, registers, and the like. Note that some or all of the functions of the processing device 51 may be implemented by hardware such as a DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), and FPGA (Field Programmable Gate Array). It may be realized.
  • the processing device 51 executes various processes in parallel or sequentially.
  • the storage device 52 is a recording medium that can be read and written by the processing device 51. Furthermore, the storage device 52 stores a plurality of programs including the control program PR3 executed by the processing device 51. The storage device 52 functions as a work area for the processing device 51. The storage device 52 also stores data used by the terminal device 20-k to render the virtual object VO and the virtual space in which the virtual object VO exists.
  • the input device 53 accepts operations from the administrator of the management server 50.
  • the input device 53 includes a keyboard, a touch pad, a touch panel, or a pointing device such as a mouse.
  • the input device 53 may also serve as the display 55.
  • the communication device 54 is hardware as a transmitting/receiving device for communicating with other devices. Further, the communication device 54 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 54 may include a connector for wired connection.
  • the communication device 54 may include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products compliant with wired LAN, IEEE1394, and USB. Furthermore, examples of wireless communication interfaces include products compliant with wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 55 is a device that displays images.
  • the display 55 displays various images under the control of the processing device 51.
  • the processing device 51 functions as the communication control unit 511 and the acquisition unit 512 by reading the control program PR3 from the storage device 52 and executing the read control program PR3.
  • the communication control unit 511 receives posture information regarding the posture of the XR glasses 30-k, movement information regarding the movement of the XR glasses 30-k, orientation information regarding the orientation of the XR glasses 30-k, and azimuth information regarding the orientation of the XR glasses 30-k, from the terminal device 20-k.
  • the communication device 54 receives the captured image -k.
  • the communication control unit 511 receives, from the terminal device 20-k, posture information regarding the posture of the terminal device 20-k, movement information regarding the movement of the terminal device 20-k, azimuth information regarding the orientation of the terminal device 20-k, information regarding the orientation of the terminal device 20-k,
  • the communication device 54 is caused to receive the position information regarding the position of the terminal device 20-k and the operation information on the terminal device 20-k by the user U[k].
  • the communication control unit 511 uses the communication device 54 to cause the terminal device 20-k to render the virtual object VO and the augmented reality space or mixed reality space including the virtual object VO, which has been acquired by the acquisition unit 512 described below. send the data used for this purpose.
  • the acquisition unit 512 acquires data used by the terminal device 20-k to render the virtual object VO and the augmented reality space or mixed reality space including the virtual object VO from the storage device 52. Specifically, the acquisition unit 512 receives posture information regarding the posture of the XR glasses 30-k, movement information regarding the movement of the XR glasses 30-k, and information about the movement of the XR glasses 30-k that the communication control unit 511 causes the communication device 54 to receive.
  • azimuth information regarding the azimuth a captured image of the XR glasses 30-k, posture information regarding the attitude of the terminal device 20-k, movement information regarding the movement of the terminal device 20-k, azimuth information regarding the azimuth of the terminal device 20-k, the terminal device
  • the above data is acquired according to the position information regarding the position of the terminal device 20-k and the operation information on the terminal device 20-k by the user U[k].
  • FIG. 6 is a flowchart showing the contents of a display process in which the terminal device 20-k according to the first embodiment displays an augmented reality space or a mixed reality space including a virtual object VO on the XR glasses 30-k.
  • step S10 the processing device 21 acquires height information indicating the height of the user U[k].
  • step S11 the processing device 21 calculates the height H of the XR glasses 30-k as a display device based on the height of the user U[k].
  • step S12 the processing device 21 causes the XR glasses 30-k to display the augmented reality space or mixed reality space including the virtual object VO.
  • the processing device 21 uses at least one of the data received from the management server 50 and the data stored in the storage device 22 to create the virtual object VO and the virtual space in which the virtual object VO exists. Render. Further, the processing device 21 uses the communication device 25 to transmit image information indicating the virtual object VO, image information indicating the virtual space in which the virtual object VO exists, and position information indicating the position of the virtual object VO in the virtual space. and send it to the XR glasses 30-k. As a result, an augmented reality space or a mixed reality space in which the virtual object VO is superimposed on the real space is displayed on the XR glasses 30-k.
  • the processing device 21 places the virtual object VO on a plane corresponding to the ground G. Furthermore, the processing device 21 moves from a position C, where an axis passing through the center of the XR glass 30-k and parallel to the Z axis intersects a plane corresponding to the ground G, to a distance L corresponding to the height H of the XR glass 30-k. The virtual object VO is placed at a position separated by the distance.
  • the processing device 21 functions as the acquisition unit 212 in step S10. Furthermore, the processing device 21 functions as the calculation unit 213 in step S11. Furthermore, the processing device 21 functions as a generation unit 214 and a display control unit 215 in step S12.
  • the terminal device 20-k includes the calculation section 213 and the display control section 215.
  • the calculation unit 213 calculates the height H of the XR glasses 30-k as a display device with respect to the ground G in the real space.
  • the display control unit 215 causes the XR glasses 30-k to display an augmented reality space or a mixed reality space in which the virtual object VO is superimposed on the real space.
  • the augmented reality space or the mixed reality space includes a virtual object VO placed on a plane corresponding to the ground G.
  • the display control unit 215 calculates a distance L according to the height H of the XR glasses 30-k indicated by the height information from the position where the axis passing through the center of the XR glasses 30-k and along the vertical direction intersects the above-mentioned plane.
  • the virtual object VO is placed at a position separated by the distance.
  • the terminal device 20-k has the above configuration, when displaying the virtual object VO on the ground G in the real space in the augmented reality space or the mixed reality space, the display position of the virtual object VO is adjusted to the height of the ground G. Can be matched. For example, when the user U[k] walks with the XR glasses 30-k attached to his head, the virtual object VO is displayed in front of the user U[k]'s eyes. When the object is blocked by the virtual object VO, the danger associated with user U[k]'s walking increases.
  • the terminal device 20-k according to the present embodiment can suppress such a risk by displaying the virtual object VO on the ground G in the real space. Furthermore, since the display position of the virtual object VO matches the height of the ground G, the degree to which the user U[k] feels uncomfortable is also reduced.
  • the virtual object VO is moved by a distance L corresponding to the height H of the XR glasses 30-k from the position where the vertical axis passing through the center of the XR glasses 30-k intersects the plane corresponding to the ground G. placed at a remote location.
  • the virtual object VO is closer to the user U[k] than the height H of the XR glasses 30-k. This eliminates the need for the user U[k] to tilt his head downward at a small angle.
  • the terminal device 20-k further includes an acquisition unit 212.
  • the acquisition unit 212 acquires height information indicating the height of user U[k].
  • the calculation unit 213 calculates the height H of the XR glasses 30-k as a display device based on the height of the user U[k] indicated by the height information.
  • the virtual object VO is located at a position far from the user U[k] depending on the height of the user U[k] who wears the XR glasses 30-k. will be displayed. As a result, the visibility of the virtual object VO for the user U[k] increases.
  • the information processing system 1A has user devices 10A-1, 10A-2, ...10A-k, instead of the user devices 10-1, 10-2, ...10-k, ...10-j provided in the information processing system 1. ...Equipped with 10A-j.
  • the user device 10A-k includes a terminal device 20A-k and XR glasses 30-k.
  • the terminal device 20A-k includes a processing device 21A instead of the processing device 21 provided in the terminal device 20-k, and a storage device 22A instead of the storage device 22 provided in the terminal device 20-k. Note that the configuration of the terminal device 20A-k is the same as the configuration of the terminal device 20-k shown in FIG. 4, so its illustration is omitted.
  • the storage device 22A stores a control program PR2A instead of the control program PR2 stored in the storage device 22.
  • the processing device 21A reads the control program PR2A from the storage device 22A, and executes the read control program PR2A, thereby implementing the same communication control unit 311, acquisition unit 212, and the same as the terminal device 20-k according to the first embodiment.
  • the generation section 214 and the display control section 215 it functions as a calculation section 213A.
  • the calculation unit 213A calculates the Z-axis height output from the acceleration sensor included in the detection device 24 during a period in which the height of the terminal device 20A-k is moved from the eye level of the user U[k] toward the ground G.
  • the height H of the XR glass 30-k is calculated based on the acceleration in the direction.
  • the calculation unit 213A outputs an output from the acceleration sensor included in the detection device 24 during a period in which the height of the terminal device 20A-k is moved from the height near the ground G to the eye level of the user U[k]. Based on the acceleration in the Z-axis direction, height information indicating the height H of the XR glass 30-k is calculated.
  • user U[k] lifts the terminal device 20A-k to his/her eye level, and uses the input device 23 of the terminal device 20A-k to indicate the height H of the XR glasses 30-k. Perform an operation to indicate that you want to start calculating the information. Thereafter, the user U[k] lowers the terminal device 20A-k to the level of the ground G. After lowering the terminal device 20A-k to near the height of the ground G, the user U[k] indicates the height H of the XR glasses 30-k using the input device 23 of the terminal device 20A-k. Perform an operation to indicate that the calculation of height information is to be terminated.
  • the calculation unit 213A integrates the acceleration in the Z-axis direction output from the acceleration sensor twice over the time since the terminal device 20A started descending, thereby calculating the eye height of the user U[k]. Calculate.
  • the calculation unit 213A sets the calculated eye height of the user U[k] as the height H of the XR glasses 30-k.
  • the terminal device 20A-k can easily calculate height information indicating the height H of the XR glasses 30-k necessary for calculating the placement position of the virtual object VO.
  • the terminal device 20A-k executes only steps S11 and S12 among the steps S10 to S12 executed by the terminal device 20-k according to the first embodiment shown in FIG.
  • step S11 the processing device 21A detects the acceleration sensor included in the detection device 24 during a period in which the height of the terminal device 20A-k is changed from the eye level of the user U[k] to the height of the ground G. Based on the output acceleration in the Z-axis direction, height information indicating the height H of the XR glasses 30-k is calculated. Alternatively, the processing device 21A may detect the output from the acceleration sensor included in the detection device 24 during a period in which the height of the terminal device 20A-k is changed from the height of the ground G to the eye level of the user U[k]. Based on the acceleration in the Z-axis direction, height information indicating the height H of the XR glass 30-k is calculated.
  • the operation of the processing device 21A in step S12 is the same as the operation of the processing device 21 provided in the terminal device 20-k according to the first embodiment.
  • the processing device 21A functions as the calculation unit 213A in step S11. Furthermore, the processing device 21 functions as a generation unit 214 and a display control unit 215 in step S12.
  • the terminal device 20A-k further includes an acceleration sensor included in the detection device 24 and a calculation unit 213A.
  • the acceleration sensor detects acceleration in the vertical direction, that is, the Z-axis direction.
  • the calculation unit 213A calculates the period during which the height H of the terminal device 20A-k is moved toward the ground G from the eye level of the user U[k], or the height of the terminal device 20A-k. , the above height is calculated based on the acceleration output from the acceleration sensor during the period of movement from near the ground G to the eye level of the user U[k].
  • the terminal device 20A-k Since the terminal device 20A-k has the above configuration, it can easily calculate the height H of the XR glasses 30-k necessary for calculating the placement position of the virtual object VO.
  • Configuration of Third Embodiment 3-1-1 Overall Configuration
  • the overall configuration of the information processing system 1B according to the present embodiment is similar to the overall configuration of the information processing system 1 shown in FIG. Its illustration is omitted.
  • the information processing system 1B has user devices 10B-1, 10B-2, ...10B-k, instead of the user devices 10-1, 10-2, ...10-k, ...10-j provided in the information processing system 1. ...Equipped with 10B-j.
  • the user device 10B-k includes a terminal device 20B-k and XR glasses 30-k.
  • the terminal device 20B-k includes a processing device 21B instead of the processing device 21 provided in the terminal device 20-k, and a storage device 22B instead of the storage device 22 provided in the terminal device 20-k. Note that the configuration of the terminal device 20B-k is the same as that of the terminal device 20-k shown in FIG. 4, so its illustration is omitted.
  • the storage device 22B stores a control program PR2B instead of the control program PR2 stored in the storage device 22.
  • the processing device 21B reads the control program PR2B from the storage device 22B and executes the read control program PR2B, thereby creating the same communication control unit 211, generation unit 214, and the same as the terminal device 20-k according to the first embodiment.
  • the display control section 215 it functions as an acquisition section 212B and a calculation section 213B.
  • the acquisition unit 212B detects a position at a reference distance from a position C where the axis along the Z-axis direction intersects the ground G through the center of the XR glasses 30-k, when the user U[k] Obtain inclination information indicating the inclination of the XR glasses 30-k when viewed using the XR glasses 30-k.
  • FIG. 7 is an explanatory diagram of the inclination ⁇ of the XR glass 30-k.
  • the user U[k] moves from a position C, where an axis passing through the center of the XR glasses 30-k and along the Z-axis direction intersects with the ground G, to a reference distance M.
  • the position P located at is visually confirmed using the XR glasses 30-k.
  • the position P is, for example, a position that has been marked in advance before the user U[k] visually recognizes it.
  • the angle between the straight line S connecting the center of the XR glasses 30-k and the position P and the horizontal plane is defined as the inclination ⁇ of the XR glasses 30-k.
  • the calculation unit 213B calculates the height H of the XR glasses 30-k using the reference distance M and the slope ⁇ indicated by the slope information.
  • the height H of the XR glass 30-k is Mtan ⁇ .
  • the terminal device 20B-k calculates the height H of the XR glasses 30-k necessary for calculating the placement position of the virtual object VO. height information can be calculated.
  • FIG. 8 is a flowchart showing the contents of a display process in which the terminal device 20B-k according to the third embodiment displays an augmented reality space or mixed reality space including a virtual object VO on the XR glasses 30-k.
  • step S20 the processing device 21B selects a position P located at a reference distance M from a position C where an axis along the Z-axis direction intersects the ground G through the center of the XR glass 30-k, as the user U[k]. Obtains inclination information indicating the inclination ⁇ of the XR glasses 30-k when viewed using the XR glasses 30-k.
  • step S21 the processing device 21B calculates the height H of the XR glasses 30-k using the reference distance M and the tilt ⁇ indicated by the tilt information.
  • step S22 the processing device 21B causes the XR glasses 30-k to display the augmented reality space or mixed reality space including the virtual object VO. Note that the process in step S22 is the same as the process in step S12 in the first embodiment, so detailed description thereof will be omitted.
  • the processing device 21B functions as the acquisition unit 212B in step S20. Furthermore, the processing device 21B functions as the calculation unit 213B in step S21. Furthermore, the processing device 21B functions as the generation unit 214 and the display control unit 215 in step S22.
  • the terminal device 20B-k includes the acquisition section 212B and the calculation section 213B.
  • the acquisition unit 212B uses the XR glasses to locate a position P at a reference distance M from a position where an axis along the Z-axis direction passing through the center of the XR glasses 30-k serving as a display device intersects with the ground G.
  • Tilt information indicating the tilt ⁇ of the XR glasses 30-k when viewed using the XR glasses 30-k is acquired.
  • the calculation unit 213B calculates the height H using the reference distance M and the slope ⁇ indicated by the slope information.
  • the XR glasses 30-k necessary for the user U[k] to calculate the placement position of the virtual object VO without directly operating the terminal device 20B-k.
  • the height H of can be calculated.
  • the information processing system 1C has user devices 10C-1, 10C-2, ...10C-k, instead of the user devices 10-1, 10-2, ...10-k, ...10-j provided in the information processing system 1. ...Equipped with 10C-j.
  • the user device 10C-k includes a terminal device 20C-k and XR glasses 30-k.
  • the terminal device 20C-k includes a processing device 21C in place of the processing device 21 provided in the terminal device 20-k, and a storage device 22C in place of the storage device 22. Note that the configuration of the terminal device 20C-k is the same as the configuration of the terminal device 20-k shown in FIG. 4, so its illustration is omitted.
  • the storage device 22C stores a control program PR2C instead of the control program PR2 stored in the storage device 22.
  • the processing device 21C reads the control program PR2C from the storage device 22C and executes the read control program PR2C, thereby creating the same communication control unit 211, generation unit 214, and the same as the terminal device 20-k according to the first embodiment.
  • the calculation unit 213C functions as a calculation unit 213C.
  • the calculation unit 213C calculates the stride length of the user U[k] from the distance traveled by the user U[k] during the reference time and the number of steps taken by the user U[k] during the movement. Furthermore, the calculation unit 213C calculates the height H of the XR glasses 30-k according to a function that uses the stride length of the user U[k] as a parameter.
  • the detection device 24 provided in the terminal device 20-k includes a GPS device that detects the position of the terminal device 20-k.
  • the calculation unit 213C calculates the distance traveled by the user U[k] at the reference time based on changes in the position information detected by the GPS device.
  • the detection device 24 includes an acceleration sensor and has a function as a pedometer using the acceleration detected by the acceleration sensor.
  • the calculation unit 213C calculates the stride length of the user U[k] from the distance traveled by the user U[k] in the reference time and the number of steps taken by the user U[k] during the movement. Further, the calculation unit 213C estimates the height of the user U[k] based on the calculated stride length of the user U[k].
  • the calculation unit 213C calculates the height H of the XR glasses 30-k based on the estimated height of the user U[k]. For example, the calculation unit 213C calculates the height H of the XR glasses 30-k by subtracting the reference value from the estimated height of the user U[k].
  • the terminal device 20C-k calculates the height H of the XR glasses 30-k necessary for calculating the placement position of the virtual object VO. height information can be calculated.
  • the terminal device 20C-k executes only steps S11 and S12 among the steps S10 to S12 executed by the terminal device 20-k according to the first embodiment shown in FIG.
  • step S11 the processing device 21C calculates the stride length of the user U[k] from the distance traveled by the user U[k] during the reference time and the number of steps taken by the user U[k] during the movement. Furthermore, the calculation unit 213C calculates the height H of the XR glasses 30-k according to a function that uses the stride length of the user U[k] as a parameter.
  • the operation of the processing device 21C in step S12 is the same as the operation of the processing device 21 provided in the terminal device 20-k according to the first embodiment.
  • the processing device 21C functions as the calculation unit 213C in step S11. Furthermore, the processing device 21 functions as a generation unit 214 and a display control unit 215 in step S12.
  • the terminal device 20C-k includes the calculation unit 213C.
  • the calculation unit 213C calculates the stride length of the user U[k] from the distance traveled by the user U[k] during the reference time and the number of steps taken by the user U[k] during the movement. Further, the calculation unit 213C calculates the height H of the XR glasses 30-k according to a function that uses the stride length of the user U[k] as a parameter.
  • the XR glasses 30-k necessary for the user U[k] to calculate the placement position of the virtual object VO without directly operating the terminal device 20B-k. Height information indicating the height H of can be calculated.
  • 5-1 Configuration of fifth embodiment 5-1-1: Overall configuration
  • the overall configuration of the information processing system 1D according to the present embodiment is similar to the overall configuration of the information processing system 1 shown in FIG. Its illustration is omitted.
  • the information processing system 1D has user devices 10D-1, 10D-2, ...10D-k, instead of the user devices 10-1, 10-2, ...10-k, ...10-j provided in the information processing system 1. ...Equipped with 10D-j.
  • the user device 10D-k includes a terminal device 20D-k and XR glasses 30-k.
  • FIG. 9 is a block diagram showing an example of the configuration of the terminal device 20D-k.
  • the terminal device 20D-k includes a processing device 21D instead of the processing device 21 provided in the terminal device 20-k, a storage device 22D instead of the storage device 22, and a detection device 24D instead of the detection device 24. Further, the terminal device 20D-k further includes a sound emitting device 27 and a timer 28 in addition to the components included in the terminal device 20-k.
  • the sound emitting device 27 outputs sound.
  • the sound emitting device 27 is, for example, a speaker.
  • the detection device 24D has a function of detecting sound in addition to the functions of the detection device 24 according to the first embodiment.
  • the detection device 24D includes, for example, a microphone.
  • the detection device 24D detects the reflected sound of the sound output by the sound emitting device 27 reflected on the ground G.
  • the timer 28 measures the length of time from when the sound emitting device 27 emits a sound until the detecting device 24D detects the reflected sound.
  • the storage device 22D stores a control program PR2D instead of the control program PR2 stored in the storage device 22.
  • the processing device 21D reads the control program PR2D from the storage device 22D and executes the read control program PR2D, thereby creating the same communication control section 211, generation section 214, and the same as the terminal device 20-k according to the first embodiment.
  • the calculation unit 213D functions as a calculation unit 213D.
  • the calculation unit 213D calculates the period from when the sound emitting device 27 outputs the sound until the detection device 24D detects the reflected sound in a situation where the terminal device 20D-k is located at the eye level of the user U[k].
  • the height H of the XR glass 30-k is calculated based on the time.
  • the user U[k] raises the terminal device 20D-k to his/her eye level, and uses the input device 23 of the terminal device 20D-k to instruct the sound emitting device 27 to output sound. perform operations.
  • the sound emitted from the sound emitting device 27 is reflected by the ground G.
  • the timer 28 measures the length of time from when the sound emitting device 27 emits a sound until the detecting device 24D detects the reflected sound.
  • the calculation unit 213D calculates the eye height of the user U[k] by dividing the value obtained by integrating the time measured by the timer 28 with respect to the speed of sound by 2.
  • the calculation unit 213D sets the calculated eye height of the user U[k] as the height H of the XR glasses 30-k.
  • the terminal device 20D-k can easily calculate the height H of the XR glasses 30-k necessary for calculating the placement position of the virtual object VO.
  • FIG. 10 is a flowchart showing the contents of a display process in which the terminal device 20D-k according to the fifth embodiment causes the XR glasses 30-k to display an augmented reality space or mixed reality space including a virtual object VO.
  • step S30 the sound emitting device 27 outputs sound under a situation where the terminal device 20-k is located at the eye level of the user U[k].
  • step S31 the detection device 24D detects the reflected sound reflected by the ground G.
  • step S32 the processing device 21D determines the height of the XR glasses 30-k as a display device based on the length of time from when the sound emitting device 27 outputs sound until when the detecting device 24D detects the reflected sound. Calculate H.
  • step S33 the processing device 21D causes the XR glasses 30-k to display the augmented reality space or mixed reality space including the virtual object VO. Note that the process in step S33 is the same as the process in step S12 in the first embodiment, so detailed description thereof will be omitted.
  • the processing device 21D functions as the calculation unit 213D in step S32. Furthermore, the processing device 21D functions as the generation unit 214 and the display control unit 215 in step S33.
  • the terminal device 20D-k includes the sound emitting device 27, the detection device 24D, and the calculation unit 213D.
  • the sound emitting device 27 outputs sound.
  • the detection device 24D detects the reflected sound of the sound output from the sound emitting device 27 reflected on the ground G.
  • the calculation unit 213D calculates the period from when the sound emitting device 27 outputs sound until when the detection device 24D detects the reflected sound.
  • the height H of the XR glass 30-k is calculated based on the time length.
  • the terminal device 20D-k Since the terminal device 20D-k has the above configuration, it can easily calculate the height H of the XR glasses 30-k necessary for calculating the placement position of the virtual object VO.
  • Configuration of Sixth Embodiment 6-1-1 Overall Configuration
  • the overall configuration of the information processing system 1E according to the present embodiment is similar to the overall configuration of the information processing system 1 shown in FIG. Its illustration is omitted.
  • the information processing system 1E has user devices 10E-1, 10E-2, ...10E-k, instead of the user devices 10-1, 10-2, ...10-k, ...10-j provided in the information processing system 1. ...Equipped with 10E-j.
  • the user device 10E-k includes a terminal device 20E-k and XR glasses 30-k.
  • FIG. 11 is a block diagram showing an example of the configuration of the terminal device 20E-k.
  • the terminal device 20E-k includes a processing device 21E in place of the processing device 21 provided in the terminal device 20-k, and a storage device 22E in place of the storage device 22.
  • the storage device 22E stores a control program PR2E instead of the control program PR2 stored in the storage device 22.
  • the processing device 21E reads the control program PR2E from the storage device 22E, and executes the read control program PR2E, thereby providing the same communication control unit 211, acquisition unit 212, and the same as the terminal device 20-k according to the first embodiment.
  • it functions as a display control unit 215E and a reception unit 216.
  • the reception unit 216 receives an instruction to move the virtual object VO from the user U[k].
  • the user U[k] inputs an instruction to move the virtual object VO from the input device 23.
  • the accepting unit 216 accepts instructions input from the input device 23.
  • the user U[k] makes a gesture to move the virtual object VO.
  • an imaging device (not shown) images the gesture of the user U[k].
  • the receiving unit 216 receives the gesture of the user U[k] captured by the imaging device as an instruction to move the virtual object VO.
  • the display control unit 215E moves the virtual object VO within the plane corresponding to the ground G when the reception unit 216 receives the above instruction.
  • FIG. 12 is a diagram showing how the virtual object VO is moved by the display control unit 215E.
  • the display control unit 215E moves the virtual object VO from the area A1 to the area A2 based on the gesture of the user U[k], for example.
  • Area A1 and area A2 are both located in a plane N corresponding to the ground G.
  • user U[k] points with his index finger at virtual object VO, which was initially located in area A1.
  • User U[k] changes the direction in which he points with his index finger to the direction of area A2.
  • the virtual object VO moves from area A1 to area A2.
  • the terminal device 20E-k is able to move the virtual object VO while ensuring high visibility of the virtual object VO.
  • FIG. 13 is a flowchart showing the details of a movement process in which the terminal device 20-k moves the virtual object VO according to the sixth embodiment.
  • step S40 the processing device 21E receives an instruction to move the virtual object VO from the user U[k].
  • step S41 the processing device 21E moves the virtual object VO within a plane corresponding to the ground G.
  • the processing device 21E functions as the reception unit 216 in step S40. Furthermore, the processing device 21D functions as the display control section 215E in step S41.
  • the terminal device 20E-k includes the reception section 216 and the display control section 215E.
  • the reception unit 216 receives an instruction to move the virtual object VO from the user U[k].
  • the display control unit 215E moves the virtual object VO within a plane corresponding to the ground G when the reception unit 216 receives the above instruction.
  • the terminal device 20E-k Since the terminal device 20E-k has the above configuration, it is possible to move the virtual object VO while ensuring high visibility of the virtual object VO.
  • the user device 10-k includes a terminal device 20-k and XR glasses 30-k.
  • the terminal device 20-k functioned as a relay device that relays communication with the management server 50.
  • the present disclosure is not limited to an aspect in which the user device 10-k includes the terminal device 20-k and the XR glasses 30-k.
  • the XR glasses 30-k may have a function of communicating with the management server 50.
  • the user device 10-k may be XR glasses 30-k.
  • the terminal device 20-k may have the function of the XR glasses 30-k.
  • the user device 10-k is configured by a terminal device 20-k.
  • the storage devices 22 to 22E, the storage device 32, and the storage device 52 are exemplified as ROM and RAM, but they may also be flexible disks, magneto-optical disks (for example, compact disks, digital versatile discs, Blu-ray discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), compact disc-roms (CD-ROMs), registers, removable discs, hard disks, floppies ( trademark) disk, magnetic strip, database, server, or other suitable storage medium.
  • the program may also be transmitted from a network via a telecommunications line. Further, the program may be transmitted from the communication network NET via a telecommunications line.
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. which may be referred to throughout the above description, may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
  • the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
  • the determination may be made using a value expressed using 1 bit (0 or 1) or a truth value (Boolean: true or false).
  • the comparison may be performed by comparing numerical values (for example, comparing with a predetermined value).
  • each function illustrated in FIGS. 1 to 13 is realized by an arbitrary combination of at least one of hardware and software.
  • the method of implementing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices.
  • the functional block may be realized by combining software with the one device or the plurality of devices.
  • the programs exemplified in the above-described embodiments are instructions, instruction sets, codes, codes, regardless of whether they are called software, firmware, middleware, microcode, hardware description language, or by other names. Should be broadly construed to mean a segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc.
  • software, instructions, information, etc. may be sent and received via a transmission medium.
  • a transmission medium For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • the information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
  • the devices 20C-1 to 20C-j, the terminal devices 20D-1 to 20D-j, and the terminal devices 20E-1 to 20E-j may be mobile stations (MS).
  • a mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as “mobile station,” “user terminal,” “user equipment (UE),” and “terminal” may be used interchangeably.
  • connection refers to direct or indirect connections between two or more elements. Refers to any connection or combination and may include the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other.
  • the coupling or connection between elements may be a physical coupling or connection, a logical coupling or connection, or a combination thereof.
  • connection may be replaced with "access.”
  • two elements may include one or more wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and non-visible) ranges, and the like.
  • determining and “determining” used in this disclosure may encompass a wide variety of operations.
  • “Judgment” and “decision” include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a “judgment” or “decision.”
  • judgment and “decision” refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access.
  • (accessing) may include considering something as a “judgment” or “decision.”
  • judgment and “decision” refer to resolving, selecting, choosing, establishing, comparing, etc. as “judgment” and “decision”. may be included.
  • judgment and “decision” may include regarding some action as having been “judged” or “determined.”
  • judgment (decision) may be read as “assuming", “expecting", “considering”, etc.
  • notification of prescribed information is not limited to explicit notification, but may also be done implicitly (for example, by not notifying the prescribed information). Good too.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un dispositif de commande d'affichage qui comprend : une unité de calcul qui calcule la hauteur d'un dispositif d'affichage en référence à la surface du sol dans un espace réel ; et une unité de commande d'affichage qui amène le dispositif d'affichage à afficher un espace de réalité augmentée ou un espace de réalité mixte, dans lequel un objet virtuel est superposé sur l'espace réel. L'espace de réalité augmentée ou l'espace de réalité mixte comprend un objet virtuel qui est disposé sur un plan correspondant à la surface du sol. L'unité de commande d'affichage dispose l'objet virtuel à une position séparée, d'une distance correspondant à la hauteur du dispositif d'affichage indiquée par les informations de hauteur, à partir d'une position dans laquelle ledit plan croise l'axe qui passe à travers le centre du dispositif d'affichage et s'étend le long de la direction verticale.
PCT/JP2023/015231 2022-04-21 2023-04-14 Dispositif de commande d'affichage WO2023204159A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-070116 2022-04-21
JP2022070116 2022-04-21

Publications (1)

Publication Number Publication Date
WO2023204159A1 true WO2023204159A1 (fr) 2023-10-26

Family

ID=88419870

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/015231 WO2023204159A1 (fr) 2022-04-21 2023-04-14 Dispositif de commande d'affichage

Country Status (1)

Country Link
WO (1) WO2023204159A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020241189A1 (fr) * 2019-05-30 2020-12-03 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2021044745A1 (fr) * 2019-09-03 2021-03-11 ソニー株式会社 Dispositif de traitement d'affichage, procédé de traitement d'affichage et support d'enregistrement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020241189A1 (fr) * 2019-05-30 2020-12-03 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2021044745A1 (fr) * 2019-09-03 2021-03-11 ソニー株式会社 Dispositif de traitement d'affichage, procédé de traitement d'affichage et support d'enregistrement

Similar Documents

Publication Publication Date Title
US11670267B2 (en) Computer vision and mapping for audio applications
US20210405761A1 (en) Augmented reality experiences with object manipulation
US11995774B2 (en) Augmented reality experiences using speech and text captions
US11886633B2 (en) Virtual object display interface between a wearable device and a mobile device
US8291346B2 (en) 3D remote control system employing absolute and relative position detection
US11854147B2 (en) Augmented reality guidance that generates guidance markers
KR20160145976A (ko) 영상 공유 방법 및 이를 수행하는 전자 장치
US12028626B2 (en) Visual-inertial tracking using rolling shutter cameras
US20210405363A1 (en) Augmented reality experiences using social distancing
US20240061798A1 (en) Debug access of eyewear having multiple socs
US20230367118A1 (en) Augmented reality gaming using virtual eyewear beams
WO2023204159A1 (fr) Dispositif de commande d'affichage
WO2023223750A1 (fr) Dispositif d'affichage
WO2023199627A1 (fr) Dispositif de gestion d'image de guide
WO2023149256A1 (fr) Dispositif de commande d'affichage
WO2023112838A1 (fr) Dispositif de traitement d'informations
WO2023171341A1 (fr) Dispositif de commande d'affichage
WO2023149255A1 (fr) Dispositif de commande d'affichage
JP2024004019A (ja) 制御装置及び制御システム
WO2023145265A1 (fr) Dispositif de transmission de message, et dispositif de réception de message
WO2023149379A1 (fr) Dispositif de traitement d'informations
JP7365501B2 (ja) 情報処理装置
WO2023145890A1 (fr) Dispositif terminal
WO2023210195A1 (fr) Système d'identification
WO2023176317A1 (fr) Dispositif de commande d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791807

Country of ref document: EP

Kind code of ref document: A1