WO2023145892A1 - Dispositif de commande d'écran et serveur - Google Patents

Dispositif de commande d'écran et serveur Download PDF

Info

Publication number
WO2023145892A1
WO2023145892A1 PCT/JP2023/002690 JP2023002690W WO2023145892A1 WO 2023145892 A1 WO2023145892 A1 WO 2023145892A1 JP 2023002690 W JP2023002690 W JP 2023002690W WO 2023145892 A1 WO2023145892 A1 WO 2023145892A1
Authority
WO
WIPO (PCT)
Prior art keywords
display control
messages
display
user
virtual object
Prior art date
Application number
PCT/JP2023/002690
Other languages
English (en)
Japanese (ja)
Inventor
智仁 山▲崎▼
進 関野
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023145892A1 publication Critical patent/WO2023145892A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/373Details of the operation on graphic patterns for modifying the size of the graphic pattern
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Definitions

  • the present invention relates to a display control device and a server. More particularly, the present invention relates to a display control device and a server that cause a display device to display a virtual object corresponding to a message.
  • XR technology including VR (Virtual Reality) technology and AR (Augmented Reality) technology
  • a message indicated by a virtual object is displayed in the virtual space displayed on the XR glasses that the user wears on his or her head.
  • Patent Literature 1 discloses a technology related to a method and apparatus related to an interactive virtual environment for communication. Specifically, Patent Literature 1 discloses a technique of displaying a virtual object representing a "doodle message" in a virtual space in which information can be exchanged between users.
  • the virtual space displayed on the XR glasses worn by the user on the head may be filled with a plurality of virtual objects.
  • the conventional technology there is a problem that the user's view is obstructed and the convenience is lowered.
  • a display control device for displaying a virtual space including a virtual object on a display device worn on the head of a user, the display control device comprising: an acquisition unit for acquiring a plurality of messages; a generation unit for generating a plurality of individual objects corresponding to the plurality of messages on a one-to-one basis; and a display control unit for causing the display device to display a virtual object that is a collection of the plurality of individual objects,
  • the display control unit sets the size of the virtual object to a first size when the number of the plurality of messages is a first number, and displays the size of the virtual object in the virtual space from the center of the virtual space.
  • the size of the virtual object is a second number larger than the first size. and a second distance, which is longer than the first distance, from the center of the virtual space to the center of the virtual object in the virtual space.
  • the present invention when displaying virtual objects corresponding to the number of messages in the virtual space, even if the number of messages increases, the user's convenience can be prevented from deteriorating.
  • FIG. 1 is a perspective view showing the appearance of XR glasses 20 according to the first embodiment
  • FIG. 2 is a block diagram showing a configuration example of the XR glasses 20 according to the first embodiment
  • FIG. 1 is a block diagram showing a configuration example of a terminal device 10 according to the first embodiment
  • FIG. 4 is an explanatory diagram showing an example of operations of a generation unit 113 and a display control unit 114
  • FIG. 4 is an explanatory diagram showing an example of operations of a generation unit 113 and a display control unit 114
  • FIG. 10 is a diagram showing an example of how a plurality of individual objects SO1 to SO10 are aligned; 3 is a block diagram showing a configuration example of a server 30; FIG. A table showing an example of a message database MD. 4 is a flowchart showing the operation of the terminal device 10 according to the first embodiment; The block diagram which shows the structural example of 10 A of terminal devices.
  • FIG. 4 is an explanatory diagram of an operation example of a display control unit 114A and a reception unit 115;
  • FIG. 4 is an explanatory diagram of an operation example of a display control unit 114A and a reception unit 115;
  • 9 is a flowchart showing the operation of the terminal device 10A according to the second embodiment;
  • FIG. 2 is a block diagram showing a configuration example of a terminal device 10B; 3 is a block diagram showing a configuration example of a server 30A; FIG. A table showing an example of a location information database LD.
  • FIG. 4 is an explanatory diagram showing an example of operations of a determination unit 314, an extraction unit 315, and an output unit 312A; A flow chart which shows operation of server 30A.
  • FIG. 4 is an explanatory diagram of a virtual object VO9 generated when the terminal device 10B and the server 30A are combined;
  • FIG. 1 First Embodiment
  • a configuration of an information processing system 1 including a terminal device 10 as a display control device according to a first embodiment of the present invention will be described with reference to FIGS. 1 to 10.
  • FIG. 1 First Embodiment
  • FIG. 1 is a diagram showing the overall configuration of an information processing system 1 according to the first embodiment of the present invention.
  • the information processing system 1 is a system that uses XR technology to provide a virtual space to a user U1 wearing XR glasses 20, which will be described later.
  • the information processing system 1 includes a terminal device 10, XR glasses 20, and a server 30.
  • the terminal device 10 is an example of a display control device.
  • the terminal device 10 and the server 30 are communicably connected to each other via a communication network NET.
  • the terminal device 10 and the XR glasses 20 are connected so as to be able to communicate with each other.
  • the suffix "-X" is used for the reference numerals.
  • X is an arbitrary integer of 1 or more.
  • the terminal device 10-1 and the XR glasses 20 are connected so as to be able to communicate with each other.
  • two terminal devices 10 and one XR glass 20 are shown in FIG. A glass 20 may be provided.
  • the user U1 uses a set of the terminal device 10-1 and the XR glasses 20.
  • the XR glasses 20 display a plurality of individual objects, which will be described later, corresponding to messages addressed to the user U1.
  • the message may include a message transmitted from the terminal device 10-1 to the terminal device 10-2.
  • the message may also include a message sent from another terminal device (not shown in FIG. 1) to the terminal device 10-1.
  • the message may be a message generated by the terminal device 10-1 itself.
  • the server 30 provides various data and cloud services to the terminal device 10 via the communication network NET.
  • the terminal device 10-1 displays virtual objects placed in the virtual space on the XR glasses 20 worn by the user on the head.
  • the virtual space is, for example, a celestial space.
  • the virtual objects are, for example, virtual objects representing data such as still images, moving images, 3DCG models, HTML files, and text files, and virtual objects representing applications. Examples of text files include memos, source codes, diaries, and recipes. Examples of applications include browsers, applications for using SNS, and applications for generating document files.
  • the terminal device 10 is preferably a mobile terminal device such as a smart phone and a tablet, for example.
  • the terminal device 10-1 is an example of a display control device.
  • the terminal device 10-2 is a device for user U2 to send a message to user U1.
  • the terminal device 10-2 may display a virtual object placed in the virtual space on the display 14 described later or XR glasses (not shown) connected to the terminal device 10-2.
  • the configuration of the terminal device 10-2 is basically the same as that of the terminal device 10-1.
  • the terminal device 10-2 is preferably a mobile terminal device such as a smart phone and a tablet, for example.
  • the XR glasses 20 are a see-through wearable display worn on the head of user U1.
  • the XR glasses 20 display a virtual object on the display panel provided for each of the binocular lenses under the control of the terminal device 10-1.
  • the XR glass 20 is an example of a display device.
  • the XR glasses 20 are AR glasses will be described below.
  • the fact that the XR glasses 20 are AR glasses is only an example, and the XR glasses 20 may be VR glasses or MR (Mixed Reality) glasses.
  • FIG. 2 is a perspective view showing the appearance of the XR glasses 20. As shown in FIG. As shown in FIG. 2, the XR glasses 20 have temples 91 and 92, a bridge 93, frames 94 and 95, and lenses 41L and 41R, like general eyeglasses.
  • An imaging device 26 is provided on the bridge 93 .
  • the imaging device 26 images the outside world.
  • the imaging device 26 also outputs imaging information indicating the captured image.
  • Each of the lenses 41L and 41R has a half mirror.
  • a frame 94 is provided with a liquid crystal panel or an organic EL panel for the left eye.
  • a liquid crystal panel or an organic EL panel is hereinafter generically referred to as a display panel.
  • the frame 94 is provided with an optical member that guides the light emitted from the display panel for the left eye to the lens 41L.
  • the half mirror provided in the lens 41L transmits external light and guides it to the left eye, and reflects the light guided by the optical member to enter the left eye.
  • the frame 95 is provided with a right-eye display panel and an optical member that guides light emitted from the right-eye display panel to the lens 41R.
  • the half mirror provided in the lens 41R transmits external light and guides it to the right eye, and reflects the light guided by the optical member to enter the right eye.
  • the display 28 which will be described later, includes a lens 41L, a left-eye display panel, a left-eye optical member, and a lens 41R, a right-eye display panel, and a right-eye optical member.
  • the user U1 can observe the image displayed by the display panel in a see-through state in which the image is superimposed on the appearance of the outside world. Further, in the XR glasses 20, of the binocular images with parallax, the image for the left eye is displayed on the display panel for the left eye, and the image for the right eye is displayed on the display panel for the right eye. On the other hand, it is possible to perceive the displayed image as if it had depth and stereoscopic effect.
  • FIG. 3 is a block diagram showing a configuration example of the XR glasses 20.
  • the XR glasses 20 include a processing device 21 , a storage device 22 , a line-of-sight detection device 23 , a GPS device 24 , a motion detection device 25 , an imaging device 26 , a communication device 27 and a display 28 .
  • Each element of the XR glasses 20 is interconnected by one or more buses for communicating information.
  • the term "apparatus" in this specification may be replaced with another term such as a circuit, a device, or a unit.
  • the processing device 21 is a processor that controls the XR glasses 20 as a whole.
  • the processing device 21 is configured using, for example, one or more chips.
  • the processing device 21 is configured using, for example, a central processing unit (CPU) including an interface with peripheral devices, an arithmetic device, registers, and the like. Some or all of the functions of the processing device 21 are implemented by hardware such as DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), and FPGA (Field Programmable Gate Array). may be realized.
  • the processing device 21 executes various processes in parallel or sequentially.
  • the storage device 22 is a recording medium that can be read and written by the processing device 21 .
  • the storage device 22 also stores a plurality of programs including the control program PR1 executed by the processing device 21 .
  • the line-of-sight detection device 23 detects the line-of-sight of the user U1 and generates line-of-sight information indicating the detection result. Any method may be used to detect the line of sight by the line of sight detection device 23 .
  • the line-of-sight detection device 23 may detect line-of-sight information based on, for example, the position of the inner corner of the eye and the position of the iris.
  • the line-of-sight information indicates the line-of-sight direction of the user U1.
  • the line-of-sight detection device 23 supplies the line-of-sight information to the processing device 21, which will be described later.
  • the line-of-sight information supplied to the processing device 21 is transmitted to the terminal device 10 via the communication device 27 .
  • the GPS device 24 receives radio waves from multiple satellites.
  • the GPS device 24 also generates position information from the received radio waves.
  • the positional information indicates the position of the XR glasses 20 .
  • the location information may be in any format as long as the location can be specified.
  • the position information indicates the latitude and longitude of the XR glasses 20, for example.
  • location information is obtained from GPS device 24 .
  • the XR glasses 20 may acquire position information by any method.
  • the acquired position information is supplied to the processing device 21 .
  • the position information output to the processing device 21 is transmitted to the terminal device 10 via the communication device 27 .
  • the motion detection device 25 detects motion of the XR glasses 20 .
  • the motion detection device 25 corresponds to an inertial sensor such as an acceleration sensor that detects acceleration and a gyro sensor that detects angular acceleration.
  • the acceleration sensor detects acceleration in orthogonal X-, Y-, and Z-axes.
  • the gyro sensor detects angular acceleration around the X-, Y-, and Z-axes.
  • the motion detection device 25 can generate posture information indicating the posture of the XR glasses 20 based on the output information of the gyro sensor.
  • the motion information includes acceleration data indicating three-axis acceleration and angular acceleration data indicating three-axis angular acceleration.
  • the motion detection device 25 supplies posture information indicating the posture of the XR glasses 20 and motion information related to the motion of the XR glasses 20 to the processing device 21 .
  • the posture information and motion information supplied to the processing device 21 are transmitted to the terminal device 10 via the communication device 27 .
  • the imaging device 26 outputs imaging information obtained by imaging the outside world.
  • the imaging device 26 includes, for example, a lens, an imaging element, an amplifier, and an AD converter.
  • the light condensed through the lens is converted into an image pickup signal, which is an analog signal, by the image pickup device.
  • the amplifier amplifies the imaging signal and outputs it to the AD converter.
  • the AD converter converts the amplified imaging signal, which is an analog signal, into imaging information, which is a digital signal.
  • the converted imaging information is supplied to the processing device 21 .
  • the imaging information supplied to the processing device 21 is transmitted to the terminal device 10 via the communication device 27 .
  • the communication device 27 is hardware as a transmission/reception device for communicating with other devices.
  • the communication device 27 is also called a network device, a network controller, a network card, a communication module, etc., for example.
  • the communication device 27 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 27 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 28 is a device that displays images.
  • the display 28 displays various images under the control of the processing device 21 .
  • the display 28 includes the lens 41L, the left-eye display panel, the left-eye optical member, and the lens 41R, the right-eye display panel, and the right-eye optical member, as described above.
  • Various display panels such as a liquid crystal display panel and an organic EL display panel are preferably used as the display panel.
  • the processing device 21 functions as an acquisition unit 211 and a display control unit 212, for example, by reading the control program PR1 from the storage device 22 and executing it.
  • the acquisition unit 211 acquires image information indicating an image displayed on the XR glasses 20 from the terminal device 10-1.
  • the acquisition unit 211 also receives line-of-sight information supplied from the line-of-sight detection device 23 , position information supplied from the GPS device 24 , posture information and motion information supplied from the motion detection device 25 , and information supplied from the imaging device 26 . Acquire imaging information. After that, the acquisition unit 211 supplies the acquired line-of-sight information, position information, posture information, motion information, and imaging information to the communication device 27 .
  • the display control unit 212 Based on the image information acquired from the terminal device 10-1 by the acquisition unit 211, the display control unit 212 causes the display 28 to display an image indicated by the image information.
  • FIG. 4 is a block diagram showing a configuration example of the terminal device 10. As shown in FIG.
  • the terminal device 10 includes a processing device 11 , a storage device 12 , a communication device 13 , a display 14 , an input device 15 and an inertial sensor 16 . Elements of the terminal device 10 are interconnected by one or more buses for communicating information. As the configuration of the terminal device 10, the configuration of the terminal device 10-1 will be basically described below.
  • the processing device 11 is a processor that controls the terminal device 10 as a whole. Also, the processing device 11 is configured using, for example, a single chip or a plurality of chips. The processing unit 11 is configured using, for example, a central processing unit (CPU) including interfaces with peripheral devices, arithmetic units, registers, and the like. A part or all of the functions of the processing device 11 may be implemented by hardware such as DSP, ASIC, PLD, and FPGA. The processing device 11 executes various processes in parallel or sequentially.
  • CPU central processing unit
  • the storage device 12 is a recording medium readable and writable by the processing device 11 .
  • the storage device 12 also stores a plurality of programs including the control program PR2 executed by the processing device 11 .
  • the communication device 13 is hardware as a transmission/reception device for communicating with other devices.
  • the communication device 13 is also called a network device, a network controller, a network card, a communication module, or the like, for example.
  • the communication device 13 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 13 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 14 is a device that displays images and character information.
  • the display 14 displays various images under the control of the processing device 11 .
  • various display panels such as a liquid crystal display panel and an organic EL (Electro Luminescence) display panel are preferably used as the display 14 .
  • the display 14 may not be an essential component. In this case, the XR glasses 20 further have the same function as the display 14 .
  • the input device 15 accepts operations from the user U1 who wears the XR glasses 20 on his head.
  • the input device 15 includes a pointing device such as a keyboard, touch pad, touch panel, or mouse.
  • the input device 15 may also serve as the display 14 .
  • the inertial sensor 16 is a sensor that detects inertial force.
  • the inertial sensor 16 includes, for example, one or more of an acceleration sensor, an angular velocity sensor, and a gyro sensor.
  • the processing device 11 detects the orientation of the terminal device 10 based on the output information from the inertial sensor 16 . Further, the processing device 11 receives selection of the virtual object VO, input of characters, and input of instructions in the celestial sphere virtual space VS based on the orientation of the terminal device 10 .
  • the user U1 directs the central axis of the terminal device 10 toward a predetermined area of the virtual space VS, and operates the input device 15 to select the virtual object VO arranged in the predetermined area.
  • the user U1's operation on the input device 15 is, for example, a double tap. By operating the terminal device 10 in this way, the user U1 can select the virtual object VO without looking at the input device 15 of the terminal device 10 .
  • the terminal device 10 preferably has a GPS device similar to the GPS device 24 provided in the XR glasses 20.
  • the processing device 11 functions as an output unit 111, an acquisition unit 112, a generation unit 113, and a display control unit 114 by reading the control program PR2 from the storage device 12 and executing it.
  • the output unit 111 outputs a message created by the user U1 using the input device 15 to the server 30.
  • the message specifies the sender of the message, the receiver of the message, and the content of the message.
  • the content of the message includes at least one of text and images.
  • the output unit 111 outputs a message addressed to the user U1 acquired by the acquisition unit 112-1, which will be described later.
  • information indicating that it has been read is transmitted to the server 30.
  • the acquisition unit 112 acquires a plurality of messages addressed to the user U from the server 30. If the terminal device 10 is the terminal device 10-1 used by the user U1 who is the first user, the obtaining unit 112-1 obtains from the server 30 a plurality of messages addressed to the user U1.
  • the generating unit 113 generates multiple individual objects corresponding to the multiple messages acquired by the acquiring unit 112 on a one-to-one basis.
  • the display control unit 114 causes the XR glasses 20 as a display device to display the virtual object, which is a collection of multiple individual objects generated by the generation unit 113 .
  • user U1 can visually confirm the number of multiple messages.
  • FIG. 5 and 6 are explanatory diagrams showing an example of the operation of the generation unit 113 and the display control unit 114.
  • FIG. In the following description, it is assumed that X, Y and Z axes are orthogonal to each other in the virtual space VS.
  • the X-axis extends in the front-rear direction of user U1.
  • the forward direction along the X axis is the X1 direction
  • the backward direction along the X axis is the X2 direction.
  • the Y-axis extends in the horizontal direction of the user U1.
  • the right direction along the Y axis is the Y1 direction
  • the left direction along the Y axis is the X2 direction.
  • a horizontal plane is formed by these X-axis and Y-axis.
  • the Z-axis is orthogonal to the XY plane and extends in the vertical direction of the user U1.
  • the downward direction along the Z axis is the Z1 direction
  • the upward direction along the Z axis is the Z2 direction.
  • the coordinates of the user U1 in the virtual space VS correspond to the position of the user U1 in the real space.
  • the position of the user U1 in the physical space is indicated by position information generated in the XR glasses 20 worn on the head of the user U1.
  • the virtual object VO1 and the individual objects SO1 to SO3 are, for example, spheres.
  • the image information used by the display control unit 114 to display the individual objects SO1 to SO3 on the XR glasses 20 may be information stored in the storage device 12.
  • the image information may be information acquired from the server 30 by the acquisition unit 112 .
  • individual objects SO1 to SO3 may correspond one-to-one to all messages addressed to user U1.
  • individual objects SO1 to SO3 may correspond one-to-one only to unread messages among all messages addressed to user U1.
  • one individual object may correspond to a plurality of read messages.
  • the viewing angle when the user U1 visually recognizes the virtual object VO1 is ⁇ 1 .
  • the display control unit 114 further increases the size of the virtual object VO1 as the number of messages acquired by the acquisition unit 112 increases. That is, the display control unit 114 increases the size of the virtual object VO1 as the number of individual objects SO included in the virtual object VO1 increases. Further, the display control unit 114 arranges the virtual object VO1 at a position farther from the user U1 in the virtual space VS as the number of the plurality of messages is larger. In other words, the display control unit 114 increases the distance from the center of the virtual space VS to the center of the virtual object VO1 as the number of messages increases.
  • the display control unit 114 sets the size of the virtual object VO1 to the first size, and displays the size of the virtual object VO1 from the center of the virtual space VS. Let the distance to the center be the first distance. Further, when the number of the plurality of messages is a second number larger than the first number, the display control unit 114 sets the size of the virtual object VO1 to a second size larger than the first size. At the same time, the distance from the center of the virtual space VS to the center of the virtual object VO1 is set to a second distance longer than the first distance.
  • the viewing angle when the user U1 visually recognizes the virtual object VO2 is ⁇ 2 .
  • the viewing angle ⁇ 1 when the user views the virtual object VO1 in FIG. 5 and the viewing angle ⁇ 2 when the user views the virtual object VO2 in FIG. Equal is preferred.
  • the proportion occupied by the virtual object VO1 and the proportion occupied by the virtual object VO2 in the field of view of the user U1 are equal.
  • the area of virtual object VO2 in the field of view of user U1 remains equal to the area of virtual object VO1. I don't feel like my field of vision has narrowed.
  • the viewing angle ⁇ 1 and the viewing angle ⁇ 2 may not necessarily be equal.
  • VO1 may gradually increase.
  • the display control unit 114 changes the display mode of the virtual object VO1 according to the number of messages addressed to the user U1. For example, as the number of individual objects SO included in the virtual object VO1 increases, the display control unit 114 changes at least one of the color of the virtual object VO1 and the color of the individual objects SO included in the virtual object VO1. You may let Alternatively, the display control unit 114 may change the shape of the virtual object VO1 as the number of individual objects SO increases. For example, each time the number of individual objects SO reaches a predetermined number, a plurality of individual objects SO may be arranged to represent a specific character in the virtual object VO1.
  • FIG. 7 is a diagram showing an example of how a plurality of individual objects SO1 to SO10 are aligned. In the example shown in FIG. 7, in the virtual object VO3, the individual objects SO1 to SO10 are aligned to represent the letter "N".
  • the terminal device 10 can more appeal to the user U1 that the number of multiple messages has increased.
  • the display control unit 114 may vary the display modes of the individual objects SO according to the transmission source devices corresponding to each of the plurality of messages addressed to the user U1. For example, the display control unit 114 may change at least one of the shape and color of the individual object SO according to the device that sent the message.
  • the user U1 can distinguish the plurality of individual objects SO according to the transmission source devices of the plurality of messages, simply by visually recognizing the plurality of individual objects SO.
  • FIG. 8 is a block diagram showing a configuration example of the server 30.
  • the server 30 comprises a processing device 31 , a storage device 32 , a communication device 33 , a display 34 and an input device 35 .
  • Each element of server 30 is interconnected by one or more buses for communicating information.
  • the processing device 31 is a processor that controls the server 30 as a whole. Also, the processing device 31 is configured using, for example, a single chip or a plurality of chips. The processing unit 31 is configured using, for example, a central processing unit (CPU) including interfaces with peripheral devices, arithmetic units, registers, and the like. A part or all of the functions of the processing device 31 may be realized by hardware such as DSP, ASIC, PLD, and FPGA. The processing device 31 executes various processes in parallel or sequentially.
  • CPU central processing unit
  • the storage device 32 is a recording medium readable and writable by the processing device 31 .
  • the storage device 32 also stores a plurality of programs including the control program PR3 executed by the processing device 31 .
  • the storage device 32 stores a message database MD in which information related to messages transmitted and received between a plurality of users U is stored.
  • FIG. 9 is a table showing an example of the message database MD.
  • the acquisition unit 311 provided in the server 30 acquires messages transmitted and received between the users U from the terminal device 10 . More specifically, the acquisition unit 311 acquires information indicating the sender of a message output from the terminal device 10, information indicating the recipient of the message, and information indicating the content of the message. These information are stored in the message database MD. Further, the acquisition unit 311 acquires information indicating that each message has been read by the user U. Based on this information, the message management unit 313, which will be described later, adds a flag indicating whether or not each message has been read in the message database MD. As an example, a flag with a value of '0' indicates that the message is unread.
  • a flag with a value of "1” indicates that the message has already been read.
  • "n" is an integer of 2 or more.
  • the communication device 33 is hardware as a transmission/reception device for communicating with other devices.
  • the communication device 33 is also called a network device, a network controller, a network card, a communication module, etc., for example.
  • the communication device 33 may include a connector for wired connection and an interface circuit corresponding to the connector. Further, the communication device 33 may have a wireless communication interface. Products conforming to wired LAN, IEEE1394, and USB are examples of connectors and interface circuits for wired connection. Also, as a wireless communication interface, there are products conforming to wireless LAN, Bluetooth (registered trademark), and the like.
  • the display 34 is a device that displays images and character information.
  • the display 34 displays various images under the control of the processing device 31 .
  • various display panels such as a liquid crystal display panel and an organic EL display panel are preferably used as the display 34 .
  • the input device 35 is a device that accepts operations by the administrator of the information processing system 1 .
  • the input device 35 includes a pointing device such as a keyboard, touch pad, touch panel, or mouse.
  • the input device 35 may also serve as the display 34 .
  • the processing device 31 functions as an acquisition unit 311, an output unit 312, and a message management unit 313, for example, by reading the control program PR3 from the storage device 32 and executing it.
  • the acquisition unit 311 acquires various data from the terminal device 10 via the communication device 33 .
  • the data includes, for example, data indicating the operation content for the virtual object VO, which is input to the terminal device 10 by the user U1 wearing the XR glasses 20 on the head.
  • the acquisition unit 311 acquires messages transmitted and received between users U. Furthermore, as an example, when a message addressed to user U1 has been read by user U1 in terminal device 10-1, acquisition unit 311 acquires information indicating that the message has been read.
  • the output unit 312 transmits image information indicating an image displayed on the XR glasses 20 to the terminal device 10 .
  • the image information may be stored in the storage device 32 .
  • the image information may be generated by a generating unit (not shown).
  • the output unit 312 transmits a message addressed to the user U1 stored in the message database MD to the terminal device 10-1.
  • the message management unit 313 manages the message database MD. As an example, in the terminal device 10-1, when the acquisition unit 311 acquires information indicating that the message addressed to the user U1 has been read by the user U1, the message management unit 313 links the message to the message. The flag "0" indicating unread is changed to the flag "1" indicating read.
  • FIG. 10 is a flow chart showing the operation of the terminal device 10 according to the first embodiment, especially the terminal device 10-1 used by the user U1. The operation of the terminal device 10-1 will be described below with reference to FIG.
  • step S1 the processing device 11-1 functions as an acquisition unit 112-1.
  • Processing device 11-1 acquires a plurality of messages addressed to user U1.
  • step S2 the processing device 11-1 functions as the generator 113-1.
  • the processing device 11-1 generates a plurality of individual objects SO corresponding to the plurality of messages on a one-to-one basis.
  • the processing device 11-1 functions as the display control unit 114-1.
  • the processing device 11-1 causes the XR glasses 20 as a display device to display a virtual object VO, which is an aggregate of a plurality of individual objects SO.
  • the processing device 11-1 increases the size of the virtual object VO as the number of messages acquired in step S1 increases.
  • the processing device 11-1 arranges the virtual object VO1 at a position farther from the user U1 in the virtual space VS as the number of the plurality of messages acquired in step S1 increases. After that, the processing device 11-1 executes the process of step S1.
  • the terminal device 10 as a display control device has a virtual space including a virtual object VO on the XR glasses 20 as a display device worn on the head. It is a display control device for displaying VS.
  • the terminal device 10 includes an acquisition unit 112 , a generation unit 113 and a display control unit 114 .
  • Acquisition unit 112 acquires a plurality of messages.
  • the generation unit 113 generates a plurality of individual objects SO corresponding to a plurality of messages on a one-to-one basis.
  • the display control unit 114 causes the XR glasses 20 as a display device to display a virtual object VO, which is an aggregate of a plurality of individual objects SO.
  • the display control unit 114 sets the size of the virtual object VO to the first size, and sets the size of the virtual object VO to the first size in the virtual space VS. Let the distance to the center of the virtual object VO be the first distance. Furthermore, when the number of messages is a second number that is larger than the first number, the display control unit 114 sets the size of the virtual object VO to a second size that is larger than the first size. , in the virtual space VS, the distance from the center of the virtual space VS to the center of the virtual object VO is set as a second distance longer than the first distance.
  • the terminal device 10 Since the terminal device 10 has the above configuration, it is possible to display the virtual objects VO corresponding to the number of messages in the virtual space VS, and to suppress the deterioration of the user's convenience even if the number of messages increases. It becomes possible. Specifically, in the virtual space VS, as the number of individual objects SO corresponding to the number of messages increases, the terminal device 10 increases the size of the virtual object VO, which is a collection of the individual objects SO. At the same time, the virtual object VO is displayed at a position farther from the user U1. Since the terminal device 10 has this configuration, even if the number of individual objects SO increases on the display 28 provided on the XR glasses 20 worn on the head of the user U1, the viewability thereof is ensured. As a result, deterioration in user convenience is suppressed.
  • the display control unit 114 changes the display mode of the virtual object VO according to the number of messages.
  • the terminal device 10 has the above configuration, when the number of multiple messages increases, it becomes possible to further appeal to the user U1 that the number of multiple messages has increased.
  • the display control unit 114 makes the display modes of the plurality of individual objects SO different from each other according to the transmission source device corresponding to each of the plurality of messages.
  • the terminal device 10 Since the terminal device 10 has the above configuration, the user U1 can distinguish the plurality of individual objects SO according to the transmission source devices of the plurality of messages simply by visually recognizing the plurality of individual objects SO. becomes.
  • FIG. 2 Second Embodiment
  • a configuration of an information processing system 1A including a terminal device 10A as a display control device according to a second embodiment of the present invention will be described with reference to FIGS. 11 to 14.
  • FIG. 11 Second Embodiment
  • An information processing system 1A according to the second embodiment of the present invention has terminal devices as compared with the information processing system 1 according to the first embodiment. 10 in that a terminal device 10A is provided instead. Otherwise, the overall configuration of the information processing system 1A is the same as the overall configuration of the information processing system 1 according to the first embodiment shown in FIG. 1, so illustration and description thereof will be omitted.
  • FIG. 11 is a block diagram showing a configuration example of the terminal device 10A. Unlike the terminal device 10, the terminal device 10A includes a processing device 11A instead of the processing device 11 and a storage device 12A instead of the storage device 12.
  • FIG. 11 is a block diagram showing a configuration example of the terminal device 10A. Unlike the terminal device 10, the terminal device 10A includes a processing device 11A instead of the processing device 11 and a storage device 12A instead of the storage device 12.
  • FIG. 11 is a block diagram showing a configuration example of the terminal device 10A
  • the storage device 12A stores the control program PR2A instead of the control program PR2.
  • the processing device 11A includes a display control unit 114A instead of the display control unit 114 included in the processing device 11.
  • the processing device 11A also includes a reception unit 115 and a determination unit 116 in addition to the components included in the processing device 11 .
  • the reception unit 115 receives an operation from the user U1 on the virtual object VO.
  • the accepting unit 115 may accept an operation using the input device 15 by the user U1.
  • the accepting unit 115 may accept an operation of the user U1 touching the virtual object VO in the virtual space VS.
  • the operation of selecting the virtual object VO in the virtual space VS is an example of the first operation.
  • the receiving unit 115 may also receive an operation by the user U1 to select one message from a list of multiple messages displayed in the virtual space VS, as will be described later. There is a one-to-one correspondence between these multiple messages and multiple individual objects SO.
  • One individual object SO is selected by the user U1 selecting one message from a list of a plurality of messages.
  • the accepting unit 115 may accept an operation of the user U1 touching one individual object SO in the virtual space VS.
  • the operation of selecting one individual object SO in the virtual space VS is an example of the second operation.
  • the display control unit 114A has the same function as the display control unit 114. Further, when the operation by the user U1 is the first operation, the display control unit 114A displays a list of multiple messages in the virtual space VS. On the other hand, when the operation by the user U1 is the second operation, the display control unit 114A causes the virtual space VS to display the content of the message corresponding to one individual object SO designated by the second operation. .
  • FIG. 12 and 13 are explanatory diagrams of operation examples of the display control unit 114A and the reception unit 115.
  • FIG. 12 when the user U1 performs an operation of selecting the virtual object VO2 using the terminal device 10-1, the display control unit 114A displays a list L of a plurality of messages in the virtual space VS. More specifically, as shown in FIG. 12, a list of titles of messages A to F is displayed as list L in virtual space VS by display control unit 114A. Messages A to F correspond one-to-one to individual objects SO1 to SO6, respectively. Note that in FIG. 12, the list L is displayed on the left side of the virtual object VO2 as seen from the user U1, but this display location is merely an example.
  • the list L may be displayed anywhere in the virtual space VS.
  • the list L is preferably displayed at a position that does not overlap the virtual object VO2 when viewed from the user U1. The same applies when the user U1 performs an operation of touching the virtual object VO7 in the virtual space VS.
  • the user U1 can visually recognize, in a list format, a plurality of messages corresponding one-to-one to the plurality of individual objects SO included in the virtual object VO.
  • FIG. 12 when the user U1 performs an operation to select one message from a plurality of messages shown in the list L, a message object MO1 indicating the contents of the message is displayed as shown in FIG.
  • the message object MO1 includes at least one of text and images.
  • the user U1 performs an operation to select "message F" corresponding to the individual object SO6 from the plurality of messages shown in the list L.
  • the user U1 performs an operation to touch the individual object SO6 in the virtual space VS.
  • the user U1 can visually recognize the specific contents of each of the multiple messages corresponding to the multiple individual objects SO included in the virtual object VO.
  • the determination unit 116 determines the importance of each of the plurality of messages. For example, the determination unit 116 may determine the importance by analyzing the contents of each of the multiple messages. Alternatively, the determination unit 116 may determine the importance based on the transmission source device corresponding to each of the multiple messages. Alternatively, the determining unit 116 may determine the degree of importance based on the operation of the user U1 using the input device 15. FIG. For example, as shown in FIG. 13, while the contents of a message corresponding to one individual object SO are displayed, the user U1 judges the importance of the message and inputs the judgment result using the input device 15. do. The determining unit 116 may determine the degree of importance based on the content of input from the input device 15 .
  • the display control unit 114A displays the individual objects SO corresponding to the messages having a degree of importance equal to or greater than a predetermined value, and the individual objects SO corresponding to the messages having a degree of importance less than the predetermined value. is displayed near the user U1, that is, near the center of the virtual space VS. Further, the display control unit 114A may display an individual object SO having a higher degree of importance in the virtual object VO near the user U1, that is, near the approximate center of the virtual space VS.
  • the "predetermined value" described above is an example of the "first value”.
  • user U1 can preferentially check messages that are more important to him/herself.
  • FIG. 14 is a flowchart showing the operation of the terminal device 10A according to the second embodiment, especially the terminal device 10A-1 used by the user U1. The operation of the terminal device 10A-1 will be described below with reference to FIG.
  • step S11 the processing device 11A-1 functions as an acquisition unit 112-1.
  • the processing device 11A-1 acquires a plurality of messages addressed to the user U1.
  • step S12 the processing device 11A-1 functions as the generator 113-1.
  • the processing device 11A-1 generates a plurality of individual objects SO corresponding to the plurality of messages on a one-to-one basis.
  • the processing device 11A-1 functions as the display control section 114A-1.
  • the processing device 11A-1 causes the XR glasses 20 as a display device to display a virtual object VO, which is an aggregate of a plurality of individual objects SO.
  • the processing device 11A-1 increases the size of the virtual object VO as the number of messages acquired in step S11 increases. Also, the processing device 11A-1 arranges the virtual object VO1 at a position farther from the user U1 in the virtual space VS as the number of the plurality of messages acquired in step S11 increases.
  • step S14 the processing device 11A-1 functions as the reception unit 115.
  • Processing device 11A-1 receives an operation from user U1. If the operation from the user U1 is the first operation on the virtual object VO, the processing device 11A-1 executes the process of step S15. If the operation from the user U1 is the second operation on the individual object SO, the processing device 11A-1 executes the process of step S16.
  • step S15 the processing device 11A-1 functions as the display control section 114A.
  • the processing device 11A-1 causes the list L of a plurality of messages to be displayed in the virtual space VS. After that, the processing device 11A-1 executes the process of step S14.
  • step S16 the processing device 11A-1 functions as the display control section 114A.
  • the processing device 11A-1 causes the contents of the message corresponding to one individual object SO to be displayed in the virtual space VS. After that, the processing device 11A-1 executes the process of step S11.
  • the terminal device 10A as a display control device further includes the reception unit 115 that receives an operation on the virtual object VO. If the above operation is the first operation, the display control unit 114 causes the list L of the plurality of messages to be displayed in the virtual space VS.
  • the user U1 can visually recognize, in a list format, a plurality of messages corresponding one-to-one to a plurality of individual objects SO included in the virtual object VO.
  • the display control unit 114 when the above operation is the second operation of designating one individual object SO among the plurality of individual objects SO, The content of the message corresponding to one individual object SO is displayed in the virtual space VS.
  • the user U1 can visually recognize the specific contents of each of the multiple messages corresponding one-to-one to the multiple individual objects SO included in the virtual object VO.
  • the terminal device 10A as a display control device further includes the determination unit 116 that determines the importance of each of a plurality of messages.
  • the display control unit 114 selects individual objects SO corresponding to messages having a degree of importance greater than or equal to the first value, and displays individual objects SO corresponding to messages having a degree of importance less than the first value.
  • the individual object SO is displayed near the user U1 as compared with the individual object SO.
  • the terminal device 10A Since the terminal device 10A has the above configuration, the user U can preferentially check messages that are more important to him/herself.
  • 3-1 Configuration of Third Embodiment 3-1-1: Overall Configuration
  • An information processing system 1B according to the third embodiment of the present invention has terminal devices as compared with the information processing system 1 according to the first embodiment. 10 in that a terminal device 10B is provided instead. Otherwise, the overall configuration of the information processing system 1B is the same as the overall configuration of the information processing system 1 according to the first embodiment shown in FIG. 1, so illustration and description thereof will be omitted.
  • FIG. 15 is a block diagram showing a configuration example of the terminal device 10B.
  • the terminal device 10B includes a processing device 11B instead of the processing device 11 and a storage device 12B instead of the storage device 12.
  • FIG. The terminal device 10 ⁇ /b>B also includes a sound pickup device 17 in addition to the components included in the terminal device 10 .
  • the sound pickup device 17 picks up the voice of the user U1 and converts the picked-up voice into an electric signal.
  • the sound collecting device 17 is specifically a microphone. An electrical signal converted from the voice by the sound collecting device 17 is output to the voice recognition unit 117, which will be described later.
  • the storage device 12B stores the control program PR2B instead of the control program PR2.
  • the processing device 11B includes an acquisition unit 112B instead of the acquisition unit 112.
  • the processing device 11B also includes a speech recognition unit 117 and a message generation unit 118 in addition to the components included in the processing device 11.
  • FIG. 1 A speech recognition unit 117 and a message generation unit 118 in addition to the components included in the processing device 11.
  • the voice recognition unit 117 recognizes the voice collected by the sound collection device 17. More specifically, the speech recognition unit 117 generates text by performing speech recognition based on the electrical signal acquired from the sound pickup device 17 .
  • the message generation unit 118 generates a message corresponding to the text generated by the speech recognition unit 117.
  • 112 A of acquisition parts acquire the message which the message production
  • the plurality of messages acquired by acquisition unit 112A may include messages generated by multiple users U including user U1 in addition to messages generated by message generation unit 118 .
  • the plurality of messages are generated by the terminal device 10B-1 as a display control device and one or more terminal devices 10B connected to the terminal device 10B-1 via the communication network NET.
  • the terminal device 10B Since the terminal device 10B has the above configuration, it can generate a message based on the voice uttered by the user U1 and generate the individual object SO based on the generated message.
  • the messages generated by the message generator 118 are included in the plurality of messages acquired in step S1.
  • a plurality of messages can be sent to the terminal device 10B-1 as the display control device and the communication network NET. is generated by one or a plurality of terminal devices 10B connected to the terminal device 10B-1 via.
  • the user U1 can confirm messages generated by users U other than the user U1.
  • the terminal device 10B as a display control device further includes the sound pickup device 17, the speech recognition section 117, and the message generation section 118.
  • the sound pickup device 17 picks up the voice of the user U and outputs an electrical signal representing the voice.
  • the speech recognition unit 117 generates text based on the electrical signal output from the sound pickup device 17 .
  • Message generator 118 generates a message corresponding to the text generated by speech recognizer 117 .
  • the plurality of messages includes messages generated by the message generator 118 .
  • the terminal device 10B Since the terminal device 10B has the above configuration, it can generate the individual object SO based on the voice uttered by the user U1.
  • FIG. 4-1 Configuration of Fourth Embodiment 4-1-1: Overall Configuration
  • An information processing system 1C according to the fourth embodiment of the present invention has terminal devices as compared with the information processing system 1 according to the first embodiment. 10 and a server 30A instead of the server 30, respectively. Otherwise, the overall configuration of the information processing system 1C is the same as the overall configuration of the information processing system 1 according to the first embodiment shown in FIG. 1, so illustration and description thereof will be omitted.
  • the terminal device 10C includes a processing device 11C instead of the processing device 11 and a storage device 12C instead of the storage device 12.
  • the storage device 12C stores a control program PR2C instead of the control program PR2.
  • the processing device 11C includes an output section 111C instead of the output section 111.
  • FIG. Otherwise, the configuration of the terminal device 10C is the same as the configuration of the terminal device 10 according to the first embodiment shown in FIG. 4, so illustration and description thereof will be omitted.
  • the output unit 111C has the same function as the output unit 111 has. Further, the output unit 111C outputs the device ID (identifer) of the terminal device 10, the user name using the terminal device 10, and the location information acquired by the terminal device 10 from the XR glasses 20 to the server 30A. For example, if the terminal device 10 is the terminal device 10-1 used by user U1 who is the first user, the output unit 111C-1 outputs the device ID of the terminal device 10-1, the user name of the terminal device 10-1, and the device ID of the terminal device 10-1. and position information generated by the XR glasses 20 worn on the head of the user U1 are output to the server 30A.
  • the terminal device 10-1 is an example of a first display control device.
  • the output unit 111C also outputs coordinates indicating the display position in the virtual space VS of the virtual object VO displayed in the virtual space VS by the display control unit 114 to the server 30A.
  • a virtual space VS including a virtual object VO is displayed on the XR glasses 20 connected to the terminal device 10-1.
  • XR glasses 20 are an example of a first display device.
  • the virtual space VS displayed on the XR glasses 20 is an example of the first virtual space.
  • FIG. 16 is a block diagram showing a configuration example of the server 30A. Unlike the server 30, the server 30A includes a processing device 31A instead of the processing device 31 and a storage device 32A instead of the storage device 32. FIG.
  • the storage device 32A stores the control program PR3A instead of the control program PR3.
  • the storage device 32A also stores a location information database LD.
  • FIG. 17 is a table showing an example of the location information database LD.
  • the terminal device 10C provides the server 30A with the device ID of the terminal device 10C, the user name using the terminal device 10, and the location information acquired by the terminal device 10C from the XR glasses 20, or the terminal device 10C outputs the generated location information.
  • the location information database LD stores these device IDs, user names, and location information.
  • the location information database LD shown in FIG. (x, y, z) (x u1 , y u1 , z u1 ), where 1 is the positional information acquired from the XR glasses 20, are stored in a mutually linked state.
  • "L" is an integer of 2 or more.
  • the processing device 31A includes an acquisition unit 311A instead of the acquisition unit 311 and an output unit 312A instead of the output unit 312.
  • the processing device 31A also includes a determination unit 314 and an extraction unit 315 in addition to the constituent elements included in the processing device 31 .
  • the determination unit 314 determines whether or not the number of messages output by the output unit 312A to the terminal device 10C-1 is equal to or greater than a predetermined number.
  • the acquisition unit 311A receives the virtual message displayed in the virtual space VS by the display control unit 114-1 from the terminal device 10C-1. Coordinates indicating the display position of the object VO in the virtual space VS are acquired. Note that the obtaining unit 311A may obtain the coordinates indicating the display position of the individual object SO in addition to the coordinates indicating the display position of the virtual object VO.
  • the extracting unit 315 determines, among the users U who are permitted to share the virtual space VS with the user U1, the location within a predetermined distance from the display position of the virtual object VO in the virtual space VS. to extract users U other than the user U1.
  • the extracted other user U is an example of the second user.
  • the second user is a user U who exists within a predetermined distance from the display position in the virtual space VS and is permitted to share the virtual space VS when the number of messages is equal to or greater than a predetermined number. It should be noted that this "predetermined distance" preferably increases according to the number of messages output to the terminal device 10C-1 by the output unit 312A.
  • the output unit 312A supplies the terminal device 10C used by the user U extracted by the extraction unit 315 with a plurality of messages identical to the plurality of messages output to the terminal device 10C-1 and the virtual object VO in the virtual space VS. and the coordinates indicating the display position of the .
  • the same multiple messages and the coordinates indicating the display position of the virtual object VO are examples of control information.
  • the terminal device 10C to which the output unit 312A transmits the control information is an example of a second display control device.
  • the XR glass 20 connected to the terminal device 10C or the display 14 provided in the terminal device 10C is an example of a second display device.
  • the terminal device 10C that has obtained the same plurality of messages generates a plurality of individual objects SO corresponding to the same plurality of messages on a one-to-one basis, similar to the terminal device 10C-1.
  • the terminal device 10C displays the virtual space VS in which the virtual object VO, which is an aggregate of the plurality of individual objects SO, is arranged at the display position obtained from the server 30A. It is displayed on the display 14 provided in the device 10C.
  • the virtual space VS displayed on the XR glasses 20 connected to the terminal device 10C or the display 14 provided in the terminal device 10C is an example of the second virtual space.
  • the same plurality of messages output from the output unit 312A and the coordinates indicating the display position of the virtual object VO are information for displaying the second virtual space including the virtual object VO on the second display device.
  • the output unit 312A outputs the coordinates indicating the display position of the individual object SO to the terminal device 10C.
  • the terminal device 10C displays the individual object SO at the display position in the virtual space VS obtained from the server 30A.
  • the other user U2 can also visually recognize the virtual object VO.
  • FIG. 18 is an explanatory diagram showing an example of operations of the determination unit 314, the extraction unit 315, and the output unit 312A. It is assumed that a virtual space VS including a virtual object VO4 is displayed on the XR glasses 20 worn on the head by the user U1. As the number of messages addressed to user U1 increases, the number of individual objects SO included in virtual object VO4 also increases. Also, as the number of individual objects SO included in the virtual object VO4 increases, the virtual object VO4 moves away from the user U1.
  • the output unit 312A transmits to the terminal device 10C-2 used by the user U2 the same plurality of messages that were transmitted to the terminal device 10C-1.
  • the acquiring unit 112 provided in the terminal device 10C-2 acquires the same plurality of messages and the coordinates indicating the display position of the virtual object VO4 in the virtual space VS from the server 30A.
  • the acquisition unit 112 provided in the terminal device 10C-2 generates a plurality of individual objects SO corresponding to the same plurality of messages on a one-to-one basis.
  • FIG. 19 is a flow chart showing the operation of the server 30A according to the fourth embodiment. The operation of the server 30A will be described below with reference to FIG.
  • step S21 the processing device 31A functions as the output unit 312A.
  • the processing device 31A transmits a plurality of messages to the terminal device 10C-1 used by the user U1.
  • the terminal device 10C-1 causes the XR glasses 20 to display the virtual object VO4 based on the plurality of messages.
  • step S22 the processing device 31A functions as the determination unit 314.
  • the processing device 31A determines whether or not the number of messages sent to the terminal device 10C-1 is equal to or greater than a predetermined number.
  • the determination result is affirmative, that is, when the processing device 31A determines that the number of the plurality of messages is equal to or greater than the predetermined number, the processing device 31A executes the process of step S23.
  • the determination result is negative, that is, when the processing device 31A determines that the number of the plurality of messages is less than the predetermined number, the processing device 31A executes the process of step S21.
  • step S23 the processing device 31A functions as an acquisition unit 311A.
  • the processing device 31A acquires, from the terminal device 10C-1, the coordinates indicating the display position in the virtual space VS of the virtual object VO4 displayed in the virtual space VS by the display control unit 114-1.
  • step S24 the processing device 31A functions as the extraction unit 315.
  • the processing device 31A extracts users U other than the user U1 who exist within a predetermined distance from the display position of the virtual object VO4 in the virtual space VS from among the users U who share the virtual space VS with the user U1.
  • the processing device 31A should extract the user U2.
  • the processing device 31A functions as the output unit 312A.
  • the processing device 31A indicates, to the terminal device 10C-2 used by the user U2, a plurality of messages identical to the plurality of messages output to the terminal device 10C-1, and the display position of the virtual object VO4 in the virtual space VS.
  • the terminal device 10C-2 that has obtained the same plurality of messages generates a plurality of individual objects SO corresponding to the same plurality of messages on a one-to-one basis, similar to the terminal device 10C-1.
  • the terminal device 10C-2 displays the virtual object VO4, which is an aggregate of the plurality of individual objects SO, at the display position in the virtual space VS obtained from the server 30A.
  • the server 30A is a server that transmits a plurality of messages to the terminal devices 10 to 10C as the first display control device. be.
  • the XR glasses 20 as the first display device are worn on the head of the user U1 as the first user.
  • the server 30A includes an acquisition unit 311A and an output unit 312A.
  • the obtaining unit 311A obtains the display position of the virtual object VO4 in the first virtual space VS.
  • the output unit 312A outputs a second message that exists within a predetermined distance from the display position in the first virtual space VS and is allowed to share the virtual object VO.
  • Control information is transmitted to terminal devices 10 to 10C as display control devices.
  • the terminal devices 10 to 10C as the second display control device display the second virtual space on the XR glasses 20 as the second display device worn on the head of the user U2 as the second user.
  • the above control information is information for displaying the virtual object VO4 in the second virtual space on the second display device.
  • the server 30A has the above configuration, when the number of individual objects SO included in the virtual object VO4 displayed on the XR glasses 20 is equal to or greater than a predetermined number, the other user U2 can also The virtual object VO4 can be visually recognized.
  • FIG. 20 is an explanatory diagram of the virtual object VO5 generated when the terminal device 10B and the server 30A are combined.
  • the terminal device 10B generates a plurality of individual objects SO based on the cheers of individual spectators in a soccer stadium, and the XR glasses 20 as AR glasses are configured by the plurality of individual objects SO above the soccer stadium.
  • the virtual object VO5 to be displayed may be displayed.
  • the virtual object VO5 is shared by a plurality of spectators wearing XR glasses 20 as AR glasses on their heads. may be arranged in a shape representing In this case, the virtual object VO5 and the virtual space VS, which are composed of individual objects SO corresponding to a plurality of messages on a one-to-one basis, are shared by a plurality of users U who gather at a predetermined location.
  • the predetermined place may be a venue for some event or a public facility such as a school.
  • multiple users U who share the virtual object VO5 and the virtual space VS may participate in the same event. For example, an e-sports tournament corresponds to the event.
  • the terminal devices 10 to 10C also receive messages from the second user U, in addition to messages addressed to the user U1 and messages generated by the user U1 himself. A message sent to a third user U may be retrieved.
  • the terminal devices 10 to 10C each include a display control section 114 or a display control section 114A.
  • the servers 30 to 30A may be configured to include the display control unit 114 or the display control unit 114A.
  • the servers 30 to 30A may set coordinates indicating the display positions of the virtual object VO and the individual object SO in the virtual space VS.
  • the server 30 ⁇ /b>A has a determination section 314 .
  • the terminal device 10C may be configured to include the determination unit 314 instead of the server 30A. Specifically, the terminal device 10C determines whether the number of acquired messages or the number of individual objects SO displayed on the XR glasses 20 is equal to or greater than a predetermined number, and sends the determination result to the server. It may be configured to output to 30A.
  • the display control unit 114 or the display control unit 114A provided in the terminal devices 10 to 10C is controlled by the user U when the contents of the message are displayed.
  • the read individual object SO may be erased.
  • the determination unit 314 determines whether the number of unread messages is equal to or greater than a predetermined number, not the number of multiple messages output to the terminal device 10C-1. It may be determined whether
  • the acquisition unit 112 provided in the terminal device 10 to the terminal device 10C acquires a plurality of messages from the server 30 or the server 30A.
  • the acquiring unit 112 may acquire only message IDs corresponding to each of the plurality of messages from the server 30 or server 30A.
  • the generation unit 113 generates individual objects SO that correspond one-to-one with a plurality of message IDs instead of a plurality of messages. Also, in this case, as shown in FIG.
  • the acquiring unit 112 acquires the content of the message from the server 30 or the server 30A for the first time. can be obtained. After that, the display control unit 114 or the display control unit 114A may display the content of the message in the virtual space VS.
  • the individual object SO corresponds to the message generated by the user U on a one-to-one basis.
  • the messages to which the individual object SO corresponds are not limited to messages generated by the user U.
  • the message may be a notification to user U generated by an application.
  • the server 30 or the server 30A sends messages stored in the message database MD to the terminal devices 10 to 10C. Output.
  • the method of outputting a message from server 30 or server 30A to terminal devices 10 to 10C is not limited to this.
  • the information processing system 1 according to the first embodiment to the information processing system 1C according to the fourth embodiment may further include a content server. More specifically, the server 30 or the server 30A may acquire content including a message for the user U from the content server, and output the acquired content to the terminal devices 10 to 10C.
  • the terminal devices 10 to 10C and the XR glasses 20 are implemented separately.
  • the method of realizing the terminal devices 10 to 10C and the XR glasses 20 in the embodiment of the present invention is not limited to this.
  • the XR glasses 20 may have the same functions as the terminal device 10 .
  • the terminal devices 10 to 10C and the XR glasses 20 may be implemented within a single housing. The same applies to the information processing system 1A according to the second embodiment to the information processing system 1C according to the fourth embodiment.
  • the information processing system 1 according to the first embodiment to the information processing system 1C according to the fourth embodiment include, as an example, XR glasses 20 as AR glasses.
  • the XR glass 20 includes an HMD (Head Mounted Display) employing VR (Virtual Reality) technology, an HMD employing MR (Mixed Reality) technology, and MR technology.
  • the information processing system 1 to information processing system 1C may include, instead of the XR glasses 20, one of ordinary smartphones and tablets equipped with imaging devices.
  • These HMDs, MR glasses, smartphones, and tablets are examples of display devices.
  • the storage devices 12 to 12C, 22, and 32 to 32A are examples of ROM and RAM, but flexible disks, magneto-optical disks ( compact discs, digital versatile discs, Blu-ray discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), CD-ROMs (Compact Disc-ROM), registers, removable A disk, hard disk, floppy disk, magnetic strip, database, server or other suitable storage medium.
  • the program may be transmitted from a network via an electric communication line.
  • the program may be transmitted from the communication network NET via an electric communication line.
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. may be represented by a combination of
  • input/output information and the like may be stored in a specific location (for example, memory), or may be managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
  • the determination may be made by a value (0 or 1) represented using 1 bit, or by a true/false value (Boolean: true or false). Alternatively, it may be performed by numerical comparison (for example, comparison with a predetermined value).
  • each function illustrated in FIGS. 1 to 20 is implemented by any combination of at least one of hardware and software.
  • the method of realizing each functional block is not particularly limited. That is, each functional block may be implemented using one device physically or logically coupled, or directly or indirectly using two or more physically or logically separated devices (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices.
  • a functional block may be implemented by combining software in the one device or the plurality of devices.
  • software, instructions, information, etc. may be transmitted and received via a transmission medium.
  • the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • system and “network” are used interchangeably.
  • Information, parameters, etc. described in this disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using corresponding other information. may be represented as
  • the terminal device 10 to terminal device 10C and the server 30 to server 30A may be mobile stations (MS).
  • a mobile station is defined by those skilled in the art as subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be called a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable term. Also, in the present disclosure, terms such as “mobile station”, “user terminal”, “user equipment (UE)", “terminal”, etc. may be used interchangeably.
  • connection refers to any direct or indirect connection between two or more elements. Any connection or coupling is meant, including the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other. Couplings or connections between elements may be physical couplings or connections, logical couplings or connections, or a combination thereof. For example, “connection” may be replaced with "access.”
  • two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
  • the phrase “based on” does not mean “based only on,” unless expressly specified otherwise. In other words, the phrase “based on” means both “based only on” and “based at least on.”
  • determining and “determining” as used in this disclosure may encompass a wide variety of actions.
  • “Judgement” and “determination” are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure);
  • "judgment” and “determination” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment” or “decision” has been made.
  • judgment and “decision” are considered to be “judgment” and “decision” by resolving, selecting, choosing, establishing, comparing, etc. can contain.
  • judgment and “decision” may include considering that some action is “judgment” and “decision”.
  • judgment (decision) may be replaced by "assuming", “expecting”, “considering”, and the like.
  • the term "A and B are different” may mean “A and B are different from each other.” The term may also mean that "A and B are different from C”. Terms such as “separate,” “coupled,” etc. may also be interpreted in the same manner as “different.”
  • notification of predetermined information is not limited to explicit notification, but is performed implicitly (for example, not notification of the predetermined information). good too.
  • reception unit 116... determination unit, 117... voice recognition unit, 118 ... message generation unit 311, 311A ... acquisition unit 312, 312A ... output unit 313 ... message management unit 314 ... determination unit 315 ... extraction unit MO1 ... message object PR1 to PR3A ... control program SO, SO1 ⁇ SO10 individual objects, U, U1 to U2 users, VO, VO1 to VO5 virtual objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de commande d'écran qui comprend une unité d'acquisition pour acquérir une pluralité de messages, une unité de génération pour générer une pluralité d'objets individuels correspondant sur une base biunivoque à la pluralité de messages, et une unité de commande d'écran pour amener un écran à afficher un objet virtuel, qui est une collection de la pluralité d'objets individuels, l'unité de commande d'écran augmentant une taille de l'objet virtuel et augmentant une distance d'un centre d'un espace virtuel à un centre de l'objet virtuel conformément à une augmentation du nombre de la pluralité de messages.
PCT/JP2023/002690 2022-01-31 2023-01-27 Dispositif de commande d'écran et serveur WO2023145892A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-013454 2022-01-31
JP2022013454 2022-01-31

Publications (1)

Publication Number Publication Date
WO2023145892A1 true WO2023145892A1 (fr) 2023-08-03

Family

ID=87471727

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/002690 WO2023145892A1 (fr) 2022-01-31 2023-01-27 Dispositif de commande d'écran et serveur

Country Status (1)

Country Link
WO (1) WO2023145892A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012069097A (ja) * 2010-08-26 2012-04-05 Canon Inc データ検索結果の表示方法およびデータ検索結果の表示装置、プログラム
JP2021099544A (ja) * 2019-12-19 2021-07-01 富士フイルムビジネスイノベーション株式会社 情報処理装置及びプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012069097A (ja) * 2010-08-26 2012-04-05 Canon Inc データ検索結果の表示方法およびデータ検索結果の表示装置、プログラム
JP2021099544A (ja) * 2019-12-19 2021-07-01 富士フイルムビジネスイノベーション株式会社 情報処理装置及びプログラム

Similar Documents

Publication Publication Date Title
US11087728B1 (en) Computer vision and mapping for audio applications
AU2017308914B2 (en) Word flow annotation
CN115917498A (zh) 使用语音和文字字幕的增强现实体验
CN105190484B (zh) 个人全息告示牌
CN107943275B (zh) 模拟环境显示系统及方法
KR20160145976A (ko) 영상 공유 방법 및 이를 수행하는 전자 장치
CN105453011A (zh) 虚拟对象朝向和可视化
CN113647116A (zh) 生成双耳音频的头戴式装置
EP4172740A1 (fr) Lunettes à réalité augmentée avec bulles de texte et traduction
Starner Wearable computing
WO2023145892A1 (fr) Dispositif de commande d'écran et serveur
CN115735175A (zh) 可共享注视响应观看的眼戴器
US20230161959A1 (en) Ring motion capture and message composition system
US20230217007A1 (en) Hyper-connected and synchronized ar glasses
WO2023149255A1 (fr) Dispositif de commande d'affichage
WO2023145890A1 (fr) Dispositif terminal
WO2023149256A1 (fr) Dispositif de commande d'affichage
WO2023034021A1 (fr) Connexion sociale par l'intermédiaire d'objets réels distribués et connectés
CN117616381A (zh) 语音控制的设置和导航
WO2023112838A1 (fr) Dispositif de traitement d'informations
WO2023162499A1 (fr) Dispositif de commande d'affichage
WO2023145265A1 (fr) Dispositif de transmission de message, et dispositif de réception de message
WO2023079875A1 (fr) Dispositif de traitement d'informations
WO2023145273A1 (fr) Dispositif de commande d'affichage
JP2022022871A (ja) 処理装置および没入度導出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23747102

Country of ref document: EP

Kind code of ref document: A1