US11906741B2 - Display control device, display control method, and non-transitory computer-readable medium storing program - Google Patents

Display control device, display control method, and non-transitory computer-readable medium storing program Download PDF

Info

Publication number
US11906741B2
US11906741B2 US17/289,286 US201917289286A US11906741B2 US 11906741 B2 US11906741 B2 US 11906741B2 US 201917289286 A US201917289286 A US 201917289286A US 11906741 B2 US11906741 B2 US 11906741B2
Authority
US
United States
Prior art keywords
display
operation input
input information
display control
comment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US17/289,286
Other languages
English (en)
Other versions
US20210389588A1 (en
Inventor
Yousuke Motohashi
Mayo TAKETA
Takashi Nonaka
Hiroaki Yanagisawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
NEC Solution Innovators Ltd
Original Assignee
NEC Corp
NEC Solution Innovators Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp, NEC Solution Innovators Ltd filed Critical NEC Corp
Publication of US20210389588A1 publication Critical patent/US20210389588A1/en
Assigned to NEC CORPORATION, reassignment NEC CORPORATION, ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MOTOHASHI, YOUSUKE, NONAKA, TAKASHI, TAKETA, Mayo, YANAGISAWA, HIROAKI
Assigned to NEC CORPORATION, NEC SOLUTION INNOVATORS, LTD. reassignment NEC CORPORATION CORRECTIVE ASSIGNMENT TO CORRECT THE OMISSION 2ND ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 060956 FRAME: 0202. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT . Assignors: MOTOHASHI, YOUSUKE, NONAKA, TAKASHI, TAKETA, Mayo, YANAGISAWA, HIROAKI
Application granted granted Critical
Publication of US11906741B2 publication Critical patent/US11906741B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/10Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations all student stations being capable of presenting the same information simultaneously
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present invention relates to a display control device, a display control method, and a program.
  • a presenter and viewers are present in a presentation performed in a lecture, a conference, or the like.
  • a presenter gives an explanation about presentation materials displayed by a display device, and viewers listen to the explanation by the presenter while seeing the presentation materials displayed by this display device.
  • the viewer in a case where an electronic file of the presentation materials is sent to a mobile terminal of the viewer, the viewer is capable of browsing a page other than the page presently displayed on a screen of the display device by using the mobile terminal. That is, although the page of the presentation materials displayed on the screen of the display device proceeds in accordance with progress of the presentation regardless of the intension of the viewer, separately from that, the viewer can browse a desired page by using the mobile terminal.
  • Patent Literature 1 discloses a technique of enlarging or shrinking an image displayed in a VR space based on a motion of the head of a user wearing a head mounted display (HMD).
  • HMD head mounted display
  • Patent Literature 1 Japanese Unexamined Patent Application Publication No. 2016-024751
  • a material is displayed on a terminal at hand of an individual viewer separately from a display device for all viewers, and each of the viewers can thereby check a content other than a content presently displayed on a screen of the display device.
  • the viewer because the viewer has to move a sight line to the terminal at hand, the viewer cannot simultaneously see the content displayed on the display device which is arranged far in front of the viewer and is provided for all of the viewers. That is, in order to check both of the content displayed in accordance with progress of the presentation and the content to which the viewer desires to refer, the viewer has to repeat actions to largely change the sight line. This is due to a fact that the positional relationship of displayed positions of respective materials are fixed with respect to the position of the display device and the position of the viewer.
  • Patent Literature 1 only discloses change in size of an image to be displayed in a VR space but does not discuss browsing of plural materials in a presentation.
  • one of objects to be achieved by the example embodiments disclosed in the present specification is to provide a display control device, a display control method, and a program that can improve convenience of a viewer of a presentation in browsing materials.
  • a first aspect provides a display control device including:
  • first display control means for performing control so as to display a page, which is being displayed in a real space, in a first material as a presentation material used in a presentation by a presenter in a first position in a virtual reality space;
  • second display control means for performing control so as to display a second material in a second position which is different from the first position in the virtual reality space
  • operation input information acquisition means for acquiring first operation input information as operation input information which is input by a viewer experiencing the virtual reality space and which is about display of the second material
  • the second display control means changes a display manner of the second material based on the first operation input information.
  • a second aspect provides a display control method including:
  • a third aspect provides a program causing a computer to execute:
  • a display control device a display control method, and a program can be provided which can improve convenience of a viewer of a presentation in browsing materials.
  • FIG. 1 is a block diagram illustrating one example of a configuration of a display control device according to an outline of an example embodiment.
  • FIG. 2 is a schematic diagram illustrating one example of a configuration of a presentation system according to the example embodiment.
  • FIG. 3 is a block diagram illustrating one example of a function configuration of a display control device according to a first example embodiment.
  • FIG. 4 is a diagram illustrating an example of display state data stored in a display state storage unit according to the first example embodiment.
  • FIG. 5 is a schematic diagram illustrating one example of a picture to be displayed in a head mounted display according to the first example embodiment.
  • FIG. 6 is a schematic diagram illustrating another example of the picture to be displayed in the head mounted display according to the first example embodiment.
  • FIG. 7 is a block diagram illustrating one example of a hardware configuration of a display control device according to an example embodiment.
  • FIG. 8 is a flowchart illustrating one example of an action of a reference material display control unit according to the first example embodiment.
  • FIG. 9 is a block diagram illustrating one example of a function configuration of a display control device according to a second example embodiment.
  • FIG. 10 is a diagram illustrating an example of display state data stored in a display state storage unit according to the second example embodiment.
  • FIG. 11 is a schematic diagram illustrating one example of a picture to be displayed in a head mounted display according to the second example embodiment.
  • FIG. 12 is a flowchart illustrating one example of an action of a reference material explication display control unit according to the second example embodiment.
  • FIG. 13 is a block diagram illustrating one example of a function configuration of a display control device according to a third example embodiment.
  • FIG. 14 is a diagram illustrating an example of comment data stored in a comment storage unit according to the third example embodiment.
  • FIG. 15 is a schematic diagram illustrating one example of a picture to be displayed in a head mounted display according to the third example embodiment.
  • FIG. 16 is a flowchart illustrating one example of actions of a reference material explication display control unit and a comment control unit according to the third example embodiment.
  • FIG. 1 is a block diagram illustrating one example of a configuration of a display control device 1 according to an outline of an example embodiment. As illustrated in FIG. 1 , the display control device 1 has a first display control unit 2 , a second display control unit 3 , and an operation input information acquisition unit 4 .
  • the first display control unit 2 performs control so as to display a page, which is being displayed in a real space, in a first material as a presentation material used in a presentation by a presenter in a first position (in other words, a first display region) in a virtual reality space.
  • the first display control unit 2 performs control so as to display a page of a presentation material, which is presently displayed on a predetermined display device, in a first position in a virtual reality space in a head mounted display worn by a viewer.
  • a display content of the predetermined display device that is, the page of the presentation material, which is displayed on the display device, changes in accordance with progress of the presentation by the presenter, for example, by an operation by the presenter or his/her assistant.
  • a display content in the first position in the virtual reality space is synchronized with the display content of the above predetermined display device and thus proceeds in accordance with the progress of the presentation regardless of an intention of the viewer.
  • the second display control unit 3 performs control so as to display a second material in a second position (in other words, a second display region) in the same virtual reality space as the above-described virtual reality space.
  • the second position (second display region) is different from the first position (first display region).
  • the second position (second display region) is positioned next to the first position (first display region). That is, the second position (second display region) is positioned within a range in which the distance from the first position (first display region) is in advance defined.
  • the second material is the same material as the presentation material set as a display object in the first position, for example, but may be another material than the presentation material.
  • the operation input information acquisition unit 4 acquires operation input information which is input by a viewer experiencing the virtual reality space and is about display of the above second material. For example, the operation input information acquisition unit 4 receives the operation input information transmitted from the head mounted display and thereby acquires that.
  • the second display control unit 3 changes a display manner of the second material based on this acquired operation input information.
  • the second display control unit 3 may change pages of the second material as display objects based on this operation input information. Further, the second display control unit 3 may change a display size of the second material based on this operation input information. Further, the second display control unit 3 may change a display position of the second material based on this operation input information.
  • the display control device 1 simultaneously displays a page to which the presenter presently refers for a presentation and a desired page of a material to which the viewer presently desires to refer in the virtual reality space. Furthermore, the viewer is capable of changing a display manner of the material to which the viewer presently desires to refer.
  • the display position of the material can be set regardless of a viewing position of the viewer in the real space.
  • the display control device 1 can improve convenience of a viewer of a presentation in browsing materials. For example, in particular, in a case where the second position (second display region) is set adjacently to the first position (first display region), the viewer can easily browse both materials.
  • FIG. 2 is a schematic diagram illustrating one example of a configuration of a presentation system 10 according to an example embodiment.
  • the presentation system 10 includes a display control device 100 , a head mounted display 200 _ 1 , a head mounted display 200 _ 2 , . . . a head mounted display 200 _N, a terminal device 300 , and a display device 400 .
  • N denotes an integer that is one or greater.
  • the head mounted display 200 _ 1 the head mounted display 200 _ 2 , . . . the head mounted display 200 _N are mentioned without being particularly distinguished, those will be referred to as the head mounted display 200 .
  • FIG. 2 illustrates a presenter 500 as a person who performs a presentation and viewers 510 and 520 _ 1 , 520 _ 2 , . . . 520 _N.
  • the viewer 510 is a viewer who attends a presentation while seeing a material displayed on the display device 400 .
  • the viewers 520 _ 1 , 520 _ 2 , . . . 520 _N are viewers who wear the head mounted display 200 and attend the presentation in a virtual reality space projected by the head mounted display 200 .
  • the viewers 520 _ 1 , 520 _ 2 , . . . 520 _N are mentioned without being particularly distinguished, they will be referred to as viewer 520 .
  • the presentation system 10 is a system used for presentations such as lectures and conferences, for example.
  • the display control device 100 is a control device which controls display on the head mounted display 200 and corresponds to the display control device 1 in FIG. 1 .
  • the display control device 100 is connected with the head mounted display 200 and the terminal device 300 in a wired or wireless manner so as to be capable of communication. Note that details of the display control device 100 will be described later.
  • the head mounted display 200 is a display device which is worn on the head of the viewer 520 and causes the viewer 520 to experience a virtual reality space.
  • an arbitrary head mounted display may be used which is capable of displaying a virtual reality space.
  • the head mounted display 200 includes a display to be arranged in front of the eyes of the viewer 520 and a motion sensor detecting a motion of the head mounted display 200 (a motion of the head of the viewer 520 ).
  • this motion sensor is an inertial measurement unit (IMU) such as a velocity sensor or a gyroscope, for example. Accordingly, motions such as an orientation and an inclination of the head mounted display 200 are detected.
  • IMU inertial measurement unit
  • a picture is generated by the display control device 100 , and a picture output from the display control device 100 is displayed on the display of the head mounted display 200 .
  • the head mounted display 200 includes an input interface accepting an operation input by the viewer 520 .
  • This input interface may be built in the head mounted display 200 or may be connected with the head mounted display 200 .
  • an arbitrary input interface may be used which is capable of accepting the operation input by the viewer 520 .
  • the input interface may be an eye-tracking sensor (sight line tracking sensor) or may be a motion sensor detecting a gesture such as a motion of a hand or a finger of the viewer 520 .
  • the input interface may be an operation stick, a keyboard, a mouse, or the like.
  • the motion of the head of the viewer 520 specifically, a motion of the head of the viewer 520 in an up-down direction, or the like, for example
  • the above-described motion sensor detecting the motion of the head mounted display 200 may be used as the input interface.
  • the terminal device 300 is connected with the display device 400 and the display control device 100 in a wired or wireless manner so as to be capable of communication.
  • the terminal device 300 is an information processing device for outputting a presentation material to the display device 400 .
  • the terminal device 300 outputs an image of an arbitrary page of a presentation material to the display device 400 . Furthermore, the terminal device 300 includes an input interface such as a mouse and outputs to the display device 400 an image of an appropriate page corresponding to progress of a presentation in accordance with an operation input by the presenter 500 or his/her assistant or the like.
  • the display device 400 is connected with the terminal device 300 in a wired or wireless manner so as to be capable of communication.
  • the display device 400 is a display which displays an image received from the terminal device 300 .
  • the display device 400 may be a flat panel display such as a liquid crystal display or may be a projector which projects an image on a screen.
  • FIG. 3 is a block diagram illustrating one example of a function configuration of the display control device 100 according to the first example embodiment.
  • the display control device 100 has a space display control unit 101 , a material storage unit 102 , a display state storage unit 103 , an operation input information acquisition unit 104 , a display content acquisition unit 105 , a presenter material display control unit 106 , a reference material display control unit 107 , and a picture output unit 108 .
  • the space display control unit 101 performs control so as to display a virtual reality space in the head mounted display 200 .
  • the space display control unit 101 may change display of the virtual reality space in accordance with the motion of the head mounted display 200 which is detected by the head mounted display 200 . Further, the space display control unit 101 may change the display of the virtual reality space in accordance with an operation input by the viewer 520 .
  • the space display control unit 101 may perform control so as to display a menu described later in the virtual reality space.
  • the menu is displayed in each of the head mounted displays 200 , for example. Further, the space display control unit 101 may control display of the menu in accordance with the operation input by the viewer 520 .
  • the space display control unit 101 may perform control so as to display an avatar of the presenter 500 .
  • the display of this avatar may be changed while reflecting a gesture of the presenter 500 .
  • an actual gesture of the presenter 500 is acquired by a motion capture device such as a camera or a motion sensor, and acquired gesture information is input to the display control device 100 .
  • the space display control unit 101 controls the display so as to move the avatar in accordance with the gesture information.
  • the material storage unit 102 in advance stores an electronic file of a material to be displayed in the virtual reality space.
  • This electronic file may be a presentation material (a material to be displayed on the display device 400 ) to be used in a presentation by the presenter 500 , for example, or may be another material which is different from that.
  • the display state storage unit 103 stores display state data. Details of the display state data will be described later.
  • the operation input information acquisition unit 104 acquires operation input information as information which indicates an operation content input via the input interface of the head mounted display 200 .
  • the operation input information acquisition unit 104 receives the operation input information output from the head mounted display 200 , for example, and thereby acquires that.
  • the display content acquisition unit 105 acquires a present display content in the display device 400 .
  • the display content acquisition unit 105 acquires the present display content in the display device 400 by receiving that from the terminal device 300 .
  • the display content acquisition unit 105 acquires content specifying information as information for specifying a display content (for example, a file name, a page number of a page being displayed, and so forth).
  • the display content acquisition unit 105 may acquire image data that the terminal device 300 outputs to the display device 400 instead of the content specifying information.
  • the presenter material display control unit 106 corresponds to the first display control unit 2 in FIG. 1 .
  • the presenter material display control unit 106 performs control so as to display the present display content in the display device 400 in a predetermined coordinate position (the above-described first position) in the virtual reality space in accordance with information acquired by the display content acquisition unit 105 .
  • the presenter material display control unit 106 acquires the presentation material specified by the content specifying information from the material storage unit 102 and performs control so as to display the image of the page specified by the content specifying information in the virtual reality space.
  • the presenter material display control unit 106 performs control so as to display the image in the virtual reality space.
  • the reference material display control unit 107 corresponds to the second display control unit 3 in FIG. 1 .
  • the reference material display control unit 107 displays a material stored in the material storage unit 102 in the virtual reality space. Note that a material displayed by the reference material display control unit 107 will be referred to as the reference material in the following description.
  • the reference material corresponds to the above-described second material.
  • an initial display position (the above-described second position) of the reference material is a predetermined position adjacent to the presentation material displayed in the virtual reality space by the presenter material display control unit 106 .
  • the display position of the reference material can be changed by an operation by the viewer 520 .
  • the reference material display control unit 107 may display the presentation material or may display a different material from the presentation material.
  • the viewer 520 can browse a desired page of the presentation material by an operation.
  • the reference material is another material than the presentation material
  • the viewer 520 can browse a desired page of the material by an operation.
  • the viewer 520 can refer to a desired content regardless of the page to which the presenter 500 presently refers.
  • the reference material display control unit 107 acquires a material which is a display object as the reference material from the material storage unit 102 and performs control so as to display that in the virtual reality space.
  • the reference material display control unit 107 may acquire another material defined in advance than the presentation material from the material storage unit 102 or may acquire a material selected by an operation by the viewer 520 from the material storage unit 102 .
  • the reference material display control unit 107 selects a material based on the operation input information indicating an operation input for selecting the reference material.
  • the reference material display control unit 107 changes a display manner of the reference material based on the operation input information about display of the reference material, which is acquired by the operation input information acquisition unit 104 .
  • the operation input information about display of the reference material is the operation input information which gives an instruction to change object pages to be displayed.
  • the reference material display control unit 107 performs control so as to change pages of the reference material to be displayed in the head mounted display 200 in accordance with the operation input information. Accordingly, the viewer 520 wearing the head mounted display 200 can freely change pages to be browsed.
  • the operation input information about display of the reference material is the operation input information which gives an instruction to change the display size of the reference material.
  • the reference material display control unit 107 performs control so as to change the display size of the reference material to be displayed in the head mounted display 200 in accordance with the operation input information. Accordingly, the viewer 520 wearing the head mounted display 200 can freely change the display size of the reference material.
  • the operation input information about display of the reference material is the operation input information which gives an instruction to change the display position of the reference material.
  • the reference material display control unit 107 performs control so as to change the display position of the reference material to be displayed in the head mounted display 200 in accordance with the operation input information. Accordingly, the viewer 520 wearing the head mounted display 200 can freely change the display position of the reference material.
  • the operation input information is information corresponding to the operation input which is input by the viewer 520 via the input interface of the head mounted display 200 .
  • the operation input may be made by the sight line of the viewer 520 , may be made by a gesture by the head, a hand, or the like, and may be an operation by an operation stick, a keyboard, a mouse, or the like.
  • display pages of the reference material may be changed in accordance with a motion of the head (hand) of the viewer 520 . That is, for example, in a case where the operation input information acquisition unit 104 acquires the operation input information indicating that the viewer 520 swings the head (hand) in a predetermined direction, the reference material display control unit 107 may move the page of the reference material, which is presently displayed, forward (backward) by one page.
  • an operation content indicated by a menu item which corresponds to a pointed position detected by a pointing device such as a sight line tracking sensor, an operation stick, or a mouse
  • an operation content indicated by a menu item which is designated by a keyboard may be acquired as the operation input information.
  • the operation input information acquisition unit 104 may acquire information indicating a predetermined operation, which is input by the viewer 520 , by an arbitrary method.
  • a display manner to be changed by the operation input information is not limited to the above-described changes in the display manner.
  • the reference material display control unit 107 may replicate display of the reference material in accordance with the operation input information.
  • the reference material display control unit 107 manages the display state data indicating a display state of the reference material in the virtual reality space and thereby realizes display corresponding to the operation input information.
  • FIG. 4 is a diagram illustrating an example of the display state data stored in the display state storage unit 103 .
  • the display state data are data representing the display state of the reference material which is presently displayed on each of the head mounted displays 200 .
  • the display state data include a state ID, a material ID, a user ID, coordinates, a magnification, and a page number.
  • the state ID is an identifier which identifies the display state data.
  • the material ID is an identifier which identifies the reference material being displayed, and the material ID specifies the reference material which is presently displayed.
  • the user ID is an identifier that identifies for which head mounted display 200 (viewer 520 ) display is performed.
  • Coordinate data configuring the display state data are coordinate data indicating the display position of the reference material and are formed with first coordinates indicating the coordinates of a left upper end portion of the reference material in the virtual reality space and second coordinates indicating the coordinates of a right lower end portion of the reference material in the virtual reality space. Those coordinates are configured with (x coordinate, y coordinate, z coordinate) of the virtual reality space as a three-dimensional space.
  • the magnification indicates a present display magnification of the reference material.
  • the page number indicates which page of the reference material is displayed.
  • the reference material display control unit 107 specifies the page to be displayed anew based on the present page number indicated by the display state data. Then, when the display manner is changed, the reference material display control unit 107 updates the page number in the display state data.
  • the reference material display control unit 107 calculates a new magnification and a display position based on the present magnification and display position which are indicated by the display state data. Then, when the display manner is changed, the reference material display control unit 107 updates those pieces of information in the display state data.
  • the reference material display control unit 107 calculates a new display position based on the present display position indicated by the display state data. Then, when the display manner is changed, the reference material display control unit 107 updates the display position in the display state data.
  • the picture output unit 108 outputs a picture to the head mounted display 200 , the picture conforming to control by the space display control unit 101 , the presenter material display control unit 106 , and the reference material display control unit 107 .
  • Picture data transmitted from the picture output unit 108 are received by the head mounted display 200 and are displayed to the viewer 520 in the head mounted display 200 .
  • FIG. 5 is a schematic diagram illustrating one example of a picture to be displayed in the head mounted display 200 .
  • display 210 denotes the presentation material which is displayed by control by the presenter material display control unit 106 . That is, the display content of the display 210 is synchronized with the display content of the display device 400 . In other words, pages of the display 210 are changed in accordance with the progress of the presentation, regardless of an intention of the viewer 520 .
  • display 211 denotes the reference material (for example, a material with a material ID of D0001) which is displayed by control by the reference material display control unit 107 .
  • the display 211 represents an arbitrary page of the presentation material which is displayed as the display 210 , for example.
  • the initial display position of the reference material (the initial display position of the display 211 ) is a position adjacent to the display 210 .
  • the reference material display control unit 107 uses such a position as the initial display position, the viewer 520 can easily browse the presentation material to which the presenter 500 is referring and a material to which the viewer 520 him/herself desires to refer.
  • display 212 is displayed in the virtual reality space.
  • the display 212 denotes a menu for the operation input about display of the reference material.
  • the viewer 520 designates a menu item 212 A by a pointing device, designates a submenu (not illustrated), which is thereafter displayed, by the pointing device, and can thereby select the reference material to be displayed in the virtual reality space.
  • the viewer 520 designates a menu item 212 B or a menu item 212 C by the pointing device, and pages of the display 211 are thereby switched.
  • the viewer 520 designates a menu item 212 D by the pointing device, and the display position of the display 211 in the virtual reality space is thereby changed.
  • the reference material display control unit 107 may change the display position to a predetermined position or to a position designated by the pointing device.
  • the viewer 520 designates a menu item 212 E or a menu item 212 F by the pointing device, and the size of the display 211 is thereby changed.
  • the changed size may be a predetermined size or may be designated by the operation input.
  • the display of the menu which is illustrated in FIG. 5 is one example, and a menu with other contents may be displayed. Further, in a case where the operation input is performed by a gesture, for example, display of a menu may be omitted.
  • FIG. 6 is a schematic diagram illustrating another example of the picture to be displayed in the head mounted display 200 .
  • the display 211 displays an image of one page of the reference material.
  • the display 211 displays an image of plural pages of the reference material.
  • the reference material display control unit 107 may simultaneously display plural pages of the reference material.
  • FIG. 7 is a block diagram illustrating one example of the hardware configuration of the display control device 100 .
  • the display control device 100 is configured as a server having a network interface 150 , a memory 151 , and a processor 152 , for example.
  • the network interface 150 is used for performing communication with the head mounted display 200 and the terminal device 300 .
  • the network interface 150 may include a network interface card (NIC), for example.
  • NIC network interface card
  • the memory 151 is configured with a combination of a volatile memory and a non-volatile memory, for example.
  • the memory 151 is used for storing software (computer programs) or the like including one or more instructions to be executed by the processor 152 .
  • Non-transitory computer-readable media include various types of tangible recording media (tangible storage media).
  • Examples of non-transitory computer readable media include magnetic recording media (for example, a flexible disk, a magnetic tape, and a hard disk drive), magneto-optical recording media (for example, a magneto-optical disk), a compact disc read-only memory (CD-ROM), a CD-R, a CD-R/W, and semiconductor memories (for example, a mask ROM, a programmable ROM (PROM), an erasable PROM (EPROM), a flash ROM, and a random-access memory (RAM)).
  • magnetic recording media for example, a flexible disk, a magnetic tape, and a hard disk drive
  • magneto-optical recording media for example, a magneto-optical disk
  • CD-ROM compact disc read-only memory
  • CD-R compact disc read-only memory
  • CD-R/W compact disc read-only memory
  • semiconductor memories for example, a mask ROM, a
  • programs may be supplied to a computer by various types of transitory computer-readable media.
  • Examples of transitory computer-readable media include an electric signal, an optical signal, and an electromagnetic wave.
  • a transitory computer readable medium can supply a program to a computer via a wired communication path such as an electric wire or an optical fiber or a wireless communication path.
  • the processor 152 may be a microprocessor, an MPU (microprocessor unit), a CPU (central processing unit), or the like, for example.
  • the processor 152 may include plural processors.
  • the processor 152 reads out and executes programs from the memory 151 and thereby performs processes of the space display control unit 101 , the operation input information acquisition unit 104 , the display content acquisition unit 105 , the presenter material display control unit 106 , the reference material display control unit 107 , and the picture output unit 108 .
  • the material storage unit 102 and the display state storage unit 103 of the display control device 100 are realized by the memory 151 or a storage device (not illustrated).
  • the display control device 100 functions as a computer.
  • the head mounted display 200 , the terminal device 300 , and the display device 400 may also have a hardware configuration as illustrated in FIG. 7 . That is, those may have a function as a computer, and a processor may execute various processes by executing programs.
  • FIG. 8 is a flowchart illustrating one example of an action of the reference material display control unit 107 .
  • FIG. 8 illustrates display control of the reference material for the head mounted display 200 worn by any one of the viewers 520 (here, referred to as user X). Note that the display control of the reference material is similarly performed for the head mounted displays 200 worn by the viewers 520 other than the user X. In the following, a description will be made about one example of the action of the reference material display control unit 107 along FIG. 8 .
  • step S 100 the reference material display control unit 107 checks whether or not the operation input information acquisition unit 104 acquires the operation input information about the operation input from the user X. Note that the head mounted displays 200 and the viewers 520 wearing the head mounted displays 200 are managed in the display control device 100 while being in advance associated together. When the operation input information about the operation input from the user X is acquired (Yes in step S 100 ), the process moves to step S 101 .
  • step S 101 the reference material display control unit 107 determines whether or not the acquired operation input information is the operation input information for selecting the reference material as the display object.
  • step S 102 the reference material display control unit 107 displays the designated material as the reference material. Note that plural reference materials may be displayed.
  • step S 101 In a case where the acquired operation input information is not the operation input information for selecting the reference material as the display object (No in step S 101 ), the process moves to step S 103 .
  • step S 103 the reference material display control unit 107 determines whether or not the acquired operation input information is the operation input information for giving an instruction to change object pages to be displayed.
  • step S 104 the reference material display control unit 107 displays the page of the reference material which corresponds to the operation.
  • step S 105 In a case where the acquired operation input information is not the operation input information for giving an instruction to change object pages to be displayed (No in step S 103 ), the process moves to step S 105 .
  • step S 105 the reference material display control unit 107 determines whether or not the acquired operation input information is the operation input information for giving an instruction to change the display position of the reference material.
  • step S 106 the reference material display control unit 107 displays the reference material in the position which corresponds to the operation.
  • step S 107 In a case where the acquired operation input information is not the operation input information for giving an instruction to change the display position of the reference material (No in step S 105 ), the process moves to step S 107 .
  • step S 107 the reference material display control unit 107 determines whether or not the acquired operation input information is the operation input information for giving an instruction to change the display size of the reference material.
  • step S 108 as for the reference material, the reference material display control unit 107 displays the reference material at the magnification which corresponds to the operation.
  • step S 107 the process returns to step S 100 .
  • control illustrated in FIG. 8 is merely one example. Consequently, for example, the control may be performed in different processing order from the processing order illustrated in FIG. 8 . Further, in the flowchart illustrated in FIG. 8 , as changes in the display manner of the reference material, changes of pages and in the display position and size are possible, but only one kind of display manner among those may be changeable. Further, the display manners of an arbitrary combination of two kinds may be changeable.
  • the presentation material and the reference material are simultaneously displayed in the virtual reality space independently of a viewing position of a viewer in the real space. Furthermore, the viewer 520 is capable of freely changing the display manner of the reference material. Thus, the display control device 100 can improve convenience of a viewer of a presentation in browsing materials.
  • the second example embodiment is different from the first example embodiment in the point that explication information explicating a content described in a material is further displayed in a virtual reality space.
  • FIG. 9 is a block diagram illustrating one example of a function configuration of a display control device 120 according to the second example embodiment.
  • the display control device 120 is different from the above-described display control device 100 in the point that the reference material display control unit 107 is replaced by a reference material explication display control unit 121 .
  • the picture output unit 108 outputs a picture to the head mounted display 200 , the picture conforming to control by the space display control unit 101 , the presenter material display control unit 106 , and the reference material explication display control unit 121 .
  • the display control device 120 also includes the hardware configuration illustrated in FIG. 7 , the processor 152 executes a program read out from the memory 151 , and a process of the reference material explication display control unit 121 is thereby realized.
  • the processor 152 executes a program read out from the memory 151 , and a process of the reference material explication display control unit 121 is thereby realized.
  • configurations and actions common to the display control device 100 will not be described, but different points from the display control device 100 will be described.
  • the operation input information acquisition unit 104 in the present example embodiment not only acquires the operation input information about display of the reference material but also acquires the operation input information which is input by the viewer 520 experiencing the virtual reality space and designates an explication object in a content of the reference material.
  • the viewer 520 performs the operation input for designating a part, an explication of which he/she desires to see, in the reference material displayed in the virtual reality space.
  • the reference material explication display control unit 121 is different from the reference material display control unit 107 in the first example embodiment in the point that that further performs display control of the explication information in addition to display control of the reference material.
  • the reference material explication display control unit 121 displays the explication information as information explicating the explication object in the virtual reality space based on the operation input information designating the explication object in a content of the reference material.
  • the explication information is stored in the material storage unit 102 , for example. Each piece of the explication information is stored in the material storage unit 102 while being associated with the explication object in the reference material.
  • the reference material explication display control unit 121 reads out the explication information, which is associated with the explication object designated by the operation input information, from the material storage unit 102 and displays that in the virtual reality space.
  • the reference material explication display control unit 121 performs control so as to display the explication information in a position in the vicinity of the explication object, for example. Further, the reference material explication display control unit 121 may display the explication information by connecting the explication object and the explication information by a line, for example, such that the correspondence relationship between those becomes clear.
  • the display control device 120 can display the explication information at various explication levels.
  • an explication level means a level which indicates details of an explanation content indicated by the explication information or easiness of terms indicated by the explication information.
  • the operation input information acquisition unit 104 can also acquire the operation input information which is input by the viewer experiencing the virtual reality space and designates the explication level.
  • the reference material explication display control unit 121 performs control so as to display the explication information at a predetermined level when the operation input information designating the explication object is acquired, for example. Then, when the operation input information designating the explication level is acquired, the reference material explication display control unit 121 performs switching to display of the explication information at the level designated by the operation input information. In this case, one explication object is stored in the material storage unit 102 while being linked with plural pieces of explication information at different explication levels.
  • the reference material explication display control unit 121 manages the display state data, which indicate a display state of the reference material in the virtual reality space, and a display state of the explication information and thereby realizes display corresponding to the operation input information.
  • FIG. 10 is a diagram illustrating an example of display state data stored in the display state storage unit 103 according to the second example embodiment.
  • display state data with a state ID of S0001 and display state data with a state ID of S0002 are specific examples of the display state data indicating the display state of the reference material.
  • display state data with a state ID of S0008 are a specific example of the display state data indicating the display state of the explication information (material ID: D0001-a007).
  • the explication information with a material ID of D0001-a007 is the explication information about a reference material with a material ID of D0001, for example.
  • the display state data indicating the display state of the explication information include not only the state ID, the material ID, the user ID, the coordinates, the magnification, and the page number but also a level.
  • a level indicates the explication level of the explication information which is presently displayed.
  • the reference material explication display control unit 121 specifies the explication information to be displayed anew based on the present explication level indicated by the display state data. Then, when the display manner is changed, the reference material explication display control unit 121 updates the level in the display state data.
  • the display manner of the explication information may also be changed in accordance with the operation input information about the explication information, which is acquired by the operation input information acquisition unit 104 .
  • FIG. 11 is a schematic diagram illustrating one example of a picture to be displayed in the head mounted display 200 according to the second example embodiment.
  • display 210 denotes the presentation material which is displayed by control by the presenter material display control unit 106 .
  • display 211 denotes the reference material which is displayed by control by the reference material explication display control unit 121 .
  • display 220 denotes the explication information which is displayed by control by the reference material explication display control unit 121 . More specifically, the display 220 represents the explication information about an explication object 221 designated by the operation input.
  • the display position of the display 220 is calculated by the reference material explication display control unit 121 based on the display position indicated by the display state data about the reference material, for example. For example, the reference material explication display control unit 121 calculates a position, which does not overlap with other display (for example, the display 210 ) but is adjacent to the explication object, as the display position of the explication information. Further, the reference material explication display control unit 121 may calculate a position adjacent to the explication object as the display position of the explication information.
  • display 222 is displayed in the virtual reality space.
  • the display 222 denotes a menu for the operation input about display of the reference material and display of the explication information.
  • the viewer 520 designates a menu item 212 G by a pointing device and thereafter designates a position in which the explication object is displayed, that is, a position in which a description, explication of which is desired, is present by the pointing device, and the explication object is thereby designated.
  • a swipe action on the display 220 or the like by the viewer 520 is performed via a pointing device or the like, and the operation input for designating (changing) the explication level is thereby performed. Accordingly, the operation input information acquisition unit 104 acquires the operation input information designating (changing) the explication level.
  • the reference material explication display control unit 121 may display a predetermined icon around an object for which the explication information is capable of being provided. Then, this predetermined icon is designated by a pointing device, and the explication object may thereby be designated.
  • the predetermined icon is an icon clearly indicating that explication is capable of being referred to, for example.
  • the icon may be an icon imitating the question mark or may be an icon imitating a magnifying lens but is not limited to those.
  • the operation input for designating the explication level may be performed by designating a menu item for designating the level by a pointing device.
  • a method of the operation input is not limited to any operation input method.
  • FIG. 12 is a flowchart illustrating one example of an action of the reference material explication display control unit 121 .
  • FIG. 12 illustrates display control for the head mounted display 200 worn by the user X.
  • the display control is similarly performed for the head mounted displays 200 worn by the viewers 520 other than the user X.
  • FIG. 12 is different from the flowchart illustrated in FIG. 8 in the point that control about display of the explication information is added. Specifically, the flowchart illustrated in FIG. 12 is different from the flowchart illustrated in FIG. 8 in the point that step S 109 to step S 112 are added.
  • a description will be made about one example of the action of the reference material explication display control unit 121 along FIG. 12 . However, descriptions common to the descriptions about FIG. 8 will not be repeated.
  • step S 107 in a case of No in step S 107 , the process returns to step S 100 ; however, in the flowchart illustrated in FIG. 12 , the process moves to step S 109 .
  • step S 109 the reference material explication display control unit 121 determines whether or not the operation input information acquired by the operation input information acquisition unit 104 is the operation input information designating the explication object in a content of the reference material.
  • step S 110 the reference material explication display control unit 121 displays the explication information corresponding to the designated explication object.
  • step S 109 In a case where the acquired operation input information is not the operation input information designating the explication object in the content of the reference material (No in step S 109 ), the process moves to step S 111 .
  • step S 111 the reference material explication display control unit 121 determines whether or not the operation input information acquired by the operation input information acquisition unit 104 is the operation input information designating the explication level.
  • step S 112 the reference material explication display control unit 121 switches the display to the explication information at the designated level.
  • step S 111 the process returns to step S 100 .
  • control illustrated in FIG. 12 is merely one example. Consequently, for example, the control may be performed in different processing order from the processing order illustrated in FIG. 12 .
  • the explication information may be acquired from an external server (search server) via a network such as the Internet. That is, the reference material explication display control unit 121 may transmit a term as the explication object to an external server and acquire a search result by a search engine provided by this server as the explication information.
  • the viewer 520 can browse not only the reference material but also the explication information. Further, because those are displayed in the virtual reality space, in particular, the following advantages are provided.
  • the reference material and the explication information are displayed in the real space by a display of a mobile terminal, plural windows have to be displayed within the size of the display of the mobile terminal. Thus, a measure is needed such as displaying one window with another window overlapping with that or displaying windows in small display sizes such that those do not interfere with each other. This may result in display which is difficult to see.
  • the presentation material, the reference material, and the explication information are displayed in the virtual reality space.
  • the third example embodiment is different from the above-described example embodiments in the point that a comment on a material can be managed.
  • FIG. 13 is a block diagram illustrating one example of a function configuration of a display control device 130 according to the third example embodiment.
  • the display control device 130 is different from the above-described display control device 120 in the point that a comment control unit 131 and a comment storage unit 132 are added.
  • the picture output unit 108 in the present example embodiment outputs a picture to the head mounted display 200 , the picture conforming to control by the space display control unit 101 , the presenter material display control unit 106 , the reference material explication display control unit 121 , and the comment control unit 131 .
  • the display control device 130 has a display function of the explication information described as the second example embodiment but may not have the display function of the explication information.
  • the display control device 130 also includes the hardware configuration illustrated in FIG. 7 , the processor 152 executes a program read out from the memory 151 , and a process of the comment control unit 131 is thereby realized. Further, the comment storage unit 132 is realized by the memory 151 or a storage device (not illustrated).
  • the operation input information acquisition unit 104 in the present example embodiment acquires not only the above-described operation input information but also the operation input information about a comment input by the viewer 520 experiencing the virtual reality space.
  • the comment may be a comment on the material set as the display object by the presenter material display control unit 106 (presentation material) or may be a comment on the reference material set as the display object by the reference material explication display control unit 121 .
  • the operation input information about the comment includes a position of a comment object in a material and a comment content.
  • the viewer 520 performs the operation inputs for designating a part, on which he/she desires to comment, in a material displayed in the virtual reality space and for inputting a comment.
  • the comment control unit 131 manages the position of the comment object and the comment content which are included in the operation input information about the comment while associating those together. Specifically, the comment control unit 131 stores comment data as data in which the position of the comment object and the comment content are associated together in the comment storage unit 132 and thereby manages the comment. As described above, in the present example embodiment, the viewer 520 can provide a comment on a material. Further, the content of the provided comment is managed while being associated with the position of the comment object. Consequently, it is possible to easily specify which content in the material the provided comment is about.
  • the comment control unit 131 performs control so as to display the comment content corresponding to the comment object in the virtual reality space while associating the comment content with the position of the comment object in the material.
  • the viewer 520 can also browse the comment corresponding to the material which is being displayed.
  • the comment control unit 131 may disclose the comments on a certain material to all of the viewers 520 ; however, in the present example embodiment, the comment control unit 131 discloses comments by the other viewers 520 only to the viewers 520 who satisfy a predetermined condition.
  • the display control device 130 is configured in the following manner.
  • the operation input information about the comment which is acquired by the operation input information acquisition unit 104 more specifically includes identification information, which identifies the viewer 520 who inputs a comment, in addition to the position of the comment object in the material and the comment content. Further, the comment control unit 131 manages this identification information, the position of the comment object, and the comment content while associating those together. Furthermore, the comment control unit 131 performs control about whether or not the comment content is displayed when the material is displayed in the virtual reality space based on the identification information associated with the comment content.
  • the comment control unit 131 performs control so as to display the content of a comment on the head mounted display 200 of the viewer 520 linked with the identification information which is in advance associated with the identification information of an inputting person of the comment.
  • the comment control unit 131 performs control so as not to display the content of the comment on the head mounted display 200 of the viewer 520 linked with the identification information which is not associated in advance with the identification information of the inputting person of the comment.
  • the comment control unit 131 performs control such that the comment can be shared by the viewers belonging to the same group or belonging to a group with higher authority than the group has, for example.
  • FIG. 14 is a diagram illustrating an example of comment data stored in the comment storage unit 132 .
  • the comment data include a comment ID, a material ID, a user ID, a group ID, a page number, coordinates, and the comment content.
  • the comment ID is an identifier which identifies the comment data.
  • the material ID is an identifier which identifies the material as the comment object.
  • the user ID is an identifier that identifies which viewer 520 inputs the comment.
  • the group ID is an identifier which identifies the group to which the user ID belongs, that is, the group to which the viewer 520 inputting the comment belongs.
  • the page number indicates which page of the material the comment is about.
  • Coordinate data configuring the comment data are coordinate data indicating the position of the comment object on the page of the material and are formed with first coordinates indicating the position of a left upper end portion of the comment object and second coordinates indicating the position of a right lower end portion of the comment object.
  • the first coordinates and second coordinates in the comment data are coordinates indicating relative positions in a case where the position of the left upper end portion of the page is set as (0, 0, 0) and the position of the right lower end portion of the page is set as (1, 1, 1). Consequently, the value of each axis of the first coordinates and second coordinates in the comment data is indicated by a value of zero or greater to one or smaller.
  • the comment content indicates the content of an input comment.
  • the comment control unit 131 generates the comment data illustrated in FIG. 14 based on the operation input information about the comment, which is acquired by the operation input information acquisition unit 104 , and stores the comment data in the comment storage unit 132 . Note that it is assumed that the correspondence relationship between the user ID and the group ID is in advance managed by a configuration, which is not illustrated, in the display control device 130 . Note that the material ID configuring the comment data may be acquired as the operation input information about the comment or may be acquired by referring to the display state data stored in the display state storage unit 103 .
  • the comment control unit 131 displays the comment content linked with the comment data on the head mounted display 200 which is displaying the material of the material ID linked with the comment data.
  • the comment control unit 131 performs control about whether or not the comments are displayed in accordance with the identification information linked with the comment data.
  • the comment control unit 131 may display not only the comment content but also the identification information of the inputting person of the comment. Further, in a case where the display control device 130 manages the identification information and a user name while in advance linking those together, the comment control unit 131 may display the comment content and the name of the inputting person of the comment.
  • FIG. 15 is a schematic diagram illustrating one example of a picture to be displayed in the head mounted display 200 according to the third example embodiment.
  • display 210 denotes the presentation material which is displayed by control by the presenter material display control unit 106 .
  • display 211 denotes the reference material which is displayed by control by the reference material explication display control unit 121 .
  • display 230 denotes the comment which is displayed by control by the comment control unit 131 . Note that the display position of the display 230 is calculated by the comment control unit 131 based on the display position indicated by the display state data about the material as the object of the comment, for example. For example, the comment control unit 131 calculates a position adjacent to the comment object as the display position of the comment.
  • the comment control unit 131 calculates a position which does not overlap with other display (for example, the display 210 ) as the display position of the comment.
  • the comment control unit 131 may display the comment by connecting the comment object and the comment by a line, for example, such that the correspondence relationship between those becomes clear.
  • display 231 is displayed in the virtual reality space.
  • the display 231 denotes a menu for the operation input about the reference material, the explication information, and the comment.
  • the viewer 520 designates a menu item 212 H by a pointing device, designates a submenu (not illustrated), which is thereafter displayed, by the pointing device, and can thereby designate the position of the comment object and input the comment.
  • Designation of the position of the comment object is performed by using a pointing device, for example.
  • an input of the comment is performed by using a keyboard.
  • a comment to be input does not have to be a free character string but may be selected from choices of comments which are in advance defined. In this case, the comment content is input by selecting a predetermined choice by using a pointing device.
  • display 232 and display 233 are displayed in the virtual reality space.
  • the display 232 and the display 233 are displayed by control by the comment control unit 131 .
  • the display 232 and the display 233 are icons by which the viewer 520 selects whether or not the comment is displayed.
  • the comment control unit 131 performs control so as to display the comment.
  • the comment control unit 131 performs control so as not to display the comment.
  • FIG. 16 is a flowchart illustrating one example of actions of the reference material explication display control unit 121 and the comment control unit 131 . Similarly to FIG. 8 , FIG. 16 illustrates display control for the head mounted display 200 worn by the user X. The display control is similarly performed for the head mounted displays 200 worn by the viewers 520 other than the user X.
  • FIG. 16 is different from the flowchart illustrated in FIG. 12 in the point that control about display of the comment is added. Specifically, the flowchart illustrated in FIG. 16 is different from the flowchart illustrated in FIG. 12 in the point that step S 113 and step S 114 are added. In the following, a description will be made about one example of the actions along FIG. 16 . However, descriptions common to the descriptions about FIG. 12 will not be repeated.
  • step S 111 in a case of No in step S 111 , the process returns to step S 100 ; however, in the flowchart illustrated in FIG. 16 , the process moves to step S 113 .
  • step S 113 the comment control unit 131 determines whether or not the operation input information acquired by the operation input information acquisition unit 104 is the operation input information about an input of the comment.
  • step S 114 the comment control unit 131 generates the comment data to be stored in the comment storage unit 132 based on the acquired operation input information. Further, the comment control unit 131 performs control so as to display the input comment content in the virtual reality space. In this case, the comment control unit 131 displays the comment on the head mounted display 200 worn by the viewer 520 who is permitted to browse the input comment content.
  • step S 113 the process returns to step S 100 .
  • control illustrated in FIG. 12 is merely one example. Consequently, for example, the control may be performed in different processing order from the processing order illustrated in FIG. 16 .
  • the third example embodiment is described.
  • the content of the comment input by the viewer 520 is managed while being associated with the position of the comment object. Consequently, it is possible to easily specify which content in the material the input comment is about.
  • the display control device 130 displays the comment in the virtual reality space. Consequently, because the viewer 520 can browse a comment by the other viewer 520 during the progress of the presentation, convenience of browsing a material can further be improved.
  • display linked with the comment object is also easily enabled.
  • a display control device comprising:
  • first display control means for performing control so as to display a page, which is being displayed in a real space, in a first material as a presentation material used in a presentation by a presenter in a first position in a virtual reality space;
  • second display control means for performing control so as to display a second material in a second position which is different from the first position in the virtual reality space
  • operation input information acquisition means for acquiring first operation input information as operation input information which is input by a viewer experiencing the virtual reality space and which is about display of the second material
  • the second display control means changes a display manner of the second material based on the first operation input information.
  • the operation input information acquisition means further acquires second operation input information as operation input information which is input by a viewer experiencing the virtual reality space and which designates an explication object in a content of the second material, and
  • the second display control means further displays explication information as information explicating the explication object in the virtual reality space based on the second operation input information.
  • the operation input information acquisition means further acquires third operation input information as operation input information which is input by a viewer experiencing the virtual reality space and which designates an explication level, and
  • the second display control means performs switching to display of explication information at a level corresponding to the third operation input information.
  • the operation input information acquisition means further acquires fourth operation input information as operation input information which is input by a viewer experiencing the virtual reality space and which is about a comment on the first material or the second material,
  • the fourth operation input information includes a position of a comment object in a material and a comment content
  • the display control device further comprises comment control means for managing the position of the comment object and the comment content while associating the position of the comment object and the comment content together.
  • the display control device wherein when the first material or the second material is displayed in the virtual reality space, the comment control means performs control so as to display the comment content in the virtual reality space while associating the comment content with the position of the comment object in the first material or the second material.
  • the fourth operation input information further includes identification information which identifies a viewer inputting a comment
  • the comment control means manages the identification information, the position of the comment object, and the comment content while associating the identification information, the position of the comment object, and the comment content together and performs control about whether or not the comment content is displayed when the first material or the second material is displayed in the virtual reality space based on the identification information associated with the comment content.
  • the display control device according to any one of Supplementary Notes 1 to 6, wherein the second display control means changes pages of the second material as display objects based on the first operation input information.
  • the display control device according to any one of Supplementary Notes 1 to 6, wherein the second display control means changes at least either one of a display size of the second material or a display position of the second material based on the first operation input information.
  • the display control device according to any one of Supplementary Notes 1 to 8, wherein the second material is the presentation material.
  • the display control device according to any one of Supplementary Notes 1 to 8, wherein the second material is a different material from the presentation material.
  • a presentation system comprising:
  • a display configured to present a virtual reality space in accordance with control by the display control device, wherein
  • the display control device includes
  • the second display control means changes a display manner of the second material based on the first operation input information
  • the display displays the picture output by the picture output means of the display control device for the viewer.
  • a display control method comprising:
  • a non-transitory computer-readable medium storing a program for causing a computer to execute:

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)
US17/289,286 2018-11-06 2019-10-30 Display control device, display control method, and non-transitory computer-readable medium storing program Active US11906741B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018208811 2018-11-06
JP2018-208811 2018-11-06
PCT/JP2019/042535 WO2020095784A1 (ja) 2018-11-06 2019-10-30 表示制御装置、表示制御方法、及びプログラムが格納された非一時的なコンピュータ可読媒体

Publications (2)

Publication Number Publication Date
US20210389588A1 US20210389588A1 (en) 2021-12-16
US11906741B2 true US11906741B2 (en) 2024-02-20

Family

ID=70611873

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/289,286 Active US11906741B2 (en) 2018-11-06 2019-10-30 Display control device, display control method, and non-transitory computer-readable medium storing program

Country Status (3)

Country Link
US (1) US11906741B2 (ja)
JP (1) JP7226836B2 (ja)
WO (1) WO2020095784A1 (ja)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023167212A1 (ja) * 2022-03-01 2023-09-07 株式会社KPMG Ignition Tokyo コンピュータプログラム、情報処理方法及び情報処理装置

Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20090189974A1 (en) * 2008-01-23 2009-07-30 Deering Michael F Systems Using Eye Mounted Displays
JP2009169714A (ja) 2008-01-17 2009-07-30 Brother Ind Ltd 電子マニュアル表示装置および電子マニュアル表示プログラム
JP2010073219A (ja) 2009-12-25 2010-04-02 Seiko Epson Corp 表示制御システム、機器管理装置、表示機器、機器管理プログラムおよび表示制御プログラム、並びに表示制御方法
US20100218099A1 (en) 2009-02-20 2010-08-26 William Van Melle Systems and Methods for Audience-Enabled Access to Presentation Content
JP2011118531A (ja) 2009-12-01 2011-06-16 Brother Industries Ltd ヘッドマウントディスプレイ
JP2011198007A (ja) 2010-03-19 2011-10-06 Konica Minolta Business Technologies Inc プレゼンテーションシステム、管理装置およびプログラム
US20120242560A1 (en) * 2011-03-24 2012-09-27 Seiko Epson Corporation Head-mounted display device and control method for the head-mounted display device
US20130024785A1 (en) * 2009-01-15 2013-01-24 Social Communications Company Communicating between a virtual area and a physical space
US20140055342A1 (en) * 2012-08-21 2014-02-27 Fujitsu Limited Gaze detection apparatus and gaze detection method
US20150049004A1 (en) * 2008-01-23 2015-02-19 Michael Frank Deering Eye Mounted Displays and Systems Using Eye Mounted Displays
US20150347080A1 (en) * 2014-05-30 2015-12-03 Samsung Electronics Co., Ltd. Data processing method and electronic device thereof
JP2016024751A (ja) 2014-07-24 2016-02-08 シャープ株式会社 画像表示装置
US20160133170A1 (en) * 2014-11-07 2016-05-12 Eye Labs, LLC High resolution perception of content in a wide field of view of a head-mounted display
US20160140773A1 (en) * 2014-11-17 2016-05-19 Seiko Epson Corporation Head-mounted display device, method of controlling head-mounted display device, and computer program
US20160173865A1 (en) * 2014-12-11 2016-06-16 Hyundai Motor Company Wearable glasses, control method thereof, and vehicle control system
US20160188283A1 (en) * 2014-12-26 2016-06-30 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program
US20160188277A1 (en) * 2014-12-26 2016-06-30 Seiko Epson Corporation Display system, display device, information display method, and program
US9498704B1 (en) * 2013-09-23 2016-11-22 Cignition, Inc. Method and system for learning and cognitive training in a virtual environment
US20170010662A1 (en) * 2015-07-07 2017-01-12 Seiko Epson Corporation Display device, control method for display device, and computer program
JP2017102516A (ja) 2015-11-30 2017-06-08 セイコーエプソン株式会社 表示装置、通信システム、表示装置の制御方法、及び、プログラム
US9721566B2 (en) * 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US20170289533A1 (en) 2016-03-30 2017-10-05 Seiko Epson Corporation Head mounted display, control method thereof, and computer program
JP2017188126A (ja) 2017-05-11 2017-10-12 株式会社リコー 情報処理システム、情報登録方法、会議装置及びプログラム
US20170315614A1 (en) * 2012-10-02 2017-11-02 At&T Intellectual Property I, L.P. Adjusting content display orientation on a screen based on user orientation
US20170322623A1 (en) * 2016-05-05 2017-11-09 Google Inc. Combining gaze input and touch surface input for user interfaces in augmented and/or virtual reality
US20180096507A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US20180098059A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US20180095636A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US20180095635A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
JP2018067160A (ja) 2016-10-20 2018-04-26 セイコーエプソン株式会社 頭部装着型表示装置およびその制御方法、並びにコンピュータープログラム
US20180120935A1 (en) * 2013-06-11 2018-05-03 Samsung Electronics Co., Ltd. Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US9990044B2 (en) * 2015-10-30 2018-06-05 Intel Corporation Gaze tracking system
US10057269B1 (en) * 2017-04-21 2018-08-21 InfoSci, LLC Systems and methods for device verification and authentication
US20180341811A1 (en) * 2017-05-23 2018-11-29 Samsung Electronics Company, Ltd. Augmented Reality
US20190095071A1 (en) * 2016-03-24 2019-03-28 Sony Corporation Information processing device, information processing method, and program
US20190250699A1 (en) * 2018-02-15 2019-08-15 Sony Interactive Entertainment Inc. Information processing apparatus, image generation method, and computer program
US20200133618A1 (en) * 2018-10-31 2020-04-30 Doubleme, Inc Surrogate Visitor Mixed-Reality Live Environment Sharing System with Remote Visitors
US20200210137A1 (en) * 2018-12-27 2020-07-02 Facebook Technologies, Llc Virtual spaces, mixed reality spaces, and combined mixed reality spaces for improved interaction and collaboration
US20200211251A1 (en) * 2018-12-27 2020-07-02 Facebook Technologies, Llc Virtual spaces, mixed reality spaces, and combined mixed reality spaces for improved interaction and collaboration
US20200209949A1 (en) * 2018-12-27 2020-07-02 Facebook Technologies, Llc Virtual spaces, mixed reality spaces, and combined mixed reality spaces for improved interaction and collaboration
US20200257121A1 (en) * 2019-02-07 2020-08-13 Mercari, Inc. Information processing method, information processing terminal, and computer-readable non-transitory storage medium storing program
US20210035535A1 (en) * 2018-02-06 2021-02-04 Sony Corporation Image processing device, image processing method, and image provision system
US20210093971A1 (en) * 2018-06-18 2021-04-01 LINE Plus Corporation Method, system, and non-transitory computer-readable record medium for providing content based on user response
US20210135895A1 (en) * 2019-11-04 2021-05-06 Facebook Technologies, Llc Private collaboration spaces for computing systems
US20210383489A1 (en) * 2018-08-20 2021-12-09 Shawn R. Hutchinson Scheduling, booking, and pricing engines
US20220075187A1 (en) * 2018-12-19 2022-03-10 Viewpointsystem Gmbh Method for generating and displaying a virtual object by an optical system
US20220254125A1 (en) * 2021-08-13 2022-08-11 Facebook Technologies, Llc Device Views and Controls

Patent Citations (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
JP2009169714A (ja) 2008-01-17 2009-07-30 Brother Ind Ltd 電子マニュアル表示装置および電子マニュアル表示プログラム
US20090189974A1 (en) * 2008-01-23 2009-07-30 Deering Michael F Systems Using Eye Mounted Displays
US20150049004A1 (en) * 2008-01-23 2015-02-19 Michael Frank Deering Eye Mounted Displays and Systems Using Eye Mounted Displays
US20130024785A1 (en) * 2009-01-15 2013-01-24 Social Communications Company Communicating between a virtual area and a physical space
US20100218099A1 (en) 2009-02-20 2010-08-26 William Van Melle Systems and Methods for Audience-Enabled Access to Presentation Content
JP2011018312A (ja) 2009-02-20 2011-01-27 Fuji Xerox Co Ltd 聴講者がプレゼンテ−ションコンテンツにアクセスできるシステム及び方法
JP2011118531A (ja) 2009-12-01 2011-06-16 Brother Industries Ltd ヘッドマウントディスプレイ
JP2010073219A (ja) 2009-12-25 2010-04-02 Seiko Epson Corp 表示制御システム、機器管理装置、表示機器、機器管理プログラムおよび表示制御プログラム、並びに表示制御方法
JP2011198007A (ja) 2010-03-19 2011-10-06 Konica Minolta Business Technologies Inc プレゼンテーションシステム、管理装置およびプログラム
US20120242560A1 (en) * 2011-03-24 2012-09-27 Seiko Epson Corporation Head-mounted display device and control method for the head-mounted display device
US20140055342A1 (en) * 2012-08-21 2014-02-27 Fujitsu Limited Gaze detection apparatus and gaze detection method
US20170315614A1 (en) * 2012-10-02 2017-11-02 At&T Intellectual Property I, L.P. Adjusting content display orientation on a screen based on user orientation
US20180120935A1 (en) * 2013-06-11 2018-05-03 Samsung Electronics Co., Ltd. Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US9498704B1 (en) * 2013-09-23 2016-11-22 Cignition, Inc. Method and system for learning and cognitive training in a virtual environment
US20150347080A1 (en) * 2014-05-30 2015-12-03 Samsung Electronics Co., Ltd. Data processing method and electronic device thereof
JP2016024751A (ja) 2014-07-24 2016-02-08 シャープ株式会社 画像表示装置
US20160133170A1 (en) * 2014-11-07 2016-05-12 Eye Labs, LLC High resolution perception of content in a wide field of view of a head-mounted display
US20160140773A1 (en) * 2014-11-17 2016-05-19 Seiko Epson Corporation Head-mounted display device, method of controlling head-mounted display device, and computer program
US20160173865A1 (en) * 2014-12-11 2016-06-16 Hyundai Motor Company Wearable glasses, control method thereof, and vehicle control system
US20160188283A1 (en) * 2014-12-26 2016-06-30 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program
US20160188277A1 (en) * 2014-12-26 2016-06-30 Seiko Epson Corporation Display system, display device, information display method, and program
US9721566B2 (en) * 2015-03-08 2017-08-01 Apple Inc. Competing devices responding to voice triggers
US20170010662A1 (en) * 2015-07-07 2017-01-12 Seiko Epson Corporation Display device, control method for display device, and computer program
US9990044B2 (en) * 2015-10-30 2018-06-05 Intel Corporation Gaze tracking system
JP2017102516A (ja) 2015-11-30 2017-06-08 セイコーエプソン株式会社 表示装置、通信システム、表示装置の制御方法、及び、プログラム
US20190095071A1 (en) * 2016-03-24 2019-03-28 Sony Corporation Information processing device, information processing method, and program
US20170289533A1 (en) 2016-03-30 2017-10-05 Seiko Epson Corporation Head mounted display, control method thereof, and computer program
US20170322623A1 (en) * 2016-05-05 2017-11-09 Google Inc. Combining gaze input and touch surface input for user interfaces in augmented and/or virtual reality
US20180098059A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US20180095636A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US20180095635A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
US20180096507A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
JP2018067160A (ja) 2016-10-20 2018-04-26 セイコーエプソン株式会社 頭部装着型表示装置およびその制御方法、並びにコンピュータープログラム
US10057269B1 (en) * 2017-04-21 2018-08-21 InfoSci, LLC Systems and methods for device verification and authentication
JP2017188126A (ja) 2017-05-11 2017-10-12 株式会社リコー 情報処理システム、情報登録方法、会議装置及びプログラム
US20180341811A1 (en) * 2017-05-23 2018-11-29 Samsung Electronics Company, Ltd. Augmented Reality
US20210035535A1 (en) * 2018-02-06 2021-02-04 Sony Corporation Image processing device, image processing method, and image provision system
US20190250699A1 (en) * 2018-02-15 2019-08-15 Sony Interactive Entertainment Inc. Information processing apparatus, image generation method, and computer program
US20210093971A1 (en) * 2018-06-18 2021-04-01 LINE Plus Corporation Method, system, and non-transitory computer-readable record medium for providing content based on user response
US20210383489A1 (en) * 2018-08-20 2021-12-09 Shawn R. Hutchinson Scheduling, booking, and pricing engines
US20200133618A1 (en) * 2018-10-31 2020-04-30 Doubleme, Inc Surrogate Visitor Mixed-Reality Live Environment Sharing System with Remote Visitors
US20220075187A1 (en) * 2018-12-19 2022-03-10 Viewpointsystem Gmbh Method for generating and displaying a virtual object by an optical system
US20200210137A1 (en) * 2018-12-27 2020-07-02 Facebook Technologies, Llc Virtual spaces, mixed reality spaces, and combined mixed reality spaces for improved interaction and collaboration
US20200211251A1 (en) * 2018-12-27 2020-07-02 Facebook Technologies, Llc Virtual spaces, mixed reality spaces, and combined mixed reality spaces for improved interaction and collaboration
US20200209949A1 (en) * 2018-12-27 2020-07-02 Facebook Technologies, Llc Virtual spaces, mixed reality spaces, and combined mixed reality spaces for improved interaction and collaboration
US20200257121A1 (en) * 2019-02-07 2020-08-13 Mercari, Inc. Information processing method, information processing terminal, and computer-readable non-transitory storage medium storing program
US20210135895A1 (en) * 2019-11-04 2021-05-06 Facebook Technologies, Llc Private collaboration spaces for computing systems
US20220254125A1 (en) * 2021-08-13 2022-08-11 Facebook Technologies, Llc Device Views and Controls

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International Search Report for PCT Application No. PCT/JP2019/042535, dated Jan. 28, 2020.
Japanese Office Action for JP Application No. 2020-555999 dated Jun. 14, 2022 with English Translation.

Also Published As

Publication number Publication date
US20210389588A1 (en) 2021-12-16
JPWO2020095784A1 (ja) 2021-10-07
WO2020095784A1 (ja) 2020-05-14
JP7226836B2 (ja) 2023-02-21

Similar Documents

Publication Publication Date Title
US20220084279A1 (en) Methods for manipulating objects in an environment
US10339382B2 (en) Feedback based remote maintenance operations
TWI610097B (zh) 電子系統、可攜式顯示裝置及導引裝置
Normand et al. Enlarging a smartphone with ar to create a handheld vesad (virtually extended screen-aligned display)
JP2008040832A (ja) 複合現実感提示システム及びその制御方法
JPH10134069A (ja) 情報検索装置
CN110622110B (zh) 提供沉浸式现实内容的方法和装置
US20170109020A1 (en) Interactive presentation system
US20180332335A1 (en) Virtual expansion of desktop
JP6121496B2 (ja) ヘッドマウントディスプレイシステムを制御するプログラム
US11906741B2 (en) Display control device, display control method, and non-transitory computer-readable medium storing program
JP2014203175A (ja) 情報処理装置、情報処理方法及びプログラム。
US20200218198A1 (en) Movement control of holographic objects with crown movement of a watch device
JP7273325B2 (ja) 情報処理装置、情報処理システム、情報処理方法及びプログラム
JP5830055B2 (ja) 画像処理装置および画像処理システム
JP2006018444A (ja) 画像処理システム及び付加情報指示装置
JP2012048656A (ja) 画像処理装置、画像処理方法
CN107133028B (zh) 一种信息处理方法及电子设备
EP3510440B1 (en) Electronic device and operation method thereof
JP2014222446A (ja) 映像出力装置、映像出力方法、及びプログラム
JP6193180B2 (ja) プレゼンテーション用端末及びプレゼンテーション方法
CN118001116B (zh) 用于视力训练的头戴式显示设备和视力训练方法
JP2015069445A (ja) 映像出力装置、映像出力方法、及びプログラム
WO2023053796A1 (ja) 仮想空間提供装置
US20240152256A1 (en) Devices, Methods, and Graphical User Interfaces for Tabbed Browsing in Three-Dimensional Environments

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: NEC CORPORATION,, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MOTOHASHI, YOUSUKE;TAKETA, MAYO;NONAKA, TAKASHI;AND OTHERS;SIGNING DATES FROM 20210628 TO 20211013;REEL/FRAME:060956/0202

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

AS Assignment

Owner name: NEC SOLUTION INNOVATORS, LTD., JAPAN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE OMISSION 2ND ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 060956 FRAME: 0202. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:MOTOHASHI, YOUSUKE;TAKETA, MAYO;NONAKA, TAKASHI;AND OTHERS;SIGNING DATES FROM 20210628 TO 20211013;REEL/FRAME:066060/0832

Owner name: NEC CORPORATION, JAPAN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE OMISSION 2ND ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 060956 FRAME: 0202. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:MOTOHASHI, YOUSUKE;TAKETA, MAYO;NONAKA, TAKASHI;AND OTHERS;SIGNING DATES FROM 20210628 TO 20211013;REEL/FRAME:066060/0832

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE