WO2022201936A1 - Display control device - Google Patents

Display control device Download PDF

Info

Publication number
WO2022201936A1
WO2022201936A1 PCT/JP2022/005193 JP2022005193W WO2022201936A1 WO 2022201936 A1 WO2022201936 A1 WO 2022201936A1 JP 2022005193 W JP2022005193 W JP 2022005193W WO 2022201936 A1 WO2022201936 A1 WO 2022201936A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
user
control
detection unit
state
Prior art date
Application number
PCT/JP2022/005193
Other languages
French (fr)
Japanese (ja)
Inventor
怜央 水田
康夫 森永
望 松本
達哉 西▲崎▼
有希 中村
弘行 藤野
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2022201936A1 publication Critical patent/WO2022201936A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the present invention relates to a display control device that controls display on a display.
  • transmissive head-mounted displays have been used (see Patent Document 1, for example).
  • a transmissive head-mounted display allows a user to see the real world and to refer to information displayed on the display.
  • An embodiment of the present invention has been made in view of the above, and an object thereof is to provide a display control device capable of appropriately performing display on a display according to the state of the user.
  • a display control device provides display on a transmissive display that is worn on the eye of a user and performs display according to an operation from another user.
  • a display control device that controls the display, which detects the state of the user wearing the display, and the control of the operation related to the display of the display from another user according to the state of the user detected by the detection unit and a control unit that performs
  • the operation related to the display by another user is controlled according to the state of the user.
  • the display can be controlled so that the display by another user's operation does not interfere with the user. Therefore, according to the display control device according to the embodiment of the present invention, it is possible to appropriately perform display on the display according to the state of the user.
  • an operation related to display display from another user is controlled according to the state of the user. can.
  • FIG. 4 is a diagram showing an example of display on a display; 4 is a flow chart showing processing executed by a display that is a display control device according to an embodiment of the present invention; It is a figure which shows the hardware constitutions of the display which is a display control apparatus which concerns on embodiment of this invention.
  • FIG. 1 shows the functional configuration of a display 10, which is a display control device according to this embodiment.
  • the display 10 is a transmissive display worn on the user's eye.
  • the display 10 is a glasses-type head-mounted display, that is, see-through glasses (smart glasses). Since the display 10 is of a transmissive type, the user wearing the display 10 can simultaneously see the display (image) on the display 10 and the scenery of the outside.
  • the display 10 has a display screen 100, and displays content 110 on a part of the display screen 100 as shown in FIGS. 2(a) and 2(b).
  • the user can see the outside scenery or the like from a portion of the display screen 100 where the content 110 is not displayed.
  • the user can see the outside scenery through the content, or cannot see the outside scenery because of the content.
  • the display 10 may display according to the surrounding environment such as the scenery of the outside. That is, the display 10 may display virtual content by AR (Augmented Reality), such as AR glasses.
  • AR Augmented Reality
  • the display 10 displays according to the operation by another user.
  • the display 10 may share the display on the display 10 itself, eg, the screen of the desktop, with another display. That is, the same display may be made on the display 10 and another display.
  • the display 10 and the above-mentioned another display (or a terminal connected to the another display, the same applies to the following) are configured to be able to communicate with each other through a communication network such as a mobile communication network, a wireless LAN, and the Internet.
  • Display is shared by mutually transmitting and receiving information related to display.
  • display sharing may occur via a device other than the display, such as a content management server that can communicate with the display.
  • a user of the display 10 and a user of another display can operate the shared display.
  • one display position of the content 110 can be moved as indicated by the arrow in FIG. 2(b).
  • the operation on the display is not limited to moving the content 110 and may be arbitrary.
  • the display 10 performs display according to the remote operation.
  • the user of the display 10 and the user of another display can communicate while referring to the same information.
  • two displays share the display, but three or more displays may share the display.
  • the display 10 may perform display according to an operation from another user other than sharing the display.
  • the display 10 a conventional display having the above functions can be used.
  • some of the above-described functions of the display 10 and functions according to the present embodiment described later may be included in an information processing device (for example, a smartphone) connected to a display device (for example, the above-described see-through glasses). good. That is, the display 10 according to the present embodiment may be realized by including the display device and the information processing device.
  • the display 10 includes a display section 11, a detection section 12, and a control section 13. As shown in FIG. 1, the display 10 may have functions other than those described above that conventional display devices, such as conventional see-through glasses, have.
  • the display unit 11 is a functional unit that displays on the display screen 100 provided in the display 10 .
  • the display unit 11 inputs display information to be displayed on the display screen 100 and displays it on the display screen 100 .
  • the display information may be, for example, content such as photos or characters, or may be information that can be changed in a browser or the like.
  • the display unit 11 receives an operation related to display and executes the operation.
  • An operation related to display is, for example, an operation of moving display information on the display screen 100 . Also, the operation related to the display may be another operation.
  • An operation related to display can be performed by the user wearing the display 10 or by another user.
  • the operation related to the display is accepted by accepting the user's input operation to the display 10 , for example.
  • the display unit 11 transmits information indicating the operation to the another display.
  • acceptance of the display-related operation is performed, for example, by receiving information indicating the operation transmitted from another display used by another user.
  • the display unit 11 can identify whether the received operation is from another user. For example, the information indicating the operation is given the identifier of the user who performed the operation.
  • the above function of the display unit 11 may be the same as the conventional function of displaying on the display. Further, as will be described later, the execution of operations related to display by the display unit 11 is controlled by the control unit 13 .
  • the detection unit 12 is a functional unit that detects the state of the user wearing the display 10 .
  • the detection unit 12 may detect the line of sight of the user as the state of the user wearing the display 10 . More specifically, the detection unit 12 may detect the focal length of the line of sight of the user. Further, the detection unit 12 may detect the direction of the user's line of sight. Alternatively, the detection unit 12 may detect the moving state of the user as the state of the user wearing the display 10 .
  • a sensor for detecting the state of the user may be provided on the display 10 as at least part of the detection unit 12, and the user's state may be detected using the sensor.
  • the display 10 is provided with a camera capable of imaging the eyeball of the user.
  • the detection unit 12 detects the user's line of sight, more specifically, the focal length and direction of the line of sight from the image of the user's eyeball obtained by imaging with the camera.
  • the detection unit 12 detects the movement of the user's eyeball from the moving image of the user's eyeball, and detects the focal length and direction of the line of sight based on the movement.
  • the detection unit 12 detects, for example, the distance from the eyeball to the focal point as the focal length.
  • the detection unit 12 detects, for example, the position of the line of sight on the display screen 100 (the position of the intersection of the line of sight and the display screen 100, eg, the coordinates on the display screen 100) as the direction of the line of sight. Detecting the line of sight from the image may be performed in the same manner as in the conventional method.
  • the detection unit 12 may detect whether or not the user is walking as the movement state of the user.
  • the display 10 is provided with an acceleration sensor, for example.
  • the detection unit 12 detects whether or not the user is walking from the information obtained by the acceleration sensor. Detection of whether or not the user is walking from the acceleration may be performed in the same manner as in the conventional method.
  • the display 10 is provided with a positioning function of its own device using GPS (Global Positioning System) or the like, and the detection unit 12 detects the position of the display 10 based on the information indicating the position of the display 10 obtained by the positioning function. Detection may be performed while (eg, walking) or stationary. This detection may also be performed in the same manner as in the conventional method.
  • the detection unit 12 continuously detects the state of the user, for example, at regular intervals.
  • the detection unit 12 outputs information indicating the detected state of the user to the control unit 13 each time of detection.
  • the detection unit 12 may detect at least one of the user states described above.
  • the detection unit 12 may detect the state of the user by a method other than the above-described method.
  • the detection unit 12 may detect user states other than those described above as long as they are useful for control by the control unit 13, which will be described later.
  • the control unit 13 is a functional unit that controls operations related to display on the display 10 by another user according to the state of the user detected by the detection unit 12 .
  • the control unit 13 may determine whether or not to perform control according to the focal length detected by the detection unit 12 .
  • the control unit 13 may set an object to be controlled according to the line-of-sight direction detected by the detection unit 12 .
  • the control unit 13 may perform control to prohibit another user from performing an operation related to the display of the display 10, or control to change the display related to the operation.
  • the control unit 13 may store an operation from another user to be controlled, and execute the stored operation after the control ends.
  • the control by the control unit 13 is, for example, to prevent an operation from another user from interfering with the user wearing the display 10 . Since the display 10 is transmissive, the user wearing the display 10 can also perform actions other than looking at the display 10 . For example, the user can view a bus timetable or walk while the display 10 is displaying.
  • a user wearing the display 10 turns his or her line of sight toward the center of the display screen 100 where no content is displayed, and sees a bus in the real world that is not displayed on the display 10.
  • you are looking at a timetable That is, the user wearing the display 10 is viewing the bus timetable and sharing the content with another user.
  • FIG. 2B when another user operates the content to move to the center of the display screen 100, that is, in front of the user who wears the display 10, the user changes the field of view. It interferes with the action of being blocked and looking at the bus timetable.
  • Control by the control unit 13 is for preventing an operation from another user from interfering with the action of the user.
  • the control by the control unit 13 does not necessarily have to be performed for the above purposes, and may be performed for purposes other than the above.
  • the control unit 13 performs control as follows.
  • the control unit 13 receives information indicating the state of the user from the detection unit 12 .
  • the control unit 13 determines whether or not to control the operation related to the display of the display 10 from another user based on the information input from the detection unit 12 based on pre-stored determination criteria. For example, if the information input from the detection unit 12 indicates the focal length, the control unit 13 makes the above determination based on whether the focal length is within a preset range.
  • the above range is, for example, a range in which it can be determined that the user wearing the display 10 is focused on the display screen 100 of the display 10 , that is, the user is looking at the display screen 100 of the display 10 .
  • the focal length is not within the above range, the display screen 100 of the display 10 is not seen, for example, assuming that the user is looking at (gazing at) an object in the real world. It is determined that the control for the operation from is performed. If the focal length is within the above range, the display screen 100 of the display 10 is viewed, and the control unit 13 determines not to control the operation from another user. This is because it is considered that, as long as the user is looking at the display screen 100 of the display 10, an operation from another user will not interfere with the user.
  • control unit 13 determines that control is to be performed for the operation from another user, and the information input from the detection unit 12 If the received information indicates that the user is not walking, the control unit 13 determines not to control the operation from another user.
  • the control unit 13 controls the display unit 11 when it determines that it will control an operation from another user and the display unit 11 is not being controlled at that time.
  • the control is, for example, control that prohibits another user from performing an operation related to display on the display 10 .
  • the control is to move or enlarge the display information displayed on the display 10, to prohibit an operation from another user, that is, to temporarily stop the display information.
  • the control may prohibit some of the operations. For example, if the operation is to move the display information, the control may be a control to move the display information slightly without completely moving it according to the operation.
  • the display unit 11 receives control from the control unit 13 and performs processing according to the control.
  • the display unit 11 receives an operation related to the display of the display 10 while being controlled by the control unit 13, the display unit 11 identifies whether or not the operation is from another user.
  • the display unit 11 prohibits the operation.
  • the display unit 11 performs the operation without prohibiting it.
  • the display unit 11 outputs information indicating the operation to be controlled to the control unit 13 .
  • the control unit 13 inputs and stores (accumulates) information indicating the operation to be controlled from the display unit 11 .
  • control does not necessarily have to be control that prohibits the operation, and may be control that changes the display related to the operation. For example, control may be performed to hide or translucent the display information to be moved, or to display only the frame. Alternatively, the display information to be moved may be controlled to be displayed at a preset position on the display 10, for example, at a corner of the field of view. Further, control other than the above-described control may be performed as long as it meets the above-described purpose.
  • control unit 13 determines not to control the operation from another user, and if the display unit 11 is under control at that time, the control unit 13 cancels the control.
  • the control unit 13 causes the display unit 11 to execute the operation to be controlled that is stored when the control is canceled and ended. As a result, for example, display information whose execution has been prohibited is moved.
  • the control unit 13 deletes the stored information indicating the operation.
  • control unit 13 sets the target to be controlled from the information input from the detection unit 12 based on pre-stored criteria. For example, when the information input from the detection unit 12 indicates the direction of the line of sight, the control unit 13 sets the target to be controlled from the direction of the line of sight.
  • the control unit 13 sets an area on the display screen 100 as an object to be controlled based on the position of the line of sight on the display screen 100, which is the direction of the line of sight. For example, an operation related to display by another user moves display information already displayed outside the set area of the display screen 100 to the set area, or displays new display information in the set area. If so, control over that operation is performed. Alternatively, if another user's operation related to display is an operation for display information already displayed in a set area, such as movement of display information already displayed in the set area, the control for the operation is performed. done.
  • the control unit 13 notifies the display unit 11 of the set area, and controls operations related to display from another user according to the area.
  • the display unit 11 identifies whether or not the operation is from another user.
  • the operation related to the display is to move the display information already displayed on the display screen 100 to the set area, or to move the display information to the set area. If the operation is to display display information, the display unit 11 prohibits the operation. Alternatively, if the operation related to display is for display information already displayed in the set area, the display unit 11 prohibits the operation.
  • the control over the operation from another user may be similar to that described above. Also in this case, the storage (accumulation) of information indicating the operation to be controlled by the control unit 13 and the execution of the operation after the end of the control may be performed. The above is the function of the display 10 according to the present embodiment.
  • the processing executed by the display 10 according to the present embodiment (operation method performed by the display 10) will be described using the flowchart of FIG.
  • This processing is performed when the display 10 is worn by the user and the display unit 11 performs display on the display 10 .
  • the display can be shared with another display, and an operation related to the display of the display 10 can be performed by another user.
  • the state of the user wearing the display 10 is detected by the detection unit 12 (S01). Subsequently, the control unit 13 determines whether or not to control an operation related to display on the display 10 from another user based on the state of the user (S02). If it is determined that control is to be performed and the display unit 11 is not being controlled at that time (YES in S02), control is performed from the control unit 13 to the display unit 11 in response to another user's operation ( S03).
  • the control is, for example, control that prohibits the operation.
  • the display unit 11 receives an operation from another user while the control is being performed, a process corresponding to the control is performed. For example, an operation from another user is prohibited. Further, the control unit 13 stores the operation to be controlled.
  • control of the operation from another user from the control unit 13 to the display unit 11 is performed. It ends (S04). Further, when the operation to be controlled is stored by the control unit 13, the operation is performed by the display unit 11 after the control is finished (S05). Further, the above control may be performed by setting a target to be controlled according to the state of the user. The above processes (S01 to S05) are repeated while the display unit 11 is displaying. The above is the processing executed by the display 10 according to the present embodiment.
  • an operation related to the display of the display 10 by another user is controlled according to the state of the user.
  • the display can be controlled according to the state of the user so that the display on the display 10 by another user's operation does not interfere with the user. Therefore, according to the present embodiment, it is possible to appropriately perform display on the display 10 according to the state of the user.
  • the line of sight of the user may be detected as the state of the user wearing the display 10 . More specifically, the focal length of the line of sight of the user may be detected, and it may be determined whether or not to perform control according to the focal length. According to this configuration, for example, control can be performed when the user is not looking at the display screen 100 of the display 10, and the display on the display 10 may interfere with the user or pose a danger to the user. can be prevented.
  • the direction of the line of sight of the user may be detected, and the target to be controlled according to the direction of the line of sight may be set. According to this configuration, for example, it is possible to control only a necessary target according to the direction of the user's line of sight, and display on the display 10 can be performed more appropriately.
  • the movement state of the user more specifically, whether or not the user is walking may be detected.
  • control can be performed while the user is walking, and the display on the display 10 can prevent danger from occurring while the user is walking.
  • the state of the user wearing the display 10 is not limited to the above, and other states may be detected and used as long as they are useful for controlling the display.
  • control to prohibit the operation or control to change the display related to the operation may be performed. According to this configuration, it is possible to appropriately and reliably perform display on the display in accordance with the state of the user. However, control other than the above may be performed as long as it is a control for appropriately performing display on the display.
  • an operation from another user who is to be controlled may be stored, and the stored operation may be executed after the control ends.
  • the display on the display 10 can be shared with another display, and the user's convenience can be improved.
  • the detection unit 12 may detect the heartbeat or sweating state of the user in addition to or instead of the above information. Specifically, the detection unit 12 may detect the user's heart rate or perspiration amount, or both. In this case, a sensor for detecting the heart rate or the amount of perspiration is attached to the user, and the display 10 acquires information indicating the heart rate or the amount of perspiration of the user from the sensor. Conventional sensors can be used as these sensors. In this case as well, the control unit 13 determines whether or not to control the operation related to the display of the display 10 from another user based on the information input from the detection unit 12 based on the determination criteria stored in advance in the same manner as described above. to judge.
  • the control unit 13 detects that another user is determined to control the operation of
  • the display method can be changed according to the athlete's heart rate or sweating amount.
  • the instruction message from the coach can be prevented from obstructing the field of view.
  • the remote operator's instructions are displayed only when an abnormal state is detected by the heart rate of the worker. be able to.
  • the display control device has been described as the display 10 having a display function, but it does not necessarily have to have a display function.
  • the display control device is attached to the user's eye and is connected to a transmissive display that performs display according to an operation from another user (that is, includes a display unit 11), and controls display on the display.
  • Any device (system) may be used as long as it includes the detection unit 12 and the control unit 13 described above.
  • each functional block may be implemented using one device that is physically or logically coupled, or directly or indirectly using two or more devices that are physically or logically separated (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices.
  • a functional block may be implemented by combining software in the one device or the plurality of devices.
  • Functions include judging, determining, determining, calculating, calculating, processing, deriving, investigating, searching, checking, receiving, transmitting, outputting, accessing, resolving, selecting, choosing, establishing, comparing, assuming, expecting, assuming, Broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc. can't
  • a functional block (component) responsible for transmission is called a transmitting unit or transmitter.
  • the implementation method is not particularly limited.
  • the display 10 in one embodiment of the present disclosure may function as a computer that performs information processing of the present disclosure.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration of display 10 according to an embodiment of the present disclosure.
  • the display 10 described above may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
  • the term "apparatus” can be read as a circuit, device, unit, or the like.
  • the hardware configuration of the display 10 may be configured to include one or more of each device shown in the drawing, or may be configured without some of the devices.
  • Each function of the display 10 is performed by loading predetermined software (programs) on hardware such as the processor 1001 and the memory 1002 so that the processor 1001 performs calculations, controls communication by the communication device 1004, controls the communication by the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 .
  • the processor 1001 for example, operates an operating system and controls the entire computer.
  • the processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, registers, and the like.
  • CPU central processing unit
  • each function of display 10 described above may be implemented by processor 1001 .
  • the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them.
  • programs program codes
  • software modules software modules
  • data etc.
  • the program a program that causes a computer to execute at least part of the operations described in the above embodiments is used.
  • each function of display 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 .
  • FIG. Processor 1001 may be implemented by one or more chips.
  • the program may be transmitted from a network via an electric communication line.
  • the memory 1002 is a computer-readable recording medium, and is composed of at least one of, for example, ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), etc. may be
  • ROM Read Only Memory
  • EPROM Erasable Programmable ROM
  • EEPROM Electrical Erasable Programmable ROM
  • RAM Random Access Memory
  • the memory 1002 may also be called a register, cache, main memory (main storage device), or the like.
  • the memory 1002 can store executable programs (program code), software modules, etc. for performing information processing according to an embodiment of the present disclosure.
  • the storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like.
  • Storage 1003 may also be called an auxiliary storage device.
  • the storage medium included in display 10 may be, for example, a database, server, or other suitable medium including at least one of memory 1002 and storage 1003 .
  • the communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via at least one of a wired network and a wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like.
  • the input device 1005 is an input device (for example, keyboard, mouse, microphone, switch, button, sensor, etc.) that receives input from the outside.
  • the output device 1006 is an output device (eg, display, speaker, LED lamp, etc.) that outputs to the outside. Note that the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
  • Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information.
  • the bus 1007 may be configured using a single bus, or may be configured using different buses between devices.
  • the display 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array).
  • DSP digital signal processor
  • ASIC Application Specific Integrated Circuit
  • PLD Physical Location Deposition
  • FPGA Field Programmable Gate Array
  • a part or all of each functional block may be implemented by the hardware.
  • processor 1001 may be implemented using at least one of these pieces of hardware.
  • Input/output information may be stored in a specific location (for example, memory) or managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
  • the determination may be made by a value represented by one bit (0 or 1), by a true/false value (Boolean: true or false), or by numerical comparison (for example, a predetermined value).
  • notification of predetermined information is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
  • Software whether referred to as software, firmware, middleware, microcode, hardware description language or otherwise, includes instructions, instruction sets, code, code segments, program code, programs, subprograms, and software modules. , applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like.
  • software, instructions, information, etc. may be transmitted and received via a transmission medium.
  • the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • system and “network” used in this disclosure are used interchangeably.
  • information, parameters, etc. described in the present disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information. may be represented.
  • determining and “determining” used in this disclosure may encompass a wide variety of actions.
  • “Judgement” and “determination” are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure), ascertaining as “judged” or “determined”, and the like.
  • "judgment” and “determination” are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment” or “decision” has been made.
  • judgment and “decision” are considered to be “judgment” and “decision” by resolving, selecting, choosing, establishing, comparing, etc. can contain.
  • judgment and “decision” may include considering that some action is “judgment” and “decision”.
  • judgment (decision) may be read as “assuming”, “expecting”, “considering”, or the like.
  • connection means any direct or indirect connection or coupling between two or more elements, It can include the presence of one or more intermediate elements between two elements being “connected” or “coupled.” Couplings or connections between elements may be physical, logical, or a combination thereof. For example, “connection” may be read as "access”.
  • two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
  • any reference to elements using the "first,” “second,” etc. designations used in this disclosure does not generally limit the quantity or order of those elements. These designations may be used in this disclosure as a convenient method of distinguishing between two or more elements. Thus, reference to a first and second element does not imply that only two elements can be employed or that the first element must precede the second element in any way.
  • a and B are different may mean “A and B are different from each other.”
  • the term may also mean that "A and B are different from C”.
  • Terms such as “separate,” “coupled,” etc. may also be interpreted in the same manner as “different.”
  • 10 display, 11... display unit, 12... detection unit, 13... control unit, 1001... processor, 1002... memory, 1003... storage, 1004... communication device, 1005... input device, 1006... output device, 1007... bus.

Abstract

The present invention is for appropriately performing display on a display in accordance with the state of a user. This display control device is for controlling display on a transmission-type display 10 which is worn on a portion of a user's eye and on which display is performed in accordance with an operation of another user, the display control device comprising: a detection unit 12 for detecting the state of the user wearing the display 10; and a control unit 13 for controlling the operation that is from the other user and that is related to display on the display 10, in accordance with the state of the user detected by the detection unit 12.

Description

表示制御装置display controller
 本発明は、ディスプレイにおける表示を制御する表示制御装置に関する。 The present invention relates to a display control device that controls display on a display.
 従来から、透過型のヘッドマウントディスプレイが用いられている(例えば、特許文献1参照)。透過型のヘッドマウントディスプレイによれば、ユーザは、現実世界を見ると共にディスプレイに表示された情報を参照することができる。 Conventionally, transmissive head-mounted displays have been used (see Patent Document 1, for example). A transmissive head-mounted display allows a user to see the real world and to refer to information displayed on the display.
特開2018-91882号公報JP 2018-91882 A
 従来から、ディスプレイにおける表示を別のユーザのディスプレイと共有する技術が用いられている。透過型のヘッドマウントディスプレイにおいても、表示を別のユーザと共有することが考えられる。この場合、ヘッドマウントディスプレイにおける表示が別のユーザによって遠隔で操作される。ヘッドマウントディスプレイのユーザが、ディスプレイを見る以外の行動をしていた場合、別のユーザからの操作はヘッドマウントディスプレイのユーザの邪魔になり得る。例えば、ヘッドマウントディスプレイのユーザが、ディスプレイの表示ではなく現実世界のものを見ている際に、表示されるコンテンツが別のユーザの操作によって目の前に移動されると、ユーザの行動の邪魔になる。 Conventionally, technology has been used to share the display on a display with another user's display. Even in a transmissive head-mounted display, it is conceivable to share the display with another user. In this case, the display on the head mounted display is remotely operated by another user. If the user of the head-mounted display is doing something other than looking at the display, another user's operation may interfere with the user of the head-mounted display. For example, when a user of a head-mounted display is looking at something in the real world instead of what is displayed on the display, if the displayed content is moved in front of the user's eyes by another user's operation, the user's behavior will be disturbed. become.
 本発明の一実施形態は、上記に鑑みてなされたものであり、ユーザの状態に応じて適切にディスプレイにおける表示を行うことができる表示制御装置を提供することを目的とする。 An embodiment of the present invention has been made in view of the above, and an object thereof is to provide a display control device capable of appropriately performing display on a display according to the state of the user.
 上記の目的を達成するために、本発明の一実施形態に係る表示制御装置は、ユーザの眼の部分に装着されると共に別のユーザからの操作に応じた表示を行う透過型のディスプレイにおける表示を制御する表示制御装置であって、ディスプレイを装着するユーザの状態を検出する検出部と、検出部によって検出されたユーザの状態に応じて、別のユーザからのディスプレイの表示に係る操作に対する制御を行う制御部と、を備える。 In order to achieve the above object, a display control device according to an embodiment of the present invention provides display on a transmissive display that is worn on the eye of a user and performs display according to an operation from another user. A display control device that controls the display, which detects the state of the user wearing the display, and the control of the operation related to the display of the display from another user according to the state of the user detected by the detection unit and a control unit that performs
 本発明の一実施形態に係る表示制御装置では、ユーザの状態に応じて、別のユーザからのディスプレイの表示に係る操作に対する制御が行われる。例えば、ユーザの状態に応じて、別のユーザからの操作によるディスプレイの表示がユーザの邪魔にならないように表示を制御することができる。従って、本発明の一実施形態に係る表示制御装置によれば、ユーザの状態に応じて適切にディスプレイにおける表示を行うことができる。 In the display control device according to one embodiment of the present invention, the operation related to the display by another user is controlled according to the state of the user. For example, according to the user's state, the display can be controlled so that the display by another user's operation does not interfere with the user. Therefore, according to the display control device according to the embodiment of the present invention, it is possible to appropriately perform display on the display according to the state of the user.
 本発明の一実施形態によれば、ユーザの状態に応じて、別のユーザからのディスプレイの表示に係る操作に対する制御が行われるため、ユーザの状態に応じて適切にディスプレイにおける表示を行うことができる。 According to an embodiment of the present invention, an operation related to display display from another user is controlled according to the state of the user. can.
本発明の実施形態に係る表示制御装置であるディスプレイの機能構成を示す図である。It is a figure which shows the functional structure of the display which is a display control apparatus which concerns on embodiment of this invention. ディスプレイにおける表示の例を示す図である。FIG. 4 is a diagram showing an example of display on a display; 本発明の実施形態に係る表示制御装置であるディスプレイで実行される処理を示すフローチャートである。4 is a flow chart showing processing executed by a display that is a display control device according to an embodiment of the present invention; 本発明の実施形態に係る表示制御装置であるディスプレイのハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of the display which is a display control apparatus which concerns on embodiment of this invention.
 以下、図面と共に本発明に係る表示制御装置の実施形態について詳細に説明する。なお、図面の説明においては同一要素には同一符号を付し、重複する説明を省略する。 Hereinafter, an embodiment of the display control device according to the present invention will be described in detail along with the drawings. In the description of the drawings, the same elements are denoted by the same reference numerals, and overlapping descriptions are omitted.
 図1に本実施形態に係る表示制御装置であるディスプレイ10の機能構成を示す。表示制御装置であるディスプレイ10は、ディスプレイ10自身における表示を制御する。ディスプレイ10は、ユーザの眼の部分に装着される透過型のディスプレイである。例えば、ディスプレイ10は、眼鏡型のヘッドマウントディスプレイ、即ち、シースルーグラス(スマートグラス)である。ディスプレイ10は透過型であるため、ディスプレイ10を装着したユーザは、ディスプレイ10における表示(映像)と、外部の景色等とを同時に見ることができる。例えば、ディスプレイ10は、表示画面100を備えており、図2(a)及び図2(b)に示すように表示画面100の一部にコンテンツ110を表示する。ユーザは、表示画面100のコンテンツ110が表示されていない部分から外部の景色等を見ることができる。ユーザは、表示画面100のコンテンツ110が表示されている部分については、コンテンツ越しに外部の景色等を見ることができるか、コンテンツに阻害されて外部の景色等を見ることができない。 FIG. 1 shows the functional configuration of a display 10, which is a display control device according to this embodiment. The display 10, which is a display control device, controls display on the display 10 itself. The display 10 is a transmissive display worn on the user's eye. For example, the display 10 is a glasses-type head-mounted display, that is, see-through glasses (smart glasses). Since the display 10 is of a transmissive type, the user wearing the display 10 can simultaneously see the display (image) on the display 10 and the scenery of the outside. For example, the display 10 has a display screen 100, and displays content 110 on a part of the display screen 100 as shown in FIGS. 2(a) and 2(b). The user can see the outside scenery or the like from a portion of the display screen 100 where the content 110 is not displayed. As for the portion of the display screen 100 where the content 110 is displayed, the user can see the outside scenery through the content, or cannot see the outside scenery because of the content.
 ディスプレイ10は、外部の景色等の周囲の環境に応じた表示を行うものであってもよい。即ち、ディスプレイ10は、AR(Augmented Reality)によって仮想のコンテンツの表示を行うもの、例えば、ARグラスであってもよい。 The display 10 may display according to the surrounding environment such as the scenery of the outside. That is, the display 10 may display virtual content by AR (Augmented Reality), such as AR glasses.
 ディスプレイ10は、別のユーザからの操作に応じた表示を行う。例えば、ディスプレイ10は、ディスプレイ10自身における表示、例えば、デスクトップの画面を別のディスプレイと共有してもよい。即ち、ディスプレイ10と別のディスプレイとで同じ表示がなされてもよい。例えば、ディスプレイ10と上記の別のディスプレイ(あるいは、当該別のディスプレイに接続された端末、以下についても同様)とは、移動体通信網、無線LAN及びインターネット等の通信網によって互いに通信可能に構成されており、表示に係る情報を互いに送受信することで表示の共有を行う。あるいは、表示の共有は、ディスプレイ以外の装置、例えば、ディスプレイと通信可能なコンテンツ管理サーバを介して行われてもよい。 The display 10 displays according to the operation by another user. For example, the display 10 may share the display on the display 10 itself, eg, the screen of the desktop, with another display. That is, the same display may be made on the display 10 and another display. For example, the display 10 and the above-mentioned another display (or a terminal connected to the another display, the same applies to the following) are configured to be able to communicate with each other through a communication network such as a mobile communication network, a wireless LAN, and the Internet. Display is shared by mutually transmitting and receiving information related to display. Alternatively, display sharing may occur via a device other than the display, such as a content management server that can communicate with the display.
 ディスプレイ10のユーザと別のディスプレイのユーザとは、共有される表示に対して操作を行うことができる。例えば、図2(b)の矢印に示すようにコンテンツ110の一つの表示位置を移動させることができる。なお、表示に対する操作は、コンテンツ110の移動に限られず任意のものであってもよい。このようにディスプレイ10は、別のユーザから表示の遠隔操作があった場合、当該遠隔操作に応じた表示を行う。表示の共有を行うことでディスプレイ10のユーザと、別のディスプレイのユーザとの間で、同じ情報を参照した上でのコミュニケーションを行うことができる。なお、上記の例では、2台のディスプレイでの表示の共有の例を示したが、3台以上のディスプレイでの表示の共有が行われてもよい。また、ディスプレイ10は、表示の共有以外で別のユーザからの操作に応じた表示を行ってもよい。 A user of the display 10 and a user of another display can operate the shared display. For example, one display position of the content 110 can be moved as indicated by the arrow in FIG. 2(b). It should be noted that the operation on the display is not limited to moving the content 110 and may be arbitrary. In this way, when another user remotely operates display, the display 10 performs display according to the remote operation. By sharing the display, the user of the display 10 and the user of another display can communicate while referring to the same information. In the above example, two displays share the display, but three or more displays may share the display. Moreover, the display 10 may perform display according to an operation from another user other than sharing the display.
 ディスプレイ10としては、従来の上記の機能を有するディスプレイを用いることができる。また、ディスプレイ10の上述した機能及び後述する本実施形態に係る機能の一部は、表示装置(例えば、上述したシースルーグラス)に接続される情報処理装置(例えば、スマートフォン)が有していてもよい。即ち、表示装置と情報処理装置とを含んで本実施形態に係るディスプレイ10が実現されてもよい。 As the display 10, a conventional display having the above functions can be used. In addition, some of the above-described functions of the display 10 and functions according to the present embodiment described later may be included in an information processing device (for example, a smartphone) connected to a display device (for example, the above-described see-through glasses). good. That is, the display 10 according to the present embodiment may be realized by including the display device and the information processing device.
 引き続いて、本実施形態に係るディスプレイ10の機能を説明する。図1に示すようにディスプレイ10は、表示部11と、検出部12と、制御部13とを備えて構成される。また、ディスプレイ10は、上記以外にも従来のシースルーグラス等の従来の表示装置が備える機能を備えていてもよい。 Next, the functions of the display 10 according to this embodiment will be explained. As shown in FIG. 1, the display 10 includes a display section 11, a detection section 12, and a control section 13. As shown in FIG. In addition, the display 10 may have functions other than those described above that conventional display devices, such as conventional see-through glasses, have.
 表示部11は、ディスプレイ10に備えられる表示画面100での表示を行う機能部である。表示部11は、表示画面100に表示する表示情報を入力して、表示画面100に表示する。表示情報は、例えば、写真又は文字等のコンテンツであってもよいし、ブラウザ等の表示される内容が変更され得るものであってもよい。表示部11は、表示に係る操作を受け付けて当該操作を実行する。表示に係る操作は、例えば、表示画面100における表示情報を移動させる操作である。また、表示に係る操作は、それ以外の操作であってもよい。 The display unit 11 is a functional unit that displays on the display screen 100 provided in the display 10 . The display unit 11 inputs display information to be displayed on the display screen 100 and displays it on the display screen 100 . The display information may be, for example, content such as photos or characters, or may be information that can be changed in a browser or the like. The display unit 11 receives an operation related to display and executes the operation. An operation related to display is, for example, an operation of moving display information on the display screen 100 . Also, the operation related to the display may be another operation.
 表示に係る操作は、ディスプレイ10を装着するユーザ及び別のユーザから行われ得る。ディスプレイ10を装着するユーザが操作する場合には、表示に係る操作の受け付けは、例えば、ディスプレイ10に対するユーザの入力操作を受け付けることで行われる。別のディスプレイと表示を共有する場合には、表示部11は、操作を示す情報を別のディスプレイに送信する。 An operation related to display can be performed by the user wearing the display 10 or by another user. When the user who wears the display 10 performs the operation, the operation related to the display is accepted by accepting the user's input operation to the display 10 , for example. When sharing the display with another display, the display unit 11 transmits information indicating the operation to the another display.
 別のユーザが操作する場合には、表示に係る操作の受け付けは、例えば、別のユーザに用いられる別のディスプレイから送信される当該操作を示す情報を受信することで行われる。表示部11は、受け付けた操作が別のユーザからのものであるかを識別できるようになっている。例えば、操作を示す情報には、操作をしたユーザの識別子が付与されている。 When another user performs an operation, acceptance of the display-related operation is performed, for example, by receiving information indicating the operation transmitted from another display used by another user. The display unit 11 can identify whether the received operation is from another user. For example, the information indicating the operation is given the identifier of the user who performed the operation.
 表示部11による上記の機能は、ディスプレイに表示を行う従来の機能と同様のものでよい。また、後述するように表示部11による表示に係る操作の実行は、制御部13からの制御を受ける。 The above function of the display unit 11 may be the same as the conventional function of displaying on the display. Further, as will be described later, the execution of operations related to display by the display unit 11 is controlled by the control unit 13 .
 検出部12は、ディスプレイ10を装着するユーザの状態を検出する機能部である。検出部12は、ディスプレイ10を装着するユーザの状態として、当該ユーザの視線を検出してもよい。より具体的には、検出部12は、当該ユーザの視線の焦点距離を検出してもよい。また、検出部12は、当該ユーザの視線の方向を検出してもよい。あるいは、検出部12は、ディスプレイ10を装着するユーザの状態として、当該ユーザの移動状態を検出してもよい。 The detection unit 12 is a functional unit that detects the state of the user wearing the display 10 . The detection unit 12 may detect the line of sight of the user as the state of the user wearing the display 10 . More specifically, the detection unit 12 may detect the focal length of the line of sight of the user. Further, the detection unit 12 may detect the direction of the user's line of sight. Alternatively, the detection unit 12 may detect the moving state of the user as the state of the user wearing the display 10 .
 例えば、ディスプレイ10にユーザの状態を検出するためのセンサを検出部12の少なくとも一部として設けておき、センサを用いてユーザの状態を検出してもよい。ユーザの視線を検出する場合には、ディスプレイ10に、ユーザの眼球を撮像できるカメラを設けておく。検出部12は、カメラの撮像によって得られたユーザの眼球の画像から、ユーザの視線、より具体的には、視線の焦点距離及び方向を検出する。例えば、検出部12は、ユーザの眼球の動画像からユーザの眼球の動きを検出して、当該動きに基づいて視線の焦点距離及び方向を検出する。検出部12は、焦点距離として、例えば、眼球から焦点までの距離を検出する。検出部12は、視線の方向として、例えば、表示画面100における視線の位置(視線と表示画面100との交点の位置であり、例えば、表示画面100における座標)を検出する。画像からの視線の検出は、従来の方法と同様に行われればよい。 For example, a sensor for detecting the state of the user may be provided on the display 10 as at least part of the detection unit 12, and the user's state may be detected using the sensor. When detecting the line of sight of the user, the display 10 is provided with a camera capable of imaging the eyeball of the user. The detection unit 12 detects the user's line of sight, more specifically, the focal length and direction of the line of sight from the image of the user's eyeball obtained by imaging with the camera. For example, the detection unit 12 detects the movement of the user's eyeball from the moving image of the user's eyeball, and detects the focal length and direction of the line of sight based on the movement. The detection unit 12 detects, for example, the distance from the eyeball to the focal point as the focal length. The detection unit 12 detects, for example, the position of the line of sight on the display screen 100 (the position of the intersection of the line of sight and the display screen 100, eg, the coordinates on the display screen 100) as the direction of the line of sight. Detecting the line of sight from the image may be performed in the same manner as in the conventional method.
 検出部12は、ユーザの移動状態としては、ユーザが歩行しているか否かを検出してもよい。この場合には、ディスプレイ10に、例えば、加速度センサを設けておく。検出部12は、加速度センサによって得られた情報から、ユーザが歩行しているか否かを検出する。加速度からのユーザが歩行しているか否かの検出は、従来の方法と同様に行われればよい。あるいは、ディスプレイ10にGPS(グローバル・ポジショニング・システム)等による自装置の測位機能を設けておき、検出部12は、測位機能によって得られたディスプレイ10の位置を示す情報に基づいて、ユーザが移動中(例えば、歩行中)か静止中かの検出を行ってもよい。この検出も、従来の方法と同様に行われればよい。 The detection unit 12 may detect whether or not the user is walking as the movement state of the user. In this case, the display 10 is provided with an acceleration sensor, for example. The detection unit 12 detects whether or not the user is walking from the information obtained by the acceleration sensor. Detection of whether or not the user is walking from the acceleration may be performed in the same manner as in the conventional method. Alternatively, the display 10 is provided with a positioning function of its own device using GPS (Global Positioning System) or the like, and the detection unit 12 detects the position of the display 10 based on the information indicating the position of the display 10 obtained by the positioning function. Detection may be performed while (eg, walking) or stationary. This detection may also be performed in the same manner as in the conventional method.
 検出部12は、継続的に、例えば、一定時間毎にユーザの状態を検出する。検出部12は、検出したユーザの状態を示す情報を検出の度に制御部13に出力する。なお、検出部12は、上述したユーザの状態の少なくとも何れかについて検出を行えばよい。また、検出部12は、上述した方法以外でユーザの状態を検出してもよい。あるいは、検出部12は、後述する制御部13による制御に有用であれば、上述した状態以外のユーザの状態を検出してもよい。 The detection unit 12 continuously detects the state of the user, for example, at regular intervals. The detection unit 12 outputs information indicating the detected state of the user to the control unit 13 each time of detection. Note that the detection unit 12 may detect at least one of the user states described above. Also, the detection unit 12 may detect the state of the user by a method other than the above-described method. Alternatively, the detection unit 12 may detect user states other than those described above as long as they are useful for control by the control unit 13, which will be described later.
 制御部13は、検出部12によって検出されたユーザの状態に応じて、別のユーザからのディスプレイ10の表示に係る操作に対する制御を行う機能部である。制御部13は、検出部12によって検出された焦点距離に応じて、制御を行うか否かを判断してもよい。制御部13は、検出部12によって検出された視線の方向に応じて、制御を行う対象を設定してもよい。制御部13は、別のユーザからのディスプレイ10の表示に係る操作を禁止する制御、又は当該操作に係る表示を変更する制御を行ってもよい。制御部13は、制御対象となった別のユーザからの操作を記憶しておき、制御が終了した後に記憶した操作を実行させてもよい。 The control unit 13 is a functional unit that controls operations related to display on the display 10 by another user according to the state of the user detected by the detection unit 12 . The control unit 13 may determine whether or not to perform control according to the focal length detected by the detection unit 12 . The control unit 13 may set an object to be controlled according to the line-of-sight direction detected by the detection unit 12 . The control unit 13 may perform control to prohibit another user from performing an operation related to the display of the display 10, or control to change the display related to the operation. The control unit 13 may store an operation from another user to be controlled, and execute the stored operation after the control ends.
 制御部13による制御は、例えば、別のユーザからの操作がディスプレイ10を装着しているユーザの邪魔にならないようにするためのものである。ディスプレイ10は透過型であるため、ディスプレイ10を装着したユーザは、ディスプレイ10を見る以外の行動をすることもできる。例えば、ディスプレイ10の表示を行ったまま、ユーザは、バスの時刻表を見たり、歩行したりすることができる。 The control by the control unit 13 is, for example, to prevent an operation from another user from interfering with the user wearing the display 10 . Since the display 10 is transmissive, the user wearing the display 10 can also perform actions other than looking at the display 10 . For example, the user can view a bus timetable or walk while the display 10 is displaying.
 例えば、ディスプレイ10を装着したユーザが、図2(a)に示すように、コンテンツが表示されていない表示画面100の中央の方向に視線を向けて、ディスプレイ10の表示ではない現実世界のバスの時刻表を見ているとする。即ち、ディスプレイ10を装着したユーザは、バスの時刻表を見ると共にコンテンツを別のユーザと共有している。その際に、図2(b)に示すように、別のユーザからの操作によってコンテンツが表示画面100の中央、即ち、ディスプレイ10を装着したユーザの目の前に移動すると、当該ユーザは視界を遮られてバスの時刻表を見るという行動の邪魔となる。あるいは、ユーザが歩いている際に上記のようにコンテンツが移動すると、当該ユーザは視界を遮られて周りが見えなくなり危険である。このようにユーザが、ディスプレイ10を見る以外の現実世界の行動を行っている場合には、別のユーザからの操作がユーザの行動の邪魔になることがある。制御部13による制御は、別のユーザからの操作がユーザの行動の邪魔になることを防止するためのものである。但し、制御部13による制御は、必ずしも上記の目的で行われなくてもよく、上記以外の目的で行われてもよい。例えば、制御部13は、以下のように制御を行う。 For example, as shown in FIG. 2A, a user wearing the display 10 turns his or her line of sight toward the center of the display screen 100 where no content is displayed, and sees a bus in the real world that is not displayed on the display 10. Suppose you are looking at a timetable. That is, the user wearing the display 10 is viewing the bus timetable and sharing the content with another user. At that time, as shown in FIG. 2B, when another user operates the content to move to the center of the display screen 100, that is, in front of the user who wears the display 10, the user changes the field of view. It interferes with the action of being blocked and looking at the bus timetable. Alternatively, if the content moves as described above while the user is walking, the user's field of view is blocked and the user cannot see the surroundings, which is dangerous. In this way, when the user is performing an action in the real world other than looking at the display 10, an operation from another user may interfere with the user's action. Control by the control unit 13 is for preventing an operation from another user from interfering with the action of the user. However, the control by the control unit 13 does not necessarily have to be performed for the above purposes, and may be performed for purposes other than the above. For example, the control unit 13 performs control as follows.
 制御部13は、検出部12からユーザの状態を示す情報を入力する。制御部13は、予め記憶した判断基準に基づいて、検出部12から入力した情報から、別のユーザからのディスプレイ10の表示に係る操作に対する制御を行うか否かを判断する。例えば、検出部12から入力した情報が焦点距離を示すものであった場合、制御部13は、焦点距離が予め設定された範囲に含まれるか否かによって上記の判断を行う。上記の範囲は、例えば、ディスプレイ10を装着したユーザの焦点がディスプレイ10の表示画面100に合っている、即ち、当該ユーザがディスプレイ10の表示画面100を見ていると判断できる範囲である。 The control unit 13 receives information indicating the state of the user from the detection unit 12 . The control unit 13 determines whether or not to control the operation related to the display of the display 10 from another user based on the information input from the detection unit 12 based on pre-stored determination criteria. For example, if the information input from the detection unit 12 indicates the focal length, the control unit 13 makes the above determination based on whether the focal length is within a preset range. The above range is, for example, a range in which it can be determined that the user wearing the display 10 is focused on the display screen 100 of the display 10 , that is, the user is looking at the display screen 100 of the display 10 .
 焦点距離が上記の範囲に含まれない場合、ディスプレイ10の表示画面100を見ておらず、例えば、現実世界のものを見ている(注視している)として、制御部13は、別のユーザからの操作に対する制御を行うと判断する。焦点距離が上記の範囲に含まれる場合、ディスプレイ10の表示画面100を見ているとして、制御部13は、別のユーザからの操作に対する制御を行わないと判断する。ユーザがディスプレイ10の表示画面100を見ていれば、別のユーザからの操作はユーザの邪魔にならないと考えられるためである。 If the focal length is not within the above range, the display screen 100 of the display 10 is not seen, for example, assuming that the user is looking at (gazing at) an object in the real world. It is determined that the control for the operation from is performed. If the focal length is within the above range, the display screen 100 of the display 10 is viewed, and the control unit 13 determines not to control the operation from another user. This is because it is considered that, as long as the user is looking at the display screen 100 of the display 10, an operation from another user will not interfere with the user.
 あるいは、検出部12から入力した情報が、ユーザが歩行していることを示すものであった場合、制御部13は、別のユーザからの操作に対する制御を行うと判断し、検出部12から入力した情報が、ユーザが歩行していないことを示すものであった場合、制御部13は、別のユーザからの操作に対する制御を行わないと判断する。 Alternatively, when the information input from the detection unit 12 indicates that the user is walking, the control unit 13 determines that control is to be performed for the operation from another user, and the information input from the detection unit 12 If the received information indicates that the user is not walking, the control unit 13 determines not to control the operation from another user.
 制御部13は、別のユーザからの操作に対する制御を行うと判断し、かつその時点で表示部11に対する制御が行われていない場合、表示部11に対する制御を行う。制御は、例えば、別のユーザからのディスプレイ10の表示に係る操作を禁止する制御である。具体的には、制御は、ディスプレイ10に表示されている表示情報を移動又は拡大等する、別のユーザからの操作を禁止する、即ち、表示情報を一時停止させるものである。また、制御は、操作の一部を禁止するものであってもよい。例えば、表示情報を移動させる操作であった場合、制御は、表示情報を操作通りに完全に移動させずに少しだけ移動させる制御であってもよい。 The control unit 13 controls the display unit 11 when it determines that it will control an operation from another user and the display unit 11 is not being controlled at that time. The control is, for example, control that prohibits another user from performing an operation related to display on the display 10 . Specifically, the control is to move or enlarge the display information displayed on the display 10, to prohibit an operation from another user, that is, to temporarily stop the display information. Also, the control may prohibit some of the operations. For example, if the operation is to move the display information, the control may be a control to move the display information slightly without completely moving it according to the operation.
 表示部11は、制御部13からの制御を受けて、制御に応じた処理を行う。表示部11は、制御部13からの制御中にディスプレイ10の表示に係る操作を受け付けると、当該操作が別のユーザからのものであるか否かを識別する。当該操作が別のユーザからのものであると識別した場合、表示部11は、当該操作を禁止する。当該操作が別のユーザからのものでないと識別した場合、表示部11は、当該操作を禁止せずに実行する。 The display unit 11 receives control from the control unit 13 and performs processing according to the control. When the display unit 11 receives an operation related to the display of the display 10 while being controlled by the control unit 13, the display unit 11 identifies whether or not the operation is from another user. When identifying that the operation is from another user, the display unit 11 prohibits the operation. When identifying that the operation is not performed by another user, the display unit 11 performs the operation without prohibiting it.
 表示部11は、制御対象となった操作を示す情報を制御部13に出力する。制御部13は、表示部11から、制御対象となった操作を示す情報を入力して記憶(蓄積)する。 The display unit 11 outputs information indicating the operation to be controlled to the control unit 13 . The control unit 13 inputs and stores (accumulates) information indicating the operation to be controlled from the display unit 11 .
 また、制御は、必ずしも、操作を禁止する制御でなくてもよく、操作に係る表示を変更する制御でもよい。例えば、移動対象となった表示情報を、非表示又は半透明にしたり、枠のみを表示したりする制御を行ってもよい。あるいは、移動対象となった表示情報を、ディスプレイ10の予め設定された位置、例えば、視界の隅となる位置に表示させる制御を行ってもよい。また、上述した趣旨に沿うものであれば、上述したもの以外の制御が行われてもよい。 Also, the control does not necessarily have to be control that prohibits the operation, and may be control that changes the display related to the operation. For example, control may be performed to hide or translucent the display information to be moved, or to display only the frame. Alternatively, the display information to be moved may be controlled to be displayed at a preset position on the display 10, for example, at a corner of the field of view. Further, control other than the above-described control may be performed as long as it meets the above-described purpose.
 制御部13は、別のユーザからの操作に対する制御を行わないと判断し、かつその時点で表示部11に対する当該制御が行われている場合、表示部11に対する当該制御を解除する。制御部13は、当該制御が解除されて終了した時点で記憶している制御対象となった操作を表示部11に実行させる。これによって、例えば、実行が禁止されていた表示情報の移動が行われる。制御対象となった操作が実行されると、制御部13は、記憶していた当該操作を示す情報を削除する。 If the control unit 13 determines not to control the operation from another user, and if the display unit 11 is under control at that time, the control unit 13 cancels the control. The control unit 13 causes the display unit 11 to execute the operation to be controlled that is stored when the control is canceled and ended. As a result, for example, display information whose execution has been prohibited is moved. When the operation to be controlled is executed, the control unit 13 deletes the stored information indicating the operation.
 また、制御部13は、予め記憶した判断基準に基づいて、検出部12から入力した情報から制御を行う対象を設定する。例えば、検出部12から入力した情報が視線の方向を示すものであった場合、制御部13は、視線の方向から制御を行う対象を設定する。制御部13は、制御を行う対象として、視線の方向である表示画面100における視線の位置に基づいて、表示画面100における領域を設定する。例えば、別のユーザからの表示に係る操作が、表示画面100の設定した領域外に既に表示されている表示情報を、設定した領域に移動させるもの、及び設定した領域に新たに表示情報を表示させるものである場合、当該操作に対する制御が行われる。あるいは、別のユーザからの表示に係る操作が、設定した領域に既に表示されている表示情報の移動等、設定した領域に既に表示されている表示情報に対する操作である場合、当該操作に対する制御が行われる。 In addition, the control unit 13 sets the target to be controlled from the information input from the detection unit 12 based on pre-stored criteria. For example, when the information input from the detection unit 12 indicates the direction of the line of sight, the control unit 13 sets the target to be controlled from the direction of the line of sight. The control unit 13 sets an area on the display screen 100 as an object to be controlled based on the position of the line of sight on the display screen 100, which is the direction of the line of sight. For example, an operation related to display by another user moves display information already displayed outside the set area of the display screen 100 to the set area, or displays new display information in the set area. If so, control over that operation is performed. Alternatively, if another user's operation related to display is an operation for display information already displayed in a set area, such as movement of display information already displayed in the set area, the control for the operation is performed. done.
 制御部13は、表示部11に対して、設定した領域を通知すると共に領域に応じた別のユーザからの表示に係る操作に対する制御を行う。表示部11は、制御部13からの制御を受けると、当該操作が別のユーザからのものであるか否かを識別する。当該操作が別のユーザからのものであると識別した場合、表示に係る操作が、表示画面100に既に表示されている表示情報を、設定した領域に移動させるもの、及び設定した領域に新たに表示情報を表示させるものであれば、表示部11は、当該操作を禁止する。あるいは、表示に係る操作が、設定した領域に既に表示されている表示情報に対するものであれば、表示部11は、当該操作を禁止する。また、別のユーザから操作に対する制御は、上述したものと同様のものでもよい。また、この場合も、制御部13による制御対象となった操作を示す情報の記憶(蓄積)、及び制御終了後の当該操作の実行が行われてもよい。以上が、本実施形態に係るディスプレイ10の機能である。 The control unit 13 notifies the display unit 11 of the set area, and controls operations related to display from another user according to the area. When receiving control from the control unit 13, the display unit 11 identifies whether or not the operation is from another user. When the operation is identified as being performed by another user, the operation related to the display is to move the display information already displayed on the display screen 100 to the set area, or to move the display information to the set area. If the operation is to display display information, the display unit 11 prohibits the operation. Alternatively, if the operation related to display is for display information already displayed in the set area, the display unit 11 prohibits the operation. Also, the control over the operation from another user may be similar to that described above. Also in this case, the storage (accumulation) of information indicating the operation to be controlled by the control unit 13 and the execution of the operation after the end of the control may be performed. The above is the function of the display 10 according to the present embodiment.
 引き続いて、図3のフローチャートを用いて、本実施形態に係るディスプレイ10で実行される処理(ディスプレイ10が行う動作方法)を説明する。本処理では、ユーザによってディスプレイ10が装着されて、ディスプレイ10において表示部11による表示が行われる際のものである。また、別のディスプレイとの間で表示の共有が行われ、別のユーザからのディスプレイ10の表示に係る操作が行われ得る。 Subsequently, the processing executed by the display 10 according to the present embodiment (operation method performed by the display 10) will be described using the flowchart of FIG. This processing is performed when the display 10 is worn by the user and the display unit 11 performs display on the display 10 . Also, the display can be shared with another display, and an operation related to the display of the display 10 can be performed by another user.
 本処理では、検出部12によって、ディスプレイ10を装着するユーザの状態が検出される(S01)。続いて、制御部13によって、ユーザの状態から、別のユーザからのディスプレイ10の表示に係る操作に対する制御を行うか否かが判断される(S02)。制御を行うと判断され、かつその時点で表示部11に対する制御が行われていない場合(S02のYES)、制御部13から表示部11への、別のユーザからの操作に対する制御が行われる(S03)。当該制御は、例えば、操作を禁止する制御である。当該制御が行われている際に、表示部11によって別のユーザからの操作が受け付けられると、制御に応じた処理が行われる。例えば、別のユーザからの操作が禁止される。また、制御対象となった操作は、制御部13によって記憶される。 In this process, the state of the user wearing the display 10 is detected by the detection unit 12 (S01). Subsequently, the control unit 13 determines whether or not to control an operation related to display on the display 10 from another user based on the state of the user (S02). If it is determined that control is to be performed and the display unit 11 is not being controlled at that time (YES in S02), control is performed from the control unit 13 to the display unit 11 in response to another user's operation ( S03). The control is, for example, control that prohibits the operation. When the display unit 11 receives an operation from another user while the control is being performed, a process corresponding to the control is performed. For example, an operation from another user is prohibited. Further, the control unit 13 stores the operation to be controlled.
 S02において制御を行わないと判断され、かつその時点で表示部11に対する制御が行われている場合(S02のNO)、制御部13から表示部11への、別のユーザからの操作に対する制御が終了される(S04)。また、制御対象となった操作が制御部13によって記憶されていた場合、制御の終了後、表示部11によって当該操作が行われる(S05)。また、上記の制御は、ユーザの状態の状態に応じて、制御を行う対象が設定されて行われてもよい。上記の処理(S01~S05)は、表示部11による表示が行われている間、繰り返し行われる。以上が、本実施形態に係るディスプレイ10で実行される処理である。 If it is determined in S02 that control is not to be performed, and if the display unit 11 is being controlled at that time (NO in S02), the control of the operation from another user from the control unit 13 to the display unit 11 is performed. It ends (S04). Further, when the operation to be controlled is stored by the control unit 13, the operation is performed by the display unit 11 after the control is finished (S05). Further, the above control may be performed by setting a target to be controlled according to the state of the user. The above processes (S01 to S05) are repeated while the display unit 11 is displaying. The above is the processing executed by the display 10 according to the present embodiment.
 本実施形態では、ユーザの状態に応じて、別のユーザからのディスプレイ10の表示に係る操作に対する制御が行われる。例えば、ユーザの状態に応じて、別のユーザからの操作によるディスプレイ10の表示がユーザの邪魔にならないように表示を制御することができる。従って、本実施形態によれば、ユーザの状態に応じて適切にディスプレイ10における表示を行うことができる。 In this embodiment, an operation related to the display of the display 10 by another user is controlled according to the state of the user. For example, the display can be controlled according to the state of the user so that the display on the display 10 by another user's operation does not interfere with the user. Therefore, according to the present embodiment, it is possible to appropriately perform display on the display 10 according to the state of the user.
 本実施形態のように、ディスプレイ10を装着するユーザの状態として、当該ユーザの視線を検出してもよい。より具体的には、当該ユーザの視線の焦点距離を検出して、焦点距離に応じて制御を行うか否かを判断してもよい。この構成によれば、例えば、ユーザがディスプレイ10の表示画面100を見ていない場合に制御を行うことができ、ディスプレイ10の表示が、ユーザの邪魔になったり、ユーザに危険を生させたりすることを防止することができる。 As in the present embodiment, the line of sight of the user may be detected as the state of the user wearing the display 10 . More specifically, the focal length of the line of sight of the user may be detected, and it may be determined whether or not to perform control according to the focal length. According to this configuration, for example, control can be performed when the user is not looking at the display screen 100 of the display 10, and the display on the display 10 may interfere with the user or pose a danger to the user. can be prevented.
 あるいは、当該ユーザの視線の方向を検出して、視線の方向に応じて制御を行う対象を設定してもよい。この構成によれば、例えば、ユーザの視線の方向に応じて必要な対象のみに制御を行うことができ、より適切にディスプレイ10における表示を行うことができる。 Alternatively, the direction of the line of sight of the user may be detected, and the target to be controlled according to the direction of the line of sight may be set. According to this configuration, for example, it is possible to control only a necessary target according to the direction of the user's line of sight, and display on the display 10 can be performed more appropriately.
 また、本実施形態のように、ディスプレイ10を装着するユーザの状態として、当該ユーザの移動状態、より具体的には、ユーザが歩行しているか否かを検出してもよい。この構成によれば、例えば、ユーザが歩行している場合に制御を行うことができ、ディスプレイ10の表示によってユーザが歩行している際に危険が生じることを防止することができる。なお、ディスプレイ10を装着するユーザの状態としては、上記のものに限られず、表示の制御を行うのに有用のものであれば、その他の状態を検出して用いてもよい。 Further, as in the present embodiment, as the state of the user wearing the display 10, the movement state of the user, more specifically, whether or not the user is walking may be detected. According to this configuration, for example, control can be performed while the user is walking, and the display on the display 10 can prevent danger from occurring while the user is walking. The state of the user wearing the display 10 is not limited to the above, and other states may be detected and used as long as they are useful for controlling the display.
 また、本実施形態のように、別のユーザから操作に対する制御としては、操作を禁止する制御、又は当該操作に係る表示を変更する制御を行ってもよい。この構成によれば、ユーザの状態に応じて適切かつ確実にディスプレイにおける表示を行うことができる。但し、適切にディスプレイにおける表示を行う制御であれば、上記以外の制御が行われてもよい。 In addition, as in the present embodiment, as the control for the operation by another user, control to prohibit the operation or control to change the display related to the operation may be performed. According to this configuration, it is possible to appropriately and reliably perform display on the display in accordance with the state of the user. However, control other than the above may be performed as long as it is a control for appropriately performing display on the display.
 また、本実施形態のように、制御対象となった別のユーザからの操作を記憶しておき、制御が終了した後に記憶した操作を実行させてもよい。この構成によれば、例えば、制御が終了した後に、ディスプレイ10における表示を別のディスプレイと共有した状態にさせることができ、ユーザの利便性を向上させることができる。但し、必ずしも、制御が終了した後に記憶した操作を実行させる必要はない。 Also, as in the present embodiment, an operation from another user who is to be controlled may be stored, and the stored operation may be executed after the control ends. According to this configuration, for example, after the control ends, the display on the display 10 can be shared with another display, and the user's convenience can be improved. However, it is not always necessary to execute the stored operation after the control ends.
 検出部12は、ディスプレイ10を装着するユーザの状態として、上記の情報に加えて、あるいは上記の情報にかえて、当該ユーザの心拍又は発汗に係る状態を検出してもよい。具体的には、検出部12は、当該ユーザの心拍数若しくは発汗量、又はそれらの両方を検出してもよい。この場合、当該ユーザに心拍数を検出するセンサ又は発汗量を検出するセンサを取り付けておき、ディスプレイ10は、当該センサから当該ユーザの心拍数又は発汗量を示す情報を取得する。これらのセンサとしては、従来のものを用いることができる。この場合も、制御部13は、上記と同様に予め記憶した判断基準に基づいて、検出部12から入力した情報から、別のユーザからのディスプレイ10の表示に係る操作に対する制御を行うか否かを判断する。例えば、制御部13は、検出部12から入力した情報によって示されるセンサ又は発汗量が予め設定された閾値以上であった場合、あるいは、予め設定された閾値以下であった場合、別のユーザからの操作に対する制御を行うと判断する。 As the state of the user wearing the display 10, the detection unit 12 may detect the heartbeat or sweating state of the user in addition to or instead of the above information. Specifically, the detection unit 12 may detect the user's heart rate or perspiration amount, or both. In this case, a sensor for detecting the heart rate or the amount of perspiration is attached to the user, and the display 10 acquires information indicating the heart rate or the amount of perspiration of the user from the sensor. Conventional sensors can be used as these sensors. In this case as well, the control unit 13 determines whether or not to control the operation related to the display of the display 10 from another user based on the information input from the detection unit 12 based on the determination criteria stored in advance in the same manner as described above. to judge. For example, if the sensor or perspiration amount indicated by the information input from the detection unit 12 is equal to or greater than a preset threshold value, or if it is equal to or less than a preset threshold value, the control unit 13 detects that another user is determined to control the operation of
 上記のような構成によって、例えば、運動中のアスリートが、装着したディスプレイ10にて監督からの指示を受けるシーンで、アスリートの心拍数又は発汗量の状況によって表示方法を変えることができる。これによって、短距離走でゴール直前のダッシュをするような場合に、監督の指示メッセージが視野の邪魔にならないようにできる。あるいは、ディスプレイ10を装着しながら、高所での危険な作業をしている時、作業者の心拍数により、状態の異常を感知した場合のみ、遠隔操作者の指示が表示されるようにすることができる。 With the configuration described above, for example, in a scene where an athlete is exercising and receives instructions from a coach on the display 10 worn by the athlete, the display method can be changed according to the athlete's heart rate or sweating amount. As a result, when making a dash just before the goal in a sprint, the instruction message from the coach can be prevented from obstructing the field of view. Alternatively, while wearing the display 10, when performing dangerous work at a high place, the remote operator's instructions are displayed only when an abnormal state is detected by the heart rate of the worker. be able to.
 上述した実施形態では、表示制御装置は、表示機能を有するディスプレイ10とて説明したが、必ずしも表示機能を有するものでなくてもよい。表示制御装置は、ユーザの眼の部分に装着されると共に別のユーザからの操作に応じた表示を行う(即ち、表示部11を備える)透過型のディスプレイに接続されて、ディスプレイにおける表示を制御する装置(システム)であり、上述した検出部12と、制御部13とを備えるものであればよい。 In the above-described embodiment, the display control device has been described as the display 10 having a display function, but it does not necessarily have to have a display function. The display control device is attached to the user's eye and is connected to a transmissive display that performs display according to an operation from another user (that is, includes a display unit 11), and controls display on the display. Any device (system) may be used as long as it includes the detection unit 12 and the control unit 13 described above.
 なお、上記実施形態の説明に用いたブロック図は、機能単位のブロックを示している。これらの機能ブロック(構成部)は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線などを用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。 It should be noted that the block diagrams used in the description of the above embodiments show blocks for each function. These functional blocks (components) are realized by any combination of at least one of hardware and software. Also, the method of implementing each functional block is not particularly limited. That is, each functional block may be implemented using one device that is physically or logically coupled, or directly or indirectly using two or more devices that are physically or logically separated (e.g. , wired, wireless, etc.) and may be implemented using these multiple devices. A functional block may be implemented by combining software in the one device or the plurality of devices.
 機能には、判断、決定、判定、計算、算出、処理、導出、調査、探索、確認、受信、送信、出力、アクセス、解決、選択、選定、確立、比較、想定、期待、見做し、報知(broadcasting)、通知(notifying)、通信(communicating)、転送(forwarding)、構成(configuring)、再構成(reconfiguring)、割り当て(allocating、mapping)、割り振り(assigning)などがあるが、これらに限られない。たとえば、送信を機能させる機能ブロック(構成部)は、送信部(transmitting unit)又は送信機(transmitter)と呼称される。いずれも、上述したとおり、実現方法は特に限定されない。 Functions include judging, determining, determining, calculating, calculating, processing, deriving, investigating, searching, checking, receiving, transmitting, outputting, accessing, resolving, selecting, choosing, establishing, comparing, assuming, expecting, assuming, Broadcasting, notifying, communicating, forwarding, configuring, reconfiguring, allocating, mapping, assigning, etc. can't For example, a functional block (component) responsible for transmission is called a transmitting unit or transmitter. In either case, as described above, the implementation method is not particularly limited.
 例えば、本開示の一実施の形態におけるディスプレイ10は、本開示の情報処理を行うコンピュータとして機能してもよい。図4は、本開示の一実施の形態に係るディスプレイ10のハードウェア構成の一例を示す図である。上述のディスプレイ10は、物理的には、プロセッサ1001、メモリ1002、ストレージ1003、通信装置1004、入力装置1005、出力装置1006、バス1007などを含むコンピュータ装置として構成されてもよい。 For example, the display 10 in one embodiment of the present disclosure may function as a computer that performs information processing of the present disclosure. FIG. 4 is a diagram illustrating an example of a hardware configuration of display 10 according to an embodiment of the present disclosure. The display 10 described above may be physically configured as a computer device including a processor 1001, a memory 1002, a storage 1003, a communication device 1004, an input device 1005, an output device 1006, a bus 1007, and the like.
 なお、以下の説明では、「装置」という文言は、回路、デバイス、ユニットなどに読み替えることができる。ディスプレイ10のハードウェア構成は、図に示した各装置を1つ又は複数含むように構成されてもよいし、一部の装置を含まずに構成されてもよい。 In the following explanation, the term "apparatus" can be read as a circuit, device, unit, or the like. The hardware configuration of the display 10 may be configured to include one or more of each device shown in the drawing, or may be configured without some of the devices.
 ディスプレイ10における各機能は、プロセッサ1001、メモリ1002などのハードウェア上に所定のソフトウェア(プログラム)を読み込ませることによって、プロセッサ1001が演算を行い、通信装置1004による通信を制御したり、メモリ1002及びストレージ1003におけるデータの読み出し及び書き込みの少なくとも一方を制御したりすることによって実現される。 Each function of the display 10 is performed by loading predetermined software (programs) on hardware such as the processor 1001 and the memory 1002 so that the processor 1001 performs calculations, controls communication by the communication device 1004, controls the communication by the memory 1002 and It is realized by controlling at least one of data reading and writing in the storage 1003 .
 プロセッサ1001は、例えば、オペレーティングシステムを動作させてコンピュータ全体を制御する。プロセッサ1001は、周辺装置とのインターフェース、制御装置、演算装置、レジスタなどを含む中央処理装置(CPU:Central Processing Unit)によって構成されてもよい。例えば、上述のディスプレイ10における各機能は、プロセッサ1001によって実現されてもよい。 The processor 1001, for example, operates an operating system and controls the entire computer. The processor 1001 may be configured by a central processing unit (CPU) including an interface with peripheral devices, a control device, an arithmetic device, registers, and the like. For example, each function of display 10 described above may be implemented by processor 1001 .
 また、プロセッサ1001は、プログラム(プログラムコード)、ソフトウェアモジュール、データなどを、ストレージ1003及び通信装置1004の少なくとも一方からメモリ1002に読み出し、これらに従って各種の処理を実行する。プログラムとしては、上述の実施の形態において説明した動作の少なくとも一部をコンピュータに実行させるプログラムが用いられる。例えば、ディスプレイ10における各機能は、メモリ1002に格納され、プロセッサ1001において動作する制御プログラムによって実現されてもよい。上述の各種処理は、1つのプロセッサ1001によって実行される旨を説明してきたが、2以上のプロセッサ1001により同時又は逐次に実行されてもよい。プロセッサ1001は、1以上のチップによって実装されてもよい。なお、プログラムは、電気通信回線を介してネットワークから送信されても良い。 Also, the processor 1001 reads programs (program codes), software modules, data, etc. from at least one of the storage 1003 and the communication device 1004 to the memory 1002, and executes various processes according to them. As the program, a program that causes a computer to execute at least part of the operations described in the above embodiments is used. For example, each function of display 10 may be implemented by a control program stored in memory 1002 and running on processor 1001 . Although it has been explained that the above-described various processes are executed by one processor 1001, they may be executed simultaneously or sequentially by two or more processors 1001. FIG. Processor 1001 may be implemented by one or more chips. Note that the program may be transmitted from a network via an electric communication line.
 メモリ1002は、コンピュータ読み取り可能な記録媒体であり、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable ROM)、EEPROM(Electrically Erasable Programmable ROM)、RAM(Random Access Memory)などの少なくとも1つによって構成されてもよい。メモリ1002は、レジスタ、キャッシュ、メインメモリ(主記憶装置)などと呼ばれてもよい。メモリ1002は、本開示の一実施の形態に係る情報処理を実施するために実行可能なプログラム(プログラムコード)、ソフトウェアモジュールなどを保存することができる。 The memory 1002 is a computer-readable recording medium, and is composed of at least one of, for example, ROM (Read Only Memory), EPROM (Erasable Programmable ROM), EEPROM (Electrically Erasable Programmable ROM), RAM (Random Access Memory), etc. may be The memory 1002 may also be called a register, cache, main memory (main storage device), or the like. The memory 1002 can store executable programs (program code), software modules, etc. for performing information processing according to an embodiment of the present disclosure.
 ストレージ1003は、コンピュータ読み取り可能な記録媒体であり、例えば、CD-ROM(Compact Disc ROM)などの光ディスク、ハードディスクドライブ、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリ(例えば、カード、スティック、キードライブ)、フロッピー(登録商標)ディスク、磁気ストリップなどの少なくとも1つによって構成されてもよい。ストレージ1003は、補助記憶装置と呼ばれてもよい。ディスプレイ10が備える記憶媒体は、例えば、メモリ1002及びストレージ1003の少なくとも一方を含むデータベース、サーバその他の適切な媒体であってもよい。 The storage 1003 is a computer-readable recording medium, for example, an optical disc such as a CD-ROM (Compact Disc ROM), a hard disk drive, a flexible disc, a magneto-optical disc (for example, a compact disc, a digital versatile disc, a Blu-ray disk), smart card, flash memory (eg, card, stick, key drive), floppy disk, magnetic strip, and/or the like. Storage 1003 may also be called an auxiliary storage device. The storage medium included in display 10 may be, for example, a database, server, or other suitable medium including at least one of memory 1002 and storage 1003 .
 通信装置1004は、有線ネットワーク及び無線ネットワークの少なくとも一方を介してコンピュータ間の通信を行うためのハードウェア(送受信デバイス)であり、例えばネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュールなどともいう。 The communication device 1004 is hardware (transmitting/receiving device) for communicating between computers via at least one of a wired network and a wireless network, and is also called a network device, a network controller, a network card, a communication module, or the like.
 入力装置1005は、外部からの入力を受け付ける入力デバイス(例えば、キーボード、マウス、マイクロフォン、スイッチ、ボタン、センサなど)である。出力装置1006は、外部への出力を実施する出力デバイス(例えば、ディスプレイ、スピーカー、LEDランプなど)である。なお、入力装置1005及び出力装置1006は、一体となった構成(例えば、タッチパネル)であってもよい。 The input device 1005 is an input device (for example, keyboard, mouse, microphone, switch, button, sensor, etc.) that receives input from the outside. The output device 1006 is an output device (eg, display, speaker, LED lamp, etc.) that outputs to the outside. Note that the input device 1005 and the output device 1006 may be integrated (for example, a touch panel).
 また、プロセッサ1001、メモリ1002などの各装置は、情報を通信するためのバス1007によって接続される。バス1007は、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 Each device such as the processor 1001 and the memory 1002 is connected by a bus 1007 for communicating information. The bus 1007 may be configured using a single bus, or may be configured using different buses between devices.
 また、ディスプレイ10は、マイクロプロセッサ、デジタル信号プロセッサ(DSP:Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)などのハードウェアを含んで構成されてもよく、当該ハードウェアにより、各機能ブロックの一部又は全てが実現されてもよい。例えば、プロセッサ1001は、これらのハードウェアの少なくとも1つを用いて実装されてもよい。 In addition, the display 10 includes hardware such as a microprocessor, a digital signal processor (DSP), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), and an FPGA (Field Programmable Gate Array). A part or all of each functional block may be implemented by the hardware. For example, processor 1001 may be implemented using at least one of these pieces of hardware.
 本開示において説明した各態様/実施形態の処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素を提示しており、提示した特定の順序に限定されない。 The order of the processing procedures, sequences, flowcharts, etc. of each aspect/embodiment described in the present disclosure may be changed as long as there is no contradiction. For example, the methods described in this disclosure present elements of the various steps using a sample order, and are not limited to the specific order presented.
 入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 Input/output information may be stored in a specific location (for example, memory) or managed using a management table. Input/output information and the like can be overwritten, updated, or appended. The output information and the like may be deleted. The entered information and the like may be transmitted to another device.
 判定は、1ビットで表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 The determination may be made by a value represented by one bit (0 or 1), by a true/false value (Boolean: true or false), or by numerical comparison (for example, a predetermined value).
 本開示において説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 Each aspect/embodiment described in the present disclosure may be used alone, may be used in combination, or may be used by switching along with execution. In addition, the notification of predetermined information (for example, notification of “being X”) is not limited to being performed explicitly, but may be performed implicitly (for example, not notifying the predetermined information). good too.
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されるものではないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。したがって、本開示の記載は、例示説明を目的とするものであり、本開示に対して何ら制限的な意味を有するものではない。 Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described in the present disclosure. The present disclosure can be practiced with modifications and variations without departing from the spirit and scope of the present disclosure as defined by the claims. Accordingly, the description of the present disclosure is for illustrative purposes and is not meant to be limiting in any way.
 ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 Software, whether referred to as software, firmware, middleware, microcode, hardware description language or otherwise, includes instructions, instruction sets, code, code segments, program code, programs, subprograms, and software modules. , applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, and the like.
 また、ソフトウェア、命令、情報などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)など)及び無線技術(赤外線、マイクロ波など)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 In addition, software, instructions, information, etc. may be transmitted and received via a transmission medium. For example, the software uses at least one of wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and wireless technology (infrared, microwave, etc.) to website, Wired and/or wireless technologies are included within the definition of transmission medium when sent from a server or other remote source.
 本開示において使用する「システム」及び「ネットワーク」という用語は、互換的に使用される。 The terms "system" and "network" used in this disclosure are used interchangeably.
 また、本開示において説明した情報、パラメータなどは、絶対値を用いて表されてもよいし、所定の値からの相対値を用いて表されてもよいし、対応する別の情報を用いて表されてもよい。 In addition, the information, parameters, etc. described in the present disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information. may be represented.
 本開示で使用する「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、判定(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up、search、inquiry)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。また、「判断(決定)」は、「想定する(assuming)」、「期待する(expecting)」、「みなす(considering)」などで読み替えられてもよい。 The terms "determining" and "determining" used in this disclosure may encompass a wide variety of actions. "Judgement" and "determination" are, for example, judging, calculating, computing, processing, deriving, investigating, looking up, searching, inquiring (eg, lookup in a table, database, or other data structure), ascertaining as "judged" or "determined", and the like. Also, "judgment" and "determination" are used for receiving (e.g., receiving information), transmitting (e.g., transmitting information), input, output, access (accessing) (for example, accessing data in memory) may include deeming that a "judgment" or "decision" has been made. In addition, "judgment" and "decision" are considered to be "judgment" and "decision" by resolving, selecting, choosing, establishing, comparing, etc. can contain. In other words, "judgment" and "decision" may include considering that some action is "judgment" and "decision". Also, "judgment (decision)" may be read as "assuming", "expecting", "considering", or the like.
 「接続された(connected)」、「結合された(coupled)」という用語、又はこれらのあらゆる変形は、2又はそれ以上の要素間の直接的又は間接的なあらゆる接続又は結合を意味し、互いに「接続」又は「結合」された2つの要素間に1又はそれ以上の中間要素が存在することを含むことができる。要素間の結合又は接続は、物理的なものであっても、論理的なものであっても、或いはこれらの組み合わせであってもよい。例えば、「接続」は「アクセス」で読み替えられてもよい。本開示で使用する場合、2つの要素は、1又はそれ以上の電線、ケーブル及びプリント電気接続の少なくとも一つを用いて、並びにいくつかの非限定的かつ非包括的な例として、無線周波数領域、マイクロ波領域及び光(可視及び不可視の両方)領域の波長を有する電磁エネルギーなどを用いて、互いに「接続」又は「結合」されると考えることができる。 The terms "connected", "coupled", or any variation thereof, mean any direct or indirect connection or coupling between two or more elements, It can include the presence of one or more intermediate elements between two elements being "connected" or "coupled." Couplings or connections between elements may be physical, logical, or a combination thereof. For example, "connection" may be read as "access". As used in this disclosure, two elements are defined using at least one of one or more wires, cables, and printed electrical connections and, as some non-limiting and non-exhaustive examples, in the radio frequency domain. , electromagnetic energy having wavelengths in the microwave and optical (both visible and invisible) regions, and the like.
 本開示において使用する「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 The term "based on" as used in this disclosure does not mean "based only on" unless otherwise specified. In other words, the phrase "based on" means both "based only on" and "based at least on."
 本開示において使用する「第1の」、「第2の」などの呼称を使用した要素へのいかなる参照も、それらの要素の量又は順序を全般的に限定しない。これらの呼称は、2つ以上の要素間を区別する便利な方法として本開示において使用され得る。したがって、第1及び第2の要素への参照は、2つの要素のみが採用され得ること、又は何らかの形で第1の要素が第2の要素に先行しなければならないことを意味しない。 Any reference to elements using the "first," "second," etc. designations used in this disclosure does not generally limit the quantity or order of those elements. These designations may be used in this disclosure as a convenient method of distinguishing between two or more elements. Thus, reference to a first and second element does not imply that only two elements can be employed or that the first element must precede the second element in any way.
 本開示において、「含む(include)」、「含んでいる(including)」及びそれらの変形が使用されている場合、これらの用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。さらに、本開示において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 Where "include," "including," and variations thereof are used in this disclosure, these terms are inclusive, as is the term "comprising." is intended. Furthermore, the term "or" as used in this disclosure is not intended to be an exclusive OR.
 本開示において、例えば、英語でのa, an及びtheのように、翻訳により冠詞が追加された場合、本開示は、これらの冠詞の後に続く名詞が複数形であることを含んでもよい。 In this disclosure, if articles are added by translation, such as a, an, and the in English, the disclosure may include that the nouns following these articles are plural.
 本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」などの用語も、「異なる」と同様に解釈されてもよい。 In the present disclosure, the term "A and B are different" may mean "A and B are different from each other." The term may also mean that "A and B are different from C". Terms such as "separate," "coupled," etc. may also be interpreted in the same manner as "different."
 10…ディスプレイ、11…表示部、12…検出部、13…制御部、1001…プロセッサ、1002…メモリ、1003…ストレージ、1004…通信装置、1005…入力装置、1006…出力装置、1007…バス。 10... display, 11... display unit, 12... detection unit, 13... control unit, 1001... processor, 1002... memory, 1003... storage, 1004... communication device, 1005... input device, 1006... output device, 1007... bus.

Claims (8)

  1.  ユーザの眼の部分に装着されると共に別のユーザからの操作に応じた表示を行う透過型のディスプレイにおける表示を制御する表示制御装置であって、
     前記ディスプレイを装着するユーザの状態を検出する検出部と、
     前記検出部によって検出されたユーザの状態に応じて、別のユーザからの前記ディスプレイの表示に係る操作に対する制御を行う制御部と、
    を備える表示制御装置。
    A display control device that controls display on a transmissive display that is worn on the user's eye and performs display according to an operation from another user,
    a detection unit that detects the state of a user wearing the display;
    a control unit that controls an operation related to display of the display from another user according to the state of the user detected by the detection unit;
    A display controller comprising:
  2.  前記検出部は、前記ディスプレイを装着するユーザの状態として、当該ユーザの視線を検出する請求項1に記載の表示制御装置。 The display control device according to claim 1, wherein the detection unit detects the line of sight of the user as the state of the user wearing the display.
  3.  前記検出部は、前記ディスプレイを装着するユーザの状態として、当該ユーザの視線の焦点距離を検出して、
     前記制御部は、前記検出部によって検出された焦点距離に応じて、前記制御を行うか否かを判断する、請求項2に記載の表示制御装置。
    The detection unit detects the focal length of the line of sight of the user as the state of the user wearing the display,
    3. The display control device according to claim 2, wherein said control unit determines whether to perform said control according to the focal length detected by said detection unit.
  4.  前記検出部は、前記ディスプレイを装着するユーザの状態として、当該ユーザの視線の方向を検出して、
     前記制御部は、前記検出部によって検出された視線の方向に応じて、前記制御を行う対象を設定する、請求項2又は3に記載の表示制御装置。
    The detection unit detects, as a state of a user wearing the display, a direction of a line of sight of the user,
    The display control device according to claim 2 or 3, wherein the control unit sets the target to be controlled according to the line-of-sight direction detected by the detection unit.
  5.  前記検出部は、前記ディスプレイを装着するユーザの状態として、当該ユーザの移動状態を検出する請求項1~4の何れか一項に記載の表示制御装置。 The display control device according to any one of claims 1 to 4, wherein the detection unit detects a moving state of the user as the state of the user wearing the display.
  6.  前記制御部は、別のユーザからの前記ディスプレイの表示に係る操作を禁止する制御、又は当該操作に係る表示を変更する制御を行う請求項1~5の何れか一項に記載の表示制御装置。 The display control device according to any one of claims 1 to 5, wherein the control unit performs control to prohibit an operation related to display of the display from another user, or control to change the display related to the operation. .
  7.  前記制御部は、制御対象となった別のユーザからの操作を記憶しておき、制御が終了した後に記憶した操作を実行させる請求項1~6の何れか一項に記載の表示制御装置。 The display control device according to any one of claims 1 to 6, wherein the control unit stores an operation performed by another user to be controlled, and executes the stored operation after the control ends.
  8.  前記検出部は、前記ディスプレイを装着するユーザの状態として、当該ユーザの心拍又は発汗に係る状態を検出する請求項1~7の何れか一項に記載の表示制御装置。 The display control device according to any one of claims 1 to 7, wherein the detection unit detects a state related to heartbeat or sweating of the user as the state of the user wearing the display.
PCT/JP2022/005193 2021-03-24 2022-02-09 Display control device WO2022201936A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021050062 2021-03-24
JP2021-050062 2021-03-24

Publications (1)

Publication Number Publication Date
WO2022201936A1 true WO2022201936A1 (en) 2022-09-29

Family

ID=83395469

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/005193 WO2022201936A1 (en) 2021-03-24 2022-02-09 Display control device

Country Status (1)

Country Link
WO (1) WO2022201936A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010061402A (en) * 2008-09-03 2010-03-18 Olympus Corp Information presentation system, program, and information storage medium
JP2016004402A (en) * 2014-06-17 2016-01-12 コニカミノルタ株式会社 Information display system having transmission type hmd and display control program
JP2016206469A (en) * 2015-04-24 2016-12-08 マツダ株式会社 Voice interaction system for vehicle
JP2016209233A (en) * 2015-05-07 2016-12-15 セイコーエプソン株式会社 Biological information processing system, server system, biological information processor and biological information processing method
JP2017049781A (en) * 2015-09-01 2017-03-09 株式会社東芝 Glasses-type wearable device, control method thereof, and information management server
JP2018124826A (en) * 2017-02-01 2018-08-09 株式会社コロプラ Information processing method, apparatus, and program for implementing that information processing method in computer
JP2020149399A (en) * 2019-03-14 2020-09-17 株式会社シーエスレポーターズ Method for providing virtual reality space

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010061402A (en) * 2008-09-03 2010-03-18 Olympus Corp Information presentation system, program, and information storage medium
JP2016004402A (en) * 2014-06-17 2016-01-12 コニカミノルタ株式会社 Information display system having transmission type hmd and display control program
JP2016206469A (en) * 2015-04-24 2016-12-08 マツダ株式会社 Voice interaction system for vehicle
JP2016209233A (en) * 2015-05-07 2016-12-15 セイコーエプソン株式会社 Biological information processing system, server system, biological information processor and biological information processing method
JP2017049781A (en) * 2015-09-01 2017-03-09 株式会社東芝 Glasses-type wearable device, control method thereof, and information management server
JP2018124826A (en) * 2017-02-01 2018-08-09 株式会社コロプラ Information processing method, apparatus, and program for implementing that information processing method in computer
JP2020149399A (en) * 2019-03-14 2020-09-17 株式会社シーエスレポーターズ Method for providing virtual reality space

Similar Documents

Publication Publication Date Title
US11282481B2 (en) Information processing device
WO2019130991A1 (en) Information processing device
KR20150082843A (en) Head mounted display and method for controlling the same
JP2015129929A (en) Method for displaying content and head mounted display
WO2022201936A1 (en) Display control device
WO2023026798A1 (en) Display control device
US20220365741A1 (en) Information terminal system, method, and storage medium
WO2022190735A1 (en) Display control device
US20220083145A1 (en) Information display apparatus using line of sight and gestures
JP2016208079A (en) Information processing device, information processing method and program
WO2022201739A1 (en) Display control device
WO2023026700A1 (en) Display control apparatus
WO2022202065A1 (en) Display control device
WO2020031493A1 (en) Terminal device and method for controlling terminal device
WO2023218751A1 (en) Display control device
WO2024048195A1 (en) Display control apparatus
WO2023119528A1 (en) Head-mounted display device
JP7090791B2 (en) Information processing equipment
KR102419433B1 (en) Method and system for security-enhanced evaluation data inquiry
WO2023026628A1 (en) Display control device
WO2023286569A1 (en) Congestion estimating system
WO2023223750A1 (en) Display device
WO2021172137A1 (en) Content sharing system and terminal
US20240127726A1 (en) Display control device
WO2023210195A1 (en) Identification system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22774737

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22774737

Country of ref document: EP

Kind code of ref document: A1