WO2023218751A1 - Display control device - Google Patents

Display control device Download PDF

Info

Publication number
WO2023218751A1
WO2023218751A1 PCT/JP2023/009936 JP2023009936W WO2023218751A1 WO 2023218751 A1 WO2023218751 A1 WO 2023218751A1 JP 2023009936 W JP2023009936 W JP 2023009936W WO 2023218751 A1 WO2023218751 A1 WO 2023218751A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
virtual object
display device
walking speed
Prior art date
Application number
PCT/JP2023/009936
Other languages
French (fr)
Japanese (ja)
Inventor
怜央 水田
康夫 森永
達哉 西▲崎▼
充宏 後藤
有希 中村
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023218751A1 publication Critical patent/WO2023218751A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a display control device that controls the display of virtual objects.
  • Transparent display devices are generally known that display virtual objects that represent additional information such as explanatory text regarding objects that exist in real-world view (hereinafter referred to as real objects), superimposed on real space.
  • Transparent display devices include HMD devices (Head Mounted Display), AR (Augmented Reality) glasses, and MR (Mixed Reality) glasses, which display virtual objects overlaid in real space without blocking the user's field of view. is displayed.
  • HMD devices Head Mounted Display
  • AR Augmented Reality
  • MR Mated Reality
  • One way to use a transmissive display device is for a user to walk while looking at a virtual object displayed on the transmissive display device. However, in this case, the user's attention (line of sight) is directed too much toward the virtual object, which is not preferable from the viewpoint of ensuring safety. Therefore, a technique has been proposed that limits the information displayed on a transmissive display device based on the level of movement of a user wearing the transmissive display device (for example, Patent Document
  • Patent Document 1 regardless of whether the user's attention is directed to the information displayed on the transmissive display device, the display information is limited based on the movement level of the user of the transmissive display device ( Specifically, this is done according to the moving speed). Therefore, the technique disclosed in Patent Document 1 has a problem in that unnecessary restrictions may be added to the display of information on the transmissive display device.
  • a virtual object display control device includes a calculation unit and a display control unit.
  • the calculation unit calculates the user's concentration level with respect to the virtual object displayed on the transparent display device attached to the user's head based on the user's walking speed.
  • the display control unit controls display of the virtual object on the transparent display device based on the degree of concentration.
  • the display of the virtual object is controlled based on the user's concentration level with respect to the virtual object displayed on the display device, so that unnecessary display restrictions can be avoided and the user can avoid excessive display of the virtual object. It can prevent you from concentrating on
  • FIG. 1 is a block diagram illustrating a configuration example of a display system 1 including a mobile device 10 according to an embodiment of a display control device of the present disclosure.
  • 2 is a diagram showing an example of an image of real space corresponding to the field of view of the user U.
  • FIG. FIG. 2 is a diagram illustrating an example of an image in which virtual objects are superimposed on real space.
  • 1 is a block diagram showing a configuration example of a mobile device 10.
  • FIG. 3 is a flowchart showing the flow of a display control method executed by the processing device 18 of the mobile device 10 according to the program PR1.
  • 2 is a block diagram showing a configuration example of a glasses-type display device 20.
  • FIG. 1 is a block diagram illustrating a configuration example of a display system 1 including a mobile device 10 according to an embodiment of the display control device of the present disclosure. As shown in FIG. 1, the display system 1 includes a glasses-type display device 20 in addition to the mobile device 10.
  • the glasses-type display device 20 is an example of a transmissive display device worn on the head of the user U.
  • the glasses-type display device 20 displays virtual objects that do not exist in real space without blocking the field of view of the user U wearing the glasses-type display device 20.
  • the glasses-type display device 20 includes an imaging device (camera).
  • the eyeglass-type display device 20 worn by the user U captures an image of the user U's viewing range (the viewing range of the eyeglass-type display device 20) using an imaging device.
  • the mobile device 10 is, for example, a smartphone.
  • the mobile device 10 is not limited to a smartphone, and may be, for example, a tablet or a notebook personal computer.
  • the mobile device 10 is worn on the user U's body together with the glasses-type display device 20 .
  • the mobile device 10 is attached to the body of the user U by hanging from the neck using a strap or the like.
  • the mobile device 10 is connected to the eyeglass-type display device 20 by wire.
  • the mobile device 10 may be connected to the eyeglass-type display device 20 wirelessly.
  • the mobile device 10 acquires image data representing an image captured by the glasses-type display device 20 from the glasses-type display device 20 .
  • the mobile device 10 communicates with the management device 30 via the communication line NW.
  • the mobile device 10 transmits the image data acquired from the glasses-type display device 20 to the management device 30.
  • the management device 30 is a server device that provides a self-location recognition service and a content management service in AR.
  • the self-location recognition service refers to a service that specifies the position of the eyeglass-type display device 20 in the global coordinate system based on an image captured by the imaging device of the eyeglass-type display device 20.
  • Specific methods for realizing the self-location recognition service include a method using an AR tag or a method using a distribution of feature points extracted from an image, such as SLAM (Simultaneous Localization and Mapping).
  • the content management service refers to a service that distributes information regarding virtual objects to the glasses-type display device 20.
  • the virtual object corresponds to a real object visible from the position of the glasses-type display device 20 in the global coordinate system.
  • each virtual object is associated with a position in the global coordinate system and corresponds to a real object that can be seen from the position.
  • the management device 30 stores in advance virtual object information and area information corresponding to each virtual object.
  • the virtual object information represents an image of the corresponding virtual object.
  • the area information indicates the position and size of the display area that displays the virtual object.
  • the management device 30 receives image data captured by the glasses-type display device 20 from the mobile device 10 via the communication line NW, and specifies the position of the glasses-type display device 20 based on the received image data. Then, the management device 30 transmits virtual object information corresponding to the specified position and area information corresponding to the specified position to the mobile device 10.
  • the mobile device 10 causes the glasses-type display device 20 to display an image of the virtual object according to the virtual object information and area information received from the management device 30. As a result, the virtual object appears superimposed on the real space in the eyes of the user U.
  • the real space in this embodiment is a cityscape of a tourist spot.
  • the real object in this embodiment is, for example, a store in the cityscape.
  • FIG. 2 is a diagram illustrating an example of a real space image G1 corresponding to the user's U field of view.
  • the virtual object in this embodiment is an image that includes a character string of explanatory text regarding products etc. handled at the store.
  • FIG. 3 is a diagram illustrating an example of an image G2 that the user U views through the glasses-type display device 20. In the image G2 shown in FIG. 3, a virtual object displayed overlapping the cityscape of a tourist spot is drawn with a dotted line.
  • FIG. 4 is a block diagram showing a configuration example of the mobile device 10.
  • the mobile device 10 includes an input device 11, an output device 12, a communication device 14, a communication device 15, a storage device 17, a processing device 18, and a bus 19.
  • the components of the mobile device 10 are interconnected by a bus 19 for communicating information.
  • the bus 19 may be configured using a single bus, or may be configured using different buses for each device.
  • the input device 11 includes a touch panel.
  • the input device 11 may include a plurality of operation keys in addition to a touch panel.
  • the input device 11 may include a plurality of operation keys without including a touch panel.
  • the input device 11 receives operations performed by the user U.
  • Output device 12 includes a display panel. A touch panel of the input device 11 is stacked on the display panel of the output device 12 .
  • the output device 12 displays various information.
  • the communication device 14 is a hardware device for communicating with the management device 30 via the communication line NW.
  • the communication device 14 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 14 transmits the image data given from the processing device 18 to the management device 30. Furthermore, the communication device 14 supplies virtual object information and area information received from the management device 30 to the processing device 18 . Note that the communication device 14 may communicate with the management device 30 without using the communication line NW.
  • the communication device 15 is a hardware device for communicating with the eyeglass-type display device 20 by wire.
  • the communication device 15 supplies data received from the glasses-type display device 20 to the processing device 18 .
  • the communication device 15 transmits image data provided from the processing device 18 to the glasses-type display device 20.
  • the communication device 15 may communicate with the glasses-type display device 20 wirelessly.
  • the storage device 17 is a recording medium that can be read by the processing device 18.
  • the storage device 17 includes, for example, nonvolatile memory and volatile memory.
  • Nonvolatile memories include, for example, ROM (Read Only Memory), EPROM (Erasable Programmable Read Only Memory), and EEPROM (Electrically Erasable Programmable Read Only Memory).
  • the volatile memory is, for example, RAM (Random Access Memory).
  • the storage device 17 stores in advance a program PR1 that causes the processing device 18 to execute the display control method of the present disclosure.
  • the processing device 18 includes one or more CPUs (Central Processing Units). One or more CPUs are an example of one or more processors. Each of the processor and CPU is an example of a computer.
  • the processing device 18 reads the program PR1 from the storage device 17.
  • the processing device 18 operating according to the program PR1 transmits the image data received from the glasses-type display device 20 using the communication device 15 to the management device 30 using the communication device 14.
  • the processing device 18 operating according to the program PR1 functions as a measuring section 181, a determining section 182, a calculating section 183, and a display controlling section 184 shown in FIG. That is, the measurement section 181, determination section 182, calculation section 183, and display control section 184 in FIG. 4 are software modules realized by operating the processing device 18 according to software.
  • the measurement unit 181 measures the walking speed of the user U wearing the glasses-type display device 20.
  • the glasses-type display device 20 transmits acceleration data to the mobile device 10.
  • the acceleration data represents acceleration in the following three directions. The first is acceleration in the direction along the vertical axis (hereinafter referred to as the Z axis). The second is the acceleration along the axis (hereinafter referred to as the X axis) perpendicular to the vertical axis. The third is acceleration in a direction along an axis (hereinafter referred to as Y-axis) that is perpendicular to the vertical axis and perpendicular to the X-axis.
  • the acceleration data transmitted from the glasses-type display device 20 to the mobile device 10 is received by the communication device 15.
  • the measurement unit 181 measures the walking speed of the user U based on the acceleration data received by the communication device 15.
  • the measurement unit 181 first identifies the direction of gravitational acceleration (that is, the Z-axis direction) based on the acceleration data received by the communication device 15. The measurement unit 181 determines that the user U is walking when the acceleration in the Z-axis direction changes within a predetermined cycle. When it is determined that the user U is walking, the measurement unit 181 calculates the absolute value of the composite acceleration obtained by combining the acceleration in the X-axis direction and the acceleration in the Y-axis direction based on the acceleration data. The measurement unit 181 then calculates the user's walking speed by integrating the absolute value of the composite acceleration.
  • the measurement unit 181 calculates the user's walking speed by integrating the absolute value of the composite acceleration.
  • the determining unit 182 determines whether the walking speed measured by the measuring unit 181 is less than a first threshold (walking speed ⁇ first threshold).
  • the first threshold in this embodiment is, for example, an average value of walking speeds measured when a plurality of sample users walk while looking at a virtual object.
  • the first threshold value is not limited to the average walking speed of a plurality of sample users.
  • the first threshold value may be an actual value of the walking speed of the user U.
  • the tutorial on the glasses-type display device 20 is used to measure the walking speed of the user U. During execution of the tutorial, the actual value of the walking speed of the user U, which is measured when the user U walks while looking at the virtual object displayed on the glasses-type display device 20, is set as the first threshold value.
  • the calculation unit 183 calculates the degree of concentration of the user U based on the walking speed of the user U.
  • the user's concentration level refers to the degree of user U's attention to the virtual object displayed on the glasses-type display device 20 (that is, the degree to which the user is concentrating on the virtual object).
  • the degree of concentration of the user U in this embodiment is, for example, a positive value, and the larger the value, the higher the degree of concentration of the user U is.
  • the reason why the degree of concentration of the user U with respect to the virtual object can be calculated based on the walking speed of the user U is as follows.
  • the user U when the user U is in a hurry to move to the destination without paying attention to the virtual object displayed on the glasses-type display device 20, the user U's walking style tends to be brisk, and the walking speed is set to the first speed. Frequently exceeds the threshold. On the other hand, when the user U is walking while paying attention to the virtual object, the user walks a little slowly, and the walking speed often falls below the first threshold. Since there is a correlation between the degree of concentration of the user U with respect to the virtual object displayed on the glasses-type display device 20 and the walking speed, the degree of concentration of the user U can be calculated based on the walking speed of the user U.
  • the calculating unit 183 calculates the walking speed measured by the measuring unit 181 and The degree of concentration of the user U is calculated based on the difference from the first threshold. In this embodiment, the calculation unit 183 calculates the degree of concentration with a larger value as the difference between the walking speed measured by the measurement unit 181 and the first threshold value becomes larger. That is, when the difference between the walking speed measured by the measuring section 181 and the first threshold value is the first value, the first concentration degree calculated by the calculating section 183 is equal to the walking speed measured by the measuring section 181.
  • the first threshold value is a second value (a value smaller than the first value), which is larger than the second concentration degree calculated by the calculation unit 183.
  • the display control unit 184 controls the display of the virtual object on the glasses-type display device 20.
  • the display control unit 184 performs different processing depending on whether the determination result by the determination unit 182 is positive or negative.
  • the display control unit 184 If the determination result by the determination unit 182 is negative, the display control unit 184 generates image data representing the superimposed image.
  • a superimposed image is an image in which an image indicated by virtual object information is arranged in an area indicated by area information.
  • the virtual object information and area information are received by the mobile device 10 from the management device 30. Then, the display control unit 184 transmits the image data to the glasses-type display device 20 and causes the glasses-type display device 20 to display the superimposed image.
  • the display control unit 184 controls the display of the virtual object based on the degree of concentration of the user U calculated by the calculation unit 183. More specifically, when the determination result by the determination unit 182 is affirmative, the display control unit 184 determines whether the concentration level of the user U calculated by the calculation unit 183 exceeds the second threshold (the user's concentration > second threshold). When the degree of concentration of the user U calculated by the calculation unit 183 is less than or equal to the second threshold, the display control unit 184 causes the glasses-type display device 20 to display the above-mentioned superimposed image. When the degree of concentration of the user U calculated by the calculation unit 183 is greater than the second threshold, the display control unit 184 stops generating and transmitting image data representing the superimposed image. As a result, no virtual object is displayed on the glasses-type display device 20.
  • the processing device 18 operating according to the program PR1 executes the display method shown in FIG. 5 every time it receives acceleration data from the communication device 15. As shown in FIG. 5, this display method includes each process from step SA110 to step SA150.
  • step SA110 the processing device 18 functions as the measurement unit 181.
  • step SA110 the processing device 18 measures the walking speed of the user U wearing the glasses-type display device 20 based on the acceleration data received by the communication device 15.
  • step SA120 the processing device 18 functions as the determination unit 182.
  • step SA120 the processing device 18 determines whether the walking speed measured in step SA110 is less than a first threshold. If the walking speed measured in step SA110 is equal to or greater than the first threshold, the determination result in step SA120 is "No" and the process in step SA130 is executed. If the walking speed measured in step SA110 is less than the first threshold, the determination result in step SA120 is "Yes” and the processes in step SA140 and step SA150 are executed.
  • step SA130 the processing device 18 functions as the display control unit 184.
  • step SA130 the processing device 18 generates image data representing the superimposed image. Then, the processing device 18 transmits the image data to the glasses-type display device 20 and causes the glasses-type display device 20 to display the superimposed image.
  • step SA140 the processing device 18 functions as the calculation unit 183.
  • step SA140 processing device 18 calculates the degree of concentration of user U based on the walking speed measured in step SA110.
  • step SA150 following step SA140, the processing device 18 functions as the display control unit 184.
  • step SA150 the processing device 18 controls the display of the virtual object on the glasses-type display device 20 based on the degree of concentration of the user U calculated in step SA140.
  • step SA150 the processing device 18 determines whether the degree of concentration of the user U calculated in step SA140 exceeds the second threshold. If the degree of concentration of the user U calculated in step SA140 is equal to or less than the second threshold, the processing device 18 causes the glasses-type display device 20 to display the superimposed image. If the degree of concentration of the user U calculated in step SA140 is greater than the second threshold, the processing device 18 stops displaying the superimposed image.
  • FIG. 6 is a block diagram showing a configuration example of the eyeglass-type display device 20.
  • the glasses-type display device 20 includes a display section 2a, a communication device 2b, an imaging device 2c, an acceleration sensor 2g, a storage device 2d, a processing device 2e, and a bus 2f.
  • the components of the mirror display device 20 (display section 2a, communication device 2b, imaging device 2c, acceleration sensor 2g, storage device 2d, and processing device 2e) are interconnected by a bus 2f for communicating information.
  • the bus 2f may be configured using a single bus, or may be configured using different buses for each element such as a device.
  • the display section 2a is of a transmissive type. Light in the field of view of the glasses-type display device 20 (display section 2a) passes through the display section 2a.
  • the display unit 2a displays the superimposed image under the control of the mobile device 10.
  • the display section 2a is located in front of the user's U left and right eyes.
  • the user U wearing the glasses-type display device 20 visually recognizes when an image of a virtual object appears in real space.
  • the display unit 2a includes a left eye lens, a left eye display panel, a left eye optical member, a right eye lens, a right eye display panel, and a right eye optical member.
  • the display panel for the left eye and the display panel for the right eye are, for example, a liquid crystal panel or an organic EL (Electro Luminescence) panel.
  • the display panel for the left eye displays a superimposed image represented by image data provided from the mobile device 10.
  • the left eye optical member is an optical member that guides light emitted from the left eye display panel to the left eye lens.
  • the display panel for the right eye displays a superimposed image represented by image data provided from the mobile device 10.
  • the right eye optical member is an optical member that guides light emitted from the right eye display panel to the right eye lens.
  • Each of the left eye lens and the right eye lens has a half mirror.
  • the half mirror included in the left eye lens guides the light representing the real space to the left eye of the user U by transmitting the light representing the real space. Further, the half mirror included in the left eye lens reflects the light guided by the left eye optical member to the user U's left eye.
  • the half mirror included in the right eye lens guides the light representing the real space to the right eye of the user U by transmitting the light representing the real space.
  • the half mirror included in the right eye lens reflects the light guided by the right eye optical member to the user U's right eye.
  • the communication device 2b is a hardware device for communicating with the mobile device 10 by wire.
  • the communication device 2b may communicate with the mobile device 10 wirelessly.
  • the spectacle-type display device 20 has a spectacle-shaped frame that supports a left-eye lens and a right-eye lens, and the imaging device 2c is provided on a bridge of the frame.
  • the imaging device 2c images the field of view of the imaging device 2c.
  • the imaging device 2c outputs image data representing the captured image to the processing device 2e.
  • the acceleration sensor 2g is a three-axis acceleration sensor that detects acceleration in each of the X-axis, Y-axis, and Z-axis directions at regular intervals.
  • the acceleration sensor 2g outputs acceleration data representing acceleration in the directions of the X, Y, and Z axes to the processing device 2e at regular intervals.
  • the storage device 2d is a recording medium that can be read by the processing device 2e.
  • the storage device 2d like the storage device 17, includes nonvolatile memory and volatile memory.
  • the storage device 2d stores a program PR2.
  • the processing device 2e includes one or more CPUs.
  • the processing device 2e reads the program PR2 from the storage device 2d.
  • the processing device 2e functions as an operation control unit 2e1 by executing the program PR2.
  • the operation control unit 2e1 controls the eyeglass-type display device 20.
  • the operation control unit 2e1 transmits the image data output from the imaging device 2c to the mobile device 10 using the communication device 2b. Further, the operation control unit 2e1 transmits acceleration data output from the acceleration sensor 2g to the mobile device 10 using the communication device 2b. Further, the operation control unit 2e1 supplies image data received from the mobile device 10 via the communication device 2b to the display unit 2a.
  • the display section 2a displays an image represented by the image data supplied from the operation control section 2e1.
  • the image data transmitted from the mobile device 10 to the glasses-type display device 20 represents the above-mentioned superimposed image. Since the superimposed image is displayed on the display unit 2a, the user U's eyes see an image of the real space on which the virtual object is superimposed.
  • step SA120 in the display control method executed by the mobile device 10 is "No"
  • image data representing the superimposed image is transmitted from the mobile device 10 to the glasses-type display device 20, and the superimposed image is displayed on the display section 2a of the glasses-type display device 20. Therefore, the user U's eyes see a real space in which virtual objects are superimposed, as in the image G2 shown in FIG.
  • step SA120 determines whether the user U's degree of concentration on the virtual object wearing the glasses-type display device 20 increases and the walking speed falls below the first threshold.
  • step SA140 determines whether the user U's degree of concentration on the virtual object wearing the glasses-type display device 20 increases and the walking speed falls below the first threshold.
  • step SA140 When user U's concentration level with respect to the virtual object further increases and user U's walking speed further slows down, user U's concentration level calculated in step SA140 exceeds the second threshold, and the superimposed image is no longer displayed. As a result, the virtual object does not appear in the eyes of the user U, as in the image G1 shown in FIG.
  • the display of the virtual object is controlled according to the degree of concentration of the user U on the virtual object displayed on the glasses-type display device 20, thereby avoiding unnecessary display restrictions. At the same time, it is possible to prevent the user U from concentrating too much on displaying the virtual object.
  • B Transformation
  • the present disclosure is not limited to the embodiments illustrated above. Specific aspects of the modification are as follows. Two or more aspects arbitrarily selected from the examples below may be combined.
  • B-1 Modification 1
  • the display control unit 184 in the embodiment described above limits the display of the virtual object by stopping the display of the virtual object when the degree of concentration of the user U calculated by the calculation unit 183 exceeds the second threshold. However, when the degree of concentration of the user U calculated by the calculation unit 183 exceeds the second threshold, the display control unit 184 displays, in addition to the virtual object, a message in the form of glasses to warn against excessive concentration on the virtual object. It may be displayed on the device 20. In short, the display control unit 184 only needs to be able to control the display of the virtual object on the glasses-type display device 20 based on the degree of concentration of the user U calculated by the calculation unit 183.
  • the measurement unit 181 calculated the walking speed of the user U by integrating the combined acceleration obtained by combining the acceleration in the X-axis direction and the acceleration in the Y-axis direction.
  • the measurement unit 181 may identify the walking speed of the user U by using a table prepared in advance. Specifically, the table has multiple records. Each of the plurality of records includes a walking speed within a certain range and a periodic swing amplitude of acceleration in the Z-axis direction obtained when the user U walks at the walking speed.
  • the above table is stored in the storage device 17 in advance.
  • the measurement unit 181 receives acceleration data from the glasses-type display device 20, and specifies the walking speed of the user U by reading out from the table the walking speed that corresponds to the swing amplitude of acceleration in the Z-axis direction represented by the acceleration data. do.
  • the glasses-type display device 20 includes a three-axis acceleration sensor, and the measurement unit 181 identifies the walking speed of the user based on the output data of the acceleration sensor.
  • the measurement unit 181 may measure the walking speed of the user based on the moving image captured by the imaging device 2c.
  • the measurement unit 181 may identify the walking speed of the user U wearing the glasses-type display device 20 by referring to a table prepared in advance. Specifically, the table has multiple records.
  • Each of the plurality of records includes a walking speed within a certain range and a blur width in the Z-axis direction of a moving image obtained by capturing an image while the user U is walking at the walking speed.
  • the measurement unit 181 specifies the walking speed of the user U by reading from the table the walking speed corresponding to the shake width in the Z-axis direction of the moving image captured by the imaging device 2c. Furthermore, the measurement unit 181 may determine whether the user U is walking based on the vertical image blur width for each frame of the moving image captured by the imaging device 2c. When it is determined that the user U is walking, the walking speed may be determined based on the change in size of an object whose size is known, such as a road sign or a vehicle license plate, over each frame. When specifying the user's walking speed based on the moving image captured by the imaging device 2c, the acceleration sensor can be omitted.
  • a GPS (Global Positioning System, Global Positioning Satellite) receiver is provided in the glasses-type display device 20 instead of an acceleration sensor, and the measurement unit 181 can identify the user's walking speed based on the output data of the GPS receiver. good. For example, the measurement unit 181 determines whether the user U is walking based on the image blur width in the Z-axis direction for each frame of the moving image captured by the imaging device 2c. When it is determined that the user U is walking, the measurement unit 181 uses the first position information acquired by the GPS receiver at time t1 and the second position information acquired by the GPS receiver at time t2 after time t1. The moving distance of the user is calculated based on the position information, and the value obtained by dividing this moving distance by t2-t1 is set as the walking speed.
  • GPS Global Positioning System, Global Positioning Satellite
  • the program PR1 is stored in the storage device 17 of the mobile device 10, but the program PR1 may be manufactured or sold separately.
  • the method of providing the program PR1 to the purchaser is to distribute a computer-readable recording medium such as a flash ROM in which the program PR1 is written, or to distribute the recording medium via a telecommunications line. For example, it can be distributed by downloading.
  • the measurement unit 181, determination unit 182, calculation unit 183, and display control unit 184 in the above embodiment were software modules.
  • any one, a plurality, or all of the measurement section 181, the determination section 182, the calculation section 183, and the display control section 184 may be a hardware module.
  • Specific examples of the hardware module include DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), and the like.
  • the mobile device 10 included the measurement section 181, the determination section 182, the calculation section 183, and the display control section 184.
  • the measurement unit 181 may be provided in the glasses-type display device 20, and the data representing the walking speed may be supplied from the glasses-type display device 20 to the mobile device 10.
  • the determination unit 182 may be omitted.
  • the display control device that controls the display of the virtual object on the glasses-type display device 20 only needs to include the calculation section 183 and the display control section 184.
  • the storage device 17 and the storage device 2d are exemplified as ROM, RAM, etc., but the storage device 17 and the storage device 2d are flexible disks, magneto-optical disks (for example, compact disks, digital Application discs, Blu-ray (registered trademark) discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable disks, hard disks, floppies (registered trademark) disk, magnetic strip, database, server, or other suitable storage medium.
  • magneto-optical disks for example, compact disks, digital Application discs, Blu-ray (registered trademark) discs
  • smart cards e.g. cards, sticks, key drives
  • CD-ROMs Compact Disc-ROMs
  • registers removable disks
  • hard disks hard disks
  • floppies registered trademark
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
  • the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
  • the determination may be made based on a value represented by 1 bit (0 or 1), or may be made based on a truth value (Boolean: true or false). , may be performed by numerical comparison (for example, comparison with a predetermined value).
  • each function illustrated in FIG. 4 is realized by an arbitrary combination of at least one of hardware and software.
  • the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices.
  • the functional block may be realized by combining software with the one device or the plurality of devices.
  • the programs exemplified in the embodiments described above may include instructions, instruction sets, codes, software, firmware, middleware, microcode, hardware description language, or other names. Should be broadly construed to mean a code segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc. .
  • software, instructions, information, etc. may be sent and received via a transmission medium.
  • a transmission medium For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • the information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
  • the mobile device includes a mobile station (MS).
  • MS mobile station
  • a mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as “mobile station,” “user terminal,” “user equipment (UE),” and “terminal” may be used interchangeably.
  • connection refers to direct or indirect connections between two or more elements. Refers to any connection or combination and may include the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other.
  • the bonds or connections between elements may be physical, logical, or a combination thereof.
  • connection may be replaced with "access.”
  • two elements may include one or more electrical wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and non-visible) ranges, and the like.
  • determining and “determining” used in this disclosure may encompass a wide variety of operations.
  • “Judgment” and “decision” include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a “judgment” or “decision.”
  • judgment and “decision” refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access.
  • (accessing) may include considering something as a “judgment” or “decision.”
  • judgment and “decision” refer to resolving, selecting, choosing, establishing, comparing, etc. as “judgment” and “decision”. may be included.
  • judgment and “decision” may include regarding some action as having been “judged” or “determined.”
  • judgment (decision) may be read as “assuming", “expecting", “considering”, etc.
  • the display control device includes a calculation section 183 and a display control section 184.
  • the calculation unit 183 calculates the degree of concentration of the user U with respect to the virtual object displayed on the transparent display device worn on the head of the user U, based on the walking speed of the user.
  • the display control unit 184 controls the display of the virtual object on the transparent display device based on the user concentration level calculated by the calculation unit 183.
  • the mobile device 10 is an example of a display control device of the present disclosure.
  • the glasses-type display device 20 is an example of a transmissive display device in the present disclosure.
  • the display of the virtual object is controlled based on the user's degree of concentration on the virtual object displayed on the transparent display device, so unnecessary display restrictions can be avoided while It is possible to prevent the user from concentrating too much on displaying the virtual object.
  • the display control device may further include a measurement unit 181 that measures the walking speed of the user.
  • the display control device of the second aspect can control the display of the virtual object according to the concentration degree calculated based on the walking speed measured by the measurement unit 181.
  • the display control device may include a determination unit 182 that determines whether the walking speed of the user has fallen below a first threshold.
  • the calculation unit 183 in the display control device according to the third aspect calculates the difference between the first threshold value and the walking speed of the user when the determination result by the determination unit 182 is affirmative (when the walking speed is less than the first threshold value).
  • the user's concentration level may be calculated based on the difference.
  • the display control unit 184 in the display control device according to the third aspect controls the display of the virtual object on the display device based on the user concentration calculated by the calculation unit 183. Limits may be based on degree.
  • the display control device of the third aspect can limit the display of the virtual object on the display device based on the user's concentration level calculated by the calculation unit 183 when the user's walking speed is less than the first threshold.
  • the display control unit 184 in the display control device according to the fourth aspect performs virtual control on the transmissive display device when the user concentration calculated by the calculation unit 183 exceeds the second threshold. Display of objects may be restricted.
  • the display control device of the fourth aspect can restrict the display of the virtual object on the transparent display device when the user's concentration level calculated based on the user's walking speed exceeds the second threshold.
  • SYMBOLS 1...Display system 10...Mobile device, 20...Glasses type display device, 11...Input device, 12...Output device, 14, 15, 2b...Communication device, 17, 2d...Storage device, 18, 2e...Processing device, 181...Measurement section, 182...Judgment section, 183...Calculation section, 184...Display control section, 19, 2f...Bus, 2a...Display section, 2g...Acceleration sensor, 2e1...Operation control section, PR1, PR2...Program.

Abstract

A mobile device that is an aspect of a display control device according to the present disclosure communicates with a transmission-type display device mounted on the head of a user, thereby controlling the display of a virtual object on the transmission-type display device. The mobile device is provided with a calculation unit and a display control unit. The calculation unit calculates, on the basis of the walking speed of the user, the degree of concentration of the user on the virtual object displayed on the transmission-type display device. The display control unit controls display of the virtual object on the transmission-type display device on the basis of the degree of concentration of the user calculated by the calculation unit.

Description

表示制御装置display control device
 本発明は、仮想オブジェクトの表示を制御する表示制御装置に関する。 The present invention relates to a display control device that controls the display of virtual objects.
 実空間(real-world view)に存在する物体(以下、実オブジェクト)に関する説明文等の付加情報を表す仮想オブジェクトを実空間に重ねて表示する透過型表示装置が一般に知られている。透過型表表示装置の中には、HMD装置(Head Mounted Display)、AR(Augmented Reality)グラス及びMR(Mixed Reality)グラス等があり、ユーザの視界を遮蔽せずに実空間に重ねて仮想オブジェクトが表示される。透過型表示装置の使い方としては、透過型表示装置に表示された仮想オブジェクトを見ながらユーザが歩行することが挙げられる。しかし、このような使い方の場合、仮想オブジェクトにユーザの注意(視線)が向きすぎるので、安全確保の観点からは好ましくない。そこで、透過型表示装置を装着したユーザの動きのレベルに基づいて透過型表示装置に表示される情報を制限する技術が提案されている(例えば、特許文献1)。 Transparent display devices are generally known that display virtual objects that represent additional information such as explanatory text regarding objects that exist in real-world view (hereinafter referred to as real objects), superimposed on real space. Transparent display devices include HMD devices (Head Mounted Display), AR (Augmented Reality) glasses, and MR (Mixed Reality) glasses, which display virtual objects overlaid in real space without blocking the user's field of view. is displayed. One way to use a transmissive display device is for a user to walk while looking at a virtual object displayed on the transmissive display device. However, in this case, the user's attention (line of sight) is directed too much toward the virtual object, which is not preferable from the viewpoint of ensuring safety. Therefore, a technique has been proposed that limits the information displayed on a transmissive display device based on the level of movement of a user wearing the transmissive display device (for example, Patent Document 1).
特開2019-153347号公報Japanese Patent Application Publication No. 2019-153347
 特許文献1に開示の技術では、透過型表示装置に表示される情報にユーザの注意が向けられているか否かとは無関係に、表示する情報の制限が透過型表示装置のユーザの動きのレベル(具体的には、移動速度)に応じて行われる。従って、特許文献1に開示の技術には、透過型表示装置への情報の表示に不必要な制限が加えられる場合がある、という問題がある。 In the technology disclosed in Patent Document 1, regardless of whether the user's attention is directed to the information displayed on the transmissive display device, the display information is limited based on the movement level of the user of the transmissive display device ( Specifically, this is done according to the moving speed). Therefore, the technique disclosed in Patent Document 1 has a problem in that unnecessary restrictions may be added to the display of information on the transmissive display device.
 本開示の好適な態様に係る仮想オブジェクトの表示制御装置は、算出部と、表示制御部と、を備える。算出部は、ユーザの頭部に装着された透過型表示装置に表示される仮想オブジェクトに対する前記ユーザの集中度を、前記ユーザの歩行速度に基づいて算出する。表示制御部は、前記透過型表示装置への仮想オブジェクトの表示を、前記集中度に基づいて制御する。 A virtual object display control device according to a preferred aspect of the present disclosure includes a calculation unit and a display control unit. The calculation unit calculates the user's concentration level with respect to the virtual object displayed on the transparent display device attached to the user's head based on the user's walking speed. The display control unit controls display of the virtual object on the transparent display device based on the degree of concentration.
 本開示によれば、表示装置に表示される仮想オブジェクトに対するユーザの集中度に基づいて仮想オブジェクトの表示が制御されるので、不必要な表示制限を回避しつつ、ユーザが仮想オブジェクトの表示に過度に集中することを防ぐことができる。 According to the present disclosure, the display of the virtual object is controlled based on the user's concentration level with respect to the virtual object displayed on the display device, so that unnecessary display restrictions can be avoided and the user can avoid excessive display of the virtual object. It can prevent you from concentrating on
本開示の表示制御装置の一実施形態による携帯機器10を含む表示システム1の構成例を示すブロック図である。1 is a block diagram illustrating a configuration example of a display system 1 including a mobile device 10 according to an embodiment of a display control device of the present disclosure. ユーザUの視界に対応する実空間の画像の一例を示す図である。2 is a diagram showing an example of an image of real space corresponding to the field of view of the user U. FIG. 仮想オブジェクトが実空間に重ねられた画像の一例を示す図である。FIG. 2 is a diagram illustrating an example of an image in which virtual objects are superimposed on real space. 携帯機器10の構成例を示すブロック図である。1 is a block diagram showing a configuration example of a mobile device 10. FIG. 携帯機器10の処理装置18がプログラムPR1に従って実行する表示制御方法の流れを示すフローチャートである。3 is a flowchart showing the flow of a display control method executed by the processing device 18 of the mobile device 10 according to the program PR1. 眼鏡型表示装置20の構成例を示すブロック図である。2 is a block diagram showing a configuration example of a glasses-type display device 20. FIG.
(A.実施形態)
 図1は、本開示の表示制御装置の一実施形態による携帯機器10を含む表示システム1の構成例を示すブロック図である。図1に示されるように、表示システム1は、携帯機器10の他に、眼鏡型表示装置20を含む。
(A. Embodiment)
FIG. 1 is a block diagram illustrating a configuration example of a display system 1 including a mobile device 10 according to an embodiment of the display control device of the present disclosure. As shown in FIG. 1, the display system 1 includes a glasses-type display device 20 in addition to the mobile device 10.
 眼鏡型表示装置20は、ユーザUの頭部に装着される透過型表示装置の一例である。眼鏡型表示装置20は、眼鏡型表示装置20を装着したユーザUの視界を遮蔽せずに、実空間には存在しない仮想オブジェクトを表示する。眼鏡型表示装置20は、撮像装置(カメラ)を備える。ユーザUに装着された眼鏡型表示装置20は、ユーザUの視界範囲(眼鏡型表示装置20の視野範囲)を、撮像装置を用いて撮像する。 The glasses-type display device 20 is an example of a transmissive display device worn on the head of the user U. The glasses-type display device 20 displays virtual objects that do not exist in real space without blocking the field of view of the user U wearing the glasses-type display device 20. The glasses-type display device 20 includes an imaging device (camera). The eyeglass-type display device 20 worn by the user U captures an image of the user U's viewing range (the viewing range of the eyeglass-type display device 20) using an imaging device.
 携帯機器10は、例えばスマートフォンである。携帯機器10は、スマートフォンに限らず、例えば、タブレット又はノート型パーソナルコンピュータでもよい。携帯機器10は、眼鏡型表示装置20と共にユーザUの身体に装着される。携帯機器10は、ストラップ等を用いて首からぶら下げることでユーザUの身体に装着される。携帯機器10は、有線にて眼鏡型表示装置20に接続される。携帯機器10は、無線にて眼鏡型表示装置20と接続されてもよい。携帯機器10は、眼鏡型表示装置20により撮像された画像を表す画像データを眼鏡型表示装置20から取得する。 The mobile device 10 is, for example, a smartphone. The mobile device 10 is not limited to a smartphone, and may be, for example, a tablet or a notebook personal computer. The mobile device 10 is worn on the user U's body together with the glasses-type display device 20 . The mobile device 10 is attached to the body of the user U by hanging from the neck using a strap or the like. The mobile device 10 is connected to the eyeglass-type display device 20 by wire. The mobile device 10 may be connected to the eyeglass-type display device 20 wirelessly. The mobile device 10 acquires image data representing an image captured by the glasses-type display device 20 from the glasses-type display device 20 .
 また、携帯機器10は、通信回線NWを介して管理装置30と通信する。携帯機器10は、眼鏡型表示装置20から取得した画像データを管理装置30へ送信する。管理装置30は、ARにおける自己位置認識サービス及びコンテンツ管理サービスを提供するサーバ装置である。 Additionally, the mobile device 10 communicates with the management device 30 via the communication line NW. The mobile device 10 transmits the image data acquired from the glasses-type display device 20 to the management device 30. The management device 30 is a server device that provides a self-location recognition service and a content management service in AR.
 自己位置認識サービスとは、眼鏡型表示装置20の撮像装置により撮像された画像に基づいて、グローバル座標系における眼鏡型表示装置20の位置を特定するサービスのことをいう。自己位置認識サービスの具体的な実現方法は、ARタグを用いる方法、又はSLAM(Simultaneous Localization and Mapping)のように画像から抽出される特徴点の分布を利用する方法が挙げられる。コンテンツ管理サービスとは、仮想オブジェクトに関する情報を、眼鏡型表示装置20に配信するサービスのことをいう。仮想オブジェクトは、グローバル座標系における眼鏡型表示装置20の位置から見える実オブジェクトに対応している。 The self-location recognition service refers to a service that specifies the position of the eyeglass-type display device 20 in the global coordinate system based on an image captured by the imaging device of the eyeglass-type display device 20. Specific methods for realizing the self-location recognition service include a method using an AR tag or a method using a distribution of feature points extracted from an image, such as SLAM (Simultaneous Localization and Mapping). The content management service refers to a service that distributes information regarding virtual objects to the glasses-type display device 20. The virtual object corresponds to a real object visible from the position of the glasses-type display device 20 in the global coordinate system.
 本実施形態では、様々な仮想オブジェクトが用意されており、各仮想オブジェクトは、グローバル座標系における位置に対応付けて当該位置から見える実オブジェクトに対応している。管理装置30には、各仮想オブジェクトに対応する、仮想オブジェクト情報と、領域情報とが予め記憶されている。仮想オブジェクト情報は、該当する仮想オブジェクトの画像を表す。領域情報は、当該仮想オブジェクトを表示する表示領域の位置及び大きさを示す。管理装置30は、眼鏡型表示装置20によって撮像された画像データを、通信回線NWを介して携帯機器10から受信し、受信した画像データに基づいて眼鏡型表示装置20の位置を特定する。そして、管理装置30は、当該特定された位置に対応する仮想オブジェクト情報及び当該特定された位置に対応する領域情報を携帯機器10へ送信する。携帯機器10は、管理装置30から受信した仮想オブジェクト情報及び領域情報に従って、仮想オブジェクトの画像を眼鏡型表示装置20に表示させる。これにより、ユーザUの眼には、実空間に重ねて仮想オブジェクトが映る。 In this embodiment, various virtual objects are prepared, and each virtual object is associated with a position in the global coordinate system and corresponds to a real object that can be seen from the position. The management device 30 stores in advance virtual object information and area information corresponding to each virtual object. The virtual object information represents an image of the corresponding virtual object. The area information indicates the position and size of the display area that displays the virtual object. The management device 30 receives image data captured by the glasses-type display device 20 from the mobile device 10 via the communication line NW, and specifies the position of the glasses-type display device 20 based on the received image data. Then, the management device 30 transmits virtual object information corresponding to the specified position and area information corresponding to the specified position to the mobile device 10. The mobile device 10 causes the glasses-type display device 20 to display an image of the virtual object according to the virtual object information and area information received from the management device 30. As a result, the virtual object appears superimposed on the real space in the eyes of the user U.
 本実施形態における実空間は、観光地の街並みである。本実施形態における実オブジェクトは、例えば当該街並みにおける店舗である。図2は、ユーザUの視界に対応する実空間の画像G1の一例を示す図である。本実施形態における仮想オブジェクトは、当該店舗にて取り扱われている商品等に関する説明文の文字列を含む画像である。図3は、眼鏡型表示装置20越しにユーザUが見る画像G2の一例を示す図である。図3に示される画像G2では、観光地の街並み重ねて表示される仮想オブジェクトが点線で描画されている。 The real space in this embodiment is a cityscape of a tourist spot. The real object in this embodiment is, for example, a store in the cityscape. FIG. 2 is a diagram illustrating an example of a real space image G1 corresponding to the user's U field of view. The virtual object in this embodiment is an image that includes a character string of explanatory text regarding products etc. handled at the store. FIG. 3 is a diagram illustrating an example of an image G2 that the user U views through the glasses-type display device 20. In the image G2 shown in FIG. 3, a virtual object displayed overlapping the cityscape of a tourist spot is drawn with a dotted line.
 図4は、携帯機器10の構成例を示すブロック図である。図4に示されるように、携帯機器10は、入力装置11と、出力装置12と、通信装置14と、通信装置15と、記憶装置17と、処理装置18と、バス19と、を含む。携帯機器10の構成要素(入力装置11、出力装置12、通信装置14、通信装置15、記憶装置17、及び処理装置18)は、情報を通信するためのバス19によって相互に接続される。バス19は、単一のバスを用いて構成されてもよいし、装置間ごとに異なるバスを用いて構成されてもよい。 FIG. 4 is a block diagram showing a configuration example of the mobile device 10. As shown in FIG. 4, the mobile device 10 includes an input device 11, an output device 12, a communication device 14, a communication device 15, a storage device 17, a processing device 18, and a bus 19. The components of the mobile device 10 (input device 11, output device 12, communication device 14, communication device 15, storage device 17, and processing device 18) are interconnected by a bus 19 for communicating information. The bus 19 may be configured using a single bus, or may be configured using different buses for each device.
 入力装置11は、タッチパネルを含む。入力装置11は、タッチパネルに加えて、複数の操作キーを含んでもよい。入力装置11は、タッチパネルを含まずに、複数の操作キーを含んでもよい。入力装置11は、ユーザUが行う操作を受け付ける。出力装置12は、ディスプレイパネルを含む。出力装置12のディスプレイパネルの上には入力装置11のタッチパネルが積層されている。出力装置12は、種々の情報を表示する。 The input device 11 includes a touch panel. The input device 11 may include a plurality of operation keys in addition to a touch panel. The input device 11 may include a plurality of operation keys without including a touch panel. The input device 11 receives operations performed by the user U. Output device 12 includes a display panel. A touch panel of the input device 11 is stacked on the display panel of the output device 12 . The output device 12 displays various information.
 通信装置14は、通信回線NWを介して管理装置30と通信するためのハードウェアデバイスである。通信装置14は、例えば、ネットワークデバイス、ネットワークコントローラ、ネットワークカード、又は通信モジュール等とも呼ばれる。通信装置14は、処理装置18から与えられた画像データを管理装置30へ送信する。また、通信装置14は、管理装置30から受信した仮想オブジェクト情報及び領域情報を処理装置18へ供給する。なお、通信装置14は、通信回線NWを介さずに管理装置30と通信してもよい。 The communication device 14 is a hardware device for communicating with the management device 30 via the communication line NW. The communication device 14 is also called, for example, a network device, a network controller, a network card, a communication module, or the like. The communication device 14 transmits the image data given from the processing device 18 to the management device 30. Furthermore, the communication device 14 supplies virtual object information and area information received from the management device 30 to the processing device 18 . Note that the communication device 14 may communicate with the management device 30 without using the communication line NW.
 通信装置15は、有線にて眼鏡型表示装置20と通信するためのハードウェアデバイスである。通信装置15は、眼鏡型表示装置20から受信したデータを処理装置18へ供給する。また、通信装置15は、処理装置18から与えられる画像データを眼鏡型表示装置20に送信する。なお、通信装置15は、無線にて眼鏡型表示装置20と通信してもよい。 The communication device 15 is a hardware device for communicating with the eyeglass-type display device 20 by wire. The communication device 15 supplies data received from the glasses-type display device 20 to the processing device 18 . Furthermore, the communication device 15 transmits image data provided from the processing device 18 to the glasses-type display device 20. Note that the communication device 15 may communicate with the glasses-type display device 20 wirelessly.
 記憶装置17は、処理装置18が読み取り可能な記録媒体である。記憶装置17は、例えば、不揮発性メモリと揮発性メモリとを含む。不揮発性メモリは、例えば、ROM(Read Only Memory)、EPROM(Erasable Programmable Read Only Memory)及びEEPROM(Electrically Erasable Programmable Read Only Memory)である。揮発性メモリは、例えば、RAM(Random Access Memory)である。記憶装置17には、本開示の表示制御方法を処理装置18に実行させるプログラムPR1が予め記憶されている。 The storage device 17 is a recording medium that can be read by the processing device 18. The storage device 17 includes, for example, nonvolatile memory and volatile memory. Nonvolatile memories include, for example, ROM (Read Only Memory), EPROM (Erasable Programmable Read Only Memory), and EEPROM (Electrically Erasable Programmable Read Only Memory). The volatile memory is, for example, RAM (Random Access Memory). The storage device 17 stores in advance a program PR1 that causes the processing device 18 to execute the display control method of the present disclosure.
 処理装置18は、1又は複数のCPU(Central Processing Unit)を含む。1又は複数のCPUは、1又は複数のプロセッサの一例である。プロセッサ及びCPUの各々は、コンピュータの一例である。処理装置18は、記憶装置17からプログラムPR1を読み取る。プログラムPR1に従って作動している処理装置18は、通信装置15を用いて眼鏡型表示装置20から受信した画像データを、通信装置14を用いて管理装置30へ送信する。また、プログラムPR1に従って作動している処理装置18は、図4に示される計測部181、判定部182、算出部183、及び表示制御部184として機能する。つまり、図4における計測部181、判定部182、算出部183、及び表示制御部184は処理装置18をソフトウェアに従って作動させることにより実現されるソフトウェアモジュールである。 The processing device 18 includes one or more CPUs (Central Processing Units). One or more CPUs are an example of one or more processors. Each of the processor and CPU is an example of a computer. The processing device 18 reads the program PR1 from the storage device 17. The processing device 18 operating according to the program PR1 transmits the image data received from the glasses-type display device 20 using the communication device 15 to the management device 30 using the communication device 14. Further, the processing device 18 operating according to the program PR1 functions as a measuring section 181, a determining section 182, a calculating section 183, and a display controlling section 184 shown in FIG. That is, the measurement section 181, determination section 182, calculation section 183, and display control section 184 in FIG. 4 are software modules realized by operating the processing device 18 according to software.
 計測部181は、眼鏡型表示装置20を装着しているユーザUの歩行速度を計測する。本実施形態では、眼鏡型表示装置20は、加速度データを携帯機器10へ送信する。加速度データは、次に示す3方向の加速度を表す。一つ目は、鉛直軸(以下、Z軸)に沿った方向の加速度である。二つ目は、鉛直軸に直交する軸(以下、X軸)に沿った加速度である。3つ目は、鉛直軸に直交し、且つX軸に直交する軸(以下、Y軸)に沿った方向の加速度である。眼鏡型表示装置20から携帯機器10へ送信された加速度データは通信装置15により受信される。計測部181は、通信装置15により受信された加速度データに基づいてユーザUの歩行速度を計測する。 The measurement unit 181 measures the walking speed of the user U wearing the glasses-type display device 20. In this embodiment, the glasses-type display device 20 transmits acceleration data to the mobile device 10. The acceleration data represents acceleration in the following three directions. The first is acceleration in the direction along the vertical axis (hereinafter referred to as the Z axis). The second is the acceleration along the axis (hereinafter referred to as the X axis) perpendicular to the vertical axis. The third is acceleration in a direction along an axis (hereinafter referred to as Y-axis) that is perpendicular to the vertical axis and perpendicular to the X-axis. The acceleration data transmitted from the glasses-type display device 20 to the mobile device 10 is received by the communication device 15. The measurement unit 181 measures the walking speed of the user U based on the acceleration data received by the communication device 15.
 より詳細に説明すると、計測部181は、まず、通信装置15により受信した加速度データに基づいて重力加速度の方向(即ち、Z軸の方向)を特定する。計測部181は、Z軸方向の加速度が所定周期の範囲で変化する場合に、ユーザUは歩行していると判定する。ユーザUが歩行していると判定した場合、計測部181は、X軸方向の加速度及びY軸方向の加速度を合成して得られる合成加速度の絶対値を、加速度データに基づいて算出する。そして、計測部181は、合成加速度の絶対値を積分することによって、ユーザの歩行速度を算出する。 To explain in more detail, the measurement unit 181 first identifies the direction of gravitational acceleration (that is, the Z-axis direction) based on the acceleration data received by the communication device 15. The measurement unit 181 determines that the user U is walking when the acceleration in the Z-axis direction changes within a predetermined cycle. When it is determined that the user U is walking, the measurement unit 181 calculates the absolute value of the composite acceleration obtained by combining the acceleration in the X-axis direction and the acceleration in the Y-axis direction based on the acceleration data. The measurement unit 181 then calculates the user's walking speed by integrating the absolute value of the composite acceleration.
 判定部182は、計測部181により計測された歩行速度が第1閾値を下回るか否かを判定する(歩行速度<第1閾値)。本実施形態における第1閾値は、例えば複数のサンプルユーザが仮想オブジェクトを見ながら歩行した場合に計測された歩行速度の平均値である。ただし、第1閾値は、複数のサンプルユーザに関する歩行速度の平均値には限定されない。第1閾値は、ユーザUの歩行速度の実測値であってもよい。この場合、例えば、ユーザUの歩行速度の計測には、眼鏡型表示装置20のチュートリアルが利用される。チュートリアルの実行中において、眼鏡型表示装置20に表示された仮想オブジェクトを見ながら、ユーザUが歩行した場合に計測されるユーザUの歩行速度の実測値が第1閾値に設定される。 The determining unit 182 determines whether the walking speed measured by the measuring unit 181 is less than a first threshold (walking speed<first threshold). The first threshold in this embodiment is, for example, an average value of walking speeds measured when a plurality of sample users walk while looking at a virtual object. However, the first threshold value is not limited to the average walking speed of a plurality of sample users. The first threshold value may be an actual value of the walking speed of the user U. In this case, for example, the tutorial on the glasses-type display device 20 is used to measure the walking speed of the user U. During execution of the tutorial, the actual value of the walking speed of the user U, which is measured when the user U walks while looking at the virtual object displayed on the glasses-type display device 20, is set as the first threshold value.
 算出部183は、ユーザUの集中度を、ユーザUの歩行速度に基づいて算出する。ユーザの集中度とは、眼鏡型表示装置20に表示される仮想オブジェクトに対するユーザUの注意の程度(即ち、仮想オブジェクトに対してユーザがどの程度集中しているかを示す程度)を指す。本実施形態におけるユーザUの集中度は、例えば正の値であり、値が大きいほどユーザUの集中度が高いことを表す。仮想オブジェクトに対するユーザUの集中度を、ユーザUの歩行速度に基づいて算出できる理由は次の通りである。 The calculation unit 183 calculates the degree of concentration of the user U based on the walking speed of the user U. The user's concentration level refers to the degree of user U's attention to the virtual object displayed on the glasses-type display device 20 (that is, the degree to which the user is concentrating on the virtual object). The degree of concentration of the user U in this embodiment is, for example, a positive value, and the larger the value, the higher the degree of concentration of the user U is. The reason why the degree of concentration of the user U with respect to the virtual object can be calculated based on the walking speed of the user U is as follows.
 例えば、ユーザUが眼鏡型表示装置20に表示される仮想オブジェクトに注意を払わずに目的地への移動を急いでいる場合、ユーザUの歩き方は早歩き気味になり、歩行速度は第1閾値を上回ることが多い。一方、ユーザUが当該仮想オブジェクトに注意を向けながら歩いている場合、ユーザの歩き方はゆっくり気味になり、歩行速度は第1閾値を下回ることが多い。眼鏡型表示装置20に表示される仮想オブジェクトに対するユーザUの集中度と歩行速度との間には相関があるので、ユーザUの歩行速度に基づいてユーザUの集中度を算出することができる。 For example, when the user U is in a hurry to move to the destination without paying attention to the virtual object displayed on the glasses-type display device 20, the user U's walking style tends to be brisk, and the walking speed is set to the first speed. Frequently exceeds the threshold. On the other hand, when the user U is walking while paying attention to the virtual object, the user walks a little slowly, and the walking speed often falls below the first threshold. Since there is a correlation between the degree of concentration of the user U with respect to the virtual object displayed on the glasses-type display device 20 and the walking speed, the degree of concentration of the user U can be calculated based on the walking speed of the user U.
 本実施形態では、算出部183は、判定部182による判定結果が肯定である場合、即ち計測部181により計測された歩行速度が第1閾値を下回る場合、計測部181により計測された歩行速度と第1閾値との差に基づいてユーザUの集中度を算出する。本実施形態では、算出部183は、計測部181により計測された歩行速度と第1閾値との差が大きいほど、大きな値の集中度を算出する。即ち、計測部181により計測された歩行速度と第1閾値との差が第1の値である場合に算出部183により算出される第1の集中度は、計測部181により計測された歩行速度と第1閾値と差が第2の値(第1の値よりも小さい値)である場合に算出部183により算出される第2の集中度よりも大きい。前述したように、眼鏡型表示装置20に表示される仮想オブジェクトに向けられる注意が多いほど、眼鏡型表示装置20を使用中のユーザの歩行速度は遅くなることが多いからである。 In the present embodiment, if the determination result by the determining unit 182 is positive, that is, if the walking speed measured by the measuring unit 181 is less than the first threshold, the calculating unit 183 calculates the walking speed measured by the measuring unit 181 and The degree of concentration of the user U is calculated based on the difference from the first threshold. In this embodiment, the calculation unit 183 calculates the degree of concentration with a larger value as the difference between the walking speed measured by the measurement unit 181 and the first threshold value becomes larger. That is, when the difference between the walking speed measured by the measuring section 181 and the first threshold value is the first value, the first concentration degree calculated by the calculating section 183 is equal to the walking speed measured by the measuring section 181. and the first threshold value is a second value (a value smaller than the first value), which is larger than the second concentration degree calculated by the calculation unit 183. This is because, as described above, the more attention is paid to the virtual object displayed on the glasses-type display device 20, the slower the walking speed of the user who is using the glasses-type display device 20 often becomes.
 表示制御部184は、眼鏡型表示装置20への仮想オブジェクトの表示を制御する。表示制御部184は、判定部182による判定結果が肯定である場合と否定である場合とで異なる処理を実行する。 The display control unit 184 controls the display of the virtual object on the glasses-type display device 20. The display control unit 184 performs different processing depending on whether the determination result by the determination unit 182 is positive or negative.
 判定部182による判定結果が否定である場合、表示制御部184は、重畳画像を表す画像データを生成する。重畳画像とは、仮想オブジェクト情報の示す画像を、領域情報の示す領域に配置した画像である。仮想オブジェクト情報及び領域情報は、携帯機器10によって管理装置30から受信される。そして、表示制御部184は、当該画像データを眼鏡型表示装置20に送信し、重畳画像を眼鏡型表示装置20に表示させる。 If the determination result by the determination unit 182 is negative, the display control unit 184 generates image data representing the superimposed image. A superimposed image is an image in which an image indicated by virtual object information is arranged in an area indicated by area information. The virtual object information and area information are received by the mobile device 10 from the management device 30. Then, the display control unit 184 transmits the image data to the glasses-type display device 20 and causes the glasses-type display device 20 to display the superimposed image.
 判定部182による判定結果が肯定である場合、表示制御部184は、算出部183により算出されたユーザUの集中度に基づいて仮想オブジェクトの表示を制御する。より詳細には、判定部182による判定結果が肯定である場合、表示制御部184は、算出部183により算出されたユーザUの集中度が第2閾値を上回るか否かを判定する(ユーザの集中度>第2閾値)。算出部183により算出されたユーザUの集中度が第2閾値以下である場合、表示制御部184は、前述の重畳画像を眼鏡型表示装置20に表示させる。算出部183により算出されたユーザUの集中度が第2閾値よりも大きい場合、表示制御部184は、重畳画像を表す画像データの生成及び送信を停止する。その結果、眼鏡型表示装置20には仮想オブジェクトは表示されない。 If the determination result by the determination unit 182 is affirmative, the display control unit 184 controls the display of the virtual object based on the degree of concentration of the user U calculated by the calculation unit 183. More specifically, when the determination result by the determination unit 182 is affirmative, the display control unit 184 determines whether the concentration level of the user U calculated by the calculation unit 183 exceeds the second threshold (the user's concentration > second threshold). When the degree of concentration of the user U calculated by the calculation unit 183 is less than or equal to the second threshold, the display control unit 184 causes the glasses-type display device 20 to display the above-mentioned superimposed image. When the degree of concentration of the user U calculated by the calculation unit 183 is greater than the second threshold, the display control unit 184 stops generating and transmitting image data representing the superimposed image. As a result, no virtual object is displayed on the glasses-type display device 20.
 また、プログラムPR1に従って作動している処理装置18は、通信装置15により加速度データを受信する毎に、図5に示される表示方法を実行する。図5に示されるように、この表示方法は、ステップSA110~ステップSA150の各処理を含む。 Furthermore, the processing device 18 operating according to the program PR1 executes the display method shown in FIG. 5 every time it receives acceleration data from the communication device 15. As shown in FIG. 5, this display method includes each process from step SA110 to step SA150.
 ステップSA110では、処理装置18は、計測部181として機能する。ステップSA110では、処理装置18は、通信装置15により受信した加速度データに基づいて、眼鏡型表示装置20を装着しているユーザUの歩行速度を計測する。 In step SA110, the processing device 18 functions as the measurement unit 181. In step SA110, the processing device 18 measures the walking speed of the user U wearing the glasses-type display device 20 based on the acceleration data received by the communication device 15.
 ステップSA120では、処理装置18は、判定部182として機能する。ステップSA120では、処理装置18は、ステップSA110にて計測した歩行速度が第1閾値を下回るか否かを判定する。ステップSA110にて計測した歩行速度が第1閾値以上の場合、ステップSA120の判定結果は“No(否定)”となり、ステップSA130の処理が実行される。ステップSA110にて計測した歩行速度が第1閾値未満の場合、ステップSA120の判定結果は“Yes(肯定)”となり、ステップSA140及びステップSA150の処理が実行される。 In step SA120, the processing device 18 functions as the determination unit 182. In step SA120, the processing device 18 determines whether the walking speed measured in step SA110 is less than a first threshold. If the walking speed measured in step SA110 is equal to or greater than the first threshold, the determination result in step SA120 is "No" and the process in step SA130 is executed. If the walking speed measured in step SA110 is less than the first threshold, the determination result in step SA120 is "Yes" and the processes in step SA140 and step SA150 are executed.
 ステップSA130では、処理装置18は、表示制御部184として機能する。ステップSA130では、処理装置18は、重畳画像を表す画像データを生成する。そして、処理装置18は、当該画像データを眼鏡型表示装置20に送信し、重畳画像を眼鏡型表示装置20に表示させる。 In step SA130, the processing device 18 functions as the display control unit 184. In step SA130, the processing device 18 generates image data representing the superimposed image. Then, the processing device 18 transmits the image data to the glasses-type display device 20 and causes the glasses-type display device 20 to display the superimposed image.
 ステップSA140では、処理装置18は算出部183として機能する。ステップSA140では、処理装置18はステップSA110にて計測された歩行速度に基づいて、ユーザUの集中度を算出する。ステップSA140に後続するステップSA150では、処理装置18は、表示制御部184として機能する。ステップSA150では、処理装置18は、眼鏡型表示装置20における仮想オブジェクトの表示を、ステップSA140にて算出したユーザUの集中度に基づいて制御する。より詳細に説明すると、ステップSA150では、処理装置18は、ステップSA140にて算出したユーザUの集中度が第2閾値を上回るか否かを判定する。ステップSA140にて算出したユーザUの集中度が第2閾値以下の場合、処理装置18は、重畳画像を眼鏡型表示装置20に表示させる。ステップSA140にて算出したユーザUの集中度が第2閾値よりも大きい場合、処理装置18は、重畳画像の表示を停止する。 In step SA140, the processing device 18 functions as the calculation unit 183. In step SA140, processing device 18 calculates the degree of concentration of user U based on the walking speed measured in step SA110. In step SA150 following step SA140, the processing device 18 functions as the display control unit 184. In step SA150, the processing device 18 controls the display of the virtual object on the glasses-type display device 20 based on the degree of concentration of the user U calculated in step SA140. To explain in more detail, in step SA150, the processing device 18 determines whether the degree of concentration of the user U calculated in step SA140 exceeds the second threshold. If the degree of concentration of the user U calculated in step SA140 is equal to or less than the second threshold, the processing device 18 causes the glasses-type display device 20 to display the superimposed image. If the degree of concentration of the user U calculated in step SA140 is greater than the second threshold, the processing device 18 stops displaying the superimposed image.
 図6は、眼鏡型表示装置20の構成例を示すブロック図である。眼鏡型表示装置20は、表示部2aと、通信装置2bと、撮像装置2cと、加速度センサ2gと、記憶装置2dと、処理装置2eと、バス2fと、を含む。鏡型表示装置20の構成要素(表示部2a、通信装置2b、撮像装置2c、加速度センサ2g、記憶装置2d、及び処理装置2e)は、情報を通信するためのバス2fによって相互に接続される。バス2fは、単一のバスを用いて構成されてもよいし、装置等の要素間ごとに異なるバスを用いて構成されてもよい。 FIG. 6 is a block diagram showing a configuration example of the eyeglass-type display device 20. The glasses-type display device 20 includes a display section 2a, a communication device 2b, an imaging device 2c, an acceleration sensor 2g, a storage device 2d, a processing device 2e, and a bus 2f. The components of the mirror display device 20 (display section 2a, communication device 2b, imaging device 2c, acceleration sensor 2g, storage device 2d, and processing device 2e) are interconnected by a bus 2f for communicating information. . The bus 2f may be configured using a single bus, or may be configured using different buses for each element such as a device.
 表示部2aは、透過型である。眼鏡型表示装置20(表示部2a)の視野の光は、表示部2aを透過する。表示部2aは、携帯機器10による制御の下、重畳画像を表示する。ユーザUが眼鏡型表示装置20を装着したとき、表示部2aは、ユーザUの左眼及び右眼の前に位置する。眼鏡型表示装置20を装着したユーザUは、実空間に仮想オブジェクトの画像が現れると、視認する。 The display section 2a is of a transmissive type. Light in the field of view of the glasses-type display device 20 (display section 2a) passes through the display section 2a. The display unit 2a displays the superimposed image under the control of the mobile device 10. When the user U wears the glasses-type display device 20, the display section 2a is located in front of the user's U left and right eyes. The user U wearing the glasses-type display device 20 visually recognizes when an image of a virtual object appears in real space.
 より詳細に説明すると、表示部2aは、左眼用のレンズ、左眼用の表示パネル、左眼用の光学部材、右眼用のレンズ、右眼用の表示パネル、及び右眼用の光学部材を含む。左眼用の表示パネル及び右眼用の表示パネルは、例えば、液晶パネル又は有機EL(Electro Luminescence)パネルである。左眼用の表示パネルは、携帯機器10から与えられる画像データの表す重畳画像を表示する。左眼用の光学部材は、左眼用の表示パネルから射出された光を左眼用のレンズに導光する光学部材である。同様に、右眼用の表示パネルは、携帯機器10から与えられる画像データの表す重畳画像を表示する。右眼用の光学部材は、右眼用の表示パネルから射出された光を右眼用のレンズに導光する光学部材である。 To explain in more detail, the display unit 2a includes a left eye lens, a left eye display panel, a left eye optical member, a right eye lens, a right eye display panel, and a right eye optical member. Contains parts. The display panel for the left eye and the display panel for the right eye are, for example, a liquid crystal panel or an organic EL (Electro Luminescence) panel. The display panel for the left eye displays a superimposed image represented by image data provided from the mobile device 10. The left eye optical member is an optical member that guides light emitted from the left eye display panel to the left eye lens. Similarly, the display panel for the right eye displays a superimposed image represented by image data provided from the mobile device 10. The right eye optical member is an optical member that guides light emitted from the right eye display panel to the right eye lens.
 左眼用のレンズ及び右眼用のレンズの各々は、ハーフミラーを有する。左眼用のレンズが有するハーフミラーは、実空間を表す光を透過させることによって、実空間を表す光をユーザUの左眼に導く。また、左眼用のレンズが有するハーフミラーは、左眼用の光学部材によって導光された光をユーザUの左眼に反射する。右眼用のレンズが有するハーフミラーは、実空間を表す光を透過させることによって、実空間を表す光をユーザUの右眼に導く。右眼用のレンズが有するハーフミラーは、右眼用の光学部材によって導光された光をユーザUの右眼に反射する。 Each of the left eye lens and the right eye lens has a half mirror. The half mirror included in the left eye lens guides the light representing the real space to the left eye of the user U by transmitting the light representing the real space. Further, the half mirror included in the left eye lens reflects the light guided by the left eye optical member to the user U's left eye. The half mirror included in the right eye lens guides the light representing the real space to the right eye of the user U by transmitting the light representing the real space. The half mirror included in the right eye lens reflects the light guided by the right eye optical member to the user U's right eye.
 通信装置2bは、有線にて携帯機器10と通信するためのハードウェアデバイスである。通信装置2bは、無線にて携帯機器10と通信してもよい。 The communication device 2b is a hardware device for communicating with the mobile device 10 by wire. The communication device 2b may communicate with the mobile device 10 wirelessly.
 眼鏡型表示装置20は、左眼用のレンズ及び右眼用のレンズを支持する眼鏡型のフレームを有し、撮像装置2cは、当該フレームにおけるブリッジに設けられる。撮像装置2cは、撮像装置2cの視野範囲を撮像する。撮像装置2cは、撮像した画像を表す画像データを処理装置2eへ出力する。 The spectacle-type display device 20 has a spectacle-shaped frame that supports a left-eye lens and a right-eye lens, and the imaging device 2c is provided on a bridge of the frame. The imaging device 2c images the field of view of the imaging device 2c. The imaging device 2c outputs image data representing the captured image to the processing device 2e.
 加速度センサ2gは、X軸、Y軸及びZ軸の各軸方向の加速度を一定周期で検出する3軸加速度センサである。加速度センサ2gは、X軸、Y軸及びZ軸の各軸方向の加速度を表す加速度データを一定周期で処理装置2eに出力する。 The acceleration sensor 2g is a three-axis acceleration sensor that detects acceleration in each of the X-axis, Y-axis, and Z-axis directions at regular intervals. The acceleration sensor 2g outputs acceleration data representing acceleration in the directions of the X, Y, and Z axes to the processing device 2e at regular intervals.
 記憶装置2dは、処理装置2eが読み取り可能な記録媒体である。記憶装置2dは、記憶装置17と同様に不揮発性メモリと揮発性メモリとを含む。記憶装置2dは、プログラムPR2を記憶する。処理装置2eは、1又は複数のCPUを含む。処理装置2eは、記憶装置2dからプログラムPR2を読み取る。処理装置2eは、プログラムPR2を実行することによって、動作制御部2e1として機能する。 The storage device 2d is a recording medium that can be read by the processing device 2e. The storage device 2d, like the storage device 17, includes nonvolatile memory and volatile memory. The storage device 2d stores a program PR2. The processing device 2e includes one or more CPUs. The processing device 2e reads the program PR2 from the storage device 2d. The processing device 2e functions as an operation control unit 2e1 by executing the program PR2.
 動作制御部2e1は、眼鏡型表示装置20を制御する。動作制御部2e1は、撮像装置2cから出力された画像データを、通信装置2bを用いて携帯機器10へ送信する。また、動作制御部2e1は、加速度センサ2gから出力される加速度データを、通信装置2bを用いて携帯機器10へ送信する。また、動作制御部2e1は、通信装置2bにより携帯機器10から受信した画像データを表示部2aに供給する。表示部2aは、動作制御部2e1から供給される画像データの表す画像を表示する。前述したように、携帯機器10から眼鏡型表示装置20へ送信される画像データは、前述の重畳画像を表す。重畳画像が表示部2aに表示されるので、ユーザUの眼には、仮想オブジェクトが重ねられた実空間の画像が映る。 The operation control unit 2e1 controls the eyeglass-type display device 20. The operation control unit 2e1 transmits the image data output from the imaging device 2c to the mobile device 10 using the communication device 2b. Further, the operation control unit 2e1 transmits acceleration data output from the acceleration sensor 2g to the mobile device 10 using the communication device 2b. Further, the operation control unit 2e1 supplies image data received from the mobile device 10 via the communication device 2b to the display unit 2a. The display section 2a displays an image represented by the image data supplied from the operation control section 2e1. As described above, the image data transmitted from the mobile device 10 to the glasses-type display device 20 represents the above-mentioned superimposed image. Since the superimposed image is displayed on the display unit 2a, the user U's eyes see an image of the real space on which the virtual object is superimposed.
 例えば、眼鏡型表示装置20を装着したユーザUの歩行速度が第1閾値を上回る場合、携帯機器10にて実行される表示制御方法におけるステップSA120の判定結果は“No”となり、ステップSA130の処理が実行される。この結果、重畳画像を表す画像データが携帯機器10から眼鏡型表示装置20へ送信され、眼鏡型表示装置20の表示部2aには重畳画像が表示される。従って、ユーザUの眼には、図3に示される画像G2のように、仮想オブジェクトが重ねられた実空間が映る。 For example, if the walking speed of the user U wearing the glasses-type display device 20 exceeds the first threshold, the determination result in step SA120 in the display control method executed by the mobile device 10 is "No", and the process in step SA130 is is executed. As a result, image data representing the superimposed image is transmitted from the mobile device 10 to the glasses-type display device 20, and the superimposed image is displayed on the display section 2a of the glasses-type display device 20. Therefore, the user U's eyes see a real space in which virtual objects are superimposed, as in the image G2 shown in FIG.
 これに対して、眼鏡型表示装置20を装着した仮想オブジェクトに対するユーザUの集中度が高まり、歩行速度が第1閾値を下回ると、ステップSA120の判定結果は“Yes”となってステップSA140及びステップSA150の処理が実行される。歩行速度が第1閾値を下回った状態でも、ステップSA140にて算出されるユーザUの集中度が第2閾値以下であれば、眼鏡型表示装置20の表示部2aには重畳画像が表示されるので、ユーザUの眼には、図3に示される画像G2のように、仮想オブジェクトが重ねられた実空間が映る。仮想オブジェクトに対するユーザUの集中度が更に高まり、ユーザUの歩行速度が更に遅くなると、ステップSA140にて算出されるユーザUの集中度が第2閾値を上回り、重畳画像が表示されなくなる。その結果、ユーザUの眼には、図2に示される画像G1のように、仮想オブジェクトが映らない。 On the other hand, when the user U's degree of concentration on the virtual object wearing the glasses-type display device 20 increases and the walking speed falls below the first threshold, the determination result in step SA120 becomes "Yes" and step SA140 and step The process of SA150 is executed. Even if the walking speed is below the first threshold, if the concentration level of the user U calculated in step SA140 is below the second threshold, the superimposed image is displayed on the display unit 2a of the glasses-type display device 20. Therefore, the user U's eyes see a real space in which virtual objects are superimposed, as in the image G2 shown in FIG. When user U's concentration level with respect to the virtual object further increases and user U's walking speed further slows down, user U's concentration level calculated in step SA140 exceeds the second threshold, and the superimposed image is no longer displayed. As a result, the virtual object does not appear in the eyes of the user U, as in the image G1 shown in FIG.
 以上説明したように、本実施形態によれば、眼鏡型表示装置20に表示される仮想オブジェクトに対するユーザUの集中度に応じて仮想オブジェクトの表示が制御されるので、不必要な表示制限を回避しつつ、ユーザUが仮想オブジェクトの表示に過度に集中することを防ぐことができる。 As described above, according to the present embodiment, the display of the virtual object is controlled according to the degree of concentration of the user U on the virtual object displayed on the glasses-type display device 20, thereby avoiding unnecessary display restrictions. At the same time, it is possible to prevent the user U from concentrating too much on displaying the virtual object.
(B:変形)
 本開示は、以上に例示した実施形態に限定されない。具体的な変形の態様は以下の通りである。以下の例示から任意に選択された2以上の態様が併合されてもよい。
(B-1:変形例1)
 上記実施形態における表示制御部184は、算出部183により算出されたユーザUの集中度が第2閾値を上回る場合には、仮想オブジェクトの表示を停止することにより、仮想オブジェクトの表示を制限した。しかし、表示制御部184は、算出部183により算出されたユーザUの集中度が第2閾値を上回る場合に、仮想オブジェクトに加えて、仮想オブジェクトへの過度な集中を警告するメッセージを眼鏡型表示装置20に表示させてもよい。要は、表示制御部184は、算出部183により算出されたユーザUの集中度に基づいて眼鏡型表示装置20への仮想オブジェクトの表示を制御できればよい。
(B: Transformation)
The present disclosure is not limited to the embodiments illustrated above. Specific aspects of the modification are as follows. Two or more aspects arbitrarily selected from the examples below may be combined.
(B-1: Modification 1)
The display control unit 184 in the embodiment described above limits the display of the virtual object by stopping the display of the virtual object when the degree of concentration of the user U calculated by the calculation unit 183 exceeds the second threshold. However, when the degree of concentration of the user U calculated by the calculation unit 183 exceeds the second threshold, the display control unit 184 displays, in addition to the virtual object, a message in the form of glasses to warn against excessive concentration on the virtual object. It may be displayed on the device 20. In short, the display control unit 184 only needs to be able to control the display of the virtual object on the glasses-type display device 20 based on the degree of concentration of the user U calculated by the calculation unit 183.
(B-2:変形例2)
 上記実施形態では、計測部181は、X軸方向の加速度及びY軸方向の加速度を合成して得られる合成加速度を積分することにより、ユーザUの歩行速度を算出した。しかし、計測部181は、あらかじめ用意されたテーブルを利用することで、ユーザUの歩行速度を特定してもよい。具体的には、当該テーブルは、複数のレコードを有する。複数のレコードの各々は、一定範囲の歩行速度と、当該歩行速度でユーザUが歩行した場合に得られた、Z軸方向の加速度の周期的な揺れ幅とを含む。上記テーブルは記憶装置17に予め記憶されている。計測部181は、眼鏡型表示装置20から加速度データを受信し、当該加速度データの表すZ軸方向の加速度の振れ幅に対応する歩行速度を当該テーブルから読み出すことにより、ユーザUの歩行速度を特定する。
(B-2: Modification 2)
In the embodiment described above, the measurement unit 181 calculated the walking speed of the user U by integrating the combined acceleration obtained by combining the acceleration in the X-axis direction and the acceleration in the Y-axis direction. However, the measurement unit 181 may identify the walking speed of the user U by using a table prepared in advance. Specifically, the table has multiple records. Each of the plurality of records includes a walking speed within a certain range and a periodic swing amplitude of acceleration in the Z-axis direction obtained when the user U walks at the walking speed. The above table is stored in the storage device 17 in advance. The measurement unit 181 receives acceleration data from the glasses-type display device 20, and specifies the walking speed of the user U by reading out from the table the walking speed that corresponds to the swing amplitude of acceleration in the Z-axis direction represented by the acceleration data. do.
(B-3:変形例3)
 上記実施形態では、眼鏡型表示装置20は3軸の加速度センサを備え、計測部181は当該加速度センサの出力データに基づいてユーザの歩行速度を特定した。しかし、撮像装置2cにより動画像が撮像される場合、計測部181は、撮像装置2cにより撮像された動画像に基づいてユーザの歩行速度を計測してもよい。例えば、計測部181は、あらかじめ用意されたテーブルを参照することにより、眼鏡型表示装置20を装着したユーザUの歩行速度を特定してもよい。具体的には、当該テーブルは、複数のレコードを有する。複数のレコードの各々は、一定範囲の歩行速度と、当該歩行速度でユーザUが歩行しながら撮像することで得られた動画像のZ軸方向のぶれ幅とを含む。計測部181は、撮像装置2cによって撮像された動画像のZ軸方向のぶれ幅に対応する歩行速度を当該テーブルから読み出すことにより、ユーザUの歩行速度を特定する。また、計測部181は、撮像装置2cにより撮像された動画像の各フレームについての鉛直方向の画像のぶれ幅に基づいてユーザUが歩行しているか否かを判定してもよい。ユーザUが歩行していると判定された場合、道路標識又は車両のナンバープレート等の大きさが既知の物体についての各フレームに亙る大きさの変化に基づいて歩行速度を特定してもよい。撮像装置2cにより撮像された動画像に基づいてユーザの歩行速度を特定する場合、加速度センサは省略可能である。
(B-3: Modification 3)
In the above embodiment, the glasses-type display device 20 includes a three-axis acceleration sensor, and the measurement unit 181 identifies the walking speed of the user based on the output data of the acceleration sensor. However, when a moving image is captured by the imaging device 2c, the measurement unit 181 may measure the walking speed of the user based on the moving image captured by the imaging device 2c. For example, the measurement unit 181 may identify the walking speed of the user U wearing the glasses-type display device 20 by referring to a table prepared in advance. Specifically, the table has multiple records. Each of the plurality of records includes a walking speed within a certain range and a blur width in the Z-axis direction of a moving image obtained by capturing an image while the user U is walking at the walking speed. The measurement unit 181 specifies the walking speed of the user U by reading from the table the walking speed corresponding to the shake width in the Z-axis direction of the moving image captured by the imaging device 2c. Furthermore, the measurement unit 181 may determine whether the user U is walking based on the vertical image blur width for each frame of the moving image captured by the imaging device 2c. When it is determined that the user U is walking, the walking speed may be determined based on the change in size of an object whose size is known, such as a road sign or a vehicle license plate, over each frame. When specifying the user's walking speed based on the moving image captured by the imaging device 2c, the acceleration sensor can be omitted.
(B-4:変形例4)
 加速度センサに代えてGPS(Global Positioning System, Global Positioning Satellite)受信機が眼鏡型表示装置20に設けられ、計測部181は、GPS受信機の出力データに基づいてユーザの歩行速度を特定してもよい。例えば、計測部181は、撮像装置2cにより撮像された動画像の各フレームについてのZ軸方向の画像のぶれ幅に基づいてユーザUが歩行しているか否かを判定する。ユーザUが歩行していると判定された場合、計測部181は、時刻t1においてGPS受信機により取得した第1位置情報と、時刻t1よりも後の時刻t2においてGPS受信機により取得した第2位置情報とに基づいてユーザの移動距離を算出し、この移動距離をt2-t1で除算した値を歩行速度に設定する。
(B-4: Modification 4)
A GPS (Global Positioning System, Global Positioning Satellite) receiver is provided in the glasses-type display device 20 instead of an acceleration sensor, and the measurement unit 181 can identify the user's walking speed based on the output data of the GPS receiver. good. For example, the measurement unit 181 determines whether the user U is walking based on the image blur width in the Z-axis direction for each frame of the moving image captured by the imaging device 2c. When it is determined that the user U is walking, the measurement unit 181 uses the first position information acquired by the GPS receiver at time t1 and the second position information acquired by the GPS receiver at time t2 after time t1. The moving distance of the user is calculated based on the position information, and the value obtained by dividing this moving distance by t2-t1 is set as the walking speed.
(B-5:変形例5)
 上記実施形態では、携帯機器10の記憶装置17にプログラムPR1が記憶されていたが、プログラムPR1が単体で製造又は販売されてもよい。プログラムPR1を販売する際の購入先へのプログラムPR1の提供方法としては、プログラムPR1が書き込まれたフラッシュROM等のコンピュータ読み取り可能な記録媒体が配布されること、又は当該記録媒体が電気通信回線経由のダウンロードにより配布されることが挙げられる。
(B-5: Modification 5)
In the above embodiment, the program PR1 is stored in the storage device 17 of the mobile device 10, but the program PR1 may be manufactured or sold separately. When selling the program PR1, the method of providing the program PR1 to the purchaser is to distribute a computer-readable recording medium such as a flash ROM in which the program PR1 is written, or to distribute the recording medium via a telecommunications line. For example, it can be distributed by downloading.
(B-6:変形例6)
 上記実施形態における計測部181、判定部182、算出部183、及び表示制御部184はソフトウェアモジュールであった。しかし、計測部181、判定部182、算出部183、及び表示制御部184のうちの何れか一つ、複数、又は全部はハードウェアモジュールであってもよい。ハードウェアモジュールの具体例としては、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field Programmable Gate Array)等が挙げられる。
(B-6: Modification 6)
The measurement unit 181, determination unit 182, calculation unit 183, and display control unit 184 in the above embodiment were software modules. However, any one, a plurality, or all of the measurement section 181, the determination section 182, the calculation section 183, and the display control section 184 may be a hardware module. Specific examples of the hardware module include DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), and the like.
(B-7:変形例7)
 上記実施形態では、携帯機器10が計測部181、判定部182、算出部183、及び表示制御部184を有した。しかし、計測部181が眼鏡型表示装置20に設けられ、眼鏡型表示装置20から携帯機器10へ歩行速度を表すデータが供給されてもよい。また、判定部182は省略されてもよい。要は、眼鏡型表示装置20における仮想オブジェクトの表示を制御する表示制御装置は、算出部183と、表示制御部184とを含んでいればよい。
(B-7: Modification 7)
In the embodiment described above, the mobile device 10 included the measurement section 181, the determination section 182, the calculation section 183, and the display control section 184. However, the measurement unit 181 may be provided in the glasses-type display device 20, and the data representing the walking speed may be supplied from the glasses-type display device 20 to the mobile device 10. Further, the determination unit 182 may be omitted. In short, the display control device that controls the display of the virtual object on the glasses-type display device 20 only needs to include the calculation section 183 and the display control section 184.
(C:その他)
(1)上述した実施形態では、記憶装置17及び記憶装置2dとしてROM及びRAM等が例示されたが、記憶装置17及び記憶装置2dは、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリデバイス(例えば、カード、スティック、キードライブ)、CD-ROM(Compact Disc-ROM)、レジスタ、リムーバブルディスク、ハードディスク、フロッピー(登録商標)ディスク、磁気ストリップ、データベース、サーバその他の適切な記憶媒体であってもよい。
(C: Other)
(1) In the above-described embodiment, the storage device 17 and the storage device 2d are exemplified as ROM, RAM, etc., but the storage device 17 and the storage device 2d are flexible disks, magneto-optical disks (for example, compact disks, digital Application discs, Blu-ray (registered trademark) discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable disks, hard disks, floppies (registered trademark) disk, magnetic strip, database, server, or other suitable storage medium.
(2)上述した実施形態において、説明した情報、信号等は、様々な異なる技術の何れかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップ等は、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 (2) In the embodiments described above, the information, signals, etc. described may be represented using any of a variety of different technologies. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
(3)上述した実施形態において、入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 (3) In the embodiments described above, the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
(4)上述した実施形態において、判定は、1ビットで表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 (4) In the embodiments described above, the determination may be made based on a value represented by 1 bit (0 or 1), or may be made based on a truth value (Boolean: true or false). , may be performed by numerical comparison (for example, comparison with a predetermined value).
(5)上述した実施形態において例示した処理手順、シーケンス、フローチャート等は、矛盾の無い限り、順序が入れ替えられてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素が提示されており、提示された特定の順序に限定されない。 (5) The order of the processing procedures, sequences, flowcharts, etc. illustrated in the embodiments described above may be changed as long as there is no contradiction. For example, for the methods described in this disclosure, elements of the various steps are presented using an example order and are not limited to the particular order presented.
(6)図4に例示された各機能は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線等を用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。 (6) Each function illustrated in FIG. 4 is realized by an arbitrary combination of at least one of hardware and software. Furthermore, the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices. The functional block may be realized by combining software with the one device or the plurality of devices.
(7)上述した実施形態で例示したプログラムは、ソフトウェアは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称で呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能等を意味するよう広く解釈されるべきである。 (7) The programs exemplified in the embodiments described above may include instructions, instruction sets, codes, software, firmware, middleware, microcode, hardware description language, or other names. Should be broadly construed to mean a code segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc. .
 また、ソフトウェア、命令、情報等は、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)等)及び無線技術(赤外線、マイクロ波等)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 Additionally, software, instructions, information, etc. may be sent and received via a transmission medium. For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
(8)前述の各形態において、「システム」及び「ネットワーク」という用語は、互換的に使用される。 (8) In each of the above embodiments, the terms "system" and "network" are used interchangeably.
(9)本開示において説明した情報、パラメータ等は、絶対値を用いて表されてもよいし、所定の値からの相対値を用いて表されてもよいし、対応する別の情報を用いて表されてもよい。 (9) The information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
(10)上述した実施形態において、携帯機器には、移動局(MS:Mobile Station)である場合が含まれる。移動局は、当業者によって、加入者局、モバイルユニット、加入者ユニット、ワイヤレスユニット、リモートユニット、モバイルデバイス、ワイヤレスデバイス、ワイヤレス通信デバイス、リモートデバイス、モバイル加入者局、アクセス端末、モバイル端末、ワイヤレス端末、リモート端末、ハンドセット、ユーザエージェント、モバイルクライアント、クライアント、又はいくつかの他の適切な用語で呼ばれる場合もある。また、本開示においては、「移動局」、「ユーザ端末(user terminal)」、「ユーザ装置(UE:User Equipment)」、「端末」等の用語は、互換的に使用され得る。 (10) In the embodiments described above, the mobile device includes a mobile station (MS). A mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as "mobile station," "user terminal," "user equipment (UE)," and "terminal" may be used interchangeably.
(11)上述した実施形態において、「接続された(connected)」、「結合された(coupled)」という用語、又はこれらのあらゆる変形は、2又はそれ以上の要素間の直接的又は間接的なあらゆる接続又は結合を意味し、互いに「接続」又は「結合」された2つの要素間に1又はそれ以上の中間要素が存在することを含むことができる。要素間の結合又は接続は、物理的なものであっても、論理的なものであっても、或いはこれらの組み合わせであってもよい。例えば、「接続」は、「アクセス」で読み替えられてもよい。本開示で使用する場合、2つの要素は、1又はそれ以上の電線、ケーブル及びプリント電気接続の少なくとも一つを用いて、並びにいくつかの非限定的かつ非包括的な例として、無線周波数領域、マイクロ波領域及び光(可視及び不可視の両方)領域の波長を有する電磁エネルギー等を用いて、互いに「接続」又は「結合」されると考えることができる。 (11) In the embodiments described above, the terms "connected", "coupled", or any variations thereof refer to direct or indirect connections between two or more elements. Refers to any connection or combination and may include the presence of one or more intermediate elements between two elements that are "connected" or "coupled" to each other. The bonds or connections between elements may be physical, logical, or a combination thereof. For example, "connection" may be replaced with "access." As used in this disclosure, two elements may include one or more electrical wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and non-visible) ranges, and the like.
(12)上述した実施形態において、「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 (12) In the embodiments described above, the statement "based on" does not mean "based solely on" unless specified otherwise. In other words, the phrase "based on" means both "based only on" and "based at least on."
(13)本開示で使用する「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、判定(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up、search、inquiry)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事等を含み得る。また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事等を含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)等した事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。また、「判断(決定)」は、「想定する(assuming)」、「期待する(expecting)」、「みなす(considering)」等で読み替えられてもよい。 (13) The terms "determining" and "determining" used in this disclosure may encompass a wide variety of operations. "Judgment" and "decision" include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a "judgment" or "decision." In addition, "judgment" and "decision" refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access. (accessing) (e.g., accessing data in memory) may include considering something as a "judgment" or "decision." In addition, "judgment" and "decision" refer to resolving, selecting, choosing, establishing, comparing, etc. as "judgment" and "decision". may be included. In other words, "judgment" and "decision" may include regarding some action as having been "judged" or "determined." Further, "judgment (decision)" may be read as "assuming", "expecting", "considering", etc.
(14)上述した実施形態において、「含む(include)」、「含んでいる(including)」及びそれらの変形が使用されている場合、これらの用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。更に、本開示において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 (14) In the embodiments described above, when “include”, “including” and variations thereof are used, these terms are used in the same manner as the term “comprising”. , is intended to be comprehensive. Furthermore, the term "or" as used in this disclosure is not intended to be exclusive or.
(15)本開示において、例えば、英語でのa, an及びtheのように、翻訳により冠詞が追加された場合、本開示は、これらの冠詞の後に続く名詞が複数形であることを含んでもよい。 (15) In the present disclosure, when articles are added by translation, such as a, an, and the in English, the present disclosure does not include that the nouns following these articles are plural. good.
(16)本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」等の用語も、「異なる」と同様に解釈されてもよい。 (16) In the present disclosure, the term "A and B are different" may mean "A and B are different from each other." Note that the term may also mean that "A and B are each different from C". Terms such as "separate", "coupled", etc. may also be interpreted similarly to "different".
(17)本開示において説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行うものに限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 (17) Each aspect/embodiment described in the present disclosure may be used alone, in combination, or may be switched and used in accordance with execution. In addition, notification of prescribed information (for example, notification of "X") is not limited to being done explicitly, but may also be done implicitly (for example, not notifying the prescribed information). Good too.
(D:上述の形態又は変形例から把握される態様)
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されるものではないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施することができる。従って、本開示の記載は、例示説明を目的とするものであり、本開示に対して何ら制限的な意味を有するものではない。上述の実施形態又は変形例の少なくとも1つから以下の態様が把握される。
(D: Aspects understood from the above-mentioned form or modification)
Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described in the present disclosure. The present disclosure can be implemented as modifications and variations without departing from the spirit and scope of the present disclosure as determined by the claims. Therefore, the description of the present disclosure is for the purpose of illustrative explanation and is not intended to have any limiting meaning on the present disclosure. The following aspects can be understood from at least one of the above-described embodiments or modifications.
 第1の態様による表示制御装置は、算出部183と、表示制御部184と、を備える。算出部183は、ユーザUの頭部に装着される透過型表示装置に表示される仮想オブジェクトに対するユーザUの集中度を、ユーザの歩行速度に基づいて算出する。表示制御部184は、透過型表示装置への仮想オブジェクトの表示を、算出部183により算出されたユーザの集中度に基づいて制御する。携帯機器10は、本開示の表示制御装置の一例である。眼鏡型表示装置20は、本開示における透過型表示装置の一例である。第1の態様の表示制御装置によれば、透過型表示装置に表示される仮想オブジェクトに対するユーザの集中度に基づいて仮想オブジェクトの表示が制御されるので、不必要な表示制限を回避しつつ、ユーザが仮想オブジェクトの表示に過度に集中することを防ぐことができる。 The display control device according to the first aspect includes a calculation section 183 and a display control section 184. The calculation unit 183 calculates the degree of concentration of the user U with respect to the virtual object displayed on the transparent display device worn on the head of the user U, based on the walking speed of the user. The display control unit 184 controls the display of the virtual object on the transparent display device based on the user concentration level calculated by the calculation unit 183. The mobile device 10 is an example of a display control device of the present disclosure. The glasses-type display device 20 is an example of a transmissive display device in the present disclosure. According to the display control device of the first aspect, the display of the virtual object is controlled based on the user's degree of concentration on the virtual object displayed on the transparent display device, so unnecessary display restrictions can be avoided while It is possible to prevent the user from concentrating too much on displaying the virtual object.
 第2の態様(第1の態様の例)による表示制御装置は、前記ユーザの歩行速度を計測する計測部181を更に備えてもよい。第2の態様の表示制御装置は、計測部181により計測される歩行速度に基づいて算出される集中度に応じて、仮想オブジェクトの表示を制御できる。 The display control device according to the second aspect (an example of the first aspect) may further include a measurement unit 181 that measures the walking speed of the user. The display control device of the second aspect can control the display of the virtual object according to the concentration degree calculated based on the walking speed measured by the measurement unit 181.
 第3の態様(第1の態様の例)による表示制御装置は、前記ユーザの歩行速度が第1閾値を下回ったか否かを判定する判定部182、を備えてもよい。第3の態様による表示制御装置における算出部183は、判定部182による判定結果が肯定である場合(歩行速度が第1閾値を下回る場合)に、前記第1閾値と前記ユーザの歩行速度との差に基づいて前記ユーザの集中度を算出してもよい。また、第3の態様による表示制御装置における表示制御部184は、判定部182の判定結果が肯定である場合に、表示装置への仮想オブジェクトの表示を、算出部183により算出されたユーザの集中度に基づいて制限してもよい。第3の態様の表示制御装置は、ユーザの歩行速度が第1閾値を下回る場合に、表示装置における仮想オブジェクトの表示を、算出部183により算出されたユーザの集中度に基づいて制限できる。 The display control device according to the third aspect (an example of the first aspect) may include a determination unit 182 that determines whether the walking speed of the user has fallen below a first threshold. The calculation unit 183 in the display control device according to the third aspect calculates the difference between the first threshold value and the walking speed of the user when the determination result by the determination unit 182 is affirmative (when the walking speed is less than the first threshold value). The user's concentration level may be calculated based on the difference. Furthermore, when the determination result of the determination unit 182 is positive, the display control unit 184 in the display control device according to the third aspect controls the display of the virtual object on the display device based on the user concentration calculated by the calculation unit 183. Limits may be based on degree. The display control device of the third aspect can limit the display of the virtual object on the display device based on the user's concentration level calculated by the calculation unit 183 when the user's walking speed is less than the first threshold.
 第4の態様(第1の態様の例)による表示制御装置における表示制御部184は、算出部183により算出されたユーザの集中度が第2閾値を上回る場合に、透過型表示装置への仮想オブジェクトの表示を制限してもよい。第4の態様の表示制御装置は、ユーザの歩行速度に基づいて算出されるユーザの集中度が第2閾値を上回る場合に、透過型表示装置への仮想オブジェクトの表示を制限できる。 The display control unit 184 in the display control device according to the fourth aspect (an example of the first aspect) performs virtual control on the transmissive display device when the user concentration calculated by the calculation unit 183 exceeds the second threshold. Display of objects may be restricted. The display control device of the fourth aspect can restrict the display of the virtual object on the transparent display device when the user's concentration level calculated based on the user's walking speed exceeds the second threshold.
1…表示システム、10…携帯機器、20…眼鏡型表示装置、11…入力装置、12…出力装置、14,15,2b…通信装置、17、2d…記憶装置、18,2e…処理装置、181…計測部、182…判定部、183…算出部、184…表示制御部、19,2f…バス、2a…表示部、2g…加速度センサ、2e1…動作制御部、PR1,PR2…プログラム。 DESCRIPTION OF SYMBOLS 1...Display system, 10...Mobile device, 20...Glasses type display device, 11...Input device, 12...Output device, 14, 15, 2b...Communication device, 17, 2d...Storage device, 18, 2e...Processing device, 181...Measurement section, 182...Judgment section, 183...Calculation section, 184...Display control section, 19, 2f...Bus, 2a...Display section, 2g...Acceleration sensor, 2e1...Operation control section, PR1, PR2...Program.

Claims (4)

  1.  ユーザの頭部に装着される透過型表示装置に表示される仮想オブジェクトに対する前記ユーザの集中度を、前記ユーザの歩行速度に基づいて算出する算出部と、
     前記透過型表示装置への前記仮想オブジェクトの表示を、前記ユーザの集中度に基づいて制御する表示制御部と、
     を備える、表示制御装置。
    a calculation unit that calculates the user's concentration level with respect to a virtual object displayed on a transparent display device worn on the user's head based on the user's walking speed;
    a display control unit that controls display of the virtual object on the transparent display device based on the concentration level of the user;
    A display control device comprising:
  2.  前記ユーザの歩行速度を計測する計測部を更に備える、請求項1に記載の表示制御装置。 The display control device according to claim 1, further comprising a measurement unit that measures the walking speed of the user.
  3.  前記歩行速度が第1閾値を下回るか否かを判定する判定部を更に備え、
     前記算出部は、前記歩行速度が第1閾値を下回る場合に、前記歩行速度と前記第1閾値との差に基づいて前記ユーザの集中度を算出し、
     前記表示制御部は、前記歩行速度が第1閾値を下回る場合に、前記透過型表示装置への前記仮想オブジェクトの表示を前記ユーザの集中度に基づいて制限する、
     請求項1に記載の表示制御装置。
    further comprising a determination unit that determines whether the walking speed is below a first threshold;
    The calculation unit calculates the concentration level of the user based on the difference between the walking speed and the first threshold when the walking speed is less than a first threshold;
    The display control unit limits display of the virtual object on the transparent display device based on the concentration level of the user when the walking speed is less than a first threshold.
    The display control device according to claim 1.
  4.  前記表示制御部は、前記ユーザの集中度が第2閾値を上回る場合に、前記透過型表示装置への前記仮想オブジェクトの表示を制限する、
     請求項1に記載の表示制御装置。
    The display control unit limits display of the virtual object on the transparent display device when the concentration level of the user exceeds a second threshold.
    The display control device according to claim 1.
PCT/JP2023/009936 2022-05-11 2023-03-14 Display control device WO2023218751A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022078051 2022-05-11
JP2022-078051 2022-05-11

Publications (1)

Publication Number Publication Date
WO2023218751A1 true WO2023218751A1 (en) 2023-11-16

Family

ID=88729984

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/009936 WO2023218751A1 (en) 2022-05-11 2023-03-14 Display control device

Country Status (1)

Country Link
WO (1) WO2023218751A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015145544A1 (en) * 2014-03-24 2015-10-01 パイオニア株式会社 Display control device, control method, program, and storage medium
JP2017068595A (en) * 2015-09-30 2017-04-06 ソニー株式会社 Information processing device, information processing method, and program
JP2021165864A (en) * 2018-06-18 2021-10-14 ソニーグループ株式会社 Information processing device, information processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015145544A1 (en) * 2014-03-24 2015-10-01 パイオニア株式会社 Display control device, control method, program, and storage medium
JP2017068595A (en) * 2015-09-30 2017-04-06 ソニー株式会社 Information processing device, information processing method, and program
JP2021165864A (en) * 2018-06-18 2021-10-14 ソニーグループ株式会社 Information processing device, information processing method, and program

Similar Documents

Publication Publication Date Title
US8558759B1 (en) Hand gestures to signify what is important
US10109110B2 (en) Reality augmentation to eliminate, or de-emphasize, selected portions of base image
US20130179303A1 (en) Method and apparatus for enabling real-time product and vendor identification
US9633477B2 (en) Wearable device and method of controlling therefor using location information
US11126848B2 (en) Information processing device, information processing method, and information processing program
KR20160145976A (en) Method for sharing images and electronic device performing thereof
US11282481B2 (en) Information processing device
US20160018643A1 (en) Wearable display device and control method thereof
US10761694B2 (en) Extended reality content exclusion
WO2019130991A1 (en) Information processing device
CN111527466A (en) Information processing apparatus, information processing method, and program
WO2019130708A1 (en) Information processing device, information processing method, and program
US11755747B2 (en) High throughput storage encryption
WO2023218751A1 (en) Display control device
US20180158242A1 (en) Information processing method and program for executing the information processing method on computer
US20220198794A1 (en) Related information output device
US20220365741A1 (en) Information terminal system, method, and storage medium
WO2023210195A1 (en) Identification system
WO2022251831A1 (en) Reducing light leakage via external gaze detection
US20240095348A1 (en) Instant detection of a homoglyph attack when reviewing code in an augmented reality display
WO2023223750A1 (en) Display device
WO2023204159A1 (en) Display control device
WO2023026798A1 (en) Display control device
WO2022201739A1 (en) Display control device
WO2022201936A1 (en) Display control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23803235

Country of ref document: EP

Kind code of ref document: A1