WO2023218751A1 - Dispositif de commande d'affichage - Google Patents

Dispositif de commande d'affichage Download PDF

Info

Publication number
WO2023218751A1
WO2023218751A1 PCT/JP2023/009936 JP2023009936W WO2023218751A1 WO 2023218751 A1 WO2023218751 A1 WO 2023218751A1 JP 2023009936 W JP2023009936 W JP 2023009936W WO 2023218751 A1 WO2023218751 A1 WO 2023218751A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
virtual object
display device
walking speed
Prior art date
Application number
PCT/JP2023/009936
Other languages
English (en)
Japanese (ja)
Inventor
怜央 水田
康夫 森永
達哉 西▲崎▼
充宏 後藤
有希 中村
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2023218751A1 publication Critical patent/WO2023218751A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a display control device that controls the display of virtual objects.
  • Transparent display devices are generally known that display virtual objects that represent additional information such as explanatory text regarding objects that exist in real-world view (hereinafter referred to as real objects), superimposed on real space.
  • Transparent display devices include HMD devices (Head Mounted Display), AR (Augmented Reality) glasses, and MR (Mixed Reality) glasses, which display virtual objects overlaid in real space without blocking the user's field of view. is displayed.
  • HMD devices Head Mounted Display
  • AR Augmented Reality
  • MR Mated Reality
  • One way to use a transmissive display device is for a user to walk while looking at a virtual object displayed on the transmissive display device. However, in this case, the user's attention (line of sight) is directed too much toward the virtual object, which is not preferable from the viewpoint of ensuring safety. Therefore, a technique has been proposed that limits the information displayed on a transmissive display device based on the level of movement of a user wearing the transmissive display device (for example, Patent Document
  • Patent Document 1 regardless of whether the user's attention is directed to the information displayed on the transmissive display device, the display information is limited based on the movement level of the user of the transmissive display device ( Specifically, this is done according to the moving speed). Therefore, the technique disclosed in Patent Document 1 has a problem in that unnecessary restrictions may be added to the display of information on the transmissive display device.
  • a virtual object display control device includes a calculation unit and a display control unit.
  • the calculation unit calculates the user's concentration level with respect to the virtual object displayed on the transparent display device attached to the user's head based on the user's walking speed.
  • the display control unit controls display of the virtual object on the transparent display device based on the degree of concentration.
  • the display of the virtual object is controlled based on the user's concentration level with respect to the virtual object displayed on the display device, so that unnecessary display restrictions can be avoided and the user can avoid excessive display of the virtual object. It can prevent you from concentrating on
  • FIG. 1 is a block diagram illustrating a configuration example of a display system 1 including a mobile device 10 according to an embodiment of a display control device of the present disclosure.
  • 2 is a diagram showing an example of an image of real space corresponding to the field of view of the user U.
  • FIG. FIG. 2 is a diagram illustrating an example of an image in which virtual objects are superimposed on real space.
  • 1 is a block diagram showing a configuration example of a mobile device 10.
  • FIG. 3 is a flowchart showing the flow of a display control method executed by the processing device 18 of the mobile device 10 according to the program PR1.
  • 2 is a block diagram showing a configuration example of a glasses-type display device 20.
  • FIG. 1 is a block diagram illustrating a configuration example of a display system 1 including a mobile device 10 according to an embodiment of the display control device of the present disclosure. As shown in FIG. 1, the display system 1 includes a glasses-type display device 20 in addition to the mobile device 10.
  • the glasses-type display device 20 is an example of a transmissive display device worn on the head of the user U.
  • the glasses-type display device 20 displays virtual objects that do not exist in real space without blocking the field of view of the user U wearing the glasses-type display device 20.
  • the glasses-type display device 20 includes an imaging device (camera).
  • the eyeglass-type display device 20 worn by the user U captures an image of the user U's viewing range (the viewing range of the eyeglass-type display device 20) using an imaging device.
  • the mobile device 10 is, for example, a smartphone.
  • the mobile device 10 is not limited to a smartphone, and may be, for example, a tablet or a notebook personal computer.
  • the mobile device 10 is worn on the user U's body together with the glasses-type display device 20 .
  • the mobile device 10 is attached to the body of the user U by hanging from the neck using a strap or the like.
  • the mobile device 10 is connected to the eyeglass-type display device 20 by wire.
  • the mobile device 10 may be connected to the eyeglass-type display device 20 wirelessly.
  • the mobile device 10 acquires image data representing an image captured by the glasses-type display device 20 from the glasses-type display device 20 .
  • the mobile device 10 communicates with the management device 30 via the communication line NW.
  • the mobile device 10 transmits the image data acquired from the glasses-type display device 20 to the management device 30.
  • the management device 30 is a server device that provides a self-location recognition service and a content management service in AR.
  • the self-location recognition service refers to a service that specifies the position of the eyeglass-type display device 20 in the global coordinate system based on an image captured by the imaging device of the eyeglass-type display device 20.
  • Specific methods for realizing the self-location recognition service include a method using an AR tag or a method using a distribution of feature points extracted from an image, such as SLAM (Simultaneous Localization and Mapping).
  • the content management service refers to a service that distributes information regarding virtual objects to the glasses-type display device 20.
  • the virtual object corresponds to a real object visible from the position of the glasses-type display device 20 in the global coordinate system.
  • each virtual object is associated with a position in the global coordinate system and corresponds to a real object that can be seen from the position.
  • the management device 30 stores in advance virtual object information and area information corresponding to each virtual object.
  • the virtual object information represents an image of the corresponding virtual object.
  • the area information indicates the position and size of the display area that displays the virtual object.
  • the management device 30 receives image data captured by the glasses-type display device 20 from the mobile device 10 via the communication line NW, and specifies the position of the glasses-type display device 20 based on the received image data. Then, the management device 30 transmits virtual object information corresponding to the specified position and area information corresponding to the specified position to the mobile device 10.
  • the mobile device 10 causes the glasses-type display device 20 to display an image of the virtual object according to the virtual object information and area information received from the management device 30. As a result, the virtual object appears superimposed on the real space in the eyes of the user U.
  • the real space in this embodiment is a cityscape of a tourist spot.
  • the real object in this embodiment is, for example, a store in the cityscape.
  • FIG. 2 is a diagram illustrating an example of a real space image G1 corresponding to the user's U field of view.
  • the virtual object in this embodiment is an image that includes a character string of explanatory text regarding products etc. handled at the store.
  • FIG. 3 is a diagram illustrating an example of an image G2 that the user U views through the glasses-type display device 20. In the image G2 shown in FIG. 3, a virtual object displayed overlapping the cityscape of a tourist spot is drawn with a dotted line.
  • FIG. 4 is a block diagram showing a configuration example of the mobile device 10.
  • the mobile device 10 includes an input device 11, an output device 12, a communication device 14, a communication device 15, a storage device 17, a processing device 18, and a bus 19.
  • the components of the mobile device 10 are interconnected by a bus 19 for communicating information.
  • the bus 19 may be configured using a single bus, or may be configured using different buses for each device.
  • the input device 11 includes a touch panel.
  • the input device 11 may include a plurality of operation keys in addition to a touch panel.
  • the input device 11 may include a plurality of operation keys without including a touch panel.
  • the input device 11 receives operations performed by the user U.
  • Output device 12 includes a display panel. A touch panel of the input device 11 is stacked on the display panel of the output device 12 .
  • the output device 12 displays various information.
  • the communication device 14 is a hardware device for communicating with the management device 30 via the communication line NW.
  • the communication device 14 is also called, for example, a network device, a network controller, a network card, a communication module, or the like.
  • the communication device 14 transmits the image data given from the processing device 18 to the management device 30. Furthermore, the communication device 14 supplies virtual object information and area information received from the management device 30 to the processing device 18 . Note that the communication device 14 may communicate with the management device 30 without using the communication line NW.
  • the communication device 15 is a hardware device for communicating with the eyeglass-type display device 20 by wire.
  • the communication device 15 supplies data received from the glasses-type display device 20 to the processing device 18 .
  • the communication device 15 transmits image data provided from the processing device 18 to the glasses-type display device 20.
  • the communication device 15 may communicate with the glasses-type display device 20 wirelessly.
  • the storage device 17 is a recording medium that can be read by the processing device 18.
  • the storage device 17 includes, for example, nonvolatile memory and volatile memory.
  • Nonvolatile memories include, for example, ROM (Read Only Memory), EPROM (Erasable Programmable Read Only Memory), and EEPROM (Electrically Erasable Programmable Read Only Memory).
  • the volatile memory is, for example, RAM (Random Access Memory).
  • the storage device 17 stores in advance a program PR1 that causes the processing device 18 to execute the display control method of the present disclosure.
  • the processing device 18 includes one or more CPUs (Central Processing Units). One or more CPUs are an example of one or more processors. Each of the processor and CPU is an example of a computer.
  • the processing device 18 reads the program PR1 from the storage device 17.
  • the processing device 18 operating according to the program PR1 transmits the image data received from the glasses-type display device 20 using the communication device 15 to the management device 30 using the communication device 14.
  • the processing device 18 operating according to the program PR1 functions as a measuring section 181, a determining section 182, a calculating section 183, and a display controlling section 184 shown in FIG. That is, the measurement section 181, determination section 182, calculation section 183, and display control section 184 in FIG. 4 are software modules realized by operating the processing device 18 according to software.
  • the measurement unit 181 measures the walking speed of the user U wearing the glasses-type display device 20.
  • the glasses-type display device 20 transmits acceleration data to the mobile device 10.
  • the acceleration data represents acceleration in the following three directions. The first is acceleration in the direction along the vertical axis (hereinafter referred to as the Z axis). The second is the acceleration along the axis (hereinafter referred to as the X axis) perpendicular to the vertical axis. The third is acceleration in a direction along an axis (hereinafter referred to as Y-axis) that is perpendicular to the vertical axis and perpendicular to the X-axis.
  • the acceleration data transmitted from the glasses-type display device 20 to the mobile device 10 is received by the communication device 15.
  • the measurement unit 181 measures the walking speed of the user U based on the acceleration data received by the communication device 15.
  • the measurement unit 181 first identifies the direction of gravitational acceleration (that is, the Z-axis direction) based on the acceleration data received by the communication device 15. The measurement unit 181 determines that the user U is walking when the acceleration in the Z-axis direction changes within a predetermined cycle. When it is determined that the user U is walking, the measurement unit 181 calculates the absolute value of the composite acceleration obtained by combining the acceleration in the X-axis direction and the acceleration in the Y-axis direction based on the acceleration data. The measurement unit 181 then calculates the user's walking speed by integrating the absolute value of the composite acceleration.
  • the measurement unit 181 calculates the user's walking speed by integrating the absolute value of the composite acceleration.
  • the determining unit 182 determines whether the walking speed measured by the measuring unit 181 is less than a first threshold (walking speed ⁇ first threshold).
  • the first threshold in this embodiment is, for example, an average value of walking speeds measured when a plurality of sample users walk while looking at a virtual object.
  • the first threshold value is not limited to the average walking speed of a plurality of sample users.
  • the first threshold value may be an actual value of the walking speed of the user U.
  • the tutorial on the glasses-type display device 20 is used to measure the walking speed of the user U. During execution of the tutorial, the actual value of the walking speed of the user U, which is measured when the user U walks while looking at the virtual object displayed on the glasses-type display device 20, is set as the first threshold value.
  • the calculation unit 183 calculates the degree of concentration of the user U based on the walking speed of the user U.
  • the user's concentration level refers to the degree of user U's attention to the virtual object displayed on the glasses-type display device 20 (that is, the degree to which the user is concentrating on the virtual object).
  • the degree of concentration of the user U in this embodiment is, for example, a positive value, and the larger the value, the higher the degree of concentration of the user U is.
  • the reason why the degree of concentration of the user U with respect to the virtual object can be calculated based on the walking speed of the user U is as follows.
  • the user U when the user U is in a hurry to move to the destination without paying attention to the virtual object displayed on the glasses-type display device 20, the user U's walking style tends to be brisk, and the walking speed is set to the first speed. Frequently exceeds the threshold. On the other hand, when the user U is walking while paying attention to the virtual object, the user walks a little slowly, and the walking speed often falls below the first threshold. Since there is a correlation between the degree of concentration of the user U with respect to the virtual object displayed on the glasses-type display device 20 and the walking speed, the degree of concentration of the user U can be calculated based on the walking speed of the user U.
  • the calculating unit 183 calculates the walking speed measured by the measuring unit 181 and The degree of concentration of the user U is calculated based on the difference from the first threshold. In this embodiment, the calculation unit 183 calculates the degree of concentration with a larger value as the difference between the walking speed measured by the measurement unit 181 and the first threshold value becomes larger. That is, when the difference between the walking speed measured by the measuring section 181 and the first threshold value is the first value, the first concentration degree calculated by the calculating section 183 is equal to the walking speed measured by the measuring section 181.
  • the first threshold value is a second value (a value smaller than the first value), which is larger than the second concentration degree calculated by the calculation unit 183.
  • the display control unit 184 controls the display of the virtual object on the glasses-type display device 20.
  • the display control unit 184 performs different processing depending on whether the determination result by the determination unit 182 is positive or negative.
  • the display control unit 184 If the determination result by the determination unit 182 is negative, the display control unit 184 generates image data representing the superimposed image.
  • a superimposed image is an image in which an image indicated by virtual object information is arranged in an area indicated by area information.
  • the virtual object information and area information are received by the mobile device 10 from the management device 30. Then, the display control unit 184 transmits the image data to the glasses-type display device 20 and causes the glasses-type display device 20 to display the superimposed image.
  • the display control unit 184 controls the display of the virtual object based on the degree of concentration of the user U calculated by the calculation unit 183. More specifically, when the determination result by the determination unit 182 is affirmative, the display control unit 184 determines whether the concentration level of the user U calculated by the calculation unit 183 exceeds the second threshold (the user's concentration > second threshold). When the degree of concentration of the user U calculated by the calculation unit 183 is less than or equal to the second threshold, the display control unit 184 causes the glasses-type display device 20 to display the above-mentioned superimposed image. When the degree of concentration of the user U calculated by the calculation unit 183 is greater than the second threshold, the display control unit 184 stops generating and transmitting image data representing the superimposed image. As a result, no virtual object is displayed on the glasses-type display device 20.
  • the processing device 18 operating according to the program PR1 executes the display method shown in FIG. 5 every time it receives acceleration data from the communication device 15. As shown in FIG. 5, this display method includes each process from step SA110 to step SA150.
  • step SA110 the processing device 18 functions as the measurement unit 181.
  • step SA110 the processing device 18 measures the walking speed of the user U wearing the glasses-type display device 20 based on the acceleration data received by the communication device 15.
  • step SA120 the processing device 18 functions as the determination unit 182.
  • step SA120 the processing device 18 determines whether the walking speed measured in step SA110 is less than a first threshold. If the walking speed measured in step SA110 is equal to or greater than the first threshold, the determination result in step SA120 is "No" and the process in step SA130 is executed. If the walking speed measured in step SA110 is less than the first threshold, the determination result in step SA120 is "Yes” and the processes in step SA140 and step SA150 are executed.
  • step SA130 the processing device 18 functions as the display control unit 184.
  • step SA130 the processing device 18 generates image data representing the superimposed image. Then, the processing device 18 transmits the image data to the glasses-type display device 20 and causes the glasses-type display device 20 to display the superimposed image.
  • step SA140 the processing device 18 functions as the calculation unit 183.
  • step SA140 processing device 18 calculates the degree of concentration of user U based on the walking speed measured in step SA110.
  • step SA150 following step SA140, the processing device 18 functions as the display control unit 184.
  • step SA150 the processing device 18 controls the display of the virtual object on the glasses-type display device 20 based on the degree of concentration of the user U calculated in step SA140.
  • step SA150 the processing device 18 determines whether the degree of concentration of the user U calculated in step SA140 exceeds the second threshold. If the degree of concentration of the user U calculated in step SA140 is equal to or less than the second threshold, the processing device 18 causes the glasses-type display device 20 to display the superimposed image. If the degree of concentration of the user U calculated in step SA140 is greater than the second threshold, the processing device 18 stops displaying the superimposed image.
  • FIG. 6 is a block diagram showing a configuration example of the eyeglass-type display device 20.
  • the glasses-type display device 20 includes a display section 2a, a communication device 2b, an imaging device 2c, an acceleration sensor 2g, a storage device 2d, a processing device 2e, and a bus 2f.
  • the components of the mirror display device 20 (display section 2a, communication device 2b, imaging device 2c, acceleration sensor 2g, storage device 2d, and processing device 2e) are interconnected by a bus 2f for communicating information.
  • the bus 2f may be configured using a single bus, or may be configured using different buses for each element such as a device.
  • the display section 2a is of a transmissive type. Light in the field of view of the glasses-type display device 20 (display section 2a) passes through the display section 2a.
  • the display unit 2a displays the superimposed image under the control of the mobile device 10.
  • the display section 2a is located in front of the user's U left and right eyes.
  • the user U wearing the glasses-type display device 20 visually recognizes when an image of a virtual object appears in real space.
  • the display unit 2a includes a left eye lens, a left eye display panel, a left eye optical member, a right eye lens, a right eye display panel, and a right eye optical member.
  • the display panel for the left eye and the display panel for the right eye are, for example, a liquid crystal panel or an organic EL (Electro Luminescence) panel.
  • the display panel for the left eye displays a superimposed image represented by image data provided from the mobile device 10.
  • the left eye optical member is an optical member that guides light emitted from the left eye display panel to the left eye lens.
  • the display panel for the right eye displays a superimposed image represented by image data provided from the mobile device 10.
  • the right eye optical member is an optical member that guides light emitted from the right eye display panel to the right eye lens.
  • Each of the left eye lens and the right eye lens has a half mirror.
  • the half mirror included in the left eye lens guides the light representing the real space to the left eye of the user U by transmitting the light representing the real space. Further, the half mirror included in the left eye lens reflects the light guided by the left eye optical member to the user U's left eye.
  • the half mirror included in the right eye lens guides the light representing the real space to the right eye of the user U by transmitting the light representing the real space.
  • the half mirror included in the right eye lens reflects the light guided by the right eye optical member to the user U's right eye.
  • the communication device 2b is a hardware device for communicating with the mobile device 10 by wire.
  • the communication device 2b may communicate with the mobile device 10 wirelessly.
  • the spectacle-type display device 20 has a spectacle-shaped frame that supports a left-eye lens and a right-eye lens, and the imaging device 2c is provided on a bridge of the frame.
  • the imaging device 2c images the field of view of the imaging device 2c.
  • the imaging device 2c outputs image data representing the captured image to the processing device 2e.
  • the acceleration sensor 2g is a three-axis acceleration sensor that detects acceleration in each of the X-axis, Y-axis, and Z-axis directions at regular intervals.
  • the acceleration sensor 2g outputs acceleration data representing acceleration in the directions of the X, Y, and Z axes to the processing device 2e at regular intervals.
  • the storage device 2d is a recording medium that can be read by the processing device 2e.
  • the storage device 2d like the storage device 17, includes nonvolatile memory and volatile memory.
  • the storage device 2d stores a program PR2.
  • the processing device 2e includes one or more CPUs.
  • the processing device 2e reads the program PR2 from the storage device 2d.
  • the processing device 2e functions as an operation control unit 2e1 by executing the program PR2.
  • the operation control unit 2e1 controls the eyeglass-type display device 20.
  • the operation control unit 2e1 transmits the image data output from the imaging device 2c to the mobile device 10 using the communication device 2b. Further, the operation control unit 2e1 transmits acceleration data output from the acceleration sensor 2g to the mobile device 10 using the communication device 2b. Further, the operation control unit 2e1 supplies image data received from the mobile device 10 via the communication device 2b to the display unit 2a.
  • the display section 2a displays an image represented by the image data supplied from the operation control section 2e1.
  • the image data transmitted from the mobile device 10 to the glasses-type display device 20 represents the above-mentioned superimposed image. Since the superimposed image is displayed on the display unit 2a, the user U's eyes see an image of the real space on which the virtual object is superimposed.
  • step SA120 in the display control method executed by the mobile device 10 is "No"
  • image data representing the superimposed image is transmitted from the mobile device 10 to the glasses-type display device 20, and the superimposed image is displayed on the display section 2a of the glasses-type display device 20. Therefore, the user U's eyes see a real space in which virtual objects are superimposed, as in the image G2 shown in FIG.
  • step SA120 determines whether the user U's degree of concentration on the virtual object wearing the glasses-type display device 20 increases and the walking speed falls below the first threshold.
  • step SA140 determines whether the user U's degree of concentration on the virtual object wearing the glasses-type display device 20 increases and the walking speed falls below the first threshold.
  • step SA140 When user U's concentration level with respect to the virtual object further increases and user U's walking speed further slows down, user U's concentration level calculated in step SA140 exceeds the second threshold, and the superimposed image is no longer displayed. As a result, the virtual object does not appear in the eyes of the user U, as in the image G1 shown in FIG.
  • the display of the virtual object is controlled according to the degree of concentration of the user U on the virtual object displayed on the glasses-type display device 20, thereby avoiding unnecessary display restrictions. At the same time, it is possible to prevent the user U from concentrating too much on displaying the virtual object.
  • B Transformation
  • the present disclosure is not limited to the embodiments illustrated above. Specific aspects of the modification are as follows. Two or more aspects arbitrarily selected from the examples below may be combined.
  • B-1 Modification 1
  • the display control unit 184 in the embodiment described above limits the display of the virtual object by stopping the display of the virtual object when the degree of concentration of the user U calculated by the calculation unit 183 exceeds the second threshold. However, when the degree of concentration of the user U calculated by the calculation unit 183 exceeds the second threshold, the display control unit 184 displays, in addition to the virtual object, a message in the form of glasses to warn against excessive concentration on the virtual object. It may be displayed on the device 20. In short, the display control unit 184 only needs to be able to control the display of the virtual object on the glasses-type display device 20 based on the degree of concentration of the user U calculated by the calculation unit 183.
  • the measurement unit 181 calculated the walking speed of the user U by integrating the combined acceleration obtained by combining the acceleration in the X-axis direction and the acceleration in the Y-axis direction.
  • the measurement unit 181 may identify the walking speed of the user U by using a table prepared in advance. Specifically, the table has multiple records. Each of the plurality of records includes a walking speed within a certain range and a periodic swing amplitude of acceleration in the Z-axis direction obtained when the user U walks at the walking speed.
  • the above table is stored in the storage device 17 in advance.
  • the measurement unit 181 receives acceleration data from the glasses-type display device 20, and specifies the walking speed of the user U by reading out from the table the walking speed that corresponds to the swing amplitude of acceleration in the Z-axis direction represented by the acceleration data. do.
  • the glasses-type display device 20 includes a three-axis acceleration sensor, and the measurement unit 181 identifies the walking speed of the user based on the output data of the acceleration sensor.
  • the measurement unit 181 may measure the walking speed of the user based on the moving image captured by the imaging device 2c.
  • the measurement unit 181 may identify the walking speed of the user U wearing the glasses-type display device 20 by referring to a table prepared in advance. Specifically, the table has multiple records.
  • Each of the plurality of records includes a walking speed within a certain range and a blur width in the Z-axis direction of a moving image obtained by capturing an image while the user U is walking at the walking speed.
  • the measurement unit 181 specifies the walking speed of the user U by reading from the table the walking speed corresponding to the shake width in the Z-axis direction of the moving image captured by the imaging device 2c. Furthermore, the measurement unit 181 may determine whether the user U is walking based on the vertical image blur width for each frame of the moving image captured by the imaging device 2c. When it is determined that the user U is walking, the walking speed may be determined based on the change in size of an object whose size is known, such as a road sign or a vehicle license plate, over each frame. When specifying the user's walking speed based on the moving image captured by the imaging device 2c, the acceleration sensor can be omitted.
  • a GPS (Global Positioning System, Global Positioning Satellite) receiver is provided in the glasses-type display device 20 instead of an acceleration sensor, and the measurement unit 181 can identify the user's walking speed based on the output data of the GPS receiver. good. For example, the measurement unit 181 determines whether the user U is walking based on the image blur width in the Z-axis direction for each frame of the moving image captured by the imaging device 2c. When it is determined that the user U is walking, the measurement unit 181 uses the first position information acquired by the GPS receiver at time t1 and the second position information acquired by the GPS receiver at time t2 after time t1. The moving distance of the user is calculated based on the position information, and the value obtained by dividing this moving distance by t2-t1 is set as the walking speed.
  • GPS Global Positioning System, Global Positioning Satellite
  • the program PR1 is stored in the storage device 17 of the mobile device 10, but the program PR1 may be manufactured or sold separately.
  • the method of providing the program PR1 to the purchaser is to distribute a computer-readable recording medium such as a flash ROM in which the program PR1 is written, or to distribute the recording medium via a telecommunications line. For example, it can be distributed by downloading.
  • the measurement unit 181, determination unit 182, calculation unit 183, and display control unit 184 in the above embodiment were software modules.
  • any one, a plurality, or all of the measurement section 181, the determination section 182, the calculation section 183, and the display control section 184 may be a hardware module.
  • Specific examples of the hardware module include DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), and the like.
  • the mobile device 10 included the measurement section 181, the determination section 182, the calculation section 183, and the display control section 184.
  • the measurement unit 181 may be provided in the glasses-type display device 20, and the data representing the walking speed may be supplied from the glasses-type display device 20 to the mobile device 10.
  • the determination unit 182 may be omitted.
  • the display control device that controls the display of the virtual object on the glasses-type display device 20 only needs to include the calculation section 183 and the display control section 184.
  • the storage device 17 and the storage device 2d are exemplified as ROM, RAM, etc., but the storage device 17 and the storage device 2d are flexible disks, magneto-optical disks (for example, compact disks, digital Application discs, Blu-ray (registered trademark) discs), smart cards, flash memory devices (e.g. cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable disks, hard disks, floppies (registered trademark) disk, magnetic strip, database, server, or other suitable storage medium.
  • magneto-optical disks for example, compact disks, digital Application discs, Blu-ray (registered trademark) discs
  • smart cards e.g. cards, sticks, key drives
  • CD-ROMs Compact Disc-ROMs
  • registers removable disks
  • hard disks hard disks
  • floppies registered trademark
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. may refer to voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, light fields or photons, or any of these. It may also be represented by a combination of
  • the input/output information may be stored in a specific location (for example, memory) or may be managed using a management table. Information etc. to be input/output may be overwritten, updated, or additionally written. The output information etc. may be deleted. The input information etc. may be transmitted to other devices.
  • the determination may be made based on a value represented by 1 bit (0 or 1), or may be made based on a truth value (Boolean: true or false). , may be performed by numerical comparison (for example, comparison with a predetermined value).
  • each function illustrated in FIG. 4 is realized by an arbitrary combination of at least one of hardware and software.
  • the method for realizing each functional block is not particularly limited. That is, each functional block may be realized using one physically or logically coupled device, or may be realized using two or more physically or logically separated devices directly or indirectly (e.g. , wired, wireless, etc.) and may be realized using a plurality of these devices.
  • the functional block may be realized by combining software with the one device or the plurality of devices.
  • the programs exemplified in the embodiments described above may include instructions, instruction sets, codes, software, firmware, middleware, microcode, hardware description language, or other names. Should be broadly construed to mean a code segment, program code, program, subprogram, software module, application, software application, software package, routine, subroutine, object, executable, thread of execution, procedure, function, etc. .
  • software, instructions, information, etc. may be sent and received via a transmission medium.
  • a transmission medium For example, if the software uses wired technology (coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.) and/or wireless technology (infrared, microwave, etc.) to create a website, When transmitted from a server or other remote source, these wired and/or wireless technologies are included within the definition of transmission medium.
  • wired technology coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), etc.
  • wireless technology infrared, microwave, etc.
  • the information, parameters, etc. described in this disclosure may be expressed using absolute values, relative values from a predetermined value, or other corresponding information. It may also be expressed as
  • the mobile device includes a mobile station (MS).
  • MS mobile station
  • a mobile station is defined by a person skilled in the art as a subscriber station, mobile unit, subscriber unit, wireless unit, remote unit, mobile device, wireless device, wireless communication device, remote device, mobile subscriber station, access terminal, mobile terminal, wireless It may also be referred to as a terminal, remote terminal, handset, user agent, mobile client, client, or some other suitable terminology. Further, in the present disclosure, terms such as “mobile station,” “user terminal,” “user equipment (UE),” and “terminal” may be used interchangeably.
  • connection refers to direct or indirect connections between two or more elements. Refers to any connection or combination and may include the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other.
  • the bonds or connections between elements may be physical, logical, or a combination thereof.
  • connection may be replaced with "access.”
  • two elements may include one or more electrical wires, cables, and/or printed electrical connections, as well as in the radio frequency domain, as some non-limiting and non-inclusive examples. , electromagnetic energy having wavelengths in the microwave and optical (both visible and non-visible) ranges, and the like.
  • determining and “determining” used in this disclosure may encompass a wide variety of operations.
  • “Judgment” and “decision” include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, and inquiry. (e.g., searching in a table, database, or other data structure), and regarding an ascertaining as a “judgment” or “decision.”
  • judgment and “decision” refer to receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and access.
  • (accessing) may include considering something as a “judgment” or “decision.”
  • judgment and “decision” refer to resolving, selecting, choosing, establishing, comparing, etc. as “judgment” and “decision”. may be included.
  • judgment and “decision” may include regarding some action as having been “judged” or “determined.”
  • judgment (decision) may be read as “assuming", “expecting", “considering”, etc.
  • the display control device includes a calculation section 183 and a display control section 184.
  • the calculation unit 183 calculates the degree of concentration of the user U with respect to the virtual object displayed on the transparent display device worn on the head of the user U, based on the walking speed of the user.
  • the display control unit 184 controls the display of the virtual object on the transparent display device based on the user concentration level calculated by the calculation unit 183.
  • the mobile device 10 is an example of a display control device of the present disclosure.
  • the glasses-type display device 20 is an example of a transmissive display device in the present disclosure.
  • the display of the virtual object is controlled based on the user's degree of concentration on the virtual object displayed on the transparent display device, so unnecessary display restrictions can be avoided while It is possible to prevent the user from concentrating too much on displaying the virtual object.
  • the display control device may further include a measurement unit 181 that measures the walking speed of the user.
  • the display control device of the second aspect can control the display of the virtual object according to the concentration degree calculated based on the walking speed measured by the measurement unit 181.
  • the display control device may include a determination unit 182 that determines whether the walking speed of the user has fallen below a first threshold.
  • the calculation unit 183 in the display control device according to the third aspect calculates the difference between the first threshold value and the walking speed of the user when the determination result by the determination unit 182 is affirmative (when the walking speed is less than the first threshold value).
  • the user's concentration level may be calculated based on the difference.
  • the display control unit 184 in the display control device according to the third aspect controls the display of the virtual object on the display device based on the user concentration calculated by the calculation unit 183. Limits may be based on degree.
  • the display control device of the third aspect can limit the display of the virtual object on the display device based on the user's concentration level calculated by the calculation unit 183 when the user's walking speed is less than the first threshold.
  • the display control unit 184 in the display control device according to the fourth aspect performs virtual control on the transmissive display device when the user concentration calculated by the calculation unit 183 exceeds the second threshold. Display of objects may be restricted.
  • the display control device of the fourth aspect can restrict the display of the virtual object on the transparent display device when the user's concentration level calculated based on the user's walking speed exceeds the second threshold.
  • SYMBOLS 1...Display system 10...Mobile device, 20...Glasses type display device, 11...Input device, 12...Output device, 14, 15, 2b...Communication device, 17, 2d...Storage device, 18, 2e...Processing device, 181...Measurement section, 182...Judgment section, 183...Calculation section, 184...Display control section, 19, 2f...Bus, 2a...Display section, 2g...Acceleration sensor, 2e1...Operation control section, PR1, PR2...Program.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon la présente divulgation, un dispositif mobile qui est un aspect d'un dispositif de commande d'affichage communique avec un dispositif d'affichage de type à transmission monté sur la tête d'un utilisateur, ce qui permet de commander l'affichage d'un objet virtuel sur le dispositif d'affichage de type à transmission. Le dispositif mobile est pourvu d'une unité de calcul et d'une unité de commande d'affichage. L'unité de calcul calcule, d'après la vitesse de marche de l'utilisateur, le degré de concentration de l'utilisateur sur l'objet virtuel affiché sur le dispositif d'affichage de type à transmission. L'unité de commande d'affichage commande l'affichage de l'objet virtuel sur le dispositif d'affichage de type à transmission d'après le degré de concentration de l'utilisateur calculé par l'unité de calcul.
PCT/JP2023/009936 2022-05-11 2023-03-14 Dispositif de commande d'affichage WO2023218751A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-078051 2022-05-11
JP2022078051 2022-05-11

Publications (1)

Publication Number Publication Date
WO2023218751A1 true WO2023218751A1 (fr) 2023-11-16

Family

ID=88729984

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/009936 WO2023218751A1 (fr) 2022-05-11 2023-03-14 Dispositif de commande d'affichage

Country Status (1)

Country Link
WO (1) WO2023218751A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015145544A1 (fr) * 2014-03-24 2015-10-01 パイオニア株式会社 Dispositif de commande d'affichage, procédé de commande, programme et support de stockage
JP2017068595A (ja) * 2015-09-30 2017-04-06 ソニー株式会社 情報処理装置、および情報処理方法、並びにプログラム
JP2021165864A (ja) * 2018-06-18 2021-10-14 ソニーグループ株式会社 情報処理装置、情報処理方法、及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015145544A1 (fr) * 2014-03-24 2015-10-01 パイオニア株式会社 Dispositif de commande d'affichage, procédé de commande, programme et support de stockage
JP2017068595A (ja) * 2015-09-30 2017-04-06 ソニー株式会社 情報処理装置、および情報処理方法、並びにプログラム
JP2021165864A (ja) * 2018-06-18 2021-10-14 ソニーグループ株式会社 情報処理装置、情報処理方法、及びプログラム

Similar Documents

Publication Publication Date Title
US8558759B1 (en) Hand gestures to signify what is important
US9928655B1 (en) Predictive rendering of augmented reality content to overlay physical structures
US10109110B2 (en) Reality augmentation to eliminate, or de-emphasize, selected portions of base image
US20130179303A1 (en) Method and apparatus for enabling real-time product and vendor identification
US9633477B2 (en) Wearable device and method of controlling therefor using location information
US11126848B2 (en) Information processing device, information processing method, and information processing program
KR20160145976A (ko) 영상 공유 방법 및 이를 수행하는 전자 장치
US11282481B2 (en) Information processing device
US20160018643A1 (en) Wearable display device and control method thereof
US10761694B2 (en) Extended reality content exclusion
WO2019130991A1 (fr) Dispositif de traitement d'informations
CN111527466A (zh) 信息处理装置、信息处理方法和程序
JP2019114078A (ja) 情報処理装置、情報処理方法及びプログラム
US20220207156A1 (en) High throughput storage encryption
WO2023218751A1 (fr) Dispositif de commande d'affichage
US20180158242A1 (en) Information processing method and program for executing the information processing method on computer
US20220198794A1 (en) Related information output device
US20220365741A1 (en) Information terminal system, method, and storage medium
WO2023210195A1 (fr) Système d'identification
WO2022251831A1 (fr) Réduction de fuite de lumière par détection de regard externe
EP3438939A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20240095348A1 (en) Instant detection of a homoglyph attack when reviewing code in an augmented reality display
WO2023223750A1 (fr) Dispositif d'affichage
WO2023204159A1 (fr) Dispositif de commande d'affichage
WO2023026798A1 (fr) Dispositif de commande d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23803235

Country of ref document: EP

Kind code of ref document: A1