JP2018005005A - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
JP2018005005A
JP2018005005A JP2016132696A JP2016132696A JP2018005005A JP 2018005005 A JP2018005005 A JP 2018005005A JP 2016132696 A JP2016132696 A JP 2016132696A JP 2016132696 A JP2016132696 A JP 2016132696A JP 2018005005 A JP2018005005 A JP 2018005005A
Authority
JP
Japan
Prior art keywords
real object
user
object
information processing
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2016132696A
Other languages
Japanese (ja)
Inventor
浩丈 市川
Hirotake Ichikawa
浩丈 市川
佐藤 直之
Naoyuki Sato
直之 佐藤
誠司 鈴木
Seiji Suzuki
誠司 鈴木
真人 島川
Masato Shimakawa
真人 島川
Original Assignee
ソニー株式会社
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社, Sony Corp filed Critical ソニー株式会社
Priority to JP2016132696A priority Critical patent/JP2018005005A/en
Publication of JP2018005005A publication Critical patent/JP2018005005A/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position

Abstract

PROBLEM TO BE SOLVED: To provide an information processing device, an information processing method, and a program which can change a user's degree of recognition of a real object included in the user's visual field in a manner appropriate to the real object.SOLUTION: The information processing device is provided with an output control unit which, on the basis of the result of determining whether or not a real object included in the user's visual field is a first real object, controls display by a display unit in the area between the user and the real object so as to change the user's degree of recognition of the real object.SELECTED DRAWING: Figure 10

Description

  The present disclosure relates to an information processing apparatus, an information processing method, and a program.

  In recent years, various technologies related to augmented reality (AR) in which additional information is superimposed on the real world and presented to the user have been developed (see, for example, Patent Document 1 below).

  In addition, a technique for improving visibility when displaying AR content has also been proposed. For example, Patent Document 2 below describes a technique for controlling display of a display object on a transmissive display so that a user can visually recognize an actual object located behind the transmissive display through the transmissive display.

JP 2012-155654 A Japanese Patent No. 5830987

  By the way, it is desirable that the recognition degree of the real object included in the user's field of view varies depending on the type of the real object. However, Patent Document 2 does not disclose changing the display method of the display object on the transmissive display according to the actual object.

  Therefore, in the present disclosure, a new and improved information processing apparatus, information processing method, and information processing apparatus capable of changing the degree of recognition of the real object by the user adaptively to the real object included in the user's field of view, and Suggest a program.

  According to the present disclosure, based on the determination result of whether or not the real object included in the user's field of view is the first real object, the real object by the user is in a range between the user and the real object. An information processing apparatus is provided that includes an output control unit that controls display by a display unit so that the degree of object recognition changes.

  Further, according to the present disclosure, based on a determination result of whether or not the real object included in the user's field of view is the first real object, the user can determine the range between the user and the real object. There is provided an information processing method including a processor controlling display by a display unit so that a recognition degree of the real object is changed.

  In addition, according to the present disclosure, the computer can be used in a range between the user and the real object based on a determination result of whether or not the real object included in the user's field of view is the first real object. A program is provided for functioning as an output control unit that controls display by a display unit such that the degree of recognition of the real object by the user changes.

  As described above, according to the present disclosure, the recognition degree of the real object by the user can be adaptively changed to the real object included in the user's field of view. Note that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.

It is explanatory drawing which showed the structural example of the information processing system by embodiment of this indication. FIG. 3 is a schematic diagram showing how a user visually recognizes a real object and a virtual object through an AR glass 10. FIG. 6 is another schematic diagram illustrating a state in which a user visually recognizes a real object and a virtual object through the AR glass 10. It is the figure which showed the real object and virtual object which are visually recognized through the display part 124 in the condition shown in FIG. It is the functional block diagram which showed the structural example of AR glass 10 by this embodiment. It is the figure which showed a mode that a user visually recognizes through AR glass 10 when the real object 30a is set to avoid concealment. It is the figure which showed the example of the real object and virtual object visually recognized through the display part in the condition shown in FIG. It is the figure which showed another example of the real object and virtual object visually recognized through the display part in the condition shown in FIG. It is the figure which showed another example of the real object and virtual object visually recognized through the display part in the condition shown in FIG. 6 is a diagram illustrating an example of a position change allowable range set for a virtual object 32. FIG. It is the figure which showed the example of a display of the virtual object in an evacuation route. It is the figure which showed another example of a display of the virtual object in an evacuation route. It is the flowchart which showed the operation example by this embodiment. It is explanatory drawing which showed the hardware structural example of AR glass 10 by this embodiment.

  Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, the duplicate description is abbreviate | omitted by attaching | subjecting the same code | symbol.

  In the present specification and drawings, a plurality of components having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral. For example, a plurality of configurations having substantially the same functional configuration are differentiated as needed, such as the AR glass 10a and the AR glass 10b. However, when it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration, only the same reference numerals are given. For example, the AR glass 10a and the AR glass 10b are simply referred to as the AR glass 10 when it is not necessary to distinguish between them.

Further, the “DETAILED DESCRIPTION OF THE INVENTION” will be described according to the following item order.
1. 1. Configuration of information processing system 2. Detailed Description of Embodiments Hardware configuration Modified example

<< 1. Configuration of information processing system >>
First, a configuration of an information processing system according to an embodiment of the present disclosure will be described with reference to FIG. As shown in FIG. 1, the information processing system includes an AR glass 10, a server 20, and a communication network 22.

<1-1. AR Glass 10>
The AR glass 10 is an example of an information processing apparatus according to the present disclosure. The AR glass 10 is a device that controls display of a virtual object associated with a position in the real world in advance. For example, the AR glass 10 first sends a virtual object located around the position (for example, a certain range in all directions) from the server 20 via the communication network 22 based on the position information of the AR glass 10. get. Then, the AR glass 10 displays, on the display unit 124 described later, a virtual object included in the user's field of view among the acquired virtual objects based on the posture of the AR glass 10 (or the detection result of the user's line-of-sight direction). indicate. For example, the AR glass 10 generates a right-eye image and a left-eye image based on the acquired virtual object, displays the right-eye image on the right-eye display unit 124a, and displays the left-eye image on the left-eye display unit. It is displayed on 124b. Thereby, a user can visually recognize a virtual stereoscopic video.

  Here, the virtual object is basically a 3D object, but is not limited to such an example, and may be a 2D object. Further, the display unit 124 of the AR glass 10 is configured by a transmissive display.

  FIGS. 2 and 3 are schematic diagrams illustrating how the user wears the AR glass 10 and visually recognizes the real object 30 and the virtual object 32. As shown in FIG. 2 (and FIG. 3), the user simultaneously displays the real object 30 and the virtual object 32 included in the user's view 40 among the plurality of real objects 30 located in the real world through the display unit 124. It can be visually recognized. In the following description, “real object” includes not only a single real object but also a predetermined area in the real world (for example, the entire building, an intersection, a corridor, etc.).

  Note that the user's field of view 40 may be defined in various ways. For example, it may be estimated that the user's field of view 40 is approximately the center of the area captured by the camera provided on the outside of the AR glass 10, that is, the front side of the AR glass 10. Alternatively, the gaze direction of the user is estimated based on the eyeball image captured by the camera provided inside the AR glass 10, that is, the rear side of the AR glass 10, and a predetermined three-dimensional space corresponding to the gaze direction is obtained. It may be estimated that it is the user's field of view 40. The three-dimensional shape of the user's field of view 40 may be determined as appropriate, but the three-dimensional shape is preferably defined as a substantially conical shape.

  Also, the relationship between the user's field of view 40 and the display area of the display unit 124 can be variously determined. For example, as shown in FIGS. 2 and 4, the relationship between the two may be determined so that the area of the user's field of view 40 a that intersects the display area of the display unit 124 is greater than or equal to the entire display area. Alternatively, as illustrated in FIGS. 3 and 4, the relationship between the two may be determined so that the area of the user's field of view 40 b that intersects the display area of the display unit 124 is smaller than the entire display area.

  FIG. 4 is a diagram illustrating an example of the real object 30 and the virtual object 32 that are visually recognized through the display unit 124 in the situation illustrated in FIGS. 2 and 3. In FIG. 4, it is assumed that the virtual object 32 is a non-transparent object.

  In the example shown in FIGS. 2 and 3, the virtual object 32 is located between the real object 30a and the real object 30b and the user. Therefore, as shown in FIG. 4, a part of each of the real object 30 a and the real object 30 b is hidden by the virtual object 32 and visually recognized by the user.

  The AR glass 10 can communicate with the server 20 via the communication network 22.

<1-2. Server 20>
The server 20 is a device that stores virtual objects in association with real-world position information. Here, the real world position information may be information including latitude and longitude, or may be floor plan information in a predetermined building. When the server 20 receives a virtual object acquisition request from another device such as the AR glass 10, for example, the server 20 transmits a virtual object corresponding to the acquisition request to the other device.

<1-3. Communication network 22>
The communication network 22 is a wired or wireless transmission path for information transmitted from a device connected to the communication network 22. For example, the communication network 22 may include a public line network such as a telephone line network, the Internet, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like. . The communication network 22 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).

<1-4. Organizing issues>
The configuration of the information processing system according to this embodiment has been described above. By the way, especially when the display unit 124 is a high-luminance display, as described above, the real object located in the field of view of the user wearing the AR glass 10 is located between the real object and the user. Hidden by virtual objects and difficult to see. Therefore, when the user acts while wearing the AR glass 10, various dangers may occur. For example, by not being aware of the existence of a real object approaching the user, the real object can hit the user. Alternatively, the user does not notice the presence of a stationary real object and can collide with the real object. Alternatively, the user may mistake a virtual traffic light for a real traffic light and perform dangerous driving.

  In view of the above circumstances, the AR glass 10 according to the present embodiment has been created. The AR glass 10 according to the present embodiment is based on the determination result of whether or not a real object included in the user's field of view is a specific real object, and the real object in the range between the user and the real object. The display by the display unit 124 is controlled so that the degree of recognition changes. For this reason, for example, when a virtual object exists between a specific real object and the user, the AR glass 10 can control the display of the virtual object so that the user can recognize the real object. it can. As a result, it is possible to improve safety when displaying the virtual object.

<< 2. Detailed Description of Embodiment >>
<2-1. Configuration>
The configuration of the information processing system according to this embodiment has been described above. Next, the configuration of the AR glass 10 according to the present embodiment will be described in detail. FIG. 5 is a functional block diagram showing a configuration example of the AR glass 10 according to the present embodiment. As illustrated in FIG. 5, the AR glass 10 includes a control unit 100, a communication unit 120, a sensor unit 122, a display unit 124, and a storage unit 126.

{2-1-1. Control unit 100}
The control unit 100 centrally controls the operation of the AR glass 10 using hardware such as a CPU (Central Processing Unit) 150 and a RAM (Random Access Memory) 154, which will be described later, built in the AR glass 10. . As illustrated in FIG. 5, the control unit 100 includes a virtual object acquisition unit 102, a real object determination unit 104, an overlap determination unit 106, and an output control unit 108.

{2-1-2. Virtual object acquisition unit 102}
The virtual object acquisition unit 102 acquires a virtual object to be displayed from the server 20 based on the measurement result of the position information of the AR glass 10 by the sensor unit 122 described later. For example, first, the virtual object acquisition unit 102 transmits the position information measured by the sensor unit 122 to the server 20 to be positioned around the position information (for example, a certain range in all directions). A plurality of virtual objects are acquired from the server 20. The virtual object acquisition unit 102 then includes a virtual object included in the user's field of view from the plurality of received virtual objects based on the posture of the AR glass 10 or the user's line-of-sight direction measured by the sensor unit 122. Are extracted as virtual objects to be displayed.

{2-1-3. Real object determination unit 104}
The real object determination unit 104 determines whether or not individual real objects included in the user's field of view are set to avoid concealment. Here, the real object that is set to avoid concealment is an example of the first real object in the present disclosure, and the real object that is not set to avoid concealment is an example of the second real object in the present disclosure. .

(2-1-3-1. Determination example)
As will be described in detail later, for example, concealment avoidance setting conditions may be registered in advance for each type of real object. In this case, the real object determination unit 104 first performs object recognition on each real object included in the user's field of view based on, for example, a captured image in front of the user captured by the sensor unit 122. Then, the real object determination unit 104 determines whether or not the object is concealed based on the concealment avoidance setting condition corresponding to the recognized object type for each real object in the user's field of view. Judging.

  Note that the concealment avoidance setting condition list may be stored in the storage unit 126 or may be stored in the server 20. In the latter case, the real object determination unit 104 sends an inquiry as to whether or not each recognized real object is a concealment avoidance target to the server 20, and obtains an answer, thereby obtaining a response within the user's field of view. It is specified whether or not each real object is set to avoid concealment.

(2-1-3-2. Setting example of concealment avoidance target)
Here, the above-described setting example of concealment avoidance for each type of real object will be specifically described. For example, a public sign such as a traffic light, a road sign, or a signboard of a construction site can be always set to avoid concealment. Further, concealment avoidance setting can be always set for a predetermined area such as a pedestrian crossing (the entire sky), an intersection (the entire sky), or an evacuation route in a building such as a leisure facility. Thereby, the user can pass more safely or drive the car.

  Also, other types of real objects can be set to avoid concealment dynamically based on predetermined criteria. Here, the predetermined reference may include a positional relationship between the real object and the user. For example, when the user is located in front of a traffic light or an advertisement, these real objects are set to avoid concealment, and when the user is located on the side or behind these real objects, these real objects are set. Is not set to avoid concealment.

  The predetermined standard includes a distance between the real object and the user. In this case, when the distance between the real object and the user is equal to or smaller than a predetermined distance, the real object can be set to avoid concealment. Alternatively, the predetermined criteria may include a moving direction of the real object, a speed of the real object, and / or an acceleration of the real object. For example, when a real object (for example, a ball or a car) is approaching the user and the speed of the real object is equal to or higher than a predetermined threshold, the real object can be set to avoid concealment dynamically. . Alternatively, the predetermined criteria may include a user moving direction, a user speed, and / or a user acceleration. For example, when the user is moving toward a certain real object and the speed of the user is equal to or higher than a predetermined threshold, the real object can be set to avoid concealment dynamically.

  Alternatively, the predetermined criterion may include a recognition result of another person's action. For example, when it is detected that another person is facing the user and speaking, the other person can be set to avoid concealment.

  Alternatively, the predetermined criteria may include a real object state. For example, the predetermined criterion includes the temperature of the real object. In this case, when it is detected by a predetermined temperature sensor that the temperature of the real object (for example, a kettle) is equal to or higher than a predetermined threshold, the real object can be set to avoid concealment. The predetermined threshold may be determined for each type of real object.

  Alternatively, the predetermined reference includes a device state of a real object (for example, an electronic device). For example, the concealment avoidance setting can be set for the television receiver and the PC monitor only when the power of the television receiver and the PC (Personal Computer) is ON. Further, when it is detected that an electronic device in operation has failed, the electronic device can be set to avoid concealment.

  Alternatively, the predetermined criterion may include a sound, light, or smoke generation situation from a real object. For example, when a predetermined sensor detects that smoke is generated from a real object, the real object can be set to avoid concealment. Or, for example, opening and closing of doors such as clocks and entrances, knocking of doors, entrance chimes, telephones, kettles, collisions of real objects (falling, etc.), generation of sounds such as timers of various electronic devices or fire alarms If detected, these real objects can be set to avoid concealment. Thereby, as will be described later, the output control unit 108 can hide the virtual object located between the sound source and the user. Accordingly, when the user looks in the direction of arrival of the sound, the virtual object on the flow line toward the sound generation source is not displayed, so that the user can clearly perceive the sound generation source.

  Alternatively, the predetermined standard may include the presence / absence of a contract and a billing status. For example, if the server 20 registers that an agreement regarding the display of advertisements and products has been exchanged with the AR service operator, the advertisements and products will remain in the contract period (or Concealment avoidance setting can only be set.

{2-1-4. Overlap determination unit 106}
The overlap determination unit 106 includes a real object determined by the real object determination unit 104 as being concealment avoidance among real objects included in the user's field of view, and a display target acquired by the virtual object acquisition unit 102. It is determined whether or not there is an overlap with the virtual object. For example, the overlap determination unit 106 first specifies distance information (depth map) regarding all virtual objects to be displayed based on the position and orientation of the AR glass 10 (or the detection result of the user's line-of-sight direction). Next, the overlap determination unit 106 specifies distance information (depth map) regarding all real objects that are set to avoid concealment based on the position and orientation of the AR glass 10 (or the detection result of the user's line-of-sight direction). . Then, the overlap determination unit 106 determines whether or not there is an overlap with the virtual object to be displayed for each real object that is set to avoid concealment by comparing the two pieces of distance information. Specifically, the overlap determination unit 106 determines whether or not a virtual object exists between the real object and the AR glass 10 for each real object that is set to avoid concealment, and exists. In this case, all corresponding virtual objects are specified.

{2-1-5. Output control unit 108}
(2-1-5-1. Control example 1)
Based on the determination result by the real object determination unit 104 and the determination result by the overlap determination unit 106, the output control unit 108 changes the degree of recognition of the real object in a predetermined range located between the real object and the user. Thus, the display by the display unit 124 is controlled. For example, the output control unit 108 determines that, among the real objects included in the user's field of view, a real object that is set to avoid concealment is a real object that is set to avoid concealment. The display by the display unit 124 is controlled so that the degree of recognition increases. The predetermined range may be a range that is located between the real object and the user and does not include the real object.

-Change of display attribute of virtual object As an example, among real objects included in the user's field of view, only for real objects that are set to avoid concealment, the output control unit 108 The display mode of the virtual object is changed so that the visibility of the virtual object located at is reduced. In this case, the output control unit 108 may hide all or a part of the virtual object (for example, a portion overlapping with a real object set to avoid concealment and its vicinity). Alternatively, the output control unit 108 may make the virtual object translucent, display only the outline of the virtual object (wire frame display), or blink the virtual object at a predetermined time interval. May be displayed.

  Here, the above function will be described in more detail with reference to FIGS. FIG. 6 is an explanatory diagram showing an example in which the real object 30a is set to avoid concealment. As shown in FIG. 6, a certain range around the real object 30 a can be set as the concealment avoidance area 50. The concealment avoidance area 50 is an area (space) where concealment by a virtual object is avoided.

  In the situation illustrated in FIG. 6, for example, the output control unit 108 hides the virtual object 32 positioned between the concealment avoidance area 50 and the display unit 124. As a result, as shown in FIG. 7A, the entire real object 30 a is visible to the user on the display unit 124 without being hidden by the virtual object 32. Alternatively, the output control unit 108 makes the virtual object 32 translucent. As a result, as shown in FIG. 7B, the entire real object 30 a is visible to the user on the display unit 124 without being hidden by the virtual object 32.

--- Modification When the display mode of the virtual object is changed, the output control unit 108 may control the display so as to emphasize the change in the display mode. For example, when changing to display only the contour line of the virtual object, the output control unit 108 may temporarily and gently flash the vicinity of the contour line. Alternatively, when the real object set to avoid concealment is moving, the output control unit 108 displays an animation such as a virtual object located on the movement locus of the real object exploding or collapsing. The virtual object may be hidden. Alternatively, the output control unit 108 may change the display mode of the virtual object while outputting sound or generating vibration to the AR glass 10 or other devices carried by the user. According to these control examples, it is possible to notify the user more emphasized that the display mode of the virtual object has changed.

-Changing the display position of the virtual object Alternatively, the output control unit 108 causes the virtual object to be displayed while being shifted from the default display position so that the real object set to avoid concealment does not overlap the virtual object. May be. For example, the output control unit 108 shifts and displays the virtual object so that it does not overlap with the real object that is set to avoid concealment, and the position change amount is as small as possible.

  In the example illustrated in FIG. 6, the output control unit 108 shifts the display position of the virtual object 32 so that the virtual object 32 does not overlap the region where the concealment avoidance region 50 is visually recognized on the display unit 124. . As a result, as shown in FIG. 7C, the entire real object 30 a is not hidden by the virtual object 32 on the display unit 124 and is visually recognized by the user.

  For each virtual object, the change allowable range (upper limit) of the position, posture, and size may be set in advance. For example, when the default position of a certain virtual object is (x, y, z) = (10 m, 2 m, 1 m), the change allowable range regarding the position is −50 cm to 50 cm in each direction, and the change allowable rotation range Is -30 ° or more and 30 ° or less with respect to each axis, and the change allowable size range can be set to 0.4 times or more and 1.0 times or less. Thus, by setting the change allowable range, it is possible to limit the virtual object from being displayed greatly different from the default display position.

  Here, the change allowable range may be defined in a world coordinate system, or may be defined in a user coordinate system (for example, a coordinate system based on the AR glass 10).

  Alternatively, the change allowable range may be defined in a vector space. FIG. 8 is an explanatory diagram showing an example in which the change allowable range 50 of the virtual object is defined on the vector space. For example, at each point within the change allowable range 50, an allowable rotation amount and expansion / contraction value of each axis can be set.

--- Modification 1
As a modification, the output control unit 108 may predict the future movement of the user in advance, and may predetermine the time series of the display position of the virtual object based on the prediction result. Thereby, the load at the time of the display update of a virtual object can be reduced.

--- Modification 2
In general, for example, in a crowd, the position of an individual person (that is, a real object) can change sequentially. Therefore, as another modified example, the output control unit 108 predicts the movement of one or more real objects that are set to avoid concealment, thereby predetermining the time series of the display positions of the virtual objects (path finding). Is also possible. This eliminates the need to sequentially calculate the position at which the virtual object is shifted each time a surrounding person moves, thereby reducing the load when updating the display of the virtual object.

-Modification 3
As another modification, the output control unit 108 may change how the virtual object is shifted based on, for example, the predetermined criterion described above. For example, the output control unit 108 shifts the virtual object located between the real object and the display unit 124 at a higher speed or a farther distance as the speed of the real object moving in the direction of the user increases. May be. Alternatively, the output control unit 108 may shift the virtual object positioned between the real object and the display unit 124 at higher speed or farther as the temperature of the real object set to avoid concealment is higher. Good. According to these control examples, the magnitude of danger can be notified to the user.

-Display of another virtual object Alternatively, the output control unit 108 may newly display another virtual object related to the corresponding real object at a position that does not overlap the virtual object. For example, when the corresponding real object is a traffic light, the output control unit 108 may newly display a virtual traffic light at a position shifted from the virtual object displayed overlapping the traffic light. Alternatively, the output control unit 108 may newly display information (text or image) indicating the current lighting color of the traffic signal, or display the entire display unit 124 lightly with the current lighting color of the traffic signal. You may let them.

(2-1-5-2. Control example 2)
Further, the output control unit 108 may dynamically change the display mode of the virtual object located between the real object set to avoid concealment and the display unit 124, for example, based on the predetermined criterion described above. . For example, the output control unit 108 dynamically changes the display mode of the virtual object located between the real object and the display unit 124 based on the change in the distance between the real object and the display unit 124. . More specifically, the output control unit 108 increases the transparency of the virtual object or gradually increases the mesh size of the virtual object as the distance between the real object and the display unit 124 decreases. Alternatively, the virtual object may be gradually made into a wire frame. As an example, when the user is driving a car, the output control unit 108 is positioned between the traffic signal and the display unit 124 as the distance between the traffic signal located in front of the user and the display unit 124 decreases. The transparency of the virtual object to be performed may be increased.

  Alternatively, the output control unit 108 may dynamically change the display mode of the virtual object positioned between the real object and the display unit 124 according to changes in the surrounding situation. For example, it is assumed that the escape route is set to avoid concealment in a predetermined facility such as a leisure facility. In this case, at the time of evacuation guidance, the output control unit 108 performs control so as not to display on the evacuation route, such as hiding a virtual object that becomes an obstacle, or newly displays a virtual object for guidance. May be. Or the output control part 108 may always display the virtual object located on an evacuation route translucently.

  FIG. 9A is a diagram illustrating a display example of the virtual object 32 at the normal time in the corridor which is an evacuation route. As shown in FIG. 9A, for example, during normal times, the output control unit 108 displays the virtual object 32a located on the evacuation route in a translucent manner. FIG. 9B is a diagram showing a display example of the virtual object 32 at the time of evacuation guidance in the same corridor. As shown in FIG. 9B, at the time of evacuation guidance, the output control unit 108 hides the virtual object 32a shown in FIG. 9A and newly displays the virtual object 32b for guidance. Note that the output control unit 108 may sequentially update the display position, tilt, or display content of the virtual object 32b in accordance with the movement of the user so as to guide evacuation.

  Or the output control part 108 may change the display mode of one or more virtual objects located in a user's visual field according to surrounding brightness. For example, when the surroundings are dark (such as when it is night or cloudy), the output control unit 108 displays a portion to be hidden among the virtual objects located between the real object set to avoid concealment and the user. It may be larger than when the surroundings are bright. Alternatively, in this case, the output control unit 108 may further make other virtual objects located around the virtual object semi-transparent, or make all other virtual objects located within the user's field of view semi-transparent. May be.

(2-1-5-3. Control example 3)
When the real object set to avoid concealment is an advertisement, the output control unit 108 determines whether or not the virtual object is positioned between the advertisement and the display unit 124 based on the positional relationship between the user and the advertisement. The display mode may be controlled dynamically. For example, when the user is located in front of the advertisement, the output control unit 108 increases the transparency of the virtual object. In addition, when the user is located on the side or rear of the advertisement, the output control unit 108 further reduces the transparency of the virtual object. Thereby, the fall of the visibility of a virtual object can be suppressed as much as possible, maintaining the recognition degree of an advertisement.

(2-1-5-4. Control example 4)
Alternatively, the output control unit 108 may change the display mode of the virtual object based on the display position of the virtual object on the display unit 124. For example, as the distance between the display position of the virtual object on the display unit 124 and the center of the display unit 124 is smaller, the output control unit 108 may relatively increase the transparency of the virtual object. Alternatively, the output control unit 108 may relatively increase the transparency of the virtual object as the distance between the display position of the virtual object and the user's gaze point on the display unit 124 is smaller.

(2-1-5-5. Control example 5)
Further, the output control unit 108 can sequentially execute the display control described above every time the position and posture of the AR glass 10 (or the detection result of the user's line-of-sight direction) change.

(2-1-5-6. Control example 6)
Further, the output control unit 108 can dynamically change the display mode of the virtual object in accordance with the change in the determination result by the real object determination unit 104. For example, when the concealment avoidance setting for a certain real object is switched from OFF to ON, the output control unit 108 displays the display mode of the virtual object positioned between the display unit 124 and the real object (as described above). Is changed from the default display mode. When the setting for avoiding concealment of a certain real object is switched from ON to OFF, the output control unit 108 sets the display mode of the virtual object positioned between the display unit 124 and the real object to the default display mode. Return to.

  As a modification, when the user makes an input for returning the display mode of the virtual object after the display mode of the virtual object is changed from the default display mode, the output control unit 108 It is also possible to return the display mode of the virtual object to the default display mode. Further, the output control unit 108 specifies a virtual object that the user has permitted in the past to return to the default display mode based on the history information, and displays the specified virtual object in the default display mode thereafter. May be.

{2-1-6. Communication unit 120}
The communication unit 120 transmits and receives information to and from other devices. For example, the communication unit 120 transmits an acquisition request for a virtual object located around the current position to the server 20 according to the control of the virtual object acquisition unit 102. Further, the communication unit 120 receives a virtual object from the server 20.

{2-1-7. Sensor unit 122}
The sensor unit 122 may include a positioning device that receives a positioning signal from a positioning satellite such as a GPS (Global Positioning System) and measures the current position. The sensor unit 122 may include a range sensor.

  Furthermore, the sensor unit 122 includes, for example, a three-axis acceleration sensor, a gyroscope, a magnetic sensor, a camera, a depth sensor, and / or a microphone. For example, the sensor unit 122 measures the speed, acceleration, posture, orientation, or the like of the AR glass 10. The sensor unit 122 captures an image of the eyes of the user wearing the AR glass 10 or captures an image in front of the AR glass 10. In addition, the sensor unit 122 can detect an object positioned in front of the user and can detect a distance to the detected object.

{2-1-8. Display unit 124}
The display unit 124 displays an image according to the control of the output control unit 108. For example, the display unit 124 projects an image on at least a partial region (projection plane) of each of the left-eye lens and the right-eye lens. The left-eye lens and the right-eye lens can be formed of a transparent material such as resin or glass.

  As a modification, the display unit 124 may include a liquid crystal panel, and the transmittance of the liquid crystal panel may be controllable. Thereby, the display unit 124 can be controlled to be transparent or translucent.

{2-1-9. Storage unit 126}
The storage unit 126 stores various data and various software. For example, the storage unit 126 stores a list of concealment avoidance setting conditions for each type of real object.

<2-2. Operation>
The configuration according to this embodiment has been described above. Next, an example of the operation according to the present embodiment will be described with reference to FIG. FIG. 10 is a flowchart showing an operation example according to the present embodiment.

  As shown in FIG. 10, first, the virtual object acquisition unit 102 of the AR glass 10 is arranged around the current position (for example, a certain range in all directions of up, down, left, and right directions) based on the position information measurement result by the sensor unit 122. A plurality of virtual objects located are acquired from the server 20. Then, based on the measurement result of the posture of the AR glass 10 (or the user's line of sight) by the sensor unit 122, the virtual object acquisition unit 102 selects a virtual object included in the user's field of view from the acquired virtual objects. Extract (S101).

  Subsequently, the real object determination unit 104 avoids concealment among real objects included in the user's field of view based on a list of concealment avoidance setting conditions for each type of real object stored in the storage unit 126, for example. It is determined whether or not the set real object exists (S103). If the real object set to avoid concealment does not exist in the user's field of view (S103: No), the AR glass 10 performs the process of S109 described later.

  On the other hand, when one or more real objects that are set to avoid concealment exist in the user's field of view (S103: Yes), the overlap determination unit 106, for each of the corresponding real objects, In the meantime, it is determined whether or not at least one of the virtual objects acquired in S101 is located (S105).

  If no virtual object exists between all of the corresponding real objects and the user (S105: No), the AR glass 10 performs the process of S109 described later.

  On the other hand, when one or more virtual objects are located between at least one of the corresponding real objects and the user (S105: Yes), the output control unit 108 may decrease the visibility of the corresponding virtual object. In addition, the display mode of the virtual object is changed. For example, the output control unit 108 hides the corresponding virtual object, makes it translucent, or shifts the display position so as not to overlap the corresponding real object (S107).

  Thereafter, the output control unit 108 causes the display unit 124 to display the virtual object acquired in S101 (S109).

<2-3. Effect>
As described above, according to the present embodiment, the AR glass 10 determines whether the real object included in the user's field of view is a real object that is set to avoid concealment. The display by the display unit 124 is controlled so that the recognition degree of the real object changes in a range between the real object and the real object. For this reason, the recognition degree of the real object can be adaptively changed to the real object included in the user's field of view.

  For example, with respect to individual real objects included in the user's field of view, display by the display unit 124 is performed so that a real object that is set to avoid concealment has a higher degree of recognition than a real object that is not set to avoid concealment. To control. Accordingly, the user can visually recognize the real object that is set to avoid the concealment by the virtual object, or the user can visually recognize the virtual object that replaces the real object. Accordingly, since the user can recognize the existence of the real object that is set to avoid concealment, the safety when the user acts while wearing the AR glass 10 can be improved.

<< 3. Hardware configuration >>
Next, the hardware configuration of the AR glass 10 according to the present embodiment will be described with reference to FIG. As shown in FIG. 11, the AR glass 10 includes a CPU 150, a ROM (Read Only Memory) 152, a RAM 154, a bus 156, an interface 158, an input device 160, an output device 162, a storage device 164, and a communication device 166.

  The CPU 150 functions as an arithmetic processing device and a control device, and controls the overall operation in the AR glass 10 according to various programs. In addition, the CPU 150 realizes the function of the control unit 100 in the AR glass 10. The CPU 150 is configured by a processor such as a microprocessor.

  The ROM 152 stores programs used by the CPU 150, control data such as calculation parameters, and the like.

  The RAM 154 temporarily stores a program executed by the CPU 150, for example.

  The bus 156 includes a CPU bus. The bus 156 connects the CPU 150, the ROM 152, and the RAM 154 to each other.

  The interface 158 connects the input device 160, the output device 162, the storage device 164, and the communication device 166 with the bus 156.

  The input device 160 includes, for example, input means for a user to input information, such as buttons, switches, levers, and microphones, and an input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 150. The

  The output device 162 includes, for example, a display device such as a projector and an audio output device such as a speaker. The display device may be a liquid crystal display (LCD) device, an organic light emitting diode (OLED) device, or the like.

  The storage device 164 is a data storage device that functions as the storage unit 126. The storage device 164 includes, for example, a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, or a deletion device that deletes data recorded on the storage medium.

  The communication device 166 is a communication interface configured by a communication device or the like for connecting to the communication network 22 or the like, for example. The communication device 166 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wire communication device that performs wired communication. The communication device 166 functions as the communication unit 120.

<< 4. Modification >>
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field to which the present disclosure belongs can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that these also belong to the technical scope of the present disclosure.

<4-1. Modification 1>
For example, in the above-described embodiment, an example in which the AR glass 10 (the output control unit 108) changes the display mode for each virtual object has been described. However, the present invention is not limited to this example, and the AR glass 10 includes a plurality of AR glasses 10. You may change the display mode of a virtual object collectively. As an example, the output control unit 108 may hide all the virtual objects located between all the real objects set to avoid concealment and the user, or may make them semi-transparent.

<4-2. Modification 2>
As another modification, the output control unit 108 may change the way of changing the display mode of the virtual object depending on the type of the virtual object. For example, for a virtual object that prompts the user to confirm only once, the output control unit 108 may change the display mode of the virtual object depending on whether or not the user has recognized that the virtual object has been viewed. More specifically, when the user has not yet seen the virtual object, the output control unit 108 displays the virtual object in a default display mode. Further, after it is recognized that the user has seen the virtual object, the output control unit 108 displays the virtual object in a simple display format such as hiding or displaying only the outline.

<4-3. Modification 3>
As another modification, when the same virtual object is included in the field of view of a plurality of users, the output control unit 108 displays the display mode of the virtual object displayed for each of the plurality of users. The display may be controlled to be the same.

<4-4. Modification 4>
As another modification, the output control unit 108 may further control display based on detection of occurrence of a system error so that the visibility of a virtual object located in the user's field of view is reduced. . For example, when the AR glass 10 breaks down, when a SLAM error occurs due to the AR glass 10, and / or when the server 20 goes down, the output control unit 108 hides all virtual objects. Or semi-transparent.

<4-5. Modification 5>
As another modified example, the output control unit 108 may notify the user by voice or vibration (tactile stimulus) instead of or in addition to the above-described display control of the virtual object. For example, it is assumed that there is a traffic light in the user's field of view, the traffic light is set to avoid concealment, and a virtual object is located between the traffic light and the user. In this case, the output control unit 108 may cause the built-in speaker (not shown) to output sound indicating the lighting color of the traffic light (for example, “currently red” or “changed to blue”). Alternatively, a vibration pattern is registered for each type of lighting color, and the output control unit 108 is a vibration pattern corresponding to the current lighting color of the traffic light, and another device (smart phone or smart watch) carried by the user. Or the 3D glass 10 itself may be vibrated.

<4-6. Modification 6>
Moreover, although embodiment mentioned above demonstrated the example whose information processing apparatus in this indication is AR glass 10, it is not limited to this example. For example, when the server 20 includes all the components included in the control unit 100 described above, the information processing apparatus may be the server 20. In this case, the server 20 controls the display of the virtual object with respect to the AR glass 10 by acquiring the position information and posture information of the AR glass 10 (and the detection result of the user's line-of-sight direction) from the AR glass 10. It is possible. The information processing apparatus is not limited to the server 20, and may be another type of apparatus that can be connected to the communication network 22, such as a smartphone, a tablet terminal, a PC, or a game machine. Alternatively, the information processing apparatus may be a car.

  Moreover, although embodiment mentioned above demonstrated the example in which the display part in this indication is the display part 124 of AR glass 10, it is not limited to this example. For example, the display unit may be a see-through device such as a head-up display (for example, an in-vehicle windshield) or a desktop transparent display, or a video transmission type HMD (Head Mounted Display) or tablet. It may be a video see-through device such as a terminal. In the latter case, the captured image in front of the user can be sequentially displayed on the corresponding display.

  Alternatively, the display unit may be a 3D projector. In this case, the 3D projector performs projection mapping of the virtual object on the projection target while the sensor worn by the user or the sensor disposed in the environment senses the user's field of view. This function can be realized. The projection target may be a flat surface or a three-dimensional object.

<4-7. Modification 7>
In the above-described embodiment, the example in which the object to be concealed is set as a real object has been described. For example, a specific type of virtual object may be set in advance as a concealment avoidance target. As an example, important notification information from the system, a message display window in a chat service, or a message reception notification screen may be set as a concealment avoidance target. In this case, the AR glass 10 controls display so that the virtual object set to avoid concealment has a higher recognition degree than the virtual object not set to avoid concealment. For example, the AR glass 10 may be configured such that a virtual object that is not set to avoid concealment that is positioned between the virtual object that is set to avoid concealment and the user is hidden, translucent, or shifted in display position. Good.

  Alternatively, the priority for each type of virtual object can be registered in the table in advance. In this case, the AR glass 10 may hide another semi-transparent object positioned between the virtual object and the user and has a lower priority than the virtual object, or may be translucent, or the display position may be shifted. Good. According to these display examples, a virtual object with high importance can be visually recognized by a user without being hidden by other virtual objects.

<4-8. Modification 8>
In addition, each step in the operation of the above-described embodiment does not necessarily have to be processed in the order described. For example, the steps may be processed by changing the order as appropriate. Each step may be processed in parallel or individually instead of being processed in time series. Further, some of the described steps may be omitted, or another step may be further added.

  In addition, according to the above-described embodiment, it is possible to provide a computer program for causing hardware such as the CPU 150, the ROM 152, and the RAM 154 to perform the same functions as the components of the AR glass 10 according to the above-described embodiment. A recording medium on which the computer program is recorded is also provided.

  Further, the effects described in the present specification are merely illustrative or exemplary and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.

The following configurations also belong to the technical scope of the present disclosure.
(1)
Based on the determination result of whether or not the real object included in the user's field of view is the first real object, the degree of recognition of the real object by the user changes in a range between the user and the real object An output control unit for controlling display by the display unit,
An information processing apparatus comprising:
(2)
When the real object is determined to be the first real object, and when the real object is determined to be a second real object different from the first real object, The information processing apparatus according to (1), wherein the output control unit controls display by the display unit so that the degree of recognition is different.
(3)
When it is determined that the real object is the first real object, the degree of recognition of the real object is greater than when the real object is determined to be the second real object. The information processing apparatus according to (2), wherein the output control unit controls display by the display unit.
(4)
The range between the user and the real object is a range having a predetermined shape located between the user and the real object,
When it is determined that the real object is the first real object and at least a part of the real object is included in the range having the predetermined shape, the recognition degree of the real object is increased. Moreover, the information processing apparatus according to (3), wherein the output control unit controls display by the display unit.
(5)
The information processing apparatus according to (3) or (4), wherein whether or not the real object is the first real object is determined based on a positional relationship between the real object and the user.
(6)
The information processing apparatus according to (5), wherein whether or not the real object is the first real object is determined based on a distance between the real object and the user.
(7)
The information processing apparatus according to (5) or (6), wherein whether or not the real object is the first real object is determined based on a direction of the real object with respect to the user.
(8)
The information processing apparatus according to any one of (3) to (7), wherein whether or not the real object is the first real object is determined based on a speed or acceleration of the real object. .
(9)
The information processing apparatus according to any one of (3) to (8), wherein whether or not the real object is the first real object is determined based on a temperature of the real object.
(10)
The determination according to any one of (3) to (9), wherein whether or not the real object is the first real object is determined based on a sound or light generation state from the real object. Information processing device.
(11)
The real object is an electronic device,
The information processing apparatus according to any one of (3) to (10), wherein whether or not the real object is the first real object is determined based on a device state of the real object.
(12)
Any of (3) to (11), wherein whether or not the real object is the first real object is determined based on whether or not the real object is a predetermined type of object. The information processing apparatus according to one item.
(13)
When it is determined that the real object is the first real object, the output control unit changes a display mode of the first virtual object located in a range between the user and the real object. The information processing apparatus according to any one of (3) to (12).
(14)
The information processing apparatus according to (13), wherein when the real object is determined to be the second real object, the output control unit does not change a display mode of the first virtual object.
(15)
When it is determined that the real object is the first real object, the output control unit controls display so that a part or all of the first virtual object is hidden. (13) The information processing apparatus according to (14).
(16)
When it is determined that the real object is the first real object, the output control unit increases the transparency of the first virtual object, and any one of (13) to (15) The information processing apparatus according to one item.
(17)
When it is determined that the real object is the first real object, the output control unit determines a region where the real object is visually recognized on the display unit and a display position of the first virtual object. The information processing apparatus according to any one of (13) to (16), wherein a display position of the first virtual object is changed so as not to overlap.
(18)
When it is determined that the real object is the first real object, the output control unit displays a second virtual object related to the real object at a display position different from that of the first virtual object. The information processing apparatus according to any one of (13) to (17), wherein the information is newly displayed.
(19)
Based on the determination result of whether or not the real object included in the user's field of view is the first real object, the degree of recognition of the real object by the user changes in a range between the user and the real object The processor controls the display by the display unit,
Including an information processing method.
(20)
Computer
Based on the determination result of whether or not the real object included in the user's field of view is the first real object, the degree of recognition of the real object by the user changes in a range between the user and the real object An output control unit for controlling display by the display unit,
Program to function as

10 AR glass 20 server 22 communication network 30 real object 32 virtual object 100 control unit 102 virtual object acquisition unit 104 real object determination unit 106 overlap determination unit 108 output control unit 120 communication unit 122 sensor unit 124 display unit 126 storage unit

Claims (20)

  1. Based on the determination result of whether or not the real object included in the user's field of view is the first real object, the degree of recognition of the real object by the user changes in a range between the user and the real object An output control unit for controlling display by the display unit,
    An information processing apparatus comprising:
  2.   When the real object is determined to be the first real object, and when the real object is determined to be a second real object different from the first real object, The information processing apparatus according to claim 1, wherein the output control unit controls display by the display unit so that the degree of recognition is different.
  3.   When it is determined that the real object is the first real object, the degree of recognition of the real object is greater than when the real object is determined to be the second real object. The information processing apparatus according to claim 2, wherein the output control unit controls display by the display unit.
  4. The range between the user and the real object is a range having a predetermined shape located between the user and the real object,
    When it is determined that the real object is the first real object and at least a part of the real object is included in the range having the predetermined shape, the recognition degree of the real object is increased. The information processing apparatus according to claim 3, wherein the output control unit controls display by the display unit.
  5.   The information processing apparatus according to claim 3, wherein whether or not the real object is the first real object is determined based on a positional relationship between the real object and the user.
  6.   The information processing apparatus according to claim 5, wherein whether or not the real object is the first real object is determined based on a distance between the real object and the user.
  7.   The information processing apparatus according to claim 5, wherein whether or not the real object is the first real object is determined based on an orientation of the real object with respect to the user.
  8.   The information processing apparatus according to claim 3, wherein whether or not the real object is the first real object is determined based on a speed or acceleration of the real object.
  9.   The information processing apparatus according to claim 3, wherein whether or not the real object is the first real object is determined based on a temperature of the real object.
  10.   The information processing apparatus according to claim 3, wherein whether or not the real object is the first real object is determined based on a sound or light generation state from the real object.
  11. The real object is an electronic device,
    The information processing apparatus according to claim 3, wherein whether or not the real object is the first real object is determined based on a device state of the real object.
  12.   The information processing apparatus according to claim 3, wherein whether or not the real object is the first real object is determined based on whether or not the real object is a predetermined type of object.
  13.   When it is determined that the real object is the first real object, the output control unit changes a display mode of the first virtual object located in a range between the user and the real object. The information processing apparatus according to claim 3.
  14.   The information processing apparatus according to claim 13, wherein the output control unit does not change a display mode of the first virtual object when it is determined that the real object is the second real object.
  15.   When it is determined that the real object is the first real object, the output control unit controls display so that a part or all of the first virtual object is hidden. Item 14. The information processing apparatus according to Item 13.
  16.   The information processing apparatus according to claim 13, wherein when it is determined that the real object is the first real object, the output control unit increases the transparency of the first virtual object.
  17.   When it is determined that the real object is the first real object, the output control unit determines a region where the real object is visually recognized on the display unit and a display position of the first virtual object. The information processing apparatus according to claim 13, wherein a display position of the first virtual object is changed so as not to overlap.
  18.   When it is determined that the real object is the first real object, the output control unit displays a second virtual object related to the real object at a display position different from that of the first virtual object. The information processing apparatus according to claim 13, wherein the information is newly displayed.
  19. Based on the determination result of whether or not the real object included in the user's field of view is the first real object, the degree of recognition of the real object by the user changes in a range between the user and the real object The processor controls the display by the display unit,
    Including an information processing method.
  20. Computer
    Based on the determination result of whether or not the real object included in the user's field of view is the first real object, the degree of recognition of the real object by the user changes in a range between the user and the real object An output control unit for controlling display by the display unit,
    Program to function as
JP2016132696A 2016-07-04 2016-07-04 Information processing device, information processing method, and program Pending JP2018005005A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016132696A JP2018005005A (en) 2016-07-04 2016-07-04 Information processing device, information processing method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016132696A JP2018005005A (en) 2016-07-04 2016-07-04 Information processing device, information processing method, and program
PCT/JP2017/013655 WO2018008210A1 (en) 2016-07-04 2017-03-31 Information processing device, information processing method, and program

Publications (1)

Publication Number Publication Date
JP2018005005A true JP2018005005A (en) 2018-01-11

Family

ID=60912445

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2016132696A Pending JP2018005005A (en) 2016-07-04 2016-07-04 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP2018005005A (en)
WO (1) WO2018008210A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09330489A (en) * 1996-06-07 1997-12-22 Hitachi Eng Co Ltd Equipment monitoring method and its device
JP2005037181A (en) * 2003-07-17 2005-02-10 Pioneer Design Kk Navigation device, server, navigation system, and navigation method
JP5040061B2 (en) * 2004-12-15 2012-10-03 コニカミノルタホールディングス株式会社 Video display device and information providing system
JP2008083289A (en) * 2006-09-27 2008-04-10 Sony Computer Entertainment Inc Imaging display apparatus, and imaging display method
DE102013207528A1 (en) * 2013-04-25 2014-10-30 Bayerische Motoren Werke Aktiengesellschaft A method for interacting with an object displayed on a data goggle
US20150296324A1 (en) * 2014-04-11 2015-10-15 Mitsubishi Electric Research Laboratories, Inc. Method and Apparatus for Interacting Between Equipment and Mobile Devices
JP2016045814A (en) * 2014-08-25 2016-04-04 泰章 岩井 Virtual reality service providing system and virtual reality service providing method

Also Published As

Publication number Publication date
WO2018008210A1 (en) 2018-01-11

Similar Documents

Publication Publication Date Title
US9418481B2 (en) Visual overlay for augmenting reality
US9235051B2 (en) Multi-space connected virtual data objects
US9685001B2 (en) System and method for indicating a presence of supplemental information in augmented reality
US20130342564A1 (en) Configured virtual environments
JP2013092964A (en) Image processing device, image processing method, and program
JP6280134B2 (en) Helmet-based navigation notification method, apparatus, and computer program
US20130044128A1 (en) Context adaptive user interface for augmented reality display
RU2558620C2 (en) Device and method of user&#39;s input for control over displayed data
RU2638004C2 (en) Device for information processing, method for managing display and program
US20130241805A1 (en) Using Convergence Angle to Select Among Different UI Elements
US20110300876A1 (en) Method for guiding route using augmented reality and mobile terminal using the same
US10514758B2 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US9857589B2 (en) Gesture registration device, gesture registration program, and gesture registration method
JP5909034B1 (en) User interface for head mounted display
TWI533162B (en) User interface for augmented reality enabled devices
CN106062614B (en) It is detected on the head of head-mounted display
KR102055967B1 (en) Head-mounted display resource management
WO2013121730A1 (en) Head-mounted display, program for controlling head-mounted display, and method of controlling head-mounted display
US9122053B2 (en) Realistic occlusion for a head mounted augmented reality display
JP5881263B2 (en) Display of sound status on wearable computer system
US9678342B2 (en) Information processing device, display control method, and program
US20140010391A1 (en) Amplifying audio-visiual data based on user&#39;s head orientation
JP2017513535A (en) Audio navigation support
US9501873B2 (en) Indicating out-of-view augmented reality images
KR20140130321A (en) Wearable electronic device and method for controlling the same