WO2014054210A2 - Information processing device, display control method, and program - Google Patents

Information processing device, display control method, and program Download PDF

Info

Publication number
WO2014054210A2
WO2014054210A2 PCT/JP2013/004916 JP2013004916W WO2014054210A2 WO 2014054210 A2 WO2014054210 A2 WO 2014054210A2 JP 2013004916 W JP2013004916 W JP 2013004916W WO 2014054210 A2 WO2014054210 A2 WO 2014054210A2
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
display
action state
priority
Prior art date
Application number
PCT/JP2013/004916
Other languages
French (fr)
Other versions
WO2014054210A3 (en
Inventor
Kazuyuki Yamamoto
Takuro Noda
Tetsuyuki Miyawaki
Kenji Suzuki
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation filed Critical Sony Corporation
Priority to RU2015110684A priority Critical patent/RU2015110684A/en
Priority to US14/407,722 priority patent/US9678342B2/en
Priority to CN201380049986.5A priority patent/CN104685857B/en
Priority to BR112015004100A priority patent/BR112015004100A2/en
Priority to EP13756935.6A priority patent/EP2904767A2/en
Priority to IN2386DEN2015 priority patent/IN2015DN02386A/en
Publication of WO2014054210A2 publication Critical patent/WO2014054210A2/en
Publication of WO2014054210A3 publication Critical patent/WO2014054210A3/en
Priority to US15/598,547 priority patent/US10209516B2/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the present disclosure relates to an information processing device, a display control method, and a program encoded on a non-transitory computer readable medium.
  • the present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2012-219450 filed in the Japan Patent Office on October 1, 2012, the entire content of which is hereby incorporated by reference.
  • a user acquires information from a smartphone while commuting to work or school.
  • a navigation device provides information to a user who is driving a car or riding a bicycle.
  • a wearable device such as a head-mounted display (HMD) is able to provide information to a user even while the user is performing any given activity.
  • HMD head-mounted display
  • PTL 1 below proposes displaying different information on a screen of a wearable device depending on outside conditions, in order to provide the user with information that is contextually appropriate or interesting.
  • the outside conditions herein may include factors such as the environmental light level, temperature, humidity and barometric pressure, the distance to a photographic subject, whether or not a photographic subject is a living thing, as well as the current location and date/time.
  • the present invention broadly comprises an apparatus, method and a non-transitory computer readable medium encoded with a program for performing the method.
  • the apparatus includes a user action state obtaining circuit configured to obtain an action state of a user; and a display control circuit configured to control a display to modify display information based on the action state.
  • FIG. 1 is an explanatory diagram illustrating a first example of the exterior of an information processing device.
  • FIG. 2 is an explanatory diagram illustrating a second example of the exterior of an information processing device.
  • FIG. 3 is an explanatory diagram illustrating a third example of the exterior of an information processing device.
  • FIG. 4 is a block diagram illustrating an example of a hardware configuration of an information processing device according to an embodiment.
  • FIG. 5 is a block diagram illustrating an example of a logical function configuration of an image processing device according to an embodiment.
  • FIG. 6A is a flowchart illustrating a first example of a flow of an action recognition process.
  • FIG. 6B is a flowchart illustrating a second example of a flow of an action recognition process.
  • FIG. 6A is a flowchart illustrating a first example of a flow of an action recognition process.
  • FIG. 6B is a flowchart illustrating a second example of a flow of an action recognition process.
  • FIG. 6C is a flowchart illustrating a third example of a flow of an action recognition process.
  • FIG. 7 is an explanatory diagram for explaining a first example of setting priorities according to a user's action state.
  • FIG. 8 is an explanatory diagram for explaining a second example of setting priorities according to a user's action state.
  • FIG. 9A is an explanatory diagram for explaining a first technique for adjusting priorities.
  • FIG. 9B is an explanatory diagram for explaining a second technique for adjusting priorities.
  • FIG. 10 is a flowchart illustrating an example of the flow of a display control process according to an embodiment.
  • FIG. 11A is an explanatory diagram illustrating a first example of the display of items.
  • FIG. 11A is an explanatory diagram illustrating a first example of the display of items.
  • FIG. 11B is an explanatory diagram illustrating a second example of the display of items.
  • FIG. 11C is an explanatory diagram illustrating a third example of the display of items.
  • FIG. 11D is an explanatory diagram illustrating a fourth example of the display of items.
  • FIG. 11E is an explanatory diagram illustrating a fifth example of the display of items.
  • FIG. 12 is an explanatory diagram for explaining an example of linking an information processing device and an external device.
  • FIG. 13 is an explanatory diagram illustrating an example of item display when linking an information processing device and an external device.
  • FIG. 14 is an explanatory diagram for explaining display control based on recognition of an external device.
  • Technology according to the present disclosure is applicable to information processing devices in various forms, including portable devices such as a tablet personal computer (PC), mobile PC, smartphone, game console, or portable navigation devices (PND), as well as wearable devices such as a head-mounted device (HMD), for example.
  • portable devices such as a tablet personal computer (PC), mobile PC, smartphone, game console, or portable navigation devices (PND), as well as wearable devices such as a head-mounted device (HMD), for example.
  • the screens of these information processing devices may be arranged so as to be continually present in a user's visual field while the user engages in activity.
  • FIG. 1 is an explanatory diagram illustrating a first example of the exterior of an information processing device to which technology according to the present disclosure may be applied.
  • the information processing device 100a is a glasses-style wearable device worn on a user's head.
  • the information processing device 100a is equipped with a pair of screens SCa and SCb, a housing HS, an imaging lens LN, and a touch surface TS.
  • the screens SCa and SCb are see-through or non-see-through screens arranged in front of a user's left eye and right eye, respectively.
  • the housing HS includes a frame that supports the screens SCa and SCb, and what are called temples positioned on the sides of a user's head.
  • Various modules for information processing are stored inside the temples.
  • the imaging lens LN is arranged such that the optical axis is approximately parallel to the user's line of sight, and is used to capture images.
  • the touch surface TS is a surface that detects touches by the user, and is used in order for the information processing device 100a to receive user input.
  • FIG. 2 is an explanatory diagram illustrating a second example of the exterior of an information processing device to which technology according to the present disclosure may be applied.
  • the information processing device 100b is a mobile client carried by a user.
  • the information processing device 100b is equipped with a touch surface TS and a screen SC that are integrally constituted as a touch panel display, as well as buttons BT.
  • the screen SC is held in front by the user, and is continuously present in the user's visual field.
  • An imaging lens of the information processing device 100b may be provided on the back of the screen SC.
  • the touch surface TS and the buttons BT are used in order for the information processing device 100b to receive user input.
  • FIG. 3 is an explanatory diagram illustrating a third example of the exterior of an information processing device to which technology according to the present disclosure may be applied.
  • the information processing device 100c is a navigation device attached to a vehicle VH that a user rides.
  • the information processing device 100c may be a specialized navigation device, or a general-purpose device such as a smartphone with navigation functionality.
  • the information processing device 100c is equipped with a screen SC and buttons BT.
  • the screen SC is continuously present in the user's visual field while the user is riding the vehicle VH.
  • the buttons BT are used in order for the information processing device 100c to receive user input.
  • These information processing devices are carried by an active user, and provide the user with various information.
  • Various information such as advertising information, social information (such as social networking service (SNS) posts, blog posts, or email), news information, and traffic information may be provided, even while the user is engaged in activity.
  • social information such as social networking service (SNS) posts, blog posts, or email
  • news information such as news information
  • traffic information may be provided, even while the user is engaged in activity.
  • social information such as social networking service (SNS) posts, blog posts, or email
  • news information such as news information
  • traffic information may be provided, even while the user is engaged in activity.
  • social information such as social networking service (SNS) posts, blog posts, or email
  • news information such as social networking service (SNS) posts, blog posts, or email
  • traffic information may be provided, even while the user is engaged in activity.
  • SNS social networking service
  • the provision of information is controlled using priorities that are adaptively set according to user action recognition results, such that the user will reliably notice information in
  • FIG. 4 is a block diagram illustrating an example of a hardware configuration of an information processing device 100 according to an embodiment.
  • the information processing device 100 is equipped with an imaging unit 102, a sensor unit 104, an input unit 106, storage 108, a display 110, a communication unit 112, a bus 116, and a controller 118.
  • Imaging unit 102 is a camera module that captures images.
  • the imaging unit 102 includes a lens LN as illustrated by example in FIG. 1, a CCD, CMOS, or other image sensor, and an imaging circuit.
  • the imaging unit 102 generates a captured image depicting a real space.
  • a series of captured images generated by the imaging unit 102 may constitute video.
  • the sensor unit 104 may include various sensors such as a positioning sensor, an acceleration sensor, and a gyro sensor.
  • the positioning sensor may be, for example, a Global Positioning System (GPS) sensor that receives GPS signals and measures the latitude, longitude, and altitude of the device. Otherwise, the positioning sensor may be a sensor that executes positioning on the basis of the strengths of wireless signals received from wireless access points.
  • GPS Global Positioning System
  • the acceleration sensor measures 3-axis acceleration imparted to the information processing device 100.
  • the gyro sensor measures the tilt angle of the information processing device 100.
  • the sensor unit 104 outputs sensor data indicating measurement results output from these sensors to the controller 118.
  • the input unit 106 is an input interface used in order for a user to operate the information processing device 100 or input information into the information processing device 100.
  • the input unit 106 receives user input via the touch surface TS exemplified in FIG. 1 or the buttons BT exemplified in FIGS. 2 and 3, for example.
  • the input unit 106 may also include other types of input interfaces, such as switches, dials, a keypad, a keyboard, or a speech input interface.
  • the input unit 106 may additionally include a gaze detection interface that detects the user's gaze direction.
  • the storage 108 is realized with a storage medium such as semiconductor memory or a hard disk, and stores programs and data used in processing by the information processing device 100.
  • Data stored by the storage 108 may include captured image data and sensor data, as well as data in a database to be described later, and a mapping table, for example. Note that some of the programs and data described in this specification may also be acquired from an external data source (such as a data server, network storage, or externally attached memory, for example), rather than being stored in the storage 108.
  • an external data source such as a data server, network storage, or externally attached memory, for example
  • the display 110 is a display module that includes the pair of screens SCa and SCb exemplified in FIG. 1, the screen SC exemplified in FIG. 2, or the screen SC exemplified in FIG. 3, and a display circuit.
  • the display 110 displays on-screen output images generated by a display controller 180 described later.
  • the communication unit 112 is a communication interface that mediates communication between the information processing device 100 and another device.
  • the communication unit 112 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device.
  • Bus The bus 116 connects the imaging unit 102, the sensor unit 104, the input unit 106, the storage 108, the display 110, the communication unit 112, and the controller 118 to each other.
  • the controller 118 corresponds to a processor such as a central processing unit (CPU) or a digital signal processor (DSP).
  • the controller 118 causes various functions of the information processing device 100 described later to operate by executing a program stored in the storage 108 or another storage medium.
  • FIG. 5 is a block diagram illustrating an exemplary configuration of logical functions realized by the storage 108 and the controller 118 of the information processing device 100 illustrated in FIG. 4.
  • the information processing device 100 is equipped with an environment recognition unit 120, an action recognition unit 130, a user database (DB) 140, an information acquisition unit 150, a priority setting unit 160, an attention determination unit 170, and a display controller 180.
  • DB user database
  • the environment recognition unit 120 recognizes the environment of the real space in which a user is active. For example, the environment recognition unit 120 may recognize the current date and time. The environment recognition unit 120 may also recognize an object appearing in a captured image input from the imaging unit 102 by using an established object recognition technology such as pattern matching. The environment recognition unit 120 may also recognize a person appearing in a captured image input from the imaging unit 102 by using an established facial image recognition technology. The environment recognition unit 120 outputs environment recognition results to the action recognition unit 130 and the information acquisition unit 150.
  • Action recognition unit 130 recognizes a user's action state. Typically, the action recognition unit 130 executes recognition of a user's action state while a display item is being displayed on-screen on the display 110. As an example, the action recognition unit 130 may recognize action states related to a user's movement speed. Action states related to movement speed may include at least one from among the states “sitting/standing still”, “walking/running", and "riding a vehicle", for example. The action recognition unit 130 may also recognize action states related to the transportation being utilized by a user. Action states related to transportation may include at least one from among "riding a train", “riding a car", and “riding a bicycle", for example. For example, Japanese Unexamined Patent Application Publication No.
  • the 2006-345269 describes a technique of recognizing the above action states on the basis of sensor data from an acceleration sensor and a gyro sensor.
  • the action recognition unit 130 may recognize a user's action state by using such a sensor-based technique or another established technique. Also, as a simpler technique, the action recognition unit 130 may also prompt a user to explicitly select the current action state via a user interface.
  • the recognition of action states by the action recognition unit 130 is not limited to the examples discussed above.
  • the action recognition unit 130 may also recognized more detailed action states by additionally using positioning data input from the positioning sensor.
  • the above action state "sitting/standing still” may be further differentiated into the two action states of "working” and “relaxing” by determining, on the basis of positioning data, whether a user is at a specific place such as an office or home.
  • the above action state "riding a train” may be recognized by determining, on the basis of positioning data, that a user is at a train station or on a train line.
  • the action recognition unit 130 may also recognize action states related to persons accompanying a user's activity (hereinafter designated companions).
  • the action recognition unit 130 may determine a companion from the results of person recognition by the environment recognition unit 120, for example. Otherwise, the action recognition unit 130 may also determine a companion by querying, via the communication unit 112, a location data server that manages the current locations of multiple users.
  • the action recognition unit 130 may also determine a companion by transmitting, to a nearby device, a request signal requesting identification information such as a user ID or a device ID, receiving a response signal to the request signal, and cross-referencing the received identification information against existing identification information registered in advance.
  • the above action state "sitting/standing still” may be further differentiated into the two action states of "working” and "relaxing" by determining whether a companion is a user's friend or family, or a coworker.
  • FIG. 6A is a flowchart illustrating a first example of a flow of an action recognition process.
  • the action recognition unit 130 acquires sensor data generated by the sensor unit 104 (step S30).
  • the sensor data acquired at this point may include positioning data indicating a current user position, as well as acceleration data indicating acceleration imparted to the information processing device 100.
  • the user's current velocity may also be computed from the timewise change in the user location or the integral of the acceleration.
  • the action recognition unit 130 determines whether or not the user has stopped for a specific period (such as from several seconds to several minutes, for example) (step S31).
  • a specific period such as from several seconds to several minutes, for example
  • “stopped” not only refers to when the user's velocity is exactly zero, but also encompasses situations in which the user's velocity falls below a predetermined first threshold value.
  • the action recognition unit 130 determines that the user's action state is a "sitting/standing still" state ST1 (step S37).
  • the action recognition unit 130 determines whether the has been moving at a velocity exceeding a predefined second threshold value (which is larger than the above first threshold value, and may be 30 km/hr, for example) for a specific period (step S35). In the case where the user has been moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 determines that the user's action state is a "riding a vehicle" state ST2 (step S38). Otherwise, the action recognition unit 130 determines that the user's action state is a "walking/running" state ST3 (step S39).
  • a predefined second threshold value which is larger than the above first threshold value, and may be 30 km/hr, for example
  • FIG. 6B is a flowchart illustrating a second example of a flow of an action recognition process.
  • the action recognition unit 130 acquires sensor data generated by the sensor unit 104 (step S30).
  • the sensor data acquired at this point may include positioning data and acceleration data, and a user's current velocity may also be computed.
  • the action recognition unit 130 determines whether or not the user has stopped for a specific period (step S31). In the case where the user has stopped for the specific period, the action recognition unit 130 additionally determines whether or not the place where the user is located is an office (step S32). Note that location data for an office (or some other workplace) where the user works is registered in the user DB 140 in advance. Then, in the case where the place where the user is located is an office, the action recognition unit 130 determines that the user's action state is a "working" state ST1a (step S37a). Meanwhile, in the case where the user is located is not an office, the action recognition unit 130 determines that the user's action state is a "relaxing" state ST1b (step S37b).
  • the action recognition unit 130 determines whether the has been moving at a velocity exceeding a predefined second threshold value for a specific period (step S35). In the case where the user is moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 additionally determines whether or not the place where the user is located is a train station or on a train line (step S36). Then, in the case where the place where the user is located is a train station or on a train line, the action recognition unit 130 determines that the user's action state is a "riding a train" state ST2a (step S38a).
  • the action recognition unit 130 determines that the user's action state is a "riding a car" state ST2b (step S38b). Additionally, in the case where the user has not been moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 determines that the user's action state is a "walking/running" state ST3 (step S39).
  • FIG. 6C is a flowchart illustrating a third example of a flow of an action recognition process.
  • the action recognition unit 130 acquires sensor data generated by the sensor unit 104 (step S30).
  • the sensor data acquired at this point may include positioning data and acceleration data, and a user's current velocity may also be computed.
  • the action recognition unit 130 determines whether or not the user has stopped for a specific period (step S31). In the case where the user has stopped for the specific period, the action recognition unit 130 acquires identification information for a nearby device by transmitting a query to a location data server, or by transmitting to a nearby device a request signal requesting identification information (step S33). Next, the action recognition unit 130 determines whether or not the user is together with a coworker by cross-referencing the identification information from the nearby device with existing identification information registered in the user DB 140 in advance (step S34). In the case where the user is together with a coworker, the action recognition unit 130 determines that the user's action state is a "working" state ST1a (step S37a).
  • the action recognition unit 130 determines that the user's action state is a "relaxing" state ST1b (step S37b).
  • the action recognition unit 130 additionally determines whether the has been moving at a velocity exceeding a predefined second threshold value for a specific period (step S35). In the case where the user has been moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 determines that the user's action state is a "riding a vehicle" state ST2 (step S38). Otherwise, the action recognition unit 130 determines that the user's action state is a "walking/running" state ST3 (step S39).
  • the action recognition unit 130 outputs an identifier for the user's action state recognized in this way to the information acquisition unit 150, the priority setting unit 160, and the display controller 180.
  • the user DB 140 is a database that stores data related to users of information processing devices 100.
  • the user DB 140 may also, for example, store location data for a user's office and home, as well as identification information for devices possessed by a user's coworkers, family, and friends in order to aid recognition of a user's action state.
  • the user DB 140 may also store login information for a data server utilized when the information acquisition unit 150 acquires information as discussed later.
  • the user DB 140 may also store preference information expressing a user's preferences. Preference information may be automatically collected from a user's content viewing/playback history or email history, or registered by the user him- or herself via a user interface, for example.
  • the information acquisition unit 150 acquires information to provide to a user.
  • the information acquisition unit 150 accesses a data server via the communication unit 112 and acquires information from the data server.
  • Information acquired by the information acquisition unit 150 may include information in various categories, such as advertising information, social information, news information, and traffic information.
  • the information acquisition unit 150 may also periodically acquire up-to-date information from a data server according to a fixed cycle, or acquire information from the data server in response to a trigger, such as the activation of an information-providing application.
  • the information acquisition unit 150 may also acquire information specific to a locality by using positioning data input from the sensor unit 104.
  • the information acquisition unit 150 may also acquire additional information associated with an object or person appearing in a captured image recognized by the environment recognition unit 120.
  • the additional information may include information such as the name and attributes of the object or person, a related message, or a related advertisement.
  • the information acquisition unit 150 outputs acquired information to the display controller 180.
  • Priority setting unit 160 determines priorities for multiple display items according to an action state recognized by the action recognition unit 130.
  • display items refer to individual pieces of information to provide to a user via a screen.
  • Each display item is categorized according to the type of information.
  • the priority setting unit 160 may, for example, set the priority of a display item belonging to a category associated with an action state recognized by the action recognition unit 130 higher than the priority of a display item belonging to another category.
  • news information may be associated with the "working" state ST1a, social information with the "relaxing" state ST1b, and traffic information with the "riding a vehicle” state ST2.
  • Such association patterns may be predefined using a mapping table 165 as exemplified in FIG. 7, for example. Association patterns may also be edited by a user via a user interface.
  • the priority setting unit 160 may also use different association patterns depending on the type of device displaying the display items.
  • FIG. 7 is an explanatory diagram for explaining a first example of setting priorities according to a user's action state.
  • FIG. 7 illustrates the contents of the mapping table 165 in the first example.
  • Each row of the mapping table expresses a particular category of information items.
  • Each column of the mapping table expresses a particular action state.
  • three types of action states are herein assumed to be potentially recognizable: a "sitting/standing still" state ST1, a "riding a vehicle” state ST2, and a "walking/running” state ST3, like the example in FIG. 6A.
  • priority values are numerical values in the range from 1 to 5. A smaller value represents a higher priority.
  • a user's action state is the state ST1
  • the priority of advertising information is set to "3”
  • the priorities of social information and new information are set to "1”
  • the priority of traffic information is set to "5".
  • the priority of advertising information is set to "5"
  • the priorities of social information and new information are set to "3”
  • the priority of traffic information is set to "1”.
  • the priority of advertising information is set to "3”
  • the priorities of social information and new information are set to "5"
  • the priority of traffic information is set to "1".
  • FIG. 8 is an explanatory diagram for explaining a second example of setting priorities according to a user's action state.
  • FIG. 8 illustrates the contents of the mapping table 165 in the second example.
  • Each row of the mapping table expresses a particular sub-category of information items.
  • Each column of the mapping table expresses a particular action state.
  • five types of action states are herein assumed to be potentially recognizable: a "working" state ST1a, a "relaxing" state ST1b, a "riding a train” state ST2a, a "riding a car” state ST2b, and a "walking/running” state ST3, like the example in FIG. 6B.
  • priority values are likewise numerical values in the range from 1 to 5.
  • a smaller value represents a higher priority.
  • a user's action state is the state ST1a
  • the priority of music advertising among advertising information is set to "3”
  • the priority of short posts among social information is set to "5"
  • the priority of economic news among news information is set to "1”
  • the priority of train schedule information among traffic information is set to "3”
  • the priority of navigation information among traffic information is set to "5".
  • a user's action state is the state ST2a
  • the priority of music advertising among advertising information is set to "3”
  • the priority of short posts among social information is set to "3”
  • the priority of economic news among news information is set to "3”
  • the priority of train schedule information among traffic information is set to "1”
  • the priority of navigation information among traffic information is set to "5".
  • a user's action state is the state ST3
  • the priority of music advertising among advertising information is set to "5"
  • the priority of short posts among social information is set to "3”
  • the priority of economic news among news information is set to "3”
  • the priority of train schedule information among traffic information is set to "3”
  • the priority of navigation information among traffic information is set to "3".
  • the priority setting unit 160 may also adjust priorities to set according to a user's action state depending on a degree of attention for each display item.
  • the degree of attention for each display item may be determined by the attention determination unit 170 discussed later, according to parameters such as a user's preferences, a user's current location, or the number of times that display item as actually been viewed.
  • FIG. 9A is an explanatory diagram for explaining a first technique for adjusting priorities.
  • FIG. 9A illustrates three display items belonging to a category called “social information”, with item IDs of "IT11", “IT12”, and “IT13”, respectively, as well as two display items belonging to a category called “news information”, with item IDs of "IT21” and “IT22”, respectively.
  • the base priorities of the display items IT11, IT12, and IT13 are set to "3" according to the mapping table 165 discussed above, for example.
  • the base priorities of the display items IT21 and IT22 are also set to "3".
  • the priority setting unit 160 adjusts these base priorities (the priorities before adjustment) according to degrees of attention determined on the basis of a user's preferences.
  • a user's preferences may be expressed in a keywords list format, for example, in which keywords are extracted from a history of past content viewed or played back by the user, or from an email history.
  • keywords are extracted from a history of past content viewed or played back by the user, or from an email history.
  • Japanese Unexamined Patent Application Publication No. 2003-178075 describes technology that extracts a keyword list from a user's email history.
  • preference information 142 in a keyword list format includes two keywords: "car" and "camera".
  • at least one of these keywords are respectively included in the information content of the display items IT11 and IT21.
  • the priority setting unit 160 adjusts the priority values set for the display items IT11 and IT21 to "2" by applying an offset "-1" to the base priorities "3".
  • a local area name, a facility name, or a shop name corresponding to a user's current location may also be used in order to adjust priorities.
  • FIG. 9B is an explanatory diagram for explaining a second technique for adjusting priorities.
  • FIG. 9B again illustrates three display items IT11, IT12, and IT13 belonging to a category called "social information", as well as two display items IT21 and IT22 belonging to a category called "news information”.
  • the priority setting unit 160 adjusts the priorities of these display items according to degrees of attention determined on the basis of a user's gaze direction. For example, the degree of attention for each display item increases the more times the user's gaze is directed at that display item. In the example in FIG.
  • the priority value of the display item IT12 changes to "2.5" due to applying an offset "-0.5" to the base priority "3" of the display item IT12. Also, at a time T2 when the user's gaze is directed at the display item IT12, the priority value of the display item IT12 changes to "2" due to further applying the offset "-0.5" to the priority of the display item IT12.
  • the attention determination unit 170 determines a degree of attention for each display item. For example, the attention determination unit 170 may determine that the degree of attention is high for a display item having information content with a high correlation to preference information which may be acquired from the user DB 140. The attention determination unit 170 may also determine that the degree of attention is high for a display item having information content with a high correlation to a current location indicated by positioning data input from the sensor unit 104. In addition, by detecting a user's gaze direction, the attention determination unit 170 may also determine a higher degree of attention for a display item that the user has actually viewed more times.
  • the length of time the user's gaze was directed at a display item may also be used to determine a degree of attention instead of the number of times the user viewed a display item.
  • the attention determination unit 170 outputs a degree of attention for each display item determined in this way to the priority setting unit 160.
  • the display controller 180 controls the display of display items used to provide a user with information input from the information acquisition unit 150, according to priorities set by the priority setting unit 160. For example, the display controller 180 may cause the display 110 to highlight on-screen a display item set with comparatively high priority. More specifically, the display controller 180 may arrange a display item set with a higher priority closer to the center of the screen. The display controller 180 may also set a larger on-screen size for a display item set with a higher priority. The display controller 180 may also set a higher brightness, lower transparency, higher contrast, or higher sharpness for a display item set with a higher priority. The display controller 180 may also set the color of a display item set with a priority exceeding a threshold to a specific color. Additionally, in the case where the display 110 supports three-dimensional (3D) display, the display controller 180 may also set a shallow depth for a display item set with a higher priority. The display controller 180 may also display on-screen only display items set with comparatively high priorities.
  • 3D three-dimensional
  • the display controller 180 may also determine an object or person in a real space to be perceived by a user according to an action state recognized by the action recognition unit 130, and control the display of display items such that the determined object or person is not obscured by the display items.
  • objects or persons to be visually perceived by a user may include traffic signs and pedestrians while the user is driving a car or riding a bicycle.
  • objects to be perceived by the user may include the screen of an information device while the user is working.
  • the display controller 180 may control at least one of the on-screen position, size, shape, brightness, or transparency of display items such that an object or person to be perceived is not obscured by the display items, for example.
  • the display controller 180 is able to allow a user to clearly perceive display items by varying the transmittance rate of the filter.
  • the display controller 180 may set the filter transmittance to maximum, and maintain the maximum transmittance while the battery level of the information processing device 100 is below a specific threshold value.
  • the display controller 180 In the case where a screen of the display 110 is a non-see-through screen, the display controller 180 generates an output image by superimposing images of display items respectively having determined display attributes onto a captured image, and outputs the generated output image to the display 110. Meanwhile, in the case where a screen of the display 110 is a see-through screen, the display controller 180 outputs individual images of display items respectively having determined display attributes to the display 110. Several examples of item display controlled by the display controller 180 will be additionally described later.
  • FIG. 10 is a flowchart illustrating an example of the flow of a display control process which may be executed by an information processing device 100 according to an embodiment. The process illustrated in FIG. 10 may be periodically repeated according to a fixed cycle.
  • input data such as captured image data, sensor data, and date/time data is collected via the imaging unit 102, the sensor unit 104, and the communication unit 112 (step S110).
  • the environment recognition unit 120 uses the collected input data to recognize the environment of a real space in which a user is active (step S120).
  • the information acquisition unit 150 acquires information in various categories to provide to the user (step S130).
  • the information acquisition unit 150 may also acquire information specific to a locality, or acquire additional information associated with an object or person appearing in a captured image.
  • the action recognition unit 130 executes an action recognition process (step S140).
  • the action recognition process executed at this point may be any of the processes described using FIGS. 6A to 6C, or a process like that described in Japanese Unexamined Patent Application Publication No. 2006-345269.
  • the priority setting unit 160 sets priorities for display items that respectively express information acquired by the information acquisition unit 150 (step S150).
  • the priority setting unit 160 also adjusts the priorities of display items according to a degree of attention for each display item determined by the attention determination unit 170 (step S160).
  • the display controller 180 determines display attributes for display items according to the priorities set by the priority setting unit 160 (step S170).
  • the display attributes herein may be factors such as the position, size, shape, brightness, transparency, color, and depth of a display item, for example. Additionally, the display controller 180 causes each of the display items to be displayed on-screen with the determined display attributes (step S180).
  • FIGS. 11A to 11E illustrate five examples of item display by an information processing device 100.
  • the information processing device 100 is a wearable device exemplified in FIG. 1.
  • a user's action state is a "relaxing" state ST1b.
  • Four display items IT01, IT11, IT23, and IT31 are being displayed on-screen in the information processing device 100.
  • the category of the display IT01 is advertising information.
  • the category of the display IT11 is social information.
  • the category of the display IT23 is news information.
  • the category of the display IT31 is traffic information.
  • the display item IT11 belonging to social information associated with the "relaxing" state ST1b is displayed in the center of the screen at the largest size. Consequently, a user relaxing at home is able to easily notice a friend's short post indicated by the display item IT11.
  • the display item IT31 belonging to traffic information is displayed at the edge of the screen at a smaller size. Consequently, the user is not distracted by the presence of traffic information, which is not very important in the current action state.
  • a user's action state is a "riding a train" state ST2a.
  • Four display items IT01, IT11, IT23, and IT31 are being displayed on-screen in the information processing device 100.
  • the display item IT31 belonging to train schedule information associated with the "riding a train" state ST2a is displayed in the center of the screen with the shallowest depth (or in other words, farthest in front). Consequently, a user riding a train is able to reliably ascertain that a certain railway line has stopped by viewing the display item IT31.
  • the display item IT01 belonging to music advertising information is displayed with the deepest depth and at a small size. Consequently, the user is not distracted by the presence of music advertising information which is not very important.
  • a user's action state is a "relaxing" state ST1b.
  • a display item IT01 is displayed at the edge of the screen and at a small size.
  • the star illustrated in FIG. 11C represents the position of the user's gaze as detected by the input unit 106.
  • the user's gaze is directed at the display item IT01.
  • the attention determination unit 170 raises the degree of attention for the display item IT01 under the user's gaze.
  • the priority setting unit 160 changes the priority of the display item IT01 to a larger value.
  • the position of the display item IT01 approaches the center of the screen. Also, as a result of the user's continued gaze directed at the display item IT01, on the screen illustrated in the lower-right part of FIG. 11C, the position of the display item IT01 further approaches the center of the screen, and the size of the display item IT01 is enlarged.
  • the priority By adjusting the priority in this way, a user is able to more clearly view information attracting his or her interest without performing a special operation.
  • a user's action state is a "riding a car" state ST2b.
  • Four display items IT01, IT11, IT23, and IT31 are being displayed on-screen in the information processing device 100.
  • a traffic sign TS exists in a real space appearing in the user's visual field.
  • the environment recognition unit 120 recognizes such a traffic sign TS in a captured image.
  • the display controller 180 arranges the display items IT01, IT11, IT23, and IT31 such that the recognized traffic sign TS is not obscured by these display items.
  • a user's action state is a "walking/running" state ST3.
  • Three display items IT01, IT23, and IT31 are being displayed on-screen in the information processing device 100.
  • the display item IT31 which contains information content correlated with the keyword "car” expressing the user's preferences, is displayed at a larger size than the other display items.
  • the display item IT01 is arranged on-screen so as to track the movement of these persons PE1 and PE2 in the user's visual field.
  • the position of persons may be recognized by the environment recognition unit 120 using a captured image, or be recognized by transmitting a query to an external data server.
  • These persons PE1 and PE2 are persons participating in the provision of advertising information registered in an advertising information server in advance, for example.
  • a reward may also be paid to the persons PE1 and PE2 by an advertising information service business as a result of the user viewing the display item IT01.
  • Such a system enables the realization of a new type of advertising information service that draws a user's interest.
  • the display item IT01 may also not be displayed at a position overlapping the persons.
  • FIG. 12 illustrates the information processing device 100a exemplified in FIG. 1, and an external device ED.
  • the external device ED is a mobile client such as a smartphone or a mobile PC.
  • the information processing device 100a wirelessly communicates with the external device ED using an arbitrary wireless communication protocol such as wireless local area network (LAN), Bluetooth (registered trademark), or Zigbee.
  • LAN wireless local area network
  • Bluetooth registered trademark
  • Zigbee Zigbee
  • FIG. 13 is an explanatory diagram illustrating an example of item display when linking an information processing device and an external device.
  • a display item IT41 is displayed on-screen in the information processing device 100a.
  • the display item IT41 is an item that expresses additional information (such as a name and attributes of a person PE3, for example) associated with the person PE3 appearing in a captured image.
  • the person PE3 is recognized by the environment recognition unit 120.
  • the additional information may be acquired by the information acquisition unit 150 from a data server or a database inside the external device ED.
  • person recognition is a process that demands comparatively high processor performance, and may include processes such as cross-referencing facial images.
  • a database of additional information may occupy a comparatively large proportion of memory resources.
  • the information processing device 100a As a low-cost, lightweight, and compact device.
  • a user is able to view the display item IT41 on a head-mounted display rather than a screen of the external device ED. Consequently, in a situation where a user is facing a person, the user is able to ascertain information related to that person, without making that person uncomfortable due to the user shifting his or her gaze to a screen on the external device ED.
  • processes other than person recognition may also be implemented on the external device ED.
  • FIG. 14 is an explanatory diagram for explaining display control based on recognition of an external device.
  • a display item IT41 is likewise displayed on-screen in the information processing device 100a.
  • the display item IT41 is an item that expresses additional information associated with a person PE3.
  • an external device ED exists in a real space appearing in the user's visual field.
  • the environment recognition unit 120 recognizes such an external device ED in a captured image.
  • the display controller 180 arranges the display item IT41 such that the recognized external device ED is not obscured by the display item IT41.
  • the display item IT41 has moved upward compared to FIG. 13. Consequently, in the case where a user is attempting to look at a screen on an external device ED, it is possible to avoid a situation in which that action is hindered by the display of an item.
  • a user's action state may include states related to the user's movement velocity. Consequently, it is possible to provide the user with information in a suitable format according to various user action states, such as a state in which the user is riding a vehicle, a state of walking, a state of running, a state of standing still, and a state of sitting.
  • a user's action state includes states related to transportation being utilized by the user. Consequently, it is possible to provide a user with relevant information in a safer format under conditions where providing information may affect safety, such as a state in which the user is driving a car or riding a bicycle, for example.
  • a user's action state includes states related to a person accompanying a user's activity. Consequently, it is possible to prioritize the provision of more desirable information to the user after suitably determining what kind of information, such as formal information and private information, for example, is desirable for the user at that time.
  • the priority of a display item may be adjusted on the basis of parameters such as the user's preferences, the user's current location, or the user's gaze direction. Consequently, it is possible to prioritize the provision of user-desired information to the user, even among information belonging to a shared category, for example.
  • a display item is displayed on a screen arranged so as to be continually present in a user's visual field while the user engages in activity.
  • displaying a varied assortment of information on-screen may divert the user's attention and hinder the user's activity, while also increasing the risk of the user overlooking highly necessary information.
  • information matched to the user's activity is relatively highlighted or displayed alone, thus effectively reducing the risk of the user overlooking highly necessary information without greatly hindering the user's activity.
  • each device described in this specification may be realized in any of software, hardware, and a combination of software and hardware.
  • Programs constituting software are stored in advance in a non-transitory medium provided internally or externally to each device, for example.
  • Each program is then loaded into random access memory (RAM) at runtime and executed by a processor such as a CPU, for example.
  • RAM random access memory
  • An apparatus including: a user action state obtaining circuit configured to obtain an action state of a user; and a display control circuit configured to control a display to modify display information based on the action state.
  • a user action state obtaining circuit configured to obtain an action state of a user; and a display control circuit configured to control a display to modify display information based on the action state.
  • the display control circuit modifies attributes of the display information based on the action state.
  • the display circuit modifies at least one of a font size and location of the display information based on the action state.
  • the priority setting circuit sets the priority of the plurality of categories of information including advertising information, social information, information, and traffic information.
  • the user action state obtaining circuit obtains the action state of the user as one of working, relaxing, riding a train, riding a car, riding a bicycle, walking, and running.
  • the user action state obtaining circuit determines the action state of the user based on location data.
  • the priority setting circuit changes the priority of at least one category of information when the action state changes.
  • the priority setting circuit is configured to set the priority of the at least one category of information based on a number of times information in the at least one category of information is viewed by the user.
  • the display control circuit is configured to control the display to display other information received from an external source.
  • the display control circuit is configured to control the display to display the other information including advertising information.
  • An information processing device including: a recognition unit that recognizes an action state of a user while a display item is being displayed on a screen; a setting unit that sets priorities for a plurality of display items according to the action state recognized by the recognition unit; and a display controller that controls display of the display items according to the priorities set by the setting unit.
  • the information processing device according to any one of (1) to (3), wherein the action state includes a state related to a person accompanying activity of the user.
  • the information processing device further including: a determination unit that determines a degree of attention for each display item, wherein the setting unit adjusts the priorities of the plurality of display items according to the degree of attention determined by the determination unit.
  • the determination unit determines the degree of attention for each display item on the basis of a preference of the user.
  • the determination unit determines the degree of attention for each display item on the basis of a current location of the user.
  • the determination unit determines the degree of attention for each display item by detecting a gaze direction of the user.
  • each of the plurality of display items is categorized according to information type, and wherein the setting unit sets a priority of a display item belonging to a category associated with the action state recognized by the recognition unit higher than a priority of a display item belonging to another category.
  • the display controller highlights the display items set with the priorities, the priorities being relatively high.
  • the display controller displays only the display items set with the priorities on the screen, the priorities being relatively high.
  • the information processing device according to any one of (1) to (11), further including: a display that includes the screen arranged to continually enter a visual field of the user while the user is engaged in activity. (13) The information processing device according to (12), wherein the display is a device worn by the user. (14) The information processing device according to any one of (1) to (13), further including: an imaging unit that captures a real space, wherein the display controller determines an object or person in the real space to be visually perceived by the user according to the action state recognized by the recognition unit, and controls display of the display items in a manner that the determined object or person is not obscured by a display item.
  • the information processing device controls at least one of a position, a size, a shape, brightness, or transparency of each display item in a manner that the determined object or person is not obscured by the display item.
  • the display controller controls at least one of a position, a size, a shape, brightness, or transparency of each display item in a manner that the determined object or person is not obscured by the display item.
  • the information processing device controls at least one of a position, a size, a shape, brightness, or transparency of each display item in a manner that the determined object or person is not obscured by the display item.
  • the display controller controls at least one of a position, a size, a shape, brightness, or transparency of each display item in a manner that the determined object or person is not obscured by the display item.
  • the display controller controls at least one of a position, a size, a shape, brightness, or transparency of each display item in a manner that the determined object or person is not obscured by the display item.
  • the display controller controls at least one
  • a display control method executed by a controller of an information processing device including: recognizing an action state of a user while a display item is being displayed on a screen; setting priorities for a plurality of display items according to the recognized action state; and controlling display of the display items according to the set priorities.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Environmental & Geological Engineering (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)
  • Automatic Cycles, And Cycles In General (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of El Displays (AREA)
  • Telephone Function (AREA)

Abstract

An apparatus includes a user action state obtaining circuit configured to obtain an action state of a user; and a display control circuit configured to control a display to modify display information based on the action state.

Description

INFORMATION PROCESSING DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM
The present disclosure relates to an information processing device, a display control method, and a program encoded on a non-transitory computer readable medium.
The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2012-219450 filed in the Japan Patent Office on October 1, 2012, the entire content of which is hereby incorporated by reference.
Recently, users are spending more time viewing information provided by information devices as a result of great improvements in information device portability. For example, a user acquires information from a smartphone while commuting to work or school. A navigation device provides information to a user who is driving a car or riding a bicycle. A wearable device such as a head-mounted display (HMD) is able to provide information to a user even while the user is performing any given activity. For example, PTL 1 below proposes displaying different information on a screen of a wearable device depending on outside conditions, in order to provide the user with information that is contextually appropriate or interesting. The outside conditions herein may include factors such as the environmental light level, temperature, humidity and barometric pressure, the distance to a photographic subject, whether or not a photographic subject is a living thing, as well as the current location and date/time.
JP 2008-83290A
Summary
The types of information provided to users from day to day cover a wide range. However, if much information is simultaneously provided to a user, there is a risk that the user will instead overlook highly necessary information. Particularly, it is difficult for a user to check a varied assortment of information one by one in the case of attempting to provide information while a user is engaged in some activity. Thus, it is desirable to provide information in a format enabling the user to reliably notice information that matches the activity.
The present invention broadly comprises an apparatus, method and a non-transitory computer readable medium encoded with a program for performing the method. In one embodiment, the apparatus includes a user action state obtaining circuit configured to obtain an action state of a user; and a display control circuit configured to control a display to modify display information based on the action state.
According to technology in accordance with the present disclosure, it is possible to reduce the risk of a user overlooking highly necessary information while the user is engaged in some activity.
FIG. 1 is an explanatory diagram illustrating a first example of the exterior of an information processing device. FIG. 2 is an explanatory diagram illustrating a second example of the exterior of an information processing device. FIG. 3 is an explanatory diagram illustrating a third example of the exterior of an information processing device. FIG. 4 is a block diagram illustrating an example of a hardware configuration of an information processing device according to an embodiment. FIG. 5 is a block diagram illustrating an example of a logical function configuration of an image processing device according to an embodiment. FIG. 6A is a flowchart illustrating a first example of a flow of an action recognition process. FIG. 6B is a flowchart illustrating a second example of a flow of an action recognition process. FIG. 6C is a flowchart illustrating a third example of a flow of an action recognition process. FIG. 7 is an explanatory diagram for explaining a first example of setting priorities according to a user's action state. FIG. 8 is an explanatory diagram for explaining a second example of setting priorities according to a user's action state. FIG. 9A is an explanatory diagram for explaining a first technique for adjusting priorities. FIG. 9B is an explanatory diagram for explaining a second technique for adjusting priorities. FIG. 10 is a flowchart illustrating an example of the flow of a display control process according to an embodiment. FIG. 11A is an explanatory diagram illustrating a first example of the display of items. FIG. 11B is an explanatory diagram illustrating a second example of the display of items. FIG. 11C is an explanatory diagram illustrating a third example of the display of items. FIG. 11D is an explanatory diagram illustrating a fourth example of the display of items. FIG. 11E is an explanatory diagram illustrating a fifth example of the display of items. FIG. 12 is an explanatory diagram for explaining an example of linking an information processing device and an external device. FIG. 13 is an explanatory diagram illustrating an example of item display when linking an information processing device and an external device. FIG. 14 is an explanatory diagram for explaining display control based on recognition of an external device.
Hereinafter, preferred embodiments of the present disclosure will be described in detail and with reference to the attached drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
The description will proceed in the following order.
1. Overview
2. Configuration of device according to embodiment
2-1. Hardware configuration
2-2. Functional configuration
2-3. Process flow
3. Display examples
4. Linking with external device
5. Conclusion
<1. Overview>
Technology according to the present disclosure is applicable to information processing devices in various forms, including portable devices such as a tablet personal computer (PC), mobile PC, smartphone, game console, or portable navigation devices (PND), as well as wearable devices such as a head-mounted device (HMD), for example. The screens of these information processing devices may be arranged so as to be continually present in a user's visual field while the user engages in activity.
FIG. 1 is an explanatory diagram illustrating a first example of the exterior of an information processing device to which technology according to the present disclosure may be applied. In the first example, the information processing device 100a is a glasses-style wearable device worn on a user's head. The information processing device 100a is equipped with a pair of screens SCa and SCb, a housing HS, an imaging lens LN, and a touch surface TS. The screens SCa and SCb are see-through or non-see-through screens arranged in front of a user's left eye and right eye, respectively. The housing HS includes a frame that supports the screens SCa and SCb, and what are called temples positioned on the sides of a user's head. Various modules for information processing are stored inside the temples. The imaging lens LN is arranged such that the optical axis is approximately parallel to the user's line of sight, and is used to capture images. The touch surface TS is a surface that detects touches by the user, and is used in order for the information processing device 100a to receive user input.
FIG. 2 is an explanatory diagram illustrating a second example of the exterior of an information processing device to which technology according to the present disclosure may be applied. In the second example, the information processing device 100b is a mobile client carried by a user. The information processing device 100b is equipped with a touch surface TS and a screen SC that are integrally constituted as a touch panel display, as well as buttons BT. In a given usage scenario, the screen SC is held in front by the user, and is continuously present in the user's visual field. An imaging lens of the information processing device 100b may be provided on the back of the screen SC. The touch surface TS and the buttons BT are used in order for the information processing device 100b to receive user input.
FIG. 3 is an explanatory diagram illustrating a third example of the exterior of an information processing device to which technology according to the present disclosure may be applied. In the third example, the information processing device 100c is a navigation device attached to a vehicle VH that a user rides. The information processing device 100c may be a specialized navigation device, or a general-purpose device such as a smartphone with navigation functionality. The information processing device 100c is equipped with a screen SC and buttons BT. The screen SC is continuously present in the user's visual field while the user is riding the vehicle VH. The buttons BT are used in order for the information processing device 100c to receive user input.
These information processing devices are carried by an active user, and provide the user with various information. Various information such as advertising information, social information (such as social networking service (SNS) posts, blog posts, or email), news information, and traffic information may be provided, even while the user is engaged in activity. However, if much information is simultaneously provided to a user, there is a risk that the user will instead overlook highly necessary information. Particularly, it is difficult for a user to check a varied assortment of information one by one in the case of attempting to provide information while the user is engaged in some activity. Accordingly, in the embodiments described in detail in the next section, the provision of information is controlled using priorities that are adaptively set according to user action recognition results, such that the user will reliably notice information inferred to be highly necessary to the user.
<2. Configuration of device according to embodiment>
In the following description, when the information processing devices 100a, 100b, and 100c are not being distinguished from each other, these devices will be collectively referred to as the information processing device 100 by omitting the trailing letters in the reference signs.
<2-1. Hardware configuration>
FIG. 4 is a block diagram illustrating an example of a hardware configuration of an information processing device 100 according to an embodiment. Referring to FIG. 4, the information processing device 100 is equipped with an imaging unit 102, a sensor unit 104, an input unit 106, storage 108, a display 110, a communication unit 112, a bus 116, and a controller 118.
(1) Imaging unit
The imaging unit 102 is a camera module that captures images. The imaging unit 102 includes a lens LN as illustrated by example in FIG. 1, a CCD, CMOS, or other image sensor, and an imaging circuit. The imaging unit 102 generates a captured image depicting a real space. A series of captured images generated by the imaging unit 102 may constitute video.
(2) Sensor unit
The sensor unit 104 may include various sensors such as a positioning sensor, an acceleration sensor, and a gyro sensor. The positioning sensor may be, for example, a Global Positioning System (GPS) sensor that receives GPS signals and measures the latitude, longitude, and altitude of the device. Otherwise, the positioning sensor may be a sensor that executes positioning on the basis of the strengths of wireless signals received from wireless access points. The acceleration sensor measures 3-axis acceleration imparted to the information processing device 100. The gyro sensor measures the tilt angle of the information processing device 100. The sensor unit 104 outputs sensor data indicating measurement results output from these sensors to the controller 118.
(3) Input unit
The input unit 106 is an input interface used in order for a user to operate the information processing device 100 or input information into the information processing device 100. The input unit 106 receives user input via the touch surface TS exemplified in FIG. 1 or the buttons BT exemplified in FIGS. 2 and 3, for example. Instead of (or in addition to) a touch sensor or buttons, the input unit 106 may also include other types of input interfaces, such as switches, dials, a keypad, a keyboard, or a speech input interface. The input unit 106 may additionally include a gaze detection interface that detects the user's gaze direction.
(4) Storage
The storage 108 is realized with a storage medium such as semiconductor memory or a hard disk, and stores programs and data used in processing by the information processing device 100. Data stored by the storage 108 may include captured image data and sensor data, as well as data in a database to be described later, and a mapping table, for example. Note that some of the programs and data described in this specification may also be acquired from an external data source (such as a data server, network storage, or externally attached memory, for example), rather than being stored in the storage 108.
(5) Display
The display 110 is a display module that includes the pair of screens SCa and SCb exemplified in FIG. 1, the screen SC exemplified in FIG. 2, or the screen SC exemplified in FIG. 3, and a display circuit. The display 110 displays on-screen output images generated by a display controller 180 described later.
(6) Communication unit
The communication unit 112 is a communication interface that mediates communication between the information processing device 100 and another device. The communication unit 112 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device.
(7) Bus
The bus 116 connects the imaging unit 102, the sensor unit 104, the input unit 106, the storage 108, the display 110, the communication unit 112, and the controller 118 to each other.
(8) Controller
The controller 118 corresponds to a processor such as a central processing unit (CPU) or a digital signal processor (DSP). The controller 118 causes various functions of the information processing device 100 described later to operate by executing a program stored in the storage 108 or another storage medium.
<2-2. Functional configuration>
FIG. 5 is a block diagram illustrating an exemplary configuration of logical functions realized by the storage 108 and the controller 118 of the information processing device 100 illustrated in FIG. 4. Referring to FIG. 5, the information processing device 100 is equipped with an environment recognition unit 120, an action recognition unit 130, a user database (DB) 140, an information acquisition unit 150, a priority setting unit 160, an attention determination unit 170, and a display controller 180.
(1) Environment recognition unit
The environment recognition unit 120 recognizes the environment of the real space in which a user is active. For example, the environment recognition unit 120 may recognize the current date and time. The environment recognition unit 120 may also recognize an object appearing in a captured image input from the imaging unit 102 by using an established object recognition technology such as pattern matching. The environment recognition unit 120 may also recognize a person appearing in a captured image input from the imaging unit 102 by using an established facial image recognition technology. The environment recognition unit 120 outputs environment recognition results to the action recognition unit 130 and the information acquisition unit 150.
(2) Action recognition unit
The action recognition unit 130 recognizes a user's action state. Typically, the action recognition unit 130 executes recognition of a user's action state while a display item is being displayed on-screen on the display 110. As an example, the action recognition unit 130 may recognize action states related to a user's movement speed. Action states related to movement speed may include at least one from among the states "sitting/standing still", "walking/running", and "riding a vehicle", for example. The action recognition unit 130 may also recognize action states related to the transportation being utilized by a user. Action states related to transportation may include at least one from among "riding a train", "riding a car", and "riding a bicycle", for example. For example, Japanese Unexamined Patent Application Publication No. 2006-345269 describes a technique of recognizing the above action states on the basis of sensor data from an acceleration sensor and a gyro sensor. The action recognition unit 130 may recognize a user's action state by using such a sensor-based technique or another established technique. Also, as a simpler technique, the action recognition unit 130 may also prompt a user to explicitly select the current action state via a user interface.
The recognition of action states by the action recognition unit 130 is not limited to the examples discussed above. For example, the action recognition unit 130 may also recognized more detailed action states by additionally using positioning data input from the positioning sensor. For example, the above action state "sitting/standing still" may be further differentiated into the two action states of "working" and "relaxing" by determining, on the basis of positioning data, whether a user is at a specific place such as an office or home. In addition, the above action state "riding a train" may be recognized by determining, on the basis of positioning data, that a user is at a train station or on a train line.
The action recognition unit 130 may also recognize action states related to persons accompanying a user's activity (hereinafter designated companions). The action recognition unit 130 may determine a companion from the results of person recognition by the environment recognition unit 120, for example. Otherwise, the action recognition unit 130 may also determine a companion by querying, via the communication unit 112, a location data server that manages the current locations of multiple users. The action recognition unit 130 may also determine a companion by transmitting, to a nearby device, a request signal requesting identification information such as a user ID or a device ID, receiving a response signal to the request signal, and cross-referencing the received identification information against existing identification information registered in advance. The above action state "sitting/standing still" may be further differentiated into the two action states of "working" and "relaxing" by determining whether a companion is a user's friend or family, or a coworker.
Hereinafter, three examples of action recognition processes which may be executed by the action recognition unit 130 will be described using flowcharts.
FIG. 6A is a flowchart illustrating a first example of a flow of an action recognition process. Referring to FIG. 6A, first, the action recognition unit 130 acquires sensor data generated by the sensor unit 104 (step S30). The sensor data acquired at this point may include positioning data indicating a current user position, as well as acceleration data indicating acceleration imparted to the information processing device 100. The user's current velocity may also be computed from the timewise change in the user location or the integral of the acceleration.
Next, the action recognition unit 130 determines whether or not the user has stopped for a specific period (such as from several seconds to several minutes, for example) (step S31). Herein, "stopped" not only refers to when the user's velocity is exactly zero, but also encompasses situations in which the user's velocity falls below a predetermined first threshold value. In the case where the user has stopped for the specific period, the action recognition unit 130 determines that the user's action state is a "sitting/standing still" state ST1 (step S37).
In the case of determining in step S31 that the user has not stopped, the action recognition unit 130 additionally determines whether the has been moving at a velocity exceeding a predefined second threshold value (which is larger than the above first threshold value, and may be 30 km/hr, for example) for a specific period (step S35). In the case where the user has been moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 determines that the user's action state is a "riding a vehicle" state ST2 (step S38). Otherwise, the action recognition unit 130 determines that the user's action state is a "walking/running" state ST3 (step S39).
FIG. 6B is a flowchart illustrating a second example of a flow of an action recognition process. Referring to FIG. 6B, first, the action recognition unit 130 acquires sensor data generated by the sensor unit 104 (step S30). The sensor data acquired at this point may include positioning data and acceleration data, and a user's current velocity may also be computed.
Next, the action recognition unit 130 determines whether or not the user has stopped for a specific period (step S31). In the case where the user has stopped for the specific period, the action recognition unit 130 additionally determines whether or not the place where the user is located is an office (step S32). Note that location data for an office (or some other workplace) where the user works is registered in the user DB 140 in advance. Then, in the case where the place where the user is located is an office, the action recognition unit 130 determines that the user's action state is a "working" state ST1a (step S37a). Meanwhile, in the case where the place where the user is located is not an office, the action recognition unit 130 determines that the user's action state is a "relaxing" state ST1b (step S37b).
In the case of determining in step S31 that the user has not stopped, the action recognition unit 130 additionally determines whether the has been moving at a velocity exceeding a predefined second threshold value for a specific period (step S35). In the case where the user is moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 additionally determines whether or not the place where the user is located is a train station or on a train line (step S36). Then, in the case where the place where the user is located is a train station or on a train line, the action recognition unit 130 determines that the user's action state is a "riding a train" state ST2a (step S38a). Meanwhile, in the case where the place where the user is located is neither a train station nor on a train line, the action recognition unit 130 determines that the user's action state is a "riding a car" state ST2b (step S38b). Additionally, in the case where the user has not been moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 determines that the user's action state is a "walking/running" state ST3 (step S39).
FIG. 6C is a flowchart illustrating a third example of a flow of an action recognition process. Referring to FIG. 6C, first, the action recognition unit 130 acquires sensor data generated by the sensor unit 104 (step S30). The sensor data acquired at this point may include positioning data and acceleration data, and a user's current velocity may also be computed.
Next, the action recognition unit 130 determines whether or not the user has stopped for a specific period (step S31). In the case where the user has stopped for the specific period, the action recognition unit 130 acquires identification information for a nearby device by transmitting a query to a location data server, or by transmitting to a nearby device a request signal requesting identification information (step S33). Next, the action recognition unit 130 determines whether or not the user is together with a coworker by cross-referencing the identification information from the nearby device with existing identification information registered in the user DB 140 in advance (step S34). In the case where the user is together with a coworker, the action recognition unit 130 determines that the user's action state is a "working" state ST1a (step S37a). Meanwhile, in the case where the place where the user is not together with a coworker (or together with a friend or family member), the action recognition unit 130 determines that the user's action state is a "relaxing" state ST1b (step S37b).
In the case of determining in step S31 that the user has not stopped, the action recognition unit 130 additionally determines whether the has been moving at a velocity exceeding a predefined second threshold value for a specific period (step S35). In the case where the user has been moving at a velocity exceeding the second threshold value for the specific period, the action recognition unit 130 determines that the user's action state is a "riding a vehicle" state ST2 (step S38). Otherwise, the action recognition unit 130 determines that the user's action state is a "walking/running" state ST3 (step S39).
The action recognition unit 130 outputs an identifier for the user's action state recognized in this way to the information acquisition unit 150, the priority setting unit 160, and the display controller 180.
(3) User DB
The user DB 140 is a database that stores data related to users of information processing devices 100. The user DB 140 may also, for example, store location data for a user's office and home, as well as identification information for devices possessed by a user's coworkers, family, and friends in order to aid recognition of a user's action state. The user DB 140 may also store login information for a data server utilized when the information acquisition unit 150 acquires information as discussed later. The user DB 140 may also store preference information expressing a user's preferences. Preference information may be automatically collected from a user's content viewing/playback history or email history, or registered by the user him- or herself via a user interface, for example.
(4) Information acquisition unit
The information acquisition unit 150 acquires information to provide to a user. For example, the information acquisition unit 150 accesses a data server via the communication unit 112 and acquires information from the data server. Information acquired by the information acquisition unit 150 may include information in various categories, such as advertising information, social information, news information, and traffic information. The information acquisition unit 150 may also periodically acquire up-to-date information from a data server according to a fixed cycle, or acquire information from the data server in response to a trigger, such as the activation of an information-providing application. The information acquisition unit 150 may also acquire information specific to a locality by using positioning data input from the sensor unit 104. The information acquisition unit 150 may also acquire additional information associated with an object or person appearing in a captured image recognized by the environment recognition unit 120. The additional information may include information such as the name and attributes of the object or person, a related message, or a related advertisement. The information acquisition unit 150 outputs acquired information to the display controller 180.
(5) Priority setting unit
The priority setting unit 160 determines priorities for multiple display items according to an action state recognized by the action recognition unit 130. In this specification, display items refer to individual pieces of information to provide to a user via a screen. Each display item is categorized according to the type of information. The priority setting unit 160 may, for example, set the priority of a display item belonging to a category associated with an action state recognized by the action recognition unit 130 higher than the priority of a display item belonging to another category. As an example, news information may be associated with the "working" state ST1a, social information with the "relaxing" state ST1b, and traffic information with the "riding a vehicle" state ST2. Such association patterns may be predefined using a mapping table 165 as exemplified in FIG. 7, for example. Association patterns may also be edited by a user via a user interface. The priority setting unit 160 may also use different association patterns depending on the type of device displaying the display items.
FIG. 7 is an explanatory diagram for explaining a first example of setting priorities according to a user's action state. FIG. 7 illustrates the contents of the mapping table 165 in the first example. Each row of the mapping table expresses a particular category of information items. Each column of the mapping table expresses a particular action state. Note that three types of action states are herein assumed to be potentially recognizable: a "sitting/standing still" state ST1, a "riding a vehicle" state ST2, and a "walking/running" state ST3, like the example in FIG. 6A. In the example in FIG. 7, priority values are numerical values in the range from 1 to 5. A smaller value represents a higher priority. In the case where a user's action state is the state ST1, the priority of advertising information is set to "3", the priorities of social information and new information are set to "1", and the priority of traffic information is set to "5". In the case where a user's action state is the state ST2, the priority of advertising information is set to "5", the priorities of social information and new information are set to "3", and the priority of traffic information is set to "1". In the case where a user's action state is the state ST3, the priority of advertising information is set to "3", the priorities of social information and new information are set to "5", and the priority of traffic information is set to "1".
FIG. 8 is an explanatory diagram for explaining a second example of setting priorities according to a user's action state. FIG. 8 illustrates the contents of the mapping table 165 in the second example. Each row of the mapping table expresses a particular sub-category of information items. Each column of the mapping table expresses a particular action state. Note that five types of action states are herein assumed to be potentially recognizable: a "working" state ST1a, a "relaxing" state ST1b, a "riding a train" state ST2a, a "riding a car" state ST2b, and a "walking/running" state ST3, like the example in FIG. 6B. In the example in FIG. 8, priority values are likewise numerical values in the range from 1 to 5. A smaller value represents a higher priority. In the case where a user's action state is the state ST1a, the priority of music advertising among advertising information is set to "3", the priority of short posts among social information is set to "5", the priority of economic news among news information is set to "1", the priority of train schedule information among traffic information is set to "3", and the priority of navigation information among traffic information is set to "5". In the case where a user's action state is the state ST1b, the priority of music advertising among advertising information is set to "3", the priority of short posts among social information is set to "1", the priority of economic news among news information is set to "3", the priority of train schedule information among traffic information is set to "3", and the priority of navigation information among traffic information is set to "5". In the case where a user's action state is the state ST2a, the priority of music advertising among advertising information is set to "3", the priority of short posts among social information is set to "3", the priority of economic news among news information is set to "3", the priority of train schedule information among traffic information is set to "1", and the priority of navigation information among traffic information is set to "5". In the case where a user's action state is the state ST2b, the priority of music advertising among advertising information is set to "5", the priority of short posts among social information is set to "5", the priority of economic news among news information is set to "3", the priority of train schedule information among traffic information is set to "5", and the priority of navigation information among traffic information is set to "1". In the case where a user's action state is the state ST3, the priority of music advertising among advertising information is set to "5", the priority of short posts among social information is set to "3", the priority of economic news among news information is set to "3", the priority of train schedule information among traffic information is set to "3", and the priority of navigation information among traffic information is set to "3".
The priority setting unit 160 may also adjust priorities to set according to a user's action state depending on a degree of attention for each display item. The degree of attention for each display item may be determined by the attention determination unit 170 discussed later, according to parameters such as a user's preferences, a user's current location, or the number of times that display item as actually been viewed.
FIG. 9A is an explanatory diagram for explaining a first technique for adjusting priorities. FIG. 9A illustrates three display items belonging to a category called "social information", with item IDs of "IT11", "IT12", and "IT13", respectively, as well as two display items belonging to a category called "news information", with item IDs of "IT21" and "IT22", respectively. The base priorities of the display items IT11, IT12, and IT13 are set to "3" according to the mapping table 165 discussed above, for example. The base priorities of the display items IT21 and IT22 are also set to "3".
With the first technique, the priority setting unit 160 adjusts these base priorities (the priorities before adjustment) according to degrees of attention determined on the basis of a user's preferences. A user's preferences may be expressed in a keywords list format, for example, in which keywords are extracted from a history of past content viewed or played back by the user, or from an email history. For example, Japanese Unexamined Patent Application Publication No. 2003-178075 describes technology that extracts a keyword list from a user's email history. In the example in FIG. 9A, preference information 142 in a keyword list format includes two keywords: "car" and "camera". In addition, at least one of these keywords are respectively included in the information content of the display items IT11 and IT21. Consequently, the priority setting unit 160 adjusts the priority values set for the display items IT11 and IT21 to "2" by applying an offset "-1" to the base priorities "3". Instead of (or in addition to) keywords expressing a user's preferences, a local area name, a facility name, or a shop name corresponding to a user's current location may also be used in order to adjust priorities.
FIG. 9B is an explanatory diagram for explaining a second technique for adjusting priorities. FIG. 9B again illustrates three display items IT11, IT12, and IT13 belonging to a category called "social information", as well as two display items IT21 and IT22 belonging to a category called "news information". With the second technique, the priority setting unit 160 adjusts the priorities of these display items according to degrees of attention determined on the basis of a user's gaze direction. For example, the degree of attention for each display item increases the more times the user's gaze is directed at that display item. In the example in FIG. 9B, at a time T1 when the user's gaze is directed at the display item IT12, the priority value of the display item IT12 changes to "2.5" due to applying an offset "-0.5" to the base priority "3" of the display item IT12. Also, at a time T2 when the user's gaze is directed at the display item IT12, the priority value of the display item IT12 changes to "2" due to further applying the offset "-0.5" to the priority of the display item IT12.
(6) Attention determination unit
The attention determination unit 170 determines a degree of attention for each display item. For example, the attention determination unit 170 may determine that the degree of attention is high for a display item having information content with a high correlation to preference information which may be acquired from the user DB 140. The attention determination unit 170 may also determine that the degree of attention is high for a display item having information content with a high correlation to a current location indicated by positioning data input from the sensor unit 104. In addition, by detecting a user's gaze direction, the attention determination unit 170 may also determine a higher degree of attention for a display item that the user has actually viewed more times. The length of time the user's gaze was directed at a display item may also be used to determine a degree of attention instead of the number of times the user viewed a display item. The attention determination unit 170 outputs a degree of attention for each display item determined in this way to the priority setting unit 160.
(7) Display controller
The display controller 180 controls the display of display items used to provide a user with information input from the information acquisition unit 150, according to priorities set by the priority setting unit 160. For example, the display controller 180 may cause the display 110 to highlight on-screen a display item set with comparatively high priority. More specifically, the display controller 180 may arrange a display item set with a higher priority closer to the center of the screen. The display controller 180 may also set a larger on-screen size for a display item set with a higher priority. The display controller 180 may also set a higher brightness, lower transparency, higher contrast, or higher sharpness for a display item set with a higher priority. The display controller 180 may also set the color of a display item set with a priority exceeding a threshold to a specific color. Additionally, in the case where the display 110 supports three-dimensional (3D) display, the display controller 180 may also set a shallow depth for a display item set with a higher priority. The display controller 180 may also display on-screen only display items set with comparatively high priorities.
The display controller 180 may also determine an object or person in a real space to be perceived by a user according to an action state recognized by the action recognition unit 130, and control the display of display items such that the determined object or person is not obscured by the display items. For example, objects or persons to be visually perceived by a user may include traffic signs and pedestrians while the user is driving a car or riding a bicycle. In addition, objects to be perceived by the user may include the screen of an information device while the user is working. The display controller 180 may control at least one of the on-screen position, size, shape, brightness, or transparency of display items such that an object or person to be perceived is not obscured by the display items, for example.
Note that in the case where the screen of the display 110 includes a filter that transmits outside light according to a variable transmittance rate, the display controller 180 is able to allow a user to clearly perceive display items by varying the transmittance rate of the filter. However, if the battery level of the information processing device 100 reaches zero, the transmittance rate of the filter may become unchangeable. Consequently, the display controller 180 may set the filter transmittance to maximum, and maintain the maximum transmittance while the battery level of the information processing device 100 is below a specific threshold value. Thus, it is possible to preemptively avoid situations in which a user's actions are impeded because the transmittance is unchangeable with the screen in a dark state.
In the case where a screen of the display 110 is a non-see-through screen, the display controller 180 generates an output image by superimposing images of display items respectively having determined display attributes onto a captured image, and outputs the generated output image to the display 110. Meanwhile, in the case where a screen of the display 110 is a see-through screen, the display controller 180 outputs individual images of display items respectively having determined display attributes to the display 110. Several examples of item display controlled by the display controller 180 will be additionally described later.
<2-3. Process flow>
FIG. 10 is a flowchart illustrating an example of the flow of a display control process which may be executed by an information processing device 100 according to an embodiment. The process illustrated in FIG. 10 may be periodically repeated according to a fixed cycle.
Referring to FIG. 10, first, input data such as captured image data, sensor data, and date/time data is collected via the imaging unit 102, the sensor unit 104, and the communication unit 112 (step S110). The environment recognition unit 120 uses the collected input data to recognize the environment of a real space in which a user is active (step S120).
Next, the information acquisition unit 150 acquires information in various categories to provide to the user (step S130). The information acquisition unit 150 may also acquire information specific to a locality, or acquire additional information associated with an object or person appearing in a captured image.
Also, the action recognition unit 130 executes an action recognition process (step S140). The action recognition process executed at this point may be any of the processes described using FIGS. 6A to 6C, or a process like that described in Japanese Unexamined Patent Application Publication No. 2006-345269.
Next, according to an action state recognized by the action recognition unit 130, the priority setting unit 160 sets priorities for display items that respectively express information acquired by the information acquisition unit 150 (step S150). The priority setting unit 160 also adjusts the priorities of display items according to a degree of attention for each display item determined by the attention determination unit 170 (step S160).
Next, the display controller 180 determines display attributes for display items according to the priorities set by the priority setting unit 160 (step S170). The display attributes herein may be factors such as the position, size, shape, brightness, transparency, color, and depth of a display item, for example. Additionally, the display controller 180 causes each of the display items to be displayed on-screen with the determined display attributes (step S180).
<3. Display examples>
FIGS. 11A to 11E illustrate five examples of item display by an information processing device 100. In these examples, the information processing device 100 is a wearable device exemplified in FIG. 1.
(1) First example
In the first example in FIG. 11A, a user's action state is a "relaxing" state ST1b. Four display items IT01, IT11, IT23, and IT31 are being displayed on-screen in the information processing device 100. The category of the display IT01 is advertising information. The category of the display IT11 is social information. The category of the display IT23 is news information. The category of the display IT31 is traffic information. In the first example, the display item IT11 belonging to social information associated with the "relaxing" state ST1b is displayed in the center of the screen at the largest size. Consequently, a user relaxing at home is able to easily notice a friend's short post indicated by the display item IT11. Meanwhile, the display item IT31 belonging to traffic information is displayed at the edge of the screen at a smaller size. Consequently, the user is not distracted by the presence of traffic information, which is not very important in the current action state.
(2) Second example
In the second example in FIG. 11B, a user's action state is a "riding a train" state ST2a. Four display items IT01, IT11, IT23, and IT31 are being displayed on-screen in the information processing device 100. In the second example, the display item IT31 belonging to train schedule information associated with the "riding a train" state ST2a is displayed in the center of the screen with the shallowest depth (or in other words, farthest in front). Consequently, a user riding a train is able to reliably ascertain that a certain railway line has stopped by viewing the display item IT31. Meanwhile, the display item IT01 belonging to music advertising information is displayed with the deepest depth and at a small size. Consequently, the user is not distracted by the presence of music advertising information which is not very important.
(3) Third example
In the third example in FIG. 11C, a user's action state is a "relaxing" state ST1b. On the screen illustrated in the upper part of FIG. 11C, a display item IT01 is displayed at the edge of the screen and at a small size. The star illustrated in FIG. 11C represents the position of the user's gaze as detected by the input unit 106. The user's gaze is directed at the display item IT01. In this case, the attention determination unit 170 raises the degree of attention for the display item IT01 under the user's gaze. Subsequently, the priority setting unit 160 changes the priority of the display item IT01 to a larger value. As a result, on the screen illustrated in the lower-left part of FIG. 11C, the position of the display item IT01 approaches the center of the screen. Also, as a result of the user's continued gaze directed at the display item IT01, on the screen illustrated in the lower-right part of FIG. 11C, the position of the display item IT01 further approaches the center of the screen, and the size of the display item IT01 is enlarged. By adjusting the priority in this way, a user is able to more clearly view information attracting his or her interest without performing a special operation.
(4) Fourth example
In the fourth example in FIG. 11D, a user's action state is a "riding a car" state ST2b. Four display items IT01, IT11, IT23, and IT31 are being displayed on-screen in the information processing device 100. Also, a traffic sign TS exists in a real space appearing in the user's visual field. The environment recognition unit 120 recognizes such a traffic sign TS in a captured image. The display controller 180 arranges the display items IT01, IT11, IT23, and IT31 such that the recognized traffic sign TS is not obscured by these display items. By displaying items in this way, it is possible to avoid situations where a user's appropriate actions (safely driving a car, for example) are hindered as a result of providing information on-screen.
(5) Fifth example
In the fifth example in FIG. 11E, a user's action state is a "walking/running" state ST3. Three display items IT01, IT23, and IT31 are being displayed on-screen in the information processing device 100. Among these display items, the display item IT31, which contains information content correlated with the keyword "car" expressing the user's preferences, is displayed at a larger size than the other display items.
In addition, persons PE1 and PE1 are walking in a real space appearing in the user's visual field. The display item IT01 is arranged on-screen so as to track the movement of these persons PE1 and PE2 in the user's visual field. The position of persons may be recognized by the environment recognition unit 120 using a captured image, or be recognized by transmitting a query to an external data server. These persons PE1 and PE2 are persons participating in the provision of advertising information registered in an advertising information server in advance, for example. A reward may also be paid to the persons PE1 and PE2 by an advertising information service business as a result of the user viewing the display item IT01. Such a system enables the realization of a new type of advertising information service that draws a user's interest. Note that in cases of low-precision position recognition, the display item IT01 may also not be displayed at a position overlapping the persons.
<4. Linking with external device>
The functionality of the information processing device 100 discussed above may also be realized by the linkage of multiple devices. FIG. 12 illustrates the information processing device 100a exemplified in FIG. 1, and an external device ED. The external device ED is a mobile client such as a smartphone or a mobile PC. The information processing device 100a wirelessly communicates with the external device ED using an arbitrary wireless communication protocol such as wireless local area network (LAN), Bluetooth (registered trademark), or Zigbee. In addition, one or more of the various logical functions of the information processing device 100a illustrated in FIG. 5 may be executed in the external device ED.
FIG. 13 is an explanatory diagram illustrating an example of item display when linking an information processing device and an external device. Referring to FIG. 13, a display item IT41 is displayed on-screen in the information processing device 100a. The display item IT41 is an item that expresses additional information (such as a name and attributes of a person PE3, for example) associated with the person PE3 appearing in a captured image. The person PE3 is recognized by the environment recognition unit 120. The additional information may be acquired by the information acquisition unit 150 from a data server or a database inside the external device ED. Typically, person recognition is a process that demands comparatively high processor performance, and may include processes such as cross-referencing facial images. A database of additional information may occupy a comparatively large proportion of memory resources. Consequently, by implementing the person recognition process or the database of additional information on the external device ED and linking the information processing device 100a with the external device ED, it becomes possible to realize the information processing device 100a as a low-cost, lightweight, and compact device. In addition, a user is able to view the display item IT41 on a head-mounted display rather than a screen of the external device ED. Consequently, in a situation where a user is facing a person, the user is able to ascertain information related to that person, without making that person uncomfortable due to the user shifting his or her gaze to a screen on the external device ED. Note that processes other than person recognition (such as object recognition, text recognition, action recognition, and priority setting, for example) may also be implemented on the external device ED.
FIG. 14 is an explanatory diagram for explaining display control based on recognition of an external device. In the example in FIG. 14, a display item IT41 is likewise displayed on-screen in the information processing device 100a. The display item IT41 is an item that expresses additional information associated with a person PE3. Also, an external device ED exists in a real space appearing in the user's visual field. The environment recognition unit 120 recognizes such an external device ED in a captured image. The display controller 180 arranges the display item IT41 such that the recognized external device ED is not obscured by the display item IT41. In FIG. 14, the display item IT41 has moved upward compared to FIG. 13. Consequently, in the case where a user is attempting to look at a screen on an external device ED, it is possible to avoid a situation in which that action is hindered by the display of an item.
<5. Conclusion>
The foregoing thus describes embodiments of technology according to the present disclosure in detail using FIGS. 1 to 14. According to the foregoing embodiment, priorities are set for multiple display items according to a user's action state while the display items are being displayed on-screen, and the display of display items is controlled according to the set priorities. Consequently, a user is able to reliably notice information matched to the user's own actions without individually checking a varied assortment of information displayed on-screen.
According to an embodiment, a user's action state may include states related to the user's movement velocity. Consequently, it is possible to provide the user with information in a suitable format according to various user action states, such as a state in which the user is riding a vehicle, a state of walking, a state of running, a state of standing still, and a state of sitting.
Also, according to an embodiment, a user's action state includes states related to transportation being utilized by the user. Consequently, it is possible to provide a user with relevant information in a safer format under conditions where providing information may affect safety, such as a state in which the user is driving a car or riding a bicycle, for example.
Also, according to an embodiment, a user's action state includes states related to a person accompanying a user's activity. Consequently, it is possible to prioritize the provision of more desirable information to the user after suitably determining what kind of information, such as formal information and private information, for example, is desirable for the user at that time.
Also, according to an embodiment, the priority of a display item may be adjusted on the basis of parameters such as the user's preferences, the user's current location, or the user's gaze direction. Consequently, it is possible to prioritize the provision of user-desired information to the user, even among information belonging to a shared category, for example.
Also, according to an embodiment, a display item is displayed on a screen arranged so as to be continually present in a user's visual field while the user engages in activity. In this case, displaying a varied assortment of information on-screen may divert the user's attention and hinder the user's activity, while also increasing the risk of the user overlooking highly necessary information. However, according to technology in accordance with the present disclosure, information matched to the user's activity is relatively highlighted or displayed alone, thus effectively reducing the risk of the user overlooking highly necessary information without greatly hindering the user's activity.
Note that the series of processes conducted by each device described in this specification may be realized in any of software, hardware, and a combination of software and hardware. Programs constituting software are stored in advance in a non-transitory medium provided internally or externally to each device, for example. Each program is then loaded into random access memory (RAM) at runtime and executed by a processor such as a CPU, for example.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Additionally, the present technology may also be configured as below.
(1) An apparatus including:
a user action state obtaining circuit configured to obtain an action state of a user; and
a display control circuit configured to control a display to modify display information based on the action state.
(2) The apparatus according to (1), wherein the display control circuit modifies attributes of the display information based on the action state.
(3) The apparatus according to (2), wherein the display circuit modifies at least one of a font size and location of the display information based on the action state.
(4) The apparatus according to (1) to (3), further comprising:
a priority setting circuit configured to set a priority of at least one category of information based on the action state.
(5) The apparatus according to (4), wherein the priority setting circuit sets the priority of a plurality of categories of information based on the action state.
(6) The apparatus according to (5), wherein the priority setting circuit sets the priority of the plurality of categories of information including advertising information, social information, information, and traffic information.
(7) The apparatus according to (1) to (6), wherein the user action state obtaining circuit obtains the action state of the user as one of working, relaxing, riding a train, riding a car, riding a bicycle, walking, and running.
(8) The apparatus according to (7), wherein the user action state obtaining circuit determines the action state of the user based on location data.
(9) The apparatus according to (7), wherein the user action state obtaining circuit determines the action state of the user based on data from an acceleration sensor and a gyro sensor.
(10) The apparatus according to (4), wherein the priority setting circuit changes the priority of at least one category of information when the action state changes.
(11) The apparatus according to (4), further comprising:
an attention determining circuit configured to determine a degree of attention the user pays to information,
wherein the priority setting circuit is configured to set the priority of the information based on the degree of attention.
(12) The apparatus according to (4), wherein the priority setting circuit is configured to set the priority of the at least one category of information based on a number of times information in the at least one category of information is viewed by the user.
(13) The apparatus according to (4), wherein the priority setting circuit is configured to set the priority of the at least one category of information based on a location of the user.
(14) The apparatus according to (1) to (13), wherein the display control circuit is configured to control the display to display other information received from an external source.
(15) The apparatus according to (14), wherein the display control circuit is configured to control the display to display the other information including advertising information.
(16) The apparatus according to (1) to (15), further comprising:
an eyeglass frame onto which is mounted the display control circuit and the user action state obtaining circuit;
a display mounted in the eyeglass frame and configured to display images generated by the display control circuit;
an imaging device mounted on the eyeglass frame and configured to generate images; and
an object recognition circuit configured to recognize objects in the images generated by the imaging device.
(17) The apparatus according to (16), wherein the display control circuit modifies display of the information such that the information does not overlap the objects in the images.
(18) The apparatus according to (16), wherein the display control circuit modifies display of the information such that the information is associated with one of the objects in the images.
(19) A method including:
obtaining an action state of a user; and
controlling a display, using a processor, to modify display information based on the action state.
(20) A non-transitory computer readable medium encoded with computer readable instructions that, when performed by a processor, cause the processor to perform the method according to (19).
Additionally, the present technology may also be configured as below.
(1)
An information processing device including:
a recognition unit that recognizes an action state of a user while a display item is being displayed on a screen;
a setting unit that sets priorities for a plurality of display items according to the action state recognized by the recognition unit; and
a display controller that controls display of the display items according to the priorities set by the setting unit.
(2)
The information processing device according to (1), wherein
the action state includes a state related to movement velocity of the user.
(3)
The information processing device according to (1) or (2), wherein
the action state includes a state related to transportation being utilized by the user.
(4)
The information processing device according to any one of (1) to (3), wherein
the action state includes a state related to a person accompanying activity of the user.
(5)
The information processing device according to any one of (1) to (4), further including:
a determination unit that determines a degree of attention for each display item,
wherein the setting unit adjusts the priorities of the plurality of display items according to the degree of attention determined by the determination unit.
(6)
The information processing device according to (5), wherein
the determination unit determines the degree of attention for each display item on the basis of a preference of the user.
(7)
The information processing device according to (5), wherein
the determination unit determines the degree of attention for each display item on the basis of a current location of the user.
(8)
The information processing device according to (5), wherein
the determination unit determines the degree of attention for each display item by detecting a gaze direction of the user.
(9)
The information processing device according to any one of (1) to (8),
wherein each of the plurality of display items is categorized according to information type, and
wherein the setting unit sets a priority of a display item belonging to a category associated with the action state recognized by the recognition unit higher than a priority of a display item belonging to another category.
(10)
The information processing device according to any one of (1) to (9), wherein
the display controller highlights the display items set with the priorities, the priorities being relatively high.
(11)
The information processing device according to any one of (1) to (9), wherein
the display controller displays only the display items set with the priorities on the screen, the priorities being relatively high.
(12)
The information processing device according to any one of (1) to (11), further including:
a display that includes the screen arranged to continually enter a visual field of the user while the user is engaged in activity.
(13)
The information processing device according to (12), wherein
the display is a device worn by the user.
(14)
The information processing device according to any one of (1) to (13), further including:
an imaging unit that captures a real space,
wherein the display controller determines an object or person in the real space to be visually perceived by the user according to the action state recognized by the recognition unit, and controls display of the display items in a manner that the determined object or person is not obscured by a display item.
(15)
The information processing device according to (14), wherein
the display controller controls at least one of a position, a size, a shape, brightness, or transparency of each display item in a manner that the determined object or person is not obscured by the display item.
(16)
The information processing device according to any one of (1) to (15),
wherein at least one of the plurality of display items is an item expressing additional information associated with an object or person appearing in a captured image, and
wherein the information processing device further includes
an acquisition unit that acquires the additional information from an external device.
(17)
The information processing device according to (16), further including:
a display, worn by the user, that includes the screen arranged to enter a visual field of the user;
wherein the external device is a mobile client that communicates with the information processing device.
(18)
The information processing device according to (16),
wherein the additional information is advertising information, and
wherein the external device is a data server that includes data associated with the person who participates in provision of the advertising information.
(19)
A display control method executed by a controller of an information processing device, including:
recognizing an action state of a user while a display item is being displayed on a screen;
setting priorities for a plurality of display items according to the recognized action state; and
controlling display of the display items according to the set priorities.
(20)
A program for causing a computer that controls an information processing device to function as:
a recognition unit that recognizes an action state of a user while a display item is being displayed on a screen;
a setting unit that sets priorities for a plurality of display items according to the action state recognized by the recognition unit; and
a display controller that controls display of the display items according to the priorities set by the setting unit.
100 (100a, 100b, 100c) information processing device
120 environment recognition unit
130 action recognition unit
150 information acquisition unit
160 priority setting unit
170 attention determination unit
180 display controller

Claims (20)

  1. An apparatus comprising:
    a user action state obtaining circuit configured to obtain an action state of a user; and
    a display control circuit configured to control a display to modify display information based on the action state.
  2. The apparatus according to claim 1, wherein the display control circuit modifies attributes of the display information based on the action state.
  3. The apparatus according to claim 2, wherein the display circuit modifies at least one of a font size and location of the display information based on the action state.
  4. The apparatus according to claim 1, further comprising:
    a priority setting circuit configured to set a priority of at least one category of information based on the action state.
  5. The apparatus according to claim 4, wherein the priority setting circuit sets the priority of a plurality of categories of information based on the action state.
  6. The apparatus according to claim 5, wherein the priority setting circuit sets the priority of the plurality of categories of information including advertising information, social information, information, and traffic information.
  7. The apparatus according to claim 1, wherein the user action state obtaining circuit obtains the action state of the user as one of working, relaxing, riding a train, riding a car, riding a bicycle, walking, and running.
  8. The apparatus according to claim 7, wherein the user action state obtaining circuit determines the action state of the user based on location data.
  9. The apparatus according to claim 7, wherein the user action state obtaining circuit determines the action state of the user based on data from an acceleration sensor and a gyro sensor.
  10. The apparatus according to claim 4, wherein the priority setting circuit changes the priority of at least one category of information when the action state changes.
  11. The apparatus according to claim 4, further comprising:
    an attention determining circuit configured to determine a degree of attention the user pays to information,
    wherein the priority setting circuit is configured to set the priority of the information based on the degree of attention.
  12. The apparatus according to claim 4, wherein the priority setting circuit is configured to set the priority of the at least one category of information based on a number of times information in the at least one category of information is viewed by the user.
  13. The apparatus according to claim 4, wherein the priority setting circuit is configured to set the priority of the at least one category of information based on a location of the user.
  14. The apparatus according to claim 1, wherein the display control circuit is configured to control the display to display other information received from an external source.
  15. The apparatus according to claim 14, wherein the display control circuit is configured to control the display to display the other information including advertising information.
  16. The apparatus according to claim 1, further comprising:
    an eyeglass frame onto which is mounted the display control circuit and the user action state obtaining circuit;
    a display mounted in the eyeglass frame and configured to display images generated by the display control circuit;
    an imaging device mounted on the eyeglass frame and configured to generate images; and
    an object recognition circuit configured to recognize objects in the images generated by the imaging device.
  17. The apparatus according to claim 16, wherein the display control circuit modifies display of the information such that the information does not overlap the objects in the images.
  18. The apparatus according to claim 16, wherein the display control circuit modifies display of the information such that the information is associated with one of the objects in the images.
  19. A method comprising:
    obtaining an action state of a user; and
    controlling a display, using a processor, to modify display information based on the action state.
  20. A non-transitory computer readable medium encoded with computer readable instructions that, when performed by a processor, cause the processor to perform the method according to claim 19.
PCT/JP2013/004916 2012-10-01 2013-08-20 Information processing device, display control method, and program WO2014054210A2 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
RU2015110684A RU2015110684A (en) 2012-10-01 2013-08-20 INFORMATION PROCESSING DEVICE, DISPLAY DEVICE MANAGEMENT METHOD AND PROGRAM
US14/407,722 US9678342B2 (en) 2012-10-01 2013-08-20 Information processing device, display control method, and program
CN201380049986.5A CN104685857B (en) 2012-10-01 2013-08-20 Information processor, display control method
BR112015004100A BR112015004100A2 (en) 2012-10-01 2013-08-20 device, method, and, non-temporary computer readable media
EP13756935.6A EP2904767A2 (en) 2012-10-01 2013-08-20 Information processing device, display control method, and program
IN2386DEN2015 IN2015DN02386A (en) 2012-10-01 2013-08-20
US15/598,547 US10209516B2 (en) 2012-10-01 2017-05-18 Display control method for prioritizing information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012219450A JP5935640B2 (en) 2012-10-01 2012-10-01 Information processing apparatus, display control method, and program
JP2012-219450 2012-10-01

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/407,722 A-371-Of-International US9678342B2 (en) 2012-10-01 2013-08-20 Information processing device, display control method, and program
US15/598,547 Continuation US10209516B2 (en) 2012-10-01 2017-05-18 Display control method for prioritizing information

Publications (2)

Publication Number Publication Date
WO2014054210A2 true WO2014054210A2 (en) 2014-04-10
WO2014054210A3 WO2014054210A3 (en) 2014-05-22

Family

ID=49115552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/004916 WO2014054210A2 (en) 2012-10-01 2013-08-20 Information processing device, display control method, and program

Country Status (8)

Country Link
US (2) US9678342B2 (en)
EP (1) EP2904767A2 (en)
JP (1) JP5935640B2 (en)
CN (1) CN104685857B (en)
BR (1) BR112015004100A2 (en)
IN (1) IN2015DN02386A (en)
RU (1) RU2015110684A (en)
WO (1) WO2014054210A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2980627A1 (en) * 2014-07-31 2016-02-03 Samsung Electronics Co., Ltd Wearable glasses and method of providing content using the same
WO2016197043A1 (en) * 2015-06-04 2016-12-08 Paypal, Inc. Movement based graphical user interface
CN106293576A (en) * 2015-06-29 2017-01-04 卡西欧计算机株式会社 Portable electron device, display control program and display control method
JP2017041019A (en) * 2015-08-18 2017-02-23 株式会社コロプラ Head-mounted display system control program
US11170539B2 (en) 2017-12-01 2021-11-09 Sony Corporation Information processing device and information processing method
US11593914B2 (en) 2014-06-17 2023-02-28 Interdigital Ce Patent Holdings, Sas Method and a display device with pixel repartition optimization

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102160650B1 (en) * 2013-08-26 2020-09-28 삼성전자주식회사 Mobile device for providing information by automatically recognizing intention and operating method thereof
US10133548B2 (en) * 2014-01-27 2018-11-20 Roadwarez Inc. System and method for providing mobile personal security platform
JP2015210580A (en) * 2014-04-24 2015-11-24 エイディシーテクノロジー株式会社 Display system and wearable device
KR101430614B1 (en) * 2014-05-30 2014-08-18 주식회사 모리아타운 Display device using wearable eyeglasses and method for operating the same
KR102433291B1 (en) * 2014-07-31 2022-08-17 삼성전자주식회사 Method and wearable glasses for providing a content
JP6187413B2 (en) * 2014-08-19 2017-08-30 株式会社デンソー Vehicle information presentation method, vehicle information presentation system, and in-vehicle device
JP5955360B2 (en) * 2014-08-29 2016-07-20 ヤフー株式会社 Distribution device, distribution method, distribution program, terminal device, display method, and display program
JP5989725B2 (en) * 2014-08-29 2016-09-07 京セラドキュメントソリューションズ株式会社 Electronic device and information display program
JP2016057814A (en) * 2014-09-09 2016-04-21 セイコーエプソン株式会社 Head-mounted type display device, control method of head-mounted type display device, information system, and computer program
CN107003752B (en) 2014-12-17 2020-04-10 索尼公司 Information processing apparatus, information processing method, and program
US10255713B2 (en) * 2014-12-29 2019-04-09 Google Llc System and method for dynamically adjusting rendering parameters based on user movements
JP2016143310A (en) * 2015-02-04 2016-08-08 ソニー株式会社 Information processing device, image processing method, and program
JP2016177240A (en) * 2015-03-23 2016-10-06 カシオ計算機株式会社 Display device, display control method, and program
US20160342327A1 (en) * 2015-05-22 2016-11-24 Lg Electronics Inc. Watch-type mobile terminal and method of controlling therefor
WO2016194844A1 (en) * 2015-05-29 2016-12-08 京セラ株式会社 Wearable device
JP6608199B2 (en) * 2015-07-07 2019-11-20 クラリオン株式会社 Information system and content start method
JP6598575B2 (en) * 2015-08-17 2019-10-30 株式会社コロプラ Method and program for controlling head mounted display system
JP5961736B1 (en) 2015-08-17 2016-08-02 株式会社コロプラ Method and program for controlling head mounted display system
US10382804B2 (en) * 2015-08-31 2019-08-13 Orcam Technologies Ltd. Systems and methods for identifying exposure to a recognizable item
JP2017068594A (en) 2015-09-30 2017-04-06 ソニー株式会社 Information processing device, information processing method, and program
CN108027662B (en) * 2015-09-30 2022-01-11 索尼公司 Information processing apparatus, information processing method, and computer-readable recording medium
CN105416439B (en) * 2015-11-24 2018-03-30 大连楼兰科技股份有限公司 Vehicle based on intelligent glasses is disassembled and assemble method
US10198233B2 (en) 2016-03-01 2019-02-05 Microsoft Technology Licensing, Llc Updating displays based on attention tracking data
JP5996138B1 (en) * 2016-03-18 2016-09-21 株式会社コロプラ GAME PROGRAM, METHOD, AND GAME SYSTEM
JP6345206B2 (en) * 2016-06-14 2018-06-20 ヤフー株式会社 Distribution device, distribution method, distribution program, terminal device, display method, and display program
JP6694637B2 (en) * 2016-08-30 2020-05-20 株式会社アルファコード Information providing system and information providing method regarding an object of interest
US10204264B1 (en) * 2016-12-05 2019-02-12 Google Llc Systems and methods for dynamically scoring implicit user interaction
US10769438B2 (en) 2017-05-23 2020-09-08 Samsung Electronics Company, Ltd. Augmented reality
GB2565302B (en) * 2017-08-08 2022-04-13 Sony Interactive Entertainment Inc Head-mountable apparatus and methods
CN110020119A (en) * 2017-10-09 2019-07-16 北京嘀嘀无限科技发展有限公司 Information display method, system and terminal
US10818086B2 (en) * 2018-02-09 2020-10-27 Lenovo (Singapore) Pte. Ltd. Augmented reality content characteristic adjustment
US10877969B2 (en) * 2018-03-16 2020-12-29 International Business Machines Corporation Augmenting structured data
CN108509593A (en) * 2018-03-30 2018-09-07 联想(北京)有限公司 A kind of display methods and electronic equipment, storage medium
JP6917340B2 (en) 2018-05-17 2021-08-11 グリー株式会社 Data processing programs, data processing methods, and data processing equipment
CN110696712B (en) * 2018-07-10 2021-02-09 广州汽车集团股份有限公司 Automatic adjusting method and device for automobile rearview mirror, computer storage medium and automobile
JP2019021347A (en) * 2018-11-07 2019-02-07 株式会社コロプラ Head-mounted display system control program
JP6892960B1 (en) * 2020-09-29 2021-06-23 Kddi株式会社 Control devices, information processing systems, control methods and programs
US11507246B2 (en) * 2020-10-07 2022-11-22 Htc Corporation Method for dynamically showing virtual boundary, electronic device and computer readable storage medium thereof
US11914835B2 (en) 2020-11-16 2024-02-27 Samsung Electronics Co., Ltd. Method for displaying user interface and electronic device therefor
KR20220066578A (en) * 2020-11-16 2022-05-24 삼성전자주식회사 Method for displaying user interface and electronic device therefor
JP2022110509A (en) * 2021-01-18 2022-07-29 富士フイルムビジネスイノベーション株式会社 Information processing device and program
EP4318453A4 (en) * 2021-03-30 2024-10-02 Sony Group Corp Information processing device, information processing method, and program
EP4357838A4 (en) * 2021-09-27 2024-10-16 Samsung Electronics Co Ltd Electronic device and method for displaying content

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003178075A (en) 2001-12-13 2003-06-27 Sony Corp Information processing device and method, recording medium and program
JP2006345269A (en) 2005-06-09 2006-12-21 Sony Corp Information processing apparatus and method, and program
JP2008083290A (en) 2006-09-27 2008-04-10 Sony Corp Display apparatus, and display method

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2940896B2 (en) * 1993-03-30 1999-08-25 三菱電機株式会社 Multiplex broadcast receiver
JPH0981309A (en) * 1995-09-13 1997-03-28 Toshiba Corp Input device
US5956035A (en) * 1997-05-15 1999-09-21 Sony Corporation Menu selection with menu stem and submenu size enlargement
US7963652B2 (en) * 2003-11-14 2011-06-21 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
US7280096B2 (en) * 2004-03-23 2007-10-09 Fujitsu Limited Motion sensor engagement for a handheld device
GB0422500D0 (en) * 2004-10-09 2004-11-10 Ibm Method and system for re-arranging a display
JP2006205930A (en) * 2005-01-28 2006-08-10 Konica Minolta Photo Imaging Inc Image display device
JP2008102860A (en) * 2006-10-20 2008-05-01 Nec Corp Small electronic device and menu display program
TW200823740A (en) * 2006-11-28 2008-06-01 Inst Information Industry Mobile electronic apparatus, method, application program, and computer readable medium thereof for dynamically arranging the display sequence of function options
US20080254837A1 (en) * 2007-04-10 2008-10-16 Sony Ericsson Mobile Communication Ab Adjustment of screen text size
US8631358B2 (en) * 2007-10-10 2014-01-14 Apple Inc. Variable device graphical user interface
JP2010016432A (en) * 2008-07-01 2010-01-21 Olympus Corp Digital photograph frame, information processing system, control method, program, and information storage medium
KR101524616B1 (en) * 2008-07-07 2015-06-02 엘지전자 주식회사 Controlling a Mobile Terminal with a Gyro-Sensor
JP5232733B2 (en) * 2008-08-11 2013-07-10 シャープ株式会社 Problem questioning apparatus and question questioning method
JP4982715B2 (en) * 2008-08-21 2012-07-25 学校法人関西学院 Information providing method and information providing system
KR20110019861A (en) * 2009-08-21 2011-03-02 삼성전자주식회사 Method for display configuration and apparatus using the same
US20120194551A1 (en) * 2010-02-28 2012-08-02 Osterhout Group, Inc. Ar glasses with user-action based command and control of external devices
JP4922436B2 (en) * 2010-06-07 2012-04-25 株式会社エヌ・ティ・ティ・ドコモ Object display device and object display method
CN101963887A (en) * 2010-09-26 2011-02-02 百度在线网络技术(北京)有限公司 Method and equipment for changing display object in mobile equipment based on sliding operation
US9167991B2 (en) * 2010-09-30 2015-10-27 Fitbit, Inc. Portable monitoring devices and methods of operating same
US8510166B2 (en) * 2011-05-11 2013-08-13 Google Inc. Gaze tracking system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003178075A (en) 2001-12-13 2003-06-27 Sony Corp Information processing device and method, recording medium and program
JP2006345269A (en) 2005-06-09 2006-12-21 Sony Corp Information processing apparatus and method, and program
JP2008083290A (en) 2006-09-27 2008-04-10 Sony Corp Display apparatus, and display method

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11593914B2 (en) 2014-06-17 2023-02-28 Interdigital Ce Patent Holdings, Sas Method and a display device with pixel repartition optimization
US11150738B2 (en) 2014-07-31 2021-10-19 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10452152B2 (en) 2014-07-31 2019-10-22 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
CN111897425B (en) * 2014-07-31 2024-03-12 三星电子株式会社 Wearable glasses and method for providing information using the same
CN105320279A (en) * 2014-07-31 2016-02-10 三星电子株式会社 Wearable glasses and method of providing content using the same
CN105320279B (en) * 2014-07-31 2020-08-14 三星电子株式会社 Wearable glasses and method of providing content using the same
EP2980627A1 (en) * 2014-07-31 2016-02-03 Samsung Electronics Co., Ltd Wearable glasses and method of providing content using the same
CN111897425A (en) * 2014-07-31 2020-11-06 三星电子株式会社 Wearable glasses and method for providing information using the same
US10725556B2 (en) 2014-07-31 2020-07-28 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10037084B2 (en) 2014-07-31 2018-07-31 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
WO2016197043A1 (en) * 2015-06-04 2016-12-08 Paypal, Inc. Movement based graphical user interface
US11094294B2 (en) 2015-06-04 2021-08-17 Paypal, Inc. Movement based graphical user interface
US10134368B2 (en) 2015-06-04 2018-11-20 Paypal, Inc. Movement based graphical user interface
US11967298B2 (en) 2015-06-04 2024-04-23 Paypal, Inc. Movement based graphical user interface
CN106293576A (en) * 2015-06-29 2017-01-04 卡西欧计算机株式会社 Portable electron device, display control program and display control method
JP2017041019A (en) * 2015-08-18 2017-02-23 株式会社コロプラ Head-mounted display system control program
US11170539B2 (en) 2017-12-01 2021-11-09 Sony Corporation Information processing device and information processing method

Also Published As

Publication number Publication date
US20170255010A1 (en) 2017-09-07
WO2014054210A3 (en) 2014-05-22
EP2904767A2 (en) 2015-08-12
JP2014071811A (en) 2014-04-21
IN2015DN02386A (en) 2015-09-04
US9678342B2 (en) 2017-06-13
CN104685857B (en) 2017-03-29
BR112015004100A2 (en) 2017-07-04
US20150153570A1 (en) 2015-06-04
US10209516B2 (en) 2019-02-19
CN104685857A (en) 2015-06-03
RU2015110684A (en) 2016-10-27
JP5935640B2 (en) 2016-06-15

Similar Documents

Publication Publication Date Title
US10209516B2 (en) Display control method for prioritizing information
US9418481B2 (en) Visual overlay for augmenting reality
US9390561B2 (en) Personal holographic billboard
US10019962B2 (en) Context adaptive user interface for augmented reality display
US9262780B2 (en) Method and apparatus for enabling real-time product and vendor identification
EP3137976B1 (en) World-locked display quality feedback
US20160117861A1 (en) User controlled real object disappearance in a mixed reality display
CN107015638B (en) Method and apparatus for alerting a head mounted display user
KR20210046085A (en) Wearable apparatus and methods for analyzing images
US20130050258A1 (en) Portals: Registered Objects As Virtualized, Personalized Displays
CN105900041A (en) Target positioning with gaze tracking
US20150193977A1 (en) Self-Describing Three-Dimensional (3D) Object Recognition and Control Descriptors for Augmented Reality Interfaces
US20150169186A1 (en) Method and apparatus for surfacing content during image sharing
US11960652B2 (en) User interactions with remote devices
US20230298247A1 (en) Sharing received objects with co-located users
US12072489B2 (en) Social connection through distributed and connected real-world objects
US20180292980A1 (en) System, information processing method, and storage medium
US20230069328A1 (en) Snapshot messages for indicating user state
KR20240049836A (en) Scan-based messaging for electronic eyewear devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13756935

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 14407722

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2013756935

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2015110684

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112015004100

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112015004100

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20150225