US20170038892A1 - Control device, control method, and computer program - Google Patents

Control device, control method, and computer program Download PDF

Info

Publication number
US20170038892A1
US20170038892A1 US15/106,717 US201415106717A US2017038892A1 US 20170038892 A1 US20170038892 A1 US 20170038892A1 US 201415106717 A US201415106717 A US 201415106717A US 2017038892 A1 US2017038892 A1 US 2017038892A1
Authority
US
United States
Prior art keywords
processing system
information processing
menu
user
diagram illustrating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/106,717
Other languages
English (en)
Inventor
Tetsuo Ikeda
Takayuki Sakamoto
Tomohiro Ishii
Atsushi IZUMIHARA
Masayuki Yamada
Yohei Fukuma
Yuzuru Kimura
Yasushi Okumura
Takashi Shibuya
Katsuya HYODO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OKUMURA, YASUSHI, HYODO, KATSUYA, SHIBUYA, TAKASHI, YAMADA, MASAYUKI, ISHII, TOMOHIRO, FUKUMA, YOHEI, KIMURA, YUZURU, IKEDA, TETSUO, IZUMIHARA, ATSUSHI, SAKAMOTO, TAKAYUKI
Publication of US20170038892A1 publication Critical patent/US20170038892A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • H04N9/3155Modulator illumination systems for controlling the light source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R29/00Monitoring arrangements; Testing arrangements
    • H04R29/004Monitoring arrangements; Testing arrangements for microphones
    • H04R29/005Microphone arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R29/00Monitoring arrangements; Testing arrangements
    • H04R29/008Visual indication of individual signal levels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/40Visual indication of stereophonic sound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/40Details of arrangements for obtaining desired directional characteristic by combining a number of identical transducers covered by H04R1/40 but not provided for in any of its subgroups
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2430/00Signal processing covered by H04R, not provided for in its groups
    • H04R2430/01Aspects of volume control, not necessarily automatic, in sound systems

Definitions

  • the present disclosure relates to a control device, a control method, and a computer program.
  • Devices displaying various kinds of information through manipulations on touch panels have become widespread.
  • the sizes of screens have also increased and uses of simultaneous manipulations of a plurality of users are considered.
  • projectors have been used as devices that display information.
  • Patent Literature 1 proposes a method of simultaneously displaying a plurality of windows at the time of display of information. Specifically, by displaying display information regarding a window on the rear side thinner than display information regarding a window on the front side in a portion in which first and second windows are superimposed, it is possible to view the display information regarding both of the windows.
  • Patent Literature 1 JP H8-123652A
  • a control device including: a generation unit configured to generate the control information for controlling an output unit such that an output associated with a predetermined condition is executed when a state on a display surface detected by a detection unit satisfies the condition.
  • a control method including: generating the control information for controlling an output unit such that an output associated with a predetermined condition is executed when a detected state on a display surface satisfies the condition.
  • a computer program causing a computer to execute: generating the control information for controlling an output unit such that an output associated with a predetermined condition is executed when a detected state on a display surface satisfies the condition.
  • FIG. 1 is an explanatory diagram illustrating an example of the configuration of an information processing system according to an embodiment of the present disclosure.
  • FIG. 2 is an explanatory diagram illustrating an example of the configuration of an information processing system according to an embodiment of the present disclosure.
  • FIG. 3 is an explanatory diagram illustrating an example of the configuration of an information processing system according to an embodiment of the present disclosure.
  • FIG. 4 is an explanatory diagram illustrating an example of the configuration of an information processing system according to an embodiment of the present disclosure.
  • FIG. 5 is an explanatory diagram illustrating an example of a functional configuration of the information processing system according to an embodiment of the present disclosure.
  • FIG. 6 is an explanatory diagram illustrating an example of a manipulation situation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 7 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 8 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 9 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 10 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 11 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 12 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 13 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 14 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 15 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 16 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 17 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 18 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 19 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 20 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 21 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 22 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 23 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 24 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 25 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 26 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 27 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 28 is an explanatory diagram illustrating a menu display control example in an information processing system 100 d.
  • FIG. 29 is an explanatory diagram illustrating a menu display control example in an information processing system 100 d.
  • FIG. 30 is an explanatory diagram illustrating a menu display control example in an information processing system 100 c.
  • FIG. 31 is an explanatory diagram illustrating a menu display control example in an information processing system 100 a.
  • FIG. 32 is a flowchart illustrating an example of an operation of a portable terminal linked to the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 33 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 34 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 35 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 36 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 37 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 38 is a flowchart illustrating a use example of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 39 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 40 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 41 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 42 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 43 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 44 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 45 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 46 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 47 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 48 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 49 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 50 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 51 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 52 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 53 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 54 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 55 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 56 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 57 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 58 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 59 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 60 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 61 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 62 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 63 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 64 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 65 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 66 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 67 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 68 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 69 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 70 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 71 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 72 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 73 is an explanatory diagram illustrating an example of a GUI of an application.
  • FIG. 74 is an explanatory diagram illustrating a user interface according to specific example 1.
  • FIG. 75 is an explanatory diagram illustrating a user interface according to specific example 1.
  • FIG. 76 is an explanatory diagram illustrating a user interface according to specific example 1.
  • FIG. 77 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 78 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 79 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 80 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 81 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 82 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 83 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 84 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 85 is an explanatory diagram illustrating a user interface according to specific example 2.
  • FIG. 86 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 87 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 88 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 89 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 90 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 91 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 92 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 93 is an explanatory diagram illustrating a user interface according to specific example 3.
  • FIG. 94 is an explanatory diagram illustrating a user interface according to specific example 4.
  • FIG. 95 is an explanatory diagram illustrating a user interface according to specific example 5.
  • FIG. 96 is an explanatory diagram illustrating a user interface according to specific example 5.
  • FIG. 97 is an explanatory diagram illustrating a user interface according to specific example 5.
  • FIG. 98 is an explanatory diagram illustrating a user interface according to specific example 6.
  • FIG. 99 is an explanatory diagram illustrating a user interface according to specific example 6.
  • FIG. 100 is an explanatory diagram illustrating a user interface according to specific example 7.
  • FIG. 101 is an explanatory diagram illustrating a user interface according to specific example 7.
  • FIG. 102 is an explanatory diagram illustrating a user interface according to specific example 7.
  • FIG. 103 is an explanatory diagram illustrating a user interface according to specific example 7.
  • FIG. 104 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 105 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 106 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 107 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 108 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 109 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 110 is an explanatory diagram illustrating a user interface according to specific example 8.
  • FIG. 111 is an explanatory diagram illustrating a user interface according to specific example 9.
  • FIG. 112 is an explanatory diagram illustrating a user interface according to specific example 9.
  • FIG. 113 is an explanatory diagram illustrating a user interface according to specific example 9.
  • FIG. 114 is an explanatory diagram illustrating a user interface according to specific example 9.
  • FIG. 115 is an explanatory diagram illustrating a user interface according to specific example 10.
  • FIG. 116 is an explanatory diagram illustrating a user interface according to specific example 10.
  • FIG. 117 is an explanatory diagram illustrating a user interface according to specific example 10.
  • FIG. 118 is an explanatory diagram illustrating a user interface according to specific example 11.
  • FIG. 119 is an explanatory diagram illustrating a user interface according to specific example 12.
  • FIG. 120 is an explanatory diagram illustrating a user interface according to specific example 12.
  • FIG. 121 is an explanatory diagram illustrating a user interface according to specific example 12.
  • FIG. 122 is an explanatory diagram illustrating a user interface according to specific example 12.
  • FIG. 123 is an explanatory diagram illustrating a user interface according to specific example 12.
  • FIG. 124 is an explanatory diagram illustrating a user interface according to specific example 13.
  • FIG. 125 is an explanatory diagram illustrating a user interface according to specific example 13.
  • FIG. 126 is an explanatory diagram illustrating a user interface according to specific example 13.
  • FIG. 127 is an explanatory diagram illustrating a user interface according to specific example 13.
  • FIG. 128 is an explanatory diagram illustrating a specific example of a karuta card assistance application.
  • FIG. 129 is an explanatory diagram illustrating a specific example of a conversation assistance application.
  • FIG. 130 is an explanatory diagram illustrating a specific example of a projection surface tracking application.
  • FIG. 131 is an explanatory diagram illustrating a specific example of a projection surface tracking application.
  • FIG. 132 is an explanatory diagram illustrating a specific example of a projection surface tracking application.
  • FIG. 133 is an explanatory diagram illustrating a specific example of a projection surface tracking application.
  • FIG. 134 is an explanatory diagram illustrating a specific example of a meal assistance application.
  • FIG. 135 is an explanatory diagram illustrating another specific example of the meal assistance application.
  • FIG. 136 is an explanatory diagram illustrating a specific example of a motion effect application.
  • FIG. 137 is an explanatory diagram illustrating a specific example of the motion effect application.
  • FIG. 138 is an explanatory diagram illustrating a specific example of a lunch box preparation supporting application.
  • FIG. 139 is an explanatory diagram illustrating a specific example of user assistance by a daily assistance application.
  • FIG. 140 is an explanatory diagram illustrating a specific example of user assistance by a daily assistance application.
  • FIG. 141 is an explanatory diagram illustrating a specific example of user assistance by a daily assistance application.
  • FIG. 142 is an explanatory diagram illustrating a specific example of user assistance by a daily assistance application.
  • FIG. 143 is an explanatory diagram illustrating a specific example of user assistance by a daily assistance application.
  • FIG. 144 is an explanatory diagram illustrating a specific example of a dining table representation application.
  • FIG. 145 is an explanatory diagram illustrating a specific example of a food recommendation application.
  • FIG. 146 is an explanatory diagram illustrating a specific example of a tableware effect application.
  • FIG. 147 is an explanatory diagram illustrating a specific example of an inter-room linking application.
  • FIG. 148 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 149 is an explanatory diagram illustrating an example of an illumination map.
  • FIG. 150 is an explanatory diagram illustrating an example of an environment map.
  • FIG. 151 is an explanatory diagram illustrating an example of association between the illumination map and the environment map.
  • FIG. 152 is an explanatory diagram illustrating an example of an application illumination association table.
  • FIG. 153 is an explanatory diagram illustrating examples of values of the illumination map and the environment map.
  • FIG. 154 is an explanatory diagram illustrating examples of values of the illumination map and the environment map.
  • FIG. 155 is an explanatory diagram illustrating an example when outside light is reflected to the environment map.
  • FIG. 156 is an explanatory diagram illustrating examples of values of the illumination map and the environment map.
  • FIG. 157 is an explanatory diagram illustrating a specific example of an application.
  • FIG. 158 is an explanatory diagram illustrating a specific example of an application.
  • FIG. 159 is an explanatory diagram illustrating a specific example of an application.
  • FIG. 160 is an explanatory diagram illustrating a specific example of an application.
  • FIG. 161 is an explanatory diagram illustrating a specific example of an application.
  • FIG. 162 is an explanatory diagram illustrating a specific example of an application.
  • FIG. 163 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 164 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 165 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 166 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 167 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 168 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 169 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 170 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 171 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 172 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 173 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 174 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 175 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 176 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 177 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 178 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 179 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 180 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 181 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 182 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 183 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 184 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 185 is an explanatory diagram illustrating an example of visibility of a provoking function.
  • FIG. 186 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 187 is an explanatory diagram illustrating an example of a combination of triggers.
  • FIG. 188 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 189 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 190 is an explanatory diagram illustrating an example of a manipulation method and a mode of a window.
  • FIG. 191 is an explanatory diagram illustrating an example of a manipulation method and a mode of a window.
  • FIG. 192 is an explanatory diagram illustrating an example of a manipulation method and a mode of a window.
  • FIG. 193 is an explanatory diagram illustrating an example of a manipulation method and a mode of a window.
  • FIG. 194 is an explanatory diagram illustrating an example of a manipulation of a window.
  • FIG. 195 is an explanatory diagram illustrating manipulations by a user.
  • FIG. 196 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 197 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 198 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 199 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 200 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 201 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 202 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 203 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 204 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 205 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 206 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 207 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 208 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 209 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 210 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 211 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 212 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 213 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 214 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 215 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 216 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 217 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 218 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 219 is an explanatory diagram illustrating an example of a manipulation on a window by the user.
  • FIG. 220 is an explanatory diagram illustrating a display control example when a window interferes with a real object placed on a projection surface.
  • FIG. 221 is an explanatory diagram illustrating a display example of information.
  • FIG. 222 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 223 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 224 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 225 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 226 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 227 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 228 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 229 is an explanatory diagram illustrating an example of a GUI.
  • FIG. 230 is an explanatory diagram illustrating an overview of a user interface.
  • FIG. 231 is an explanatory diagram illustrating an overview of a user interface.
  • FIG. 232 is an explanatory diagram illustrating an example of a user position estimation function.
  • FIG. 233 is an explanatory diagram illustrating an example of a user position estimation function.
  • FIG. 234 is an explanatory diagram illustrating an example of a user position estimation function.
  • FIG. 235 is an explanatory diagram illustrating an example of a user position estimation function.
  • FIG. 236 is an explanatory diagram illustrating an example of a user position estimation function.
  • FIG. 237 is an explanatory diagram illustrating an example of a user position estimation function.
  • FIG. 238 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 239 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 240 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 241 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 242 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 243 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 244 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 245 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 246 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 247 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 248 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 249 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 250 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 251 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 252 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 253 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 254 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 255 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 256 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 257 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 258 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 259 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 260 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 261 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 262 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 263 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 264 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 265 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 266 is a flowchart illustrating an example of the flow of a display control process executed in the information processing system.
  • FIG. 267 is an explanatory diagram illustrating an overview of a user interface.
  • FIG. 268 is a block diagram illustrating an example of a logical configuration of the information processing system.
  • FIG. 269 is an explanatory diagram illustrating an example of a real object recognition function.
  • FIG. 270 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 271 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 272 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 273 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 274 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 275 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 276 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 277 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 278 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 279 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 280 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 281 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 282 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 283 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 284 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 285 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 286 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 287 is an explanatory diagram illustrating an example of a display region decision process.
  • FIG. 288 is a flowchart illustrating an example of the flow of a display control process executed in the information processing system.
  • FIG. 289 is a flowchart illustrating an example of the flow of a display region decision process executed in the information processing system.
  • FIG. 290 is a flowchart illustrating an example of the flow of a display region decision process executed in the information processing system.
  • FIG. 291 is a block diagram illustrating an example of a logical configuration of the information processing system.
  • FIG. 292 is an explanatory diagram illustrating a process of calculating a projection magnification.
  • FIG. 293 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 294 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 295 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 296 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 297 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 298 is a flowchart illustrating an example of the flow of the display control process executed in the information processing system.
  • FIG. 299 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 300 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 301 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 302 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 303 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 304 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 305 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 306 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 307 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 308 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 309 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 310 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 311 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 312 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 313 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 314 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 315 is an explanatory diagram illustrating an example of a user interface.
  • FIG. 316 is a flowchart illustrating an example of the flow of a display control process executed in the information processing system.
  • FIG. 317 is a flowchart illustrating an example of the flow of a display control process executed in the information processing system.
  • FIG. 318 is an explanatory diagram illustrating a hardware configuration example.
  • FIG. 1 is an explanatory diagram illustrating an example of the configuration of the information processing system according to the embodiment of the present disclosure.
  • FIG. 1 an example of the configuration of the information processing system according to the embodiment of the present disclosure will be described with reference to FIG. 1 .
  • an information processing system 100 a is configured to include an input unit 110 a and an output unit 130 a .
  • the information processing system 100 a according to the embodiment of the present disclosure illustrated in FIG. 1 is a system that displays information on the top surface of a table 140 a and allows a user using the information processing system 100 a to manipulate the information displayed on the table 140 a .
  • a scheme of displaying information on the top surface of the table 140 a as in FIG. 1 is also referred to as a “projection type.”
  • the input unit 110 a is a device that inputs manipulation content of the user using the information processing system 100 a or the shape or design of an object placed on the table 140 a .
  • the input unit 110 a is provided above the table 140 a to be suspended from a ceiling. That is, the input unit 110 a is provided to be separated from the table 140 a which is a target on which information is displayed.
  • a camera that images the table 140 a using one lens for example, a camera that images the table 140 a using one lens, a stereo camera that is capable of imaging the table 140 a using two lenses and recording information in a depth direction, a microphone that collects a sound uttered by the user using the information processing system 100 a or environmental sounds of an environment in which the information processing system 100 a is placed can be used.
  • the information processing system 100 a can detect an object placed on the table 140 a by analyzing an image captured by the camera.
  • the stereo camera is used as the input unit 110 a
  • a visible light camera or an infrared camera can be used in the stereo camera.
  • the input unit 110 a can acquire depth information.
  • the information processing system 100 a can detect, for example, a hand or an object placed on the table 140 a .
  • the information processing system 100 a can detect touch or approach of a hand of the user to the table 140 a or can detect separation of the hand from the table 140 a .
  • a user touching or approaching an information display surface with a manipulator such as a hand is also collectively referred to simply as “touch.”
  • a microphone array collecting a sound in a specific direction can be used as the microphone.
  • the information processing system 100 a may adjust a sound collection direction of the microphone array to any direction.
  • a manipulation by the user may be detected by a touch panel that detects touch of a finger of the user.
  • examples of the user manipulation which can be acquired by the input unit 110 a can include a stylus manipulation on an information display surface and a gesture manipulation on a camera.
  • the output unit 130 a is a device that displays information on the table 140 a according to information regarding manipulation content input through the input unit 110 a by the user using the information processing system 100 a , content of information output by the output unit 130 a , or the shape or design of an object placed on the table 140 a or that outputs a sound.
  • a projector or a speaker is used as the output unit 130 a .
  • the output unit 130 a is provided above the table 140 a to be suspended from a ceiling.
  • the output unit 130 a is configured of a projector, the output unit 130 a projects information to the top surface of the table 140 a .
  • the output unit 130 a When the output unit 130 a is configured of a speaker, the output unit 130 a outputs a sound based on a sound signal.
  • the number of speakers may be one or plural.
  • the information processing system 100 a may limit the speakers outputting sounds or adjust a sound output direction.
  • the output unit 130 a may include an illumination device.
  • the information processing system 100 a may control an on or off state or the like of the illumination device based on content of information input through the input unit 110 a.
  • the user using the information processing system 100 a can place his or her finger or the like on the table 140 a to manipulate information displayed on the table 140 a by the output unit 130 a .
  • the user using the information processing system 100 a can place the object on the table 140 a , cause the input unit 110 a to recognize an object, and execute various manipulations on the recognized object.
  • another device may be connected to the information processing system 100 a .
  • an illumination device illuminating the table 140 a may be connected to the information processing system 100 a .
  • the information processing system 100 a can control a lighting state of the illumination device according to a state of an information display surface.
  • FIGS. 2 to 4 are explanatory diagrams illustrating examples of other new forms of information processing systems according to embodiments of the present disclosure.
  • FIG. 2 is an explanatory diagram illustrating an example of the configuration of an information processing system 100 b according to an embodiment of the present disclosure.
  • the information processing system is configured to display the information on the front surface of the table 140 b by causing the output unit 130 a to radiate information from the lower side of a table 140 b . That is, in the information processing system 100 b illustrated in FIG. 2 , the information display surface is the top surface of the table 140 b .
  • the surface of the table 140 b is formed of a transparent material such as a glass plate or a transparent plastic plate.
  • FIG. 2 is also referred to as a “rear projection type.”
  • a configuration in which an input unit 110 b is provided on the front surface of the table 140 b is illustrated.
  • the input unit 110 b may be provided below the table 140 b to be separated from the table 140 b.
  • FIG. 3 is an explanatory diagram illustrating an example of the configuration of an information processing system 100 c according to an embodiment of the present disclosure.
  • FIG. 3 illustrates a state in which a touch panel type display is placed on a table.
  • an input unit 110 c and an output unit 130 c can be configured as a touch panel type display. That is, in the information processing system 100 c illustrated in FIG. 3 , an information display surface is the touch panel type display.
  • a camera detecting the position of a user may be provided above the touch panel type display, as in the information processing system 100 a illustrated in FIG. 1 .
  • FIG. 4 is an explanatory diagram illustrating an example of the configuration of an information processing system 100 d according to an embodiment of the present disclosure.
  • FIG. 4 illustrates a state in which a flat panel type display is placed on a table. That is, in the information processing system 100 d illustrated in FIG. 4 , an information display surface is a flat panel type display. In this way, in the case of the flat panel type display, an input unit 110 d and an output unit 130 d can be configured as a flat panel type display. In the flat panel type display illustrated in FIG. 4 , a touch panel may be provided.
  • the configuration of the information processing system 100 a in which the input unit 110 a and the output unit 130 a are provided above the table 140 a , that is, the configuration in which the input unit 110 a and the output unit 130 a are provided to be separated from the information display surface, will be described as an example.
  • the information processing system 100 a , the input unit 110 a , and the output unit 130 a will also be described as an information processing system 100 , an input unit 110 , and an output unit 130 .
  • FIG. 5 is an explanatory diagram illustrating an example of the functional configuration of the information processing system according to the embodiment of the present disclosure.
  • the example of the functional configuration of the information processing system according to the embodiment of the present disclosure will be described with reference to FIG. 5 .
  • the information processing system 100 is configured to include an input unit 110 , a control unit 120 , and an output unit 130 .
  • the input unit 110 inputs manipulation content on the information processing system 100 from a user using the information processing system 100 or the shape or design of an object placed on a surface (for example, the table 140 a illustrated in FIG. 1 ) through which information is output by the output unit 130 .
  • the manipulation content on the information processing system 100 from the user using the information processing system 100 includes manipulation content on a GUI output to an information display surface by the information processing system 100 .
  • Information regarding the shape or design of an object or the manipulation content on the information processing system 100 input by the input unit 110 is transmitted to the control unit 120 .
  • the input unit 110 can be configured of, for example, a camera configured of one lens, a stereo camera configured of two lenses, or a microphone.
  • the control unit 120 executes control on each unit of the information processing system 100 .
  • the control unit 120 generates information to be output from the output unit 130 using information input by the input unit 110 .
  • the control unit 120 is configured to include a detection unit 121 and an output control unit 122 .
  • the detection unit 121 executes a process of detecting manipulation content on the information processing system 100 by the user using the information processing system 100 , content of information output by the output unit 130 , or the shape or design of an object placed on a surface (for example, the table 140 a illustrated in FIG. 1 ) through which information is output by the output unit 130 .
  • the content detected by the detection unit 121 is transmitted to the output control unit 122 .
  • the output control unit 122 executes control such that the information output from the output unit 130 is generated.
  • the information generated by the output control unit 122 is transmitted to the output unit 130 .
  • coordinates on the information display surface are proofread in advance to match touch coordinates of a manipulator such as a hand of the user on the display surface, and thus the detection unit 121 can detect a portion of a GUI which is touched by the manipulator such as a hand of the user.
  • the control unit 120 may be configured of, for example, a central processing unit (CPU).
  • CPU central processing unit
  • the control unit 120 is configured of a device such as a CPU, the device can be configured of an electronic circuit.
  • control unit 120 may have a communication function of executing wireless communication with another device or a function of controlling an operation of another device connected to the information processing system 100 , for example, an illumination device.
  • the output unit 130 outputs information according to information regarding manipulation content input through the input unit 110 by the user using the information processing system 100 , content of information output by the output unit 130 , and the shape or design of an object placed on a surface (for example, the table 140 a illustrated in FIG. 1 ) through which information is output by the output unit 130 .
  • the output unit 130 outputs information based on information generated by the output control unit 122 .
  • the information output by the output unit 130 includes information to be displayed on the information display surface, a sound to be output from a speaker (not illustrated), or the like.
  • the information processing system 100 illustrated in FIG. 5 may be configured as a single device or may be configured partially or entirely of another device.
  • the control unit 120 may be included in a device such as a server connected to the input unit 110 and the output unit 130 via a network or the like.
  • the control unit 120 executes a process on the information from the input unit 110 , and information to be output by the output unit 130 is transmitted from the device such as a server to the output unit 130 via the network or the like.
  • FIG. 6 is an explanatory diagram illustrating an example of a manipulation situation of the information processing system 100 according to an embodiment of the present disclosure.
  • the information processing system 100 according to the embodiment of the present disclosure is, for example, a system configured for a plurality of users to independently execute applications on the same screen displayed on the table 140 a .
  • a graphical user interface (GUI) of an application illustrated in FIG. 6 is generated by the output control unit 122 and is output by the output unit 130 .
  • Reference numerals 1100 illustrated in FIG. 6 denote menu buttons used to manipulate an application.
  • the information processing system 100 acquires manipulation content from the user on the GUI of an application output to the information display surface by the output unit 130 using the input unit 110 .
  • the information processing system 100 allows a user to touch the display surface with a manipulator such as his or her hand or move the manipulator with which he or she is touching the display surface on the display surface and receives a manipulation on the GUI of the application output to the information display surface by the output unit 130 .
  • FIGS. 7 and 8 are explanatory diagrams illustrating examples of GUIs of applications displayed by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 7 illustrates an example of a GUI in which buttons are disposed in a fan form centering on a corner (a left corner in the example of FIG. 7 ) of a window of an application.
  • FIG. 8 illustrates an example of a GUI in which buttons are disposed along one side (a lower side in the example of FIG. 8 ) of a window of an application.
  • Reference numeral 1100 illustrated in FIG. 7 denotes a menu button used to manipulate the application.
  • reference numeral 7 denotes a menu button group displayed when the user touches the menu button denoted by reference numeral 1100 or displayed initially and used to manipulate the application.
  • reference numeral 1100 ′ illustrated in FIG. 8 is a menu button used to manipulate the application.
  • Reference numeral 1110 ′ illustrated in FIG. 8 denotes a menu button group displayed when the user touches the menu button denoted by reference numeral 1100 ′ or displayed initially and used to manipulate the application.
  • the information processing system 100 tracks the manipulation from the user and displays the menu button group 1110 so that the menu button group 1110 is rotated about the menu button 1100 .
  • FIGS. 9 and 10 are explanatory diagrams illustrating examples of GUIs of applications displayed by the information processing system 100 according to an embodiment of the present disclosure and are explanatory diagrams illustrating states in which a plurality of windows are displayed.
  • FIGS. 9 and 10 illustrate forms in which global menus which are menus used for users to activate applications executed by the information processing system 100 and local menus which are menus used for the users to manipulate the activated applications are displayed.
  • FIG. 9 illustrates a display example of a button format in which the global menus and the local menus are displayed in fan forms.
  • FIG. 10 illustrates a display example in which the global menus and the local menus are displayed in bar forms. In FIG.
  • the global menus are a menu button 1100 a and a menu button group 1110 a and the local menus are menu buttons 1100 b and menu button groups 1110 b .
  • the global menus are a menu button 1100 ′ and a menu button group 1110 ′.
  • FIGS. 11 to 13 are explanatory diagrams illustrating examples of GUIs of applications displayed by the information processing system 100 according to an embodiment of the present disclosure.
  • reference numeral A denotes an example of a button icon 1101 used to activate a camera application
  • reference numeral B denotes an example of a button icon 1101 used to read data managed by the information processing system 100
  • reference numeral C denotes an example of a button icon 1101 representing a folder.
  • FIG. 12 is an explanatory diagram illustrating a display example when the user of the information processing system 100 selects the button icon 1101 used to activate the camera application denoted by reference numeral A in FIG. 11 .
  • FIG. 13 is an explanatory diagram illustrating a display example when the user of the information processing system 100 selects the menu button 1100 . In the example illustrated in FIG. 13 , when the menu button 1100 is selected by the user, the menu button group 1110 used to execute a function belonging to the menu button 1100 is displayed.
  • various problems may occur according to the position of another window or a state of the information display surface, for example, a state of an object placed on the table 140 a illustrated in FIG. 1 .
  • a state of an object placed on the table 140 a illustrated in FIG. 1 For example, when the position of a window protrudes outside a display region, there may be an unselectable menu.
  • a window is covered with another window and the user does not execute a manipulation of moving the window to the forefront, there may be an unselectable menu.
  • the user can manipulate the menu in various directions.
  • the user may be located away from the menu according to the position or direction of the user, and thus it may be hard for the user to touch the menu.
  • a menu may overlap a location in which an object is placed on the table 140 a , and thus the user may not manipulate the menu.
  • the information processing system 100 detects the position of another window or the state of the information display surface and controls the position of a menu based on the detection result. Specifically, the information processing system 100 according to the embodiment of the present disclosure detects, for example, a state of an object placed on the table 140 a illustrated in FIG. 1 or the table 140 b illustrated in FIG. 2 and controls the position of a menu based on the detection result. By detecting the state of the information display surface and controlling the position of a menu based on the detection result, the information processing system 100 according to the embodiment of the present disclosure can display information appropriately and efficiently according to an environment in which information is displayed.
  • a method of controlling the position of a menu by the information processing system 100 according to the embodiment of the present disclosure will be described.
  • FIGS. 14 to 16 are flowcharts illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIGS. 14 to 16 illustrate an example of an operation of the information processing system 100 when the information processing system 100 detects the position of another window or a state of the information display surface and controls the position of a menu based on the detection result.
  • the example of the operation of the information processing system 100 according to the embodiment of the present disclosure will be described with reference to FIGS. 14 to 16 .
  • step S 1001 When the user of the information processing system 100 executes a predetermined manipulation to display a menu, the information processing system 100 sets a menu movement destination at which the menu is displayed to a current menu position (step S 1001 ).
  • the process of step S 1001 is executed by, for example, the output control unit 122 .
  • step S 1002 the information processing system 100 determines whether the state of a window displayed according to the manipulation executed by the user is related to the menu position (step S 1002 ). This determination is executed by, for example, the detection unit 121 .
  • step S 1002 the information processing system 100 determines whether the window is maximized. When the window is maximized, the information processing system 100 determines that the state of the window is related to the menu position. The fact that the window is maximized means that the window is displayed in a maximum range which can be displayed by the output unit 130 .
  • step S 1002 When it is determined in step S 1002 that the state of the window displayed according to a manipulation executed by the user is related to the menu position (Yes in step S 1002 ), that is, the window is maximized, the information processing system 100 subsequently executes a process of applying an offset to the menu position set in step S 1001 according to the state of the window (step S 1003 ). That is, the information processing system 100 assigns the offset to the menu position set in step S 1001 so that the menu position comes near the inside of the window by a predetermined amount.
  • the process of step S 1003 is executed by, for example, the output control unit 122 .
  • FIG. 17 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and illustrates a state in which a window of the application displayed by the information processing system 100 is maximized. Specifically, since the window is in a maximized state, the process of moving the position of the menu button 1100 to the inside of the window by a predetermined amount is executed in step S 1003 by the information processing system 100 , as illustrated in FIG. 17 .
  • step S 1002 when it is determined in step S 1002 that the state of the window displayed according to a manipulation executed by the user is not related to the menu position (No in step S 1002 ), that is, the window is not maximized, the information processing system 100 skips the process of step S 1003 .
  • the information processing system 100 determines whether the menu movement destination set in step S 1001 is inside a screen, that is, inside a screen which can be displayed by the output unit 130 (step S 1004 ). This determination is executed by, for example, the detection unit 121 .
  • FIG. 18 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and is an explanatory diagram illustrating whether a menu is inside a screen.
  • a circle indicated by a broken line is an example of the menu movement destination when the menu movement destination of the menu button 1100 is outside the screen, that is, outside the screen which can be displayed by the output unit 130 .
  • a circle indicated by a solid line is an example of the menu movement destination when the menu movement destination of the menu button 1100 is inside the screen, that is, inside the screen which can be displayed by the output unit 130 .
  • step S 1001 When the menu movement destination set in step S 1001 is inside the screen (Yes in step S 1004 ), the information processing system 100 subsequently determines whether the menu movement destination set in step S 1001 is covered with another window displayed by the information processing system 100 (step S 1005 ). This determination is executed by, for example, the detection unit 121 .
  • FIG. 19 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and is an explanatory diagram illustrating whether a menu button 1100 is covered with another window displayed by the information processing system 100 .
  • a circle indicated by a broken line is an example of the menu movement destination when the menu button 1100 is covered with the other window.
  • a circle indicated by a solid line is an example of the menu movement destination when the menu button 1100 is not covered with the other window.
  • the information processing system 100 subsequently determines whether the menu movement destination set in step S 1001 is located at a proper position according to the position of the user or a manipulation direction of the user (step S 1006 ). Specifically, the information processing system 100 determines whether the menu movement destination set in step S 1001 is located at the proper position according to the position of the user or the manipulation direction of the user by comparing the menu movement destination set in step S 1001 with the position of the user or the manipulation direction of the user. This determination is executed by, for example, the detection unit 121 .
  • FIG. 20 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and is an explanatory diagram illustrating whether the menu movement destination set in step S 1001 is located at a proper position according to the position of the user or the manipulation direction of the user.
  • a circle indicated by a broken line is an example of a case in which the movement destination of the menu button 1100 is not located at the proper position according to the position of the user or the manipulation direction (a direction from the lower side to the upper side of a screen) of the user since the movement destination of the menu button 1100 is away from the position of the user.
  • a circle indicated by a solid line is an example of a case in which the movement destination of the menu button 1100 is located at the proper position according to the position of the user or the manipulation direction (a direction from the lower side to the upper side of the screen) of the user since the movement destination of the menu button 1100 is close to the position of the user.
  • step S 1006 When it is determined in step S 1006 that the menu movement destination set in step S 1001 is located at the proper position according to the position of the user or the manipulation direction of the user (Yes in step S 1006 ), the information processing system 100 subsequently determines whether the menu movement destination set in step S 1001 interferes with an object placed on the information display surface displayed by the information processing system 100 (step S 1007 ). This determination is executed by, for example, the detection unit 121 .
  • An example of the information display surface displayed by the information processing system 100 includes the top surface of the table 140 a illustrated in FIG. 1 .
  • the fact that the menu movement destination interferes with the object means that the menu movement destination overlaps at least a part of the object.
  • FIG. 21 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and is an explanatory diagram illustrating whether the movement destination of the menu button 1100 set in step S 1001 interferes with the object 1200 placed on the information display surface.
  • a circle indicated by a broken line is an example of a case in which the movement destination of the menu button 1100 interferes with the object 1200 placed on the information display surface.
  • a circle indicated by a solid line is an example of a case in which the movement destination of the menu button 1100 does not interfere with the object 1200 placed on the information display surface.
  • the detection unit 121 may determine that the movement destination of the menu button 1100 uniformly interferes with the object 1200 when the movement destination of the menu button 1100 overlaps the object 1200 placed on the information display surface as in the circle indicated by the broken line. The detection unit 121 may determine that the menu movement destination does not interfere with the object placed on the information display surface when the movement destination of the menu button 1100 overlaps the object 1200 placed on the information display surface and the movement destination of the menu button 1100 is located on the flat surface of the object 1200 .
  • step S 1008 When the menu movement destination set in step S 1001 does not interfere with the object placed on the information display surface (Yes in step S 1007 ), the information processing system 100 moves a menu called by the user to the menu movement destination set in step S 1001 (step S 1008 ).
  • the process of step S 1008 is executed by, for example, the output control unit 122 .
  • the information processing system 100 subsequently determines whether all of the menu movement destinations are examined (step S 1009 ).
  • the determination of whether all of the menu movement destinations are examined is executed by, for example, the detection unit 121 .
  • step S 1010 determines whether the position of the user is confirmed.
  • the determination of whether the position of the user is confirmed is executed by, for example, the detection unit 121 .
  • step S 1010 it is determined whether the position of the user is confirmed through, for example, recognition of the body, face, head, or the like of the user by a camera or recognition of the direction of a sound by a microphone.
  • step S 1011 the information processing system 100 subsequently sets the menu movement destination to an unexamined position closest to the position of the user (step S 1011 ).
  • the process of step S 1011 is executed by, for example, the output control unit 122 .
  • the information processing system 100 subsequently executes the determinations of the foregoing steps S 1004 to S 1007 again.
  • the menu has the button format illustrated in FIG. 7
  • the menu movement destination is set to the unexamined position closest to the position of the user among the four corners of the window.
  • the menu has the bar format illustrated in FIG. 8
  • the menu movement destination is set to the unexamined position closest to the position of the user among the four sides of the window.
  • FIG. 22 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • a circle indicated by a solid line is an example of a movement destination (initial menu position) of the menu button 1100 set in the process of step S 1001 and circles indicated by broken lines are examples of movement destination candidates of the menu button 1100 .
  • the corner closest to the position of the user is the lower left corner of the window
  • the second closest corner is the lower right corner of the window
  • the third closest corner (excluding the initial menu position) is the upper right corner of the window. Accordingly, in the example illustrated in FIG. 22 , the information processing system 100 first sets the movement destination of the menu button 1100 to the lower left corner of the window closest to the position of the user as the unexamined position closest to the position of the user.
  • the information processing system 100 subsequently determines whether an object frequently used by the user is recognized on the information display surface (step S 1012 ).
  • the recognition of the object frequently used by the user on the information display surface is executed by, for example, the detection unit 121 .
  • the object frequently used by the user may be any object such as a mobile phone, a smartphone, a tablet terminal, a key, a book, a newspaper, a magazine, tableware, or a toy.
  • the information processing system 100 may determine whether there is an object frequently used by the user by recognizing an object placed on the information display surface and comparing the object recognized in advance to the object placed on the information display surface at a time point at which the menu is to be displayed.
  • the information processing system 100 can store a history of objects placed on the information display surface by maintaining information acquired by the input unit 110 . It is needless to say that the history of the objects placed on the information display surface may be stored in another device connected to the information processing system 100 via a network or the like.
  • the information processing system 100 may determine, for example, whether the object is placed on the information display surface with more than a predetermined frequency or may determine, for example, whether the object is an object registered as the object frequently used by the user.
  • step S 1012 When it is determined in the foregoing step S 1012 that the object frequently used by the user is recognized on the information display surface (Yes in step S 1012 ), the information processing system 100 subsequently sets the menu movement destination to the position which is the closest to the position of the object frequently used by the user and is not examined (step S 1013 ).
  • the process of step S 1013 is executed by, for example, the output control unit 122 .
  • FIG. 23 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • a circle indicated by a solid line is an example of a movement destination (initial menu position) of the menu button 1100 set in the process of step S 1001 and circles indicated by broken lines are examples of movement destination candidates of the menu button 1100 .
  • the corner closest to the position of the object frequently used by the user is the lower left corner of the window
  • the second closest corner is the lower right corner of the window
  • the third closest corner (excluding the initial menu position) is the upper right corner of the window.
  • the information processing system 100 first sets the movement destination of the menu button 1100 to the lower left corner of the window closest to the position of the object frequently used by the user as the unexamined position closest to the position of the object frequently used by the user.
  • the information processing system 100 subsequently determines whether the menu movement destination can be decided using a manipulation history of the user (step S 1014 ). Whether the menu movement destination can be decided using the manipulation history of the user is determined by, for example, the detection unit 121 .
  • the information processing system 100 can store the manipulation history of the user by maintaining information regarding user manipulations acquired by the input unit 110 . It is needless to say that the manipulation history of the user may be stored in another device connected to the information processing system 100 via a network or the like.
  • step S 1014 When it is determined in the foregoing step S 1014 that the menu movement destination can be decided using the manipulation history of the user (Yes in step S 1014 ), the information processing system 100 subsequently sets the menu movement destination to an unexamined position which is frequently manipulated by the user (step S 1015 ).
  • the process of step S 1015 is executed by, for example, the output control unit 122 .
  • FIG. 24 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • a circle indicated by a solid line is an example of the movement destination (initial menu position) of the menu button 1100 set in the process of step S 1001 .
  • Circles indicated by broken lines are examples of candidates of the movement destination of the menu button 1100 .
  • a position most frequently manipulated by the user is the lower right corner of the window
  • a position second most frequently manipulated by the user is the lower left corner of the window
  • a position third most frequently manipulated by the user (excluding the initial menu position) is the upper right corner of the window.
  • the information processing system 100 first sets the movement destination of the menu button 1100 to the lower right corner of the window as the unexamined position most frequently manipulated by the user.
  • step S 1014 when it is determined in the foregoing step S 1014 that the menu movement destination can be decided using the manipulation history of the user (No in step S 1014 ), the information processing system 100 subsequently sets the menu movement destination to the unexamined position closest to the original menu position (step S 1016 ).
  • the process of step S 1016 is executed by, for example, the output control unit 122 .
  • FIG. 25 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • a circle indicated by a solid line is an example of the movement destination (initial menu position) of the menu button 1100 set in the process of step S 1001 .
  • Circles indicated by broken lines are examples of candidates of the menu movement destination.
  • a corner closest to the initial menu position is the lower left corner of the window
  • a second closest corner is the upper left corner of the window
  • a third closest corner is a lower right corner of the window. Accordingly, in the example illustrated in FIG. 25 , the information processing system 100 first sets the movement destination of the menu button 1100 to the lower right corner of the window as the unexamined position closest to the original menu position.
  • step S 1017 the information processing system 100 subsequently determines whether there is a position to which the menu can be moved at any position inside the window displayed by the information processing system 100 (step S 1017 ).
  • the process of step S 1017 is executed by, for example, the detection unit 121 .
  • step S 1017 When it is determined in the foregoing step S 1017 that there is a position to which the menu can be moved at any position inside the window (Yes in step S 1017 ), the information processing system 100 sets the menu movement destination to any position which is not suitable for the above-described processes that is closest to the initial position and inside the window displayed on the screen (step S 1018 ).
  • the process of step S 1018 is executed by, for example, the output control unit 122 .
  • FIG. 26 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 26 illustrates a setting example of the movement destination of the menu button 1100 when none of the four corners of the window is suitable for the menu movement destination. That is, FIG. 26 illustrates a state in which the upper left corner and the lower left corner of the window protrude outside the screen, the upper right corner interferes with an object placed on the information display surface, and the lower right corner is covered with another window.
  • the information processing system 100 decides a certain position closest to the initial position (the lower left corner of the window in the example of FIG. 26 ) inside the window displayed on the screen as the movement destination of the menu button 1100 and sets the movement destination of the menu button 1100 to this position.
  • step S 1017 when it is determined in the foregoing step S 1017 that there is no position to which the menu can be moved at any position inside the window (No in step S 1017 ), the information processing system 100 subsequently determines that there is only one window inside the screen (step S 1019 ).
  • the process of step S 1019 is executed by, for example, the detection unit 121 .
  • step S 1019 When it is determined in the foregoing step S 1019 that there is only one window inside the screen (Yes in step S 1019 ), the information processing system 100 sets the menu movement destination to any position which is not suitable for the above-described processes that is closest to the initial position and is outside the window displayed on the screen since there is no concern of confusion with a menu of another window (step S 1020 ).
  • the process of step S 1020 is executed by, for example, the output control unit 122 .
  • step S 1019 Conversely, when it is determined in the foregoing step S 1019 that there are a plurality of windows inside the screen, the information processing system 100 directly ends the process without changing the menu movement destination.
  • FIG. 27 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 27 illustrates a setting example of the movement destination of the menu button 1100 when only one window is displayed on the screen and the entire window displayed by the information processing system 100 is put on an object 1220 .
  • the information processing system 100 decides a certain position closest to the initial position (the lower left corner of the window in the example of FIG. 27 ) outside the window displayed on the screen as the movement destination of the menu button 1100 and sets the movement destination of the menu button 1100 to this position.
  • FIG. 28 is an explanatory diagram illustrating a menu display control example in the information processing system 100 d in which a screen is perpendicular to the ground.
  • FIG. 28 illustrates a state in which four windows are displayed on the screen and the windows are displayed so that menu bars subordinate to all the windows do not overlap each other in both of the windows by executing the above-described series of processes.
  • FIG. 29 is an explanatory diagram illustrating a menu display control example in the information processing system 100 d in which a screen is perpendicular to the ground, as in FIG. 28 .
  • FIG. 29 illustrates a state in which four windows are displayed on the screen and the windows are displayed so that menu bars subordinate to the windows do not overlap each other in both of the windows by the above-described series of processes.
  • FIG. 30 is an explanatory diagram illustrating a menu display control example in the information processing system 100 c in which a display is placed on a table for a manipulation.
  • FIG. 30 illustrates an example in which display control is executed such that the menu button 1100 is automatically moved to a position close to a user (assumed to be user Y) facing a certain user (assumed to be user X) so that user Y can easily manipulate the menu button 1100 when user Y stretches out his or her hand to manipulate a window manipulated by user X.
  • the information processing system 100 c can detect the position in which the user is located with respect to a screen and a direction in which the user executes a manipulation. Accordingly, by executing the above-described series of processes, the information processing system 100 c can execute display control such that the menu button 1100 is automatically moved to a position close to user Y so that user Y can easily manipulate the menu button 1100 .
  • FIG. 31 is an explanatory diagram illustrating a menu display control example in the information processing system 100 a which projects a menu or a window to a table and allows the projected menu or window to be manipulated.
  • FIG. 31 exemplifies a case in which the information processing system 100 a is used on a dining table. In a location such as a dining table on which objects are likely to be placed, a case in which a menu is projected to a real object is likely to increase. Thus, a case in which a user does not directly touch a menu or a case in which a psychological burden on a user touching a menu is large easily occurs.
  • FIG. 31 illustrates a state in which a piece of cake 1201 or a cup of coffee 1202 is placed on a surface on which information is displayed by the information processing system 100 a.
  • the user When the user moves a real object on the dining table to which a menu is projected to a location to which the menu is not projected, the user can manipulate the projected menu.
  • the user executes a manipulation of moving the projected window to a position at which the menu is not projected to a real object, the user can manipulate the projected menu.
  • a burden on the user is large.
  • the information processing system 100 a automatically changes the display position of the menu button 1100 so that the display position does not overlap the position of a real object (the piece of cake 1201 or the cup of coffee 1202 ) on the dining table, as in FIG. 31 .
  • the information processing system 100 a can reduce the manipulation burden on the user by automatically changing the display position of the menu button 1100 so that the display position does not overlap the position of the real object on the dining table.
  • the information processing system 100 can detect the position of another window or the state of the information display surface, for example, the state of an object placed on the table 140 a illustrated in FIG. 1 , and can execute control such that the position of the menu is moved to a proper position based on the detection result.
  • the information processing system 100 executes the above-described series of processes so that the user can manipulate the menu without necessarily executing a step of moving the position of the window or moving the real object placed on the information display surface. Accordingly, the information processing system 100 according to the embodiment of the present disclosure executes the above-described series of processes, and thus the number of steps and a time until the user executes an intended manipulation are reduced.
  • the information processing system 100 executes the above-described series of processes, and thus it is possible to reduce effort in the user manipulation on a window pushed outside the screen in a GUI in which there is a possibility of the window frequently moving outside the screen and which includes the window which can be omnidirectionally manipulated. Since the effort of the user manipulation on the window pushed outside the screen is reduced, the information processing system 100 according to the embodiment of the present disclosure enables the user to use the screen broadly.
  • the information processing system 100 controls the display position of the menu such that the menu can be viewed normally, and thus it is possible to obtain the advantage that the user can easily specify an intended application.
  • the information processing system 100 according to the embodiment of the present disclosure projects a screen, as illustrated in FIG. 1 , a manipulation is not hindered due to a real object. Accordingly, it is possible to obtain the advantage that the information processing system 100 according to the embodiment of the present disclosure can reduce a burden on the user who is caused to move the position of a real object or move the position of a window and is caused not to place a real object in a projected location for a manipulation.
  • the information processing system 100 can be linked to a portable terminal such as a smartphone on the table.
  • a portable terminal such as a smartphone
  • the information processing system 100 can identify the portable terminal to be linked to the identified portable terminal.
  • the information processing system 100 may not be able to determine which portable terminal it is better to link to the information processing system 100 .
  • the information processing system 100 capable of easily specifying a portable terminal to be linked even when a plurality of users own portable terminals that are substantially the same and place the portable terminals on a table simultaneously and individually will be described.
  • the detection unit 121 identifies a portable terminal to be linked using an image recognition technology and detects the position and posture of the identified portable terminal and a distance from the input unit 110 . Accordingly, the information processing system 100 according to the embodiment of the present disclosure has feature amount data necessary to identify the portable terminals.
  • the portable terminals to be recognized in the information processing system 100 have image data discovered in the information processing system 100 .
  • the following techniques are considered. For example, there is a method in which the owner of each portable terminal selects a preference image and the information processing system 100 is caused to recognize this image in advance. After the image is recognized, the owner of the portable terminal causes his or her portable terminal to display the image recognized in advance and causes the information processing system 100 to recognize the image. In this way, the information processing system 100 and the portable terminal can be linked.
  • the information processing system 100 is caused to recognize a screen, such as a lock screen or a home screen, generated by a system of the portable terminal as a recognition target image.
  • a screen such as a lock screen or a home screen
  • the screen may be recognized through a dedicated application or the user may capture a screen by himself or herself and may cause the information processing system 100 to recognize the captured image.
  • FIGS. 32 and 33 are flowcharts illustrating examples of operations of the information processing system 100 according to an embodiment of the present disclosure and a portable terminal linked to the information processing system 100 .
  • FIG. 32 illustrates the example of the operation of the portable terminal linked to the information processing system 100
  • FIG. 33 illustrates the example of the operation of the information processing system 100 .
  • the portable terminal is assumed to register any image in the information processing system 100 in advance.
  • FIGS. 32 and 33 the examples of the operations of the information processing system 100 according to the embodiment of the present disclosure and the portable terminal linked to the information processing system 100 will be described with reference to FIGS. 32 and 33 .
  • the portable terminal linked to the information processing system 100 displays a recognition screen for causing the information processing system 100 to recognize the portable terminal according to a predetermined manipulation from the user (step S 1101 ).
  • the information processing system 100 causes a mode to proceed to a mode of recognizing the portable terminal according to a predetermined manipulation from the user (hereinafter also referred to as a “recognition mode”) (step S 1111 ).
  • the user places the portable terminal displaying the recognition screen for causing the information processing system 100 to recognize the portable terminal in the foregoing step S 1101 inside a recognizable area for causing the information processing system 100 to recognize the portable terminal (step S 1102 ).
  • the recognizable area any region can be set by the information processing system 100 .
  • the entire area to which the information is projected to the table may be the recognizable area or a predetermined partial region may be the recognizable area.
  • the information processing system 100 may output display as if the information processing system 100 understands the recognizable area from the output unit 130 .
  • the information processing system 100 subsequently retrieves a recognition image registered in the information processing system 100 (step S 1112 ).
  • the process of retrieving the recognition image is executed by, for example, the detection unit 121 .
  • the information processing system 100 may start the retrieval process of step S 1112 when the portable terminal displaying an image recognition screen is placed in the recognizable area, or may start the retrieval process before the portable terminal is placed in the recognizable area.
  • step S 1113 the information processing system 100 determines whether the registered image is discovered through the retrieval process of the foregoing step S 1112 (step S 1113 ). This determination process is executed by, for example, the detection unit 121 . When it is determined in step S 1113 that the registered image is not discovered (No in step S 1113 ), the information processing system 100 subsequently determines whether a given time has passed after the retrieval process starts (step S 1114 ). The determination process is executed by, for example, the detection unit 121 . When it is determined in step S 1114 that the given time has passed and the registered image is not discovered (Yes in step S 1114 ), the information processing system 100 ends the process and exits the recognition mode. Conversely, when it is determined in step S 1114 that the given time has not passed (No in step S 1114 ), the retrieval process of step S 1112 is executed again.
  • step S 1113 the information processing system 100 subsequently displays an effect indicating that the registered image is discovered (step S 1115 ).
  • the display process of step S 1115 is executed by, for example, the output control unit 122 .
  • Any effect may be used as the effect indicating that the registered image is discovered.
  • the information processing system 100 executes, for example, display showing ripples spreading from the location in which the portable terminal is placed.
  • the recognition process in the information processing system 100 may be affected. Therefore, the information processing system 100 preferably outputs the effect indicating that the registered image is discovered so that the effect does not overlap the portable terminal.
  • FIG. 34 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 34 illustrates an example of the effect indicating that the registered image is discovered and displayed by the information processing system 100 .
  • the information processing system 100 may execute display showing ripples indicated by reference numeral 1301 spreading from the location in which the portable terminal 1300 is placed.
  • the recognition in the information processing system 100 is affected.
  • the user of the portable terminal may adjust the luminance of the display of the portable terminal so that an image can be easily recognized by the information processing system 100 .
  • the information processing system 100 subsequently determines whether an application currently executed in the information processing system 100 is an application for which it is necessary to continuously recognize the image (step S 1116 ). This determination process is executed by, for example, the detection unit 121 .
  • An example of the application for which it is necessary to continuously recognize the image includes an application for which it is necessary to continuously display information by tracking the recognized image.
  • step S 1116 When it is determined in the foregoing step S 1116 that the application currently executed in the information processing system 100 is not the application for which it is necessary to continuously recognize the image (No in step S 1116 ), it is not necessary for the portable terminal to remain in the recognizable area. Therefore, the information processing system 100 subsequently displays information prompting the user to remove the recognized portable terminal from the recognizable area (step S 1117 ).
  • the display process of step S 1115 is executed by, for example, the output control unit 122 . Any information may be used as the information prompting the user to remove the portable terminal. However, when the information prompting the user to remove the portable terminal overlaps an image displayed by the portable terminal, the recognition process in the information processing system 100 is affected. Therefore, the information processing system 100 preferably outputs the information prompting the user to remove the portable terminal so that the information does not overlap the portable terminal.
  • FIG. 35 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 35 illustrates an example of information displayed by the information processing system 100 after an image displayed by the portable terminal is recognized.
  • the information processing system 100 displays the information prompting the user to remove the portable terminal 1300 , for example, as indicated by reference numeral 1302 in FIG. 35 .
  • the information processing system 100 can cause the user of the portable terminal 1300 to be aware of the necessity to remove the portable terminal 1300 from the recognizable area by displaying the information illustrated in FIG. 35 .
  • the information processing system 100 preferably outputs the information prompting the user to remove the portable terminal so that the information does not overlap the portable terminal.
  • the information processing system 100 determines whether the image registered in the information processing system 100 disappears from the inside of the screen (the inside of the recognizable area) (step S 1118 ).
  • the determination process is executed by, for example, the detection unit 121 .
  • the information processing system 100 continuously displays the information displayed in step S 1117 .
  • the information processing system 100 stops the image recognition process (step S 1119 ).
  • step S 1116 when it is determined in the foregoing step S 1116 that the currently executed application is the application for which it is necessary to continuously recognize the image (Yes in step S 1116 ), the information processing system 100 skips the processes of the foregoing steps S 1117 to S 1119 .
  • step S 1120 the information processing system 100 subsequently records the ID of the image discovered in the foregoing step S 1113 (step S 1120 ).
  • the process of step S 1120 is executed by, for example, the detection unit 121 .
  • the information processing system 100 performs matching of the ID of the image and starts a communication process with the portable terminal displaying the image (step S 1121 ).
  • the communication between the information processing system 100 and the portable terminal is executed through, for example, the Internet, Wi-Fi, or Bluetooth (registered trademark).
  • the information processing system 100 records the position, the posture, and the size of the image discovered in the foregoing step S 1113 (step S 1122 ).
  • the process of step S 1122 is executed by, for example, the detection unit 121 .
  • the information processing system 100 executes display indicating a connection state with the portable terminal on the information display surface using information regarding the position, the posture, and the size of the image discovered in the foregoing step S 1113 (step S 1123 ).
  • the display process of step S 1123 is executed by, for example, the output control unit 122 .
  • the display indicating the connection state with the portable terminal in step S 1123 is also referred to as a “connection mark” below.
  • the information processing system 100 may display, for example, the same image as the recognition screen displayed by the recognized portable terminal as the connection mark. By displaying the same image as the recognition screen displayed by the recognized portable terminal as the connection mark, the information processing system 100 can easily allow the user to comprehend which connection mark corresponds to which portable terminal.
  • FIG. 36 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 36 illustrates an example of a state in which the information processing system 100 displays the connection mark on the display surface.
  • Reference numeral 1303 in FIG. 36 denotes an example of a connection mark and a connection state of the information processing system 100 with the portable terminal 1300 illustrated in FIGS. 34 and 35 .
  • the information processing system 100 according to the embodiment of the present disclosure can present the linking to the portable terminal owned by the user to the user and the connection mark displayed on the screen as a data exchange interface by displaying the connection mark on the display surface.
  • the connection mark 1303 illustrated in FIG. 36 can be used as an interface for extracting data from the portable terminal and copying data to the portable terminal.
  • connection mark 1303 is displayed on the display surface based on the position, the posture, and the size when the image displayed on the display of the portable terminal is recognized. Accordingly, the connection mark 1303 is naturally displayed in a direction in which the user can easily touch it at the hand of the user placing the portable terminal, and thus an effect of improving convenience of the device linkage for a plurality of users or a plurality of terminals is obtained.
  • connection state between the information processing system 100 and the portable terminal may be released through an active connection releasing manipulation from the user or may be automatically released when no operation is executed on the portable terminal or the connection mark for a given time.
  • the information processing system 100 may eliminate the connection mark displayed in the foregoing step S 1123 .
  • the information processing system 100 can present end of the connection state to the user by eliminating the connection mark displayed in the foregoing step S 1123 .
  • the information processing system 100 can offer the user various experiences by executing the above-described series of processes and displaying the connection mark on the information display surface.
  • examples of the experiences offered to the user through the display of the connection mark by the information processing system 100 will be described.
  • FIG. 37 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 37 illustrates an example of a GUI of an application executed by the information processing system 100 when image data stored in the portable terminal is shared by displaying connection marks.
  • FIG. 37 illustrates a state in which connection marks 1303 and 1304 are displayed on the information display surface by the information processing system 100 .
  • the information processing system 100 acquires image data from image folders or the like of the portable terminals corresponding to the connection marks 1303 and 1304 and displays images acquired from the image folders around the connection marks 1303 and 1304 .
  • the images displayed around the connection marks 1303 and 1304 are displayed just such that the users can execute drag manipulations.
  • the information processing system 100 outputs icons or other information indicating copying to the information processing system 100 and allows the users to drag the display image data to the information, and thus the image data maintained in the portable terminals can be copied to the information processing system 100 through a simple user manipulation.
  • connection marks 1303 and 1304 are displayed on the information display surface, as illustrated in FIG. 37 , for example, the user is allowed to drag the image data stored in the portable terminal corresponding to the connection mark 1303 to the connection mark 1304 , so that the image data can be copied between the portable terminals via the information processing system 100 . Accordingly, the information processing system 100 can copy the image data maintained by the portable terminal to another portable terminal through a simple user manipulation.
  • FIG. 38 is an explanatory diagram illustrating a use example of the information processing system 100 and illustrates a form in which the user images a photo using a portable terminal linked to the information processing system 100 .
  • the information processing system 100 can also realize an application by which the photo imaged by the portable terminal is displayed around the connection mark 1303 .
  • the information processing system 100 may display the photo in association with, for example, an effect in which the photo appears from the connection mark 1303 .
  • the information processing system 100 can clearly express which portable terminal images the photo by displaying the photo imaged by the portable terminal and such an effect in association therewith.
  • FIG. 39 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 39 illustrates an example of a GUI of an application executed by the information processing system 100 when music data stored in the portable terminal is shared by displaying connection marks.
  • FIG. 39 illustrates a state in which connection marks 1303 and 1304 are displayed by the information processing system 100 .
  • the information processing system 100 acquires music data from music folders or the like of the portable terminals corresponding to the connection marks 1303 and 1304 and displays jacket images of the music data acquired from the music folders around the connection marks 1303 and 1304 .
  • the jacket images displayed around the connection marks 1303 and 1304 are displayed just such that the users can execute drag manipulations.
  • the information processing system 100 outputs icons or other information indicating copying to the information processing system 100 and allows the users to drag the displayed jacket image to the information, and thus the music data maintained in the portable terminals can be copied to the information processing system 100 through a simple user manipulation.
  • connection marks 1303 and 1304 are displayed, as illustrated in FIG. 39 , for example, the user is allowed to drag the jacket image of the music data stored in the portable terminal corresponding to the connection mark 1303 to the connection mark 1304 , so that the music data can be copied between the portable terminals via the information processing system 100 . Accordingly, the information processing system 100 can copy the music data maintained by the portable terminal to another portable terminal through a simple user manipulation.
  • FIG. 39 illustrates a state in which the information processing system 100 displays an interface for reproducing the music data.
  • the information processing system 100 can execute a process of reproducing the music data corresponding to the jacket image or generating a playlist.
  • the information processing system 100 can share various kinds of data with the portable terminal linked to the information processing system 100 in addition to the image data or the music data.
  • the information processing system 100 can enable, for example, websites or bookmarks of browsers displayed by the portable terminal linked to the information processing system 100 to be shared, as in the above-described GUI.
  • the information processing system 100 can also offer a manipulation of dragging a predetermined menu button of a browser executed by the information processing system 100 to the connection mark.
  • FIG. 40 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 40 illustrates an example of a GUI of an application executed by the information processing system 100 when contact address data stored in the portable terminal is shared by displaying connection marks.
  • FIG. 40 illustrates a state in which connection marks 1303 and 1304 are displayed on the information display surface by the information processing system 100 .
  • the information processing system 100 acquires contact address data from the portable terminals corresponding to the connection marks 1303 and 1304 displayed on the information display surface and displays images that means the contact address data acquired from the portable terminals around the connection marks 1303 and 1304 .
  • the images displayed around the connection marks 1303 and 1304 are displayed just such that the users can execute drag manipulations.
  • the information processing system 100 outputs icons or other information indicating copying to the information processing system 100 and allows the users to drag the displayed image to the information, and thus the contact address data maintained in the portable terminals can be copied to the information processing system 100 through a simple user manipulation.
  • the information processing system 100 displays the connection marks 1303 and 1304 on the information display surface, as illustrated in FIG. 40 , for example, the user is allowed to drag the contact address data stored in the portable terminal corresponding to the connection mark 1303 to the connection mark 1304 , so that the contact address data can be copied between the portable terminals via the information processing system 100 . Accordingly, the information processing system 100 can copy the contact address data maintained by the portable terminal to another portable terminal through a simple user manipulation.
  • the portable terminal linked to the information processing system 100 can add functions by installing various applications.
  • the information processing system 100 can also realize a GUI in which an application can be given and received between the portable terminals by displaying the connection marks through the above-described processes.
  • FIG. 41 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 41 illustrates an example of a GUI of an application executed by the information processing system 100 when an application stored in the portable terminal is displayed by displaying connection marks.
  • FIG. 41 illustrates a state in which the connection marks 1303 and 1304 are displayed on the information display surface by the information processing system 100 .
  • the information processing system 100 acquires information regarding the applications installed in the portable terminal from the portable terminals corresponding to the connection marks 1303 and 1304 and displays the information as icons or other information around the connection marks 1303 and 1304 , as illustrated in FIG. 41 .
  • FIG. 41 illustrates a state in which a plurality of applications are installed in the portable terminal corresponding to the connection mark 1303 , but no application is installed in the portable terminal corresponding to the connection mark 1304 .
  • the user of the portable terminal corresponding to the connection mark 1304 finds a preferred application among the applications installed in the portable terminal corresponding to the connection mark 1303 , and drags an icon of the application to the connection mark 1304 . Through the drag manipulation, a process of downloading and installing the application is automatically executed in the portable terminal corresponding to the connection mark 1304 .
  • the information processing system 100 can acquire the position, the posture, the size, and the like of the portable terminal and then can be linked to execute communication with the portable terminal by recognizing the image displayed by the portable terminal even when a dedicated application is not activated by the portable terminal.
  • the information processing system 100 causes the portable terminal to display any image and registers the displayed image before the device linkage with the portable terminal.
  • the information processing system 100 according to the embodiment of the present disclosure can make image selection more fun for the user through such an image registration process.
  • the information processing system 100 can allow the user to easily recognize the user of the connection mark by continuously displaying the image as the connection mark on the screen.
  • the information processing system 100 causes the portable terminal to display any image and registers the displayed image before the device linkage with the portable terminal. Therefore, even when there are a plurality of substantially the same kind of portable terminals, the portable terminals can be uniquely identified by proper use of recognition images. There is a possibility of each user incidentally selecting substantially the same image as the recognition image when the plurality of users have the same kind of devices. Accordingly, the information processing system 100 according to the embodiment of the present disclosure may not be linked to the portable terminal when the portable terminal is not caused to register the selected recognition image in the information processing system 100 . The information processing system 100 can determine whether the image selected by the portable terminal is superimposed by causing the portable terminal to register the selected recognition image in the information processing system 100 .
  • the information processing system 100 may cause the portable terminals to be linked to select the recognition images and register recognition images in the information processing system 100 so that all of the portable terminals are unique.
  • the information processing system 100 can receive manipulations on a menu in various directions from a plurality of users, for example, as illustrated in FIG. 6 .
  • the other users may not use the menu while somebody else is using the menu.
  • the information processing system 100 is configured to receive manipulations of users, as will be described below, so that an improvement in operability and convenience is achieved at the time of reception of the manipulations of the plurality of users on the menu in various directions.
  • FIG. 42 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 42 illustrates the example of the operation of the information processing system 100 according to the embodiment of the present disclosure when the users execute drag manipulations on the menu button group 1110 or the menu button 1100 illustrated in FIG. 6 and the like.
  • the example of the operation of the information processing system 100 according to the embodiment of the present disclosure will be described with reference to FIG. 42 .
  • the drag manipulation on the menu button 1100 or the menu button group 1110 is also simply referred to as a drag manipulation (on the menu) in some cases.
  • step S 1201 When the information processing system 100 detects that the user executes the drag manipulation on the menu displayed by the information processing system 100 (step S 1201 ), the information processing system 100 determines whether the menu is pressed at one point and the menu is dragged at another point in the drag manipulation (step S 1202 ).
  • the processes of the foregoing steps S 1201 and 1202 are executed by, for example, the detection unit 121 .
  • step S 1202 When it is determined in the foregoing step S 1202 that the manipulation of pressing the menu at one point and further dragging the menu dragged at one point is executed (Yes in step S 1202 ), the information processing system 100 generates a copy of the dragged menu (step S 1203 ).
  • the generation of the copy of the menu in step S 1203 is executed by, for example, the output control unit 122 .
  • FIG. 45 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 45 illustrates an example of the generation of the copy of the menu in the foregoing step S 1203 .
  • a drag manipulation is assumed to be executed by the user pressing one menu button (B) in the menu button group 1110 displayed on the information display surface with his or her left forefinger and dragging the same menu button (B) with his or her right forefinger.
  • the information processing system 100 executes a process of generating a copy menu button 1111 of the menu button (B) according to the manipulation executed by the user.
  • step S 1202 when it is determined in the foregoing step S 1202 that the manipulation of pressing the menu at one point and further dragging the menu at one point is not executed (No in step S 1202 ), the information processing system 100 subsequently determines whether the menu is pressed at two points and the menu is dragged at one point in the drag manipulation detected in the foregoing step S 1201 (step S 1204 ).
  • the process of step S 1204 is executed by, for example, the detection unit 121 .
  • step S 1204 When it is determined in the foregoing step S 1204 that the manipulation of pressing the menu at two points and dragging the menu at one point is executed (Yes in step S 1204 ), the information processing system 100 subsequently determines whether the menu is a folder menu indicating a folder (step S 1205 ). The process of step S 1205 is executed by, for example, the detection unit 121 . When it is determined in step S 1205 that the dragged menu is not the folder menu (No in step S 1205 ), the information processing system 100 generates a copy of the dragged menu, as in the case of the manipulation of pressing the menu one point and dragging the menu at one point (step S 1203 ).
  • step S 1205 when it is determined in step S 1205 that the dragged menu is the folder menu (Yes in step S 1205 ), the information processing system 100 generates a shortcut to the menu (the folder menu) (step S 1206 ).
  • the generation of the shortcut to the menu in step S 1206 is executed by, for example, the output control unit 122 .
  • the shortcut is assumed to refer to a menu which functions as a reference to another menu and has no substance.
  • FIG. 46 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 46 illustrates an example of the generation of the shortcut of the menu in the foregoing step S 1206 .
  • a drag manipulation is assumed to be executed by the user pressing one menu button (B) in the menu button group 1110 with his or her left forefinger and middle finger at two points and dragging the same menu button (B) with his or her right forefinger.
  • the information processing system 100 executes a process of generating a shortcut button 1112 of the menu button (B) according to the manipulation executed by the user.
  • the information processing system 100 may generate the copy menu button 1111 of the menu button (B) illustrated in FIG. 45 and the shortcut button 1112 illustrated in FIG. 46 so that the appearances of the copy menu button and the shortcut button are different.
  • the menu button 1111 is a simple circle and the shortcut button 1112 is a double circle.
  • the shapes of the copy menu button 1111 and the shortcut button 1112 are not limited to the related examples, but the information processing system 100 may set different appearances through different colors as well as the different shapes.
  • a difference between generation of a copy of the menu and generation of a shortcut of the menu will be described.
  • the information processing system 100 does not add the added menu to the other side menu (for example, a menu of a copy destination).
  • the information processing system 100 also adds the added menu to the other side menu (for example, a menu of a shortcut destination).
  • FIGS. 47 and 48 are explanatory diagrams illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and an example of the GUI when a copy of a menu is generated based on a manipulation from the user. Even when a menu is generated through a manipulation from the user as in FIG. 47 and a new menu (G) is subsequently added to one side menu as in FIG. 48 , the menu (G) is not added to the other side menu.
  • FIGS. 49 and 50 are explanatory diagrams illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure and an example of the GUI when a shortcut of a menu is generated based on a manipulation from the user.
  • FIGS. 49 and 50 when a shortcut of a menu is generated based on a manipulation from the user in accordance with a broken line as in FIG. 49 , and a new menu (G) is subsequently added to one side menu as in FIG. 50 , the new menu (G) is also added to the other side menu.
  • step S 1204 when it is determined in the foregoing step S 1204 that the manipulation of pressing the menu at two points and dragging the menu at one point is not executed (No in step S 1204 ), the information processing system 100 subsequently determines whether an angle formed by a row of the menu and a drag direction of the menu is equal to or greater than a prescribed value (step S 1207 ).
  • the process of step S 1207 is executed by, for example, the detection unit 121 .
  • step S 1207 When it is determined in the foregoing step S 1207 that the angle formed by the row of the menu and the drag direction is equal to or greater than the prescribed value (Yes in step S 1207 ), the information processing system 100 subsequently determines whether the dragged menu is a menu separable from the menu button group (step S 1208 ).
  • the process of step S 1208 is executed by, for example, the detection unit 121 .
  • step S 1208 determines whether the dragged menu is not the menu separable from the menu button group (No in step S 1208 )
  • the information processing system 100 generates a copy of the dragged menu (step S 1203 ).
  • step S 1208 when it is determined in step S 1208 that the dragged menu is the separable menu (Yes in step S 1208 ), the information processing system 100 separates the menu from the menu button group (step S 1209 ).
  • the process of separating the menu from the menu button group in step S 1209 is executed by, for example, the output control unit 122 .
  • FIG. 51 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 51 illustrates an example of a drag manipulation on the menu by the user.
  • Reference numeral 1401 denotes a radial direction of the menu button group 1110 disposed in an arc shape
  • reference numeral 1402 denotes a direction in which the menu button is to be dragged by the user
  • reference numeral 1403 denotes a circumferential direction of the menu button group 1110 disposed in an arc shape.
  • the information processing system 100 separates the dragged menu button from the menu button group 1110 .
  • FIG. 52 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 52 illustrates an example of a drag manipulation on the menu by the user.
  • the information processing system 100 separates the dragged menu button from the menu button group 1110 so that the menu button is independent as a menu button 1111 .
  • step S 1210 the information processing system 100 executes a drag manipulation from the user as a normal behavior (step S 1210 ).
  • the process of step S 1210 is executed by, for example, the output control unit 122 .
  • the normal behavior is, for example, a behavior in which the menu button 1100 is moved to track a manipulation from the user or a behavior in which the menu button group 1110 tracks a manipulation from the user and is rotated about the menu button 1100 .
  • the information processing system 100 can allow the user to copy the menu, generate the shortcut of the menu, or separate the menu through a simple manipulation by executing the above-described operation according to content of a drag manipulation on the menu button by the user.
  • FIGS. 43 and 44 are flowcharts illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIGS. 43 and 44 illustrate the example of the operation of the information processing system 100 according to the embodiment of the present disclosure when the user executes a drag manipulation on the menu button 1100 or the menu button group 1110 illustrated in FIG. 6 and the like and the user subsequently executes a drop manipulation.
  • the example of the operation of the information processing system 100 according to the embodiment of the present disclosure will be described with reference to FIGS. 43 and 44 .
  • the drop manipulation on the menu button 1100 or the menu button group 1110 is also simply referred to as a drop manipulation (on the menu) in some cases.
  • step S 1211 When the information processing system 100 detects that the user executes a drop manipulation on a menu displayed by the information processing system 100 (step S 1211 ), the information processing system 100 determines whether a distance dragged by the user is equal to or less than a prescribed distance (step S 1212 ).
  • the processes of the foregoing steps S 1211 and 1212 are executed by, for example, the detection unit 121 .
  • the information processing system 100 executes functions assigned to the dropped menu (step S 1213 ).
  • the functions assigned to the menu are, for example, various functions such as activation of an application, display of a website, display of image data, and reproduction of music data and are not limited to specific functions.
  • step S 1212 determines whether the menu is dropped on a menu which is another menu and in which a dragged distance is equal to or less than the prescribed distance (step S 1214 ).
  • the determination of step S 1214 is executed by, for example, the detection unit 121 .
  • step S 1214 When it is determined in the foregoing step S 1214 that the menu is dropped on the menu which is the other menu other than the dropped menu and in which the dragged distance is equal to or less than the prescribed distance (Yes in step S 1214 ), the information processing system 100 subsequently determines whether the dropped menu is a menu which accepts the drop (step S 1215 ).
  • the determination of step S 1214 is executed by, for example, the detection unit 121 .
  • step S 1215 When it is determined in the foregoing step S 1215 that the dropped menu is the menu that accepts the drop (Yes in step S 1215 ), the information processing system 100 subsequently determines whether the dropped menu is a folder menu (step S 1216 ). The determination of step S 1215 is executed by, for example, the detection unit 121 .
  • step S 1216 When it is determined in the foregoing step S 1216 that the dropped menu is the folder menu (Yes in step S 1216 ), the information processing system 100 subsequently adds the dropped menu to a menu (subordinate menu) in a lower hierarchy of the drop destination (step S 1218 ).
  • the addition process of step S 1218 is executed by, for example, the output control unit 122 .
  • step S 1216 determines whether an item corresponding to the menu dropped by the user is handleable in the dropped menu (step S 1217 ).
  • the determination of step S 1217 is executed by, for example, the detection unit 121 .
  • step S 1217 When it is determined in the foregoing step S 1217 that the item dropped by the user is handleable in the dropped menu (Yes in step S 1217 ), the information processing system 100 subsequently delivers information linked to the menu dropped by the user to the menu receiving the drop (step S 1219 ).
  • the process of step S 1219 is executed by, for example, the output control unit 122 .
  • step S 1217 when it is determined in step S 1217 that the item dropped by the user is not handleable in the dropped menu (No in step S 1217 ), the information processing system 100 subsequently executes a process of generating a new menu having a menu of the drop source and a menu of the drop destination in subordinate components (step S 1220 ).
  • the process of step S 1220 is executed by, for example, the output control unit 122 .
  • step S 1214 When it is determined in the foregoing step S 1214 that the menu is not dropped on the menu which is the other menu other than the dropped menu and in which the dragged distance is equal to or less than the prescribed distance (No in step S 1214 ), the information processing system 100 subsequently determines whether a menu other than the dropped menu approaches the menu on which the menu is dropped by a distance equal to or less than the prescribed distance in a state in which the other menu is pressed at one point (step S 1221 ). The determination of step S 1221 is executed by, for example, the detection unit 121 .
  • step S 1221 When it is determined in the foregoing step S 1221 that the menu other than the dropped menu approaches the dropped menu by the distance equal to or less than the prescribed distance in the state in which the other menu is pressed at one point (Yes in step S 1221 ), the information processing system 100 subsequently determines whether the dropped menu and the other menu can be merged (step S 1222 ).
  • the determination of step S 1222 is executed by, for example, the detection unit 121 .
  • step S 1222 When it is determined in the foregoing step S 1222 that the dropped menu and the other menu can be merged (Yes in step S 1222 ), the information processing system 100 subsequently executes a process of merging a subordinate menu of the dropped menu and a subordinate menu of the other menu (step S 1223 ).
  • the process of step S 1223 is executed by, for example, the output control unit 122 .
  • the information processing system 100 When it is determined in the foregoing step S 1222 that the dropped menu and the other menu may not be merged (No in step S 1222 ), the information processing system 100 subsequently executes a process of returning the dropped menu to the position before the drag (step S 1226 ).
  • the process of step S 1226 is executed by, for example, the output control unit 122 .
  • step S 1221 When it is determined in the foregoing step S 1221 that the menu other than the dropped menu does not approach the dropped menu by the distance equal to or less than the prescribed distance in the state in which the other menu is pressed at one point (No in step S 1221 ), the information processing system 100 subsequently determines whether the dropped menu is dropped on a location within a fixed distance from each of two menus of the same hierarchy (step S 1224 ).
  • the determination of step S 1224 is executed by, for example, the detection unit 121 .
  • step S 1224 When it is determined in the foregoing step S 1224 that the dropped menu is dropped on the location within the fixed distance from each of the two menus of the same hierarchy (Yes in step S 1224 ), the information processing system 100 subsequently executes a process of inserting the dragged and dropped menu between the two menus (step S 1225 ).
  • the process of step S 1225 is executed by, for example, the output control unit 122 .
  • step S 1224 determines whether the menu is dragged at a speed equal to or greater than a fixed speed until the menu is dropped (step S 1227 ).
  • the determination of step S 1227 is executed by, for example, the detection unit 121 .
  • step S 1227 When it is determined in the foregoing step S 1227 that the menu is dragged at a speed equal to or greater than the fixed speed until the menu is dropped (Yes in step S 1227 ), the information processing system 100 subsequently determines whether the dropped menu can be deleted (step S 1228 ).
  • the determination of step S 1228 is executed by, for example, the detection unit 121 .
  • step S 1228 When it is determined in the foregoing step S 1228 that the dropped menu can be deleted (Yes in step S 1228 ), the information processing system 100 subsequently executes a process of deleting the dragged menu (step S 1230 ).
  • the process of step S 1230 is executed by, for example, the output control unit 122 .
  • step S 1226 the process of returning the dropped menu to the position before the drag is executed (step S 1226 ).
  • step S 1226 is executed by, for example, the output control unit 122 .
  • step S 1227 When it is determined in the foregoing step S 1227 that the menu is dragged at a speed less than the fixed speed until the menu is dropped (No in step S 1227 ), the information processing system 100 subsequently determines whether the drop location is outside the screen (step S 1229 ).
  • the determination of step S 1229 is executed by, for example, the detection unit 121 .
  • step S 1229 When it is determined in the foregoing step S 1229 that the drop location is outside the screen (Yes in step S 1229 ), the information processing system 100 subsequently determines whether the dropped menu can be deleted in the foregoing step S 1228 . Conversely, when it is determined in the foregoing step S 1229 that the drop location is not outside the screen (No in step S 1229 ), the information processing system 100 subsequently executes a process of moving the menu to the drop location (step S 1231 ). The process of step S 1231 is executed by, for example, the output control unit 122 .
  • the information processing system 100 can change the state of the menu dropped by the user according to the drop location, the speed of the drag, and the like by executing the above-described series of processes.
  • FIG. 53 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 53 illustrates an example of a drop manipulation on the menu by the user and the example of the GUI when the menu is dragged at a speed equal to or greater than a fixed speed.
  • the information processing system 100 executes a process of deleting the menu button 1111 .
  • the information processing system 100 can supply the user with the GUI which is easy for the user to intuitively understand.
  • FIG. 54 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 54 illustrates an example of a drop manipulation on the menu by the user and a state in which the user executes a manipulation of dropping the menu button 1111 into a trash menu 1112 .
  • the information processing system 100 executes a process of deleting the menu button 1111 .
  • FIG. 55 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 55 illustrates an example of a drop manipulation on the menu by the user and a state in which the user executes a manipulation of moving and dropping the menu button 1111 outside the screen.
  • the information processing system 100 detects that the user executes the manipulation of moving and dropping the menu button 1111 outside the screen, the information processing system 100 executes the process of deleting the menu button 1111 .
  • FIG. 56 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 56 illustrates an example of a drop manipulation on the menu by the user and a state in which the user executes a manipulation of inserting the menu button into the menu button group 1110 .
  • the information processing system 100 When the user drags the menu button 1111 in an arc shape displayed in the menu button group 1110 , the information processing system 100 generates a gap for inserting the menu button 1111 into the arc shape in which the menu button group 1110 is displayed. Then, when the user drops the menu button 1111 in the gap in the arc shape, the information processing system 100 displays the menu button 1111 dropped on the arc shape.
  • the information processing system 100 can supply the GUI which is easy for the user to intuitively understand.
  • FIG. 57 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 57 illustrates an example of a drop manipulation on the menu by the user and illustrates a state in which the user executes a manipulation of dropping the menu button on a menu button in the menu button group 1110 .
  • the information processing system 100 broadly displays the menu button on which the user intends to drop the menu button, as illustrated in FIG. 57 .
  • the information processing system 100 displays the dropped menu button so that the dropped button is added to a subordinate menu of the drop destination, as illustrated in FIG. 57 .
  • the information processing system 100 may display the dropped menu button to be added to the end of the subordinate menu of the drop destination menu or may display the dropped menu button to be added to the closest position of the drop destination menu.
  • FIG. 58 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 58 illustrates an example of a drop manipulation on the menu by the user and illustrates a state in which the user executes a manipulation of dropping the menu button on a menu button in the menu button group 1110 .
  • the information processing system 100 broadly displays the menu button on which the user intends to drop the menu button, as illustrated in FIG. 58 .
  • the information processing system 100 newly displays a menu button of the folder menu in the location in which the menu button of the drop destination has been displayed until then and displays the dropped menu button and the menu button of the drop destination as a subordinate menu of the folder menu, as illustrated in FIG. 58 . That is, the information processing system 100 executes a process of merging the dropped menu button and the menu button of the drop destination.
  • FIG. 59 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 59 illustrates an example of a drop manipulation on the menu by the user and a state in which the user executes a manipulation of dropping another menu button 1111 on a certain menu button while pressing the menu button in the menu button group 1110 with his or her finger.
  • the information processing system 100 displays the menu buttons so that the menu buttons are joined, as illustrated in FIG. 59 .
  • the information processing system 100 may display the menu buttons displaced in the menu button group 1110 without change or may display the other menu buttons in an arc shape of the menu button group 1110 .
  • FIG. 60 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure. The merging example of the menus has been described.
  • FIG. 60 illustrates a merging example of the menus when the user drops another menu button on a certain menu button while pressing the menu button with his or her finger.
  • the information processing system 100 may generate only one subordinate menu rather than two subordinate menus. For example, FIG.
  • 60 illustrates an example in which, when menus including subordinate menus A, B, D, F, and G and subordinate menus A, C, D, E, and F are merged, the subordinate menus A, D, and F common to both menus are included as single menus.
  • FIG. 61 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 61 illustrates a generation example of a new menu according to a drag manipulation and a drop manipulation from the user in an application displayed by the information processing system 100 .
  • the information processing system 100 generates a copy of the menu button 1111 according to a user manipulation of dragging the certain menu button 1111 in the menu button group 1110 with one finger while pressing the menu button 1111 with another finger.
  • FIG. 61 illustrates an example in which copies of menus D and E in the menu button group 1110 are generated based on a user manipulation.
  • the information processing system 100 generates a new menu button 1100 and a menu button group 1110 according to a user manipulation of merging copies of the menus D and E.
  • the information processing system 100 adds the dropped menu button 1111 to the newly generated menu button group 1110 , as illustrated in FIG. 61 .
  • the information processing system 100 By receiving the drag manipulation or the drop manipulation by the user, the information processing system 100 according to the embodiment of the present disclosure can ensure ease of customization of the menu. By receiving the drag manipulation or the drop manipulation by the user, the information processing system 100 according to the embodiment of the present disclosure can allow a plurality of users to simultaneously use the same menu.
  • FIG. 62 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 62 illustrates a generation example of a shortcut button according to a drag manipulation and a drop manipulation from a user in an application displayed by the information processing system 100 .
  • the user has to select menu buttons in sequence in the order of a menu button 1100 , a menu button group 1110 , and a menu button group 1120 through three manipulations to reach the menu button in the menu button group 1120 in the second hierarchy counted from the menu button 1100 .
  • the information processing system 100 generates a shortcut button based on the user manipulation, as described above, so that the user can reach the menu button in the menu button group 1120 through one manipulation.
  • the information processing system 100 can allow any user to reach the menu through one manipulation.
  • the information processing system 100 can allow, for example, family members to generate a common menu or can allow the family members to generate separate menus.
  • FIG. 63 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 63 illustrates a generation example of menus for family members according to a drag manipulation and a drop manipulation from the user in the application displayed by the information processing system 100 .
  • FIG. 63 illustrates a state in which a father menu button, a mother menu button, and a child menu button are included in the menu button group 1120 .
  • the information processing system 100 When the menu button group 1120 is displayed in this way and the user executes a manipulation of generating shortcut buttons of the father menu button, the mother menu button, and the child menu button, the information processing system 100 generates shortcut buttons to the menu buttons.
  • the shortcut buttons to the menu buttons are generated, a father, a mother, and a child can use their own menus and customize the menus so that they can easily use the menus by merely touching the shortcut buttons with manipulators such as their fingers.
  • the customization of the menus displayed through manipulations on the shortcut buttons is also reflected in the original menus.
  • the information processing system 100 can allow the users to generate, for example, bookmarks of websites easily and intuitively by generating the copies of the menus, as described above.
  • FIG. 64 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 64 illustrates a generation example of a bookmark menu according to a drag manipulation and a drop manipulation from the user in an application displayed by the information processing system 100 .
  • FIG. 64 illustrates a state in which the menu button 1100 and the menu button groups 1110 and 1120 are displayed by the information processing system 100 .
  • FIG. 64 illustrates a state in which a web browser 1140 displaying a web site on the Internet is displayed as an example of an application by the information processing system 100 .
  • FIG. 64 illustrates a state in which a menu button 1141 for connection to a currently displayed web page is displayed in the bottom left of the web browser 1140 by the information processing system 100 .
  • the information processing system 100 detects that the user executes a manipulation of generating a copy on the menu button 1141 , the information processing system 100 generates a copy of the menu button 1141 according to the user manipulation.
  • the copy of the menu button 1141 can function as a bookmark of the web page.
  • the generated copy of the menu button 1141 is added to the menu button group 1120 through, for example, a manipulation from the user.
  • the information processing system 100 according to the embodiment of the present disclosure can allow the user to generate, for example, a bookmark of a web page intuitively and easily by offering such a manipulation to the user.
  • the information processing system 100 according to the embodiment of the present disclosure can collect bookmarks of a plurality of web pages in one menu button group through a simple manipulation by offering such a manipulation to the user.
  • the information processing system 100 is configured to receive menu manipulations from a plurality of users, and thus a situation in which the same application or similar applications are activated by a plurality of users and are executed simultaneously can occur.
  • the same application or similar applications are executed simultaneously by a plurality of users, a situation in which it is difficult to comprehend who activates which application may occur.
  • the information processing system 100 according to the embodiment of the present disclosure supplies a structure capable of binding menus with applications and releasing the binding through a simple user manipulation.
  • FIG. 65 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 65 illustrates an example of a GUI when binding of a menu and an application displayed by the information processing system 100 is released through a manipulation from the user.
  • the information processing system 100 may display one menu button in the menu button group 1120 and the menu button 1141 of the web browser 1140 by binding the menu buttons.
  • the binding of the one menu button in the menu button group 1120 and the menu button 1141 of the web browser 1140 is indicated by a broken line, but the display of the binding in the present disclosure is not limited to the related example.
  • the information processing system 100 detects that the user executes a predetermined manipulation, for example, the user executes a manipulation of cutting the binding at a speed equal to or greater than a predetermined speed in the display of the binding, the information processing system 100 executes a process of releasing the binding of the one menu button in the menu button group 1120 and the menu button 1141 of the web browser 1140 .
  • the information processing system 100 executes a process of closing the web browser 1140 .
  • the information processing system 100 may execute a display process of gradually thinning the web browser 1140 and finally removing the display, as illustrated in FIG. 65 .
  • the information processing system 100 can bind the menu with the application and can release the binding through a simple user manipulation.
  • the information processing system 100 can offer various other manipulations to the user.
  • the projection type information processing system 100 a illustrated in FIG. 1 displays information on a display surface with a large area as in the table 140 a , and thus the user may wish to bring an application located away from him or her close to his or her hand.
  • the information processing system 100 may execute a display process of moving a window of the application according to the user manipulation.
  • FIG. 66 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 66 illustrates an example of a GUI when the user manipulates the binding of the menu and the application displayed by the information processing system 100 .
  • FIG. 66 illustrates a state displayed by the information processing system 100 so that one menu button in the menu button group 1120 is bound up with the menu button 1141 of the web browser 1140 .
  • the information processing system 100 executes a display process of causing the web browser 1140 to be close to the menu button group 1120 according to the detection of the manipulation.
  • the information processing system 100 can improve convenience of the user manipulation.
  • FIG. 65 the example of the case in which the user executes the manipulation of bringing the window of the application close to his or her hand is illustrated.
  • the information processing system 100 executes a display process of keeping the window of the application away from the menu button.
  • the information processing system 100 can allow the user to manipulate an application at a location away from the window of the application.
  • FIG. 67 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 67 illustrates an example of a GUI when the user executes a manipulation in the state in which the menu and the application displayed by the information processing system 100 are bound up together.
  • FIG. 67 illustrates a state displayed by the information processing system 100 so that one menu button in the menu button group 1120 is bound up with the menu button 1141 of the web browser 1140 .
  • menus for manipulating the web browser 1140 are assumed to be arranged.
  • the information processing system 100 executes a process (for example, a process of opening a web page with a bookmark, returning to a previous page, moving to a next page, or closing the web browser) on the web browser 1140 according to the manipulation.
  • a process for example, a process of opening a web page with a bookmark, returning to a previous page, moving to a next page, or closing the web browser
  • the information processing system 100 can improve the convenience of the user manipulation.
  • the projection type information processing system 100 a illustrated in FIG. 1 displays information on a display surface with a large area as in the table 140 a . Therefore, even when the application is displayed at a position far from the user, a remote manipulation is possible due to the binding, and thus it is possible to improve the convenience of the user manipulation.
  • the web browser 1140 is illustrated as the application, but the application which can be remotely manipulated is not limited to the related example.
  • the application when the application is music reproduction software, adjustment of a volume, skip, fast-forwarding, or rewinding of music, or the like can be remotely manipulated by binding the menu button with the application.
  • the application when the application is moving-image reproduction software, adjustment of a volume, skip, fast-forwarding, or rewinding of a moving image, or the like can be remotely manipulated by binding the menu button with the application.
  • the information processing system 100 displaying the GUI for displaying the menu button groups 1110 and 1120 using the menu button 1100 as a starting point has been described.
  • the starting point of the menu button group 1110 and 1120 is not limited to the menu button 1100 .
  • the information processing system 100 may display the menu button groups 1110 and 1120 using a mobile phone, a smartphone, or another portable terminal owned by the user as a starting point.
  • the examples in which the information processing system 100 and the portable terminal are linked have been described.
  • the information processing system 100 can also display the menu button groups 1110 and 1120 using the portable terminal linked to the information processing system 100 as a starting point.
  • FIG. 68 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 68 illustrates an example of a GUI when the portable terminal 1310 linked to the information processing system 100 is used as the starting point of the display of the menu button groups 1110 and 1120 .
  • the information processing system 100 may display the menu button group 1110 and 1120 around the portable terminal 1310 , as illustrated in FIG. 68 .
  • the user can edit the layout of the menu button groups 1110 and 1120 at home, store the layout in the portable terminal 1310 , bring the portable terminal 1310 to his or her friend's home, and display the menu button groups 1110 and 1120 that he or she edited at home using the information processing system 100 at his or her friend's home.
  • the information processing system 100 can allow each user, for example, each family member, to generate each different menu by generating the shortcut button, as illustrated in FIG. 63 .
  • the information processing system 100 according to the embodiment of the present disclosure may bind an account of each user with the menu that he or she generated in this way.
  • FIG. 69 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 69 illustrates a state in which the account of each user is bound up with the menu that he or she generated.
  • FIG. 69 illustrates windows 1150 of applications used by a father and a mother and menu buttons 1151 for manipulating the applications.
  • the information processing system 100 can store cookies of a web browser, record login information of a web page, or manage an access history of a web page for each user by binding the account of each user with the menu that he or she generated.
  • a form in which the information processing system 100 according to the embodiment of the present disclosure is simultaneously used by a plurality of users can be assumed. Accordingly, when the menu customized for each user is generated, as described above, a situation in which a certain user uses the menu of another user can occur. When the menu of the user is not locked, anyone can simply use the menu of the user.
  • the information processing system 100 supplies a structure in which the menu is not usable when authentication is not gained.
  • An authentication scheme may be a password scheme or may be a device authentication scheme using the portable terminal used by the user.
  • a structure in which access to the menu is authenticated in accordance with the device authentication scheme will be described.
  • FIG. 70 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 70 illustrates an example of a GUI when access to the menu is authenticated in accordance with the device authentication scheme using the portable terminal used by the user.
  • the information processing system 100 allows the menu to be locked with a key so that access to the menu used by a father is not permitted when authentication is not gained using the portable terminal used by the father.
  • the information processing system 100 executes control such that no response is made in the state in which the menu used by the father is locked with the key even when the user selects the menu (for example, even when the user selecting the menu is the father himself).
  • the information processing system 100 detects that the portable terminal used by the father is placed near the menu used by the father, the information processing system 100 recognizes the portable terminal.
  • the information processing system 100 recognizes that the portable terminal is the portable terminal of the father, the key to the menu used by the father is released.
  • the information processing system 100 may recognize the portable terminal through the above-described image recognition or may recognize the portable terminal through near field communication (NFC), Wi-Fi communication, Bluetooth (registered trademark) communication, or the like.
  • NFC near field communication
  • Wi-Fi communication Wi-Fi communication
  • Bluetooth registered trademark
  • the information processing system 100 can restrict the access to the menu by an unauthenticated user.
  • the information processing system 100 can display the menu button using the portable terminal as the starting point, as described above.
  • the information processing system 100 according to the embodiment of the present disclosure supplies the structure controlling authority over the portable terminal in accordance with the menu button displayed using the portable terminal as the starting point.
  • FIG. 71 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 71 illustrates an example of a GUI when authority over the portable terminal is controlled in accordance with the menu button at the time of display of the menu button using the portable terminal as the starting point.
  • FIG. 71 illustrates a state in which the menu button groups 1110 and 1120 are displayed using portable terminals 1310 and 1320 as starting points.
  • FIG. 71 illustrates a state in which authority over each of the portable terminals 1310 and 1320 is displayed in the menu button group 1120 .
  • the authority over the portable terminals 1310 and 1320 is, for example, manipulation authority for an application, manipulation authority for a device remotely manipulated from the portable terminal, and payment authority at the time of payment of a price using the portable terminal.
  • the user executes a manipulation of copying certain authority (for example, payment authority for a price corresponding to 1000 yen) from the menu button group 1120 displayed using the portable terminal 1310 as the starting point to the menu button group 1120 displayed using the portable terminal 1320 as the starting point.
  • the information processing system 100 executes a process of copying the authority maintained in the portable terminal 1310 to the portable terminal 1320 according to the user manipulation.
  • the information processing system 100 can transfer the authority in the portable terminal through a simple manipulation.
  • the information processing system 100 supplies a function of delivering data to an application based on the drag and drop manipulations on the menu button to the window of the application.
  • FIG. 72 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 72 illustrates an example of a GUI when the data is delivered to the application based on the drag and drop manipulations on the menu button to the window of the application.
  • FIG. 72 illustrates a state in which there is a menu button corresponding to a bookmark for access to a web page in the menu button group 1120 .
  • the information processing system 100 controls an operation of the web browser 1140 so that the access to the web page corresponding to the dropped menu button is gained according to the user manipulation. In this way, by transferring data to the application according to the drag and drop manipulations by the user, the information processing system 100 according to the embodiment of the present disclosure can provide an intuitive manipulation to the user.
  • FIG. 72 illustrates an example in which the application is a web browser, but the application which is the data transfer target is not limited to the related example.
  • the information processing system 100 executes a process of displaying the image data with the image display application.
  • the information processing system 100 executes a process of reproducing the music data with the music reproduction application.
  • the application is a moving-image reproduction application and the user drops a menu button indicating moving-image data on the moving-image reproduction application
  • the information processing system 100 executes a process of reproducing the moving-image data with the moving-image reproduction application.
  • FIG. 72 the example in which the information processing system 100 transfers the data to the application according to the drop of the menu button on the window of the application has been described.
  • the information processing system 100 executes a function corresponding to a dropped menu button according to drop of the other menu button on a menu button supplying any function will be described.
  • FIG. 73 is an explanatory diagram illustrating an example of a GUI of an application displayed on the information display surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 73 illustrates an example of a GUI when the function corresponding to the dropped menu button is executed based on manipulations of dragging and dropping the other menu button on the menu button supplying any function.
  • FIG. 73 illustrates a state in which there is a menu button for posting to, for example, a social networking service (SNS) in the menu button group 1120 .
  • the user is assumed to drop a menu button indicating, for example, image data on a menu button in the menu button group 1120 .
  • the information processing system 100 executes a process of posting the image data to the SNS according to the drop manipulation from the user.
  • the information processing system 100 can offer an intuitive manipulation to the user.
  • each menu button of the menu button group 1120 in FIG. 73 is not limited to the posting to the SNS.
  • various functions such as transmission of data to a partner with a registered contact address and transmission of data to a linked device can be considered as the function executed by each menu button of the menu button group 1120 .
  • UIs user interfaces
  • the projection type information processing system 100 a a casing in which the projector and camera are provided above the table 140 a is also referred to as a body.
  • the table 140 a is also referred to as a projection surface (display surface) to which an image is projected by the projector.
  • the information processing system 100 supplies a semicircular menu rotated according to the shape of a manipulation object.
  • a menu is displayed regardless of the shape of a manipulation object, for example, display of the menu may overlap a hand, and thus visibility deteriorates in some cases. Accordingly, the information processing system 100 according to the present specific example displays a menu in a region other than a region in which it would overlap a manipulation object.
  • FIGS. 74 to 76 are explanatory diagrams for describing a user interface according to specific example 1.
  • the information processing system 100 displays a menu 2002 in which icons (menu items) are disposed in a semicircular shape according to the shape of a finger with which the user touches a menu button 2001 displayed on the table 140 a functioning as a display surface.
  • the information processing system 100 displays the semicircular menu 2002 spreading right and left centering on a direction of the finger so that the icons do no overlap the finger.
  • the information processing system 100 causes the text display of an icon to directly face the user according to the direction of the finger.
  • a menu button 2003 in which items are disposed in a semicircular shape according to the shape of a finger is similarly displayed.
  • the detection unit 121 first detects a manipulation object overlapping the display surface.
  • the manipulation object may be a part of the body of the user such as a finger or a hand, may be any object such as a manipulation stick to be manipulated by the user, or may be a robot arm or the like.
  • the detection unit 121 detects the shape, an orientation of a longer side, an orientation of a shorter side, and a height of the manipulation object overlapping the display surface based on depth information obtained by a stereo camera.
  • the detection unit 121 may detect a direction in which the finger points. In the example illustrated in FIG. 74 , the detection unit 121 detects a finger or a hand touching the menu button 2001 or 2003 overlapping the table 140 a as a manipulation object and detects a direction in which the finger points.
  • the output control unit 122 controls the output unit 130 such that a menu with a circular shape in which a region overlapping the manipulation object detected by the detection unit 121 is omitted (a semicircular shape) is displayed on the display surface. For example, in the example illustrated in FIG. 74 , the output control unit 122 displays the menu 2002 in which no item is disposed in the region overlapping the finger touching the menu 2001 .
  • the output control unit 122 may increase or decrease at least one of the number of icons displayed or the display sizes of the icons according to the size of the region in which the manipulation object overlaps the display surface. For example, the output control unit 122 controls the output unit 130 such that the number of displayed icons increases or decreases according to the size of the hand touching the menu. Specifically, as illustrated in FIG. 75 , the output control unit 122 displays 9 icons when the user extends one finger and touches the menu button, and displays 7 icons when the user spreads his or her fingers without bending any fingers and touches the menu button.
  • the output control unit 122 may adjust the icon display up to the upper limit of the size and the number of icons which the user can touch without difficulty according to the thicknesses of the fingers. Specifically, when the fingers are slim, the output control unit 122 decreases the sizes of the icons and displays more icons. The output control unit 122 increases the sizes of the icons and reduces the number of icons when the fingers are thicker. The output control unit 122 may increase or decrease the radius of the circle in which the icons are disposed. For example, when the display sizes are fixed, the number of icons displayed increases or decreases according to an increase or decrease in the radius. When the number of icons displayed is fixed, the display sizes of the icons increase or decrease according to the increase or decrease in the radius.
  • the output control unit 122 executes such display control such that the icons do not overlap the finger of the user, the user can easily comprehend the entire menu. Further, an erroneous operation caused due to an icon of which display overlaps a finger being unintentionally touched with the finger can be avoided.
  • the output control unit 122 can display the icons utilizing the available area as much as possible by controlling the menu display in accordance with the size of the hand. The adjustment of the number of icons displayed can be particularly effective when the total number of icons exceeds the number displayed, that is, not all of the icons may be displayed.
  • the output control unit 122 may control the direction of the menu to be displayed according to the direction of the manipulation object. Specifically, the output control unit 122 may control the output unit such that the menu in which the items are disposed based on the orientation of the longer side of the manipulation object detected by the detection unit 121 is displayed on the display surface. For example, in the example illustrated in FIG. 74 , the output control unit 122 displays the semicircular menu 2002 spreading right and left centering the direction of the finger based on the direction of the finger touching the menu button 2001 . Further, the output control unit 122 estimates the direction of the user to whom the finger belongs based on the direction of the finger touching the menu button 2001 and displays the menu 2002 so that the text display and the menu disposition directly face the user. The output control unit 122 can supply the menu directly facing the user when the user touches the table 140 a in any direction by executing such display control.
  • the information processing system 100 can also identify an individual 6 according to the detection result by the detection unit 121 and execute an individualized output according to a manipulation history of the identified individual or the like.
  • the information processing system 100 can specify the individual according to the thicknesses of fingers. Therefore, for example, even in a state in which the user logs in to the information processing system 100 used at home with a family sharing account, the individualized output can be output to the logged-in family members without orienting a camera toward the face of the user and identifying the user through face recognition.
  • the information processing system 100 is of a projection type, the user can be supplied with an individualized output even without looking up.
  • the information processing system 100 can output the individualized output even in an environment in which an unspecified large number of users in a bar or the like touch. However, for example, when the user executes a touch while wearing gloves on a snowy mountain, a case in which the thicknesses of the fingers change even for the same individual is considered.
  • the present specific example is a form in which input and output are optimized for the user by estimating a direction in which the user is located from a direction in which a hand or a finger is observed when a camera, a microphone, a projector, and a speaker are known.
  • the information processing system 100 estimates the position of the user and executes optimized input and output at the position of the user.
  • FIGS. 77 to 85 are explanatory diagrams for describing a user interface according to specific example 2.
  • the detection unit 121 controls directivity of the microphone functioning as the input unit 110 and orients the directivity of the microphone toward the mouth of the user.
  • the detection unit 121 controls the directivity using a microphone array in which a plurality of microphones are combined.
  • FIG. 77 illustrates an example of directivity formed by a microphone array including four microphones.
  • the detection unit 121 can exclude everyday noise from directions outside of the directivity.
  • the microphones can be caused to have more narrowed directivity.
  • FIG. 78 illustrates a control example of the directivity according to the position of a finger touching the display surface.
  • the detection unit 121 estimates the position of the user according to the position of the finger of the user touching the display surface and orients the directivity of the microphone toward the estimated position of the user. Even when the user is far from the microphone, the detection unit 121 can acquire a clear user sound from which everyday noise is excluded by controlling the directivity of the microphone in this way, and thus executes sound recognition, for example. As illustrated in FIG.
  • the information processing system 100 can acquire a clear user sound from which everyday noise is excluded even at a distance of about 100 cm.
  • the detection unit 121 first detects the manipulation object.
  • the detection unit 121 detects the shape, the direction of the longer side, the direction of the shorter side, and the height of the manipulated object overlapping the display surface.
  • the detection unit 121 may detect a direction pointed by the finger.
  • the detection unit 121 functions as an estimation unit that estimates a direction in which the user manipulating the manipulation object is located based on the detected manipulated object.
  • the detection unit 121 detects a hand or a finger of the user as a manipulation object and estimates the direction in which the user is located based on the position and direction of the hand or the finger.
  • a specific estimation method will be described.
  • the detection unit 121 estimates that the user is located in an arrow direction on a straight line connecting the position of an icon projected to the projection surface and the position of an icon touched with a finger.
  • This scheme is effective when a premise that the icon to be manipulated is near a hand of the user is established. For example, when the user can press and hold an icon and subsequently drag the icon to freely move the icon near his or her hand, this premise is considered to be established.
  • the detection unit 121 can estimate the direction of the user even when the detection of the direction of the finger fails.
  • the detection unit 121 estimates that the user is located in the arrow direction which is opposite to the direction of the finger touching the icon. This scheme is effective when the detection unit 121 successfully detects the direction of the finger.
  • the detection unit 121 estimates that the user is located between the right and left hands based on the positions and the directions of the hands placed on the table 140 a functioning as the projection surface and the user is located in the arrow direction. Specifically, the detection unit 121 estimates that an average direction of directions of two hands detected from the same side is the direction in which the user is located. This scheme is effective when the detection unit 121 successfully detects the directions of the hands.
  • the detection unit 121 estimates the direction in which the user is located from the directions of the fingers manipulating an application.
  • the detection unit 121 may estimate the current direction in which the user is located by estimating and storing the directions of the fingers manipulating a plurality of applications from the time of driving and integrating the stored directions of the fingers.
  • the detection unit 121 may estimate the direction in which the user is located by integrating and calculating the directions of the fingers manipulating another application in the directions of the fingers manipulating a certain application.
  • the detection unit 121 estimates the direction in which the user is located. Then, the detection unit 121 controls the input unit 110 such that an input having directivity in the estimated direction in which the user is located is executed. For example, in the example described above with reference to FIGS. 79 to 83 , the detection unit 121 controls the microphone such that the beamforming is executed to acquire a sound in the estimated direction in which the user is located. The detection unit 121 may execute the beamforming process on a sound output from each microphone. Accordingly, the detection unit 121 can acquire a clear user sound from which everyday noise is excluded. Additionally, the detection unit 121 may control the direction of a camera functioning as the input unit 110 to acquire an image in the estimated direction in which the user is located.
  • the output control unit 122 controls the output unit 130 such that an output having directivity in the direction which is estimated by the detection unit 121 and in which the user is located is executed.
  • the output control unit 122 controls a speaker functioning as the output unit 130 such that a channel is configured to output a sound in the direction in which the user is located.
  • the output control unit 122 may control the output unit 130 such that an image is output, for example, in the estimated direction in which the user is located so that the image directly faces the user.
  • FIG. 84 is a diagram illustrating a positional relation between the body and a user and a simple configuration example when the body of the projection type information processing system 100 is viewed from below.
  • speakers are formed in four corners of the body.
  • “R” indicates a right channel
  • “L” indicates a left channel.
  • the output control unit 122 controls the speakers such that sounds are output with a normal channel configuration.
  • the output control unit 122 controls the speakers such that sounds are output in a channel configuration specialized for an estimated user position.
  • the information processing system 100 can reproduce content for which a channel configuration is designed on the assumption of use of a home television according to an intention of a content generator. Additionally, according to the present specific example, the information processing system 100 can also display the application such as a web browser so that the application directly faces the user after the sound recognition is completed.
  • a photo application will be described as an example of an application in which the information processing system 100 according to the present specific example is used.
  • the application outputs various visual effects and sound effects using position information and installation azimuth information regarding the body and the table 140 a when the positions of the speakers and the table 140 a viewed from the body are known.
  • the position information can be acquired by, for example, the Global Positioning System (GPS) and the installation azimuth information can be acquired by, for example, a geomagnetic sensor.
  • GPS Global Positioning System
  • installation azimuth information can be acquired by, for example, a geomagnetic sensor.
  • the application acquires a photographing position from exchangeable image file format (Exif) information incidental to a photo and estimates an azimuth of the photographing position viewed from the table 140 a .
  • the application displays an animation in which photos slide onto a projection surface in the estimated azimuth or produces a sound effect from a speaker at a corresponding position.
  • a sound image may be localized at the corresponding position.
  • the application may be applied to, for example, conference rooms so that conference documents are mirrored to be shared on the table 140 a installed in a plurality of conference rooms at remote locations or dialogs may be realized between the conference rooms.
  • the photos displayed on the projection surface 140 a by the photo application can be manipulated and browsed simultaneously by many people in many directions.
  • the users at four sides of the projection surface 140 a can simultaneously select photos in many directions and move their positions while changing the directions, edit the photos, or add new photos.
  • the present specific example is a form in which the state of the projection surface and the state of an application during activation are managed and an illuminator (a light) is controlled as necessary.
  • an illuminator a light
  • a projected image may become unclear due to brightness of the illuminator. For this reason, a person executes an action of turning off illuminator in a room or turning off illuminator in only the vicinity of a projection surface.
  • the information processing system 100 controls the illuminator such that an image projected by the projector is clearly displayed, and thus such effort by the person can be reduced.
  • the information processing system 100 is of a projection type and it is assumed that a controllable illumination unit is integrated with the body or a separated illumination unit can be remotely adjusted.
  • the illumination unit is assumed to change a radiation range and a radiation direction.
  • FIGS. 86 to 93 are explanatory diagrams illustrating user interfaces according to specific example 3.
  • the information processing system 100 includes a camera 2011 , a projector 2012 that functions as a projection unit, and a plurality of illumination units 2010 arrayed in a matrix form.
  • the detection unit 121 detects the state of a projection surface based on an image captured by the camera 2011 orientated toward the projection surface and acquires state information obtained by detecting the state of the projection surface.
  • the state of the projection surface includes, for example, the brightness of the projection surface, a contrast ratio of an image projected to the projection surface, presence or absence of an object on the projection surface, and the kind, disposition, and size of the object on the projection surface, and a combination of objects.
  • the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with an amount of light (illumination intensity) according to the state detected by the detection unit 121 and an image projected to the projection surface by the projector 2012 .
  • the output control unit 122 controls the illumination unit 2010 such that the projection surface is irradiated with a large amount of light (brightly).
  • the output control unit 122 controls the illumination unit 2010 such that the projection surface is irradiated with a small amount of light (darkly).
  • the illuminator may be turned off so that an amount of light is zero.
  • the output control unit 122 may control the illumination unit 2010 such that the projection surface is irradiated with an amount of light by which a contrast ratio of the projected image is equal to or greater than a threshold value by increasing or decreasing an amount of light with reference to a detection result of the contrast ratio of the projected image by the detection unit 121 .
  • the output control unit 122 can cause brightness and visibility to be compatible.
  • the detection unit 121 detects the placed object as the state of the projection surface and the output control unit 122 controls the illumination units 2010 such that the vicinity of the detected object is irradiated with a large amount of light and another portion used as the projection surface is radiated with a small amount of light.
  • the output control unit 122 can control the illumination units 2010 based on a kind of object placed on the table 140 a . For example, when the object placed on the projection surface is detected by the detection unit 121 , the output control unit 122 may control the illumination units 2010 such that the projection surface is irradiated with an amount of light according to whether the detected object is an illuminant that emits light or a reflector (which does not emit light) reflecting light. Specifically, the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with a small amount of light when the object is an illuminant and such that the projection surface is irradiated with a large amount of light when the object is a reflector.
  • illumination control when a smartphone is placed as an illuminant and a plate is placed as a reflector on the table 140 a will be described with reference to FIGS. 88 and 89 .
  • the output control unit 122 first controls the illumination units 2010 such that the projection surface is irradiated with a small amount of light.
  • the detection unit 121 detects the object. In this case, in order for the detection unit to recognize the object, the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with a large amount of light regardless of whether the object is an illuminant or a reflector. Subsequently, the detection unit 121 recognizes the object placed on the table 140 a , for example, by inquiring of a server about an image obtained by imaging the projection surface.
  • the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with an original small amount of light, as illustrated in the right drawing of FIG. 88 since a display unit of the smartphone is a light-emitting illuminant. Since the smartphone is an illuminant, the detection unit 121 can continuously detect and track the smartphone even when the projection surface is darkened. Since the amount of light is small, the light radiated from the illumination units 2010 is prevented from being reflected (highlighted) to the display unit of the smartphone. Accordingly, the detection unit 121 can easily detect the smartphone.
  • the output control unit 122 controls the illumination units 2010 such that the projection surface is irradiated with a large amount of light, as illustrated in the left drawing of FIG. 89 since the plate is a reflector with no illuminant. Accordingly, the detection unit 121 can continuously detect and track the plate.
  • the output control unit 122 may control the illumination units 2010 such that only a region in which the plate is located is bright and other regions are dark, as illustrated in the middle drawing of FIG. 89 . Accordingly, for example, the projected image is projected to be visible in the other region. As illustrated in the right drawing of FIG.
  • the output control unit 122 may control the projector 2012 such that the illumination units 2010 are turned off and then a spotlight slightly larger than the shape of the object is shone.
  • the spotlight from the projector can be realized, for example, by projecting a bright white image to a predetermined region. Accordingly, the detection unit 121 can easily detect the plate and can continuously detect and track the plate when the plate is moved within the range of the spotlight.
  • the detection unit 121 may detect whether the object is an illuminant or a reflector with reference to an image captured by the camera 2011 while adjusting the amount of light by the illumination units 2010 . For example, the detection unit 121 may recognize that the object is an illuminant when the object can be detected even in a dark state, and may recognize that the object is a reflector when the object can first be detected in a bright state. According to such a scheme, the detection unit 121 can identify even an object unregistered in the server as an illuminant or a reflector without inquiring of the server.
  • the information processing system 100 can improve search precision (detection precision) of the object through control of an illumination area.
  • the detection unit 121 controls the illumination units 2010 such that the entire table 140 a is illuminated immediately after the search start of the object (marker).
  • the marker is assumed to be a reflector.
  • the detection unit 121 controls the illumination units 2010 such that an illumination range is gradually narrowed to a region occupied by the marker. Accordingly, the information processing system 100 can explicitly give the user feedback on whether the marker placed on the table 140 a is recognized and how much a presence range of the marker is narrowed.
  • the detection unit 121 can detect and track the marker. At this time, the detection unit 121 may control the illumination units 2010 such that the illumination range is broadened and then narrowed again. As illustrated in the left drawing of FIG. 91 , when the marker deviates from the illumination range, the detection unit 121 can fail to detect the marker. In this case, as shown in the right drawing of FIG. 91 , the detection unit 121 may broaden the illumination range and search for the marker again.
  • the information processing system 100 can improve the search precision of the object through control of the amount of light.
  • the detection unit 121 adjusts the amount of light of the illumination units 2010 according to a material of the marker. For example, when the marker is formed of a glossy material such as glass or plastic and a highlight occurs, it may be difficult to detect the marker from the image captured by the camera 2011 . Therefore, the detection unit 121 controls the illumination units 2010 such that the projection surface is irradiated with a small amount of light by which the marker can be detected. On the other hand, for example, when the marker is formed of a glossless (matte) material such as cloth, paper, or wood, the marker can be easily detected when an environment is bright.
  • a glossless (matte) material such as cloth, paper, or wood
  • the detection unit 121 controls the illumination units 2010 such that the projection surface is irradiated with as large an amount of light as possible.
  • the detection unit 121 can determine the material of the marker, for example, with reference to information indicating the material of the marker registered in the server. As illustrated in FIG. 92 , the detection unit 121 can search for the marker while repeatedly changing the strength of illumination. In this case, the detection unit 121 can detect the marker of any of various materials under an illumination environment proper for the material without inquiring of the server.
  • the information processing system 100 can improve search precision of the object by linking the illumination units 2010 and the camera 2011 .
  • FIG. 93 illustrates an example in which the object is detected under a dim environment and time flows from the left side to the right side.
  • the detection unit 121 controls the illumination units 2010 such that the illumination units 2010 emit light in synchronization with a photographing interval of the camera 2011 . That is, the detection unit 121 causes the illumination units 2010 to function as a strobe light (electronic flash) attached to a general camera.
  • the camera 2011 can image reflected light of light emitted from the illumination units 2010 even in a dim environment, and thus the detection precision of the detection unit 121 is improved.
  • imaging and light emitting can be executed at intervals of 20 milliseconds. Since light is emitted at intervals which may not be recognized by a person, the object can be detected by the detection unit 121 even while the environment remains dim despite the momentary brightness.
  • the information processing system 100 can project a clear image to the projection surface by adjusting the illumination intensity and the illumination range by the illumination units according to the state of the projection surface.
  • the information processing system 100 can suppress an influence on an environment in which an entire room unintentionally becomes dark or bright by adjusting the illumination range so that a necessary spot is irradiated.
  • the information processing system 100 can improve recognition precision of the object placed on the table 140 a by adjusting the illumination intensity and the illumination range of the illumination units.
  • the present specific example is a form in which an excess of the number of recognizable manipulation objects is fed back. For example, when there is no feedback even when the table 140 a is touched with a finger, the user may not discern whether the touch has failed to be recognized, whether a UI has failed to respond despite the touch being recognized, or whether he or she has failed to execute a manipulation.
  • the information processing system 100 fails to detect a touch corresponding to the excess, and thus it is difficult to give the user feedback.
  • the information processing system 100 defines a number obtained by subtracting 1 from a computationally recognizable upper limit of the manipulation object as a recognizable upper limit based on specifications.
  • the computationally recognizable upper limit means an upper limit of manipulation objects which can be detected by the detection unit 121 . That is, one buffer is provided and the recognizable upper limit based on specifications is defined. Of course, the number of buffers may be any number other than 1.
  • the detection unit 121 detects a touched manipulation object on the table 140 a .
  • FIG. 94 is an explanatory diagram illustrating a user interface according to specific example 4.
  • FIG. 94 illustrates a feedback example given by the information processing system 100 when the computationally recognizable upper limit of the manipulation objects is 4.
  • the recognizable upper limit based on specifications is 3 obtained by subtracting 4 from 1. Therefore, as illustrated in FIG. 94 , when the number of fingers executing a touch manipulation is 1 to 3, the output control unit 122 controls the output unit 130 such that a sound effect indicating that touch detection is successful is output. As illustrated in FIG. 94 , when the number of fingers executing a touch manipulation becomes 4, 4 exceeds 3 which is the recognizable upper limit based on specifications. Therefore, the output control unit 122 controls the output unit 130 such that a sound effect indicating that the touch detection has failed is output.
  • the output control unit 122 may give feedback indicating that a finger is recognizable or unrecognizable, for example, at a timing at which the finger enters a view angle of the camera before the finger touches the table. Further, the output control unit 122 may give, for example, feedback indicating that a touch may not be detected when hands are clasped and the touch may not be available. In addition to the recognizable upper limit of the detection unit 121 , the output control unit 122 may give, for example, feedback according to a recognizable upper limit defined in an application in which fingers are used one by one in a two-player game.
  • the recognizable upper limit is 4, the number of fingers touching the table is 6, and thus two fingers are unrecognizable, feedback is given in preference for the top left side for scanning convenience.
  • the recognizable upper limit is null, the recognizable upper limit can be available in preference for the top left side.
  • the preferential position is not limited to the top left side, but any position may be preferred according to product design.
  • the present specific example is a form in which a manipulation mode is changed according to a hand with which no manipulation is executed.
  • the user can manipulate an image, text, an application, or the like projected to the table 140 a with his or her finger.
  • the information processing system 100 changes the manipulation mode based on a recognition result of a hand with which no manipulation is executed.
  • the detection unit 121 detects one pair of hands of the user. For example, two hands detected on the same side are detected as the one pair of hands by the detection unit 121 .
  • the output control unit 122 controls the output unit 130 such that an output is executed to cause one hand belonging to the one pair of hands detected by the detection unit 121 to function as an action point.
  • the output control unit 122 expresses an interaction of a scroll or the like according to a touched position by causing the right hand touching the table 140 a with a finger to function as an action point to manipulate an application projected to the table 140 a .
  • the output control unit 122 controls the output unit 130 such that an output is executed to cause the other hand to function as a switcher which switches classification of an action at the action point according to the shape of the one hand.
  • the output control unit 122 switches the manipulation mode of a manipulation by the right hand according to the shape of the left hand.
  • FIGS. 95 to 97 are explanatory diagrams illustrating a user interface according to specific example 5.
  • the detection unit 121 recognizes the shape of the left hand based on a captured image and the output control unit 122 switches the manipulation mode to “paperweight mode.”
  • the output control unit 122 controls the output unit 130 such that text is drawn at a point touched by the right hand.
  • the detection unit 121 recognizes the shape of the left hand based on a captured image and the output control unit 122 switches the manipulation mode to “scissors mode.”
  • the output control unit 122 controls the output unit 130 such that an expression in which a projected image is cut out is executed at a point touched by the right hand.
  • the detection unit 121 recognizes the shape of the left hand based on a captured image and the output control unit 122 switches the manipulation mode to “normal mode.”
  • the output control unit 122 controls the output unit 130 such that a normal manipulation such as tapping or dragging is executed at a point touched by the right hand.
  • the output control unit 122 may control the output unit 130 so that a different menu is output according to the shape of the hand detected by the detection unit 121 .
  • FIG. 96 the output control unit 122 may control the output unit 130 so that a different menu is output according to the shape of the hand detected by the detection unit 121 .
  • the output control unit 122 may control the output unit 130 such that a file list of different media is output according to the shape of the hand detected by the detection unit 121 .
  • a file list of different media is output according to the shape of the hand detected by the detection unit 121 .
  • FIG. 97 when the left hand is in the shape of the rock of the rock-paper-scissors game, a list of music files is output.
  • a list of web files is output.
  • the output control unit 122 may explicitly give the user a feedback of the manipulation mode by projecting display indicating the current manipulation mode to one of the right hand functioning as the action point and the left hand functioning as the switcher. For example, the output control unit 122 controls the output unit 130 such that a scissors mark is projected to a fingernail or the back of the hand when the manipulation mode is the scissors mode.
  • the output control unit 122 may switch the classification of the action at the action point according to the shape of the right hand functioning as the action point. For example, the output control unit 122 may control the output unit 130 such that a fine line is drawn when one finger of the right hand is spread and a thick line is drawn when two fingers of the right hand are spread.
  • the output control unit 122 may maintain the manipulation mode even when the left hand functioning as the switcher is off the table 140 a and the recognition of the detection unit 121 fails. For example, even when the left hand of the user is in the scissors shape, and the manipulation mode is switched to the scissors mode, and if the user subsequently pulls back his or her left hand, the output control unit 122 may maintain the scissors mode.
  • the user can switch the manipulation mode with the hand with which no manipulation is executed. Therefore, the user can seamlessly switch the manipulation mode without interruption of his or her current task, and thus continuous work is possible. Since the user can intuitively switch the manipulation mode, a learning cost related to the switching of the manipulation mode is low.
  • the present specific example is a form in which constituent elements such as a camera and a projector are formed in units of modules and replacement is possible for each module according to necessity by enabling connection by a standardized interface.
  • constituent elements such as a camera and a projector are formed in units of modules and replacement is possible for each module according to necessity by enabling connection by a standardized interface.
  • the information processing system 100 is formed as an integrated product, methods of extending functions other than replacing the information processing system 100 may become difficult. Accordingly, in the information processing system 100 according to the present specific example, constituent elements can be modularized and module units can be exchanged.
  • a CPU, a camera, a projector, an LED light, a microphone, a speaker, and the like included in the information processing system 100 are stored in standardized modules.
  • Such constituent elements may be individually stored or a plurality of constituent elements may be combined and stored in one module.
  • a module storing the CPU, the projector, and the camera may be comprehended as a core module and a module storing the other constituent elements may be comprehended as a sub-module.
  • Mutual communication and power feeding can be achieved by connecting the modules via a common interface and all of the connected modules can function as the information processing system 100 . It is also possible for only the core module to function as the information processing system 100 .
  • the interface may be realized through wireless communication, may be realized through wired communication, or may be connected physically by terminals.
  • FIGS. 98 and 99 are explanatory diagrams illustrating user interfaces according to specific example 6.
  • FIG. 98 illustrates a simple configuration example of the body of the information processing system 100 according to the present specific example.
  • a module storing the projector (Projector) is disposed in the lowest layer
  • a module storing a control substrate (MAIN PCB) on which the CPU is mounted is disposed in the middle layer
  • a module storing a speaker (Speaker) is disposed in the highest layer in a cylindrical container.
  • stereo cameras (Stereo Camera) connected to the control substrate are disposed on the right and left sides of the cylindrical container.
  • Such modules may be connected by, for example, common terminals.
  • the projection type information processing system 100 When there is a gap between the module storing the speaker and the module storing the control substrate, an improvement in sound quality and a heat radiation effect are expected. It is possible to realize the projection type information processing system 100 by connecting the speaker side to a ceiling and orienting the projector side to the floor side.
  • the modules may be disposed to overlap in the vertical direction, as illustrated in FIG. 98 , may be arranged in a straight line in the horizontal direction, may be arranged on a flat surface, or may be disposed to be individually separated at any position.
  • sub-modules of an illuminator and a speaker may be connected arbitrarily.
  • the information processing system 100 may be formed as an illumination device with a shape suspended from a ceiling, may be formed as a floor lamp type illumination device, or may be formed as a desk lamp type illumination device.
  • the core module may recognize a positional relation between the core module and the illuminators by specifying light-emitting positions through image recognition while causing the illuminators to sequentially emit light. Accordingly, the core module can cause the illuminator provided at proper positions to selectively emit light according to the state of the projection surface or the like. Additionally, the core module may notify the user to that an expiration date is approaching by recording an installation date of the illuminators and projecting a message, “These lights will soon expire,” for example. When a speaker with a broad range is fitted, the core module may output a sound mainly using this speaker and may use another speaker for balance adjustment.
  • each module can be replaced partially rather than the entire product, the replacement cost of the product is suppressed and resources are saved. According to the present specific example, it is possible to easily realize the extension of the function by replacing the module. For example, the user can improve performance such as the processing capability of the CPU, the resolution of the camera, and the recognition precision by substituting the core module. The user can enjoy a design variation of the abundantly developed speakers and illuminators, for example, by substituting the sub-module.
  • the present specific example is a form in which display of screens is synchronized when a plurality of screens of the same application are displayed.
  • applications are displayed in a single direction.
  • the information processing system 100 displays a plurality of screens of the same application and switches between synchronization (mirroring) and non-synchronization (releasing of the mirroring) of the screens as necessary.
  • the detection unit 121 detects a manipulation object.
  • the output control unit 122 controls the output unit 130 such that at least two screens are displayed on the display surface according to the manipulation object detected by the detection unit 121 .
  • the output control unit 122 displays the screens of the same application based on the directions of fingers detected by the detection unit 121 so that the screens directly face the plurality of users surrounding the table 140 a .
  • the output control unit 122 controls the output unit 130 such that display is executed to similarly reflect a manipulation on one screen with the manipulation object detected by the detection unit 121 on the other screen. For example, when one user scrolls the screen with his or her finger, the screen displayed for the other user is scrolled similarly.
  • FIGS. 100 to 103 are explanatory diagrams illustrating user interfaces according to specific example 7.
  • FIG. 100 illustrates a normal state in which a web browser is displayed on one screen.
  • FIG. 101 illustrates a synchronous display state in which web browsers are synchronously displayed on two screens.
  • the output control unit 122 reflects a manipulation on one screen on the other screen while the screens are synchronized. For example, as illustrated in FIG. 101 , when one user clicks the screen to transition the screen, the output control unit 122 controls the output unit 130 such that the screen transitions similarly so that the same spot is clicked on the other screen.
  • FIG. 101 illustrates an example in which a display surface is divided into two upper and lower surfaces for synchronous display. As illustrated in FIG. 102 , the output control unit 122 may control the output unit 130 such that the display surface is divided into two right and left surfaces that are synchronously displayed.
  • the output control unit 122 may control the output unit 130 such that display in which each user individually executes a scroll manipulation on his or her screen is executed.
  • the output control unit 122 unifies the displays of another screen (slave) in accordance with the display of one of the screens serving as a master.
  • the output control unit 122 can, for example, synchronously display text entry, input of a marker in a map application, etc. on all screens. Additionally, for example, when a plurality of users browse a certain entire web page, the output control unit 122 may display the positions of regions displayed by other users in rectangular forms or may display the directions of the regions with arrows.
  • FIGS. 100 to 103 illustrate examples in which the screens are displayed in contact states, but the screens may each be separated.
  • the output control unit 122 may divide (branch) one screen into two screens or may unify (join) two screens into one screen.
  • the output control unit 122 may display the plurality of branched screens synchronously or asynchronously.
  • the output control unit 122 may rejoin the plurality of branched screens. In this case, the output control unit 122 displays one screen serving as a master as the joined screen.
  • the output control unit 122 may set the screen first selected to be joined as a master and set the other screen as a slave. At this time, the output control unit 122 may display a dialog “Would you like to join?” on another screen and set a screen on which the user agrees to join as a slave.
  • the output control unit 122 may display a screen displayed originally by a slave as the screen of the slave. For example, an example in which the output control unit 122 synchronizes a master displaying web page “A” with a slave displaying web page “B” to display one web page “A” is assumed. Thereafter, when the web pages are branched again, the output control unit 122 may cause the master to display web page “A” and cause the slave to display web page “B.”
  • the output control unit 122 may control the output unit 130 such that the branching and the joining are executed by a user's selection of a menu item detected by the detection unit 121 . Additionally, the output control unit 122 may execute the branching when the detection unit 121 detects an operation of dragging and moving fingers touching the one screen right and left. Additionally, the output control unit 122 may execute the branching when the detection unit 12 detects an operation of two users touching the screen and drawing one screen right and left to cut the screen. In contrast, the output control unit 122 may execute the joining when the detection unit 121 detects an operation of moving fingers touching two screens so that the fingers overlap.
  • the detection unit 121 may distinguish a manipulation indicating the branching and the joining from manipulations such as pinch-in and pinch-out according to the number of fingers or the like.
  • the output control unit 122 may permit only the master to decide the joining or non-joining, may permit only the slave to decide the joining, or may permit all of the screens including the slave to decide the joining.
  • a plurality of screens can be displayed in the directions according to the positions of the users, it is possible to realize high visibility from different directions.
  • the plurality of screens can be switched synchronously or asynchronously as necessary, it is possible to realize extemporaneous display and manipulations according to the state of an application.
  • the screens are synchronized, a manipulation from another person is fed back. The user can easily recognize which manipulation the other person executes and how an application operates.
  • the present specific example is a form in which a subject on the table 140 a is recorded and is reproduced with an original size.
  • the subject on the table 140 a is, for example, an object such as a picture or a photo placed on the table 140 a , or an image projected to the table 140 a .
  • the information processing system 100 images a subject on the table 140 a at a certain time point and causes a projector (projection unit) to project the captured image so that the subject is subsequently displayed with a real size on the table 140 a .
  • the information processing system 100 stores a projection distance between the projector and the table 140 a and a projection view angle of the projector at a recording time point and changes (calibrates) a projection size according to the projection distance and the projection view angle at a reproduction time point.
  • the control unit 120 executes an adjustment process of matching an imaged size by the camera and a projected sized by the projector. Specifically, the control unit 120 functions as an adjustment unit executing adjustment so that the projected size of the subject matches the real size of the subject when an image obtained by imaging the subject on the table 140 a by the camera is projected to the table 140 a by the projector.
  • the control unit 120 executes, for example, position alignment of 4 points on the table 140 a and executes homography conversion as the adjustment process.
  • the information processing system 100 can capture a picture placed on the table 140 a and project the picture with the same size at a later date.
  • the settings of the projector such as the projection distance or the projection view angle
  • the projected size of the image projected by the projector is also changed. Accordingly, the information processing system 100 realizes projection of the subject with the original size by storing the captured image and the setting information in association therewith and adjusting the projected size according to a change in the setting information.
  • the control unit 120 first stores a captured image obtained by the camera (first imaging unit) imaging the subject on the table 140 a (first projection surface) and the setting information of the projector.
  • the control unit 120 may function as a storage unit that stores the setting information.
  • the setting information is information that includes information indicating a projection distance which is a distance between the projector and the projection surface.
  • the projection distance may be information indicating a distance between the projector and the subject.
  • the setting information may further include information indicating a projection view angle which is a view angle of the projector.
  • FIGS. 104 to 110 are explanatory diagrams illustrating user interfaces according to specific example 8.
  • the detection unit 121 controls the input unit 110 such that the placed picture is captured to acquire a captured image.
  • the control unit 120 stores a projection distance of 100 cm and a projection view angle of 60° as setting information in addition to the acquired captured image.
  • the picture is projected with the same size as the real size when the projector projects the captured image.
  • the output control unit 122 compares the stored setting information to the setting information after the change and controls the projector such that the expansion or reduction display is executed. For example, as illustrated in the middle drawing of FIG. 104 , when the projection distance is changed from 100 cm to 50 cm, the output control unit 122 controls the projector such that the picture is expanded to twice the size of the normal size to be projected. Accordingly, the picture is reproduced with the real size.
  • the output control unit 122 compares the stored setting information to the setting information after the change and controls the projector such that expansion or reduction display is executed. For example, as illustrated in the lower drawing of FIG. 104 , when the projection view angle is changed from 60° to 120°, the output control unit 122 controls the projector such that the picture is reduced to 0.5 times the normal size to be projected. Accordingly, the picture is reproduced with the real size.
  • FIG. 104 the example in which the setting of the projector is changed at the same home (the same information processing system 100 ) has been described. However, even when an image is transmitted to a different home, the same calibration may be executed. The change in the environment can be absorbed through the calibration.
  • the information processing system 100 may provoke the calibration function at a timing at which the user presses a switch on the table.
  • the information processing system 100 may provoke the calibration function at a timing at which the table 140 a is changed due to cleaning, rearrangement, or the like.
  • the information processing system 100 may automatically provoke the calibration function.
  • the information processing system 100 may provoke the calibration function at a timing at which a height (projection distance) is changed by an elevation function.
  • the elevation function is realized by, for example, an elevation device operated by a motor.
  • the information processing system 100 may automatically adjust the height using the elevation function so that the distance in which the projection surface is used as broadly as possible is set.
  • the output control unit 122 controls the projector such that a predetermined pattern is projected and the detection unit 121 controls the camera such that a projected image is captured.
  • the control unit 120 adjusts the height using the elevation function so that the height for an image in which the projected predetermined pattern entirely falls on the table 140 a is captured is achieved.
  • the distance to the projection surface is far since the projected pattern protrudes from the table 140 a .
  • the control unit 120 adjusts the height so that an optimum distance in which the projected pattern is pictured across the entire area of the table 140 a is set.
  • the control unit 120 may realize more highly reliable adjustment by measuring a distance from the size of the pattern included in the captured image in conjunction with a projection distance acquired from a depth sensor (stereo camera).
  • the information processing system 100 may execute the calibration during elevation of the elevation function.
  • the control unit 120 may control the projector such that a projection size is constant during elevation.
  • the control unit 120 may control the projector such that an image is projected with the original projection size after completion of the elevation.
  • the information processing system 100 can execute the elevation while adjusting the projection size based on a change in the projection distance during the elevation, so that the user can execute a manipulation and browsing even during the elevation.
  • the device such as the stereo camera or the projector included in the information processing system 100 is suspended from a ceiling.
  • the device included in the information processing system 100 can be moved to any location by the user.
  • vibration occurs due to movement, and thus a time necessary for the vibration to converge largely depends on the material.
  • FIG. 110 Exterior examples of a steel shaft made of steel, carbon fiber reinforced plastics (FRP) shaft, and carbon shafts containing power lines viewed in a side direction are illustrated in FIG. 110 .
  • reference numeral 1410 denotes the exterior example of the steel shaft
  • reference numeral 1420 denotes the exterior example of the carbon FRP shaft
  • reference numeral 1430 denote the exterior examples of the carbon shafts containing power lines.
  • Two carbon shafts containing power lines can be configured as one pair of carbon shafts.
  • the convergence time in which there was no problem in actual use was confirmed when the device included in the information processing system 100 was suspended using the carbon shaft. Further, by using two carbon shafts containing power lines, the convergence time in which there was no problem in actual use was confirmed even in a material with a small outer diameter.
  • a fiber used for the material of the shafts for example, glass, aramid, boron, bamboo, hemp, polyethylene terephthalate (PET), polyethylene (PE), or polypropylene (PP) can be used in addition to carbon.
  • the present specific example is a form in which an application activation location is automatically selected.
  • an application is normally activated at a predetermined location, for example, an object placed on the table 140 a may be an obstacle, and thus it may be difficult to display the entire application screen. Even when the application screen is intended to be moved, an object placed on the table 140 a may be an obstacle, and thus it may be difficult to move the application screen.
  • the information processing system 100 recognizes an object on the table 140 a at the time of activation of an application, searches for a position satisfying constraint conditions (display conditions) set for each application, and displays an application screen.
  • constraint conditions display conditions
  • the detection unit 121 first detects an object on the table 140 a by acquiring depth information. Then, the output control unit 122 controls the output unit 130 such that an image is displayed in a region other than a region overlapping the object detected by the detection unit 121 . Accordingly, an application screen is displayed in a region in which there is no object.
  • FIGS. 111 to 114 are explanatory diagrams illustrating user interfaces according to specific example 9.
  • a web browser 2021 and a music player 2022 are displayed in regions other than regions overlapping objects on the table 140 a .
  • the output control unit 122 controls the output unit 130 such that the applications are displayed on a flat surface on which a minimum size defined for each application is satisfied.
  • the detection unit 121 functions as an estimation unit that estimates the position of the user based on the position and the direction of a hand or a finger.
  • the output control unit 122 controls the output unit 130 such that an image is displayed at a position corresponding to the estimated position of the user according to display conditions set as a relation with the user. For example, when displaying the image near the position of the user is set as a display condition, the output control unit 122 controls the output unit 130 such that the application screen is displayed near the position of the user.
  • the display conditions regarding the display position may be comprehended as setting of the weight working on the application screen.
  • the display conditions of the application illustrated in FIG. 111 are assumed to be as follows.
  • the size of the projection surface is assumed to be 1920 ⁇ 1080.
  • the output control unit 122 displays the web browser 2021 and the music player 2022 are displayed at positions close to the user and on the flat surface satisfying the minimum sizes, as illustrated in FIG. 111 .
  • the output control unit 122 may display an image according to the positional relation of the moved object on the table 140 a at a position at which the display conditions are more matched. For example, in the example illustrated in FIG. 111 , when the object on the table 140 a is moved, the output control unit 122 may search for a position closer to the user and move the application screen.
  • the output control unit 122 may control the output unit 130 such that an image is displayed at a position according to the display conditions set as a relation with an object on the table 140 a . For example, when the image is set such that the image is displayed adjacent to an end (edge) of the object on the table 140 a , the output control unit 122 controls the output unit 130 such that an application screen is displayed adjacent to an object detected by the detection unit 121 .
  • the display conditions of the application illustrated in FIG. 112 are assumed to be as follows.
  • the output control unit 122 can express a stream flow by arranging and displaying the brook application 2023 along the edge to weave between objects on the table 140 a.
  • the output control unit 122 may control the output unit 130 such that an image is displayed according to display conditions set as a relation with the table 140 a (projection surface).
  • display conditions set as a relation with the table 140 a projection surface.
  • the display conditions of an application illustrated in FIG. 113 are assumed to be as follows.
  • the output control unit 122 displays the candle application 2024 in the middle of the table 140 a.
  • the output control unit 122 may directly project an image at a default position without avoiding objects. As illustrated in the middle drawing of FIG. 114 , the output control unit 122 may transmit regions overlapping the objects while directly projecting images at the default positions. As illustrated in the lower drawing of FIG. 114 , the output control unit 122 may execute warning display indicating an obstructive object. The warning display may be, for example, a message prompting the user to move the object. Additionally, the detection unit 121 may detect that the user has finished eating according a comparison result obtained by comparing a captured image at the time of food supply to a current captured image. Then, the output control unit 122 may display a message prompting the user to preferentially remove finished plates as warning display.
  • the warning display may be, for example, a message prompting the user to move the object.
  • the present specific example by automatically detecting and displaying a flat surface that is a proper display region for each application, it is possible to execute optimum display so that it is not necessary for the user to execute a manipulation. According to the present specific example, by dynamically searching for the display regions satisfying the display conditions such as the minimum size and the weight defined for each application, it is possible to automatically execute the optimum display.
  • the present specific example is a form in which control of sound output is executed so that a sound is audible from a sound source displayed on the table 140 a .
  • a video is projected to the projection surface (the table 140 a ) located therebelow and a sound is produced from the body located thereabove. Therefore, a sense of unity between the video and the sound is lost when a distance between the body and the table 140 a is far.
  • the information processing system 100 causes a sound to be reflected from the projection surface by a directional speaker so that the sound is oriented toward the user.
  • the information processing system 100 yields the sense of unity between the video and the sound by changing a position from which the sound is reflected in conformity with a manipulation and the position of the user according to the characteristics of an application.
  • the detection unit 121 functions as an estimation unit that estimates the position of the user based on the position and the direction of a hand or a finger.
  • the output control unit 122 controls the speaker such that a sound output for an image displayed on the display surface is reflected to reach the position of the user estimated by the estimation unit.
  • the information processing system 100 includes a plurality of directional speakers and is assumed to be able to control direction and a directional range of each speaker.
  • the output control unit 122 selects the speaker installed at a position at which a reflected sound can reach the user at the time of production of the sound toward the application screen based on a positional relation between the position of the user and an application display position and controls the speaker such that the sound is produced.
  • FIGS. 115 to 117 are explanatory diagrams illustrating a user interface according to specific example 10.
  • the output control unit 122 may control the speaker such that the sound is reflected toward the middle of the application screen and the sound can reach the user.
  • the output control unit 122 may control the speaker such that sounds of the corresponding channels are reflected to the left and right sides of the application screen to reach the left and right ears of the user.
  • the output control unit 122 may control the speaker such that a sound is reflected to, for example, a position (for example, a link) clicked by the user and the sound reaches the user.
  • the output control unit 122 may control the speaker such that a sound image is localized to the position of an application to be displayed.
  • the information processing system 100 may emit only a sound of the application used by each user to the user.
  • the information processing system 100 may reproduce a sound in the native language of each user for that user.
  • the information processing system 100 may emit a sound to the front side of the application screen, that is, in a direction in which a user executing the manipulation is normally located.
  • the position of the sound source by controlling the position of the sound source according to the display position of the application, it is possible to provide the user with a sense of sound similar to the sound produced from the application screen itself.
  • the position of the reflection according to a manipulation from the user, the sense of unity between a video and a sound can be yielded even when there is no prior information regarding a sound source such as a web browser.
  • the channel configuration of LR according to the position of the user, it is possible to yield the sense of presence as if the user is viewing a home television.
  • the present specific example is a form in which a pre-set function is provoked when a specific condition is satisfied on the table 140 a .
  • a condition in which a function of an application is provoked can normally be set only by a vendor supplying the application.
  • a function is not provoked in a behavior defined in an application in some cases.
  • the information processing system 100 according to the present specific example is configured such that a function to be provoked and a provoking condition can be freely set by the user.
  • the user generates a program in which a condition regarding the state on a display surface is associated with an output instruction.
  • the information processing system 100 receiving the program executes an output based on a corresponding output instruction when the state on the display surface satisfies a condition defined by the program.
  • Examples of the condition regarding the state on the display surface include placement of a specific object on the table 140 a , a temperature on the table 140 a , and a change in depth information.
  • the condition regarding the state on the display surface is also referred to as an output condition.
  • the detection unit 121 recognizes a manipulation object such as a finger touching the table 140 a and detects programming by the user. Then, the control unit 120 stores a program in which an output instruction is associated with the output condition based on a detection result of the programming obtained by the detection unit 121 .
  • the control unit 120 may function as a storage unit that stores the program.
  • FIG. 118 is an explanatory diagram illustrating a user interface according to specific example 11.
  • a program by which a temperature on the table 140 a is set as an output condition is considered, for example.
  • a region in which the condition determination is executed is can be set.
  • a program by which a region in which a cake is placed is set as a condition determination region 2031 , the temperature equal to or less than 30 degrees is set as an output condition, and an instruction to turn on an illuminator is set as an output instruction is considered.
  • a program by which a pattern on the table 140 a is set as an output condition is considered. For example, as illustrated in FIG.
  • the information processing system 100 may receive programming through a manipulation of touching the projection surface or may receive a program from an external device through wired or wireless communication.
  • the detection unit 121 detects, for example, an object on the table 140 a and the pattern, temperature, humidity, or the like of the surface of the object as the state on the table 140 a .
  • the output control unit 122 controls the output unit 130 such that an output according to the output instruction stored in association with the output condition is executed.
  • the output control unit 122 controls the illuminator such that the illuminator is turned on when a temperature sensor acquires the fact that the temperature of the condition determination region 2031 is equal to or less than 30 degrees.
  • the output control unit 122 controls the projector such that a programmed video effect is displayed when the fact that a card placed in the condition determination region 2032 is a joker card is detected from the pattern shown in a captured image.
  • the program will be further exemplified. For example, a program notifying the user of a temperature when the temperature is displayed around milk for a baby and becomes a temperature of human skin, such as 36 degrees to 37 degrees, is considered. Further, a program automatically turning on an illuminator and taking photos when a birthday cake is monitored, candles are blown out, and a temperature sharply drops is considered. Furthermore, a program displaying news when a black drink (assumed to be a cup of coffee) is placed in front of the user at a morning hour is considered.
  • the output control unit 122 may execute display, such as rectangular display indicated by a broken line, indicating that some program is executed in a programmed region, that is, a condition determination region. Of course, such display can be set not to be executed when presence of a program is desired to be concealed for the purpose of surprise.
  • a change in depth can be set as an output condition.
  • the detection unit 121 detects an object located on the table 140 a based on depth information, and the output control unit 122 controls the output unit 130 such that an output is executed according to an output instruction stored in association when the detected state of the object satisfies the output condition.
  • the fact that a player's has a losing hand in mah-jong can be detected based on a change in the depth information.
  • a program recognizing a role which is a state of a player's hand based on a captured image and automatically calculating scores when the player's has a losing hand in mah-jong is considered.
  • the fact that the cover of a cake is removed and the content of the cake appears can also be detected based on a change in depth information. Accordingly, a program reproducing a birthday song, for example, when the cover of the box of a birthday cake placed in the middle of the table 140 a is removed is considered.
  • a program displaying an effect for example, when the piece of the real object is stopped in the specific frame is considered.
  • the present specific example is a form in which it is determined to whom an object placed on the table 140 a belongs. According to use of an application, it may be necessary to be able to determine to whom the object placed on the table 140 a belongs. Accordingly, in the present specific example, a hand placing an object on the table 140 a is detected and it is determined to whom the object belongs by associating the detected hand with the object. It can also be comprehended that the user owns the object which belongs to the user.
  • the detection unit 121 detects that a manipulation object and an object entering a predetermined region in a contact state are separated. For example, based on depth information, the detection unit 121 detects that a hand holding an object enters the table 140 a and the hand is separated from the object.
  • FIGS. 119 to 123 are explanatory diagrams illustrating user interfaces according to specific example 12.
  • the detection unit 121 detects a closed curve adjoined to a side of the table 140 a as a hand based on depth information. As illustrated in FIG. 119 , when the hands holding objects enter the table 140 a and the objects are placed on the table 140 a , the detection unit 121 detects that the hands are separated from the objects. Specifically, based on the depth information, the detection unit 121 detects that closed curves not adjoined to sides of the table 140 a appear. At this time, the detection unit 121 detects the closed curves not adjoined to the sides of the table 140 a as objects separated from the hands.
  • the detection unit 121 functions as a recognition unit that recognizes the detected manipulation objects and the objects separated from the manipulation objects in association therewith. Specifically, based on the depth information, the detection unit 121 recognizes the hands indicated by the closed curves adjoined to the sides of the table 140 a and the objects indicated by the curved lines which are separated from the closed curves and are not adjoined to the sides of the table 140 a in association therewith. When the objects are separated, the 6 detection unit 121 may recognize hands located at positions closest to the objects in association with the objects. The detection unit 121 recognizes the hands associated with the objects as destinations to which the objects belong. For example, as illustrated in FIG.
  • the detection unit 121 pairs the objects placed on the table 140 a and the hands holding the objects.
  • the detection unit 121 may recognize the destinations to which the objects belong using the sides of the table 140 a to which the closed curves indicating the hands are adjoined, that is, the directions in which the hands extend. Accordingly, as illustrated in FIG. 119 , even when only the objects are detected on the table 140 a , the detection unit 121 can recognize to whom the objects belong.
  • the present specific example can be applied to, for example, a roulette game.
  • the detection unit 121 detects the values of chips placed on the table 140 a using a pattern recognized from a captured image and a height recognized from the depth information and detect users betting with the chips. Then, the output control unit 122 controls the output unit 130 such that an obtainable amount of chips is displayed at a hand of the user having won the bet. Accordingly, the user can bring the chips from a pool in person with reference to the displayed amount of chips. Since a dealer is not necessary, all members can participate in the game.
  • the present specific example can also be applied to, for example, a board game.
  • the detection unit 121 detects a user spinning a roulette wheel on the table 140 a based on the direction in which the hand extends. Then, the output control unit 122 controls the output unit 130 such that display of a move of the user spinning the roulette wheel is executed automatically according to a roulette number.
  • the output control unit 122 may execute warning display when the user attempts to spin the roulette wheel out of turn.
  • the output control unit 122 may execute a warning when the user has an object which the user should not have. For example, as illustrated in the upper drawing of FIG. 120 , the output control unit 122 may execute warning display when the user attempts to take a cup which does not belong to him or her and may execute display to guide the user to a cup which belongs to him or her. As illustrated in the lower drawing of FIG. 120 , when the user attempts to take an eating utensil from the inside, the output control unit 122 may execute display to guide the user to use an eating utensil on the outside. Additionally, for example, when grilling meat, the detection unit 121 may recognize who puts the meat down and the output control unit 122 may execute warning display for a user reaching for meat that another person is grilling.
  • the detection unit 121 may detect transition in ownership (belonging destination). As a rule of the transition, for example, “first victory” in which the ownership is fixed to a person who first touches an object and “final victory” in which the ownership transitions directly to a person who touches an object are considered. Additionally, as a rule of the transition, “user selection” in which the ownership transitions according to selection of a user is considered. For example, as illustrated in FIG. 121 , when it is detected that user B attempts to touch an object of that user A has first touched and has ownership of, the output control unit 122 may execute display in which the user is allowed to select transition or non-transition of the ownership at the hand of user A who is the owner.
  • “handover” in which the ownership transitions when an object is handed is considered.
  • the detection unit 121 may transition the ownership from user X to user Y.
  • the detection unit 121 may set the ownership according to division of an object.
  • the detection unit 121 detects one closed curve as one object and detects division of an object when it is detected that two or more closed curves appear from the one closed curve.
  • the detection unit 121 detects division of an object when coins stacked in a plurality of layers for betting collapse.
  • the detection unit 121 may set the ownership of an object after division to an owner before the division when the object in which the owner has already been decided is divided into a plurality of portions.
  • the belonging of the object placed on the table 140 a can be identified and can be treated as attribute information in an application, a game, or the like. Accordingly, for example, the output control unit 122 can execute an output to support game progress according to the ownership of the object.
  • the output control unit 122 can visualize belonging information in the real world by suggesting information indicating the ownership by the user.
  • the present specific example is a form in which a window projected to the table 140 a can freely be manipulated.
  • the window projected to the table 140 a is preferably moved, rotated, expanded, or reduced according to intentions of the users.
  • user interfaces for receiving manipulations on the window such as movement of the window, are supplied.
  • FIGS. 124 to 127 are explanatory diagrams illustrating user interfaces according to specific example 13.
  • the information processing system 100 projects a window indicating a plurality of different application screens, such as a calendar, a moving image, and a map, to the table 140 a .
  • Any of the projected applications can be selected by the user.
  • the plurality of different application screens are projected simultaneously, the user can simultaneously check various kinds of information freely using the table 140 a , for example, during a busy morning.
  • the detection unit 121 detects that the user touches a rotation button from a menu illustrated in the left drawing of FIG. 124
  • the output control unit 122 controls the output unit 130 such that the entire window is rotated in a facing direction of the table 140 a , as illustrated in the right drawing of FIG. 124 .
  • the detection unit 121 may comprehend the outer circumference of the window as a handle for a window manipulation and detect a user manipulation on the handle to realize the window manipulation. For example, when the detection unit 121 detects that the handle is touched, the information processing system 100 switches the manipulation mode to a window manipulation mode.
  • the window manipulation mode is a manipulation mode in which a user manipulation is detected as a window manipulation.
  • a manipulation mode in which a user manipulation is detected as a manipulation on an application of a scroll or the like is also referred to as a normal mode.
  • the output control unit 122 controls the output unit 130 such that a handle 2041 is displayed as display indicating that the manipulation mode is the window manipulation mode, as illustrated in the left drawing of FIG. 125 .
  • the user can move the window with his or her finger by dragging, pinching, or rotating the handle 2041 while touching the handle 2041 .
  • the detection unit 121 detects that the user drags the handle 2041 while touching the handle 2041
  • the output control unit 122 moves the window in a direction in which the handle 2041 is dragged, as illustrated in the right drawing of FIG. 125 .
  • the information processing system 100 may switch the manipulation mode from the window manipulation mode to the normal mode.
  • the detection unit 121 may switch the manipulation mode from the normal mode to the window manipulation mode. For example, as illustrated in the left drawing of FIG. 126 , when the detection unit 121 detects that the finger is dragged from the outside of the window to the inside of the window, the output control unit 122 switches the manipulation mode to the window manipulation mode. Therefore, as illustrated in the right drawing of FIG. 126 , the output control unit 122 moves the window in a direction in which the dragging is executed.
  • the detection unit 121 may switch the manipulation mode from the normal mode to the window manipulation mode in accordance with the number of fingers touching the window. For example, when the detection unit 121 detects that two fingers are touching the window, the manipulation mode may be switched to the window manipulation mode. Specifically, as illustrated in FIG. 127 , when the detection unit 121 detects that the handle is rotated with two fingers, the output control unit 122 rotates the display of the window. When the detection unit 121 detects that dragging is executed with two fingers, the output control unit 122 may move the window in a direction in which the dragging is executed. Additionally, when the detection unit 121 detects that two fingers spread out, the output control unit 122 may expand and display the window.
  • the user can freely manipulate the window, and thus usability is improved.
  • a karuta card assistance application is an application that assists in a karuta card game in which karuta cards arranged on the table 140 a are used.
  • the karuta card assistance application has a reading-phrase automatic read-aloud function, an answer display function, and a hint supply function.
  • the reading-phrase automatic read-aloud function is a function of causing the output unit 130 to sequentially sound and output reading phrases registered in advance.
  • the answer display function is a function of recognizing each karuta card from a captured image and generating effect display when the karuta card of an answer overlaps a hand of the user.
  • the hint supply function when an answer is not presented despite elapse of a predetermined time from the read-aloud of the reading phrase, display indicating a hint range including a karuta card of the answer may be caused to be projected by the output unit 130 and the hint range may be further narrowed according to elapse of time, as illustrated in FIG. 128 .
  • a reader is not necessary in the karuta card game and smooth progress can be supported.
  • a conversation assistance application is an application that supports an excitement atmosphere during conversation of users.
  • the conversation assistance application can execute sound recognition on conversation of the users, extract keywords through syntax analysis on text from the conversation, and cause the output unit 130 to project an image corresponding to the keywords.
  • the description will be made more specifically with reference to FIG. 129 .
  • FIG. 129 is an explanatory diagram illustrating a specific example of the conversation assistance application. For example, a case in which two users X and Y have a conversation as follows will be assumed.
  • the conversation assistance application extracts, for example, “Japan,” “airplane,” and “Mt. Fuji” as keywords from the conversation of the two users and causes the output unit 130 to project a map of Japan, an airplane, and Mt. Fuji, as illustrated in FIG. 129 .
  • the sense of presence of the users can be improved and the users are excited with the atmosphere of the conversation.
  • a projection surface tracking application is an application that executes proper projection according to a state of the projection surface.
  • the projection surface tracking application corrects and projects a projected image so that the projected image is displayed to directly face the user according to a state of the projection surface, such as inclination of the projection surface or unevenness on the projection surface.
  • a state of the projection surface such as inclination of the projection surface or unevenness on the projection surface.
  • FIGS. 130 to 133 are explanatory diagrams illustrating specific examples of the projection surface tracking application.
  • the projection surface tracking application projects a recipe to the projection surface obliquely erected on a work table so that the recipe can be viewed to directly face the user.
  • the user can search for a recipe from a web browser projected to the flat surface on the work table and obliquely erect the projection surface on the work table when the user makes food, so that the user can stand at a sink and view the recipe while making the food.
  • the user can view the projected image from a free position.
  • the projection surface tracking application can also detect a user manipulation according to the state of the projection surface.
  • the projection surface tracking application detects a maximum flat surface as the projection surface at a specific timing such as the time of calibration and activation of a product and detects a user manipulation based on a difference in a height between the projection surface and the finger.
  • the projection surface tracking application detects a user manipulation by detecting whether a finger is touching the projection surface according to a distance between the finger of the user and the flat surface.
  • the projection surface tracking application may detect a user manipulation by detecting a local difference between the finger of the user and the solid object. For example, as illustrated in FIG.
  • the projection surface tracking application detects a user manipulation based on a distance between a hemispherical local flat surface and a finger of the user. As illustrated in FIG. 130 , when the projection surface is inclined, the projection surface tracking application may detect a user manipulation according to a distance between the finger of the user and the inclined projection surface.
  • the projection surface tracking application can realize individual display using a mirror.
  • a peripheral device including a mirror and a screen illustrated in FIG. 133 is used.
  • an image projected to the flat surface may be viewed by neighboring people.
  • the projection surface tracking application can execute display in which only a user directly facing the screen is set as a target by reflecting projected light from the mirror and forming an image on the screen installed in front of the user.
  • the projection surface tracking application can also simultaneously realize display dedicated for all the users and individual display such as a hand and cards on the table in a card game.
  • the projection surface tracking application can detect a user manipulation using a distance between the screen and a finger acquired in the X axis direction and the Y axis direction rather than the Z axis direction.
  • a meal assistance application is an application that supports progress of a meal of the user. For example, the meal assistance application recognizes how much food remains on a dish, i.e., a progress status of a meal, by storing an empty state of the dish on which the food is put in advance and comparing the empty dish and a current dish.
  • the meal assistance application can cause the output unit 130 to project a pattern, a message, or the like according to the progress status of the meal.
  • FIGS. 134 and 135 the description will be made more specifically with reference to FIGS. 134 and 135 .
  • FIG. 134 is an explanatory diagram illustrating a specific example of the meal assistance application.
  • the meal assistance application may cause the output unit 130 to project display such as running of a train around a dish after the meal is finished.
  • the meal assistance application may cause the output unit 130 to project display of producing a design of a luncheon mat as the meal progresses.
  • FIG. 135 is an explanatory diagram illustrating another specific example of the meal assistance application.
  • the meal assistance application recognizes a progress status of a meal by detecting that the form of the bottom surface of a plate concealed by food is exposed according to the progress of the meal.
  • the meal assistance application may cause the output unit 130 to project a message according to the progress status of the meal. For example, when the progress of the meal is less than 20%, a message “Put your hands together and eat!” can be output. When the progress of the meal is equal to or greater than 20% and less than 40%, a message “Chew well” can be output. When the progress of the meal is equal to or greater than 40% and less than 70%, a message “Just a little left! Do your best” can be output. When the progress of the meal is 100%, a message “Great! You ate everything” can be output.
  • the meal assistance application can support the progress of the meal of the user by causing the output unit 130 to project a pattern, a message, or the like according to the progress status of the meal and improving motivation for the meal of the user.
  • a motion effect application can cause the output unit 130 to project an animation as if a picture were moving based on the picture placed on the table 140 a .
  • the motion effect application causes the output unit 130 to project an animation and a sound as if the picture drawn by the user were moving.
  • the motion effect application may cause the output unit 130 to output a different sound whenever the user puts a picture on the table 140 a .
  • a method of generating an animation will be described specifically with reference to FIG. 137 .
  • FIG. 137 is an explanatory diagram illustrating a specific example of the motion effect application.
  • the motion effect application visually erases a picture placed on the table 140 a by recognizing the shape and color of the picture from a captured image of the picture placed on the table 140 a and causing the output unit 130 to project light with the same color as the picture. Then, the motion effect application generates an animation based on the recognized shape of the picture and causes the output unit 130 to project the animation. In such a configuration, it is possible to offer the user a sense as if a picture such as a simple scribble were starting to move.
  • a lunch box preparation supporting application is an application that supports the user in expressing various patterns with food ingredients. For example, when a target image is designated by the user, the lunch box preparation supporting application analyzes a color structure of the target image and specifies food ingredients, amounts, and an arrangement to express the target image as a pattern based on the analysis result.
  • the lunch box preparation supporting application causes the output unit 130 to project guide display for guiding specified food ingredients, amounts, and arrangement.
  • the user can generate a lunch box expressing a pattern imitating the target image by arranging the food ingredients according to the guide display.
  • FIG. 138 the description will be made more specifically with reference to FIG. 138 .
  • FIG. 138 is an explanatory diagram illustrating a specific example of the lunch box preparation supporting application.
  • the lunch box preparation supporting application recognizes the character image as a target image and specifies the food ingredients, amounts, and arrangement for expressing the character image as a pattern.
  • the lunch box preparation supporting application causes the output unit 130 to project guide display illustrated in two drawings of FIG. 138 based on the specified food ingredients, amounts, and arrangement.
  • the user can generate a lunch box expressing the pattern of the character as illustrated in the right drawing of FIG. 138 by disposing the food ingredients according to the guide display.
  • the example in which the package on which the target image is formed is placed on the table 140 a has been described above as the method of designating the target image.
  • the method of designating the target image is not limited to the example.
  • the user can also designate an image included in a website output to the table 140 a by the output unit 130 as a target image.
  • the character image has been described above as an example of the target image.
  • the target image may be an image of a vehicle, a landscape, a map, a toy, or the like.
  • a daily assistance application is an application that supports a behavior, such as learning, a hobby, and work, done every day by the user.
  • the daily assistance application can support a behavior done by the user by causing the output unit 130 to project useful information display for the user to an object in the real space.
  • FIGS. 139 to 143 the description will be made more specifically with reference to FIGS. 139 to 143 .
  • FIGS. 139 to 143 are explanatory diagrams illustrating specific examples of user assistance by the daily assistance application.
  • the daily assistance application can cause the output unit 130 to project sample text to an envelope.
  • the user can write text neatly by following the projected text with, for example, a brush pen.
  • the sample text can be freely designated by the user.
  • the daily assistance application can cause the output unit 130 to project information display indicating how to use knives or forks on the table 140 a on which knives or forks are arranged and display indicating food on a plate.
  • the user can acquire table manners by using the knives and forks according to the projected information display.
  • the daily assistance application can cause the output unit 130 to project information display indicating a sample picture, a drawing method, and the like to the table 140 a on which a sketch book, paints, a brush, and the like are prepared. The user can complete a good picture by drawing a picture with the brush and the drawing tool according to the projected information display.
  • the daily assistance application can cause the output unit 130 to project auxiliary lines for equally dividing a whole cake. The user can obtain pieces of cake with the same size by cutting the whole cake along the projected auxiliary lines.
  • the daily assistance application may specify proper auxiliary lines based on the number of equal divisions designated by the user and the shape and size of the whole cake.
  • the daily assistance application can execute automatic answer marking from a captured image of a medium such as a print or a book on which answers to problems are written by the user and cause the output unit 130 to project a marking result.
  • the user can confirm correct and incorrect answers, scores, and the like based on the projected marking result.
  • the daily assistance application may execute automatic marking by recognizing identification information formed on the medium and comparing solutions stored in association with the identification information to the answers of the user.
  • a dining table representation application is an application that executes colorizing for dining table representation.
  • the dining table representation application can recognize an object on a dining table from a captured image of the dining table and cause the output unit 130 to project display according to the object to the object.
  • the description will be made specifically with reference to FIG. 144 .
  • FIG. 144 is an explanatory diagram illustrating a specific example of the dining table representation application.
  • the dining table representation application may recognize an object on a dining table based on, for example, a distance image of the dining table, cause the output unit 130 to project, for example, display of “Happy Birthday!” to a plate, and cause the output unit 130 to project star display to the table, as illustrated in FIG. 144 .
  • the food recommendation application is an application that recommends food to the user.
  • the food recommendation application can recognize food on a dining table from a captured image of the dining table, calculate nutrient balance of the food based on a recognition result, and recommend food supplementing deficient nutrients.
  • the food recommendation application may recognize that vitamins, dietary fiber, and the like are deficient and cause the output unit 130 to project display recommending Caesar salad which can supplement nutrients such as vitamins, dietary fiber, and the like and an image of the Caesar salad.
  • the food recommendation application may recommend additional food according to progress of a meal by recognizing an amount of remaining beverage contained in a cup, an amount of remaining food, or the like using depth information. In such a configuration, a well balanced meal for the user is supported and improvements in sales due to additional orders in a restaurant or the like are expected.
  • a mood representation application is an application that causes the output unit 130 to project a presentation according to food.
  • the mood representation application can recognize food on a dining table from a captured image of a dining table and cause the output unit 130 to project an image of an object having affinity for food.
  • the mood representation application may cause the output unit 130 to project images of pine, bamboo, and plum.
  • the mood representation application may cause the output unit 130 to project an image of a riverbed or a brook. In such a configuration, it is possible to improve the mood of the dining table according to food.
  • a tableware effect application is an application that generates an effect according to placement of tableware on a dining table.
  • the tableware effect application may recognize classification of tableware placed on a dining table and cause the output unit 130 to output a display effect and a sound according to the classification.
  • the description will be described more specifically with reference to FIG. 146 .
  • FIG. 146 is an explanatory diagram illustrating a specific example of the tableware effect application.
  • the tableware effect application may cause the output unit 130 to project display of ripples spreading from the cup and cause the output unit 130 to output a sound according to the cup.
  • a sound to be output differs according to a type of tableware.
  • a sound to be output by the output unit 130 may be changed by the tableware effect application.
  • the tableware effect application may cause the output unit 130 to output a predetermined sound.
  • the tableware effect application can provide a new type of enjoyment on the dining table by generating an effect according to the placement of the tableware on the dining table.
  • An inter-room linking application is an application that shares and links an application used by the user between rooms when the information processing systems 100 are installed in the plurality of rooms.
  • the information processing system 100 installed in a user's room acquires information regarding the application used in a living room by the user and enables the user to use the application continuously from the use in the living room after the user moves to the user's room.
  • FIG. 147 the description will be made more specifically with reference to FIG. 147 .
  • FIG. 147 is an explanatory diagram illustrating a specific example of the inter-room linking application.
  • the inter-room linking application stores history information of use times and use locations (projection locations) of each application and suggests the stored history information to the user.
  • All Apps illustrated in FIG. 147 , all of the applications installed in the information processing system 100 projecting a screen are listed.
  • “Your Apps” applications that the user is currently frequently using on the projection surface are listed.
  • “Recent Apps” applications recently used by the user including the information processing systems 100 installed in the other rooms are listed.
  • the user can activate a desired application by selecting an icon from the screen illustrated in FIG. 147 .
  • the inter-room linking application can support selection of the application by the user by projecting such a screen.
  • the inter-room linking application can share the history information of the user between the rooms.
  • the inter-room linking application continuously supplies an application used in another room with reference to the history information stored in the other room even after the user moves from the room. For example, when the user selects a cooking recipe with the information processing system 100 installed in a living room and the user moves to a kitchen, the selected cooking recipe is projected by the information processing system 100 installed in the kitchen. Accordingly the user can continuously use the same application even after moving to another room.
  • the video of the projector becomes unclear due to environmental factors such as brightness of an illuminator or outside light in some cases.
  • the user is forced to execute a task of darkening the entire room or turning off only an illuminator near the projection surface in order to clarify the video, and thus convenience for the user is damaged.
  • an illumination control process of clearly displaying a video of the projector on the projection surface by acquiring the state of the application or the projection surface and automatically controlling an illuminator according to a status of the projection surface so that the user need not execute a task of adjusting the illumination will be described.
  • FIG. 148 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • the example of the operation of the information processing system 100 according to the embodiment of the present disclosure will be described with reference to FIG. 148 .
  • the information processing system 100 a is of a projection type illustrated in FIG. 1 and the output unit 130 a includes an illumination device will be described.
  • the description will be made assuming that the illumination device is different form a projector involving display of information and is a device that has a simple illumination function without involvement in display of information.
  • an illumination controllable unit of the illumination device is set as a map.
  • the number of cells is decided according to the form of the illumination device, for example, the number of illuminators. Numerical values of each cell indicate illumination intensity, 0 indicates no illumination output, and 100 indicates the maximum value of an illumination output.
  • FIG. 149 is an explanatory diagram illustrating examples of the illumination maps.
  • FIG. 149 illustrates 3 patterns.
  • FIG. 149 illustrates examples of illumination maps 3011 when only 1 illuminator is installed in the illumination device 3010 , when 4 illuminators are installed in a square state, and when 8 illuminators are arranged in a rectangular form.
  • the number of cells of the illumination map is 1 when only 1 illuminator is installed in the illumination device 3010 .
  • the number of cells of the illumination map is 4.
  • the number of cells of the illumination map is 8.
  • the arrangement patterns of the illuminators or the patterns of the cells of the illumination maps in the illumination devices 3010 are not limited to these examples.
  • an obtainable unit of ambient light is set as a map.
  • the number of cells is decided according to the obtainable unit of ambient light.
  • the obtainable unit of ambient light defining the number of cells of the environment map is, for example, a disposition location of an illuminance sensor or a pixel value of an imaging device that images a projection surface.
  • Numerical values of each cell of the environment map are normalized to numerical values of each cell of the illumination map. By normalizing the numerical values of each cell of the environment map to the numerical values of each cell of the illumination map, the illuminance of ambient light and brightness of an illuminator are associated.
  • the information processing system 100 adopts various methods as methods of obtaining the illuminance of ambient light.
  • the illuminance of ambient light may be calculated from a pixel value of an image obtained by imaging the projection surface using the camera of the input unit 110 , or an illuminance sensor may be provided in the input unit 110 to obtain the illuminance of ambient light from a value of the illuminance sensor.
  • FIG. 150 is an explanatory diagram illustrating examples of environment maps.
  • FIG. 150 illustrates 3 patterns.
  • FIG. 150 illustrates examples of illumination maps 3021 when only 1 illuminance sensor is installed in an illuminance sensor 3020 acquiring ambient light, when 4 illuminance sensors are installed in a square state in the illuminance sensor 3020 , and when 3 sensors are arranged in the same direction in the illuminance sensor 3020 .
  • the number of cells of the environment map is 1 when only 1 sensor is installed in the illuminance sensor 3020 .
  • the number of cells of the environment map is 4 when 4 sensors are installed in the square state in the illuminance sensor 3020 .
  • the number of cells of the environment map is 3 when 3 sensors are arranged in the same direction in the illuminance sensor 3020 .
  • the arrangement patterns of the sensors in the illuminance sensors 3020 or the patterns of the cells of the illumination map are not limited to these examples.
  • FIG. 151 is an explanatory diagram illustrating an example of association between the illuminance map and the environment map.
  • FIG. 151 illustrates an example of the association between the illuminance map and the environment map in a configuration of the illumination device 3010 in which 8 illuminators are installed to be arranged in the rectangular form and the illuminance sensor 3020 in which 3 sensors are installed to be arranged in the same direction.
  • the number of cells of the environment map 3021 is 3 and the cells are defined as A00, A01, and A02.
  • the illumination map and the environment map are associated.
  • the number of cells is converted from 3 to 8.
  • the cells after the conversion are defined as B00 to B03 and B10 to B13.
  • a conversion formula at the time of the conversion can be defined as follows, for example.
  • the information processing system 100 can control the illumination device 3010 based on the value of each cell of the environment map after the conversion by adapting the number of cells of the environment map to the number of cells of the illumination map in this way.
  • step S 3001 the information processing system 100 sets a value of system setting in each cell of the illumination map.
  • the process of step S 3001 can be executed by, for example, the control unit 120 .
  • step S 3001 the brightness of the illumination device 3010 set by the user of the information processing system 100 is set in each cell of the illumination map in the range of 0 to 100, for example, at a starting time point of the process illustrated in FIG. 148 .
  • step S 3002 the information processing system 100 subsequently acquires an execution state of a managed application and determines whether there are unprocessed applications.
  • the determination of step S 3002 can be executed by, for example, the control unit 120 .
  • step S 3002 When it is determined in the foregoing step S 3002 that there are unprocessed applications (Yes in step S 3002 ), the information processing system 100 subsequently acquires the application drawn on the innermost side among the unprocessed applications (step S 3003 ). By executing the process from the innermost application, the information processing system 100 can reflect the value of the older application in the illumination map.
  • step S 3004 the information processing system 100 subsequently confirms whether the corresponding application is defined from an application illumination association table.
  • the process of step S 3004 can be executed by, for example, the control unit 120 .
  • the application illumination association table is a table in which an application to be executed by the information processing system 100 is associated with brightness (brightness realized in accordance with illumination light of the illumination device 3010 with ambient light) at the time of execution of the application.
  • brightness brightness realized in accordance with illumination light of the illumination device 3010 with ambient light
  • FIG. 152 is an explanatory diagram illustrating an example of the application illumination association table.
  • the information processing system 100 can be aware of the brightness of the illumination device 3010 when a processing target application is executed. For example, for a “cinema moving image player,” it can be understood from the application illumination association table illustrated in FIG. 152 that the brightness is set to 0 to represent immersion in a movie.
  • the application illumination association table may have information regarding whether to notify the user when each application controls the brightness of the illumination device 3010 . For example, it can be understood from the application illumination association table illustrated in FIG. 152 that notification to the user is not necessary when the “cinema moving image player” controls the brightness of the illumination device 3010 and notification to the user is necessary when a “child moving image player” controls the brightness of the illumination device 3010 .
  • step S 3004 When it is determined in the foregoing step S 3004 that the corresponding application is not defined in the application illumination association table (No in step S 3004 ), the information processing system 100 assumes that the application is processed and returns the process to the process of the foregoing step S 3002 . Conversely, when it is determined that in the foregoing step S 3004 that the corresponding application is defined in the application illumination association table (Yes in step S 3004 ), the information processing system 100 acquires the value of the brightness of the application from the application illumination association table (step S 3005 ). The process of step S 3005 can be executed by, for example, the control unit 120 .
  • step S 3006 the information processing system 100 subsequently acquires a display area of the application in the display region by the projector.
  • the process of step S 3006 can be executed by, for example, the control unit 120 .
  • the information processing system 100 may not set the application as a processing target.
  • step S 3007 the information processing system 100 subsequently sets the value acquired from the application association table in the cell corresponding to the display area acquired in the foregoing step S 3006 in the illumination map (step S 3007 ).
  • the process of step S 3007 can be executed by, for example, the control unit 120 .
  • the application is assumed to be processed and the process returns to the process of the foregoing step S 3002 .
  • step S 3002 When it is determined in the foregoing step S 3002 that there is no unprocessed application (No in step S 3002 ), the information processing system 100 subsequently acquires the ambient light and sets a value in the environment map (step S 3008 ).
  • the process of step S 3008 can be executed by, for example, the control unit 120 .
  • the information processing system 100 normalizes the value of the ambient light (or the illuminance of the illumination device 3010 ) to the value of an illumination output (brightness) at the time of the process of setting the value in the environment map in step S 3008 .
  • the illuminators of the illumination device 3010 which output light to some extent (0% to 100%) and the degrees of illuminance under the illuminators may be associated in advance at the time of factory shipment.
  • step S 3009 the information processing system 100 subsequently executes association to determine how much of an influence the environment map has on the range of the illumination map (step S 3009 ).
  • the process of step S 3009 can be executed by, for example, the control unit 120 .
  • the association between the illumination map and the environment map is executed by causing the number of cells of the illumination map to match the number of cells of the environment map, as described above.
  • the association between the illumination map and the environment map may be executed in advance at the time of factory shipment.
  • step S 3010 the information processing system 100 subsequently determines whether there is an unprocessed cell of the illumination map.
  • the process of step S 3010 can be executed by, for example, the control unit 120 .
  • the order of the processes on the illumination map may begin, for example, from lower numbers assigned to the cells of the illumination map.
  • step S 3010 When it is determined in the foregoing step S 3010 that there is the unprocessed cell (Yes in step S 3010 ), the information processing system 100 compares the value of the processing target cell of the illumination map to the value (the value of the cell of the environment map) of the ambient light corresponding to the processing target cell. Then, the information processing system 100 determines whether the value of the ambient light corresponding to the processing target cell of the illumination map is equal to or less than the value of the processing target cell of the illumination map (step S 3011 ).
  • the process of step S 3010 can be executed by, for example, the control unit 120 .
  • step S 3011 When it is determined in the foregoing step S 3011 that the value of the ambient light corresponding to the processing target cell of the illumination map is equal to or less than the value of the processing target cell of the illumination map (Yes in step S 3011 ), it means that brightness necessary for the application is not achieved by only the ambient light. Therefore, the information processing system 100 sets a value obtained by subtracting the value of the ambient light from the value of the processing target cell of the illumination map as a new illumination value of the cell (step S 3012 ).
  • the process of step S 3010 can be executed by, for example, the output control unit 122 .
  • step S 3011 when it is determined in the foregoing step S 3011 that the value of the ambient light corresponding to the value of the processing target cell of the illumination map is greater than the value of the processing target cell of the illumination map (No in step S 3011 ), it means that brightness necessary for the application is exceeded by only the ambient light. Therefore, the information processing system 100 sets 0 as the value of the processing target cell of the illumination map (step S 3013 ).
  • the process of step S 3013 can be executed by, for example, the output control unit 122 .
  • step S 3013 the information processing system 100 subsequently executes notification only on the application for which the notification is necessary when the brightness necessary for the application is exceeded by only the ambient light. Therefore, it is determined whether notification is set in the processing target application in the application illumination association table (step S 3014 ).
  • the process of step S 3014 can be executed by, for example, the output control unit 122 .
  • step S 3014 When it is determined in the foregoing step S 3014 that the notification is set in the processing target application in the application illumination association table (Yes in step S 3014 ), the information processing system 100 subsequently gives notification of the processing target application to the application (step S 3015 ).
  • the process of step S 3015 can be executed by, for example, the output control unit 122 .
  • the application receiving the notification in the foregoing step S 3015 executes a notification process of displaying a message or outputting a sound, for example, “Surroundings are too bright. Please darken environment” or “Light is brighter than expected and recognition precision may deteriorate.”
  • step S 3014 when it is determined in the foregoing step S 3014 that the notification is not set in the processing target application in the application illumination association table (No in step S 3014 ), the information processing system 100 returns to the process of the foregoing step S 3010 assuming that the processing target cell of the illumination map is processed.
  • step S 3010 When it is determined in the foregoing step S 3010 that there is no unprocessed cell (No in step S 3010 ), the information processing system 100 subsequently sets the value of the illumination map set in the above-described series of processes in the output unit 130 (step S 3016 ).
  • the process of step S 3016 can be executed by, for example, the output control unit 122 .
  • the output unit 130 controls the brightness of each illuminator of the illumination device 3010 based on the value of the illumination map.
  • the information processing system 100 can execute the illumination control according to the application to be executed and the status of the projection surface such as the brightness of the ambient light by executing the above-described series of operations.
  • the information processing system 100 according to the embodiment of the present disclosure can optimize the illumination according to the purpose of the user by executing the illumination control according to the status of the projection surface.
  • the process of acquiring the ambient light in step S 3008 of the flowchart illustrated in FIG. 148 may be executed once at the time of activation of the application in the information processing system 100 or may be executed periodically during the activation of the information processing system 100 .
  • the illumination control according to the application to be executed and the status of the projection surface such as the brightness of the ambient light will be described exemplifying specific applications.
  • FIG. 153 is an explanatory diagram illustrating examples of values of the illumination map 3011 and the environment map 3021 ′ when the information processing system 100 executes the cinema moving image player.
  • the values of the cells of the illumination map 3011 and the environment map 3021 ′ when the cinema moving image player is executed are assumed to be acquired, as illustrated in (1) of FIG. 153 .
  • the brightness of the cinema moving image player is defined as 0. Accordingly, the information processing system 100 realizes the brightness of the cinema moving image player defined in the application illumination association table by turning off the illuminator of the illumination device 3010 .
  • the values of the cells of the illumination map 3011 and the environment map 3021 ′ when the illuminator of the illumination device 3010 is turned off are similar, as illustrated in, for example, (2) of FIG. 153 .
  • FIG. 154 is an explanatory diagram illustrating examples of the illumination map 3011 and the environment map 3021 ′ when the information processing system 100 executes the child moving image player.
  • the values of the cells of the illumination map 3011 and the environment map 3021 ′ when the child moving image player is executed are assumed to be acquired, as illustrated in (1) of FIG. 154 .
  • the brightness of the child moving image player is defined as 80. Accordingly, the information processing system 100 realizes the brightness of the child moving image player defined in the application illumination association table by turning on the illuminator of the illumination device 3010 .
  • the values of the cells of the illumination map 3011 and the environment map 3021 ′ when the illuminator of the illumination device 3010 is turned on are similar, as illustrated in, for example, (2) of FIG. 154 .
  • the information processing system 100 When the illuminator of the illumination device 3010 is turned on and brightness becomes the brightness defined in the application illumination association table, the information processing system 100 notifies the child moving image player that the brightness becomes the brightness defined in the application illumination association table. The notified child moving image player determines that a safe environment can be prepared so that a child can view content and starts producing the content.
  • the notification is defined as “necessary” for the child moving image player. Accordingly, if the brightness is deficient even with the control of the illumination device 3010 , for example, the information processing system 100 notifies the child moving image player that the brightness is deficient. Then, the notified child moving image player prompts the user to switch the light on in a room by displaying “Brighten surroundings” or the like.
  • FIG. 155 is an explanatory diagram illustrating an example when the outside light is reflected to the environment map 3021 ′.
  • FIG. 155 illustrates an example in which the outside light has an influence on the upper left portion of the environment map 3021 ′ and values increase.
  • a candle application is an application which projects a candle video to the projection surface such as a table.
  • FIG. 156 is an explanatory diagram illustrating examples of the illumination map 3011 and the environment map 3021 ′ when the information processing system 100 executes the candle application. The values of the cells of the illumination map 3011 and the environment map 3021 ′ when the candle application is executed are assumed to be acquired, as illustrated in (1) of FIG. 156 .
  • the brightness of the candle application is defined as 20. Accordingly, the information processing system 100 realizes the brightness of the candle application defined in the application illumination association table by turning off the illuminator of the illumination device 3010 .
  • the values of the cells of the illumination map 3011 and the environment map 3021 ′ when the illuminator of the illumination device 3010 is turned on are similar, as illustrated in, for example, (2) of FIG. 154 .
  • the notification is defined as “necessary” for the candle application.
  • the information processing system 100 notifies the candle application that the brightness is excessive.
  • the candle application notified by the information processing system 100 prompts the user to close a curtain, for example, and darken the ambient light by displaying, for example, “Surroundings are too bright. Please darken environment.”
  • a projection mapping application is an application that aims to project an image to a wall surface of a room or an object.
  • the user installs, for example, a hemisphere in a mirror state on the projection surface so that an image output from the projector of the output unit 130 is reflected from the hemisphere.
  • the brightness of the projection mapping application is defined as 0. Accordingly, the information processing system 100 realizes the brightness of the projection mapping application defined in the application illumination association table by turning off the illuminator of the illumination device 3010 .
  • the projection mapping application for example, there are a planetarium application projecting an image of a starry sky to a wall surface of a room and a revolving lantern application realizing a revolving lantern by projecting an image to Japanese paper installed on the projection surface.
  • the planetarium application and the revolving lantern application will be exemplified as the projection mapping applications in the description.
  • FIG. 157 is an explanatory diagram illustrating a specific example of the application and an explanatory diagram illustrating the planetarium application.
  • an example of an image output from the projector of the output unit 130 is illustrated.
  • a form in which an image output from the projector of the output unit 130 is reflected from a hemisphere 3110 having a surface in a mirror state and installed on the projection surface is illustrated.
  • a state in which the image output from the projector of the output unit 130 is reflected from the hemisphere 3110 and is projected to a wall surface of a room is illustrated.
  • FIG. 158 is an explanatory diagram illustrating a specific example of the application and an explanatory diagram illustrating the revolving lantern application.
  • FIG. 158 an example of an image output from the projector of the output unit 130 is illustrated.
  • FIG. 157 a form in which the image output from the projector of the output unit 130 is reflected from the hemisphere 3110 installed on the projection surface and having the surface in the mirror state and is projected to Japanese paper 3120 is illustrated.
  • a globe application is an application that aims to express a globe by projecting an image to a hemisphere installed on the projection surface.
  • the user installs, for example, a hemisphere with a mat shape on the projection surface so that an image output from the projector of the output unit 130 is projected to the hemisphere.
  • FIG. 159 is an explanatory diagram illustrating a specific example of the application and an explanatory diagram illustrating the globe application.
  • an example of an image output from the projector of the output unit 130 is illustrated.
  • a form in which the image output from the projector of the output unit 130 is reflected from the hemisphere 3120 installed on the projection surface and having a surface of a mat shape is illustrated.
  • a screen recognition application of a mobile terminal is an application which recognizes a screen of a mobile terminal installed on the projection surface with the camera of the input unit 110 and executes a process according to the recognized screen.
  • the screen of the mobile terminal is recognized and the illuminator of the illumination device 3010 is turned off and darkened, the screen can be easily recognized with the camera of the input unit 110 .
  • the screen may be whitened depending on a dynamic range of the camera of the input unit 110 .
  • the illuminator of the illumination device 3010 is not turned off, but the illuminator of the illumination device 3010 is set to be dark for the purpose of reducing highlight, as illustrated in the application illumination association table of FIG. 152 .
  • FIG. 160 is an explanatory diagram illustrating a specific example of the application and an explanatory diagram illustrating the screen recognition application of the mobile terminal.
  • FIG. 160 illustrates a form in which a screen of a mobile terminal 3130 is recognized with the camera of the input unit 110 and information according to the screen of the mobile terminal 3130 is projected by the projector of the output unit 130 .
  • a food package recognition application is an application which recognizes the surface of a food package installed on the projection surface with the camera of the input unit 110 and executes a process according to the recognized food package.
  • the food package is recognized and the illuminator of the illumination device 3010 is turned on and brightened, the food package which is a reflector is easily recognized with the camera of the input unit 110 .
  • FIG. 161 is an explanatory diagram illustrating a specific example of the application and an explanatory diagram illustrating the food package recognition application.
  • FIG. 161 illustrates a form in which the surfaces of food packages 3140 and 3150 are recognized with the camera of the input unit 110 and information according to the screens of the food packages 3140 and 3150 is projected by the projector of the output unit 130 .
  • a general object recognition application is an application which recognizes the surface of an object installed on the projection surface with the camera of the input unit 110 and executes a process according to the recognized object.
  • the illuminator of the illumination device 3010 is brightened about half because it is not known in advance which object is placed on the projection surface. By brightening the illuminator of the illumination device 3010 about half, the surface of an object is easily recognized with the camera of the input unit 110 .
  • FIG. 162 is an explanatory diagram illustrating a specific example of the application and an explanatory diagram illustrating the general object recognition application.
  • FIG. 162 illustrates a form in which the surfaces of general objects 3160 and 3170 are recognized with the camera of the input unit 110 and information according to screens of the general objects 3160 and 3170 is projected by the projector of the output unit 130 .
  • the information processing system 100 can optimize the illumination for each purpose of the user by executing the illumination control according to a use status of the projection surface.
  • the information processing system 100 according to the embodiment can adjust illumination of only a necessary spot of the projection surface by executing the illumination control using the environment map and the illumination map divided as the cells. By adjusting the illumination of only a necessary spot of the projection surface, the plurality of users can execute different tasks on content projected to the projection surface by the projector of the output unit 130 without stress.
  • the information processing system 100 can clarify a video by detecting a portion in which an application is executed on the projection surface and adjusting the illumination.
  • the input unit 110 detects a portion on the projection surface in which eating is taking place and the brightness of the illuminator of the illumination device 3010 is adjusted, and thus it is possible to prevent the brightness of a neighboring area in which eating is taking place from being darkened.
  • the information processing system 100 executes the illumination control when the general object recognition application is executed, and thus it is possible to improve recognition precision.
  • the information processing system 100 according to the embodiment changes a control method for the illumination device 3010 according to a recognition target object, and thus it is possible to improve the recognition precision of the object placed on the projection surface.
  • the information processing system 100 may control the brightness based on meta information or attribute information of content to be output from the output unit 130 . For example, when the attribute information of the content to be output is set as an animation for children, the information processing system 100 controls the illumination device 3010 to brighten. When the attribute information is set as a movie for adults, the information processing system 100 may control the brightness of the illuminator of the illumination device 3010 to darken.
  • the information processing system 100 can change the brightness of individual content even for the same application by controlling the brightness based on meta information or attribute information of content to be output in this way.
  • the information processing system 100 may execute the control for immediate target brightness when the brightness of the illuminator of the illumination device 3010 is controlled. Additionally, the information processing system 100 may execute control through gradual brightening or darkening such that target brightness is ultimately achieved.
  • a condition for provoking a function of an application can normally be set by only a vendor supplying the application.
  • a function is not provoked in a behavior defined in an application in some cases.
  • a function is not executed in a behavior defined in an application in some cases.
  • the information processing system 100 is configured to allow the user to freely set a function to be provoked and a provoking condition.
  • the user is allowed to freely set a function to be provoked and a provoking condition, so that various representations can be expressed.
  • the information processing system 100 allows the users to use functions using the interactions as chances to provoke functions.
  • the information processing system 100 delivers data to be used at the time of provoking of functions to processes (actions) to be executed at the time of provoking of the functions according to occurrence of the interactions.
  • Examples of the interactions to be used as provoking conditions (triggers) of the functions by the information processing system 100 according to the embodiment are as follows.
  • the interactions are not limited to the following interactions:
  • ID information for identifying the detected pattern.
  • data to be delivered to an action when a change in volume a time for which a sound continues, or a direction from which a sound emanates, for example, there are volume, a sound time, and a sound direction.
  • Examples of the triggers when an AR marker generated by the user is recognized are as follows:
  • data to be delivered to an action when an AR marker is discovered for example, there are ID information of a discovered marker, discovered coordinates, a discovered posture, a discovered size, and a time at which the AR marker is discovered.
  • ID information of a discovered marker for example, there are ID information of the lost marker, coordinates at which the marker was last seen, a posture at which the marker was last seen, a size in which the marker was last seen, and a time at which the marker is lost.
  • data to be delivered to an action when a mass is recognized for example, there are a location in which the mass is discovered, an area of the mass, a cubic volume of the mass, and a time at which the mass id recognized.
  • data to be delivered to an action when the disposition of an object on the table surface is recognized for example, there are a location in which the disposition is changed, a time at which the disposition is changed, and a change amount (of area or cubic volume).
  • data to be delivered to an action when the change from the standard flat surface is detected for example, there are an exterior, a location, an area, and a cubic volume of an object changed from a standard state and a date on which the object is first placed.
  • data to be delivered to an action when a motion is detected for example, there are activeness of the motion, coordinates or area of a region in which the motion is mainly done, and a date on which the motion is done.
  • the activeness of the motion is, for example, an index obtained by multiplying an area in which the motion is done by a speed of the motion.
  • Examples of the triggers when a hand state on the table surface is recognized are as follows:
  • ID information of the recognized device As data to be delivered to an action when the connection of the device is recognized, for example, there are ID information of the recognized device, the position of the device, the posture of the device, and the size of the device.
  • Examples of the triggers when arrival of a predetermined time is recognized are as follows:
  • data to be delivered to an action when a designated time arrives for example, there is time information.
  • data to be delivered to an action when elapse of a predetermined time for example, there are a starting time, an elapsed time, and a current time.
  • data to be delivered to an action when the change in temperature is detected for example, there are a temperature change amount, an area in which the temperature is changed, an absolute temperature, and a date on which the change in temperature is detected.
  • data to be delivered to an action when the change in the concentration of carbon dioxide is detected for example, there are a change amount of a concentration of carbon dioxide, absolute concentration, and a date on which the change in the concentration is detected.
  • data to be delivered to an action when a smell is detected for example, there are a detection amount and a date on which the smell is detected.
  • an action when projection of a video or an image is executed for example, there is projection of a visual effect to the projection surface.
  • display to be projected to the projection surface for example, there are display of a visual effect registered in advance (an explosion effect, a glittering effect, or the like) and display of a visual effect generated by the user.
  • An example of the visual effect may be an effect recorded in advance on the table surface, an effect drawn based on a movement trajectory of a hand or a finger on the table surface by the user, or an effect obtained by using an illustration drawn by a paint application or an image searched for and discovered on the Internet by the user.
  • an action when reproduction of music is executed for example, there is an action of reproducing a sound or music. Specifically, for example, there is an action of outputting a sound effect registered in advance or an action of outputting favorite music registered by the user.
  • a use of data to be delivered from the trigger when an action of reproducing a sound or music is executed for example, there is reflection of an increase or decrease in given data to loudness of a sound.
  • actions when an application is activated for example, there are activation of a general application, activation of an application designating an argument, and activation of a plurality of applications.
  • a general application for example, there is an application manipulating a television or an application displaying a clock.
  • the activation of the application designating an argument for example, there is activation of a browser designating a URL.
  • the activation of the plurality of applications for example, there is reproduction of the positions, window sizes, and inclinations of the plurality of stored applications.
  • an action of imaging a photo for example, there are an action of imaging the entire projection surface and an action of imaging a part of the projection surface.
  • an action of imaging a photo for example, there are an action of imaging the entire projection surface and an action of imaging a part of the projection surface.
  • a use of data to be delivered as the trigger when an action of imaging a photo is executed for example, there is an action of imaging a predetermined range centering on a recognized hand.
  • actions when illumination is adjusted or the brightness of a projected image is adjusted for example, there are adjustment of the brightness and an action of turning off an illuminator.
  • the adjustment of the brightness for example, the illumination is brightened or darkened, or a starting point and an ending point are designated and movement is executed between the points in a certain time.
  • the use of data to be delivered from the trigger when the action of adjusting the illumination is executed for example, there are reflection of a delivered value in the brightness and adjustment of the brightness according to ambient light.
  • actions when the volume is adjusted for example, there are adjustment of the volume and muting of the volume.
  • the adjustment of the volume for example, a sound is increased or decreased, or a starting point and an ending point are designated and movement is executed between the points in a certain time.
  • a use of data to be delivered from the trigger when the action of adjusting the volume is executed for example, there are reflection of a delivered value in the volume and adjustment of the volume according to surrounding volume.
  • an action when an alert is displayed for example, there is display (projection) of an alert message.
  • display for example, there is an output of an alert message “Manipulation may not be executed with that hand” around a newly recognized hand when the number of recognized hands exceeds a threshold value.
  • FIGS. 163 to 171 are explanatory diagrams illustrating an example of a GUI which is output to the projection surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIGS. 163 to 171 illustrate an example of a GUI when the user is allowed to set the provoking condition and an example of a GUI 3200 when a pattern of music is registered as the provoking condition.
  • the information processing system 100 When the user is allowed to set the provoking condition, the information processing system 100 according to the embodiment of the present disclosure first outputs the GUI 3200 for allowing the user to select a channel which is to be used as the provoking condition, as illustrated in FIG. 163 .
  • a sound, a marker, an object on a desk surface (the surface of the table 140 a ) are shown as the channels.
  • what is used as the channel when the user is allowed to set the provoking condition is not limited to the related example.
  • FIG. 164 illustrating a pattern of the sound, volume of the sound, and a time for which the sound continues as triggers.
  • the trigger is not limited to the related example.
  • the information processing system 100 when the user selects the pattern of the sound as the trigger, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to record the pattern of the sound.
  • the user When the user is ready to record, he or she touches a recording button illustrated in FIG. 165 .
  • the information processing system 100 starts the recording according to a manipulation from the user on the recording button. While the pattern of the sound is recorded, the information processing system 100 outputs the GUI 3200 illustrated in FIG. 166 .
  • the information processing system 100 When the recording of the sound is completed, the information processing system 100 subsequently outputs the GUI 3200 illustrated in FIG. 167 .
  • the GUI 3200 illustrated in FIG. 167 is a GUI for allowing the user to decide whether to complete the recording and whether to use the recorded sound as the trigger.
  • the information processing system 100 outputs the GUI 3200 illustrated in FIG. 168 .
  • the GUI 3200 illustrated in FIG. 168 is output as a GUI for allowing the user to reproduce the recorded sound.
  • the information processing system 100 When the user reproduces the recorded sound and the information processing system 100 recognizes a sound reproduced by the user as the pattern of the sound, the information processing system 100 outputs the GUI 3200 illustrated in FIG. 169 .
  • the GUI 3200 illustrated in FIG. 169 is a GUI indicating that the information processing system 100 recognizes the pattern of the sound.
  • the information processing system 100 Conversely, when the sound reproduced by the user is not recognized as the pattern of the sound, the information processing system 100 outputs the GUI 3200 illustrated in FIG. 170 .
  • the GUI 3200 illustrated in FIG. 170 is a GUI indicating that the information processing system 100 did not recognize the pattern of the sound reproduced by the user.
  • the GUI 3200 illustrated in FIG. 171 is a GUI indicating that the information processing system 100 registers the pattern of the sound registered by the user is registered as a trigger.
  • the information processing system 100 can allow the user to simply set the function to be provoked and the provoking condition by outputting the GUI illustrated in FIGS. 163 to 171 .
  • GUI 3200 output by the information processing system 100 when the sound produced by the user is registered as the trigger has been described above.
  • an example of a GUI output by the information processing system 100 when a marker placed on the projection surface by the user is registered as a trigger will be described.
  • FIGS. 172 to 179 are explanatory diagrams illustrating an example of a GUI which is output to the projection surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIGS. 172 to 179 illustrate an example of a GUI when the user is allowed to set the provoking condition and an example of the GUI 3200 when the fact that the object is placed on the table surface is registered as a provoking condition.
  • the information processing system 100 When the user is allowed to set the provoking condition, the information processing system 100 according to the embodiment of the present disclosure first outputs the GUI 3200 for allowing the user to select a channel which is to be used as the provoking condition, as illustrated in FIG. 172 .
  • a sound, a marker, an object on a desk surface (the surface of the table 140 a ) are shown as the channels.
  • what is used as the channel is not limited to the related example.
  • FIG. 173 illustrates mass recognition, disposition of an object, and flat surface recognition as triggers.
  • what is used as the trigger is not limited to the related example.
  • the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to designate an area in which the recognition is executed, as illustrated in FIG. 174 .
  • the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to designate the provoking condition of the function, as illustrated in FIG. 175 .
  • FIG. 175 illustrates the GUI 3200 for allowing the user to designate whether the function is provoked when an object is placed on the desk surface or the function is provoked when the object is removed from the desk surface.
  • FIG. 176 is an explanatory diagram illustrating an example of the GUI 3200 for allowing the user to designate the area in which the recognition is executed.
  • the information processing system 100 can allow the user to set the area in which the object is recognized and which triggers the provoking of the function by outputting the GUI 3200 illustrated in FIG. 176 .
  • the area in which the object is recognized can be moved, expanded, reduced, rotated, and deformed through a manipulation by the user.
  • the information processing system 100 When the user sets the area in which the object is recognized and which triggers the provoking of the function, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to actually place the object in the area set by the user and recognizing the object, as illustrated in FIG. 177 .
  • the information processing system 100 recognizes the object placed in the area with the camera of the input unit 110 .
  • the information processing system 100 outputs the GUI 3200 indicating that the object was recognized, as illustrated in FIG. 178 .
  • the information processing system 100 When the information processing system 100 recognizes the object placed in the area in which the object is recognized, the information processing system 100 outputs the GUI 3200 indicating that the object placed on the desk surface by the user is registered as a trigger, as illustrated in FIG. 179 .
  • FIGS. 180 to 184 are explanatory diagrams illustrating an example of a GUI which is output to the projection surface by the information processing system 100 according to an embodiment of the present disclosure.
  • FIGS. 180 to 184 illustrate an example of a GUI when the user is allowed to set a function to be provoked and an example of the GUI 3200 when reproduction of a video is registered as the function to be provoked.
  • the information 16 processing system 100 When the user is allowed to set the function to be provoked, the information 16 processing system 100 according to the embodiment of the present disclosure first outputs the GUI 3200 for allowing the user to select a channel which is used as the function to be provoked, as illustrated in FIG. 180 .
  • the GUI 3200 illustrated in FIG. 180 shows 3 kinds of channels: pictures/videos, sound/music, and applications. Of course, what is used as a channel is not limited to the related example.
  • the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to select an action (function to be provoked), as illustrated in FIGS. 181 and 182 .
  • FIG. 182 illustrates the GUI 3200 for allowing the user to select one of selection of an effect from an effect library, recording of the desk surface, and a picture drawn by a paint function using the desk surface, as the picture or video to be reproduced. What has been described may be combined to be used with the picture or video to be reproduced. For example, when a reproduction time is assumed to be 10 seconds, the information processing system 100 may allow the user to execute setting so that the effect is reproduced for 5 seconds and the picture drawn by the paint function is reproduced for 5 seconds.
  • the information processing system 100 When the user sets the action, the information processing system 100 subsequently outputs the GUI 3200 for allowing the user to confirm the action set by the user, as illustrated in FIG. 183 .
  • the user confirms the set action.
  • the user informs the information processing system 100 that that the user confirms the set action.
  • the information processing system 100 outputs the GUI 3200 indicating that the registration of the action is completed, as illustrated in FIG. 184 .
  • a function set up as a surprise is preferably concealed so that others do not notice the function.
  • FIG. 185 is an explanatory diagram illustrating an example of the visibility of the function to be provoked by the information processing system 100 .
  • (1) illustrates a form in which an icon is placed for each function.
  • (2) illustrates a form in which the registered functions are collectively disposed.
  • (3) illustrates a form in which only an area in which the trigger is generated is displayed.
  • (4) illustrates a form in which the function to be provoked is completely concealed.
  • the information processing system 100 can allow the user to register various provoking conditions of the functions. However, a case in which the user tries to assign the same condition to other functions is also considered. In that case, when the provoking condition that the user tries to register considerably resembles a previously registered condition, the information processing system 100 may reject the registration of the provoking condition.
  • the information processing system 100 may display an indication reporting that the similarity is high or output a GUI prompting the user to register the provoking condition again or cancel the provoking condition.
  • FIG. 186 is an explanatory diagram illustrating an example of a GUI output by the information processing system 100 and illustrates an example of a GUI output when the provoking condition that the user tries to register considerably resembles a previously registered condition and the registration is rejected.
  • the information processing system 100 may generate a new trigger by combining a plurality of triggers registered by the user.
  • FIG. 187 is an explanatory diagram illustrating an example of combination of the triggers.
  • FIG. 187 illustrates a form in which a trigger “At 7 in the morning” and a trigger “When an arm of the user is swinging in a designated area” are combined.
  • FIG. 188 is an explanatory diagram illustrating an example of a GUI output by the information processing system 100 when the user is allowed to generate a new trigger by combining a plurality of triggers.
  • the triggers are drawn in circles and the user executes a manipulation (for example, a drag and drop manipulation) of overlapping one of the triggers on the other.
  • a manipulation for example, a drag and drop manipulation
  • the information processing system 100 generates a new trigger by combining the two triggers.
  • FIG. 189 is an explanatory diagram illustrating an example of a GUI output by the information processing system 100 and illustrates an example of a GUI when the function set by the user to be provoked is bound up with the provoking condition.
  • the function (action) to be provoked and the provoking condition (trigger) are drawn in circles and the user executes a manipulation (for example, a drag and drop manipulation) of overlapping one of the trigger and the action on the other.
  • a manipulation for example, a drag and drop manipulation
  • the information processing system 100 maintains content of the binding of the trigger and the action.
  • the information processing system 100 allows the user to freely set the function to be provoked and the provoking condition, so that the user can set a program freely and simply in addition to a program by a vendor supplying an application. Accordingly, the provoking of the functions suitable for detailed circumstances on the table 140 a is realized.
  • the information processing system 100 allows the user to freely set the function to be provoked and the provoking condition, so that interactions executed every day can be used as chances to provoke the functions by the user. Thus, it is possible to adapt experiences to a daily life of the user.
  • An image, text, and other content can be displayed in the window displayed by the information processing system 100 .
  • Various kinds of content can be displayed in the window, and thus there may be cases in which not all of the content can be displayed in the region of the window.
  • the user browses the content by executing a manipulation of scrolling, moving, expanding or reducing the content, and the information processing system 100 has to distinguish the manipulation on the content from a manipulation on the window in which the content is displayed. This is because there may be cases in which, when the manipulation on the content is not correctly distinguished from the manipulation on the window, the manipulation on the window is executed rather than the content despite the fact that the user executes the manipulation on the content.
  • FIG. 190 is an explanatory diagram illustrating examples of a manipulation method and a mode of a window 3300 displayed by the information processing system 100 .
  • the information processing system 100 operates in a content manipulation mode for the window.
  • the information processing system 100 operates in a window manipulation mode for the window.
  • FIG. 191 is an explanatory diagram illustrating examples of a manipulation method and a mode of a window 3300 displayed by the information processing system 100 .
  • the information processing system 100 can allow the user to scroll, rotate, expand or reduce (scale) the content in the window 3300 in the content manipulation mode.
  • the information processing system 100 can allow the user to move, rotate, or scale the window 3300 in the window manipulation mode.
  • the information processing system 100 may execute display (for example, changing the color of the entire window 3300 ) indicating that the window 3300 enters the window manipulation mode.
  • FIGS. 192 and 193 are explanatory diagrams illustrating examples of a manipulation method and a mode of the window 3300 displayed by the information processing system 100 .
  • the information processing system 100 provides, for example, an outside frame around the window 3300 .
  • a process may be executed on the content.
  • an operation may be executed on the window 3300 .
  • FIG. 194 is an explanatory diagram illustrating an example of a manipulation on the window and an explanatory diagram illustrating the movement concept of the window in which the content is scrolled and the window in which the content is not scrolled.
  • (1) illustrates a window in which the content is not scrolled and a manipulation other than a manipulation (for example, a tap manipulation or a scaling manipulation for the content) by the user on the content is referred to as a manipulation on the window.
  • a manipulation other than a manipulation for example, a tap manipulation or a scaling manipulation for the content
  • (2) illustrates a window in which the content is scrolled.
  • a manipulation with one hand by the user is assumed to be a manipulation on the content and a manipulation with both hands is assumed to be a manipulation on the content.
  • the outside frame is displayed around the window for a predetermined time at the time of the manipulation on the content, and a manipulation on the outside frame is assumed to be a manipulation on the content.
  • FIG. 195 is an explanatory diagram illustrating a manipulation by the user.
  • FIG. 195 illustrates examples of a moving manipulation of the user moving the content or the window with one hand or two hands, a rotating manipulation of the user rotating the content or the window with one hand or two hands, and a scaling manipulation of the user scaling the content or the window with one hand or two hands.
  • FIG. 196 is a flowchart illustrating an example of an operation of the information processing system 100 according to an embodiment of the present disclosure.
  • FIG. 196 illustrates an example of an operation of the information processing system 100 when a manipulation is executed on a window which is output by the information processing system 100 and in which content is displayed.
  • FIG. 196 illustrates an example of the operation of the information processing system 100 when a manipulation is executed on a window which is output by the information processing system 100 and in which content is displayed.
  • step S 3301 when the information processing system 100 detects a touch manipulation on the window by the user (step S 3301 ), the information processing system 100 subsequently determines whether the manipulation is a moving manipulation (step S 3302 ).
  • the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a manipulation in the window (step S 3303 ).
  • the information processing system 100 subsequently determines whether the user manipulation is a manipulation with two hands or the content displayed in the window is the content which is not scrolled (step S 3304 ).
  • step S 3304 When the user manipulation is the manipulation with two hands or the content displayed in the window is the content which is not scrolled (Yes in step S 3304 ), the information processing system 100 subsequently executes a process of moving a manipulation target window (step S 3305 ). Conversely, when the user manipulation is a manipulation with one hand or the content displayed in the window is the content which is scrolled (No in step S 3304 ), the information processing system 100 subsequently executes a process of scrolling the content displayed in the manipulation target window (step S 3306 ).
  • the information processing system 100 subsequently executes a process of moving the manipulation target window (step S 3305 ).
  • step S 3302 When it is determined in the foregoing step S 3302 that the touch manipulation on the window by the user is not the moving manipulation (No in step S 3302 ), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a rotating manipulation (step S 3307 ).
  • step S 3307 the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a manipulation in the window (step S 3308 ).
  • the information processing system 100 subsequently determines whether the user manipulation is a manipulation with two hands or the content displayed in the window is the content which is not rotated in the window (step S 3309 ).
  • step S 3309 When the user manipulation is the manipulation with two hands or the content displayed in the window is the content which is not rotated in the window (Yes in step S 3309 ), the information processing system 100 subsequently executes a process of rotating the manipulating target window (step S 3310 ). Conversely, when the user manipulation is the manipulation with one hand or the content displayed in the window is the content which is rotated (No in step S 3309 ), the information processing system 100 subsequently executes a process of rotating the content displayed in the manipulating target window (step S 3311 ).
  • the information processing system 100 subsequently executes a process of rotating the manipulation target window (step S 3310 ).
  • step S 3307 When it is determined in the foregoing step S 3307 that the touch manipulation on the window by the user is not the rotating manipulation (No in step S 3307 ), the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a scaling manipulation (step S 3307 ).
  • the information processing system 100 subsequently determines whether the touch manipulation on the window by the user is a manipulation in the window (step S 3313 ).
  • the information processing system 100 subsequently determines whether the user manipulation is a manipulation with two hands or the content displayed in the window is the content which is not scaled in the window (step S 3314 ).
  • step S 3315 When the user manipulation is the manipulation with two hands or the content displayed in the window is the content which is not scaled in the window (Yes in step S 3314 ), the information processing system 100 subsequently executes a process of scaling the manipulating target window (step S 3315 ). Conversely, when the user manipulation is the manipulation with one hand or the content displayed in the window is the content which is scaled (No in step S 3315 ), the information processing system 100 subsequently executes a process of scaling the content displayed in the manipulating target window (step S 3316 ).
  • the information processing system 100 subsequently executes a process of scaling the manipulation target window (step S 3315 ).
  • step S 3312 When it is determined in the foregoing step S 3312 that the touch manipulation on the window by the user is not the scaling manipulation (No in step S 3312 ), the information processing system 100 subsequently executes handling according to the application which is being executed in response to the user manipulation (step S 3317 ). For example, as an example of a case in which a touch manipulation on the window by the user is not moving, rotating, or scaling, for example, there is a tap manipulation by the user.
  • the information processing system 100 may execute a process (for example, displaying an image, reproducing a video, or activating another application) on content which is a tap manipulation target.
  • the moving manipulation, the rotating manipulation, and the scaling manipulation by the user can be executed simultaneously.
  • the information processing system 100 may determine which manipulation is the closest to a manipulation executed by the user among the moving manipulation, the rotating manipulation, and the scaling manipulation.
  • FIG. 197 is an explanatory diagram illustrating an example of a manipulation on a window by the user and an example of a manipulation when content which is not scrolled is displayed in the window.
  • the information processing system 100 executes display control such that the entire window 3300 is moved.
  • FIG. 198 is an explanatory diagram illustrating an example of a manipulation on a window by the user and an example of a manipulation when content which is not scrolled is displayed in the window.
  • the information processing system 100 executes display control such that the entire window 3300 is rotated.
  • the information processing system 100 executes display control such that the entire window 3300 is scaled.
  • FIG. 199 is an explanatory diagram illustrating an example of a manipulation on the window by the user and an example of a manipulation when the content which is scrolled is displayed in the window.
  • the information processing system 100 executes display control such that the entire window 3300 is rotated.
  • the information processing system 100 executes display control such that the entire window 3300 is scaled.
  • the information processing system 100 may distinguish display control on a window from display control on content even for the same rotating or scaling by detecting whether two fingers are fingers of the same hand or different hands.
  • FIG. 200 is an explanatory diagram illustrating an example of a manipulation on the window by the user and an example of a manipulation when the content which is scrolled is displayed in the window.
  • the information processing system 100 executes display control such that the content is rotated.
  • the information processing system 100 executes display control such that the entire window 3300 is rotated.
  • the information processing system 100 executes display control such that the content is scaled.
  • the information processing system 100 executes display control such that the entire window 3300 is scaled.
  • the content is scrolled in the window by rotating or scaling the content in some cases.
  • an application capable of displaying such a window for example, there is an application drawing an illustration or an application displaying a map.
  • the information processing system 100 may allow the rotated or scaled window to transition to a process for a window in which the content is scrolled.
  • FIG. 201 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • (1) illustrates an example of a manipulation when the content which is not scrolled is displayed in the window and a form in which an expanding manipulation is executed on the content by the user.
  • a scroll margin is generated, as illustrated in (2) of FIG. 201 . Accordingly, when the content that is not scrolled is displayed in the window and an expanding manipulation is executed on the content by the user, the information processing system 100 executes display control such that the window becomes a window in which the content which is scrolled is displayed.
  • a scroll margin is generated, as illustrated in (3) of FIG. 201 . Accordingly, when the content which is not scrolled is displayed in the window and a rotating manipulation is executed on the content by the user, the information processing system 100 executes display control such that the window becomes a window in which the content which is scrolled is displayed.
  • FIG. 202 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • (1) illustrates a display control example of the window by the information processing system 100 when the user touches a predetermined position of the window and executes a direct moving manipulation.
  • the information processing system 100 may execute display control such that the touch position on the window is constant and the window is moved so that the window is rotated until an outer product of a direction in which the user drags the window and a direction of a touch position of the user from the center of the window becomes 0.
  • the user can rotate the window 180 degrees by first dragging the right side (or the left side) of the window to the lower right (or lower left) and rotating the window a predetermined amount, dragging the upper side of the rotated window to the upper right (or upper left) and rotating the window a predetermined amount, and dragging the right side (or the left side) of the window to the lower right (or lower left) again and rotating the window a predetermined amount.
  • FIG. 203 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • (1) illustrates a state in which the content which is scrolled is displayed in the window 3300 .
  • the information processing system 100 allows the manipulation to operate on the content displayed in the window, as illustrated in (2) of FIG. 203 .
  • the information processing system 100 executes display control such that the content is scrolled inside the window 3300 .
  • the information processing system 100 executes display control such that the content is scaled inside the window 3300 .
  • the information processing system 100 executes display control such that the content is rotated inside the window 3300 .
  • the information processing system 100 displays a window handle for manipulating the window 3300 around the window 3300 , as illustrated in (3) of FIG. 203 .
  • the information processing system 100 executes display control such that the window 3300 is moved according to the moving manipulation by the user.
  • FIG. 204 is an explanatory diagram illustrating an example of a manipulation on the window by the user and an example of a manipulation when the content which is scrolled is displayed in the window.
  • (1) illustrates a display control example by the information processing system 100 when the content which is scrolled is displayed in the window and the user executes a rotating manipulation on the content. That is, when the user executes a rotating manipulation with one hand, the information processing system 100 executes display control such that the content in the window is rotated according to the rotating manipulation by the user.
  • (2) illustrates a display control example by the information processing system 100 when the content which is scrolled is displayed in the window and the user executes a rotating manipulation on the window. That is, when the user executes a rotating manipulation with both hands, the information processing system 100 executes display control such that the entire window is rotated according to the rotating manipulation by the user.
  • (3) illustrates a display control example by the information processing system 100 when the content which is scrolled is displayed in the window and the user executes a scaling manipulation on the content. That is, when the user executes a scaling manipulation with one hand, the information processing system 100 executes display control such that the content in the window is scaled according to the rotating manipulation by the user.
  • FIG. 204 (4) illustrates a display control example by the information processing system 100 when the content which is scrolled is displayed in the window and the user executes a scaling manipulation on the window. That is, when the user executes a scaling manipulation with both hands, the information processing system 100 executes display control such that the entire window is scaled according to the scaling manipulation by the user.
  • FIG. 205 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • the information processing system 100 may execute display control such that the window is moved after the window is rotated in a direction in which the window is dragged by the user.
  • the rotation direction of the window may be an incident direction of a finger or may be a movement direction of the window.
  • the information processing system 100 may execute display control such that an outside frame is provided around the window and the window is moved while the window is rotated according to a moving manipulation on the outside frame.
  • FIG. 206 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • the information processing system 100 may execute display control such that the window is moved after the window is rotated in a direction in which the window is dragged by the user.
  • the information processing system 100 may execute display control such that the window is moved while the window is rotated by a special gesture by the user. For example, the information processing system 100 may execute display control on the window by assuming that a manipulation other than a tap manipulation on the content is a manipulation on the window.
  • the information processing system 100 may execute display control on the window by assuming that a manipulation on the window in which the content is scrolled with one hand, as described above, is a manipulation on the content and assuming that a manipulation on the window with both hands is a manipulation on the window, or may execute display control on the window by displaying an outside frame around the window only for a predetermined time at the time of a manipulation and assuming that a manipulation on the outside frame is a manipulation on the window.
  • FIG. 207 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • the information processing system 100 may execute display control such that the window 3300 is returned inside the screen (display region) by a reaction according to an amount by which the window 3300 is protruded outside the screen (display region), as in (2) of FIG. 207 .
  • FIG. 208 is an explanatory diagram illustrating an example of a manipulation on the window by the user.
  • the information processing system 100 may execute display control such that the window 3300 is closed or the window 3300 is minimized, as in FIG. 208 .
  • the information processing system 100 may execute display control such that the window 3300 is closed or the window 3300 is minimized, as in FIG. 208 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Otolaryngology (AREA)
  • Computer Hardware Design (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Projection Apparatus (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Stored Programmes (AREA)
  • Digital Computer Display Output (AREA)
  • Light Sources And Details Of Projection-Printing Devices (AREA)
  • Circuit For Audible Band Transducer (AREA)
  • Arrangement Of Elements, Cooling, Sealing, Or The Like Of Lighting Devices (AREA)
  • Transforming Electric Information Into Light Information (AREA)
US15/106,717 2013-12-27 2014-09-02 Control device, control method, and computer program Abandoned US20170038892A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013273369 2013-12-27
JP2013-273369 2013-12-27
PCT/JP2014/073107 WO2015098190A1 (ja) 2013-12-27 2014-09-02 制御装置、制御方法及びコンピュータプログラム

Publications (1)

Publication Number Publication Date
US20170038892A1 true US20170038892A1 (en) 2017-02-09

Family

ID=53478073

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/106,679 Abandoned US20170039030A1 (en) 2013-12-27 2014-09-02 Display control device, display control method, and program
US15/106,641 Active US11146771B2 (en) 2013-12-27 2014-09-02 Display control device, display control method, and program
US15/106,717 Abandoned US20170038892A1 (en) 2013-12-27 2014-09-02 Control device, control method, and computer program
US15/106,540 Abandoned US20170041581A1 (en) 2013-12-27 2014-09-02 Control device, control method, and computer program

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/106,679 Abandoned US20170039030A1 (en) 2013-12-27 2014-09-02 Display control device, display control method, and program
US15/106,641 Active US11146771B2 (en) 2013-12-27 2014-09-02 Display control device, display control method, and program

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/106,540 Abandoned US20170041581A1 (en) 2013-12-27 2014-09-02 Control device, control method, and computer program

Country Status (9)

Country Link
US (4) US20170039030A1 (ru)
EP (4) EP3089012A4 (ru)
JP (5) JPWO2015098190A1 (ru)
KR (4) KR20160102411A (ru)
CN (4) CN105850116A (ru)
BR (2) BR112016014491A2 (ru)
RU (4) RU2016124467A (ru)
TW (4) TW201531917A (ru)
WO (4) WO2015098190A1 (ru)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170234680A1 (en) * 2016-02-17 2017-08-17 Boe Technology Group Co., Ltd. Backlight source flatness detection system and backlight source flatness detection method
US20180011586A1 (en) * 2016-07-07 2018-01-11 Samsung Display Co., Ltd. Multi-touch display panel and method of controlling the same
US20180196503A1 (en) * 2015-09-18 2018-07-12 Sony Corporation Information processing device, information processing method, and program
US20200202150A1 (en) * 2018-12-21 2020-06-25 Toyota Jidosha Kabushiki Kaisha Control device, vehicle, image display system, and image display method
US10809870B2 (en) 2017-02-09 2020-10-20 Sony Corporation Information processing apparatus and information processing method
US10942616B2 (en) 2018-12-28 2021-03-09 Beijing Xiaomi Mobile Software Co., Ltd. Multimedia resource management method and apparatus, and storage medium
EP3852070A4 (en) * 2018-09-11 2021-11-24 Sony Group Corporation INFORMATION PROCESSING DEVICE, DRAWING CONTROL METHOD AND RECORDING MEDIUM WITH THE PROGRAM RECORDED ON IT

Families Citing this family (118)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5942840B2 (ja) * 2012-12-21 2016-06-29 ソニー株式会社 表示制御システム及び記録媒体
USD852828S1 (en) * 2013-04-05 2019-07-02 Thales Avionics, Inc. Display screen or portion thereof with graphical user interface
USD738889S1 (en) * 2013-06-09 2015-09-15 Apple Inc. Display screen or portion thereof with animated graphical user interface
KR102271853B1 (ko) * 2014-03-21 2021-07-01 삼성전자주식회사 전자 장치, 영상 처리 방법, 및 컴퓨터 판독가능 기록매체
CN110067971B (zh) * 2014-12-26 2021-08-27 麦克赛尔株式会社 照明装置
US11106314B2 (en) * 2015-04-21 2021-08-31 Dell Products L.P. Continuous calibration of an information handling system projected user interface
KR20160125783A (ko) * 2015-04-22 2016-11-01 삼성전자주식회사 컨텐츠를 표시하기 위한 방법 및 전자 장치
USD795917S1 (en) 2015-05-17 2017-08-29 Google Inc. Display screen with an animated graphical user interface
JP6213613B2 (ja) * 2015-05-25 2017-10-18 キヤノンマーケティングジャパン株式会社 情報処理装置、その制御方法、及びプログラム、並びに、情報処理システム、その制御方法、及びプログラム
WO2017006426A1 (ja) * 2015-07-07 2017-01-12 日立マクセル株式会社 表示システム、ウェアラブルデバイス、および映像表示装置
EP3343339A4 (en) * 2015-08-24 2019-03-27 Sony Corporation INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS AND PROGRAM
WO2017056147A1 (ja) 2015-09-28 2017-04-06 日立マクセル株式会社 照明装置
JP6650612B2 (ja) * 2015-10-06 2020-02-19 パナソニックIpマネジメント株式会社 照明制御装置及び照明システム
US10645309B2 (en) * 2015-11-06 2020-05-05 Intel Corporation Systems, methods, and apparatuses for implementing maximum likelihood image binarization in a coded light range camera
CN112911197A (zh) 2015-11-24 2021-06-04 麦克赛尔株式会社 影像信息处理系统、移动通信终端和影像信息处理方法
JP2017102290A (ja) * 2015-12-02 2017-06-08 株式会社 オルタステクノロジー 投影表示装置
JP6645151B2 (ja) * 2015-12-07 2020-02-12 富士通株式会社 投影装置、投影方法及び投影用コンピュータプログラム
US20170185364A1 (en) * 2015-12-28 2017-06-29 Lunatech, Llc Methods and Systems For a Dual Function Multimedia Device
USD793440S1 (en) * 2016-01-26 2017-08-01 Google Inc. Display screen with transitional graphical user interface
US10129510B2 (en) * 2016-02-18 2018-11-13 Samsung Electronics Co., Ltd. Initiating human-machine interaction based on visual attention
JP2017146927A (ja) * 2016-02-19 2017-08-24 ソニーモバイルコミュニケーションズ株式会社 制御装置、制御方法及びプログラム
KR101773772B1 (ko) * 2016-03-17 2017-09-04 주식회사 아보네 그림자 출력장치를 이용한 주문 관리 시스템
KR102462409B1 (ko) 2016-05-16 2022-11-02 센센 네트웍스 그룹 피티와이 엘티디 자동화된 테이블 게임 활동 인식을 위한 시스템 및 방법
US10481863B2 (en) 2016-07-06 2019-11-19 Baidu Usa Llc Systems and methods for improved user interface
USD817337S1 (en) * 2016-07-07 2018-05-08 Baidu Usa Llc Display screen or portion thereof with graphical user interface
USD815110S1 (en) * 2016-07-07 2018-04-10 Baidu Usa Llc Display screen or portion thereof with graphical user interface
USD812635S1 (en) 2016-07-07 2018-03-13 Baidu Usa Llc. Display screen or portion thereof with graphical user interface
KR20180017796A (ko) 2016-08-11 2018-02-21 주식회사 엘지화학 황-탄소 복합체, 이의 제조방법 및 이를 포함하는 리튬-황 전지
JP6980990B2 (ja) * 2016-08-31 2021-12-15 ソニーグループ株式会社 情報処理システム、情報処理方法、およびプログラム
JP6556680B2 (ja) * 2016-09-23 2019-08-07 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム
WO2018096804A1 (ja) * 2016-11-25 2018-05-31 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
CN106730827B (zh) 2016-12-06 2018-10-19 腾讯科技(深圳)有限公司 一种对象显示的方法以及终端设备
KR20190099207A (ko) 2016-12-27 2019-08-26 소니 주식회사 정보 처리 장치, 정보 처리 방법 및 컴퓨터 프로그램
WO2018123475A1 (ja) * 2016-12-27 2018-07-05 ソニー株式会社 情報処理装置、情報処理方法及びコンピュータプログラム
JP2018106535A (ja) * 2016-12-27 2018-07-05 ソニー株式会社 情報処理装置、情報処理方法及びコンピュータプログラム
JP2018128979A (ja) * 2017-02-10 2018-08-16 パナソニックIpマネジメント株式会社 厨房支援システム
WO2018150756A1 (ja) * 2017-02-15 2018-08-23 ソニー株式会社 情報処理装置、情報処理方法及び記憶媒体
JP6903935B2 (ja) * 2017-02-17 2021-07-14 ソニーグループ株式会社 情報処理システム、情報処理方法、およびプログラム
JP6769346B2 (ja) * 2017-03-02 2020-10-14 富士通株式会社 制御プログラム、制御方法、および情報処理端末
CN108570820B (zh) * 2017-03-07 2021-12-14 青岛胶南海尔洗衣机有限公司 一种应用于家用电器的显示装置及洗衣机
CN108570823B (zh) * 2017-03-07 2021-09-17 青岛胶南海尔洗衣机有限公司 一种应用于家用电器的显示装置及洗衣机
US20180260105A1 (en) 2017-03-08 2018-09-13 Nanning Fugui Precision Industrial Co., Ltd. Method for displaying sub-screen and device using the same
USD834611S1 (en) * 2017-03-14 2018-11-27 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
USD854560S1 (en) * 2017-03-17 2019-07-23 Health Management Systems, Inc. Display screen with animated graphical user interface
US10511818B2 (en) * 2017-03-29 2019-12-17 Intel Corporation Context aware projection
WO2018198156A1 (ja) * 2017-04-24 2018-11-01 三菱電機株式会社 報知制御装置および報知制御方法
JP6463527B2 (ja) * 2017-05-12 2019-02-06 キヤノン株式会社 情報処理装置およびその制御方法、並びにプログラム
US10466889B2 (en) 2017-05-16 2019-11-05 Apple Inc. Devices, methods, and graphical user interfaces for accessing notifications
JP2018205478A (ja) * 2017-06-02 2018-12-27 セイコーエプソン株式会社 表示装置および表示装置の制御方法
US11082669B2 (en) * 2017-06-19 2021-08-03 Sony Corporation Image projection apparatus, image processing apparatus, image projection method, image processing method, and image projection system
CN107469351B (zh) * 2017-06-20 2021-02-09 网易(杭州)网络有限公司 游戏画面显示方法及装置、存储介质、电子设备
CN107277476B (zh) * 2017-07-20 2023-05-12 苏州名雅科技有限责任公司 一种适合在旅游景点供儿童互动体验的多媒体设备
CN107231516A (zh) * 2017-07-27 2017-10-03 重庆万建电子工程有限责任公司重庆第分公司 带刮灰结构的半球型摄像机
US11417135B2 (en) * 2017-08-23 2022-08-16 Sony Corporation Information processing apparatus, information processing method, and program
USD859429S1 (en) * 2017-08-30 2019-09-10 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD857710S1 (en) * 2017-08-30 2019-08-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
KR20200041877A (ko) * 2017-08-31 2020-04-22 소니 주식회사 정보 처리 장치, 정보 처리 방법 및 프로그램
WO2019058521A1 (ja) * 2017-09-22 2019-03-28 popIn株式会社 プロジェクタおよびプロジェクタシステム
USD857722S1 (en) * 2017-10-24 2019-08-27 Atlas Biomed Group Limited Display screen with graphical user interface
KR101972134B1 (ko) * 2017-10-31 2019-05-03 화남전자 주식회사 건설장비의 자동 캘리브레이션을 위한 변형 방지 삼각형 마커
US11144153B2 (en) * 2017-12-07 2021-10-12 Elliptic Laboratories As User interface with acoustic proximity and position sensing arrangements
USD841047S1 (en) * 2017-12-11 2019-02-19 Citrix Systems, Inc. Display screen or portion thereof with transitional graphical user interface
CN109920320B (zh) * 2017-12-12 2020-12-25 成都金都超星天文设备有限公司 一种操控天象仪进行天象演示的方法及控制系统
KR102007661B1 (ko) * 2017-12-15 2019-08-06 원광대학교산학협력단 원격주문 제어시스템 및 그의 제어방법
JP6693495B2 (ja) * 2017-12-15 2020-05-13 ソニー株式会社 情報処理装置、情報処理方法及び記録媒体
USD868800S1 (en) * 2018-01-05 2019-12-03 Google Llc Display screen or portion thereof with graphical user interface
JP7124336B2 (ja) * 2018-02-23 2022-08-24 京セラドキュメントソリューションズ株式会社 表示制御装置
JP7067608B2 (ja) * 2018-03-02 2022-05-16 日本電気株式会社 情報処理システム
FR3079049B1 (fr) * 2018-03-13 2021-10-15 Immersion Procede de manipulation de contenant et contenu d'une information numerique
WO2019190511A1 (en) * 2018-03-28 2019-10-03 Rovi Guides, Inc. Systems and methods to provide media asset recommendations based on positioning of internet connected objects on an network-connected surface
CN108525285A (zh) * 2018-03-31 2018-09-14 千本樱科技成都有限公司 一种麻将自动算牌器及方法
CN108613151A (zh) * 2018-05-25 2018-10-02 讯飞幻境(北京)科技有限公司 一种智能台灯的控制系统
US10433086B1 (en) 2018-06-25 2019-10-01 Biamp Systems, LLC Microphone array with automated adaptive beam tracking
US10694285B2 (en) 2018-06-25 2020-06-23 Biamp Systems, LLC Microphone array with automated adaptive beam tracking
US10210882B1 (en) * 2018-06-25 2019-02-19 Biamp Systems, LLC Microphone array with automated adaptive beam tracking
WO2020018592A1 (en) 2018-07-17 2020-01-23 Methodical Mind, Llc. Graphical user interface system
US20220036076A1 (en) * 2018-10-01 2022-02-03 Sony Corporation Information processing apparatus, information processing method, and recording medium
TWI699543B (zh) * 2018-11-09 2020-07-21 國立雲林科技大學 主動式打火機聲源辨識系統
WO2020105299A1 (ja) 2018-11-21 2020-05-28 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
US10884614B1 (en) * 2018-11-30 2021-01-05 Zoox, Inc. Actuation interface
CN109656439A (zh) * 2018-12-17 2019-04-19 北京小米移动软件有限公司 快捷操作面板的显示方法、装置及存储介质
KR102653252B1 (ko) * 2019-02-21 2024-04-01 삼성전자 주식회사 외부 객체의 정보에 기반하여 시각화된 인공 지능 서비스를 제공하는 전자 장치 및 전자 장치의 동작 방법
JP7188176B2 (ja) * 2019-02-25 2022-12-13 セイコーエプソン株式会社 プロジェクター、画像表示システム及び画像表示システムの制御方法
TWI739069B (zh) * 2019-03-04 2021-09-11 仁寶電腦工業股份有限公司 遊戲裝置與辨識遊戲裝置的方法
US11234088B2 (en) 2019-04-16 2022-01-25 Biamp Systems, LLC Centrally controlling communication at a venue
US10743105B1 (en) * 2019-05-31 2020-08-11 Microsoft Technology Licensing, Llc Sending audio to various channels using application location information
TWI714163B (zh) * 2019-07-11 2020-12-21 緯創資通股份有限公司 控制投影內容的方法及電子裝置
JP7363163B2 (ja) * 2019-07-26 2023-10-18 日本電気株式会社 監視装置、監視方法、および、プログラム、並びに、監視システム
JP7285481B2 (ja) * 2019-07-30 2023-06-02 パナソニックIpマネジメント株式会社 照明システム、及び、制御装置
USD953347S1 (en) * 2019-09-02 2022-05-31 Huawei Technologies Co., Ltd. Electronic display for a wearable device presenting a graphical user interface
JP7363235B2 (ja) * 2019-09-10 2023-10-18 富士フイルムビジネスイノベーション株式会社 情報処理装置及び情報処理プログラム
JP7319888B2 (ja) * 2019-10-16 2023-08-02 日本たばこ産業株式会社 情報処理装置、プログラム及び情報処理システム
US11809662B2 (en) 2020-03-04 2023-11-07 Abusizz Ag Interactive display apparatus and method for operating the same
DK202070608A1 (en) 2020-03-10 2021-11-16 Apple Inc Devices, methods, and graphical user interfaces for interacting with user interface objects corresponding to applications
EP3933560A1 (en) * 2020-06-30 2022-01-05 Spotify AB Methods and systems for providing animated visual feedback for voice commands
US11481938B2 (en) * 2020-10-02 2022-10-25 Adobe Inc. Adaptable drawing guides
US11741517B1 (en) 2020-10-26 2023-08-29 Wells Fargo Bank, N.A. Smart table system for document management
US11429957B1 (en) 2020-10-26 2022-08-30 Wells Fargo Bank, N.A. Smart table assisted financial health
US11397956B1 (en) 2020-10-26 2022-07-26 Wells Fargo Bank, N.A. Two way screen mirroring using a smart table
US11727483B1 (en) 2020-10-26 2023-08-15 Wells Fargo Bank, N.A. Smart table assisted financial health
US11740853B1 (en) 2020-10-26 2023-08-29 Wells Fargo Bank, N.A. Smart table system utilizing extended reality
US11457730B1 (en) 2020-10-26 2022-10-04 Wells Fargo Bank, N.A. Tactile input device for a touch screen
USD973068S1 (en) * 2020-11-10 2022-12-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
JP7112627B2 (ja) * 2020-11-30 2022-08-04 ソニーグループ株式会社 制御装置、制御方法及びプログラム
JP7321196B2 (ja) * 2021-01-05 2023-08-04 三菱電機株式会社 情報処理装置、情報処理方法、および情報処理プログラム
JP7243748B2 (ja) * 2021-02-03 2023-03-22 セイコーエプソン株式会社 設定方法、及びプログラム
US20220291795A1 (en) * 2021-03-09 2022-09-15 Lenovo (Singapore) Pte. Ltd. Projecting interfaces on a surface
WO2022239152A1 (ja) * 2021-05-12 2022-11-17 日本電信電話株式会社 情報提示装置、情報提示方法、及びプログラム
US11545118B2 (en) * 2021-05-19 2023-01-03 Business Objects Software Ltd. Display of out-of-window status indicators in a virtual shelf of a diagram window
CN113568591B (zh) * 2021-06-15 2023-06-20 青岛海尔科技有限公司 智能设备的控制方法及控制装置、智能设备、智能餐桌
KR102364497B1 (ko) * 2021-10-20 2022-02-18 (주)유비더스시스템 인테리어 매질을 이용한 정전기식 터치 스크린 시스템
CN113873421B (zh) * 2021-12-01 2022-03-22 杭州当贝网络科技有限公司 一种基于投屏设备实现天空声音效的方法和系统
KR102443108B1 (ko) * 2022-05-02 2022-09-14 경상남도 (교육청) 스마트 홈 암호 찾기 게임을 수행하기 위한 스마트 홈 암호 찾기 게임 장치
KR102446676B1 (ko) * 2022-05-02 2022-09-26 (주) 아하 AI와 IoT 기능을 융합하여 음성인식과 원격제어를 처리하는 스마트테이블 및 그 동작방법
US11842028B2 (en) 2022-05-06 2023-12-12 Apple Inc. Devices, methods, and graphical user interfaces for updating a session region
EP4273676A1 (en) 2022-05-06 2023-11-08 Apple Inc. Devices, methods, and graphical user interfaces for updating a session region
TWI823789B (zh) * 2022-05-27 2023-11-21 陳泰生 電動麻將桌收牌啟閉裝置
JP7280453B1 (ja) * 2023-02-07 2023-05-23 株式会社サン・コンピュータ 麻雀点数計算システム、麻雀点数計算方法、および麻雀点数計算プログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110031003A1 (en) * 2008-04-10 2011-02-10 Rogers Corporation Circuit materials with improved bond, method of manufacture thereof, and articles formed therefrom
US20110294433A1 (en) * 2010-05-28 2011-12-01 Sony Corporation Information processing apparatus, information processing system, and program
US20120284619A1 (en) * 2009-12-23 2012-11-08 Nokia Corporation Apparatus

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05107638A (ja) * 1991-10-18 1993-04-30 Fujitsu General Ltd 投射型表示装置
JPH08123652A (ja) 1994-10-26 1996-05-17 Sony Corp 表示装置
US7259747B2 (en) * 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
JP2004233845A (ja) * 2003-01-31 2004-08-19 Toshiba Corp 情報処理装置および表示輝度制御方法
US7370284B2 (en) * 2003-11-18 2008-05-06 Laszlo Systems, Inc. User interface for displaying multiple applications
US7394459B2 (en) * 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
JP3899375B2 (ja) 2004-09-03 2007-03-28 国立大学法人北陸先端科学技術大学院大学 表示制御装置
JP4664665B2 (ja) * 2004-12-22 2011-04-06 オリンパスイメージング株式会社 デジタルプラットフォーム装置
JP2006287735A (ja) * 2005-04-01 2006-10-19 Fuji Photo Film Co Ltd 画像音声記録装置及び集音方向調整方法
JP2007036482A (ja) * 2005-07-25 2007-02-08 Nippon Telegr & Teleph Corp <Ntt> 情報投影表示装置およびプログラム
US7911444B2 (en) * 2005-08-31 2011-03-22 Microsoft Corporation Input method for surface of interactive display
JP2007282077A (ja) * 2006-04-11 2007-10-25 Seiko Epson Corp 放送受信装置および放送受信方法
US7552402B2 (en) * 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows
JP2008033049A (ja) * 2006-07-28 2008-02-14 Ricoh Co Ltd 対象物指示装置
DE602006003096D1 (de) * 2006-08-04 2008-11-20 Harman Becker Automotive Sys Methode und System zur Verarbeitung von Stimmkommandos in einer Fahrzeugumgebung
JP5464786B2 (ja) * 2006-12-21 2014-04-09 キヤノン株式会社 情報処理装置、制御方法、及び制御プログラム
JP2009025921A (ja) * 2007-07-17 2009-02-05 Canon Inc 情報処理装置及びその制御方法、プログラム、記録媒体
US8427424B2 (en) * 2008-09-30 2013-04-23 Microsoft Corporation Using physical objects in conjunction with an interactive surface
JP2010113455A (ja) * 2008-11-05 2010-05-20 Toshiba Corp 画像処理装置および画像処理プログラム
JP2011040227A (ja) * 2009-08-07 2011-02-24 Sharp Corp 照明装置、照明システム及び照明装置の制御方法
JP4687820B2 (ja) 2009-11-09 2011-05-25 ソニー株式会社 情報入力装置及び情報入力方法
JP2013518383A (ja) 2010-01-27 2013-05-20 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ ビデオ照明システムの制御方法
JP5648298B2 (ja) * 2010-03-15 2015-01-07 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2011227199A (ja) * 2010-04-16 2011-11-10 Nec Casio Mobile Communications Ltd 雑音抑圧装置、雑音抑圧方法及びプログラム
JP5516102B2 (ja) * 2010-06-11 2014-06-11 セイコーエプソン株式会社 光学式位置検出装置、電子機器及び表示装置
JP2012058704A (ja) * 2010-09-13 2012-03-22 Sanyo Electric Co Ltd 録音装置、録音条件設定方法および録音条件設定プログラム
US8682030B2 (en) * 2010-09-24 2014-03-25 Microsoft Corporation Interactive display
US8502816B2 (en) * 2010-12-02 2013-08-06 Microsoft Corporation Tabletop display providing multiple views to users
KR101818024B1 (ko) * 2011-03-29 2018-01-12 퀄컴 인코포레이티드 각각의 사용자의 시점에 대해 공유된 디지털 인터페이스들의 렌더링을 위한 시스템
US8643703B1 (en) * 2011-03-30 2014-02-04 Amazon Technologies, Inc. Viewer tracking image display
US20130057587A1 (en) * 2011-09-01 2013-03-07 Microsoft Corporation Arranging tiles
WO2013056157A1 (en) * 2011-10-13 2013-04-18 Autodesk, Inc. Proximity-aware multi-touch tabletop
JP5957893B2 (ja) * 2012-01-13 2016-07-27 ソニー株式会社 情報処理装置及び情報処理方法、並びにコンピューター・プログラム
US9223415B1 (en) * 2012-01-17 2015-12-29 Amazon Technologies, Inc. Managing resource usage for task performance
JP2013182624A (ja) * 2012-03-01 2013-09-12 Toshiba Corp 情報処理装置、スケジュール表示プログラム
JP6028351B2 (ja) * 2012-03-16 2016-11-16 ソニー株式会社 制御装置、電子機器、制御方法、及びプログラム
JP2013214259A (ja) * 2012-04-04 2013-10-17 Sharp Corp 表示装置
JP6217058B2 (ja) 2012-05-09 2017-10-25 セイコーエプソン株式会社 画像表示システム
US8837780B2 (en) * 2012-06-22 2014-09-16 Hewlett-Packard Development Company, L.P. Gesture based human interfaces
US9285893B2 (en) * 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
CN103092344A (zh) 2013-01-11 2013-05-08 深圳市金立通信设备有限公司 一种控制终端屏幕画面旋转的方法及终端
WO2014123224A1 (ja) * 2013-02-08 2014-08-14 株式会社ニコン 電子制御装置、制御方法、及び制御プログラム
TWI553544B (zh) * 2013-02-27 2016-10-11 緯創資通股份有限公司 電子裝置及影像調整方法
US9384751B2 (en) * 2013-05-06 2016-07-05 Honeywell International Inc. User authentication of voice controlled devices
KR102070623B1 (ko) * 2013-07-09 2020-01-29 삼성전자 주식회사 비트 라인 등화 회로

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110031003A1 (en) * 2008-04-10 2011-02-10 Rogers Corporation Circuit materials with improved bond, method of manufacture thereof, and articles formed therefrom
US20120284619A1 (en) * 2009-12-23 2012-11-08 Nokia Corporation Apparatus
US20110294433A1 (en) * 2010-05-28 2011-12-01 Sony Corporation Information processing apparatus, information processing system, and program

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180196503A1 (en) * 2015-09-18 2018-07-12 Sony Corporation Information processing device, information processing method, and program
US10564712B2 (en) * 2015-09-18 2020-02-18 Sony Corporation Information processing device, information processing method, and program
US20170234680A1 (en) * 2016-02-17 2017-08-17 Boe Technology Group Co., Ltd. Backlight source flatness detection system and backlight source flatness detection method
US9885565B2 (en) * 2016-02-17 2018-02-06 Boe Technology Group Co., Ltd. Backlight source flatness detection system and backlight source flatness detection method
US20180011586A1 (en) * 2016-07-07 2018-01-11 Samsung Display Co., Ltd. Multi-touch display panel and method of controlling the same
US10558288B2 (en) * 2016-07-07 2020-02-11 Samsung Display Co., Ltd. Multi-touch display panel and method of controlling the same
US10809870B2 (en) 2017-02-09 2020-10-20 Sony Corporation Information processing apparatus and information processing method
EP3852070A4 (en) * 2018-09-11 2021-11-24 Sony Group Corporation INFORMATION PROCESSING DEVICE, DRAWING CONTROL METHOD AND RECORDING MEDIUM WITH THE PROGRAM RECORDED ON IT
US20200202150A1 (en) * 2018-12-21 2020-06-25 Toyota Jidosha Kabushiki Kaisha Control device, vehicle, image display system, and image display method
US11468692B2 (en) * 2018-12-21 2022-10-11 Toyota Jidosha Kabushiki Kaisha Control device, vehicle, image display system, and image display method
US10942616B2 (en) 2018-12-28 2021-03-09 Beijing Xiaomi Mobile Software Co., Ltd. Multimedia resource management method and apparatus, and storage medium

Also Published As

Publication number Publication date
US20170031530A1 (en) 2017-02-02
EP3089446A4 (en) 2017-10-25
JP2019185812A (ja) 2019-10-24
WO2015098187A1 (ja) 2015-07-02
RU2016124467A (ru) 2017-12-25
JPWO2015098188A1 (ja) 2017-03-23
TW201525772A (zh) 2015-07-01
JP6428641B2 (ja) 2018-11-28
TW201531917A (zh) 2015-08-16
JPWO2015098187A1 (ja) 2017-03-23
EP3089012A1 (en) 2016-11-02
TW201525835A (zh) 2015-07-01
RU2016124466A (ru) 2017-12-25
JPWO2015098189A1 (ja) 2017-03-23
EP3089011A4 (en) 2017-11-15
CN105830005A (zh) 2016-08-03
TW201528112A (zh) 2015-07-16
EP3089011A1 (en) 2016-11-02
EP3089013A4 (en) 2017-11-22
EP3089012A4 (en) 2017-08-02
EP3089446A1 (en) 2016-11-02
RU2016124465A (ru) 2017-12-25
CN106134190A (zh) 2016-11-16
JPWO2015098190A1 (ja) 2017-03-23
KR20160102412A (ko) 2016-08-30
KR20160102411A (ko) 2016-08-30
US20170041581A1 (en) 2017-02-09
CN106104455A (zh) 2016-11-09
WO2015098190A1 (ja) 2015-07-02
RU2016124468A (ru) 2017-12-25
BR112016014515A2 (pt) 2017-08-08
US11146771B2 (en) 2021-10-12
JP6555129B2 (ja) 2019-08-07
BR112016014491A2 (pt) 2017-08-08
JP6795061B2 (ja) 2020-12-02
WO2015098188A1 (ja) 2015-07-02
WO2015098189A1 (ja) 2015-07-02
EP3089013A1 (en) 2016-11-02
KR20160102179A (ko) 2016-08-29
CN105850116A (zh) 2016-08-10
KR20160102180A (ko) 2016-08-29
US20170039030A1 (en) 2017-02-09

Similar Documents

Publication Publication Date Title
US11146771B2 (en) Display control device, display control method, and program
US10685059B2 (en) Portable electronic device and method for generating a summary of video data
CN106257392B (zh) 用于导航媒体内容的设备、方法和图形用户界面
CN102801850B (zh) 移动终端及其控制方法
CN104182168B (zh) 移动终端及其控制方法
US11500533B2 (en) Mobile terminal for displaying a preview image to be captured by a camera and control method therefor
CN105556937B (zh) 移动终端及其控制方法
CN102667701A (zh) 在触摸屏用户接口上修改命令的方法
CN106453817B (zh) 移动终端及其控制方法
KR20180006137A (ko) 단말기 및 그 제어 방법
KR20160058438A (ko) 이동 단말기 및 그 제어 방법
Centers Take Control of iOS 17 and iPadOS 17
KR20160097698A (ko) 이동 단말기 및 그 제어방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:IKEDA, TETSUO;SAKAMOTO, TAKAYUKI;ISHII, TOMOHIRO;AND OTHERS;SIGNING DATES FROM 20160606 TO 20160626;REEL/FRAME:039101/0139

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION