WO2015049931A1 - 情報処理装置、情報処理方法、及びプログラム - Google Patents
情報処理装置、情報処理方法、及びプログラム Download PDFInfo
- Publication number
- WO2015049931A1 WO2015049931A1 PCT/JP2014/071802 JP2014071802W WO2015049931A1 WO 2015049931 A1 WO2015049931 A1 WO 2015049931A1 JP 2014071802 W JP2014071802 W JP 2014071802W WO 2015049931 A1 WO2015049931 A1 WO 2015049931A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- information
- window
- control unit
- information processing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1698—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/16—Use of wireless transmission of display information
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/20—Details of the management of multiple sources of image data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 discloses a technique for setting management authority for each window.
- Patent Document 2 discloses a technique for setting window management authority in accordance with a window display position.
- Patent Document 3 discloses a technique for detecting the position of a user and determining the display position of a window corresponding to the user based on the detection result.
- a window is displayed at a display position indicated by the display position designation information in a communication unit that receives display position designation information indicating the display position of the window from another information processing apparatus and a display area of the display unit.
- an information processing apparatus including a control unit that performs control.
- a control unit that generates display position designation information that indicates a display position of a window, and a communication unit that transmits the display position designation information to another information processing apparatus capable of displaying the window.
- An information processing apparatus is provided.
- display position designation information indicating the display position of a window is received from another information processing apparatus, and the window is displayed at a display position indicated by the display position designation information in the display area of the display unit. And an information processing method is provided.
- a communication function for receiving display position designation information for instructing a display position of a window from another information processing apparatus and a display position indicated by the display position designation information among display areas of the display unit.
- a program for realizing a control function for performing control for displaying a window is provided.
- the information processing apparatus can receive display position designation information from another information processing apparatus and display a window at the display position indicated by the display position designation information.
- the display position of the window to be displayed can be designated using another information processing apparatus.
- the effect by the technique which concerns on this indication is not limited to the effect described here.
- the technology according to the present disclosure may have any of the effects described in the present specification or other effects.
- FIG. 3 is a block diagram illustrating a configuration of an information processing apparatus (another information processing apparatus) according to the first embodiment of the present disclosure.
- FIG. It is a hardware block diagram of the information processing apparatus which concerns on the embodiment. It is a block diagram which shows the structure of the display apparatus (information processing apparatus) which concerns on the same embodiment. It is a hardware block diagram of the display apparatus which concerns on the same embodiment. It is a flowchart which shows the procedure of the process by an information processing system. It is a flowchart which shows the procedure of the process by an information processing system. It is a flowchart which shows the procedure of the process by an information processing system. It is a flowchart which shows the procedure of the process by an information processing system. It is a flowchart which shows the procedure of the process by an information processing system.
- First embodiment an example of an information processing system including a display device and an information processing device
- Overall configuration 1-2 Configuration of information processing apparatus 1-3. Configuration of display device 1-4. Example of processing by information processing system 1-4-1. First Processing Example 1-4-2. Second processing example 1-4-3. Third processing example 1-4-4. Fourth processing example 1-4-5. Fifth processing example 1-4-6. Sixth processing example 1-4-7. Seventh processing example 1-4-8. Eighth processing example 1-4-9. Ninth Processing Example 1-4-10. Tenth processing example 1-4-11. Eleventh processing example Second embodiment (example in which a head-mounted display is added to the first embodiment) 2-1. Overall configuration 2-2. Configuration of head mounted display 2-3. Server configuration 2-4. Example of processing by information processing system 2-4-1. First processing example 2-4-2. Second processing example 2-4-3. Third processing example 2-4-4. Fourth processing example 2-4-5. Fifth processing example 2-4-6. Sixth processing example
- the information processing system includes one or a plurality of information processing devices 10 and a display device 20.
- the information processing system may not include the information processing apparatus 10.
- the information processing apparatus 10 is not necessarily required in some of the processing examples described later. In this case, the information processing system may not include the information processing apparatus 10.
- the information processing system may not include the information processing apparatus 10.
- these information processing apparatuses 10 may be carried by different users.
- the information processing apparatus 10 communicates with the display apparatus 20 in response to an input operation by the user.
- the information processing apparatus 10 is preferably an information processing apparatus carried by a user, such as a smartphone, a smart tablet, and a mobile phone, but is not limited thereto. That is, the information processing device 10 may be any device as long as each processing example described later is realized.
- the display device 20 communicates with the information processing device 10 and displays various images accordingly.
- the display device 20 may display a plurality of windows. Further, these windows can be operated by different users. Details will be described in a processing example described later. Since the display device 20 displays a plurality of windows, it is preferable that the resolution is high.
- the display device 20 may have a large resolution and a large screen. For example, the entire wall of the room may be the display device 20.
- “throw” means that the information displayed on the information processing apparatus 10 is displayed on the display apparatus 20. Further, “catch” means that the information displayed on the display device 20 is displayed on the information processing device 10. “Right” and “left” mean directions viewed from the user.
- the information processing apparatus 10 includes a display unit 11, an operation unit 12, a detection unit 13, a communication unit 14, a storage unit 15, a control unit 16, and an audio output unit 17.
- the display unit 11 displays various images under the control of the control unit 16.
- the operation unit 12 receives an input operation by a user.
- the operation unit 12 outputs input operation information to the control unit 16.
- the detection unit 13 detects the posture of the information processing apparatus 10 and outputs detection information regarding the detection result to the control unit 16.
- the communication unit 14 communicates with the display device 20 and outputs information obtained thereby to the control unit 16.
- the communication unit 14 may communicate with other information processing apparatuses via a network.
- the storage unit 15 stores various information, for example, a program for causing the information processing apparatus 10 to realize the display unit 11, the operation unit 12, the detection unit 13, the communication unit 14, the storage unit 15, and the control unit 16, and various image information, etc.
- the control unit 16 controls the entire information processing apparatus 10 and performs processing shown in each processing example described later.
- the audio output unit 17 outputs audio information under the control of the control unit 16.
- the information processing apparatus 10 has the hardware configuration illustrated in FIG. 2, and the display unit 11, the operation unit 12, the detection unit 13, the communication unit 14, the storage unit 15, the control unit 16, and the hardware configuration illustrated in FIG.
- the audio output unit 17 is realized.
- the information processing apparatus 10 includes a display 11a, an operation device 12a, a sensor 13a, a communication device 14a, a nonvolatile memory 15a, a RAM 15b, a CPU 16a, and an audio output device 17a as hardware configurations.
- Display 11a displays various image information.
- the controller device 12a receives an input operation by a user.
- the operation device 12a is preferably a touch panel, but may be a hard key or the like.
- the sensor 13a detects the posture of the information processing apparatus 10 and the like. Specific examples of the sensor 13a include a gyro sensor and an acceleration sensor.
- the communication device 14a communicates with the display device 20.
- the nonvolatile memory 15a stores various programs and image information.
- the program includes a program for causing the information processing apparatus 10 to realize the display unit 11, the operation unit 12, the detection unit 13, the communication unit 14, the storage unit 15, the control unit 16, and the audio output unit 17. It is.
- the RAM 15b is a work area for the CPU 16a.
- the CPU 16a reads and executes the program stored in the nonvolatile memory 15a. Therefore, the display unit 11, the operation unit 12, the detection unit 13, the communication unit 14, the storage unit 15, and the control unit 16 are realized by the CPU 16a reading and executing the program stored in the nonvolatile memory 15a. . That is, the CPU 15 a can be a substantial operating subject of the information processing apparatus 10.
- the audio output device 17a is a device that outputs audio information, such as a speaker and headphones.
- the display device 20 includes a display unit 21, an audio output unit 22, an imaging unit 23 (detection unit), a communication unit 24, a storage unit 25, a control unit 26, and an audio detection unit 27.
- the display unit 21 displays various images under the control of the control unit 26.
- the audio output unit 22 outputs audio information.
- the audio output unit 22 may output directional audio information.
- the imaging unit 23 images a user who views the display device 20 and outputs a captured image obtained thereby to the control unit 26.
- the communication unit 24 communicates with the information processing apparatus 10 and outputs information obtained thereby to the control unit 26.
- the communication unit 24 may communicate with other information processing apparatuses via a network.
- the storage unit 25 is a program for causing the display device 20 to implement various types of information, for example, the display unit 21, the audio output unit 22, the imaging unit 23, the communication unit 24, the storage unit 25, the control unit 26, and the audio detection unit 27. In addition, various image information and the like are stored.
- the control unit 26 controls the entire display device 20 and performs processing shown in each processing example described later.
- the voice detection unit 27 detects voice information, for example, a user's voice, and outputs it to the control unit 26.
- the display device 20 has the hardware configuration shown in FIG. 4, and the display unit 21, the audio output unit 22, the imaging unit 23, the communication unit 24, the storage unit 25, the control unit 26, and the hardware configuration shown in FIG.
- the voice detection unit 27 is realized.
- the display device 20 includes a display panel 21a, a speaker 22a, an imaging device 23a, a communication device 24a, a nonvolatile memory 25a, a RAM 25b, a CPU 26a, and a microphone 27a as hardware configurations.
- Display panel 21a displays various image information.
- the speaker 22a outputs audio information.
- the speaker 22a may output directional audio information.
- the imaging device 23a generates a captured image by performing imaging.
- the communication device 24a communicates with the information processing device 10.
- the nonvolatile memory 25a stores various programs and image information.
- the program includes a program for causing the display device 20 to realize the display unit 21, the audio output unit 22, the imaging unit 23, the communication unit 24, the storage unit 25, the control unit 26, and the audio detection unit 27. It is.
- the RAM 25b is a work area for the CPU 26a.
- the CPU 26a reads and executes the program stored in the nonvolatile memory 25a. Therefore, the display unit 21, the audio output unit 22, the imaging unit 23, the communication unit 24, the storage unit 25, and the control unit 26 are realized by the CPU 26a reading and executing the program stored in the nonvolatile memory 25a.
- the CPU 25a can be a substantial operation subject of the display device 20.
- the microphone 27a detects audio information.
- a set top box that is separate from the display device 20 may be prepared, and the control unit 26 may be realized by the set top box.
- the set top box has a hardware configuration necessary for realizing the control unit 26 (and a communication unit that communicates with the display device 20).
- the control unit 26 may be realized by (a part of) the information processing apparatus 10.
- an information processing server capable of communicating with at least the display device 20 via a network may be prepared, and the control unit 26 may be realized by the information processing server.
- the information processing server has a hardware configuration necessary for realizing the control unit 26 (and a communication unit that communicates with the display device 20).
- the information processing apparatus 10 is a so-called smartphone. Therefore, the operation unit 12 described above is realized by a so-called touch panel.
- the user causes the display device 20 to display information displayed on the information processing device 10 by performing a throw operation.
- the information processing device 10 displays the information being displayed on the display device 20 in a mirror display.
- FIG. 5 The information processing apparatus 10 performs processing according to the flowchart illustrated in FIG. 5, and the display device 20 performs processing according to the flowchart illustrated in FIG. 6.
- step S ⁇ b> 10 the control unit 16 displays image information on the display unit 11, and waits until the user performs a throw operation.
- the image information is image information displayed on the display device 20, that is, throw image information.
- the operation unit 12 outputs the throw operation information to the control unit 16 when a throw operation is performed.
- the control unit 16 proceeds to step S20.
- a display example is shown in FIG.
- the control unit 16 displays a throw button 100 in addition to displaying a web page as the throw image information.
- the control unit 16 displays a throw button 100 and a throw image display window 120.
- the throw button 100 is a button for the user to perform a throw operation. That is, the user performs a throw operation by tapping the throw button 100.
- the throw operation is not limited to this example.
- any gesture operation for example, an operation of flicking the display unit 11 from the lower end toward the upper end
- the throw button 100 may be displayed or omitted.
- the throw image display window 120 displays the throw image information.
- address image information 120a, a web page 120b, and an update button 120c are displayed in the through image display window 120.
- the address image information 120a indicates the address of the currently displayed web page 120b.
- the update button 120c is a button for updating the web page 120b to the latest information.
- the control unit 16 acquires the latest web page 120b, and displays a through image display window. 120.
- the control unit 16 may display an image other than the web page together with the throw button 100.
- step S20 the control unit 16 outputs to the communication unit 14 throw request information indicating that the information being displayed by the information processing device 10 is to be displayed on the display device 20.
- the communication unit 14 transmits the throw request information to the display device 20.
- the display device 20 transmits to the information processing device 10 display area (display position) in which the throw image information can be displayed, that is, displayable area information regarding the displayable area. Details will be described later.
- step S30 the communication unit 14 receives the displayable area information and outputs it to the control unit 16.
- step S40 the control unit 16 generates a display area selection dialog based on the displayable area information, and displays the display area dialog on the display unit 11.
- the display area dialog is a dialog for allowing the user to select a display area of the window.
- a window is displayed in the display area selected by the user, and information being displayed by the information processing apparatus 10 is displayed in this window.
- a display example is shown in FIG. In this example, the control unit 16 grays out the throw button 100, the address image information 120a, the web page 120b, and the update button 120c, and displays the display area dialog 130.
- the display area dialog 130 includes a display device image 140, display area selection buttons 150 and 160, and a cancel button 170.
- the display device image 140 is an image obtained by deforming and displaying the display device 20.
- the display area selection buttons 150 and 160 are buttons for allowing the user to select a display area, and are displayed in a portion corresponding to the displayable area in the display device image 140. That is, the display area selection buttons 150 and 160 indicate displayable areas. In this example, the left end and the right end of the display unit 21 are displayable areas.
- the display area selection button 150 indicates the leftmost displayable area
- the display area selection button 160 indicates the rightmost displayable area. Therefore, for example, when the upper and lower ends of the display unit 21 are displayable areas, display area selection buttons are displayed at the upper and lower ends of the display device image 140.
- the display area dialog is not limited to this example.
- the display area dialog may display a list of character information indicating the position of the displayable area. In this case, each line in the list becomes a display area selection button.
- step S50 the control unit 16 stands by until the user performs a display area selection operation, that is, an operation for selecting a table area.
- the operation unit 12 outputs display area selection operation information to the control unit 16.
- the display area selection operation information is given, that is, when the display area selection operation is detected, the control unit 16 proceeds to step S60.
- examples of the display area selection operation include an operation of tapping the display area selection button described above.
- step S60 the control unit 16 recognizes the display area selected by the user based on the display area selection operation. And the control part 16 produces
- the control unit 16 outputs the selected area information to the communication unit 14.
- the communication unit 14 transmits the selection area information to the display device 20. In response to this, the display device 20 displays a window in the displayable area indicated by the selection area information. Details will be described later.
- step S ⁇ b> 70 the control unit 16 outputs the image information being displayed on the display unit 11, that is, the throw image information to the communication unit 14, and the communication unit 14 transmits the throw image information to the display device 20 in a streaming manner.
- the display device 20 receives the throw image information transmitted by streaming, and displays the throw image information in the throw image display window.
- the transmission form is not limited to this example.
- the control unit 16 also displays the throw image information during streaming transmission on the display unit 11. That is, the control unit 16 also mirrors the throw image information displayed on the display device 20 on the display unit 11.
- a display example is shown in FIG. This display example is displayed when the leftmost display area is selected in the display example of step S40. That is, the control unit 16 displays the indicator 200, the display area switching button 210, the return button 220, and the window 230.
- the indicator 200 is character information indicating that it is being thrown.
- the display area switching button 210 is a button for switching the display area of the throw image information.
- the display area switching buttons 210 may be displayed by the number of displayable areas. In this example, the displayable area switching button 210 indicates the rightmost displayable area.
- the communication unit 14 transmits the switching request information to the display device 20.
- the display device 20 switches the display area of the window based on the switching request information.
- the return button 220 is a button for ending streaming transmission to the display device 20.
- the window 230 displays the throw image information during streaming transmission. For example, the web page described above is displayed.
- step S80 the control unit 16 determines whether or not the user has performed an end operation. For example, the control unit 16 may determine that a tap on the return button 220 is an end operation. In addition, the control unit 16 may terminate any gesture operation as an end operation.
- the operation unit 12 outputs end operation information to the control unit 16 when the end operation is performed. When the end operation information is given, that is, when the end operation is detected, the control unit 16 proceeds to step S90. When the end operation is not detected, the control unit 16 returns to step S70.
- step S90 the control unit 16 performs a cutting process.
- the control unit 16 outputs to the communication unit 14 end notification information indicating that streaming transmission is to end.
- the communication unit 14 transmits end notification information to the display device 20. Thereafter, the control unit 16 disconnects communication with the display device 20.
- step S ⁇ b> 100 the communication unit 24 of the display device 20 receives the throw request information and outputs it to the control unit 26.
- the control unit 26 may display some image information (hereinafter also referred to as “base image information”) on the display unit 21 while waiting for the throw request information.
- base image information some image information
- a display example is shown in FIG. In this example, the control unit 16 displays the image information 170 as the base image information on the display unit 21.
- step S110 the control unit 26 determines a window displayable area.
- the method for determining the window displayable area is not particularly limited.
- the displayable area may be set in advance or may be arbitrarily set by the user.
- step S120 the control unit 26 generates displayable area information related to the displayable area and outputs it to the communication unit 24.
- the communication unit 24 transmits displayable area information to the information processing apparatus 10.
- step S130 the communication unit 24 receives the selected area information and outputs it to the control unit 26.
- step S140 the control unit 26 opens a new window, that is, a through image display window, in the displayable area indicated by the selection area information.
- step S150 the communication unit 24 receives the throw image information in a streaming manner and outputs it to the control unit 26.
- step S160 the control unit 26 displays the throw image information on the throw image display window.
- a display example is shown in FIG. This display example is displayed when the leftmost display area is selected in the display example of step S40. That is, the control unit 26 displays a base image display window 171, a throw image display window 180, various information display windows 190, and a cursor 193.
- the base image display window 171 includes a base image display area 171a and a base image indicator 171b.
- Base image information 171c is displayed in the base image display area 171a.
- the indicator 171b indicates information for identifying the base image information such as a title of the base image information.
- the aspect ratio of the base image display area 171 a matches the aspect ratio of the display unit 21. That is, the control unit 16 reduces the display area of the base image information in the right direction, and displays a through image display window 180 in the opened display area. Further, since the aspect ratio of the base image display area 171a matches the aspect ratio of the display unit 21, the display area below the base image display area 171a is vacant. Therefore, the control unit 26 displays various information display windows 190 in the empty display area. Of course, the aspect ratio of the base image display area 171a may be different from the aspect ratio of the display unit 21.
- the throw image display window 180 is a window for displaying a throw image.
- the through image display window 180 includes an indicator 181, a through image display area 182, a display area switching button 184, a reduction button 185, an enlargement button 186, and a cursor 193.
- the indicator 181 indicates information for identifying the information processing apparatus 10 that is transmitting streaming image information to the display apparatus 20, for example, the name of the owner of the information processing apparatus 10.
- the throw image display area 182 displays the throw image information.
- address image information 182a and a web page 182b are displayed in the throw image display area 182.
- the address image information 182a indicates the address of the web page 182b.
- the display area switching button 184 is a button for switching the display area of the throw image, that is, the display area of the window for displaying the throw image.
- the display area switching button 184 may be displayed by the number of displayable areas.
- the displayable area switching button 210 indicates the rightmost displayable area.
- the reduction button 185 is a button for reducing the through image display window.
- the enlarge button 186 is a button for enlarging the through image display window.
- the various information display window 190 information related to base image information, throw image information, and the like are displayed.
- the various information display window 190 includes advertisement information 191 and operation buttons 192.
- the advertisement information 191 is arbitrary advertisement information, but may be advertisement information related to, for example, throw image information or base image information.
- the operation button 192 is a button for the user to perform various operations.
- the operation button 192 may include a user interface for performing voice input or the like.
- Cursor 193 moves based on an input operation by the user.
- the input operation for moving the cursor 193 may be performed using the remote controller for the display device 20.
- the cursor is moved using the information processing apparatus 10.
- the control unit 26 moves the cursor 193 based on the operation information given from the remote controller. Then, when the user presses the display area switching button 184, the control unit 26 moves the display area of the throw image display window 180 to the right end.
- an operation of pressing the display area switching button 184 includes an operation of pressing the determination button of the remote controller in a state where the cursor 193 exists on the display area switching button 184.
- the operation for selecting the reduction button 185, the enlargement button 186, and the operation button 192 is the same.
- control unit 26 reduces the through image display window 180. Specifically, the control unit 26 shifts the boundary line 183 between the throw image display window 180 and the base image display window 171 to the left.
- the control unit 26 enlarges the through image display window 180. Specifically, the control unit 26 shifts the boundary line 183 between the throw image display window 180 and the base image display window 171 to the right.
- the control unit 26 performs processing according to the type of the operation button 192. For example, when the “Full screen” button is pressed, the control unit 26 displays the throw image information in full screen. When the “Close” button is pressed, the control unit 26 closes the through image display window. In addition, when the “Home” button is pressed, the control unit 26 displays a predetermined initial image as a base image.
- the type of the operation button 192 is not limited to the above.
- control unit 26 moves the display area of the throw image display window to the display area indicated by the switching request information.
- step S170 the control unit 26 determines whether or not end notification information has been received. If it is determined that the end notification information has been received, the control unit 26 proceeds to step S180. If it is determined that the end notification information has not been received, the control unit 26 proceeds to step S140. In step S180, the control unit 26 performs a cutting process. For example, the control unit 26 disconnects communication with the information processing apparatus 10.
- the first processing example only one information processing apparatus 10 is shown, but a plurality of information processing apparatuses 10 may exist. That is, a plurality of users may display the image information of the information processing apparatus 10 on the display device 20.
- the information processing apparatus 10 can designate a display position of a window displayed on the display apparatus 20, that is, a display screen of a throw image. Therefore, the user can easily display a window at a desired position in the display unit 21 using his / her information processing apparatus 10.
- the information processing device 10 can display a display area selection dialog, for example. Therefore, the user can select a desired display area more easily.
- the second processing example is an example in which the information processing apparatus 10 is used as a remote controller.
- steps S190 to S240 shown in FIG. 7 the information processing apparatus 10 performs the same processing as in steps S10 to S60 shown in FIG.
- the control unit 16 In step S250, the control unit 16 generates context information.
- the context information includes information related to the throw image information.
- the context information includes information related to the application being executed in the information processing apparatus 10 (application ID and the like), the location of the throw image information (for example, URL), and the display state of the current throw image information (display position, playback position, etc.). , Including operation history.
- step S260 the control unit 16 receives an input operation from the user.
- This input operation includes an input operation performed on the throw image information displayed on the window of the display device 20 by the user.
- Examples of the input operation include an operation using a touch panel, an operation of moving the information processing apparatus 10 itself (hereinafter also referred to as “remote pointing operation”), and the like.
- Examples of operations using the touch panel include a drag operation, a tap operation, a pinch-in operation, a pinch-out operation, and the like.
- control unit 16 recognizes the input operation performed by the user based on the detection information given from the detection unit 13 and the operation information given from the operation unit 12. Then, the control unit 16 outputs remote operation information regarding the recognized input operation to the communication unit 14, and the communication unit 14 transmits the remote operation information to the display device 20.
- the display device 20 performs processing based on the remote operation information. Details will be described later.
- step S270 the control unit 16 determines whether or not end notification information has been received. When it is determined that the end notification information has been received, the control unit 16 proceeds to step S280, and when it is determined that the end notification information has not been received, the control unit 16 proceeds to step S260. In step S280, the control unit 16 performs a cutting process. For example, the control unit 16 disconnects communication with the display device 20.
- steps S290 to S330 shown in FIG. 8 the display device 20 performs the same processing as in steps S100 to S140 shown in FIG.
- step S340 the communication unit 24 receives the context information and outputs it to the control unit 26.
- step S350 the control unit 26 performs processing based on the context information. Specifically, the control unit 26 executes the application indicated by the context information, and acquires the throw image information from the location indicated by the context information. Then, the control unit 26 displays the throw image information in the throw image display window.
- step S360 the communication unit 24 receives the remote operation information and outputs it to the control unit 26.
- step S370 the control unit 26 performs processing based on the remote operation information. For example, the control unit 26 scrolls through image information based on a drag operation. Further, the control unit 26 performs a determination process (for example, a process of pressing the above-described various buttons 184, 185, 186, 192) based on the tap operation. Further, the control unit 26 reduces the through image display window based on the pinch-in operation. Further, the control unit 26 enlarges the through image display window based on the pinch-out operation. Moreover, the control part 26 moves a cursor (for example, cursor 193) based on remote pointing operation.
- a determination process for example, a process of pressing the above-described various buttons 184, 185, 186, 192
- the control unit 26 reduces the through image display window based on the pinch-in operation. Further, the control unit 26 enlarges the through image display window based on the pinch-out operation.
- the control part 26 moves a cursor (for example, cursor 19
- step S380 the control unit 26 determines whether or not the user has performed an end operation. For example, the control unit 26 may determine that an operation of pressing the “Close” button is an end operation. If the end operation is detected, the control unit 26 proceeds to step S390. If the end operation is not detected, the control unit 26 returns to step S350.
- step S390 the control unit 26 performs a cutting process.
- the control unit 26 outputs end notification information indicating that the display of the throw image information is ended to the communication unit 24.
- the communication unit 24 transmits end notification information to the information processing apparatus 10. Thereafter, the control unit 26 disconnects communication with the information processing apparatus 10.
- the second processing example only one information processing device 10 is shown, but a plurality of information processing devices 10 may exist. That is, a plurality of users may display the image information of the information processing apparatus 10 on the display device 20.
- the same effect as the first processing example can be obtained.
- the user can more easily operate the throw image information.
- the throw image information may also be displayed on the display unit 11 in the second processing example.
- the third processing example corresponds to a modification of the first processing example and the second processing example.
- the control unit 26 determines the display state of the through image display window based on the usage state (vertical placement, horizontal placement, diagonal placement, etc.) of the information processing apparatus 10.
- the control unit 16 determines the usage state of the information processing apparatus 10 based on the detection information given from the detection unit 13. Then, the control unit 16 outputs usage state information regarding the usage state to the communication unit 14.
- the communication unit 14 transmits usage state information to the display device 20.
- the communication unit 24 of the display device 20 receives the usage state information and outputs it to the control unit 26.
- the control unit 26 determines the display state of the through image display window based on the usage state information. For example, the control unit 26 sets the throw image display window to a vertically long shape when the usage state is set vertically, and sets the throw image display window to a horizontally long shape when the use state is set horizontally. In addition, when the usage state is set obliquely, the control unit 26 makes the throw image display window have an obliquely long shape.
- FIGS. 19 and 20 Processing examples are shown in FIGS. 19 and 20. That is, as shown in FIG. 19, when the information processing apparatus 10 is placed horizontally, as shown in FIG. 20, the control unit 26 displays the through image display window 180 in a horizontally long shape. In this case, the control unit 26 may display the through image display window 180 on the entire lower end of the display unit 21. Further, the control unit 26 may display the throw image display window 180 so as to rise from the lower end of the display unit 21.
- the display device 20 since the display device 20 determines the display state of the window based on the usage state of the information processing device 10, the user can display the window in a desired display state.
- the fourth processing example corresponds to a modification of the first processing example and the second processing example.
- the control unit 26 displays a through image display window based on the position and / or orientation of the information processing apparatus 10 in the space. More specifically, the control unit 26 displays a through image display window at the intersection of the display unit 21 and a straight line that passes through the center of the information processing device 10 and extends in the longitudinal direction of the information processing device 10.
- the control unit 16 of the information processing apparatus 10 outputs remote pointing operation information related to the remote pointing operation to the communication unit 14.
- the communication unit 14 transmits the remote pointing operation information to the display device 20.
- the communication unit 24 of the display device 20 receives the remote pointing operation information and outputs it to the control unit 26.
- the control unit 26 determines the display position of the through image display window based on the remote pointing operation information. For example, the control unit 26 determines the display position of the throw image display window so that the above-described intersection is arranged anywhere on the throw image display window.
- a display example is shown in FIG.
- the user A moves the information processing apparatus 10.
- the control unit 26 passes through the center of the information processing apparatus 10 so that the intersection P2 between the straight line P1 extending in the longitudinal direction of the information processing apparatus 10 and the display unit 21 is located at the upper left end of the throw image display window 180. Then, the display position of the through image display window 180 is determined. For example, when the intersection P2 moves from the position P2a to the position P2b, the control unit 26 moves the throw image display window 180 from the positions 180a to 180b.
- the control unit 26 may divide the display unit 21 into a plurality of small areas and display a through image display window in the small area including the intersection described above.
- the user can display a through image display window at a desired position.
- step S400 illustrated in FIG. 9 the control unit 16 waits until the user performs a catch trigger operation.
- the trigger operation include an operation of flicking the display unit 11 from the upper end toward the lower end.
- the control unit 16 may display a catch button on the display unit 11. Then, the control unit 16 may determine that an operation in which the user taps the catch button is a trigger operation.
- step S410 the control unit 16 outputs catch request information requesting display content information to the communication unit 14.
- the display content information is information indicating the current display content of the display device 20.
- the display content information indicates a window that can be a target of catch by the user, that is, a display area (display position) of a catchable window.
- the communication unit 14 transmits catch request information to the display device 20.
- the display device 20 transmits the display content information to the information processing device 10. Details will be described later.
- step S420 the communication unit 14 receives the display content information and outputs it to the control unit 16.
- step S430 the control unit 16 displays a catchable window based on the display content information. Specifically, the control unit 16 generates a catch target window selection dialog based on the display content information, and displays the catch target window selection dialog on the display unit 11.
- the catch target window selection dialog is a dialog for allowing the user to select a catch target window.
- the image information in the catch target window selected by the user is displayed on the display unit 11 of the information processing apparatus 10.
- a display example is shown in FIG.
- the control unit 16 displays a catch target window selection dialog 250 and an indicator 260.
- the control unit 16 may display a cancel button.
- the catch target window selection dialog 250 includes a display device image 250a and catch target window selection buttons 251 to 254.
- the display device image 250a is an image in which the display device 20 is deformed.
- the catch target window selection buttons 251 to 254 are buttons for allowing the user to select a catch target window, and are displayed in a portion corresponding to the display position of the catchable window in the display device image 250a.
- the catch target window selection buttons 251 to 254 indicate catchable windows.
- four catchable windows are displayed on the display unit 21.
- a catch target window selection button 251 indicates a catchable window displayed at the upper left corner of the display unit 21.
- the indicator 260 is character information that prompts the user to select a catch target window.
- the catch target window selection dialog is not limited to this example.
- character information indicating the display position of a catchable window may be displayed as a list. In this case, each row in the list becomes a catch target window selection button.
- step S440 the control unit 16 waits until the user performs a window designation operation, that is, an operation of selecting a catch target window.
- the operation unit 12 outputs window designation operation information to the control unit 16 when a window designation operation is performed.
- the window designation operation information is given, that is, when the window designation operation is detected, the control unit 16 proceeds to step S450.
- examples of the window designation operation include an operation of tapping the above-described catch target window selection button.
- step S450 the control unit 16 recognizes the catch target window selected by the user based on the window designation operation. Then, the control unit 16 generates window designation information related to the catch target window selected by the user.
- the window designation information include an ID (window ID) for identifying the catch target window, information indicating a display area of the catch target window, and the like.
- the control unit 16 may include all of these information in the window designation information, or may include only one of them in the window designation information.
- the control unit 16 outputs window designation information to the communication unit 14.
- the communication unit 14 transmits window designation information to the display device 20. In response to this, the display device 20 transmits image information being displayed in the catch target window, that is, window display information to the information processing device 10.
- step S460 the communication unit 14 receives the window display information and outputs it to the control unit 16.
- step S470 the control unit 16 displays window display information.
- step S480 illustrated in FIG. 10 the communication unit 24 of the display device 20 receives the catch request information and outputs it to the control unit 26.
- step S490 the control unit 26 generates display content information. For example, when the catchable windows 241 to 244 shown in FIG. 23 are displayed, the control unit 26 generates display content information indicating the display areas of the catchable windows 241 to 244.
- the control unit 26 may assign a window ID to each catch target window, and include this window ID in the information regarding the display area of the catchable window. Further, the control unit 26 may use only the window ID as display content information.
- the control unit 26 outputs the display content information to the communication unit 24, and the communication unit 24 transmits the display content information to the information processing apparatus 10.
- step S500 the communication unit 24 receives the window designation information and outputs it to the control unit 26.
- step S510 the control unit 26 recognizes the catch target window based on the window designation information, and acquires image information being displayed in the catch target window, that is, window display information. Then, the control unit 26 outputs window display information to the communication unit 24.
- the communication unit 24 transmits window display information to the information processing apparatus 10.
- step S520 the control unit 26 deletes the catch target window from the display unit 21.
- the control unit 26 may leave the catch target window.
- the display device 20 displays catchable windows 241 to 244 shown in FIG. Therefore, the information processing apparatus 10 displays the catch target window selection dialog 250 shown in FIG.
- the information processing apparatus 10 transmits window designation information to that effect to the display apparatus 20.
- the display device 20 acquires the image information being displayed in the catch target window as window display information and transmits it to the information processing device 10.
- the information processing apparatus 10 displays window display information 260 (that is, image information displayed in the catchable window 243). Further, the display device 20 deletes the catchable window 243.
- the display device 20 may determine the catch target window by the following processing. That is, the imaging unit 23 images the user and outputs a captured image obtained thereby to the control unit 26. On the other hand, the user performs a gesture operation (for example, an operation of pointing (pointing) to any catchable window) for designating any of the catchable windows. Based on the captured image, the control unit 26 determines that the catchable window designated by the user is the catch target window.
- a gesture operation for example, an operation of pointing (pointing) to any catchable window
- the control unit 26 may cause the user to select the destination of the window display information by the following processing. That is, the control unit 26 displays a list image indicating a list of destinations on the display unit 21. The user performs a gesture operation for selecting one of the destinations listed (for example, an operation for pointing to one of the destinations, an operation for dragging and dropping a catchable window). Note that the drag-and-drop operation is performed, for example, by pointing a catchable window and then moving the finger to a position indicating one of the destinations. The control unit 26 determines the destination selected by the user based on the captured image, and transmits window display information to the destination.
- the destinations displayed in a list may include an information processing device (such as a stationary PC) other than the information processing device 10.
- the user may designate the catch target window and the destination of the window display information by voice information.
- the voice detection unit 27 outputs the voice information to the control unit 26, and the control unit 26 recognizes the catch target window and the destination of the window display information based on the voice information.
- the display device 20 when the image information displayed in any one of the plurality of windows is requested from the information processing apparatus 10, the display device 20 is displayed in the one window.
- the image information is transmitted to the information processing apparatus 10. Therefore, the user of the information processing apparatus 10 can easily acquire desired image information.
- the display device 20 performs control to delete one window, the user can easily grasp that the image information displayed in the one window has been acquired.
- the sixth processing example is processing for determining the display position of the window according to the position of the user.
- the information processing system does not necessarily have the information processing apparatus 10.
- one or more users are present in front of the display device 20 (an imaging range by the imaging unit 23).
- the voice detection unit 27 detects voice instruction information from the user and outputs it to the control unit 26.
- the voice instruction information includes at least voice information for instructing display of a window.
- the voice instruction information may include voice information indicating the display content of the window.
- step S540 the imaging unit 23 images the user and outputs a captured image obtained thereby to the control unit 26.
- the control unit 26 specifies the position of the user who issued the voice instruction information based on the captured image.
- step S550 the control unit 26 determines the position of the user who has issued the voice instruction information.
- the control unit 26 classifies the imaging range of the imaging unit 23 into the right side, the center, and the left side from the user, and determines which of these classifications the user's position corresponds to.
- classification is not limited to this example. For example, the classification may be further subdivided. If it is determined that the user is on the right side, the control unit 26 proceeds to step S560. If the user is determined to be on the left side, the control unit 26 proceeds to step S570, and if the user is determined to be in the center. The process proceeds to step S580.
- step S560 the control unit 26 displays a window on the right side (right end) of the display unit 21.
- the control unit 26 displays the display content specified by the user in the window.
- step S570 the control unit 26 displays a window on the left side (left end) of the display unit 21.
- the control unit 26 displays the display content specified by the user in the window.
- step S580 the control unit 26 displays a prompt.
- the prompt is information prompting specification of the display position of the window. Specific examples will be described later.
- step S590 the voice detection unit 27 detects voice instruction information from the user and outputs the voice instruction information to the control unit 26.
- the voice instruction information here includes at least voice information for indicating the display position of the window.
- step S600 the control unit 26 determines whether or not the voice instruction information indicates the right, and if it is determined that the voice instruction information indicates the right, the process proceeds to step S610, and the voice instruction information indicates the left. If so, the process proceeds to step S620.
- step S610 the control unit 26 displays a window on the right side (right end) of the display unit 21.
- the control unit 26 displays the display content specified by the user in the window.
- step S620 the control unit 26 displays a window on the left side (left end) of the display unit 21.
- the control unit 26 displays the display content specified by the user in the window.
- the user may designate the display position of the window by a gesture operation (pointing, flicking, etc.).
- the control unit 26 determines the display position of the window based on the captured image. That is, the control unit 26 determines the user who performed the gesture operation, and determines the display position of the window based on the result.
- the display device 20 (control unit 26) displays arbitrary image information, for example, image information 170.
- image information 170 For example, three users A to C are viewing the image information 170.
- the users A to C are included in the imaging range of the imaging unit 23. Users A to C may or may not have the information processing apparatus 10.
- the display device 20 displays the window 270 at the left end of the display unit 21 as shown in FIG.
- the window 270 includes an indicator 271 indicating the display content of the window 270 and an image display area 272 in which a map image is displayed.
- the display device 20 also reduces the image information 170 by fixing the aspect ratio, and displays an indicator 171 on the upper side of the image information 170.
- the indicator 171 indicates information for identifying the image information 170.
- the display device 20 displays prompts 281 and 282 on both ends of the display unit 21 as shown in FIG.
- the prompt 281 is displayed at the right end of the display unit 21 and the character “R” is drawn.
- the prompt 282 is displayed at the left end of the display unit 21 and the character “L” is drawn.
- the display device 20 can prompt the user to specify the display position of the window.
- the display device 20 displays the window 290 on the left end of the display unit 21 as illustrated in FIG. 28.
- the window 290 includes an indicator 291 indicating the display contents of the window 290 and an image display area 292 in which a map image is displayed.
- the display device 20 also reduces the image information 170 by fixing the aspect ratio, and displays an indicator 171 on the upper side of the image information 170.
- the indicator 171 indicates information for identifying the image information 170.
- each user can easily display a desired window at a position easy to see.
- the seventh processing example corresponds to a modification of the first and second processing examples.
- the control unit 26 displays the same image as in the first and second processing examples.
- a display example is shown in FIG.
- the control unit 26 displays a base image display window 171 and a through image display window 180.
- the various information display window 190 and the cursor 193 are omitted, but it is needless to say that these images may be displayed.
- the users A and B are viewing the display device 20. User B may not have the information processing apparatus 10.
- the base image display window 171 includes a base image display area 171a and a base image indicator 171b.
- Base image information 171c is displayed in the base image display area 171a.
- the indicator 171b indicates information for identifying the base image information such as a title of the base image information. The detailed contents are the same as in FIG.
- the throw image display window 180 is a window for displaying a throw image.
- the through image display window 180 includes an indicator 181 and a through image display area 182.
- the specific contents are the same as in FIG.
- the display area switching button 184, the reduction button 185, the enlargement button 186, and the cursor 193 are omitted, but it goes without saying that these images may be displayed.
- the throw image information (that is, the address image information 182a and the web page 182b) transmitted from the information processing apparatus 10 of the user A (or obtained from the context information) is displayed in the throw image display area 182.
- the control unit 26 may display another window, for example, the window described in the seventh processing example, instead of the through image display window.
- control unit 16 of the information processing apparatus 10 outputs audio information S1 corresponding to the throw image information (or the other window described above).
- the control unit 26 outputs display content information indicating the display content of the display unit 21 to the communication unit 24, and the communication unit 24 transmits the display content information to the information processing apparatus 10 of the user A.
- the communication unit 14 of the information processing apparatus 10 receives the display content information and outputs it to the control unit 16.
- the control unit 16 displays the display content information on the display unit 11.
- a display example is as shown in FIG. That is, in this example, the control unit 16 displays the display content information 300.
- the display content information 300 is an image obtained by deforming the display content of the display device 20.
- the display content image 300 includes a window selection button 310 corresponding to a through image display window and a window selection button 320 corresponding to a base image display window 171.
- the control unit 16 When the user taps one of the window selection buttons, the control unit 16 outputs selection window information regarding the window selection button tapped (selected) by the user to the communication unit 14.
- the communication unit 14 outputs the selected window information to the display device 20.
- the communication unit 24 of the display device 20 outputs the selected window information to the control unit 26.
- the control unit 26 Based on the selected window information, the control unit 26 recognizes the window selected by the user and acquires audio information corresponding to the window.
- the control unit 26 may acquire the voice information from the network or may acquire it from the storage unit 25.
- the control unit 26 outputs audio information to the communication unit 24, and the communication unit 24 transmits the audio information to the information processing apparatus 10.
- the communication unit 14 of the information processing apparatus 10 receives the audio information and outputs it to the control unit 16.
- the control unit 16 causes the audio output unit 17 to output audio information.
- the control unit 16 may acquire the audio information from the network instead of acquiring it from the display device 20. Further, audio information may be stored in the storage unit 15 in advance, and the control unit 16 may acquire the audio information from the storage unit 15.
- the information processing apparatus 10 needs to include an imaging unit. That is, the user images the display unit 21 using the imaging unit of the information processing apparatus 10. The imaging unit outputs a captured image obtained by the imaging to the control unit 16. The control unit 16 displays the captured image on the display unit 11 as display content information.
- the display device 20 performs a process according to the flowchart shown in FIG. In this case, it is preferable that the audio output unit 22 can output directional audio information.
- step S630 the imaging unit 23 images the user B and outputs a captured image obtained thereby to the control unit 26.
- the control unit 26 detects the viewpoint of the user B based on the captured image.
- step S640 based on the detection result, the control unit 26 determines whether the user B has seen the window with the corresponding voice for a predetermined time.
- the window with corresponding audio is either the base image display window 171 or the through image display window 180.
- the control unit 26 proceeds to step S650, and otherwise, the process is terminated.
- the control unit 26 outputs audio information corresponding to the window being visually recognized by the user B from the audio output unit 22. That is, the audio information is switched. For example, when the viewpoint (line of sight X1) of the user B is facing the base image display window 171, the control unit 26 outputs audio information S2 corresponding to the base image display window 171.
- the control unit 26 may output audio information from the audio output unit 22 toward the user B. Further, the control unit 26 may output the audio information corresponding to the window that the user B is currently viewing and output the other audio information small. Further, the control unit 26 may fade out the audio information that has been output until the time of the audio switching, and fade in the audio information that is newly output.
- each user can more easily hear the voice information desired by the user.
- the information processing system may not include the information processing apparatus 10.
- one or more users are viewing the display device 20.
- step S660 the imaging unit 23 images the user and outputs a captured image obtained thereby to the control unit 26.
- step S670 the control unit 26 performs face detection from the captured image to determine whether or not a user is present in the imaging range. If the control unit 26 determines that the user exists within the imaging range, the process proceeds to step S680. If the control unit 26 determines that the user does not exist within the imaging range, the process ends.
- step S680 the control unit 26 specifies the position of the user in the left-right direction based on the result of face detection.
- step S690 the control unit 26 specifies the position in the front-rear direction of the user (distance from the display device 20 to the user) based on the face detection result, specifically, the size of the face.
- step S700 the control unit 26 determines the display position in the left-right direction of the window based on the position in the left-right direction of the user. For example, the control unit 26 may specify the position of the window in the left-right direction so that the position of the user in the left-right direction matches the position of the window in the left-right direction.
- step S710 the control unit 26 determines the display size of the window based on the position of the user in the front-rear direction. For example, the control unit 26 may increase the display size of the window as the user moves away from the display device 20. Note that the control unit 26 may adjust the size of the image information displayed in the window instead of adjusting the display size of the window. For example, the control unit 26 may enlarge the image information as the user moves away from the display device 20.
- step S720 the control unit 26 determines a display device in the height direction of the window (a distance from the lower end of the display unit 21) based on the position in the front-rear direction of the user. For example, the control unit 26 may specify the position in the height direction of the window so that the window is arranged at the height of the user's face.
- step S730 the control unit 26 displays the window with the position and display size determined above.
- the users A and B are viewing the display device 20 respectively.
- the display device 20 detects the faces of the users A and B by performing face detection, and displays the display position and the display size of the window 295 corresponding to the user A and the window 300 corresponding to the user B based on the detection result. decide. Then, the display device 20 displays the windows 295 and 300 at the determined display position and display size. For example, the display device 20 displays the windows 295 and 300 such that the window 295 is arranged ahead of the user A's line of sight P3 and the window 300 is arranged ahead of the user B's line of sight P4.
- the control unit 26 displays image information designated by the users A and B in the windows 295 and 300, respectively.
- the control unit 26 displays the throw image information transmitted from the information processing apparatus 10 possessed by each of the users A and B.
- the storage unit 25 stores user identification information in which identification information (information processing apparatus ID) of the information processing apparatus 10 and user identification information are associated with each other. Examples of the user identification information include the user's name and face image.
- identification information of the information processing apparatus 10 is added to the throw image information transmitted from the information processing apparatus 10. Then, based on the identification information of the information processing apparatus 10 added to the throw image information and the user specifying information stored in the storage unit 25, the control unit 26 displays in which window the throw image information is displayed. To decide.
- control unit 26 may change the display position of the window in the left-right direction accordingly.
- control unit 26 displays image information in windows 295 and 300.
- control unit 26 displays a window 310 between the windows 295 and 300, and also displays image information in this window 310.
- the control unit 26 switches the display positions of the windows 295 and 300. . Specifically, the control unit 26 moves the display position of the window 295 in the direction of the arrow P7, while moving the display position of the window 300 in the direction of the arrow P8, thereby switching the display positions of the windows 295 and 300. Specifically, when the position of the user fluctuates, the control unit 26 performs the processing from step S660 to step S730 again. Thereby, even if the positions of the users A and B change, the control unit 26 can display the windows 295 and 300 ahead of the lines of sight of the users A and B.
- the control unit 26 may display the window in which the display position has been changed in a manner different from other parts (that is, emphasized). .
- the control unit 26 may display the boundary portion of the window in a manner different from other portions for a predetermined time after the change of the display position of the window.
- the display mode of the window for example, the boundary part of the window is displayed in a color that is more conspicuous than the other parts, the vicinity of the window boundary is displayed with an effect that shines, and the window is blinked.
- control unit 26 may display an indicator indicating the user corresponding to the window in the window (or in the vicinity of the window) for a predetermined time after changing the display position of the window.
- an indicator include an icon indicating a user's face image, character information indicating a user's name and the like, a live video obtained by imaging the user, and the like. The live video may be displayed as a through image.
- the control unit 26 may display the indicator in a part of the window, or may display the image information and the indicator in the window that are ⁇ -blended. Thereby, even if a user removes a line of sight from the display device 20 during movement, the user can easily find a window corresponding to the user when the line of sight is returned to the display device 20.
- control unit 26 may continuously move the window to the changed display position, or once deletes the window and then displays the window at the changed display position. May be. In the former case, the control unit 26 may transparently display the window until the window reaches a new display position (or within a predetermined time after the window reaches the new display position). Thereby, the control part 26 can suppress the fall of the visibility of other image information, even if it is a case where a window overlaps with other image information during the movement of a window.
- the display device 20 determines the display position of the window associated with the information processing device 10 based on the position of the user of the information processing device 10, so that the user can select a desired window more. It can be easily visually recognized.
- the display device 20 changes the display position of the window associated with the information processing device 10, so that the user can more easily visually recognize the desired window. Can do.
- the ninth processing example corresponds to a modification of the first and second processing examples.
- the control unit 26 displays the same image as in the first and second processing examples.
- a display example is shown in FIG.
- the control unit 26 displays a base image display window 171 and a through image display window 180.
- the various information display window 190 and the cursor 193 are omitted, but it is needless to say that these images may be displayed.
- the user A is viewing the display device 20. Further, the user A possesses the information processing device 10, and the information processing device 10 transmits the throw image information to the display device 20.
- the base image display window 171 includes a base image display area 171a and a base image indicator 171b.
- Base image information 171c is displayed in the base image display area 171a.
- the indicator 171b indicates information for identifying the base image information such as a title of the base image information. The detailed contents are the same as in FIG.
- the throw image display window 180 is a window for displaying a throw image.
- the through image display window 180 includes an indicator 181 and a through image display area 182.
- the specific contents are the same as in FIG.
- the display area switching button 184, the reduction button 185, the enlargement button 186, and the cursor 193 are omitted, but it goes without saying that these images may be displayed.
- the throw image information (that is, the address image information 182a and the web page 182b) transmitted from the information processing apparatus 10 of the user A (or obtained from the context information) is displayed in the throw image display area 182.
- the control unit 26 may display another window, for example, the window described in the seventh processing example, instead of the through image display window.
- the imaging unit 23 captures the user A and outputs the captured image to the control unit 26.
- the control unit 26 recognizes the user A by performing face detection from the captured image.
- the control unit 26 reduces the through image display window 180 according to the movement amount of the user A. Then, the control unit 26 deletes the through image display window 180 when the user A leaves the imaging range.
- the control unit 26 sets the boundary line 183 between the throw image display window 180 and the base image display window 171 in the direction of the arrow P10 according to the movement amount of the user A. Move to.
- the control unit 26 also reduces the throw image information in the horizontal direction in accordance with the movement of the boundary line 183. That is, the control unit 26 reduces the through image display window 180 in the horizontal direction.
- the control unit 26 causes the boundary line 183 to reach the left end of the display unit 21 when the user A leaves the imaging range. As a result, the control unit 26 erases the through image display window.
- control unit 26 may delete the through image display window 180 at a timing later than the timing when the user A leaves the imaging range.
- control unit 26 may erase the through image display window 180 by the following methods. For example, the control unit 26 may delete the through image display window 180 by moving the through image display window 180 to the outside of the screen. The control unit 26 may reduce the through image display window 180 in the vertical direction. Further, the control unit 26 may delete the through image display window 180 by increasing the transparency of the through image display window 180. Further, the control unit 26 may cause the user to set whether or not to perform the above processing.
- the control unit 26 redisplays the through image display window 180 according to the movement amount of the user A. Then, when the entire user A (for example, the entire face) enters the imaging range, the control unit 26 returns the throw image display window 180 to the original size.
- the control unit 26 sets the boundary line 183 between the throw image display window 180 and the base image display window 171 according to the movement amount of the user A. To move in the direction of arrow P12. As the boundary line 183 moves, the control unit 26 also enlarges the throw image information in the horizontal direction. That is, the control unit 26 enlarges the through image display window 180 in the horizontal direction. Then, the control unit 26 causes the boundary line 183 to reach the original position when the entire user A enters the imaging range. As a result, the control unit 26 redisplays the through image display window.
- control unit 26 may return the throw image display window 180 to the original size at a timing later than the timing when the entire user A enters the imaging range.
- control unit 26 may redisplay the through image display window 180 by the following methods. For example, the control unit 26 may redisplay the through image display window 180 by moving the through image display window 180 from the outside of the screen into the screen. Further, the control unit 26 may redisplay the through image display window 180 by enlarging the through image display window 180 in the vertical direction. Further, the control unit 26 may redisplay the through image display window 180 by reducing the transparency of the through image display window 180. Further, the control unit 26 may cause the user to set whether or not to perform the above processing.
- control unit 26 may pause the reproduction of the throw image information while the throw image display window 180 is being erased.
- the control unit 26 may also temporarily stop the streaming of the base image information.
- the control unit 26 may not perform the above-described re-display when the time when the user A sits down (that is, out of the imaging range) is longer than a predetermined time. Further, the control unit 26 may determine the display position of the through image display window 180 based on the position where the user A is stationary within the imaging range.
- the specific processing content may be the same as in the eighth processing example.
- the display device 20 can display a window when the user is within the imaging range, so that, for example, other users do not feel annoying the window when the user is absent. Can do.
- operation authority is set for different users for each display area of the display unit 21.
- the users A and B are viewing the display device 20.
- the information processing system may or may not have the information processing apparatus 10.
- the storage unit 25 associates the identification information of the information processing apparatus 10 with the user identification information (for example, a face image).
- the specified user specifying information may be stored.
- the control part 26 can recognize from which user the remote operation information was emitted, when the remote operation information was given from the information processing apparatus 10 of any user.
- the imaging unit 23 images the user and outputs a captured image obtained thereby to the control unit 26.
- the control unit 26 detects the users A and B by performing face detection from the captured image, and specifies the positions of the users A and B in the left-right direction. Then, the control unit 26 gives the user A the operation authority for the display area 330 in front of the user A, and gives the user B the operation authority for the display area 340 in front of the user B.
- the control unit 26 grants the operating authority of the display area 350 between the display areas 330 and 340 to both the users A and B.
- the display area classification and the operation authority granting method are not limited to this example.
- the control unit 26 may arbitrarily divide the display area and give the operation authority to different users for each display area.
- control unit 26 displays the windows 330a and 330b in the display area 330 by the window display instruction operation by the user A, and displays the window 340a in the display area 340 by the input operation by the user B.
- the window display instruction operation by the user for example, operations similar to those in the first processing example and the second processing example (for example, an operation of tapping the throw button 100), various gesture operations, and voice operations (operations instructed by speaking) ) And the like.
- the control unit 26 can determine which user has performed the gesture operation or the voice operation based on the captured image.
- control unit 26 performs processing related to the windows 330a and 330b (for example, display of image information, switching of image information, and the like) based on the window operation by the user A. Further, the control unit 26 performs processing related to the window 340a (for example, display of image information, switching of image information, and the like) based on the window operation by the user B.
- window operation by the user in addition to the same operation as the above-described second processing example (for example, an operation of moving the cursor 193 to a desired position in the window and tapping the information processing apparatus 10), various gestures are performed. Operation, voice operation, etc. are mentioned.
- control unit 26 moves the windows 330a and 330b based on the window movement instruction operation by the user A, and moves the window 340a based on the window movement instruction operation by the user B.
- window movement instruction operation by the user include operations similar to the above-described second processing example (for example, an operation of dragging and dropping a window using the cursor 193), various gesture operations, voice operations, and the like. It is done.
- the control unit 26 sets the window 330b as a window 350a that can be operated by the users A and B.
- control unit 26 grants the operation authority of the window displayed in the display area to the user having the operation authority of the display area.
- the display device 20 divides the display area of the display unit 21 into a plurality of display areas (small regions), and each of the plurality of display areas and each of the plurality of information processing devices 10 are divided. Associate.
- the operation authority is set for a different user for each window of the display unit 21.
- the users A, B, and C are viewing the display device 20, and in the example illustrated in FIG. 37, the users A and B are viewing the display device 20.
- the information processing system may or may not include the information processing apparatus 10.
- the storage unit 25 is for user identification in which the identification information of the information processing apparatus 10 and the user identification information are associated with each other. Information may be stored. Examples of the user identification information include the user's name and face image.
- the control part 26 can recognize from which user the remote operation information was emitted, when the remote operation information was given from the information processing apparatus 10 of any user.
- the imaging unit 23 images the user and outputs a captured image obtained thereby to the control unit 26.
- the control unit 26 detects the users A, B, and C by performing face detection from the captured image.
- the control unit 26 displays windows 360 to 390, and grants operation authority to different users for each window. For example, the control unit 26 grants the operating authority of the window 360 to the user A, grants the operating authority of the window 370 to the user C, and grants the operating authority of the window 380 to the users A and B. The control unit 26 grants the operation authority of the window 390 to the user B. Then, the control unit 26 displays indicators 361 to 391 at the ends of the windows 360 to 390. Indicators 361 to 391 indicate users who have operating authority for windows 360 to 390.
- the operation authority of the window 380 is granted to the users A and B (that is, the window 380 is shared by the users A and B).
- the control unit 26 grants the operation authority to the user A in the initial state. May be.
- the control unit 26 may also grant the operation authority of the window 380 to B by the operation by the user A.
- the control unit 26 displays a list of users who share the operation authority based on a gesture operation by the user A (for example, a pointing operation on the window 380).
- the users to be listed may be limited to, for example, users shown in the captured image.
- the control unit 26 may share the window 380 with the user A and the selected user.
- the control unit 26 associates the windows 360 to 390 with the users A, B, and C.
- the control unit 26 transmits the remote operation information transmitted from the information processing apparatus 10 and the user specifying information stored in the storage unit 15. Based on the above, a window to be operated by the remote operation information is determined. Therefore, the control unit 26 associates the windows 360 to 390 with the users A, B, and C and the information processing apparatus 10.
- control unit 26 performs processing related to the windows 360 and 380 (for example, display of image information, switching of image information, etc.) based on the window operation by the user A. Further, the control unit 26 performs processing related to the windows 381 and 390 (for example, display of image information, switching of image information, and the like) based on the window operation by the user B. Further, the control unit 26 performs processing related to the window 370 based on the window operation by the user C.
- the specific contents of the window operation are the same as in the tenth processing example. Further, the control unit 26 moves the windows for which the users A, B, and C have the operation authority based on the window movement instruction operation by the users A, B, and C. The specific processing is the same as in the tenth processing example.
- control unit 26 grants each user the authority to operate the window displayed on the display unit 21.
- the control unit 26 releases the operation authority of the user A. That is, the control unit 26 releases the association between the information processing apparatus 10 of the user A and the window. For example, the control unit 26 cancels the operation authority for the windows 360 and 380 of the user A.
- control unit 26 assigns the operation authority of the window to all users existing in the imaging range.
- the control unit 26 grants the operation authority of the window 360 to the users B and C. That is, the control unit 26 associates the window 360 with the information processing apparatus 10 of a user other than the user A.
- the control unit 26 changes the display contents of the indicators 361 to 391 according to the above processing. For example, the control unit 26 displays “Free” on the indicator 361, that is, character information indicating that the operating authority is given to the users B and C. Further, the control unit 26 displays the identification information of the user B on the indicator 381. Note that the control unit 26 may delete the window 360 where no user having the operation authority exists.
- the display device 20 performs control to display a plurality of windows, and associates each of the plurality of windows with each of the plurality of information processing devices 10. Therefore, each user can operate the window associated with his / her information processing apparatus 10, thereby reducing the possibility of confusion with the operation.
- the display apparatus 20 cancels the association between the window associated with the information processing apparatus 10 and the information processing apparatus 10. Therefore, when a certain user leaves the imaging range, the possibility that the window associated with the user cannot be operated is reduced.
- the display device 20 associates the window that has been released from the association with the information processing device 10 with the information processing device 10 other than the information processing device 10. Therefore, when a user leaves the imaging range, a window associated with the user can be operated by another user.
- the display device 20 deletes the window associated with the information processing device 10. Therefore, the possibility of user confusion is reduced.
- the processing examples described above may be combined as appropriate.
- the window of the eleventh processing example may include a throw image display window specified by the user in the first processing example.
- the information processing system includes an information processing device 10, a display device 20, and a head mounted display (hereinafter also referred to as “HMD”) 30.
- the information processing apparatus 10, the display apparatus 20, and the HMD 30 all communicate with various servers (for example, the translation server 40, the caption server 50, and the ACR server 60).
- the HMD 30 includes a display unit 31, an external imaging unit 32, an internal imaging unit 33, an operation unit 34, a detection unit 35, a communication unit (acquisition unit) 36, a storage unit 37, a control unit 38, and audio.
- An output unit 39 is provided.
- the display unit 31 displays various images under the control of the control unit 38.
- the display unit 31 is a transmission type, and when the user wears the HMD 30, the user visually recognizes various objects existing on the back side of the display unit 31 in addition to the image information displayed on the display unit 31. be able to.
- two display units 31 exist, substantially the same image information is displayed on each of the two display units 31.
- the image information may be stereoscopically displayed by displaying image information with parallax on these display units 31.
- the external imaging unit 32 images the front of the HMD 30 and outputs a captured image obtained thereby to the control unit 38.
- the imaging range of the external imaging unit 32 may be adjusted to be substantially the same as the user's visual field.
- the external imaging unit 32 may be capable of recording.
- the internal imaging unit 33 images the viewpoint of the user wearing the HMD 30 and outputs a captured image obtained thereby to the control unit 38.
- the operation unit 34 receives an input operation by the user and outputs operation information obtained thereby to the control unit 38.
- the detection unit 35 detects the attitude of the HMD 30 and outputs detection information related to the detection result to the control unit 38.
- the communication unit 36 communicates with the information processing apparatus 10 and various servers. The communication unit 36 may communicate with devices other than these, for example, the display device 20.
- the storage unit 37 includes a program for causing the HMD 30 to realize the display unit 31, the external imaging unit 32, the internal imaging unit 33, the operation unit 34, the detection unit 35, the communication unit 36, the storage unit 37, and the control unit 38.
- the control unit 38 controls the entire HMD 30 and performs processing shown in each processing example described later.
- the audio output unit 39 outputs various audio information under the control of the control unit 38.
- the HMD 30 has the hardware configuration shown in FIG. 39, and the display unit 31, the external imaging unit 32, the internal imaging unit 33, the operation unit 34, the detection unit 35, the communication unit 36, and the storage unit are configured according to these hardware configurations. 37, the control part 38, and the audio
- the HMD 30 includes a display 31a, an external imaging device 32a, an internal imaging device 33a, an operation device 34a, a sensor 35a, a communication device 36a, a nonvolatile memory 37a, a RAM 37b, a CPU 38a, and an audio output device 39a as hardware configurations. .
- Display 31a displays various image information.
- the external imaging device 32a images the surroundings of the HMD 30.
- the internal imaging device 33a images the viewpoint of the user wearing the HMD 30.
- the operation device 34a receives an input operation by the user.
- the operation device 34a is preferably a touch panel (touch sensor), but may be a hard key or the like.
- the sensor 35a detects the attitude of the HMD 30 and the like. Specific examples of the sensor 35a include a gyro sensor and an acceleration sensor.
- the communication device 36a communicates with the information processing device 10 and various servers.
- the nonvolatile memory 37a stores various programs and image information.
- the program includes a display unit 31, an external imaging unit 32, an internal imaging unit 33, an operation unit 34, a detection unit 35, a communication unit 36, a storage unit 37, a control unit 38, and an audio output unit 39 in the HMD 30.
- a program for realizing it is included.
- the RAM 37b is a work area for the CPU 38a.
- the CPU 38a reads and executes the program stored in the nonvolatile memory 37a.
- the CPU 38a reads out and executes the program stored in the nonvolatile memory 37a, whereby the display unit 31, the external imaging unit 32, the internal imaging unit 33, the operation unit 34, the detection unit 35, the communication unit 36, and the storage unit. 37, the control part 38, and the audio
- the audio output device 39a is a device that outputs audio information, such as a speaker and headphones.
- the translation server 40 includes a communication unit 41, a storage unit 42, and a control unit 43, as shown in FIG.
- the communication unit 36 communicates with the information processing device 10, the display device 20, the HMD 30, and the like.
- the storage unit 42 is a program for realizing the communication unit 41, the storage unit 42, and the control unit 43 in the translation server 40, and various pieces of translation information (information associated with languages such as English and Japanese having the same meaning). Etc.) etc.
- the control unit 43 controls the entire translation server 40 and performs processing shown in each processing example described later.
- the translation server 40 has a hardware configuration shown in FIG. 41, and the communication unit 41, the storage unit 42, and the control unit 43 are realized by these hardware configurations. That is, the translation server 40 includes a communication device 41a, a nonvolatile memory 42a, a RAM 42b, an external storage medium 42c, and a CPU 43a as hardware configurations.
- the communication device 41a communicates with the information processing device 10, the display device 20, the HMD 30, and the like.
- the nonvolatile memory 42a stores various programs and the like.
- the program includes a program for causing the translation server 40 to realize the communication unit 41, the storage unit 42, and the control unit 43.
- the RAM 42b is a work area for the CPU 43a.
- the external storage medium 42c stores translation information and the like.
- the CPU 43a reads and executes the program stored in the nonvolatile memory 42a. Therefore, the communication unit 41, the storage unit 42, and the control unit 43 are realized by the CPU 43a reading and executing the program stored in the nonvolatile memory 42a. That is, the CPU 43a can be a substantial operating subject of the translation server 40.
- the caption server 50 includes a communication unit 51, a storage unit 52, and a control unit 53.
- the communication unit 51 communicates with the information processing device 10, the display device 20, the HMD 30, and the like.
- the storage unit 52 is a program for realizing the communication unit 51, the storage unit 52, and the control unit 53 in the subtitle server 50, and information for specifying subtitle identification (content ID and subtitle information (related information)). ) Etc.
- the control unit 53 controls the entire caption server 50 and performs processing shown in each processing example described later.
- the subtitle server 50 has a hardware configuration shown in FIG. 41, and the communication unit 51, the storage unit 52, and the control unit 53 are realized by these hardware configurations. That is, the caption server 50 includes a communication device 51a, a nonvolatile memory 52a, a RAM 52b, an external storage medium 52c, and a CPU 53a as hardware configurations.
- the communication device 51a communicates with the information processing device 10, the display device 20, the HMD 30, and the like.
- the nonvolatile memory 52a stores various programs and the like.
- the program includes a program for causing the caption server 50 to realize the communication unit 51, the storage unit 52, and the control unit 53.
- the RAM 52b serves as a work area for the CPU 53a.
- the external storage medium 52c stores subtitle specifying information and the like.
- the CPU 53a reads and executes the program stored in the nonvolatile memory 52a. Therefore, the communication unit 51, the storage unit 52, and the control unit 53 are realized by the CPU 53a reading and executing the program stored in the nonvolatile memory 52a. That is, the CPU 53a can be a substantial operating subject of the caption server 50.
- the ACR (Automatic content recognition) server 60 includes a communication unit 61, a storage unit 62, and a control unit 63.
- the communication unit 61 communicates with the information processing device 10, the display device 20, the HMD 30, and the like.
- the storage unit 62 is a program for causing the ACR server 60 to realize the communication unit 61, the storage unit 62, and the control unit 63, and content ID specifying information (information in which image information, audio information, and content ID are associated with each other). ) Etc.
- the storage unit 62 may store additional information specifying information (information in which a content ID and additional information are associated).
- the additional information may be, for example, performer information, related content, an advertisement, a score (evaluation), or the like. Further, the additional information may be player information or the like when the image information is a sports program. This is information (such as performer information) that explains the content of the image information.
- the control unit 63 controls the entire ACR server 60 and performs processing shown in each processing example described later.
- the ACR server 60 has a hardware configuration shown in FIG. 41, and the communication unit 61, the storage unit 62, and the control unit 63 are realized by these hardware configurations. That is, the ACR server 60 includes a communication device 61a, a nonvolatile memory 62a, a RAM 62b, an external storage medium 62c, and a CPU 63a as hardware configurations.
- the communication device 61a communicates with the information processing device 10, the display device 20, the HMD 30, and the like.
- the nonvolatile memory 62a stores various programs.
- the program includes a program for causing the ACR server 60 to realize the communication unit 61, the storage unit 62, and the control unit 63.
- the RAM 62b serves as a work area for the CPU 63a.
- the external storage medium 62c stores content ID specifying information and the like.
- the CPU 63a reads and executes the program stored in the nonvolatile memory 62a. Therefore, the communication unit 61, the storage unit 62, and the control unit 63 are realized by the CPU 63a reading and executing the program stored in the nonvolatile memory 62a. That is, the CPU 63a can be a substantial operating subject of the ACR server 60.
- caption information is displayed on the display unit 31 when the user wearing the HMD 30 is viewing the display device 20.
- the external imaging unit 32 of the HMD 30 images the display unit 21 of the display device 20 and outputs a captured image obtained thereby to the control unit 38.
- step S750 the control unit 38 specifies the positions of the display device 20 and each window in the captured image.
- step S760 the control unit 38 acquires subtitle information (subtitle text information, related information) of each window.
- the control unit 38 may acquire not only caption information but also various kinds of additional information (information describing the content of image information). Specific processing contents will be described later.
- step S770 the control unit 38 determines the display position of the caption information based on the position of the display device 20 and each window in the captured image. Specifically, the control unit 38 determines the display position of the caption information so that the caption information is superimposed on the image information corresponding to the caption information.
- step S780 the control unit 38 performs subtitle information display control.
- the control unit 38 may display the subtitle information by adjusting the font size of the subtitle information, the position of the frame / balloon (relative position with respect to the window), the shape, the color, the manner (and presence) of the animation, and the like. The same applies to the additional information.
- the HMD 30 may acquire the caption information by any one of the processes shown in FIGS.
- step S950 shown in FIG. 46 the control unit 26 of the display device 20 (TV) transmits caption information (Japanese) corresponding to the currently displayed image information to the translation server 40.
- step S ⁇ b> 960 the control unit 43 of the translation server 40 acquires caption information (English or any foreign language) corresponding to the caption information (Japanese) and transmits it to the display device 20.
- step S ⁇ b> 965 the control unit 26 of the display device 20 transmits the caption information (English or any foreign language) to the HMD 30.
- step S970 shown in FIG. 47 the control unit 26 of the display device 20 (TV) transmits a content ID corresponding to the currently displayed image information to the information processing device 10 (smart phone).
- step S980 the control unit 16 of the information processing apparatus 10 transmits the content ID to the caption server 50.
- step S990 the control unit 53 of the caption server 50 transmits caption information corresponding to the content ID to the information processing apparatus 10.
- step S1000 the control unit 16 of the information processing apparatus 10 transmits caption information to the HMD 30.
- the external imaging unit 32 of the HMD 30 acquires a captured image (video) by imaging the display unit 21 (TV screen) of the display device 20.
- the external imaging unit 32 may record audio information (sound data) output from the display device 20 instead of acquiring a captured image (video).
- the control unit 38 may acquire audio information from the display device 20.
- the external imaging unit 32 outputs the captured image or audio information to the control unit 38.
- step S1010 the control unit 38 transmits the captured image or audio information to the ACR server 60.
- step S1020 the control unit 63 of the ACR server 60 transmits the content ID corresponding to the captured image or audio information to the HMD 30.
- step S1030 the control unit 38 of the HMD 30 transmits the content ID to the caption server 50.
- step S1040 the control unit 53 of the caption server 50 transmits caption information corresponding to the content ID to the HMD 30.
- the display device 20 displays windows 400 and 410.
- the window 400 includes an indicator 401 and an image information display area 402.
- the indicator 401 displays information for identifying the content being displayed in the image information display area 402.
- Image information is displayed in the image information display area 402.
- the window 410 includes an indicator 411 and an image information display area 412.
- the indicator 411 displays information for identifying the content being displayed in the image information display area 412.
- Image information is displayed in the image information display area 412.
- the HMD 30 displays the caption information 420 corresponding to the window 400 at a position superimposed on the window 400 in the display unit 31, and the window 410 at a position superimposed on the window 410.
- the subtitle information 430 corresponding to is displayed.
- the display device 20 can transmit the related information related to the image information being displayed in the window, that is, the caption information HMD30. Therefore, the HMD 30 can display related information, for example, caption information and related information.
- subtitle information is displayed on HMD30, subtitle information with higher personality (only a user wearing HMD30 can see) can be displayed.
- the first processing example is particularly suitable when, for example, a large number of windows are displayed on the display device 20 and a large number of users are viewing different windows (that is, different image information). That is, if these users wear the HMD 30, the subtitle information for the window that the user is viewing is displayed on the HMD 30 of the user, so that each user can more easily grasp the subtitle information that the user wants to know. Can do.
- step S790 shown in FIG. 43 the external imaging unit 32 of the HMD 30 images the display unit 21 of the display device 20, and outputs the external captured image obtained thereby to the control unit 38.
- the internal imaging unit 33 images the viewpoint of the user wearing the HMD 30 and outputs the internal captured image obtained thereby to the control unit 38.
- step S800 the control unit 38 specifies the position of the display device 20 and each window in the external captured image.
- step S810 the control unit 38 specifies the user's viewpoint position ( gaze direction) from the internal captured image.
- step S820 the control unit 38 specifies a window corresponding to the viewpoint position of the user, that is, a window being viewed by the user, based on the results of steps S800 and S810.
- step S830 the control unit 38 acquires subtitle information (subtitle text information and related information) of each window. Specific processing contents are as described above. The control unit 38 may acquire not only caption information but also various kinds of additional information (information describing the content of image information). Specific processing contents will be described later. In step S840, the control unit 38 performs subtitle information display control.
- FIG. 50 the display device 20 displays windows 400 and 410.
- the HMD 30 acquires subtitle information 420 corresponding to the window 400.
- the HMD 30 displays the caption information 420 corresponding to the window 400 on the display unit 31.
- the HMD 30 acquires subtitle information 430 corresponding to the window 410.
- the HMD 30 displays the caption information 430 corresponding to the window 410 on the display unit 31.
- each user can more easily grasp subtitle information that the user wants to know.
- step S850 shown in FIG. 44 the external imaging unit 32 of the HMD 30 images the display unit 21 of the display device 20, and outputs a captured image obtained thereby to the control unit 38.
- step S860 the control unit 38 specifies the positions of the display device 20 and each window in the captured image.
- step S870 the control unit 38 specifies a window existing at a position closest to the center of the user field of view based on the captured image.
- step S880 the control unit 38 acquires subtitle information (subtitle text information and related information) of the window existing at the center of the user field of view based on the results of steps S860 and S870.
- Specific processing contents are as described above.
- the control unit 38 may acquire not only caption information but also various kinds of additional information (information describing the content of image information). Specific processing contents will be described later.
- step S890 the control unit 38 performs subtitle information display control.
- FIG. 50 the display device 20 displays windows 400 and 410.
- the HMD 30 acquires subtitle information 420 corresponding to the window 400.
- the HMD 30 displays the caption information 420 corresponding to the window 400 on the display unit 31.
- the HMD 30 acquires the caption information 430 corresponding to the window 410.
- the HMD 30 displays the caption information 430 corresponding to the window 410 on the display unit 31.
- each user can more easily grasp the caption information that the user wants to know.
- step S900 shown in FIG. 45 the external imaging unit 32 of the HMD 30 images the display unit 21 of the display device 20, and outputs a captured image obtained thereby to the control unit 38.
- step S910 the control unit 38 specifies the positions of the display device 20 and each window in the captured image.
- step S920 the control unit 38 acquires subtitle information (subtitle text information and related information) and additional information of each window.
- the method for acquiring caption information is as described above.
- the control unit 38 may acquire the additional information together with the content ID according to the flow shown in FIG.
- step S930 the control unit 38 determines the display position of the caption information and the additional information based on the position of the display device 20 and each window in the captured image. Specifically, the control unit 38 determines the display position of the caption information so that the caption information and the additional information are superimposed on the image information corresponding to the caption information and the additional information. In step S940, the control unit 38 performs display control of caption information.
- the display device 20 displays windows 400 and 410.
- a user wearing the HMD 30 visually recognizes the windows 400 and 410.
- the HMD 30 may display subtitle information 460 and 480 having speech balloons, sound effect information 470 corresponding to sound effects, and additional information 490 so as to be superimposed on windows corresponding thereto.
- Sound effect information is a type of additional information.
- caption information 460 and sound effect information 470 correspond to the window 400
- caption information 480 and additional information 490 correspond to the window 410.
- the control unit 38 may display subtitle information or the like at a position that overlaps with the display device 20 or a position that does not overlap.
- each user can more easily grasp subtitle information that the user wants to know.
- the window specified by the user in the first and second processing examples (throw) or the sixth processing example (voice or gesture operation) of the first embodiment is displayed. If so, the control unit 38 may perform the following processing. That is, the control unit 38 may continue to display subtitle information and additional information corresponding to the window.
- the control unit 38 of the HMD 30 may output audio information of at least one window from the audio output unit 39 among the plurality of windows. Further, the control unit 38 of the HMD 30 may display the subtitle information of at least one of the remaining windows on the display unit 31.
- the window from which the display device 20 outputs audio information may be set in advance, or may be a window in which no caption is displayed or a window that is not viewed by the user. The user may arbitrarily select which of the first to fourth processing examples is to be performed.
- the display unit that displays the caption information is determined based on the usage state of the HMD 30. Note that the same processing may be performed for the additional information.
- the external imaging unit 32 of the HMD 30 images the front (the user's field of view) of the HMD 30 and outputs a captured image obtained thereby to the control unit 38.
- the control unit 38 determines whether the user is viewing the display device 20 based on the captured image. When it is determined that the user is viewing the display device 20, the control unit 38 transmits subtitle information display request information to the display device 20.
- the control unit 26 of the display device 20 acquires subtitle information according to the flow shown in FIG. 46 and displays it on the display unit 21.
- the control unit 38 acquires the caption information by the method described above and displays it on the display unit 31.
- FIG. 55 Processing examples are shown in FIG. 55 and FIG.
- the HMD 30 displays caption information 500 (corresponding to image information being displayed on the display device 20) on the display unit 31.
- the control unit 26 of the display device 20 displays the caption information 500 on the display unit 21.
- the control unit 38 may change the output method of the caption information and the audio information depending on whether or not the user wearing the HMD 30 is viewing the display device 20. For example, when the user is viewing the display device 20, the control unit 38 outputs audio information in a first language (for example, English) and outputs subtitle information in a second language (for example, Japanese). Also good. In addition, when the user is not viewing the display device 20, the control unit 38 outputs audio information in a second language (for example, Japanese) and outputs subtitle information in a first language (for example, English). Also good.
- the output location may be determined in the same manner as in the above-described fifth processing example. Note that the output location may be the display device 20 or the information processing device 10. In this case, the control units 16 and 26 perform output control. In this case, the information processing system may not have the HMD 30.
- the control unit 38 may display the caption information on the display unit 31 when the HMD 30 is worn by the user, or may display the caption information on the display unit 21 when the user removes the HMD 30. Good.
- the fifth processing example it is possible to provide information that is estimated to be more necessary for the user.
- the fifth processing example may be combined with the first to fourth processing examples.
- This table may be stored in any one of the information processing apparatus 10, the display apparatus 20, and the HMD 30, for example.
- the control unit 38 determines an output destination.
- the control part 38 determines the output destination of audio
- This table further includes correspondences between other information, for example, the genre of image information, the language type (Japanese, English, etc.) and the volume of subtitle information (simple or full). May be.
- the subtitle server 50 stores simple subtitle information and full subtitle information.
- the control unit 38 determines the output destination of the audio information and the caption information according to the table, and determines the language type and volume of the caption information. Then, the control unit 38 acquires caption information of the determined type and volume (a specific acquisition method is based on FIGS. 46 to 48), and outputs the caption information from the determined output destination. In addition, when a plurality of users wear the HMD 30, the control unit 38 may display different types of caption information for each HMD 30.
- the display mode of the sixth processing example is also applicable to the first to fifth processing examples.
- the first and second embodiments may have any of the effects described in the present specification or other effects.
- a communication unit that receives display position designation information indicating the display position of the window from another information processing apparatus;
- An information processing apparatus comprising: a control unit that performs control to display a window at a display position indicated by the display position designation information in a display area of the display unit.
- the control unit performs control to transmit information related to a displayable position of the window to the other information processing apparatus.
- the control unit performs control to display a plurality of windows, and associates each of the plurality of windows with each of a plurality of other information processing devices.
- the information processing apparatus described.
- the information processing apparatus according to (4), wherein the control unit determines a display position of a window associated with the other information processing apparatus based on a position of a user of the other information processing apparatus.
- (6) The information processing apparatus according to (5), wherein the control unit changes a display position of a window associated with the other information processing apparatus when the position of the user of the other information processing apparatus changes.
- a detection unit for detecting the position of the user of the other information processing apparatus In response to determining that the position of the user of the other information processing apparatus does not exist within a predetermined range, the control unit includes a window associated with the other information processing apparatus and the other information processing apparatus. The information processing apparatus according to (4), wherein the association is canceled. (8) The information processing apparatus according to (7), wherein the control unit associates a window that has been released from association with the other information processing apparatus with an information processing apparatus other than the other information processing apparatus. (9) The control unit deletes a window associated with the other information processing device when the detection unit determines that the position of the user of the other information processing device does not exist within a predetermined range. The information processing apparatus according to (7).
- control unit divides the display area into a plurality of small areas, and associates each of the plurality of small areas with each of a plurality of other information processing apparatuses.
- control unit performs control to display a plurality of windows.
- control unit The information processing apparatus according to any one of (1) to (10), wherein control is performed to transmit the image information displayed in the window to the other information processing apparatus.
- control unit performs control to delete the one window.
- the information processing apparatus according to any one of (1) to (12), wherein the control unit transmits related information related to image information displayed in the window to the other information processing apparatus.
- a control unit that generates display position designation information that indicates the display position of the window;
- An information processing apparatus comprising: a communication unit that transmits the display position designation information to another information processing apparatus capable of displaying the window.
Abstract
Description
1.第1の実施の形態(表示装置及び情報処理装置を備える情報処理システムの例)
1-1.全体構成
1-2.情報処理装置の構成
1-3.表示装置の構成
1-4.情報処理システムによる処理の例
1-4-1.第1処理例
1-4-2.第2処理例
1-4-3.第3処理例
1-4-4.第4処理例
1-4-5.第5処理例
1-4-6.第6処理例
1-4-7.第7処理例
1-4-8.第8処理例
1-4-9.第9処理例
1-4-10.第10処理例
1-4-11.第11処理例
2.第2の実施の形態(第1の実施形態にヘッドマウントディスプレイを追加した例)
2-1.全体構成
2-2.ヘッドマウントディスプレイの構成
2-3.サーバの構成
2-4.情報処理システムによる処理の例
2-4-1.第1処理例
2-4-2.第2処理例
2-4-3.第3処理例
2-4-4.第4処理例
2-4-5.第5処理例
2-4-6.第6処理例
(1-1.全体構成)
まず、第1の実施形態に係る情報処理システムの全体構成について説明する。情報処理システムは、1または複数の情報処理装置10と、表示装置20とを備える。なお、情報処理システムは、情報処理装置10を有していなくてもよい。すなわち、後述する処理例のうち、いくつかの処理例では、必ずしも情報処理装置10は必要ない。この場合、情報処理システムは、情報処理装置10を有していなくてもよい。情報処理システムが複数の情報処理装置10を有する場合、これらの情報処理装置10は異なるユーザによって所持されていてもよい。
つぎに、図1及び図2に基づいて、第1の実施形態に係る情報処理装置10の構成について説明する。図1に示すように、情報処理装置10は、表示部11、操作部12、検出部13、通信部14、記憶部15、制御部16、及び音声出力部17を備える。
つぎに、図3及び図4に基づいて、第1の実施形態に係る表示装置20の構成について説明する。図3に示すように、表示装置20は、表示部21、音声出力部22、撮像部23(検出部)、通信部24、記憶部25、制御部26、及び音声検出部27を備える。
以下、情報処理システムによる処理例を説明する。なお、以下の説明では、情報処理装置10はいわゆるスマートフォンであるものとする。したがって、上述した操作部12はいわゆるタッチパネルによって実現される。
次に、情報処理システムによる第1の処理例について説明する。第1の処理例では、ユーザは、throw操作を行うことで、情報処理装置10に表示された情報を表示装置20に表示させる。情報処理装置10は、表示装置20が表示中の情報をミラー表示する。
ステップS10において、制御部16は、画像情報を表示部11に表示させる一方で、ユーザがthrow操作を行うまで待機する。ここで、当該画像情報は、表示装置20に表示される画像情報、すなわちthrow画像情報となるものである。操作部12は、throw操作がなされた場合には、throw操作情報を制御部16に出力する。制御部16は、throw操作情報が与えられた場合、すなわちthrow操作を検出した場合、ステップS20に進む。
ステップS100において、表示装置20の通信部24は、throwリクエスト情報を受信し、制御部26に出力する。なお、制御部26は、throwリクエスト情報の待機中に、何らかの画像情報(以下、「ベース画像情報」とも称する)を表示部21に表示してもよい。表示例を図17に示す。この例では、制御部16は、表示部21にベース画像情報として画像情報170を表示する。
次に、図7及び図8に基づいて、第2処理例について説明する。第2処理例は、情報処理装置10をリモートコントローラとして使用する例である。
図7に示すステップS190~S240において、情報処理装置10は、図5に示すステップS10~ステップS60と同様の処理を行う。
図8に示すステップS290~S330において、表示装置20は、図6に示すステップS100~ステップS140と同様の処理を行う。
次に、図19及び図20に基づいて、第3処理例について説明する。第3処理例は、第1処理例及び第2処理例の変形例に相当するものである。第3処理例では、制御部26は、情報処理装置10の使用状態(縦置き、横置き、斜め置き等)に基づいて、throw画像表示用ウインドウの表示状態を決定する。
次に、図21に基づいて、第4処理例について説明する。第4処理例は、第1処理例及び第2処理例の変形例に相当するものである。第4処理例では、制御部26は、情報処理装置10の空間中の位置及び/または姿勢に基づいて、throw画像表示用ウインドウを表示する。より具体的には、制御部26は、情報処理装置10の中心を通り、かつ情報処理装置10の長手方向に伸びる直線と、表示部21との交点にthrow画像表示用ウインドウを表示する。
次に、図9、図10、図22~図24に基づいて、第5処理例について説明する。第5処理例は、いわゆるキャッチ(catch)を行うものである。
図9に示すステップS400において、制御部16は、ユーザがcatchのトリガ操作を行うまで待機する。ここで、トリガ操作としては、例えば表示部11上を上端から下端に向けてフリックする操作等が挙げられる。また、制御部16は、表示部11にcatchボタンを表示してもよい。そして、制御部16は、ユーザがcatchボタンをタップする操作をトリガ操作と判定してもよい。
図10に示すステップS480において、表示装置20の通信部24は、catchリクエスト情報受信し、制御部26に出力する。
つぎに、図11、図25~図28に基づいて、第6処理例について説明する。第6処理例は、ユーザの位置に応じてウインドウの表示位置を決定する処理である。この第6処理例では、情報処理システムは、必ずしも情報処理装置10を有していなくてもよい。また、第6処理例では、表示装置20の前(撮像部23による撮像範囲)に1または複数のユーザが存在する。
図11に示すステップS530において、音声検出部27は、ユーザによる音声指示情報を検出し、制御部26に出力する。ここで、音声指示情報には、少なくともウインドウの表示を指示する旨の音声情報が含まれる。音声指示情報には、ウインドウの表示内容を示す音声情報が含まれていてもよい。
つぎに、図12及び図29に基づいて、第7処理例について説明する。第7処理例は、第1及び第2処理例の変形例に相当する。
つぎに、図13及び図30に基づいて、第8処理例について説明する。この第8処理例においても、情報処理システムは、情報処理装置10を有していなくてもよい。また、第8処理例では、1または複数のユーザが表示装置20を視認している。
つぎに、図33に基づいて、第9処理例について説明する。第9処理例は、第1及び第2処理例の変形例に相当する。
次に、図35に基づいて、第10処理例について説明する。第10処理例では、表示部21の表示エリア毎に異なるユーザに操作権限が設定される。なお、図35に示す例では、ユーザA、Bが表示装置20を視認している。第10の処理例では、情報処理システムは、情報処理装置10を有していても有していなくてもよい。情報処理システムが情報処理装置10を有する場合、すなわち各ユーザが異なる情報処理装置10を有する場合、記憶部25は、情報処理装置10の識別情報とユーザの識別情報(例えば顔画像)とが関連付けられたユーザ特定用情報が記憶されていてもよい。これにより、制御部26は、いずれかのユーザの情報処理装置10から遠隔操作情報が与えられた場合、その遠隔操作情報がどのユーザから発せられたのかを認識することができる。
次に、図36及び図37に基づいて、第11処理例について説明する。第11処理例では、表示部21のウインドウ毎に異なるユーザに操作権限が設定される。なお、図36に示す例では、ユーザA、B、Cが表示装置20を視認しており、図37に示す例では、ユーザA、Bが表示装置20を視認している。第11の処理例では、情報処理システムは、情報処理装置10を有していても有していなくてもよい。情報処理システムが情報処理装置10を有する場合、すなわち各ユーザが異なる情報処理装置10を有する場合、記憶部25は、情報処理装置10の識別情報とユーザの識別情報とが関連付けられたユーザ特定用情報が記憶されていてもよい。ユーザの識別情報としては、ユーザの氏名、顔画像等が挙げられる。これにより、制御部26は、いずれかのユーザの情報処理装置10から遠隔操作情報が与えられた場合、その遠隔操作情報がどのユーザから発せられたのかを認識することができる。
次に、第2の実施形態について説明する。
まず、図38~図41、及び図49に基づいて、全体構成について説明する。第2の実施形態に係る情報処理システムは、情報処理装置10と、表示装置20と、ヘッドマウントディスプレイ(以下、「HMD」とも称する)30とを備える。情報処理装置10と、表示装置20と、HMD30とは、いずれも各種サーバ(例えば翻訳サーバ40、字幕サーバ50、及びACRサーバ60)との間で通信を行う。
情報処理装置10及び表示装置20の構成は第1の実施形態と同様である。そこで、ここでは、HMD30の構成について説明する。
次に、図40及び図41に基づいて、第2の実施形態で使用される翻訳サーバ40、字幕サーバ50、及びACRサーバ60の構成について説明する。
画像情報の内容を説明する情報(出演者情報等)である。制御部63は、ACRサーバ60全体を制御する他、後述する各処理例に示される処理を行う。
(2-4-1.第1処理例)
次に、図42及び図50~図54に基づいて、第1の処理例について説明する。第1~第4処理例では、HMD30を装着したユーザが表示装置20を視認している場合に、表示部31に字幕情報を表示する。
次に、図43、図50、図52、及び図53に基づいて、第2処理例について説明する。図43に示すステップS790において、HMD30の外部撮像部32は、表示装置20の表示部21を撮像し、これにより得られた外部撮像画像を制御部38に出力する。一方、内部撮像部33は、HMD30を装着したユーザの視点を撮像し、これにより得られた内部撮像画像を制御部38に出力する。
次に、図44、図50、図52、及び図53に基づいて、第3処理例について説明する。図44に示すステップS850において、HMD30の外部撮像部32は、表示装置20の表示部21を撮像し、これにより得られた撮像画像を制御部38に出力する。
次に、図45及び図50、図54に基づいて、第4の処理例について説明する。図45に示すステップS900において、HMD30の外部撮像部32は、表示装置20の表示部21を撮像し、これにより得られた撮像画像を制御部38に出力する。
次に、図55及び図56に基づいて、第5処理例について説明する。第5処理例では、HMD30の使用状態に基づいて、字幕情報を表示する表示部を決定するものである。なお、付加情報についても同様の処理を行ってもよい。
次に、第6処理例について説明する。第6処理例では、画像情報毎(あるいは画像情報のジャンル毎)に、音声情報及び画像情報の出力先を変更するものである。
(1)
ウインドウの表示位置を指示する表示位置指定情報を他の情報処理装置から受信する通信部と、
表示部の表示エリアのうち、前記表示位置指定情報が示す表示位置にウインドウを表示する制御を行う制御部と、を備える情報処理装置。
(2)
前記制御部は、前記ウインドウの表示可能位置に関する情報を前記他の情報処理装置に送信する制御を行う、前記(1)記載の情報処理装置。
(3)
前記制御部は、前記他の情報処理装置の使用状態に基づいて、前記ウインドウの表示状態を決定する、前記(1)または(2)記載の情報処理装置。
(4)
前記他の情報処理装置は複数存在し、
前記制御部は、複数のウインドウを表示する制御を行う一方、複数のウインドウの各々と、複数の他の情報処理装置の各々とを関連付ける、前記(1)~(3)のいずれか1項に記載の情報処理装置。
(5)
前記他の情報処理装置のユーザの位置を検出する検出部を備え、
前記制御部は、前記他の情報処理装置のユーザの位置に基づいて、前記他の情報処理装置に関連付けられたウインドウの表示位置を決定する、前記(4)記載の情報処理装置。
(6)
前記制御部は、前記他の情報処理装置のユーザの位置が変動した場合、前記他の情報処理装置に関連付けられたウインドウの表示位置を変動させる、前記(5)記載の情報処理装置。
(7)
前記他の情報処理装置のユーザの位置を検出する検出部を備え、
前記制御部は、前記他の情報処理装置のユーザの位置が所定の範囲に存在しないと判定されることに応じて、前記他の情報処理装置に関連付けられたウインドウと前記他の情報処理装置との関連付けを解除する、前記(4)記載の情報処理装置。
(8)
前記制御部は、前記他の情報処理装置との関連付けが解除されたウインドウを、前記他の情報処理装置以外の情報処理装置に関連付ける、前記(7)記載の情報処理装置。
(9)
前記制御部は、前記検出部が前記他の情報処理装置のユーザの位置が所定の範囲に存在しないと判定されることに応じて、前記他の情報処理装置に関連付けられたウインドウを消去する、前記(7)記載の情報処理装置。
(10)
前記制御部は、前記表示エリアを複数の小領域に区分し、複数の小領域の各々と、複数の他の情報処理装置の各々とを関連付ける、前記(4)記載の情報処理装置。
(11)
前記制御部は、複数のウインドウを表示する制御を行う一方、複数のウインドウのうち、いずれか一のウインドウに表示された画像情報が前記他の情報処理装置から要求された場合には、前記一のウインドウに表示された画像情報を前記他の情報処理装置に送信する制御を行う、前記(1)~(10)のいずれか1項に記載の情報処理装置。
(12)
前記制御部は、前記一のウインドウを消去する制御を行う、前記(11)記載の情報処理装置。
(13)
前記制御部は、前記ウインドウに表示中の画像情報に関連する関連情報を前記他の情報処理装置に送信する、前記(1)~(12)のいずれか1項に記載の情報処理装置。
(14)
ウインドウの表示位置を指示する表示位置指定情報を生成する制御部と、
前記表示位置指定情報を、前記ウインドウを表示可能な他の情報処理装置に送信する通信部と、を備える、情報処理装置。
(15)
ウインドウの表示位置を指示する表示位置指定情報を他の情報処理装置から受信することと、
表示部の表示エリアのうち、前記表示位置指定情報が示す表示位置にウインドウを表示する制御を行うことと、を含む、情報処理方法。
(16)
コンピュータに、
ウインドウの表示位置を指示する表示位置指定情報を他の情報処理装置から受信する通信機能と、
表示部の表示エリアのうち、前記表示位置指定情報が示す表示位置にウインドウを表示する制御を行う制御機能と、を実現させる、プログラム。
11 表示部
12 操作部
13 検出部
14 通信部
15 記憶部
16 制御部
17 音声出力部
20 表示装置
21 表示部
22 音声出力部
23 撮像部
24 通信部
25 記憶部
26 制御部
27 音声検出部
Claims (16)
- ウインドウの表示位置を指示する表示位置指定情報を他の情報処理装置から受信する通信部と、
表示部の表示エリアのうち、前記表示位置指定情報が示す表示位置にウインドウを表示する制御を行う制御部と、を備える情報処理装置。 - 前記制御部は、前記ウインドウの表示可能位置に関する情報を前記他の情報処理装置に送信する制御を行う、請求項1記載の情報処理装置。
- 前記制御部は、前記他の情報処理装置の使用状態に基づいて、前記ウインドウの表示状態を決定する、請求項1記載の情報処理装置。
- 前記他の情報処理装置は複数存在し、
前記制御部は、複数のウインドウを表示する制御を行う一方、複数のウインドウの各々と、複数の他の情報処理装置の各々とを関連付ける、請求項1記載の情報処理装置。 - 前記他の情報処理装置のユーザの位置を検出する検出部を備え、
前記制御部は、前記他の情報処理装置のユーザの位置に基づいて、前記他の情報処理装置に関連付けられたウインドウの表示位置を決定する、請求項4記載の情報処理装置。 - 前記制御部は、前記他の情報処理装置のユーザの位置が変動した場合、前記他の情報処理装置に関連付けられたウインドウの表示位置を変動させる、請求項5記載の情報処理装置。
- 前記他の情報処理装置のユーザの位置を検出する検出部を備え、
前記制御部は、前記他の情報処理装置のユーザの位置が所定の範囲に存在しないと判定されることに応じて、前記他の情報処理装置に関連付けられたウインドウと前記他の情報処理装置との関連付けを解除する、請求項4記載の情報処理装置。 - 前記制御部は、前記他の情報処理装置との関連付けが解除されたウインドウを、前記他の情報処理装置以外の情報処理装置に関連付ける、請求項7記載の情報処理装置。
- 前記制御部は、前記検出部が前記他の情報処理装置のユーザの位置が所定の範囲に存在しないと判定されることに応じて、前記他の情報処理装置に関連付けられたウインドウを消去する、請求項7記載の情報処理装置。
- 前記制御部は、前記表示エリアを複数の小領域に区分し、複数の小領域の各々と、複数の他の情報処理装置の各々とを関連付ける、請求項4記載の情報処理装置。
- 前記制御部は、複数のウインドウを表示する制御を行う一方、複数のウインドウのうち、いずれか一のウインドウに表示された画像情報が前記他の情報処理装置から要求された場合には、前記一のウインドウに表示された画像情報を前記他の情報処理装置に送信する制御を行う、請求項1記載の情報処理装置。
- 前記制御部は、前記一のウインドウを消去する制御を行う、請求項11記載の情報処理装置。
- 前記制御部は、前記ウインドウに表示中の画像情報に関連する関連情報を前記他の情報処理装置に送信する、請求項1記載の情報処理装置。
- ウインドウの表示位置を指示する表示位置指定情報を生成する制御部と、
前記表示位置指定情報を、前記ウインドウを表示可能な他の情報処理装置に送信する通信部と、を備える、情報処理装置。 - ウインドウの表示位置を指示する表示位置指定情報を他の情報処理装置から受信することと、
表示部の表示エリアのうち、前記表示位置指定情報が示す表示位置にウインドウを表示する制御を行うことと、を含む、情報処理方法。 - コンピュータに、
ウインドウの表示位置を指示する表示位置指定情報を他の情報処理装置から受信する通信機能と、
表示部の表示エリアのうち、前記表示位置指定情報が示す表示位置にウインドウを表示する制御を行う制御機能と、を実現させる、プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/025,154 US10545623B2 (en) | 2013-10-04 | 2014-08-20 | Information processing device and information processing method to coordinate with a plurality of information processing devices |
JP2015540422A JP6455435B2 (ja) | 2013-10-04 | 2014-08-20 | 情報処理装置、情報処理方法、及びプログラム |
EP14851279.1A EP3054378B1 (en) | 2013-10-04 | 2014-08-20 | Information processing device, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-208828 | 2013-10-04 | ||
JP2013208828 | 2013-10-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015049931A1 true WO2015049931A1 (ja) | 2015-04-09 |
Family
ID=52778527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/071802 WO2015049931A1 (ja) | 2013-10-04 | 2014-08-20 | 情報処理装置、情報処理方法、及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10545623B2 (ja) |
EP (1) | EP3054378B1 (ja) |
JP (2) | JP6455435B2 (ja) |
WO (1) | WO2015049931A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108475184A (zh) * | 2016-02-16 | 2018-08-31 | 三星电子株式会社 | 电子设备及其应用数据显示方法 |
WO2019235135A1 (ja) * | 2018-06-07 | 2019-12-12 | ソニー株式会社 | タスク対応情報の表示位置を変更する情報処理装置 |
JP2020061670A (ja) * | 2018-10-11 | 2020-04-16 | 日本放送協会 | 字幕表示装置及びそのプログラム |
CN113590251A (zh) * | 2021-08-05 | 2021-11-02 | 四川艺海智能科技有限公司 | 单屏多窗口的数字化互动展示系统及方法 |
WO2022239152A1 (ja) * | 2021-05-12 | 2022-11-17 | 日本電信電話株式会社 | 情報提示装置、情報提示方法、及びプログラム |
WO2023157241A1 (ja) * | 2022-02-18 | 2023-08-24 | 任天堂株式会社 | システム、ポータブル電子機器、処理方法、およびプログラム |
JP7450906B2 (ja) | 2019-10-08 | 2024-03-18 | 株式会社セイビ堂 | 作業現場用の表示システム、情報伝達方法 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11030778B2 (en) * | 2014-03-31 | 2021-06-08 | Healthy.Io Ltd. | Methods and apparatus for enhancing color vision and quantifying color interpretation |
JP6412778B2 (ja) * | 2014-11-19 | 2018-10-24 | 東芝映像ソリューション株式会社 | 映像装置、方法、およびプログラム |
KR102306536B1 (ko) * | 2015-04-01 | 2021-09-29 | 삼성전자주식회사 | 위젯 제공 시스템 및 방법 |
KR102389038B1 (ko) * | 2015-09-02 | 2022-04-21 | 엘지전자 주식회사 | 전자 기기 및 전자 기기의 제어 방법 |
US10937460B2 (en) * | 2016-06-09 | 2021-03-02 | Apple Inc. | Media files and protocols supporting runtime dependent tracks |
US11076112B2 (en) * | 2016-09-30 | 2021-07-27 | Lenovo (Singapore) Pte. Ltd. | Systems and methods to present closed captioning using augmented reality |
US10331394B1 (en) * | 2017-12-21 | 2019-06-25 | Logmein, Inc. | Manipulating shared screen content |
CN113542825B (zh) * | 2020-04-20 | 2022-10-11 | 华为技术有限公司 | 投屏显示方法、系统、终端设备和存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004021595A (ja) * | 2002-06-17 | 2004-01-22 | Mitsubishi Electric Corp | 会議支援協調作業システム |
JP2006065558A (ja) | 2004-08-26 | 2006-03-09 | Canon Inc | 入力表示装置 |
JP2006215531A (ja) * | 2005-01-06 | 2006-08-17 | Canon Inc | 情報処理装置、方法、記憶媒体並びプログラム |
JP2007148350A (ja) * | 2005-10-28 | 2007-06-14 | Sharp Corp | 画像出力装置、画像表示装置、画像出力用通信システム、画像一覧表示システム、プログラム、記録媒体、及び画像出力方法 |
WO2009125481A1 (ja) * | 2008-04-10 | 2009-10-15 | パイオニア株式会社 | 画面表示システム及び画面表示プログラム |
JP2010026327A (ja) | 2008-07-22 | 2010-02-04 | Canon Inc | 表示装置の制御装置、制御方法、及びコンピュータプログラム |
JP2013145451A (ja) | 2012-01-13 | 2013-07-25 | Sony Corp | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム |
JP2013179553A (ja) * | 2012-01-30 | 2013-09-09 | Sharp Corp | 画面分割表示システム及び画面分割表示方法 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3524187B2 (ja) * | 1994-12-28 | 2004-05-10 | キヤノン株式会社 | 共有ウィンドウ操作権管理システムおよびその制御方法 |
JP4171123B2 (ja) * | 1998-12-15 | 2008-10-22 | 富士通株式会社 | 端末操作装置 |
JP2004318121A (ja) * | 2003-04-04 | 2004-11-11 | Canon Inc | 表示制御装置及び表示システム及びtv装置 |
US20060150108A1 (en) * | 2005-01-06 | 2006-07-06 | Canon Kabushiki Kaisha | Information processing device, information processing method, storage medium, and program |
JP4827626B2 (ja) * | 2006-06-14 | 2011-11-30 | キヤノン株式会社 | 被制御機器、遠隔制御システムおよび遠隔制御システムの制御方法、プログラム |
JP5093884B2 (ja) * | 2007-04-17 | 2012-12-12 | シャープ株式会社 | 表示制御装置及び表示制御プログラム |
JP5248225B2 (ja) * | 2008-07-11 | 2013-07-31 | 富士フイルム株式会社 | コンテンツ表示装置、コンテンツ表示方法およびプログラム |
JP2010177848A (ja) * | 2009-01-28 | 2010-08-12 | Sharp Corp | テレビ装置、pc装置、テレビ装置とpc装置とからなる表示システム |
JP5375338B2 (ja) * | 2009-05-29 | 2013-12-25 | セイコーエプソン株式会社 | 画像表示システム、画像表示装置、画像表示方法、画像供給装置、およびプログラム |
US9344510B2 (en) * | 2009-07-03 | 2016-05-17 | International Business Machines Corporation | Pushing content from a local device to a remote display |
US9201627B2 (en) * | 2010-01-05 | 2015-12-01 | Rovi Guides, Inc. | Systems and methods for transferring content between user equipment and a wireless communications device |
US8789131B2 (en) * | 2010-05-14 | 2014-07-22 | Lg Electronics Inc. | Electronic device and method of sharing contents thereof with other devices |
JP2012019413A (ja) * | 2010-07-08 | 2012-01-26 | Toshiba Corp | 表示システム、端末装置、表示装置、及びプログラム |
JP5516882B2 (ja) * | 2010-07-29 | 2014-06-11 | セイコーエプソン株式会社 | プログラム、情報記憶媒体、端末装置、表示システムおよび画像生成方法 |
WO2012102416A1 (en) * | 2011-01-24 | 2012-08-02 | Lg Electronics Inc. | Data sharing between smart devices |
WO2014115387A1 (ja) * | 2013-01-28 | 2014-07-31 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
US8938558B2 (en) * | 2013-03-04 | 2015-01-20 | Microsoft Corporation | Modifying functionality based on distances between devices |
US9063631B2 (en) * | 2013-03-15 | 2015-06-23 | Chad Dustin TILLMAN | System and method for cooperative sharing of resources of an environment |
US20140282103A1 (en) * | 2013-03-16 | 2014-09-18 | Jerry Alan Crandall | Data sharing |
JP2014232228A (ja) * | 2013-05-29 | 2014-12-11 | ソニー株式会社 | 情報処理装置、および情報処理システム |
-
2014
- 2014-08-20 EP EP14851279.1A patent/EP3054378B1/en active Active
- 2014-08-20 JP JP2015540422A patent/JP6455435B2/ja active Active
- 2014-08-20 US US15/025,154 patent/US10545623B2/en not_active Expired - Fee Related
- 2014-08-20 WO PCT/JP2014/071802 patent/WO2015049931A1/ja active Application Filing
-
2018
- 2018-12-11 JP JP2018231808A patent/JP6638804B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004021595A (ja) * | 2002-06-17 | 2004-01-22 | Mitsubishi Electric Corp | 会議支援協調作業システム |
JP2006065558A (ja) | 2004-08-26 | 2006-03-09 | Canon Inc | 入力表示装置 |
JP2006215531A (ja) * | 2005-01-06 | 2006-08-17 | Canon Inc | 情報処理装置、方法、記憶媒体並びプログラム |
JP2007148350A (ja) * | 2005-10-28 | 2007-06-14 | Sharp Corp | 画像出力装置、画像表示装置、画像出力用通信システム、画像一覧表示システム、プログラム、記録媒体、及び画像出力方法 |
WO2009125481A1 (ja) * | 2008-04-10 | 2009-10-15 | パイオニア株式会社 | 画面表示システム及び画面表示プログラム |
JP2010026327A (ja) | 2008-07-22 | 2010-02-04 | Canon Inc | 表示装置の制御装置、制御方法、及びコンピュータプログラム |
JP2013145451A (ja) | 2012-01-13 | 2013-07-25 | Sony Corp | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム |
JP2013179553A (ja) * | 2012-01-30 | 2013-09-09 | Sharp Corp | 画面分割表示システム及び画面分割表示方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3054378A4 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108475184A (zh) * | 2016-02-16 | 2018-08-31 | 三星电子株式会社 | 电子设备及其应用数据显示方法 |
WO2019235135A1 (ja) * | 2018-06-07 | 2019-12-12 | ソニー株式会社 | タスク対応情報の表示位置を変更する情報処理装置 |
JP2020061670A (ja) * | 2018-10-11 | 2020-04-16 | 日本放送協会 | 字幕表示装置及びそのプログラム |
JP7153522B2 (ja) | 2018-10-11 | 2022-10-14 | 日本放送協会 | 字幕表示装置及びそのプログラム |
JP7450906B2 (ja) | 2019-10-08 | 2024-03-18 | 株式会社セイビ堂 | 作業現場用の表示システム、情報伝達方法 |
WO2022239152A1 (ja) * | 2021-05-12 | 2022-11-17 | 日本電信電話株式会社 | 情報提示装置、情報提示方法、及びプログラム |
CN113590251A (zh) * | 2021-08-05 | 2021-11-02 | 四川艺海智能科技有限公司 | 单屏多窗口的数字化互动展示系统及方法 |
CN113590251B (zh) * | 2021-08-05 | 2024-04-12 | 四川艺海智能科技有限公司 | 单屏多窗口的数字化互动展示系统及方法 |
WO2023157241A1 (ja) * | 2022-02-18 | 2023-08-24 | 任天堂株式会社 | システム、ポータブル電子機器、処理方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6455435B2 (ja) | 2019-01-23 |
US20160231872A1 (en) | 2016-08-11 |
JP2019083023A (ja) | 2019-05-30 |
JP6638804B2 (ja) | 2020-01-29 |
EP3054378B1 (en) | 2022-11-02 |
US10545623B2 (en) | 2020-01-28 |
JPWO2015049931A1 (ja) | 2017-03-09 |
EP3054378A4 (en) | 2017-06-14 |
EP3054378A1 (en) | 2016-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6638804B2 (ja) | 表示装置 | |
JP5942978B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP5859969B2 (ja) | 車載情報システム、車載装置、情報端末 | |
KR102266901B1 (ko) | 디스플레이 장치 및 디스플레이 방법 | |
EP2884378B1 (en) | Method of displaying pointing information and device for performing the method | |
WO2017187708A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20210345017A1 (en) | Methods, systems, and media for presenting interactive elements within video content | |
KR20140085931A (ko) | 손 움직임 감지를 통한 방송 서비스 제공 방법 및 장치 | |
JP6492419B2 (ja) | 頭部装着型表示装置、頭部装着型表示装置を制御する方法、コンピュータープログラム、画像表示システム、および、情報処理装置 | |
EP3024220A2 (en) | Display apparatus and display method | |
WO2017104089A1 (ja) | ヘッドマウントディスプレイ連携表示システム、及び、表示装置とヘッドマウントディスプレイとを含むシステム、及び、その表示装置 | |
CN115175004A (zh) | 用于视频播放的方法、装置、可穿戴设备及电子设备 | |
KR20160097868A (ko) | 디스플레이 장치 및 디스플레이 방법 | |
KR101396821B1 (ko) | 지도 표시 변경 장치 및 방법 | |
WO2019138682A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP7433618B1 (ja) | 情報処理システム、出力方法およびプログラム | |
JP7214685B2 (ja) | システム、情報処理方法およびプログラム | |
AU2022201740B2 (en) | Display device and operating method thereof | |
EP4345817A1 (en) | Display device and operating method thereof | |
US9774812B1 (en) | Image processing terminal and method for providing a service based on sensing levels of a key in a remote controller | |
JP6719276B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2024008632A (ja) | 情報処理システム、表示方法、プログラム、記録情報作成システム | |
KR20240044008A (ko) | 디스플레이 장치 | |
JP2018190474A (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP2020076997A (ja) | ヘッドマウントディスプレイ装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14851279 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015540422 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014851279 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014851279 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15025154 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |