WO2013105443A1 - 情報処理装置及び情報処理方法、並びにコンピューター・プログラム - Google Patents
情報処理装置及び情報処理方法、並びにコンピューター・プログラム Download PDFInfo
- Publication number
- WO2013105443A1 WO2013105443A1 PCT/JP2012/083751 JP2012083751W WO2013105443A1 WO 2013105443 A1 WO2013105443 A1 WO 2013105443A1 JP 2012083751 W JP2012083751 W JP 2012083751W WO 2013105443 A1 WO2013105443 A1 WO 2013105443A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- unit
- screen
- display
- information processing
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims description 188
- 238000004590 computer program Methods 0.000 title claims description 24
- 238000003672 processing method Methods 0.000 title claims description 20
- 238000000034 method Methods 0.000 claims abstract description 89
- 230000008569 process Effects 0.000 claims abstract description 64
- 230000007704 transition Effects 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims description 133
- 238000001514 detection method Methods 0.000 claims description 128
- 238000004364 calculation method Methods 0.000 description 67
- 238000010586 diagram Methods 0.000 description 59
- 238000005457 optimization Methods 0.000 description 52
- 230000005540 biological transmission Effects 0.000 description 41
- 238000010606 normalization Methods 0.000 description 37
- 238000004891 communication Methods 0.000 description 33
- 238000005516 engineering process Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 23
- 238000004458 analytical method Methods 0.000 description 21
- 230000010354 integration Effects 0.000 description 21
- 238000003860 storage Methods 0.000 description 15
- 238000013459 approach Methods 0.000 description 13
- 230000007246 mechanism Effects 0.000 description 12
- 230000033001 locomotion Effects 0.000 description 10
- 238000004422 calculation algorithm Methods 0.000 description 7
- 238000005286 illumination Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 230000014509 gene expression Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000004091 panning Methods 0.000 description 2
- 230000037237 body shape Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/32—Image data format
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0464—Positioning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0492—Change of orientation of the displayed image, e.g. upside-down, mirrored
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
Definitions
- the technology disclosed in this specification relates to an information processing apparatus and an information processing method having a display screen that also serves as an input unit such as a touch panel type, and a computer program, and in particular, has a large screen and is shared by a plurality of users.
- the present invention relates to an information processing apparatus, an information processing method, and a computer program in which users perform collaborative work through operation of a touch panel.
- tablet terminals having a display screen that also serves as an input unit such as a touch panel type are rapidly spreading. Since the tablet terminal uses a widget or desktop as an interface and the operation method is easy to understand visually, the user can use it more easily than a personal computer that performs an input operation from a keyboard or a mouse.
- a touch sensitive device that reads data belonging to a touch input associated with a multipoint sensing device from a multipoint sensing device, such as a multipoint touch screen, and identifies a multipoint gesture based on the data from the multipoint sensing device.
- a device see, for example, Patent Document 1.
- Each operated object is a reproduction content such as a moving image or a still image, or an e-mail or a message received from another user.
- the user needs to individually rotate the tablet terminal body in order to display a desired operated object directly facing the user. For example, if it is an A4 or A5 size tablet terminal, the rotation operation is easy. However, when a large screen of about several tens of inches is used, it is troublesome to rotate the tablet terminal each time a single user operates the operated object.
- a tablet terminal that identifies a region between a right arm and a left arm when a location where a user is present on the side edge of the terminal is detected by a proximity sensor and maps the region to the touch point region of the user (for example, (See Non-Patent Document 1).
- a tablet terminal detects a plurality of users, it sets the individual operation right for each operated object, or prohibits additional participation of the user in advance, thereby controlling the operated object operated by a certain user. It is possible to prevent other users from performing operations such as rotating in the direction that faces them.
- the GUI displayed on the terminal screen is constant regardless of the distance from the user to the screen and the user's condition, the information presented on the screen is too fine to understand even though the user is far away. Or there is a problem that there is little information presented on the screen even though the user is nearby.
- the input means for the user to operate the terminal is constant regardless of the distance from the user to the screen and the user's state, the user cannot operate because there is no remote control even if the user is near the terminal, or the user Is inconvenient because it needs to be close to the terminal to operate the touch panel.
- An object of the technology disclosed in the present specification is to provide an excellent information processing apparatus that has a large screen, is shared by a plurality of users, and that allows users to suitably perform cooperative work through operation of a touch panel. It is to provide an information processing method and a computer program.
- An object of the technology disclosed in the present specification is to provide an excellent information processing apparatus and information processing method, and a computer program that are always convenient for the user to operate regardless of the position of the user and the state of the user. Is to provide.
- the purpose of the technology disclosed in this specification is to be able to display an image of an object on the screen at an appropriate size regardless of the size of the actual object, the size of the screen, and the resolution.
- an object of the technology disclosed in the present specification is to provide an excellent information processing apparatus and information processing method capable of simultaneously displaying video contents of a plurality of sources on a screen in a form of being parallel or superimposed. As well as providing computer programs.
- an object of the technology disclosed in the present specification is to provide an excellent information processing apparatus capable of optimally adjusting a display form of video content at an arbitrary rotation angle and its transition process when rotating the main body. It is to provide an information processing method and a computer program.
- the display form determination unit of the information processing apparatus provides a display form that prevents the video content from being completely seen at an arbitrary rotation angle, and each rotation angle. , And a display form that maximizes the content of interest in the video content, and a display form that includes three display forms that rotate the video content so that there is no invalid area. .
- the display form determination unit of the information processing apparatus described in claim 1 determines the arbitrary rotation angle of the screen and the transition process thereof based on the attribute information of the video content. Is configured to determine a display form.
- the display form determination unit of the information processing apparatus described in claim 1 prevents the video content from being completely invisible at an arbitrary rotation angle with respect to the protected video content.
- the display form is determined.
- the invention according to claim 5 of the present application is A rotation angle detecting step for detecting a rotation angle of a screen for displaying video content;
- a display form determining step for determining a display form of the video content in an arbitrary rotation angle of the screen or its transition process;
- the invention according to claim 6 of the present application is A display for displaying video content on the screen, A rotation angle detector for detecting a rotation angle of the screen; A display form determination unit for determining a display form of video content in an arbitrary rotation angle of the screen or a transition process thereof; An image processing unit that performs image processing according to the display mode determined by the display mode determination unit so that the video content is adapted to the screen tilted by the rotation angle detected by the rotation angle detection unit; As a computer program written in a computer readable format to allow the computer to function.
- the computer program according to claim 6 of the present application defines a computer program described in a computer-readable format so as to realize predetermined processing on a computer.
- a cooperative operation is exhibited on the computer, and the same effect as the information processing apparatus according to claim 1 of the present application is obtained. be able to.
- an excellent information processing apparatus and information that have a screen can be shared by a plurality of users, and can be suitably collaborated by users through operation of the touch panel.
- a processing method as well as a computer program can be provided.
- an excellent information processing apparatus and information processing method that optimizes the display GUI and input means according to the position of the user and the state of the user and is convenient for the user, As well as computer programs.
- an image of an object can be always displayed on the screen in an appropriate size regardless of the size of the actual object, the size of the screen, and the resolution.
- An information processing apparatus, an information processing method, and a computer program can be provided.
- an excellent information processing apparatus capable of optimally adjusting the display form of video content at an arbitrary rotation angle and its transition process, and An information processing method and a computer program can be provided.
- FIG. 1 is a diagram illustrating an example (Wall) of a usage pattern of an information processing apparatus 100 having a large screen.
- FIG. 2 is a diagram showing another example (Tabletop) of the usage pattern of the information processing apparatus 100 having a large screen.
- FIG. 3A is a diagram illustrating another example of a usage pattern of the information processing apparatus 100 having a large screen.
- FIG. 3B is a diagram illustrating another example of a usage pattern of the information processing apparatus 100 having a large screen.
- FIG. 3C is a diagram illustrating another example of a usage pattern of the information processing apparatus 100 having a large screen.
- FIG. 4 is a diagram schematically illustrating a functional configuration of the information processing apparatus 100.
- FIG. 5 is a diagram illustrating an internal configuration of the input interface unit 110.
- FIG. 5 is a diagram illustrating an internal configuration of the input interface unit 110.
- FIG. 6 is a diagram showing an internal configuration of the output interface unit 130.
- FIG. 7 is a diagram showing an internal configuration for the calculation unit 120 to process the operated object.
- FIG. 8 is a diagram showing a state where a user occupation area is set on the screen.
- FIG. 9A is a diagram showing a state in which the operated objects # 1 to # 6 are randomly oriented before the user occupation area A is set.
- FIG. 9B is a diagram illustrating a state in which the operated objects # 1 to # 6 are switched in a direction facing the user A due to the user-occupied area A of the user A being set.
- FIG. 10 is a diagram showing a state where a user-occupied area B and a common area for user B are added and set in the screen by detecting the presence of user B in addition to user A.
- FIG. 11 is a diagram showing a state in which the user-occupied area D and the common area of the user B are added and set in the screen by detecting the presence of the user D in addition to the users A and B.
- FIG. 12 is a diagram showing a state in which the user-occupied area C and the common area of the user B are added and set in the screen by detecting the presence of the user C in addition to the users A, B, and D.
- FIG. 13A is a diagram exemplifying an area division pattern for dividing a user-occupied area for each user on the screen according to the shape and size of the screen and the number of users.
- FIG. 11 is a diagram showing a state in which the user-occupied area D and the common area of the user B are added and set in the screen by detecting the presence of the user D in addition to the users A and B.
- FIG. 12 is a diagram showing a state in which the user-occupied area C and the common area of the user B are added and set in the screen by detecting the presence of
- FIG. 13B is a diagram exemplifying an area division pattern for dividing the user occupation area for each user on the screen in accordance with the shape and size of the screen and the number of users.
- FIG. 13C is a diagram exemplifying an area division pattern for dividing the user occupation area for each user on the screen according to the shape and size of the screen and the number of users.
- FIG. 13D is a diagram exemplifying an area division pattern for dividing the user occupation area for each user on the screen according to the shape and size of the screen and the number of users.
- FIG. 13E is a diagram exemplifying an area division pattern for dividing a user occupation area for each user on the screen according to the shape and size of the screen and the number of users.
- FIG. 14 is a flowchart showing the processing procedure of the monitor area division executed by the monitor area dividing unit 710.
- FIG. 15 is a diagram illustrating a state in which an object to be operated is automatically rotated in a direction facing the user by dragging or slow-moving the operated object to the user occupation area.
- FIG. 16A is a diagram illustrating a state in which an operated object in a newly occupied user occupation area automatically rotates in a direction facing the user.
- FIG. 16B is a diagram illustrating a state in which an object to be operated in a newly occupied user occupation area automatically rotates in a direction facing the user.
- FIG. 17 is a flowchart showing the procedure of the optimization process for the operated object executed by the object optimization processing unit 720.
- FIG. 18 is a diagram illustrating a state in which the rotation direction is controlled according to the position where the user touches the operated object.
- FIG. 19 is a diagram illustrating a state in which the rotation direction is controlled according to the position where the user touches the operated object.
- FIG. 20 is a diagram illustrating an example of an interaction in which an operated object is exchanged between the information processing apparatus 100 and a user's possession terminal.
- FIG. 21 is a flowchart showing a processing procedure of device cooperation data transmission / reception executed by the device cooperation data transmission / reception unit 730.
- FIG. 22 is a diagram illustrating a state in which the operated object is moved between user-occupied areas and the operated object is duplicated.
- FIG. 23 is a diagram illustrating an internal configuration for the calculation unit 120 to perform optimization processing according to the distance of the user.
- FIG. 24A is a table in which optimization processing of GUI display according to the user position and the user state by the display GUI optimization unit 2310 is summarized in a table.
- FIG. 24B is a diagram illustrating screen transition of the information processing apparatus 100 according to the user position and the user state.
- FIG. 24C is a diagram illustrating screen transition of the information processing apparatus 100 according to the user position and the user state.
- FIG. 24D is a diagram illustrating a screen transition of the information processing apparatus 100 according to the user position and the user state.
- FIG. 24E is a diagram illustrating screen transition of the information processing apparatus 100 according to the user position and the user state.
- FIG. 24A is a table in which optimization processing of GUI display according to the user position and the user state by the display GUI optimization unit 2310 is summarized in a table.
- FIG. 24B is a diagram illustrating screen transition of the information processing apparatus
- FIG. 25A is a diagram showing a display example of a screen for performing automatic zapping by randomly displaying various manipulated objects.
- FIG. 25B is a diagram showing a display example of a screen in which the display positions and sizes of a plurality of operated objects to be auto-zapping are changed every moment.
- FIG. 26 is a diagram showing a screen display example in a state where the user is watching TV but not operating.
- FIG. 27A is a diagram showing a screen display example in a state where the user is operating the TV.
- FIG. 27B is a diagram showing a screen display example in a state where the user is operating the TV.
- FIG. 28 is a table summarizing the input unit optimization processing according to the user position and the user state by the input unit optimization unit 2320 in a table.
- FIG. 29 is a table summarizing the distance detection method switching processing according to the user's position by the distance detection method switching unit 2330 in a table.
- FIG. 30 is a diagram for explaining a problem of the conventional object display system.
- FIG. 31 is a diagram for explaining the problems of the conventional object display system.
- FIG. 32 is a diagram showing an internal configuration for the calculation unit 120 to perform real size display processing of an object according to monitor performance.
- FIG. 33 is a diagram illustrating an example in which images of the same object are displayed in real size on screens with different monitor specifications.
- FIG. 34 is a diagram illustrating an example in which images of two objects having different real sizes are displayed on the same screen while maintaining the correct size relationship.
- FIG. 35 is a diagram showing an example of displaying an object image in real size.
- FIG. 36 is a diagram illustrating an example in which an object image displayed in real size is rotated or posture-converted.
- FIG. 37A is a diagram illustrating a state in which real size information of a subject is estimated.
- FIG. 37B is a diagram illustrating a state in which the real size display process of the operated object is performed based on the estimated real size information of the subject.
- FIG. 38A is a diagram showing a state in which face sizes and positions vary among users who are in video chat.
- FIG. 38B is a diagram illustrating a state in which face sizes and positions are aligned among users who are in video chatting by a normalization process between a plurality of images.
- FIG. 39A is a diagram showing a state in which the size and position of the user's figure displayed in parallel with the screen and the instructor's figure are not aligned.
- FIG. 39B is a diagram showing a state in which the size and position of the user's figure displayed in parallel with the screen and the instructor's figure are aligned by normalization processing between a plurality of images.
- FIG. 39C is a diagram illustrating a state in which a user's figure normalized by a normalization process between a plurality of images is displayed superimposed on an instructor's figure.
- FIG. 40A is a diagram showing a state where the sample image of the product does not overlap with an appropriate place because of the correct size relationship with the video of the user himself / herself.
- FIG. 40A is a diagram showing a state where the sample image of the product does not overlap with an appropriate place because of the correct size relationship with the video of the user himself / herself.
- FIG. 40B is a diagram illustrating a state in which a sample image of a product is displayed in an appropriate location in a correct size relationship with the user's own image by normalization processing between a plurality of images.
- FIG. 41 is a diagram illustrating an internal configuration for the calculation unit 120 to perform image normalization processing.
- FIG. 42 is a diagram showing a display form in which the entire area of the video content is displayed so that the video content is not completely visible at an arbitrary rotation angle.
- FIG. 43 is a diagram showing a display form in which the attention area in the video content is maximized at each rotation angle.
- FIG. 44 is a diagram showing a display form in which video content is rotated so that there is no invalid area.
- FIG. 45 is a diagram showing the relationship of the zoom ratio of the video content with respect to the rotation position in each display mode shown in FIGS.
- FIG. 46 is a flowchart illustrating a processing procedure for controlling the display form of the video content in the calculation unit 120 when the information processing apparatus 100 is rotated.
- FIG. 47 is a diagram showing an internal configuration for the calculation unit 120 to perform a process of adjusting the display form of the video content in an arbitrary rotation angle of the information processing apparatus 100 main body and its transition process.
- the information processing apparatus 100 has a large screen, but as a main usage pattern, a “Wall” that is hung on a wall as shown in FIG. 1 and a desktop as shown in FIG. “Tabletop” is assumed.
- the information processing apparatus 100 is attached in a rotatable and detachable state on the wall surface by, for example, the rotation / attachment mechanism unit 180. Further, the rotation / attachment mechanism unit 180 also serves as an electrical contact between the information processing apparatus 100 and the outside, and a power cable or a network cable (both shown) is connected to the information processing apparatus 100 via the rotation / attachment mechanism part 180.
- the information processing apparatus 100 can receive driving power from a commercial AC power source and can also access various servers on the Internet.
- the information processing apparatus 100 includes a distance sensor, a proximity sensor, and a touch sensor, and can grasp the position (distance, azimuth) of the user facing the screen.
- a ripple-shaped detection indicator (described later) or an illumination expression indicating the detection state is displayed on the screen to give the user visual feedback. Can do.
- the information processing apparatus 100 automatically selects the optimum interaction according to the user's position. For example, the information processing apparatus 100 automatically selects or adjusts a GUI (Graphical User Interface) display such as an operated object framework and information density according to the position of the user. In addition, the information processing apparatus 100 automatically selects a plurality of input means such as a touch on the screen, proximity, a gesture using a hand, a remote controller, an indirect operation according to a user state, and the like according to the position of the user or the distance to the user. Can be selected.
- GUI Graphic User Interface
- the information processing apparatus 100 includes one or more cameras, and can recognize not only the position of the user but also a person, an object, and a device from a captured image of the camera.
- the information processing apparatus 100 also includes an ultra-short distance communication unit, and can directly and naturally transmit and receive data to and from a device held by a user approaching to an ultra-short distance.
- an operated object to be operated by the user On the large screen in the Wall state, an operated object to be operated by the user is defined.
- the operated object has a specific display area of a functional module such as an arbitrary Internet site, an application, and a widget, including a moving image, a still image, and text content.
- the operated objects include TV broadcast reception content, playback content from recording media, streaming video acquired via a network, video captured from other devices such as mobile terminals owned by the user, still image content, etc. .
- three screens with an aspect ratio of 16: 9 are arranged in the vertical direction. can do.
- three types of content # 1 to # 3 such as broadcast content simultaneously received from different broadcast stations, playback content from a recording medium, and streaming video on a network can be displayed side by side in the vertical direction.
- the user operates the screen up and down with, for example, a fingertip
- the content scrolls up and down as shown in FIG. 3B.
- the user operates the left and right directions with a fingertip at any of the three levels, the screen scrolls in the horizontal direction at that level as shown in FIG. 3C.
- the information processing apparatus 100 is directly installed on the desktop.
- the rotation / attachment mechanism unit 180 also serves as an electrical contact (as described above), whereas in the state where it is installed on a table as shown in FIG. I can't find the point of contact. Therefore, in the illustrated Tabletop state, the information processing apparatus 100 may be configured to be able to operate without a power source using a built-in battery.
- the information processing apparatus 100 includes a wireless communication unit corresponding to a mobile station function of a wireless LAN (Local Area Network), for example, and the rotation / attachment mechanism unit 180 includes a wireless communication unit corresponding to an access point function of the wireless LAN. In this way, the information processing apparatus 100 can access various servers on the Internet through wireless communication with the rotation / attachment mechanism unit 180 as an access point even in the Tabletop state.
- LAN Local Area Network
- the operated object has a specific display area of a functional module such as an arbitrary Internet site, an application, and a widget, including a moving image, a still image, and text content.
- a functional module such as an arbitrary Internet site, an application, and a widget, including a moving image, a still image, and text content.
- the information processing apparatus 100 includes a proximity sensor that detects the presence or state of the user at each of the four side edges of the large screen. Similarly to the above, a user who approaches the large screen may be photographed with a camera to recognize a person.
- the ultra short-range communication unit detects whether or not the user who has detected the presence possesses a device such as a mobile terminal, or detects a data transmission / reception request from another terminal possessed by the user.
- a ripple-shaped detection indicator or an illumination expression (described later) indicating the detection state is displayed on the screen to visually Give feedback.
- the information processing apparatus 100 When the information processing apparatus 100 detects the presence of a user with a proximity sensor or the like, the information processing apparatus 100 uses the detection result for UI control. By detecting not only the presence / absence of the user but also the position of the torso, both hands and feet, the head, etc., it can be used for more detailed UI control.
- the information processing apparatus 100 also includes an ultra-short-range communication unit, and can directly and naturally transmit and receive data to and from devices owned by a user who has approached the ultra-short distance (same as above).
- the information processing apparatus 100 sets a user-occupied area for each user and a common area shared between the users in the large screen according to the detected user arrangement. Then, the touch sensor input of each user in the user occupation area and the common area is detected.
- the shape of the screen and the pattern for dividing the region are not limited to a rectangle, and can be applied to a shape including a solid such as a square, a circle, or a cone.
- the screen of the information processing apparatus 100 is enlarged, in the Tabletop state, there is enough room for a plurality of users to touch input simultaneously.
- a comfortable and efficient simultaneous operation can be realized by a plurality of users.
- the operation right of the operated object placed in the user occupied area is given to the corresponding user.
- the operation right of the operated object is also moved to himself / herself.
- the display is automatically changed so that the operated object is in a direction facing the user.
- the operated object moves between user-occupied areas, the operated object moves with a physically natural motion according to the touch position where the moving operation is performed. Further, by pulling one operated object between users, an operation of dividing or duplicating the operated object is possible.
- FIG. 4 schematically shows a functional configuration of the information processing apparatus 100.
- the information processing apparatus 100 includes an input interface unit 110 that inputs an information signal from the outside, a calculation unit 120 that performs calculation processing for controlling the display screen based on the input information signal, and a calculation result.
- An output interface unit 130 that outputs information to the outside, a large-capacity storage unit 140 such as a hard disk drive (HDD), a communication unit 150 connected to an external network, and a power supply unit 160 that handles drive power
- the TV tuner unit 170 is provided.
- the storage unit 140 stores various processing algorithms executed by the calculation unit 120 and various databases used for calculation processing by the calculation unit 120.
- the main functions of the input interface unit 110 are detection of the presence of the user, detection of a touch operation on the screen, that is, the touch panel by the detected user, detection of a device such as a mobile terminal possessed by the user, and reception processing of transmission data from the device It is.
- FIG. 5 shows an internal configuration of the input interface unit 110.
- the remote control receiving unit 501 receives a remote control signal from a remote control or a mobile terminal.
- the signal analysis unit 502 demodulates and decodes the received remote control signal to obtain a remote control command.
- the camera unit 503 adopts a monocular type, or one or both of a binocular type and an active type.
- the camera unit 503 includes an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device).
- the camera unit 503 includes camera control units such as pan, tilt, and zoom.
- the camera unit 503 notifies camera information such as pan, tilt, and zoom to the calculation unit 120 and can control pan, tilt, and zoom of the camera unit 503 according to the camera control information from the calculation unit 120.
- the image recognizing unit 504 performs a recognition process on an image captured by the camera unit 503. Specifically, it recognizes gestures by detecting the movement of the user's face and hands based on the background difference, recognizes the user's face included in the captured image, recognizes the human body, and recognizes the distance to the user. Or
- the microphone unit 505 inputs sound and conversations made by the user.
- the voice recognition unit 506 recognizes the input voice signal.
- the distance sensor 507 is made of, for example, PSD (Position Sensitive Detector), and detects a signal returned from the user or other object.
- PSD Position Sensitive Detector
- the signal analysis unit 508 analyzes the detection signal and measures the distance to the user or the object.
- a pyroelectric sensor, a simple camera, or the like can be used for the distance sensor 507.
- the distance sensor 507 constantly monitors whether or not a user exists within a radius of, for example, 5 to 10 meters from the information processing apparatus 100. For this reason, it is preferable to use a sensor element with low power consumption for the distance sensor 507.
- the touch detection unit 509 includes a touch sensor or the like superimposed on the screen, and outputs a detection signal from a location where the user's fingertip touches the screen.
- the signal analysis unit 510 analyzes the detection signal to obtain position information.
- the proximity sensor 511 is installed at each of the four side edges of the large screen, and detects that the user's body is close to the screen by, for example, a capacitance type.
- the signal analysis unit 512 analyzes the detection signal.
- the ultra short-range communication unit 513 receives a non-contact communication signal from a device owned by the user, for example, by NFC (Near Field Communication).
- the signal analysis unit 514 demodulates and decodes the received signal to obtain received data.
- the 3-axis sensor unit 515 is composed of a gyro and the like, and detects the posture of the information processing apparatus 100 around each axis.
- a GPS (Global Positioning System) receiving unit 516 receives a signal from a GPS satellite.
- the signal analysis unit 517 analyzes signals from the triaxial sensor unit 515 and the GPS reception unit 516 to obtain position information and posture information of the information processing apparatus 100.
- the input interface integration unit 520 integrates the information signal input and passes it to the calculation unit 120. In addition, the input interface integration unit 520 integrates the analysis results of the signal analysis units 508, 510, 512, and 514, acquires position information of users around the information processing apparatus 100, and passes the information to the calculation unit 120.
- the main functions of the calculation unit 120 are calculation processing such as a user detection result by the input interface unit 110, a touch detection result of the screen, and a UI screen generation process based on received data from a device owned by the user, and a calculation result This is an output to the output interface unit 130.
- the calculation unit 120 can realize calculation processing for each application by loading and executing an application program installed in the storage unit 140, for example. The functional configuration of the calculation unit 120 corresponding to each application will be described later.
- the main functions of the output interface unit 130 are UI display on the screen based on the calculation result of the calculation unit 120 and data transmission to the device owned by the user.
- FIG. 6 shows an internal configuration of the output interface unit 130.
- the output interface integration unit 610 integrates and handles information output based on calculation results such as monitor division processing, object optimization processing, and device cooperation data transmission / reception processing by the calculation unit 120.
- the output interface integration unit 610 provides the content display unit 601 with the received TV broadcast content, content reproduced from a recording medium such as a Blu-ray disc, etc. Instructs image and sound output.
- the output interface integration unit 610 instructs the GUI display unit 602 to display the GUI such as the operated object on the display unit 603.
- the output interface integration unit 610 instructs the illumination display unit 605 to display the illumination display indicating the detection state from the illumination unit 606.
- the output interface integration unit 610 instructs the ultra short-range communication unit 513 to transmit data to the device owned by the user by non-contact communication.
- the information processing apparatus 100 can detect a user based on recognition of a captured image of the camera unit 503 and detection signals from the distance sensor 507, the touch detection unit 509, the proximity sensor 511, the ultra-short distance communication unit 513, and the like. . Further, the detected person of the user can be identified by recognizing the face of the captured image of the camera unit 503 and recognizing the device owned by the user by the ultra-short distance communication unit 513. The identified user can log in to the information processing apparatus 100. Of course, the login account can be limited to a specific user. Further, the information processing apparatus 100 can receive an operation from the user by using the distance sensor 507, the touch detection unit 509, and the proximity sensor 511 depending on the position of the user and the state of the user.
- the information processing apparatus 100 is connected to an external network through the communication unit 150.
- the connection form with the external network may be wired or wireless.
- the information processing apparatus 100 can communicate with a mobile terminal such as a smartphone possessed by a user and other devices such as a tablet terminal through the communication unit 150.
- a so-called “3-screen” can be configured by combining three types of devices, that is, the information processing device 100, the mobile terminal, and the tablet terminal.
- the information processing apparatus 100 can provide a UI that links the three screens on a larger screen than the other two screens.
- a cloud server or the like is installed on the external network, and the three screens can benefit from cloud computing through the information processing apparatus 100, such as using the computing power of the cloud server.
- the information processing apparatus 100 can be operated simultaneously by a plurality of users on a large screen.
- each of the four side edges of the large screen is provided with a proximity sensor 511 that detects the presence of the user or the state of the user, and a user-occupied area and a common area in the screen according to the user's arrangement
- a proximity sensor 511 that detects the presence of the user or the state of the user, and a user-occupied area and a common area in the screen according to the user's arrangement
- the screen of the information processing apparatus 100 is enlarged, in the Tabletop state, there is enough room for a plurality of users to touch input simultaneously.
- a comfortable and efficient simultaneous operation can be realized by a plurality of users.
- the operation right of the operated object placed in the user occupied area is given to the corresponding user.
- the operation right of the operated object is also moved to himself / herself.
- the display is automatically changed so that the operated object is in a direction facing the user.
- the operated object moves between user-occupied areas, the operated object moves with a physically natural motion according to the touch position where the moving operation is performed. Further, by pulling one operated object between users, an operation of dividing or duplicating the operated object is possible.
- the main function of the calculation unit 120 when realizing this application is to optimize the operated object based on the user detection result by the input interface unit 110, the touch detection result of the screen, and the received data from the device possessed by the user.
- FIG. 7 shows an internal configuration for the calculation unit 120 to process the operated object.
- the computing unit 120 includes a monitor area dividing unit 710, an object optimization processing unit 720, and a device cooperation data transmission / reception processing unit 630.
- the monitor area dividing unit 710 When the monitor area dividing unit 710 obtains the user position information from the input interface integration unit 520, the monitor area dividing unit 710 refers to the device database 711 relating to the shape and sensor arrangement and the area pattern database 711 stored in the storage unit 140. Thus, the above-described user-occupied area and common area are set on the screen. Then, the monitor area setting unit 710 passes the set area information to the object optimization processing unit 720 and the device cooperation data transmission / reception unit 730. Details of the processing procedure of the monitor area division will be described later.
- the object optimization processing unit 720 inputs, from the input interface integration unit 520, information on operations performed by the user on the operated object on the screen. Then, the object optimization processing unit 720 performs an operation corresponding to the user operation such as rotating, moving, displaying, dividing, or copying the operated object operated by the user according to the optimal processing algorithm 721 loaded from the storage unit 140. The operation object is optimized, and the object to be operated subjected to the optimization process is output to the screen of the display unit 603. Details of the optimum processing of the operated object will be described later.
- the device cooperation data transmission / reception unit 730 inputs from the input interface integration unit 520 the location information of the user and the device possessed by the user, and the transmission / reception data with the device. Then, the device cooperation data transmission / reception unit 730 performs data transmission / reception processing in cooperation with the device possessed by the user according to the transmission / reception processing algorithm 731 loaded from the storage unit 140.
- the object optimization processing unit 720 performs an optimization process on the operated object corresponding to the data transmission / reception process. For example, the object optimization processing unit 720 performs optimization processing of the manipulated object associated with data transmission / reception in cooperation with the device owned by the user, such as rotating, moving, displaying, or copying the manipulated object related to the transmission / reception data.
- the object to be operated that has been optimally processed is output to the screen of the display unit 603. Details of the optimum processing of the operated object associated with device cooperation will be described later.
- monitor area division mainly assumes processing in a usage form that the information processing apparatus 100 shares with a plurality of users in the Tabletop state, of course, even in a usage form that is shared with a plurality of users in the Wall state. Good.
- the monitor area dividing unit 710 When the monitor area dividing unit 710 detects the presence of a user through the input interface integration unit 520, the monitor area dividing unit 710 allocates the user-occupied area of the user in the screen.
- FIG. 8 shows that the monitor area dividing unit 710 detects the presence of the user A based on the detection signal of the proximity sensor 511 (or the distance sensor 507) installed at the screen side edge. A state where A is set in the screen is shown. When only one user has detected the presence, the entire screen may be set as the user-occupied area of the user as shown.
- the object optimization processing unit 720 determines the direction of each operated object in the user occupation area A based on the position information of the user A obtained through the input interface integration unit 520. , Switch to face the user.
- FIG. 9A shows a state where the operated objects # 1 to # 6 are randomly oriented before the user occupation area A is set. In the figure, it should be understood that the inclination of each character “# 1”, “# 2”... In the object represents the direction of the object.
- FIG. 9B shows a state in which the user-occupied area A of the user A is set, so that all the operated objects # 1 to # 6 in this area are switched to the direction facing the user A. .
- the entire screen may be set in the user-occupied area A of the user A.
- the presence of the user B at the adjacent screen side edge is detected by the detection signal of the proximity sensor 511 or the distance sensor 507. And a common area added and set in the screen. Based on the position information of each user A, B, the user-occupied area A of the user A shrinks toward the place where the user A is present, and the user-occupied area B of the user B appears near the place where the user B is present. Further, when the presence of the user B is newly detected, a ripple-shaped detection indicator is also displayed in the user occupation area B. All areas other than the user occupied area A and the user occupied area B in the screen are common areas.
- User occupancy area B is newly set in the screen as user B approaches information processing apparatus 100, and then, when any object to be operated in user occupancy area B is first touched, The area B may be validated. Although not shown in FIG. 10, each operated object in the area that becomes the new user occupied area B is displayed when the user occupied area B is set or when the user occupied area B becomes valid. The direction is switched to face user B.
- the monitor area dividing unit 710 displays the user-occupied area D of the user D near the place where the user D is present. It shows how it is added and set in the screen.
- a ripple-shaped detection indicator is displayed to express that the presence of the user D is newly detected.
- FIG. 12 in addition to the users A, B, and D, the presence of the user C is detected at the other side edge of the screen, so that the monitor area dividing unit 710 is located near the place where the user C is located. The state where the occupied area C is added and set in the screen is shown.
- a ripple-shaped detection indicator is displayed to express that the presence of the user C is newly detected.
- the region division pattern also depends on the number of users who have detected the shape and presence of the screen and their arrangement.
- the area division pattern database 611 stores information related to area division patterns according to the screen shape and size and the number of users.
- the device database 711 stores information on the shape and size of the screen used by the information processing apparatus 100.
- the monitor area division unit 710 reads the shape and size of the screen from the device database 711 and the area division corresponding to the area division pattern database 712 is performed. Queries the pattern.
- 13A to 13E illustrate area division patterns for dividing the user occupation area for each user on the screen according to the shape and size of the screen and the number of users.
- FIG. 14 shows the processing procedure of the monitor area division executed by the monitor area dividing unit 710 in the form of a flowchart.
- the monitor area dividing unit 710 first checks whether there is a user near the screen based on the signal analysis result of the detection signal of the proximity sensor 511 or the distance sensor 507 (step S1401).
- step S1401 When it is detected that a user exists (Yes in step S1401), the monitor area dividing unit 710 subsequently acquires the number of existing users (step S1402), and further acquires a position where each user is present (step S1402). S1403).
- the processing in steps S1401 to S1403 is realized based on the user position information passed from the input interface integration unit 520.
- the monitor area dividing unit 710 inquires of the device database 511 to acquire device information such as the screen shape of the display unit 603 used in the information processing apparatus 100 and the arrangement of the proximity sensor 511, and the user position information.
- the corresponding area division pattern is referred to the area division pattern database 712 (step S1404).
- the monitor area dividing unit 710 sets the user-occupied area and common area of each user on the screen according to the obtained area dividing pattern (step S1405), and ends this processing routine.
- the object optimization processing unit 720 rotates and moves the operated object on the screen according to the user operation when the user inputs information on the operation performed on the operated object on the screen through the input interface integration unit 520.
- Display processing such as display, division, or copying.
- the process of rotating, moving, displaying, dividing, or copying an object to be operated in accordance with an operation such as dragging or throwing by a user is similar to a GUI operation on a computer desktop screen.
- a user-occupied area and a common area are set on the screen, and the object optimization processing unit 720 optimally processes the display according to the area where the operated object exists.
- a typical example of the optimum process is a process of switching the direction of each operated object in the user-occupied area so as to face the user.
- the object # 1 in the common area is dragged or thrown toward the user-occupied area A of the user A, and when a part or center coordinates of the object enters the user-occupied area A, the object It shows a state where the optimum processing unit 720 automatically rotates the object in the direction facing the user A.
- FIG. 15 shows that the operated object # 2 in the user occupied area B of the user B is dragged or thrown toward the user occupied area A of the user A, and a part of the object or the center coordinate is the user occupied area A. Also shown is a state in which the object is automatically rotated in the direction facing the user A when entering.
- a user occupation area B is newly set near the user B on the screen.
- the object optimum The processing unit 720 immediately and automatically rotates the operated object # 3 in the direction facing the user B (see FIG. 16B).
- the user-occupied area B is initially set to an arbitrary one in the user-occupied area B.
- the user-occupied region B may be validated when the operated object is touched. In this case, when the user-occupied area B becomes valid, all the manipulated objects in the user-occupied area B may be simultaneously rotated so as to face the user B.
- the object optimization processing unit 720 can perform the optimal processing of the operated object based on the user operation information acquired through the input interface integration unit 520 and the region information passed from the monitor region dividing unit 710.
- FIG. 17 shows a flowchart of the procedure of the optimum process for the operated object executed by the object optimization processing unit 720.
- the object optimization processing unit 720 When the object optimization processing unit 720 receives the position information of the operated object operated by the user from the input interface integration unit 520 and acquires the monitor area information divided from the monitor area dividing unit 710, the object optimized processing unit 720 operates. It is confirmed in which area (step S1701).
- the object optimization processing unit 720 checks whether or not the operated object is facing the user in the user-occupied area. (Step S1702).
- step S1702 when the operated object is not facing the direction facing the user (No in step S1702), the object optimization processing unit 720 rotates the operated object so as to face the user within the user occupation area. Processing is performed (step S1703).
- the rotation direction is controlled according to the position where the user touches the operated object. It may be.
- FIG. 18 when the user touches the right side of the centroid position of the operated object and drags or slows it, the operated object rotates clockwise around the centroid position when entering the user occupation area. It shows how it faces to the user.
- FIG. 19 shows a direction in which the operated object rotates counterclockwise around the center of gravity and faces the user as a result of the user touching the left side of the operated object's center of gravity and dragging or slow-moving. It shows how it becomes.
- the information processing apparatus 100 can communicate with other devices such as a mobile terminal possessed by the user through the communication unit 150.
- a mobile terminal possessed by the user
- the communication unit 150 For example, an object to be operated between the information processing apparatus 100 and the corresponding terminal in the background where the user performs a touch operation on the screen or performs an action such as bringing the possession terminal closer to the information processing apparatus 100.
- FIG. 20 illustrates an example of an interaction in which an object to be operated is exchanged between the information processing apparatus 100 and the user's possession terminal.
- the operated object appears from the vicinity of the terminal, and the user occupied area A UI expressions that flow into
- the information processing apparatus 100 indicates that the user's possession terminal has approached the vicinity of the user occupation area A based on the signal analysis result of the detection signal by the ultra short-range communication unit 513 and the recognition result of the user's captured image by the camera unit 503. Can be detected.
- the device cooperation data transmission / reception unit 730 allows the user to process the information processing apparatus 100 through the context of the user A and the information processing apparatus 100 (or the interaction between the user A and another user via the information processing apparatus 100). It may be specified whether there is data to be transmitted to and what the transmitted data is.
- the device cooperation data transmission / reception unit 730 is between the information processing apparatus 100 and the corresponding possession terminal in the background of the action that the user possession terminal approaches in the vicinity of the user occupation area A. Executes data transmission / reception of moving objects, still images, text contents, and the like, which are entities of the operated object.
- FIG. 20 illustrates a UI expression in which the operated object flows from the terminal owned by the user A to the corresponding user occupation area A.
- FIG. 21 shows a processing procedure of device cooperation data transmission / reception executed by the device cooperation data transmission / reception unit 730 in the form of a flowchart.
- the processing by the device cooperation data transmission / reception unit 730 starts when a terminal possessed by a certain user approaches the vicinity of his / her own user occupation area based on the signal analysis result of the detection signal by the ultra short-range communication unit 513.
- the device cooperation data transmission / reception unit 730 checks whether there is a terminal of the user who performs communication based on the signal analysis result of the detection signal by the ultra-short-range communication unit 513 (step S2101).
- the device cooperation data transmission / reception unit 730 determines the position where the terminal exists based on the signal analysis result of the detection signal by the ultra-short-range communication unit 513, and the like. (Step S2102).
- the device cooperation data transmission / reception unit 730 checks whether there is data to be transmitted / received to / from the user possessing terminal (step S2103).
- the device cooperation data transmission / reception unit 730 displays the UI of the operated object according to the position of the terminal according to the transmission / reception processing algorithm 731 (see FIG. 20). ). In addition, the device cooperation data transmission / reception unit 730 transmits / receives data that is the substance of the operated object to / from the terminal in the background of the UI display (step S2104).
- the operated object acquired by the information processing apparatus 100 from the user's possession terminal is arranged in the user occupation area of the corresponding user. Furthermore, when exchanging data between users, an operation of moving the operated object between the user occupied areas may be performed.
- FIG. 22 shows a state in which the user A copies the operated object held by the user B in the user occupied area B to the user occupied area A. Alternatively, the operated object may be divided instead of being copied.
- the operated object replicated on the screen is content such as a moving image or a still image
- another independent data is simply created.
- the duplicated operated object is an application window
- another window of the application capable of joint operation between the user who originally holds the operated object and the copy destination user is created.
- the optimum selection information processing apparatus 100 includes a distance sensor 507 and a proximity sensor 511. For example, when using the wall hanging as shown in FIGS. The distance from the main body of the apparatus 100, that is, the screen to the user can be detected.
- the information processing apparatus 100 includes a touch detection unit 509, a proximity sensor 511, a camera unit 503, and a remote control reception unit 501, and touches the screen, proximity, gestures using hands, etc., remote control, indirect by user status
- a plurality of input means such as operations can be provided to the user.
- Each input means is suitable for operation according to the distance from the main body of the information processing apparatus 100, that is, the screen to the user. For example, a user who is within a range of 50 cm from the main body of the information processing apparatus 100 can reliably operate the operated object by directly touching the screen.
- the user is within 2 m from the information processing apparatus 100 main body, it is far from touching the screen directly, but the captured image of the camera unit 503 is recognized and processed to accurately capture the movement of the face and hands. Gesture input is possible.
- a user who is 2 m or more away from the main body of the information processing apparatus 100 can perform a remote control operation because the remote control signal reliably arrives even if the accuracy of image recognition decreases.
- the optimum GUI display such as the framework of the operated object displayed on the screen and the information density also changes according to the distance to the user.
- the information processing apparatus 100 automatically selects from a plurality of input means according to the position of the user or the distance to the user, and automatically selects the GUI display according to the position of the user. Since the adjustment is made, the user convenience is improved.
- FIG. 23 shows an internal configuration for the calculation unit 120 to perform optimization processing according to the distance of the user.
- the calculation unit 120 includes a display GUI optimization unit 2310, an input means optimization unit 2320, and a distance detection method switching unit 2330.
- the display GUI optimizing unit 2310 optimizes the optimal GUI display such as the framework and information density of the operated object displayed on the screen of the display unit 603 according to the user position and the user state.
- the position of the user is obtained by the distance detection method switched by the distance detection method switching unit 2330.
- the user's position approaches personal authentication becomes possible through face recognition of the captured image of the camera unit 503, proximity communication with the user's possession terminal, and the like.
- the user's state is specified based on image recognition of the captured image of the camera unit 503 and signal analysis of the distance sensor 507.
- the state of the user is roughly divided into two states, “there is a user (presence)” and “there is no user (absence)”.
- the “user is present” state is classified into two states: “user is watching TV (screen of display unit 603) (viewing)” and “user is not watching TV (non-viewing)”.
- the state where “the user is watching TV” is subdivided into two states, “the user is operating the TV (during operation)” and “the user is not operating the TV (no operation)”.
- the display GUI optimizing unit 2310 refers to the device input means database in the storage unit 140 when determining the user state. Further, when optimizing the display GUI according to the determined user position and user state, the GUI display (framework / density) database and the content database in the storage unit 140 are referred to.
- FIG. 24A summarizes the display GUI optimization processing according to the user position and the user state by the display GUI optimization unit 2310 in a table.
- 24B to 24E show screen transitions of the information processing apparatus 100 according to the user position and the user state.
- the display GUI optimization unit 2310 stops the screen display of the display unit 603 and waits until the presence of the user is detected (see FIG. 24B).
- the display GUI optimizing unit 2310 selects “auto zapping” as the optimum display GUI (see FIG. 24C). Auto-zapping displays various objects to be operated at random, and makes the user interested in watching TV.
- the object to be used for zapping is display GUI optimization such as TV broadcast program content received by the TV tuner unit 170, network content acquired from the communication unit 150 via the Internet, mails and messages from other users, etc.
- the unit 2310 includes a plurality of operated objects selected based on the content database.
- FIG. 25A shows an example of a display GUI that is auto-zapping.
- the display GUI optimization unit 2310 changes the position and size (that is, the degree of exposure) of each operated object displayed on the screen from moment to moment as shown in FIG. You may try to work on.
- the display GUI optimization unit 2310 uses the recognized personal information to select the object to be auto-zapped. It's okay.
- display GUI optimization unit 2310 selects “auto zapping” as the optimal display GUI (see FIG. 24D).
- a plurality of manipulated objects selected based on the content database are regularly arranged such as in a column as shown in FIG. Make it easy to check.
- the display GUI optimization unit 2310 uses the recognized personal information to select the object to be auto-zapped. It's okay.
- the display GUI optimization unit 2310 may reduce the GUI information density when the user is far away and increase the GUI information density when the user approaches, depending on the position of the user. The information density of the display GUI may be controlled.
- the user operates the information processing apparatus 100 using the input means optimized by the input means optimization unit 2320.
- the optimal input means in this state is, for example, transmission of a remote control signal to the remote control receiving unit 501, gesture to the camera unit 503, contact with the touch panel detected by the touch detection unit 509, voice input to the microphone 505, proximity sensor Proximity input to 511 or the like.
- the display GUI optimizing unit 2310 displays the operated objects in a column as an optimal display GUI in accordance with the user's input operation, and can perform scrolling and selection of the operated object in accordance with the user operation. As shown in FIG.
- a cursor is displayed at a position on the screen instructed via the input means. Since the operated object without the cursor is considered not to be noticed by the user, the brightness level is lowered as shown by the diagonal line in the figure to express the contrast with the operated object being noticed. Alternatively, the cursor may be placed on the operated object # 3 that the user is touching with a fingertip. Also, as shown in FIG. 27B, when the user selects the operated object on which the cursor is placed, the operated object may be displayed in full screen (or enlarged to the maximum possible size) (see FIG. 27B). The selected operated object # 3 is enlarged and displayed).
- the input means optimization unit 2320 optimizes the input means for the user to operate on the information processing apparatus 100 according to the user position and the user state.
- the position of the user is obtained by the distance detection method switched by the distance detection method switching unit 2330.
- the user's position approaches personal authentication becomes possible through face recognition of the captured image of the camera unit 503, proximity communication with the user's possession terminal, and the like.
- the user's state is specified based on image recognition of the captured image of the camera unit 503 and signal analysis of the distance sensor 507.
- the input means optimization unit 2320 refers to the device input means database in the storage unit 140 when determining the user state.
- FIG. 28 is a table summarizing the input unit optimization processing according to the user's position and user status by the input unit optimization unit 2320.
- the input means optimizing unit 2320 optimizes each input means mainly according to the position of the user.
- the input means includes, for example, remote control input to the remote control receiving unit 501, gesture input to the camera unit 503, touch input detected by the touch detection unit 509, voice input to the microphone 505, proximity input to the proximity sensor 511, and the like.
- the remote control receiving unit 501 is activated over all user positions (that is, almost always) and is waiting to receive a remote control signal.
- the input unit optimization unit 2320 turns on gesture input to the camera unit 503 when the user position is in the range of several tens of centimeters to several meters.
- the touch on the touch panel superimposed on the screen of the display unit 603 is limited to a range that the user can reach. Therefore, the distance detection method switching unit 2330 turns on the distance detection function by the touch detection unit 509 in the range of the user position up to several tens of centimeters. Further, the proximity sensor 511 can detect the user up to several tens of centimeters even when the proximity sensor 511 is not touched. Therefore, the distance detection method switching unit 2330 turns on the distance detection function by the proximity sensor 511 up to the user position farther than the touch input.
- the input unit optimization unit 2320 turns on gesture input to the camera unit 503 within a range of the user position up to several meters.
- the distance detection method switching unit 2330 performs a process of switching the method in which the information processing apparatus 100 detects the distance to the user and the position of the user according to the position of the user.
- the distance detection method switching unit 2330 refers to the coverage database for each detection method in the storage unit 140 when determining the user state.
- FIG. 29 is a table summarizing the distance detection method switching process according to the user's position by the distance detection method switching unit 2330.
- the distance sensor 507 includes a simple sensor element with low power consumption, such as a PSD sensor, a pyroelectric sensor, or a simple camera. In order to constantly monitor whether or not a user exists within a radius of, for example, 5 to 10 meters from the information processing apparatus 100, the distance detection method switching unit 2330 always turns on the distance sensor 507.
- the image recognition unit 504 When the camera unit 503 adopts a monocular system, the image recognition unit 504 performs user motion recognition, face recognition, human body recognition, and the like based on background differences.
- the distance detection method switching unit 2330 turns on the recognition (distance detection) function by the image recognition unit 504 when the user position is in the range of 70 centimeters to 6 meters with sufficient recognition accuracy based on the captured image.
- the image recognition unit 504 can obtain a sufficient recognition accuracy from 60 centimeters to 5 meters slightly in front, and the distance detection method switching.
- the unit 2330 turns on the recognition (distance detection) function by the image recognition unit 504 within the range of the user position.
- the distance detection method switching unit 2330 may turn off the camera unit 503 and the image recognition unit 504 when the user approaches too much.
- the touch on the touch panel superimposed on the screen of the display unit 603 is limited to a range that the user can reach. Therefore, the distance detection method switching unit 2330 turns on the distance detection function by the touch detection unit 509 in the range of the user position up to several tens of centimeters. Further, the proximity sensor 511 can detect the user up to several tens of centimeters even when the proximity sensor 511 is not touched. Therefore, the distance detection method switching unit 2330 turns on the distance detection function up to the user position farther than the touch input.
- the distance detection method for detecting a distant place exceeding several meters or ten meters has a purpose of confirming the existence of the user and must be always turned on. Therefore, it is preferable to use a device with low power consumption.
- the distance detection method that detects a close distance within 1 meter can have high density information and have recognition functions such as face recognition and human body recognition, but consumes a lot of power for recognition processing etc. It is preferable to turn off the function at a distance where sufficient recognition accuracy cannot be obtained.
- an image of an actual object is displayed on the screen without considering the real size information.
- the size of the displayed object varies according to the size of the screen and the resolution (dpi).
- the width a ′ when a bag having a width of a centimeter is displayed on a 32-inch monitor and the width a ′′ when displayed on a 50-inch monitor are different from each other (a ⁇ a ′ ⁇ a ′′) (FIG. 30).
- the size relationship between the objects cannot be displayed correctly unless the real size information of each object is taken into consideration. For example, if a bag with a width of a centimeter and a pouch with a width of b centimeter are displayed simultaneously on the same monitor screen, the bag is displayed with a 'centimeter, while the pouch is displayed with b' centimeter. The magnitude relationship between the two is not correctly displayed (a: b ⁇ a ′: b ′) (see FIG. 31).
- the information processing apparatus 100 manages the real size information of the object to be displayed, the screen size and resolution (pixel pitch) information of the display unit 603, and the size of the object and the screen. Even if changes, the image of the object is always displayed in real size on the screen.
- FIG. 32 shows an internal configuration for the calculation unit 120 to perform real size display processing of an object according to the monitor performance.
- the calculation unit 120 includes a real size display unit 3210, a real size estimation unit 3220, and a real size expansion unit 3230. However, it is assumed that at least one functional block among the real size display unit 3210, the real size estimation unit 3220, and the real size extension unit 3230 is realized on a cloud server connected via the communication unit 150. .
- the real size display unit 3210 When the real size display unit 3210 simultaneously displays images of a plurality of objects on the same monitor screen, the real size display unit 3210 takes the real size information of each object into consideration, thereby reducing the screen size and resolution (pixel pitch) of the display unit 603. In response, always display in real size.
- the real size display unit 3210 correctly displays the magnitude relationship between objects when simultaneously displaying images of a plurality of objects on the screen of the display unit 603.
- the real size display unit 3210 reads monitor specifications such as the screen size and resolution (pixel pitch) of the display unit 603 from the storage unit 140. In addition, the real size display unit 3210 acquires a monitor state such as a screen orientation and a tilt of the display unit 603 from the rotation / attachment mechanism unit 180.
- the real size display unit 3210 reads the image of the object to be displayed from the object image database in the storage unit 140, and also reads the real size information of the object from the object real size database.
- the object image database and the object real size database are on a database server connected via the communication unit 150.
- the information processing apparatus 100 realizes real size display of objects, or displays a plurality of sample images in a correct size relationship. Therefore, the user can perform accurate product fitting, and the possibility of erroneous product selection is reduced.
- the real size estimation unit 3220 performs a process of estimating the real size of an object for which real size information cannot be obtained even if a person such as a person photographed by the camera unit 503 is referred to the object real size database.
- the object whose real size is to be estimated is the user's face
- user face data such as the size, age, and orientation of the user's face obtained by image recognition of the image captured by the camera unit 503 from the image recognition unit 504
- the user's real size is estimated based on the user's position obtained by the distance detection method switched by the distance detection method switching unit 2330.
- the estimated real size information of the user serves as feedback to the real size display unit 3210 and is stored in, for example, an object image database.
- the real size information estimated from the user face data is used for real size display corresponding to the subsequent monitor performance in the real size display unit 3210.
- the real size estimation unit 3220 estimates the real size based on the face data. Thereafter, even if the user attempts to enlarge and display the operated object by a touch operation or the like, the object is not enlarged beyond the real size of the subject as shown in FIG. 37B. That is, the image of the baby is not enlarged unnaturally, and the reality of the image is maintained.
- the content video is normalized based on the estimated real size, thereby achieving harmony. Excellent parallel or superimposed display can be realized.
- the real size extension unit 3230 realizes the real size display of the object realized on the screen of the display unit 603 in the real size display unit 3210 including 3D, that is, the depth direction.
- 3D display is performed by the binocular method or the light ray reproduction method only in the horizontal direction, a desired effect can be obtained only at the viewing position assumed at the time of 3D video generation.
- the omnidirectional ray reproduction method the actual size can be displayed from any position.
- the real size extension unit 3230 also detects the user's viewpoint position and corrects the 3D image with respect to the position in both the binocular method and the horizontal ray reproduction method, thereby displaying the same real size display. Can be obtained from any position.
- JP-A-2002-300602 JP-A-2005-149127, and JP-A-2005-142957, which have already been assigned to the present applicant.
- video content of a plurality of sources may be simultaneously displayed on the same screen in a form of being parallel or superimposed.
- a simultaneous display display system for image groups video content of a plurality of sources may be simultaneously displayed on the same screen in a form of being parallel or superimposed.
- a lesson such as yoga
- the user's own image captured by the camera unit 503 is displayed at the same time
- the sample image of the product and the user's image captured by the camera unit 503 are overlapped and displayed. be able to.
- the user cannot properly use the displayed video unless the magnitude relationship between the simultaneously displayed images is correctly displayed. For example, if face sizes and positions vary between users who are in video chat (FIG. 38A), the face-to-face relationship with the chat partner is impaired, and the conversation does not play. Also, if the size and position of the user's figure and the instructor's figure are not aligned (Fig. 39A), it will be difficult for the user to discern the difference between his / her movement and the instructor's movement, and the points to be corrected / improved will not be understood. You will not be able to improve the results of the lesson.
- the information processing apparatus 100 normalizes images using information such as image scales and corresponding areas when parallel or superimposing a plurality of source video contents in parallel or superimposed. It is displayed in a superimposed manner.
- image processing such as digital zoom processing is performed on digital image data such as still images and moving images.
- optical control such as panning, tilting, and zooming is performed on the actual camera.
- Image normalization processing can be easily realized using information such as face size, age, and orientation obtained by face recognition, and information such as body shape and size obtained by person recognition. .
- mirroring or rotation processing is automatically performed on one image to facilitate the correspondence with the other image.
- FIG. 38B shows a state in which face sizes and positions are aligned among users who are chatting video by the normalization processing between a plurality of images.
- FIG. 39B shows a state in which the size and position of the user's figure and the instructor's figure displayed in parallel on the screen are aligned by the normalization process between a plurality of images.
- the sample image of the product is displayed in an appropriate place with the correct size relationship with the video of the user who poses as if the product is picked up by normalization processing between multiple images. It shows how they are doing.
- FIG. 41 shows an internal configuration for the calculation unit 120 to perform image normalization processing.
- the calculation unit 120 includes an inter-image normalization processing unit 4110, a face normalization processing unit 4120, and a real size expansion unit 4130.
- at least one functional block among the inter-image normalization processing unit 4110, the face normalization processing unit 4120, and the real size expansion unit 4130 may be realized on a cloud server connected via the communication unit 150. is assumed.
- the inter-image normalization processing unit 4110 performs normalization processing so that the magnitude relationship between the user's face image and other objects is correctly displayed among a plurality of images.
- the inter-image normalization processing unit 4110 inputs the user image captured by the camera unit 503 through the input interface integration unit 520. At this time, camera information such as pan, tilt, and zoom of the camera unit 503 at the time of shooting by the user is also acquired.
- the inter-image normalization processing unit 4110 obtains an image of another object to be displayed in parallel or superimposed with the user's image, and displays a pattern in which the user's image and the image of the other object are aligned or superimposed on the image. Get from database.
- the image database may exist in the storage unit 140 or may exist on a database server accessed through the communication unit 150.
- the inter-image normalization processing unit 4110 performs image processing such as enlargement, rotation, and mirroring on the user's image so that the magnitude relationship and posture with other objects are correct according to the normalization algorithm, Camera control information for controlling the camera unit 503 such as panning, tilting, and zooming so as to capture an appropriate user image is generated.
- the processing of the inter-image normalization processing unit 4110 for example, as shown in FIG. 40B, the user's image is displayed so that the magnitude relationship with the images of other objects is correct.
- the face normalization processing unit 4120 uses a face image of the user captured by the camera unit 503 as a face image in another object to be operated (for example, an instructor's face in an image reproduced from a recording medium, or video chatting). Normalization processing is performed so that the magnitude relationship with the face of the other user is correctly displayed.
- the face normalization processing unit 4120 inputs the user image captured by the camera unit 503 through the input interface integration unit 520. At this time, camera information such as pan, tilt, and zoom of the camera unit 503 at the time of shooting by the user is also acquired. In addition, the face normalization processing unit 4120 acquires a face image in another operated object to be displayed in parallel or superimposed with the captured user image through the storage unit 140 or the communication unit 150.
- the face normalization processing unit 4120 performs image processing such as enlargement, rotation, and mirroring on the user's image so as to correct the size relationship between the face images, and captures an appropriate user's image.
- image processing such as enlargement, rotation, and mirroring
- camera control information for controlling the camera unit 503 such as pan, tilt, and zoom is generated.
- the real size extension unit 4130 performs parallel or superimposed display of a plurality of images realized on the screen of the display unit 603 in the inter-image normalization processing unit 4110 or the inter-image normalization processing unit 4110 in 3D, that is, in the depth direction. Including.
- 3D display is performed by the binocular method or the light ray reproduction method only in the horizontal direction, a desired effect can be obtained only at the viewing position assumed at the time of 3D video generation.
- the actual size can be displayed from any position.
- the real size extension unit 4130 detects the user's viewpoint position and corrects the 3D image with respect to the position in both the binocular method and the horizontal ray reproduction method, thereby displaying the same real size display. Can be obtained from any position.
- JP-A-2002-300602 JP-A-2005-149127, and JP-A-2005-142957, which have already been assigned to the present applicant.
- the information processing apparatus 100 main body is attached in a rotatable and detachable state on the wall surface by, for example, the rotation / attachment mechanism unit 180.
- the information processing apparatus 100 is turned on, that is, when the operation object is displayed on the display unit 603, the operation object is rotated so that the user can observe the operation object in the correct posture. Rotate the
- a display form of the video content in an arbitrary rotation angle of the screen or the transition process thereof (1) a display form in which the video content is not completely visible at the arbitrary rotation angle, and (2) There are three display modes: a display mode that maximizes the content of interest, and a display mode that rotates video content so that there is no invalid area.
- FIG. 42 shows a display form in which the entire area of the video content is displayed so that the video content cannot be completely seen at an arbitrary rotation angle while the information processing apparatus 100 (screen) is rotated by 90 degrees counterclockwise. Illustrated. As shown in the figure, when a horizontally long video content is displayed on a horizontally placed screen, if it is rotated 90 degrees counterclockwise to be placed vertically, the video content is reduced and the screen An invalid area represented in black appears. Also, video content is minimized in the process of transitioning the screen from landscape to portrait.
- the video content as a copyrighted work loses its identity.
- the display form as known in FIG. 42 always guarantees the identity as a copyrighted work at an arbitrary rotation angle and its transition process. That is, it can be said that the display form is suitable for protected content.
- FIG. 43 illustrates a display mode in which the attention area in the video content is maximized at each rotation angle while the information processing apparatus 100 (screen) is rotated counterclockwise by 90 degrees. ing.
- an area including a subject surrounded by a dotted line in the video content is set as an attention area, and the attention area is maximized at each rotation angle. Since the region of interest is vertically long, changing the landscape orientation to the portrait orientation enlarges the video content. Further, in the process of transition from horizontal to vertical, the attention area expands to the maximum in the diagonal direction of the screen. In addition, in the process of transition from horizontal to vertical placement, an invalid area represented in black appears on the screen.
- FIG. 44 illustrates a display form in which video content is rotated so that there is no invalid area while the information processing apparatus 100 (screen) is rotated 90 degrees counterclockwise.
- FIG. 45 shows the relationship of the zoom ratio of the video content with respect to the rotation position in each of the display forms shown in FIGS.
- the display form shown in FIG. 42 that prevents the video content from being completely seen at an arbitrary rotation angle, the content can be protected, but a large invalid area is generated in the transition process.
- the video is reduced in the transition process, there is a concern that the user may feel uncomfortable.
- the display form shown in FIG. 43 that maximizes the attention area in the video content at each rotation angle, the attention area can be displayed more smoothly during the transition process in which the screen rotates. An invalid area occurs.
- an invalid area does not occur in the transition process, but in the transition process, the video content is greatly enlarged, and there is a concern that an unnatural impression is given to the user who observes.
- FIG. 46 shows, in the form of a flowchart, a processing procedure for controlling the display form of the video content in the arithmetic unit 120 when the information processing apparatus 100 (screen of the display unit 603) is rotated.
- This processing procedure is based on, for example, detecting that the main body of the information processing apparatus 100 is rotating in the rotation / attachment mechanism section 180, or detecting a change in the rotational position of the main body of the information processing apparatus 100 using the three-axis sensor 515. Starts accordingly.
- the calculation unit 120 When rotating the information processing apparatus 100 (screen of the display unit 603), the calculation unit 120 first acquires attribute information of the video content displayed on the screen (step S4601). Then, it is checked whether the video content displayed on the screen is content protected by copyright or the like (step S4602).
- the calculation unit 120 may select an arbitrary rotation angle as shown in FIG.
- a display form for displaying the entire area of the video content is selected so that the video content is not completely visible at step S4603.
- step S4602 When the video content displayed on the screen is not content protected by copyright or the like (No in step S4602), it is subsequently checked whether there is a display form designated by the user (step S4604). ).
- step S4603 When the user has selected a display form that displays the entire area of the video content, the process proceeds to step S4603. If the user has selected a display form that displays the region of interest at maximum, the process advances to step S4605. If the user has selected a display form that does not display the invalid area, the process advances to step S4606. When the user has not selected any display form, the display form set as the default value is selected from the above three display forms.
- FIG. 47 shows an internal configuration for the calculation unit 120 to perform processing for adjusting the display form of the video content in an arbitrary rotation angle of the information processing apparatus 100 main body and its transition process.
- the calculation unit 120 includes a display form determination unit 4710, a rotation position input unit 4720, and an image processing unit 4730, and adjusts the display form of the video content reproduced from the received TV broadcast or media.
- the display form determination unit 4710 determines the display form when rotating the video content in the arbitrary rotation angle of the information processing apparatus 100 main body and the transition process thereof according to the processing procedure shown in FIG.
- the rotation position input unit 4720 inputs the rotation position of the information processing apparatus 100 main body (or the screen of the display unit 602) obtained by the rotation / attachment mechanism unit 180 and the three-axis sensor 515 via the input interface integration unit 520. .
- the image processing unit 4730 is determined by the display form determination unit 4710 so that the video content reproduced from the received TV broadcast or media is adapted to the screen of the display unit 603 inclined by the rotation angle input to the rotation position input unit 4720.
- the image is processed according to the displayed form.
- the technology disclosed in the present specification can be configured as follows. (101) A display unit, a user detection unit that detects a user existing around the display unit, and a process of an operated object displayed on the display unit in response to detection of the user by the user detection unit And an information processing apparatus. (102) In the above (101), the user detection unit includes a proximity sensor disposed on each of the four side edges of the screen of the display unit, and detects a user existing near each side edge. The information processing apparatus described. (103) The calculation unit sets a user-occupied area for each detected user and a common area shared among the users in the screen of the display unit according to the arrangement of the users detected by the user detection unit. The information processing apparatus according to (101).
- the calculation unit rotates the operated object according to the position operated by the user with respect to the gravity center position of the operated object.
- the calculation unit displays a detection indicator indicating that a user is newly detected when setting the user-occupied area of the user newly detected by the user detection unit in the screen of the display unit.
- the information processing apparatus according to (104) further including a data transmission / reception unit that transmits / receives data to / from a terminal owned by the user.
- the data transmission / reception unit performs data transmission / reception processing with a terminal possessed by a user detected by the user detection unit, and the arithmetic unit obtains an object to be operated corresponding to data received from the terminal possessed by the user,
- Information processing device (113) The information processing apparatus according to (112), wherein the calculation unit displays a copy of the operated object created as separate data in the user-occupied area of the movement destination.
- a display unit a user position detection unit that detects a user position with respect to the display unit, a user state detection unit that detects a user state with respect to a display screen of the display unit, and the user position detection unit
- An information processing apparatus comprising: a user's position; and an arithmetic unit that controls a GUI displayed on the display unit according to a user state detected by the user state detection unit.
- the calculation unit controls the framework or information density of one or more operated objects to be operated by the user displayed on the screen of the display unit according to the position of the user and the state of the user.
- the information processing apparatus controls a framework of an operated object to be displayed on a screen according to whether or not the state is present.
- a display unit one or more input means for a user to operate an object to be operated displayed on the screen of the display unit, a user position detection unit for detecting the position of the user with respect to the display unit, A user status detection unit that detects a user status with respect to the display screen of the display unit, the user position detected by the user position detection unit, and the user status detected by the user status detection unit
- An information processing apparatus comprising: an arithmetic unit to be optimized.
- the information processing apparatus wherein the calculation unit controls optimization of the input unit according to whether or not a user is viewing the screen of the display unit. (209) In the state (207), the calculation unit optimizes the input unit according to the position of the user detected by the user position detection unit while the user is viewing the screen of the display unit.
- Information processing device (210) a display unit, a user position detection unit that detects a position of the user with respect to the display unit, a plurality of distance detection methods that detect a distance from the screen of the display unit to the user, and the user position detection unit detects And an arithmetic unit that controls switching of the distance detection method according to the position of the user.
- a user position detection step for detecting a user position relative to the display screen, a user state detection step for detecting a user state relative to the display screen, a user position detected in the user position detection step, and the user state detection An information processing method comprising: a calculation step of controlling a GUI displayed on the display screen in accordance with a user state detected in the step.
- a user position detecting step for detecting a user position with respect to the display screen, a user state detecting step for detecting a user state with respect to the display screen, a user position detected in the user position detecting step, and the user state An information processing method comprising: an arithmetic step for optimizing one or more input means for a user to operate an operated object displayed on the display screen according to a user state detected in the detection step. (215) a user position detecting step for detecting the position of the user with respect to the display screen, and a plurality of distance detection methods for detecting the distance from the display screen to the user according to the position of the user detected in the user position detecting step.
- An information processing method comprising: a calculation step for controlling switching.
- a display unit a user position detection unit that detects a user position with respect to the display unit, a user state detection unit that detects a user state with respect to the display screen of the display unit, and a user position detected by the user position detection unit And a computer program written in a computer-readable format so as to cause the computer to function as a calculation unit that controls the GUI displayed on the display unit according to the user status detected by the user status detection unit.
- a display unit (217) a display unit, one or more input means for a user to operate an operated object displayed on the screen of the display unit, a user position detection unit for detecting a position of the user with respect to the display unit, and the display unit
- a user state detection unit for detecting a user state with respect to a display screen, a calculation for optimizing the input unit according to a user position detected by the user position detection unit and a user state detected by the user state detection unit
- a display unit (218) a display unit, a user position detection unit that detects the position of the user with respect to the display unit, a plurality of distance detection methods for detecting a distance from the screen of the display unit to the user, and the user position detected by the user position detection unit
- a display unit an object image acquisition unit that acquires an image of an object to be displayed on the screen of the display unit, and a real size acquisition unit that acquires information on the real size of the object displayed on the screen of the display unit
- An information processing apparatus comprising: an arithmetic unit that processes an image of the object based on the real size of the object acquired by the real size acquisition unit.
- a display performance acquisition unit that acquires information on display performance including the screen size and resolution of the display unit, and the calculation unit includes the real size of the object acquired by the real size acquisition unit, and the display The information processing apparatus according to (301), wherein processing is performed so that an image of the object is displayed in a real size on the screen of the display unit based on display performance acquired by the performance acquisition unit.
- the arithmetic unit When simultaneously displaying the images of the plurality of objects acquired by the object image acquisition unit on the screen of the display unit, the arithmetic unit correctly displays the magnitude relation between the images of the plurality of objects.
- a camera unit an image recognition unit that recognizes a user's face included in an image captured by the camera unit and acquires face data, a distance detection unit that detects a distance to the user, and the user's
- the information processing apparatus according to (301), further comprising: a real size estimation unit configured to estimate a real size of the user's face based on face data and a distance to the user.
- An object image acquisition step of acquiring an image of an object displayed on the screen a real size acquisition step of acquiring information on the real size of the object displayed on the screen, and the object acquired in the real size acquisition step
- An information processing method comprising: an arithmetic step of processing an image of the object based on a real size of the object.
- a display unit an object image acquisition unit that acquires an image of an object to be displayed on the screen of the display unit, a real size acquisition unit that acquires information on a real size of the object displayed on the screen of the display unit, and the real A computer program written in a computer-readable format for causing a computer to function as an arithmetic unit that processes an image of the object based on the real size of the object acquired by a size acquisition unit.
- An information processing apparatus comprising: a camera unit; a display unit; and an arithmetic unit that normalizes a user's image captured by the camera unit when displayed on the screen of the display unit.
- An object image acquisition unit for acquiring an image of an object to be displayed on the screen of the display unit, and a parallel / superimposition pattern for paralleling or superimposing the user image and the object image on the screen of the display unit
- a parallel / superimposed pattern acquisition unit that performs normalization so that a magnitude relationship and position between the user image and the object are correct, and the user image and the object after normalization
- the information processing apparatus according to (401), wherein the information is paralleled or superimposed according to the acquired parallel / superimposition pattern.
- a user face data acquisition unit that acquires user face data captured by the camera unit, and an in-object face data acquisition unit that acquires face data in an object displayed on the screen of the display unit,
- the information processing apparatus according to (401), wherein the calculation unit normalizes the size relationship and the position of the face data of the user and the face data in the object to be correct.
- the information processing apparatus according to (404), wherein the calculation unit controls the camera unit to normalize the user's image captured by the camera unit.
- An object image acquisition step for acquiring an image of an object to be displayed on the screen, and a parallel / superimposition pattern for parallelly or superimposing a user image captured by the camera unit and the object image on the screen of the display unit
- a parallel / superimposition pattern acquisition step a normalization step for normalizing the user image and the object so that the magnitude relationship and position are correct, and a parallel operation for acquiring the user image and the object after normalization
- an image processing step of parallel or superimposing according to the superposition pattern for acquiring an image of an object to be displayed on the screen, and a parallel / superimposition pattern for parallelly or superimposing a user image captured by the camera unit and the object image on the screen of the display unit
- a display unit that displays video content on a screen, a rotation angle detection unit that detects a rotation angle of the screen, and a display mode that determines a display mode of video content in an arbitrary rotation angle of the screen or a transition process thereof.
- An image processing unit that processes an image according to the display mode determined by the display mode determination unit so that the video content is adapted to the screen tilted by the rotation angle detected by the rotation angle detection unit;
- An information processing apparatus comprising: (502)
- the display form determination unit includes a display form that prevents the video content from being completely seen at an arbitrary rotation angle, a display form that maximizes the content of interest in the video content at each rotation angle, and an invalid area.
- the information processing apparatus determines a plurality of display forms including three display forms for rotating the video content so that there is no video content.
- the display form determination unit determines an arbitrary rotation angle of the screen and a display form in the transition process based on attribute information of video content.
- the display form determination unit determines a display form that prevents the video content from being completely viewed at an arbitrary rotation angle with respect to the protected video content.
- a display unit that displays video content on a screen, a rotation angle detection unit that detects a rotation angle of the screen, and a display mode determination unit that determines a display mode of video content in an arbitrary rotation angle of the screen and its transition process
- the computer is caused to function as an image processing unit that performs image processing according to the display mode determined by the display mode determination unit so that the video content is adapted to the screen inclined by the rotation angle detected by the rotation angle detection unit.
- the information processing apparatus 100 to which the technology disclosed in the favorite collection is applied has been described focusing on an embodiment assuming a TV receiver having a large screen.
- the gist of the technology disclosed in the present specification is as follows. It is not limited to this.
- the technology disclosed in the present specification can be similarly applied to an information processing apparatus other than a TV receiver such as a personal computer or a tablet terminal, or an information processing apparatus having a small screen size.
- DESCRIPTION OF SYMBOLS 100 Information processing apparatus 110 ... Input interface part 120 ... Operation part 130 ... Output interface part 140 ... Memory
- 3-axis sensor part 516 ... GPS receiving part 517 ... Signal analysis part 520 ... Input interface integration part 601 ... Content display part, 602 ... GUI display part 603 ... Display part, 604 ... Speaker part 60 ... Illumination display section, 606 ... Illumination section 710 ... Monitor area division section 711 ... Equipment database, 712 ... Area pattern database 720 ... Object optimization processing section, 721 ... Optimal processing algorithm 730 ... Equipment cooperation data transmission / reception section, 731 ... Transmission / reception processing Algorithm 2310: Display GUI optimization unit, 2320 ... Input means optimization unit 2330 ... Distance detection method switching unit 3210 ... Real size display unit, 3220 ... Real size estimation unit 3230 ... Real size expansion unit 4110 ... Inter-image normalization processing unit 4120 ... Face normalization processing unit 4130 ... Real size expansion unit 4710 ... Display form determination unit, 4720 ... Rotation position input unit 4730 ... Image processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
映像コンテンツを画面に表示する表示部と、
前記画面の回転角度を検出する回転角度検出部と、
前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定部と、
映像コンテンツが、前記回転角度検出部で検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定部で決定した表示形態に従って画像加工する画像加工部と、
を具備する情報処理装置である。
映像コンテンツを表示する画面の回転角度を検出する回転角度検出ステップと、
前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定ステップと、
映像コンテンツが、前記回転角度検出ステップで検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定ステップで決定した表示形態に従って画像加工する画像加工ステップと、
を有する情報処理方法である。
映像コンテンツを画面に表示する表示部、
前記画面の回転角度を検出する回転角度検出部、
前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定部、
映像コンテンツが、前記回転角度検出部で検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定部で決定した表示形態に従って画像加工する画像加工部、
としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラムである。
本実施形態に係る情報処理装置100は、大画面を有するが、主な使用形態として、図1に示すように壁に掛ける「Wall」、並びに、図2に示すように卓上に設置する「Tabletop」が想定される。
情報処理装置100は、大画面における複数ユーザーによる同時操作が可能である。具体的には、大画面の4つの側縁部の各々に、ユーザーの存在又はユーザーの状態を検出する近接センサー511を備えており、ユーザーの配置に応じて画面内にユーザー占有領域と共通領域を設定することで、複数のユーザーによる快適且つ効率的な同時操作を可能にする。
情報処理装置100は、距離センサー507や近接センサー511を備えており、例えば図1、図3に示したような壁掛け使用時において、情報処理装置100本体すなわち画面からユーザーまでの距離を検出することができる。
従来の物体表示システムでは、実在するオブジェクトの画像を、そのリアルサイズ情報を考慮せずに画面に表示する。このため、画面の大きさや解像度(dpi)に応じて表示されるオブジェクトのサイズが変動してしまう。例えば、横幅がaセンチメートルのバッグを32インチのモニターに表示したときの横幅a’と、50インチのモニターに表示したときの横幅a”は互いに異なる(a≠a’≠a”)(図30を参照のこと)。
表示システムにおいて、複数のソースの映像コンテンツを、並列又は重畳するという形態で同じ画面上に同時に表示する場合がある。例えば、(1)複数のユーザー同士でビデオ・チャットを行なう場合や、(2)ヨガなどのレッスンの際に、DVDなどの記録メディアから再生した(あるいは、ネットワーク経由でストリーミング再生する)インストラクターの映像とカメラ部503で撮影したユーザー本人の映像を同時に表示する場合、(3)ネット・ショッピングにおいて商品のサンプル画像とカメラ部503で撮影したユーザー本人の映像を重ねて表示してフィッティングする場合を挙げることができる。
既に述べたように、本実施形態に係る情報処理装置100本体は、例えば回転・取り付け機構部180によって壁面上で回転可能且つ着脱可能な状態で取り付けられている。情報処理装置100の電源オンすなわち表示部603で被操作オブジェクトを表示している最中に本体を回転操作すると、これに伴って、ユーザーが正しい姿勢の被操作オブジェクトを観察できるよう、被操作オブジェクトを回転処理する。
本明細書で開示する技術は、以下のような構成をとることも可能である。
(101)表示部と、前記表示部の周辺に存在するユーザーを検出するユーザー検出部と、前記ユーザー検出部がユーザーを検出したことに応じて、前記表示部に表示する被操作オブジェクトの処理を行なう演算部と、を具備する情報処理装置。
(102)前記ユーザー検出部は、前記表示部の画面の4つの側縁部の各々に配設された近接センサーを有し、各側縁付近に存在するユーザーを検出する、上記(101)に記載の情報処理装置。
(103)前記演算部は、前記ユーザー検出部が検出したユーザーの配置に応じて、前記表示部の画面内に、検出されたユーザー毎のユーザー占有領域と、ユーザー間で共有する共通領域を設定する、上記(101)に記載の情報処理装置。
(104)前記演算部は、前記表示部の画面上に、ユーザーの操作対象となる1以上の被操作オブジェクトを表示させる、上記(103)に記載の情報処理装置。
(105)前記演算部は、ユーザー占有領域内の被操作オブジェクトを最適化する、上記(104)に記載の情報処理装置。
(106)前記演算部は、ユーザー占有領域内の被操作オブジェクトが該当するユーザーに正対する方向となるように回転処理する、上記(104)に記載の情報処理装置。
(107)前記演算部は、共通領域又は他のユーザー占有領域からユーザー占有領域に移動してきた被操作オブジェクトが該当するユーザーに正対する方向となるように回転処理する、上記(104)に記載の情報処理装置。
(108)演算部は、ユーザーが被操作オブジェクトをドラッグして領域間を移動させる際に、被操作オブジェクトの重心位置に対してユーザーが操作した位置に応じて、被操作オブジェクトを回転処理する際の回転方向を制御する、上記(107)に記載の情報処理装置。
(109)前記演算部は、前記ユーザー検出部が新規に検出したユーザーのユーザー占有領域を前記表示部の画面内に設定するときに、ユーザーを新規に検出したことを表す検出インジケーターを表示させる、上記(103)に記載の情報処理装置。
(110)ユーザーが所持する端末とデータを送受信するデータ送受信部をさらに備える、上記(104)に記載の情報処理装置。
(111)前記データ送受信部は、前記ユーザー検出部が検出したユーザーが所持する端末とデータ送受信処理を行ない、前記演算部は、ユーザーが所持する端末から受信したデータに対応する被操作オブジェクトを、該当するユーザー占有領域に出現させる、上記(110)に記載の情報処理装置。
(112)前記演算部は、各ユーザーのユーザー占有領域間で被操作オブジェクトを移動させたことに応じて、移動先のユーザー占有領域に被操作オブジェクトを複製又は分割する、上記(104)に記載の情報処理装置。
(113)前記演算部は、移動先のユーザー占有領域に、別データとして作成した被操作オブジェクトの複製を表示させる、上記(112)に記載の情報処理装置。
(114)前記演算部は、移動先のユーザー占有領域に、ユーザー間で共同操作が可能なアプリケーションの別のウィンドウとなる被操作オブジェクトの複製を表示させる、上記(112)に記載の情報処理装置。
(115)周辺に存在するユーザーを検出するユーザー検出ステップと、前記ユーザー検出ステップでユーザーを検出したことに応じて、表示する被操作オブジェクトの処理を行なう演算ステップと、を有する情報処理方法。
(116)表示部、前記表示部の周辺に存在するユーザーを検出するユーザー検出部、前記ユーザー検出部がユーザーを検出したことに応じて、前記表示部に表示する被操作オブジェクトの処理を行なう演算部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
(201)表示部と、前記表示部に対するユーザーの位置を検出するユーザー位置検出部と、前記表示部の表示画面に対するユーザーの状態を検出するユーザー状態検出部と、前記ユーザー位置検出部が検出したユーザーの位置と、前記ユーザー状態検出部が検出したユーザーの状態に応じて、前記表示部に表示するGUIを制御する演算部と、を具備する情報処理装置。
(202)前記演算部は、前記表示部の画面上に表示する、ユーザーの操作対象となる1以上の被操作オブジェクトの枠組み又は情報密度を、ユーザーの位置及びユーザーの状態に応じて制御する、上記(201)に記載の情報処理装置。
(203)前記演算部は、ユーザーが前記表示部の画面を視聴している状態か否かに応じて、画面に表示する被操作オブジェクトの枠組みを制御する、上記(201)に記載の情報処理装置。
(204)前記演算部は、ユーザーの位置に応じて前記表示部の画面に表示する被操作オブジェクトの情報密度を制御する、上記(201)に記載の情報処理装置。
(205)前記演算部は、ユーザーが個人認証可能な位置か否かに応じて、前記表示部の画面に表示する被操作オブジェクトの選択を制御する、上記(201)に記載の情報処理装置。
(206)ユーザーが前記表示部の画面上に表示されている被操作オブジェクトを操作する1以上の入力手段をさらに備え、前記演算部は、ユーザーが前記入力手段で被操作オブジェクトを操作している状態か否かに応じて、画面に表示する被操作オブジェクトの枠組みを制御する、上記(201)に記載の情報処理装置。
(207)表示部と、ユーザーが前記表示部の画面上に表示されている被操作オブジェクトを操作する1以上の入力手段と、前記表示部に対するユーザーの位置を検出するユーザー位置検出部と、前記表示部の表示画面に対するユーザーの状態を検出するユーザー状態検出部と、前記ユーザー位置検出部が検出したユーザーの位置と、前記ユーザー状態検出部が検出したユーザーの状態に応じて、前記入力手段を最適化する演算部と、を具備する情報処理装置。
(208)前記演算部は、ユーザーが前記表示部の画面を視聴している状態か否かに応じて、前記入力手段の最適化を制御する、上記(207)に記載の情報処理装置。
(209)前記演算部は、ユーザーが前記表示部の画面を視聴している状態において、前記ユーザー位置検出部が検出したユーザーの位置に応じて入力手段を最適化する、上記(207)に記載の情報処理装置。
(210)表示部と、前記表示部に対するユーザーの位置を検出するユーザー位置検出部と、前記表示部の画面からユーザーまでの距離を検出する複数の距離検出方式と、前記ユーザー位置検出部が検出したユーザーの位置に応じて、距離検出方式の切り替えを制御する演算部と、を具備する情報処理装置。
(211)前記演算部は、遠方にいるユーザーまでの距離を検出する距離検出方式を常に機能オンにする、上記(210)に記載の情報処理装置。
(212)前記演算部は、近くにいるユーザーの距離を検出するとともに認識処理を兼ねた距離検出方式を、十分な認識精度が得られる距離範囲でのみ機能オンにする、上記(210)に記載の情報処理装置。
(213)表示画面に対するユーザーの位置を検出するユーザー位置検出ステップと、表示画面に対するユーザーの状態を検出するユーザー状態検出ステップと、前記ユーザー位置検出ステップで検出したユーザーの位置と、前記ユーザー状態検出ステップで検出したユーザーの状態に応じて、前記表示画面に表示するGUIを制御する演算ステップと、を有する情報処理方法。
(214)表示画面に対するユーザーの位置を検出するユーザー位置検出ステップと、前記表示画面に対するユーザーの状態を検出するユーザー状態検出ステップと、前記ユーザー位置検出ステップで検出したユーザーの位置と、前記ユーザー状態検出ステップで検出したユーザーの状態に応じて、ユーザーが前記表示画面上に表示されている被操作オブジェクトを操作するための1以上の入力手段を最適化する演算ステップと、を有する情報処理方法。
(215)表示画面に対するユーザーの位置を検出するユーザー位置検出ステップと、前記ユーザー位置検出ステップで検出したユーザーの位置に応じて、前記表示画面からユーザーまでの距離を検出する複数の距離検出方式の切り替えを制御する演算ステップと、を有する情報処理方法。
(216)表示部、前記表示部に対するユーザーの位置を検出するユーザー位置検出部、前記表示部の表示画面に対するユーザーの状態を検出するユーザー状態検出部、前記ユーザー位置検出部が検出したユーザーの位置と、前記ユーザー状態検出部が検出したユーザーの状態に応じて、前記表示部に表示するGUIを制御する演算部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
(217)表示部、ユーザーが前記表示部の画面上に表示されている被操作オブジェクトを操作する1以上の入力手段、前記表示部に対するユーザーの位置を検出するユーザー位置検出部、前記表示部の表示画面に対するユーザーの状態を検出するユーザー状態検出部、前記ユーザー位置検出部が検出したユーザーの位置と、前記ユーザー状態検出部が検出したユーザーの状態に応じて、前記入力手段を最適化する演算部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
(218)表示部、前記表示部に対するユーザーの位置を検出するユーザー位置検出部、前記表示部の画面からユーザーまでの距離を検出する複数の距離検出方式、前記ユーザー位置検出部が検出したユーザーの位置に応じて、距離検出方式の切り替えを制御する演算部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
(301)表示部と、前記表示部の画面に表示するオブジェクトの画像を取得するオブジェクト画像取得部と、前記表示部の画面に表示する前記オブジェクトのリアルサイズに関する情報を取得するリアルサイズ取得部と、前記リアルサイズ取得部が取得した前記オブジェクトのリアルサイズに基づいて、前記オブジェクトの画像を処理する演算部と、を具備する情報処理装置。
(302)前記表示部の画面サイズ及び解像度を含む表示性能に関する情報を取得する表示性能取得部をさらに備え、前記演算部は、前記リアルサイズ取得部が取得した前記オブジェクトのリアルサイズと、前記表示性能取得部が取得した表示性能に基づいて、前記表示部の画面上で、前記オブジェクトの画像がリアルサイズで表示されるよう処理する、上記(301)に記載の情報処理装置。
(303)前記オブジェクト画像取得部が取得した複数のオブジェクトの画像を前記表示部の画面に同時に表示する際に、前記演算部は、前記複数のオブジェクトの画像相互の大小関係が正しく表示されるように、前記複数のオブジェクトの画像を処理する、上記(301)に記載の情報処理装置。
(304)カメラ部と、前記カメラ部で撮影した画像に含まれるオブジェクトのリアルサイズを推定するリアルサイズ推定部をさらに備える、上記(301)に記載の情報処理装置。
(305)カメラ部と、前記カメラ部で撮影した画像に含まれるユーザーの顔を認識して顔データを取得する画像認識部と、前記ユーザーまでの距離を検出する距離検出部と、前記ユーザーの顔データと前記ユーザーまでの距離に基づいて、前記ユーザーの顔のリアルサイズを推定するリアルサイズ推定部と、をさらに備える上記(301)に記載の情報処理装置。
(306)画面に表示するオブジェクトの画像を取得するオブジェクト画像取得ステップと、前記画面に表示する前記オブジェクトのリアルサイズに関する情報を取得するリアルサイズ取得ステップと、前記リアルサイズ取得ステップで取得した前記オブジェクトのリアルサイズに基づいて、前記オブジェクトの画像を処理する演算ステップと、を有する情報処理方法。
(307)表示部、前記表示部の画面に表示するオブジェクトの画像を取得するオブジェクト画像取得部、前記表示部の画面に表示する前記オブジェクトのリアルサイズに関する情報を取得するリアルサイズ取得部、前記リアルサイズ取得部が取得した前記オブジェクトのリアルサイズに基づいて、前記オブジェクトの画像を処理する演算部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
(401)カメラ部と、表示部と、前記表示部の画面に表示する際に、前記カメラ部で撮影したユーザーの画像を正規化する演算部と、を具備する情報処理装置。
(402)前記表示部の画面に表示するオブジェクトの画像を取得するオブジェクト画像取得部と、前記ユーザーの画像と前記オブジェクトの画像を前記表示部の画面上に並列又は重畳する並列・重畳パターンを取得する並列・重畳パターン取得部と、をさらに備え、前記演算部は、前記ユーザーの画像と前記オブジェクトの大小関係と位置が正しくなるように正規化し、正規化した後の前記ユーザーの画像と前記オブジェクトを取得した並列・重畳パターンに従って並列又は重畳する、上記(401)に記載の情報処理装置。
(403)前記演算部は、前記カメラ部で撮影した前記ユーザーの画像を正規化するための前記カメラ部への制御を行なう、上記(402)に記載の情報処理装置。
(404)前記カメラ部で撮影したユーザーの顔データを取得するユーザー顔データ取得部と、前記表示部の画面に表示するオブジェクト内の顔データを取得するオブジェクト内顔データ取得部をさらに備え、前記演算部は、前記ユーザーの顔データと前記オブジェクト内の顔データの大小関係と位置が正しくなるように正規化する、上記(401)に記載の情報処理装置。
(405)前記演算部は、前記カメラ部で撮影した前記ユーザーの画像を正規化するための前記カメラ部への制御を行なう、上記(404)に記載の情報処理装置。
(406)画面に表示するオブジェクトの画像を取得するオブジェクト画像取得ステップと、カメラ部で撮影したユーザーの画像と前記オブジェクトの画像を前記表示部の画面上に並列又は重畳する並列・重畳パターンを取得する並列・重畳パターン取得ステップと、前記ユーザーの画像と前記オブジェクトの大小関係と位置が正しくなるように正規化する正規化ステップと、正規化した後の前記ユーザーの画像と前記オブジェクトを取得した並列・重畳パターンに従って並列又は重畳する画像加工ステップと、を有する情報処理方法。
(407)カメラ部で撮影したユーザーの顔データを取得するユーザー顔データ取得ステップと、画面に表示するオブジェクト内の顔データを取得するオブジェクト内顔データ取得ステップと、前記ユーザーの顔データと前記オブジェクト内の顔データの大小関係と位置が正しくなるように正規化する正規化ステップと、を有する情報処理方法。
(408)カメラ部、表示部、前記表示部の画面に表示する際に、前記カメラ部で撮影したユーザーの画像を正規化する演算部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
(501)映像コンテンツを画面に表示する表示部と、前記画面の回転角度を検出する回転角度検出部と、前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定部と、映像コンテンツが、前記回転角度検出部で検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定部で決定した表示形態に従って画像加工する画像加工部と、
を具備する情報処理装置。
(502)前記表示形態決定部は、任意の回転角度において映像コンテンツが全く見切れないようにする表示形態、各回転角度において、映像コンテンツ中の注目コンテンツが最大となるようにする表示形態、無効領域がないように映像コンテンツを回転する表示形態の3つを含む複数の表示形態の中から決定する、上記(501)に記載の情報処理装置。
(503)前記表示形態決定部は、映像コンテンツが持つ属性情報に基づいて、前記画面の任意の回転角度やその遷移過程における表示形態を決定する、上記(501)に記載の情報処理装置。
(504)前記表示形態決定部は、保護された映像コンテンツについて、任意の回転角度において映像コンテンツが全く見切れないようにする表示形態を決定する、上記(501)に記載の情報処理装置。
(505)映像コンテンツを表示する画面の回転角度を検出する回転角度検出ステップと、前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定ステップと、映像コンテンツが、前記回転角度検出ステップで検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定ステップで決定した表示形態に従って画像加工する画像加工ステップと、を有する情報処理方法。
(506)映像コンテンツを画面に表示する表示部、前記画面の回転角度を検出する回転角度検出部、前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定部、映像コンテンツが、前記回転角度検出部で検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定部で決定した表示形態に従って画像加工する画像加工部、としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
110…入力インターフェース部
120…演算部
130…出力インターフェース部
140…記憶部
150…通信部
160…電源部
170…テレビ・チューナー部
180…回転・取り付け機構部
501…リモコン受信部、502…信号解析部
503…カメラ部、504…画像認識部
505…マイク部、506…音声認識部
507…距離センサー、508…信号解析部
509…タッチ検出部、510…信号解析部
511…近接センサー、512…信号解析部
513…超近距離通信部、514…信号解析部
515…3軸センサー部、516…GPS受信部、517…信号解析部
520…入力インターフェース統合部
601…コンテンツ表示部、602…GUI表示部
603…表示部、604…スピーカー部
605…イルミネーション表示部、606…イルミネーション部
710…モニター領域分割部
711…機器データベース、712…領域パターン・データベース
720…オブジェクト最適処理部、721…最適処理アルゴリズム
730…機器連携データ送受信部、731…送受信処理アルゴリズム
2310…表示GUI最適化部、2320…入力手段最適化部
2330…距離検出方式切替部
3210…リアルサイズ表示部、3220…リアルサイズ推定部
3230…リアルサイズ拡張部
4110…画像間正規化処理部、4120…顔正規化処理部
4130…リアルサイズ拡張部
4710…表示形態決定部、4720…回転位置入力部
4730…画像加工部
Claims (6)
- 映像コンテンツを画面に表示する表示部と、
前記画面の回転角度を検出する回転角度検出部と、
前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定部と、
映像コンテンツが、前記回転角度検出部で検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定部で決定した表示形態に従って画像加工する画像加工部と、
を具備する情報処理装置。 - 前記表示形態決定部は、任意の回転角度において映像コンテンツが全く見切れないようにする表示形態、各回転角度において、映像コンテンツ中の注目コンテンツが最大となるようにする表示形態、無効領域がないように映像コンテンツを回転する表示形態の3つを含む複数の表示形態の中から決定する、
請求項1に記載の情報処理装置。 - 前記表示形態決定部は、映像コンテンツが持つ属性情報に基づいて、前記画面の任意の回転角度やその遷移過程における表示形態を決定する、
請求項1に記載の情報処理装置。 - 前記表示形態決定部は、保護された映像コンテンツについて、任意の回転角度において映像コンテンツが全く見切れないようにする表示形態を決定する、
請求項1に記載の情報処理装置。 - 映像コンテンツを表示する画面の回転角度を検出する回転角度検出ステップと、
前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定ステップと、
映像コンテンツが、前記回転角度検出ステップで検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定ステップで決定した表示形態に従って画像加工する画像加工ステップと、
を有する情報処理方法。 - 映像コンテンツを画面に表示する表示部、
前記画面の回転角度を検出する回転角度検出部、
前記画面の任意の回転角度やその遷移過程において映像コンテンツの表示形態を決定する表示形態決定部、
映像コンテンツが、前記回転角度検出部で検出された回転角度だけ傾いた前記画面に適合するよう、前記表示形態決定部で決定した表示形態に従って画像加工する画像加工部、
としてコンピューターを機能させるようコンピューター可読形式で記述されたコンピューター・プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12865035.5A EP2804372A4 (en) | 2012-01-13 | 2012-12-26 | DATA PROCESSING DEVICE, METHOD FOR DATA PROCESSING, AND COMPUTER PROGRAM |
CN201280066346.0A CN104040463B (zh) | 2012-01-13 | 2012-12-26 | 信息处理设备和信息处理方法、以及计算机程序 |
US14/369,834 US9317899B2 (en) | 2012-01-13 | 2012-12-26 | Information processing apparatus and information processing method, and computer program |
JP2013553244A JP6257329B2 (ja) | 2012-01-13 | 2012-12-26 | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012005438 | 2012-01-13 | ||
JP2012-005438 | 2012-01-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013105443A1 true WO2013105443A1 (ja) | 2013-07-18 |
Family
ID=48781400
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/083751 WO2013105443A1 (ja) | 2012-01-13 | 2012-12-26 | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9317899B2 (ja) |
EP (1) | EP2804372A4 (ja) |
JP (1) | JP6257329B2 (ja) |
CN (1) | CN104040463B (ja) |
WO (1) | WO2013105443A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103699303A (zh) * | 2013-12-27 | 2014-04-02 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
WO2014083953A1 (ja) * | 2012-11-27 | 2014-06-05 | ソニー株式会社 | 表示装置及び表示方法、並びにコンピューター・プログラム |
CN104077028A (zh) * | 2014-05-28 | 2014-10-01 | 天津三星通信技术研究有限公司 | 在电子设备中控制显示项目的设备和方法 |
CN104423799A (zh) * | 2013-08-23 | 2015-03-18 | 夏普株式会社 | 接口装置和接口方法 |
CN104461335A (zh) * | 2013-09-25 | 2015-03-25 | 联想(北京)有限公司 | 一种数据处理方法及电子设备 |
JP2015082026A (ja) * | 2013-10-23 | 2015-04-27 | シャープ株式会社 | 表示装置、表示システム、および表示方法 |
WO2015058670A1 (zh) * | 2013-10-21 | 2015-04-30 | 中国移动通信集团公司 | 一种显示内容处理方法及设备 |
CN104750404A (zh) * | 2013-12-31 | 2015-07-01 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
JP2015170248A (ja) * | 2014-03-10 | 2015-09-28 | 富士ゼロックス株式会社 | 表示制御装置及びプログラム |
WO2017017737A1 (ja) * | 2015-07-24 | 2017-02-02 | 富士通株式会社 | 表示方法、モニタ結果出力方法、情報処理装置および表示プログラム |
JP6169298B1 (ja) * | 2017-02-16 | 2017-07-26 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2018018515A (ja) * | 2017-06-27 | 2018-02-01 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2018139148A (ja) * | 2018-05-30 | 2018-09-06 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2019020737A (ja) * | 2018-09-06 | 2019-02-07 | シャープ株式会社 | 表示装置、表示システム、および表示方法 |
WO2021180223A1 (zh) * | 2020-03-13 | 2021-09-16 | 海信视像科技股份有限公司 | 一种显示方法及显示设备 |
US11415980B2 (en) * | 2016-07-29 | 2022-08-16 | Nec Solution Innovators, Ltd. | Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium |
US11467572B2 (en) | 2016-07-29 | 2022-10-11 | NEC Solution Innovations, Ltd. | Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium |
US11659148B2 (en) | 2021-01-25 | 2023-05-23 | Seiko Epson Corporation | Method for controlling display apparatus, and display apparatus |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5598232B2 (ja) * | 2010-10-04 | 2014-10-01 | ソニー株式会社 | 情報処理装置、情報処理システムおよび情報処理方法 |
KR101755956B1 (ko) | 2011-08-04 | 2017-07-07 | 이베이 인크. | 사용자 논평 시스템 및 방법 |
WO2014006757A1 (ja) * | 2012-07-06 | 2014-01-09 | Necディスプレイソリューションズ株式会社 | 表示装置、表示装置の制御方法 |
US9165535B2 (en) * | 2012-09-27 | 2015-10-20 | Google Inc. | System and method for determining a zoom factor of content displayed on a display device |
US9159116B2 (en) | 2013-02-13 | 2015-10-13 | Google Inc. | Adaptive screen interfaces based on viewing distance |
US9870582B2 (en) | 2013-03-14 | 2018-01-16 | Mcmaster-Carr Supply Company | System and method for browsing a product catalog and for dynamically generated product paths |
US20150201236A1 (en) * | 2014-01-15 | 2015-07-16 | Khalifa Al Remeithi | Display Proximity Control Device |
KR20160028272A (ko) * | 2014-09-03 | 2016-03-11 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
JP6412778B2 (ja) * | 2014-11-19 | 2018-10-24 | 東芝映像ソリューション株式会社 | 映像装置、方法、およびプログラム |
US9734553B1 (en) | 2014-12-31 | 2017-08-15 | Ebay Inc. | Generating and displaying an actual sized interactive object |
EP3249934A4 (en) * | 2015-01-21 | 2018-12-05 | Sony Corporation | Information processing apparatus, communication system, information processing method and program |
US9927892B2 (en) * | 2015-03-27 | 2018-03-27 | International Business Machines Corporation | Multiple touch selection control |
US20160365117A1 (en) * | 2015-06-11 | 2016-12-15 | Martin Paul Boliek | Method for processing captured video data based on capture device orientation |
CN104867479B (zh) * | 2015-06-12 | 2017-05-17 | 京东方科技集团股份有限公司 | 拼接显示装置屏幕亮度的调节装置和方法 |
TWI547177B (zh) * | 2015-08-11 | 2016-08-21 | 晶睿通訊股份有限公司 | 視角切換方法及其攝影機 |
CN105204644A (zh) | 2015-09-28 | 2015-12-30 | 北京京东方多媒体科技有限公司 | 虚拟试衣系统及方法 |
CN105373001A (zh) * | 2015-10-29 | 2016-03-02 | 小米科技有限责任公司 | 电子设备的控制方法及装置 |
WO2017151136A1 (en) * | 2016-03-03 | 2017-09-08 | Hewlett-Packard Development Company, L.P. | Input axis rotations |
EP3473388A4 (en) * | 2016-06-16 | 2020-01-29 | Shenzhen Royole Technologies Co., Ltd. | METHOD AND DEVICE FOR MULTIPLE-USER INTERACTION AND CHAPERON ROBOTS |
US10514769B2 (en) * | 2016-10-16 | 2019-12-24 | Dell Products, L.P. | Volumetric tracking for orthogonal displays in an electronic collaboration setting |
KR20180050052A (ko) * | 2016-11-04 | 2018-05-14 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
JP6903935B2 (ja) * | 2017-02-17 | 2021-07-14 | ソニーグループ株式会社 | 情報処理システム、情報処理方法、およびプログラム |
KR102373510B1 (ko) * | 2017-08-11 | 2022-03-11 | 삼성전자주식회사 | 디스플레이를 회전함에 따라 컨텐츠를 시각화 하는 디스플레이 장치 및 이의 제어 방법 |
US11402981B2 (en) | 2017-08-11 | 2022-08-02 | Samsung Electronics Co., Ltd. | Display device for visualizing contents as the display is rotated and control method thereof |
WO2019045144A1 (ko) * | 2017-08-31 | 2019-03-07 | (주)레벨소프트 | 의료용 항법 장치를 위한 의료 영상 처리 장치 및 의료 영상 처리 방법 |
CN107656789A (zh) * | 2017-09-27 | 2018-02-02 | 惠州Tcl移动通信有限公司 | 一种多角度界面显示的方法、存储介质及智能终端 |
FR3074334B1 (fr) * | 2017-11-27 | 2019-12-20 | Safran Electronics & Defense | Table tactile de preparation de mission |
US10867506B2 (en) * | 2018-03-16 | 2020-12-15 | Sean Michael Siembab | Surrounding intelligent motion sensor with adaptive recognition |
FR3079048B1 (fr) * | 2018-03-19 | 2021-11-19 | Fabian Humbert | Procede d’interaction entre d’une part au moins un utilisateur et/ou un premier dispositif electronique et d’autre part un second dispositif electronique |
JP2021519482A (ja) * | 2018-03-27 | 2021-08-10 | ヴィゼット インコーポレイテッド | マルチ画面ディスプレイと操作(interaction)のためのシステムと方法 |
US11132122B2 (en) * | 2019-04-11 | 2021-09-28 | Ricoh Company, Ltd. | Handwriting input apparatus, handwriting input method, and non-transitory recording medium |
CN110869927B (zh) * | 2019-05-20 | 2023-10-17 | 创新先进技术有限公司 | 基于隐藏式版权信息的版权保护 |
CN110264942A (zh) * | 2019-06-19 | 2019-09-20 | 深圳市洲明科技股份有限公司 | Led显示屏的画面控制方法、装置及存储介质 |
CN113573118B (zh) * | 2020-04-28 | 2022-04-29 | 海信视像科技股份有限公司 | 一种视频画面旋转方法及显示设备 |
CN111885406A (zh) * | 2020-07-30 | 2020-11-03 | 深圳创维-Rgb电子有限公司 | 智能电视控制方法、装置、可旋转电视和可读存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0792938A (ja) * | 1993-09-22 | 1995-04-07 | Hitachi Ltd | 案内装置 |
JP2002300602A (ja) | 2001-04-02 | 2002-10-11 | Sony Corp | 窓状撮像表示装置及びそれを使う双方向通信方法 |
JP2004094410A (ja) * | 2002-08-29 | 2004-03-25 | Kyocera Communication Systems Co Ltd | コンテンツ作成プログラム、コンテンツ表示プログラム、記録媒体、コンテンツ作成方法およびコンテンツ表示方法 |
JP2005142957A (ja) | 2003-11-07 | 2005-06-02 | Sony Corp | 撮像装置及び方法、撮像システム |
JP2005149127A (ja) | 2003-11-14 | 2005-06-09 | Sony Corp | 撮像表示装置及び方法、画像送受信システム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0792939A (ja) * | 1993-08-25 | 1995-04-07 | Konica Corp | 画像再生装置 |
JPH07322168A (ja) * | 1994-05-27 | 1995-12-08 | Hitachi Ltd | テレビジョン受像機 |
JPH0898135A (ja) * | 1994-09-21 | 1996-04-12 | Hitachi Ltd | ワイドテレビ信号記録再生装置 |
JPH11196397A (ja) * | 1997-12-26 | 1999-07-21 | Canon Inc | 表示装置及び通信システム |
WO2001075794A2 (en) * | 2000-04-05 | 2001-10-11 | Sony United Kingdom Limited | Identifying material |
US7844076B2 (en) * | 2003-06-26 | 2010-11-30 | Fotonation Vision Limited | Digital image processing using face detection and skin tone information |
US7574016B2 (en) * | 2003-06-26 | 2009-08-11 | Fotonation Vision Limited | Digital image processing using face detection information |
US7720924B2 (en) * | 2003-12-12 | 2010-05-18 | Syniverse Icx Corporation | System providing methodology for the restoration of original media quality in messaging environments |
WO2006020305A2 (en) | 2004-07-30 | 2006-02-23 | Apple Computer, Inc. | Gestures for touch sensitive input devices |
JP2006245726A (ja) * | 2005-03-01 | 2006-09-14 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2006261785A (ja) * | 2005-03-15 | 2006-09-28 | Pioneer Electronic Corp | 消費電力量制御装置、電子機器 |
JP2006259358A (ja) * | 2005-03-17 | 2006-09-28 | Clarion Co Ltd | 画像表示装置、画像表示方法、及び、画像表示プログラム |
US7587671B2 (en) * | 2005-05-17 | 2009-09-08 | Palm, Inc. | Image repositioning, storage and retrieval |
KR100784542B1 (ko) * | 2005-10-20 | 2007-12-11 | 엘지전자 주식회사 | 면접촉 축회전 이동통신단말기 |
WO2007113905A1 (ja) * | 2006-04-05 | 2007-10-11 | Panasonic Corporation | 携帯端末装置及び表示切替方法 |
JP2008211652A (ja) * | 2007-02-27 | 2008-09-11 | Pioneer Electronic Corp | テレビ放送受信装置及びその消費電力低減方法 |
US8896632B2 (en) * | 2008-09-12 | 2014-11-25 | Qualcomm Incorporated | Orienting displayed elements relative to a user |
JP2010128140A (ja) * | 2008-11-27 | 2010-06-10 | Casio Hitachi Mobile Communications Co Ltd | 端末装置及びプログラム |
JP2012100159A (ja) * | 2010-11-04 | 2012-05-24 | Sharp Corp | 映像表示装置,テレビジョン受像機 |
KR101752698B1 (ko) * | 2011-01-06 | 2017-07-04 | 삼성전자주식회사 | 촬영 장치 및 그 촬영 방법 |
-
2012
- 2012-12-26 EP EP12865035.5A patent/EP2804372A4/en not_active Ceased
- 2012-12-26 JP JP2013553244A patent/JP6257329B2/ja not_active Expired - Fee Related
- 2012-12-26 US US14/369,834 patent/US9317899B2/en not_active Expired - Fee Related
- 2012-12-26 WO PCT/JP2012/083751 patent/WO2013105443A1/ja active Application Filing
- 2012-12-26 CN CN201280066346.0A patent/CN104040463B/zh not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0792938A (ja) * | 1993-09-22 | 1995-04-07 | Hitachi Ltd | 案内装置 |
JP2002300602A (ja) | 2001-04-02 | 2002-10-11 | Sony Corp | 窓状撮像表示装置及びそれを使う双方向通信方法 |
JP2004094410A (ja) * | 2002-08-29 | 2004-03-25 | Kyocera Communication Systems Co Ltd | コンテンツ作成プログラム、コンテンツ表示プログラム、記録媒体、コンテンツ作成方法およびコンテンツ表示方法 |
JP2005142957A (ja) | 2003-11-07 | 2005-06-02 | Sony Corp | 撮像装置及び方法、撮像システム |
JP2005149127A (ja) | 2003-11-14 | 2005-06-09 | Sony Corp | 撮像表示装置及び方法、画像送受信システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2804372A4 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014083953A1 (ja) * | 2012-11-27 | 2014-06-05 | ソニー株式会社 | 表示装置及び表示方法、並びにコンピューター・プログラム |
CN104423799A (zh) * | 2013-08-23 | 2015-03-18 | 夏普株式会社 | 接口装置和接口方法 |
CN104461335A (zh) * | 2013-09-25 | 2015-03-25 | 联想(北京)有限公司 | 一种数据处理方法及电子设备 |
WO2015058670A1 (zh) * | 2013-10-21 | 2015-04-30 | 中国移动通信集团公司 | 一种显示内容处理方法及设备 |
JP2015082026A (ja) * | 2013-10-23 | 2015-04-27 | シャープ株式会社 | 表示装置、表示システム、および表示方法 |
CN103699303A (zh) * | 2013-12-27 | 2014-04-02 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN104750404B (zh) * | 2013-12-31 | 2018-11-09 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN104750404A (zh) * | 2013-12-31 | 2015-07-01 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
JP2015170248A (ja) * | 2014-03-10 | 2015-09-28 | 富士ゼロックス株式会社 | 表示制御装置及びプログラム |
CN104077028A (zh) * | 2014-05-28 | 2014-10-01 | 天津三星通信技术研究有限公司 | 在电子设备中控制显示项目的设备和方法 |
WO2017017737A1 (ja) * | 2015-07-24 | 2017-02-02 | 富士通株式会社 | 表示方法、モニタ結果出力方法、情報処理装置および表示プログラム |
US11467572B2 (en) | 2016-07-29 | 2022-10-11 | NEC Solution Innovations, Ltd. | Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium |
US11415980B2 (en) * | 2016-07-29 | 2022-08-16 | Nec Solution Innovators, Ltd. | Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium |
JP6169298B1 (ja) * | 2017-02-16 | 2017-07-26 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2018018496A (ja) * | 2017-02-16 | 2018-02-01 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2018018515A (ja) * | 2017-06-27 | 2018-02-01 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2018139148A (ja) * | 2018-05-30 | 2018-09-06 | 京セラ株式会社 | 電子機器及び制御方法 |
JP2019020737A (ja) * | 2018-09-06 | 2019-02-07 | シャープ株式会社 | 表示装置、表示システム、および表示方法 |
WO2021180223A1 (zh) * | 2020-03-13 | 2021-09-16 | 海信视像科技股份有限公司 | 一种显示方法及显示设备 |
US11659148B2 (en) | 2021-01-25 | 2023-05-23 | Seiko Epson Corporation | Method for controlling display apparatus, and display apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN104040463B (zh) | 2017-02-22 |
JPWO2013105443A1 (ja) | 2015-05-11 |
US9317899B2 (en) | 2016-04-19 |
EP2804372A4 (en) | 2015-09-16 |
US20140354695A1 (en) | 2014-12-04 |
EP2804372A1 (en) | 2014-11-19 |
CN104040463A (zh) | 2014-09-10 |
JP6257329B2 (ja) | 2018-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6257329B2 (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
JP6196017B2 (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
JP5957892B2 (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
JP5957893B2 (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
JP6200270B2 (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
JP2013145463A (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
US20210337260A1 (en) | Display apparatus and remote operation control apparatus | |
JP6382721B2 (ja) | 表示装置及び表示方法、並びにコンピューター・プログラム | |
JP6058978B2 (ja) | 画像処理装置及び画像処理方法、撮影装置、並びにコンピューター・プログラム | |
JP6093074B2 (ja) | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム | |
US20230386162A1 (en) | Virtual action center based on segmented video feed for a video communication session | |
US11950030B2 (en) | Electronic apparatus and method of controlling the same, and recording medium | |
US20230388445A1 (en) | Non-mirrored preview of text based demonstration object in mirrored mobile webcam image | |
TW201122706A (en) | Front projection system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12865035 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013553244 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14369834 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012865035 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |