US20140306915A1 - Information processing apparatus, program, and control method - Google Patents
Information processing apparatus, program, and control method Download PDFInfo
- Publication number
- US20140306915A1 US20140306915A1 US14/317,488 US201414317488A US2014306915A1 US 20140306915 A1 US20140306915 A1 US 20140306915A1 US 201414317488 A US201414317488 A US 201414317488A US 2014306915 A1 US2014306915 A1 US 2014306915A1
- Authority
- US
- United States
- Prior art keywords
- processing apparatus
- information processing
- controller
- image
- inclination
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04802—3D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
Definitions
- a second object associated with the first object is displayed on the screen in accordance with an inclination detected by a sensor.
- FIG. 8 is a graph showing an expression used for achieving processing of removing an unintentional hand movement and threshold value processing at a limit angle, and showing a relationship between a rotation angle of the casing and an object rotation angle of an UI object;
- FIG. 19 is a flowchart showing processing of an information processing apparatus according to still another embodiment of the present disclosure.
- FIG. 22 is a diagram showing display states on a screen in the case where the processing shown in FIG. 21 is executed;
- three-dimensional display objects 2 having a cubic shape are displayed on the left side area of the screen 1 .
- the three-dimensional display objects 2 are displayed in a rotational manner in accordance with a rotation angle of the casing 10 .
- the plurality of three-dimensional display objects 2 are located along a y-axis direction.
- the three-dimensional display objects 2 each include a front surface icon 2 a (first object) on a front surface (first surface), and a side surface icon 2 b (second object) on a side surface (second surface).
- the front surface icon 2 a and the side surface icon 2 b are assumed to be a music icon and a moving image icon, respectively, which are related to a common album.
- the front surface icon 2 a and the side surface icon 2 b are assumed to be icons having mutual association. Accordingly, the user can easily select content mutually associated.
- the first image (image including selection items) is a track selection image 22 and the second image (image after selection and determination) is a track reproduction image 23 .
- the combination of the first image and the second image is not limited to the above.
- the first image may be a moving-image selection image
- the second image may be a moving-image reproduction image.
- the reproduction of the moving image may be started in the case of a first angle or more of the casing 10 .
- examples of the combination of the first image and the second image include a case where the first image is a selection image of an application program and the second image is an image of a window displayed by the application program.
- the embodiment of the present disclosure can be applied to any cases.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
An information processing apparatus includes a display, a sensor, and a controller. The display has a screen. The sensor is configured to detect an inclination. The controller is configured to display a first object on the screen and display a second object associated with the first object on the screen in accordance with the inclination detected by the sensor.
Description
- The present disclosure relates to an information processing apparatus including a display and a touch panel, a program, and a control method.
- From the past, there has been widely known an information processing apparatus including a display such as an LCD (Liquid Crystal Display), and a touch panel that detects a contact position on a screen of the display (see, for example, Japanese Patent Application Laid-open No. 2005-09441).
- When making an input operation on an information processing apparatus including a display and a touch panel, a user touches a screen of the display with a finger, a stylus, or the like or slides a finger, a stylus, or the like thereon, thus making an input operation on the information processing apparatus.
- It is desirable to provide an information processing apparatus adopting, as an unprecedented new input system, an input system using a combination of an input operation made by inclining the information processing apparatus and an input operation made via a touch panel.
- According to an embodiment of the present disclosure, there is provided an information processing apparatus including a display, a sensor, and a controller.
- The display has a screen.
- The sensor is configured to detect an inclination.
- The controller is configured to display a first object on the screen and display a second object associated with the first object on the screen in accordance with the inclination detected by the sensor.
- In the information processing apparatus, when a user inclines the information processing apparatus, the second object is displayed on the screen in accordance with the inclination.
- The information processing apparatus may further include a touch panel configured to detect contact.
- In this case, the second object may be an object operable with the touch panel.
- In the information processing apparatus, the controller may switch availability of an operation of the second object using the touch panel in accordance with the inclination.
- Accordingly, the user inclines the information processing apparatus to display the second object on the screen, and makes an input operation with a finder or the like via the touch panel to operate the second object.
- In the information processing apparatus, the first object may be an object operable with the touch panel.
- In the information processing apparatus, the controller may switch a first state where the first object is operable with the touch panel and a second state where the second object is operable with the touch panel in accordance with the inclination.
- In the information processing apparatus, when an input operation is made with use of the touch panel within the same display area on the screen, an object to be operated can be differentiated depending on an angle of the inclination of the information processing apparatus. Accordingly, the limited screen can be effectively utilized.
- In the information processing apparatus, the controller may display in a rotational manner a three-dimensional display object having a first surface serving as the first object and a second surface serving as the second object in accordance with the inclination, to thereby display the second object on the screen in accordance with the inclination.
- In the information processing apparatus, when the user inclines the information processing apparatus, the three-dimensional display object is rotated in accordance with the inclined angle. Accordingly, the operation becomes intuitive.
- In the information processing apparatus, the controller may control display such that the three-dimensional display object is hardly rotated in a case where an angle of the inclination is less than a predetermined value, and such that a rotation speed of the three-dimensional display object becomes higher as the inclination increases in a case where the angle of the inclination is the predetermined value or more.
- In the information processing apparatus, in the case where an angle of the inclination of the information processing apparatus is less than a predetermined threshold value, the three-dimensional display object is hardly rotated and accordingly the three-dimensional display object can be prevented from being rotated by an unintentional hand movement or the like. On the other hand, in the case where the angle of the inclination of the information processing apparatus is the predetermined threshold value or more, the rotation speed of the three-dimensional display object becomes higher as the inclination increases. Accordingly, in the case where the user expresses his/her intention to rotate the three-dimensional display object and then rotates the casing, the three-dimensional display object can be appropriately rotated in accordance with the user's intention of rotation.
- In the information processing apparatus, the controller may determine a position of a virtual camera in accordance with the inclination, and change a background image of the three-dimensional display object in accordance with the position of the virtual camera.
- Accordingly, when the user inclines the information processing apparatus, the three-dimensional display object is displayed in a rotational manner, and the background image of the three-dimensional display object is changed in accordance with the position of the virtual camera. Accordingly, the feeling of rotating the three-dimensional display object can be improved.
- In the information processing apparatus, the controller may move the first object and the second object in directions different from each other in accordance with the inclination, to thereby display the second object on the screen in accordance with the inclination.
- In the information processing apparatus, when the user inclines the information processing apparatus, the first object and the second object are moved in parallel to each other on the screen in accordance with the inclined angle. Also in such a case, an operation becomes instinctive. For example, the user can obtain the feeling as if to open a door (first object).
- In the information processing apparatus, the controller may move the first object while rotating the first object.
- Accordingly, the feeling of opening a door is improved.
- In the information processing apparatus, the first object may be an image displayed when content is reproduced.
- In this case, the second object may be an object for operating a reproduction position of the content.
- In this case, the controller may change, in a case where the second object is operated with the touch panel, the reproduction position in accordance with a change amount of a contact position of the touch panel.
- In the information processing apparatus, the controller may change a ratio of a change amount of the reproduction position of the content to the change amount of the contact position of the touch panel, in accordance with the inclination.
- Accordingly, by inclining the information processing apparatus, the user can optionally change the ratio of the change amount of the reproduction position of the content to the change amount of the contact position of the touch panel.
- In the information processing apparatus, the controller may simultaneously execute a plurality of application programs.
- In this case, the first object may be an image displayed by one of the plurality of application programs.
- In this case, the second object may be an image for selecting one of the plurality of application programs.
- Accordingly, by inclining the information processing apparatus and making an input via the touch panel, the user can select an optional application.
- In the information processing apparatus, the first object may be an image indicating content.
- In this case, the second object may be an icon for deleting content.
- Accordingly, in the case where the user does not intend to delete content, the content can be prevented from being deleted mistakenly.
- In the information processing apparatus, the controller may be capable of updating a reference point serving as a reference of the inclination.
- In the information processing apparatus, the controller may determine whether an angle of the inclination is a predetermined threshold value or more, and update the reference point when the angle of the inclination is the predetermined threshold value or more.
- Accordingly, in the case where an operation position of the information processing apparatus is changed, the change of the operation position can be appropriately supported.
- In the information processing apparatus, the controller may determine whether the contact with the touch panel is not detected for a predetermined period of time or more and update, in the case where the contact is not detected for the predetermined period of time or more, a position of the information processing apparatus at that time as the reference point.
- Accordingly, in the case where the operation position of the information processing apparatus is changed, the change of the operation position can be appropriately supported.
- In the information processing apparatus, the controller may determine whether a change amount of the inclination is less than a predetermined threshold value within a predetermined period of time and update, in the case where an angle of the inclination is less than the predetermined threshold value, a position of the information processing apparatus at that time as the reference point.
- Accordingly, in the case where the operation position of the information processing apparatus is changed, the change of the operation position can be appropriately supported.
- According to an embodiment of the present disclosure, there is provided a program causing an information processing apparatus to execute displaying a first object on a screen of a display.
- Further, a second object associated with the first object is displayed on the screen in accordance with an inclination detected by a sensor.
- According to an embodiment of the present disclosure, there is provided a control method including displaying a first object on a screen of a display.
- A second object associated with the first object is displayed on the screen in accordance with an inclination detected by a sensor.
- As described above, according to one of the embodiments of the present disclosure, it is possible to provide an information processing apparatus adopting, as an unprecedented new input system, an input system using a combination of an input operation made by inclining the information processing apparatus and an input operation made via a touch panel.
- These and other objects, features and advantages of the present disclosure will become more apparent in light of the following detailed description of best mode embodiments thereof, as illustrated in the accompanying drawings.
-
FIG. 1 is a front view showing an information processing apparatus according to an embodiment of the present disclosure; -
FIG. 2 is a block diagram showing an electrical configuration of the information processing apparatus; -
FIG. 3 is a flowchart showing processing of the information processing apparatus; -
FIG. 4 is a diagram showing display states on a screen in the case where the processing shown inFIG. 3 is executed; -
FIG. 5 is a diagram showing display states on the screen in the case where the processing shown inFIG. 3 is executed; -
FIG. 6 is a diagram for explaining the principle used for displaying in a rotational manner objects constituting respective surfaces of a three-dimensional display object or objects such as an album title and an artist name; -
FIG. 7 is a flowchart showing processing when a controller calculates an object rotation angle of a UI (User Interface) object based on a rotation angle of a casing; -
FIG. 8 is a graph showing an expression used for achieving processing of removing an unintentional hand movement and threshold value processing at a limit angle, and showing a relationship between a rotation angle of the casing and an object rotation angle of an UI object; -
FIG. 9 is a diagram showing an example of a case where a background image of the three-dimensional display object is changed in accordance with the rotation angle of the casing from the reference point; -
FIG. 10 is a flowchart showing processing of an information processing apparatus according to another embodiment of the present disclosure; -
FIG. 11 is a diagram showing display states on a screen in the case where the processing shown inFIG. 10 is executed; -
FIG. 12 is a diagram for explaining movement processing for a two-dimensional display object or a UI object such as a hidden icon; -
FIG. 13 is a flowchart showing processing of an information processing apparatus according to still another embodiment of the present disclosure; -
FIG. 14 is a diagram showing display states on a screen in the case where the processing shown inFIG. 13 is executed; -
FIG. 15 is a diagram showing an example of a case where when the casing is rotated by a critical angle or more, a mail delete icon is displayed on a screen; -
FIG. 16 is a diagram showing an example of a case where when the casing is rotated by a critical angle or more, an icon for operating a reproduction position of content such as music content is displayed; -
FIG. 17 is a flowchart showing processing of an information processing apparatus according to still ti another embodiment of the present disclosure; -
FIG. 18 is a diagram showing display states on a screen in the case where the processing shown inFIG. 17 is executed; -
FIG. 19 is a flowchart showing processing of an information processing apparatus according to still another embodiment of the present disclosure; -
FIG. 20 is a diagram showing display states on a screen in the case where the processing shown inFIG. 19 is executed; -
FIG. 21 is a flowchart showing processing of an information processing apparatus according to still another embodiment of the present disclosure; -
FIG. 22 is a diagram showing display states on a screen in the case where the processing shown inFIG. 21 is executed; -
FIG. 23 is a diagram showing an example of a case where when the user slides a finger or the like at a position of the screen on which a three-dimensional display object is displayed, the three-dimensional display object is rotated; and -
FIG. 24 is a diagram showing an example of a case where an information processing apparatus does not include a display, and a display is separately provided. - Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
-
FIG. 1 is a front view showing aninformation processing apparatus 100 according to a first embodiment of the present disclosure. - As shown in
FIG. 1 , theinformation processing apparatus 100 includes a plate-like casing 10 that is thin in a z-axis direction. Inside thecasing 10, adisplay 11 including ascreen 1 is arranged. On thedisplay 11, atouch panel 12 that detects a contact position of a user's finger or a stylus is arranged. A receiver (not shown) is provided in the vicinity of an upper end portion of thecasing 10 on the front side. A mouthpiece (not shown) is provided in the vicinity of a lower end portion of thecasing 10 on the front side. - The
display 11 is constituted of, for example, a liquid crystal display or an EL (Electro-Luminescence) display. Examples of thetouch panel 12 include aresistive touch panel 12 and acapacitive touch panel 12, but thetouch panel 12 may have any touch panel system. -
FIG. 2 is a block diagram showing an electrical configuration of theinformation processing apparatus 100. - As shown in
FIG. 2 , theinformation processing apparatus 100 includes, in addition to thedisplay 11 andtouch panel 12 described above, a sensor 13, acontroller 14, acommunication unit 15, anantenna 16, aspeaker 17, amicrophone 10, aRAM 19, and aflash memory 20. - The sensor 13 is a sensor that detects the inclination of the information processing apparatus (casing). Examples of the sensor 13 include motion sensors such as angular velocity sensors (for example, vibrating gyro sensor, rotary top gyro sensor), acceleration sensors (for example, piezo-resistive type, piezoelectric type, capacitive type), and angular sensors (for example, geomagnetic sensor). The sensor 13 may be a combination of at least two or more motion sensors such as an angular velocity sensor, an acceleration sensor, and an angular sensor.
- As the motion sensor, a configuration in which a rotation angle of the
casing 10 is detected by one axis, two axes, or three axes is used. The motion sensor may have any configuration, but at least the motion sensor is configured to detect a rotation in the same rotation direction (in this embodiment, about y axis) as the rotation direction of a three-dimensional display object 2 to be described later (seeFIGS. 4 and 5 ). - It should be noted that in the description of the first embodiment, the sensor 13 is described as a motion sensor for triaxial detection for convenience.
- The
communication unit 15 executes the processing such as frequency conversion of radio waves transmitted or received by theantenna 16, modulation, and demodulation. Theantenna 16 transmits or receives radio waves for call or radio waves for packet communication of e-mails, Web data, or the like. - The
speaker 17 includes a D/A (digital/analog) converter, an amplifier, or the like. Thespeaker 17 executes D/A conversion processing and amplification processing with respect to audio data for call, which is input from thecontroller 14, and outputs audio via the receiver (not shown). - The
microphone 18 includes an A/D (analog/digital) converter or the like. Themicrophone 18 converts analog audio data input by a user via the mouthpiece into digital audio data, and outputs the digital audio data to thecontroller 14. The digital audio data output to thecontroller 14 is encoded and then transmitted via thecommunication unit 15 and theantenna 16. - The RAM 19 (Random Access Memory) is a volatile memory used as a work area of the
controller 14. TheRAM 19 temporarily stores various programs and various types of data used for processing of thecontroller 14. - The
flash memory 20 is a nonvolatile memory in which various programs and various types of data necessary for processing of thecontroller 14 are stored. - The
controller 14 is constituted of a CPU (Central Processing Unit) or the like. Thecontroller 14 collectively controls units of theinformation processing apparatus 100, and executes various computations based on various programs. - [Description on Operation]
- Next, the processing of the
information processing apparatus 100 according to the first embodiment will be described.FIG. 3 is a flowchart showing the processing of theinformation processing apparatus 100.FIGS. 4 and 5 are diagrams each showing display states on a screen in the case where the processing shown inFIG. 3 is executed, -
FIG. 4 shows display states on the screen when viewed from the perspective of a user, andFIG. 5 shows display states on the screen when thescreen 1 is viewed from the front side. - As shown in the center and the right part of
FIG. 4 andFIG. 5 , on the left side area of thescreen 1, three-dimensional display objects 2 having a cubic shape are displayed. The three-dimensional display objects 2 are displayed in a rotational manner in accordance with a rotation angle of thecasing 10. The plurality of three-dimensional display objects 2 are located along a y-axis direction. The three-dimensional display objects 2 each include afront surface icon 2 a (first object) on a front surface (first surface), and aside surface icon 2 b (second object) on a side surface (second surface). - In the first embodiment, the
front surface icon 2 a is an icon of an album, and theside surface icon 2 b is an icon of a moving image of a track included in the album. The front surface icon has an image of an album jacket or the like. The side surface icon has an image such as a still image of the moving image. - On the right side of each three-
dimensional display object 2, an album title and an artist name are displayed (seeFIG. 5 ). - With reference to
FIG. 3 , thecontroller 14 determines whether an image displayed on the screen is moved to another image (Step 101). For example, as shown in the left part ofFIG. 4 , in the state where a home image is displayed on the screen, a user touches a position where a specific icon is displayed on the screen. Then, thecontroller 14 moves the display on the screen from the home image to an image as shown in the center ofFIG. 4 and the left part ofFIG. 5 . - When the image is moved (YES of Step 101), the
controller 14 updates a reference point, with an angle of thecasing 10 at that time as a reference (Step 105). The reference point is a reference angle of the rotation angle of thecasing 10. - In the case where the reference point is updated, since the position of the
casing 10 at that time is a reference point, the rotation angle of thecasing 10 at that time is zero. An image displayed on the screen when the rotation angle from the reference point is zero will be hereinafter referred to as a reference image (see center ofFIG. 4 and left part ofFIG. 5 ). - In the reference image, the
controller 14 controls the display on the screen such that the front surface of the three-dimensional display object 2 having a cubic shape faces to the front side of thescreen 1. In other words, in the case where the rotation angle from the reference point is zero, thecontroller 14 controls the display on the screen such that thefront surface icon 2 a (icon of album) arranged on the front side of the three-dimensional display object 2 faces to the front side of thescreen 1. It should be noted that in the case where the rotation angle is zero, thecontroller 14 also displays characters such as an album title and an artist name displayed on the right side of the three-dimensional display object 2 so as to be parallel to the front side of thescreen 1. - In the case where the determination in
Step 101 is negative (NO of Step 101), thecontroller 14 proceeds to thenext Step 102. InStep 102, thecontroller 14 determines whether the rotation angle from the reference point is changed by a predetermined threshold value (for example, about ±90 degrees about y axis, x axis, and z axis) or more. - In the case where the rotation angle from the reference point is a predetermined threshold value or more (YES of Step 102), the
controller 14 updates the reference point (Step 105) and displays the reference image (see center ofFIG. 4 , and left part ofFIG. 5 ) on the screen. - As described above, in the case where the rotation angle from the reference point is a predetermined threshold value or more, the reference point is, updated. As a result, when an operation position of the
information processing apparatus 100 is largely changed, for example, when a user lies down, the change of the operation position can be appropriately supported. - In the case where the rotation angle from the reference point is less than a threshold value (NO of Step 102), the
controller 14 proceeds to the next Step 103. In Step 103, thecontroller 14 determines whether an operation with use of thetouch panel 12 has been absent for a predetermined period of time (for example, about two seconds to three seconds) or more based on an output from thetouch panel 12. - In the case where an operation with use of the
touch panel 12 has been absent for a predetermined period of time or more (YES of Step 103), the reference point is updated using the position of thecasing 10 at that time (Step 105). Accordingly, when the operation position of theinformation processing apparatus 100 is changed, the change of the operation position can be appropriately supported. - In the case where an operation with use of the
touch panel 12 has been made in the predetermined period of time (NO of Step 103), thecontroller 14 proceeds to the next Step 104. In Step 104, thecontroller 14 determines whether a change amount of the rotation angle is less than a predetermined threshold value (for example, about ±5 degrees about y axis, x axis, and z axis) in a predetermined period of time (for example, about two seconds to three seconds). - In the case where a change amount of the rotation angle in a predetermined period of time is less than a predetermined threshold value (YES of Step 104), the
controller 14 updates the reference point using the position of thecasing 10 at that time and displays a reference image on the screen. Accordingly, when the operation position of theinformation processing apparatus 100 is changed, the change of the operation position can be appropriately supported. - In the case where a chance amount of the rotation angle in a predetermined period of time is a predetermined threshold value or more (NO of Step 104), the
controller 14 proceeds to the next Step 106. In Step 106, thecontroller 14 calculates a rotation angle of thecasing 10 from the reference point based on an output from the sensor 13 (motion sensor). In this case, thecontroller 14 calculates a rotation angle of thecasing 10 about the y axis. - Next, the
controller 14 determines whether the rotation angle of thecasing 10 is a limit angle (for example, about 20 degrees to 45 degrees) or more (Step 107). In the case where the rotation angle is less than a limit angle (NO of Step 107), thecontroller 14 controls the display such that the three-dimensional display object 2 is rotated in accordance with the rotation angle (Step 108) (seeFIGS. 4 and 5 ). Thecontroller 14 may also display in a rotational manner characters such as an album title and an artist name in accordance with the rotation angle. -
FIG. 6 is a diagram for explaining the principle used for displaying in a rotational manner objects constituting therespective surfaces dimensional display object 2 or objects such as an album title and an artist name. It should be noted that in the following description, objects constituting therespective surfaces dimensional display object 2 or objects such as an album title and an artist name, which are rotated or moved in accordance with a rotation operation of thecasing 10, will be hereinafter referred to as a UI (User Interface)object 4. - In Step 108, based on the rotation angle of the
casing 10, thecontroller 14 calculates a rotation angle of aUI object 4 constituting thesurface dimensional display object 2 or aUI object 4 such as a name of album (hereinafter, referred to as object rotation angle). Then, thecontroller 14 displays eachUI object 4 in a rotational manner about each axis based on the calculated object rotation angle. Accordingly, the three-dimensional display object 2, the album title, or the like is rotated on the screen in accordance with the rotation angle. - Upon display of the three-dimensional display object in a rotational manner, the
side surface icon 2 b (icon of a moving image of a track included in the album) hidden in the reference image gradually emerges on the screen as the rotation angle from the reference point increases. - Upon display of the three-
dimensional display object 2 in a rotational manner, thecontroller 14 then determines whether contact of a user's finger or the like is detected with the touch panel 12 (Step 109). In the case where contact is not detected with the touch panel 12 (NO on Step 109), thecontroller 14 returns to Step 101. - On the other hand, in the case where contact is detected with the touch panel 12 (YES of Step 109), the
controller 14 determines where the contact position is located among rectangulardivisional areas 3 divided for each album (see undulating line inFIG. 5 ). Then, a command for afront surface icon 2 a (icon of album) displayed at a position corresponding to thatdivisional area 3 is issued (Step 110). - In other words, in the case where the rotation angle of the
casing 10 is less than a limit angle, a state where afront surface icon 2 a can be selected and determined (first state) is set. If contact is detected with thetouch panel 12 at that time, a command for thefront surface icon 2 a is issued. - Upon issue of a command for the
front surface icon 2 a (icon of album), for example, the display on the screen is moved to a selection image of tracks included in the album. - In the case where the rotation angle is a limit angle or more in Step 107 (YES of Step 107), the
controller 14 stops rotation of the three-dimensional display object 2 (Step 111). In this way, in the case where the rotation angle is a limit angle or more, the rotation of the three-dimensional display object 2 is stopped, and accordingly a user can stably operate the three-dimensional display object 2. It should be noted that the rotation of the three-dimensional display object 2 is stopped, characters such as an album title and an artist name are also stopped to be rotated. - Next, the
controller 14 determines whether contact of a user's finger or the like is detected with the touch panel 12 (Step 112). In the case where contact is not detected with the touch panel 12 (NO of Step 112), thecontroller 14 returns to Step 101. - On the other hand, in the case where contact is detected with the touch panel 12 (YES of Step 112), the
controller 14 determines where the contact position is located among the rectangulardivisional areas 3 divided for each album (see undulating line inFIG. 5 ). Then, a command for aside surface icon 2 b (icon of moving image) displayed at a position corresponding to thatdivisional area 3 is issued (Step 113). - In other words, in the case where the rotation angle of the
casing 10 is a limit angle or more, a state where aside surface icon 2 b can be selected and determined (second state) is set. If contact is detected with thetouch panel 12 at that time, a command for theside surface icon 2 b is executed. - In this embodiment, the
controller 14 switches between the state where thefront surface icon 2 a can be selected and determined (first state) and the state where theside surface icon 2 b can be selected and determined (second state) with the limit angle as a boundary. - In Step 113, upon issue of the command for the
side surface icon 2 b (icon of moving image), for example, the display on the screen is moved to a reproduction image of a moving image (moving image of a track included in the album). - [Action Etc.]
- In the
information processing apparatus 100, in the case where inputs operations such as touch and tap are made with use of thetouch panel 12 within the same display area on the screen, an icon to be selected and determined can be differentiated depending on an angle of rotation, which means that different icons are arranged in the same display area. Accordingly, a limited screen area can be used efficiently. - Further, in this embodiment, the icon of the album is arranged on the front surface of the three-
dimensional display object 2, and the icon of the moving image is displayed on the side surface thereof. Accordingly, the user can select an optional album or an optional moving image by a rotation operation of thecasing 10 and an input operation to thetouch panel 12. - Here, in related art, a selection image of albums and a selection image of moving images are generally separated. Therefore, for example, in the case where display is moved from the album selection image to the moving-image selection image or the like, it has been necessary to move from the album selection image to another image such as a home image once, and then move to the moving-image selection image. Therefore, in related art, many touch operations or much screen moving have been necessary at a time of album selection and moving image selection.
- On the other hand, in this embodiment, the user can select an optional album or an optional moving image by a rotation operation of the
casing 10 and an input operation to thetouch panel 12 as described above. In this way, in this embodiment, many touch operations, screen moving, and the like are not necessary at a time of album selection and moving image selection, for example. Therefore, a load on the user can be relieved. - Further, in this embodiment, the
front surface icon 2 a and theside surface icon 2 b are assumed to be a music icon and a moving image icon, respectively, which are related to a common album. Specifically, thefront surface icon 2 a and theside surface icon 2 b are assumed to be icons having mutual association. Accordingly, the user can easily select content mutually associated. - As described above, the
controller 14 calculates an object rotation angle of a UI object 4 (object constituting surface dimensional display object 2, or the like) based on the rotation angle of the casing 10 (see Step 108 ofFIG. 3 , andFIG. 6 ). Then, thecontroller 14 displays eachUI object 4 in a rotational manner based on the calculated object rotation angle, thus displaying the three-dimensional display object 2 or the like in a rotational manner. - Hereinafter, an example of a method of calculating the object rotation angle of a
UI object 4 based on the rotation angle of thecasing 10 will be described. -
FIG. 7 is a flowchart showing processing when thecontroller 14 calculates an object rotation angle of aUI object 4 based on a rotation angle of thecasing 10. - As shown in
FIG. 7 , thecontroller 14 acquires the rotation angle of thecasing 10 from the sensor 13, and executes processing of removing sensor noise from the rotation angle of the casing 10 (output of sensor 13) (Step 201). In this case, thecontroller 14 executes processing of averaging the last rotation angle of thecasing 10 and the current rotation angle of the casing 10 (low pass filter), thus executing the processing of removing sensor noise. - Next, the
controller 14 executes processing of removing an unintentional hand movement and threshold value processing at a limit angle (Step 203, Step 204). -
FIG. 8 is a graph showing an expression used for achieving processing of removing an unintentional hand movement and threshold value processing at a limit angle, and showing a relationship between a rotation angle of thecasing 10 and an object rotation angle of theUI object 4. - Expression (1) below is used in
FIG. 8 -
θi=a tan(bθd) - θd: rotation angle of casing 10 (sensor noise removed)
- θi: object rotation angle
- a,b: optional constant
- It should be noted that in the case where the rotation angle of the
casing 10 is a limit angle or more, the object rotation angle is set to be constant by the threshold value processing. - As shown in
FIG. 8 , in the case where the rotation angle of thecasing 10 is less than a certain value, theUI object 4 is hardly rotated by Expression (1) described above. Accordingly, the three-dimensional display object 2 or the like can be prevented from being rotated by an unintentional hand movement or the like. - Further, as shown in
FIG. 8 , in the case where the rotation angle of thecasing 10 is a certain value or more, the object rotation angle increases as the rotation angle increases. In this case, the inclination of the graph is sharp as compared to the case where the rotation angle of thecasing 10 and the object rotation angle are equal to each other (see broken line ofFIG. 8 ). Accordingly, in the case where a user expresses his/her intention to rotate the three-dimensional display object 2 or the like and then rotates thecasing 10, the three-dimensional display object 2 can be appropriately rotated in accordance with the user's intention of rotation. - Further, in the case where the rotation angle of the
casing 10 is a limit angle or more, the object rotation angle is set to be constant and the three-dimensional display object 2 or the like is stopped to be rotated. Accordingly, as described above, the user can stably operate the three-dimensional display object 2 or the like. - Here, it is also possible to structure a background image of the three-
dimensional display object 2 so as to change in accordance with the rotation angle of thecasing 10 from the reference point. -
FIG. 9 is a diagram showing an example of a case where a background image of the three-dimensional display object 2 is changed in accordance with the rotation angle of thecasing 10 from the reference point. - In the example shown in
FIG. 9 , the background image of the three-dimensional display object 2 is a sphere. It should be noted that inFIG. 9 , the three-dimensional display object 2 is transparent for easy viewing of drawings. - In this case, the
controller 14 determines the position of a virtual camera based on the rotation angle of thecasing 10 from the reference point (see Step 108 ofFIG. 3 ). Then, thecontroller 14 only has to control the display such that the background image (sphere) of the three-dimensional display object 2 is changed in accordance with the position of the virtual camera. - In this case, since the background image of the three-
dimensional display object 2 is changed in accordance with the rotation angle, the feeling of rotating the three-dimensional display object 2 can be improved. - In the example described above, the
front surface icon 2 a is a music icon and theside surface icon 2 b is a moving image icon. However, the combination of thefront surface icon 2 a and theside surface icon 2 b is not limited to the example described above. For example, both thefront surface icon 2 a and theside surface icon 2 b may be music icons or may be moving image icons. Alternatively, the front surface icon. 2 a may be a moving image icon and theside surface icon 2 b may be a music icon. Alternatively, at least one of thefront surface icon 2 a and theside surface icon 2 b may be an icon of a still image (photo or the like). - In those cases, the
front surface icon 2 a and theside surface icon 2 b may have mutual association. - An example in which the
front surface icon 2 a and theside surface icon 2 b are associated with each other will be described. For example, in the case where both thefront surface icon 2 a and theside surface icon 2 b are music icons, thefront surface icon 2 a and theside surface icon 2 b are music icons of a common artist or a common field (pops, jazz, etc.). - Further, for example, in the case where both the
front surface icon 2 a and theside surface icon 2 b are moving image icons, thefront surface icon 2 a and the side surface icon. 2 b are moving image icons of a movie, a television program, or the like in a common series. Further, for example, in the case where thefront surface icon 2 a is a moving image icon and theside surface icon 2 b is a music icon, thefront surface icon 2 a is a moving image icon of a movie or a television program, and theside surface icon 2 b is an icon of music (sound track) used in the movie or the television program. - Examples in which the
front surface icon 2 a and theside surface icon 2 b are mutually associated include a case where theside surface icon 2 b is a context menu of thefront surface icon 2 a. - In the example described above, the
side surface icon 2 b is arranged on the right side surface of the three-dimensional display object 2, but it may be arranged on another surface such as a left side surface, a top surface, or a bottom surface of the three-dimensional display object 2. - In the example described above, the plurality of three-dimensional display objects 2 are displayed on the screen, but one three-
dimensional display object 2 may be provided. - In the example described above, the three-
dimensional display object 2 having a cubic shape has been described as an example of the three-dimensional display object 2. However, the shape of the three-dimensional display object 2 is not limited to the cubic shape. For example, the three-dimensional display object 2 may have a rectangular parallelepiped shape elongated in one direction. - The three-
dimensional display object 2 may not be a hexahedron. Typically, the three-dimensional display object 2 may have any configuration as long as it is a polyhedron having surfaces more than those of a tetrahedron. It should be noted that the three-dimensional display object 2 may be a sphere as long as it is divided into multiple surfaces. It should be noted that also in those cases, afront surface icon 2 a is arranged on a front surface (first surface) of the three-dimensional display object 2 having an optional shape, and another icon is arranged on a surface (second surface) other than the front surface. - In the example described above, the three-
dimensional display object 2 is rotated about the y axis in accordance with the rotation angle of thecasing 10 about the y axis. However, it is of course possible to structure the three-dimensional display object 2 so as to be rotated about the x axis in accordance with the rotation angle of thecasing 10 about the x axis. It should be noted that a combination thereof is also possible. The direction in which thecasing 10 is rotated is not particularly limited, which is also applied to embodiments to be described later. - Next, a second embodiment of the present disclosure will be described. It should be noted that in the description of the second embodiment and subsequent embodiments, members or the like having the same structures and functions as those in the first embodiment are simply described or not described, and different points from those of the first embodiment will mainly be described.
-
FIG. 10 is a flowchart showing processing of aninformation processing apparatus 100 according to the second embodiment.FIG. 11 is a diagram showing display states on a screen in the case where the processing shown inFIG. 10 is executed. - In Steps 301 to 305 shown in
FIG. 10 , the same processing as that inSteps 101 to 105 shown inFIG. 3 is executed. - In the case where the determination is positive in Steps 301 to 304 (YES of Steps 301 to 304), the
controller 14 updates the reference point (Step 305). Upon update of the reference point, the rotation angle of thecasing 10 at that time is a reference, and the rotation angle of thecasing 10 is zero. - In the case where the rotation angle of the
casing 10 is zero, thecontroller 14 displays a reference image on the screen. In the second embodiment, a reference image is an image shown in the left part ofFIG. 11 . - In the left part of
FIG. 11 , it is assumed that the reference image is a music reproduction image. A user performs touch and tap operations with a finger or the like at positions on a lower area of thescreen 1, on which rewind, pause, and fast forward icons are displayed, to thereby control reproduction, stop, and the like of music. - At the center of the reference image, a two-dimensional display object (hereinafter, two-dimensional display object 5) moved on the screen in accordance with the rotation of the
casing 10 is displayed. The two-dimensional display object 5 (first object) has an image of a jacket of a track to be reproduced. - In Step 306, the
controller 14 calculates a rotation angle (about y axis) of thecasing 10 from the reference point based on an output of the sensor 13. Next, thecontroller 14 determines whether the rotation angle of thecasing 10 is a limit angle (for example, about 20 degrees to 45 degrees) or more (Step 307). - In the case where the rotation angle is less than a limit angle (NO of Step 307), the
controller 14 controls the display such that the two-dimensional display object 5 is moved on the screen in accordance with the rotation angle (Step 308). It should be noted that in this case, thecontroller 14 controls the display such that the two-dimensional display object. 5 is moved while being rotated (see broken line of right, part ofFIG. 11 ). - Accordingly, as shown in the right part of
FIG. 11 , a hidden icon 6 (second object) hidden behind the two-dimensional display object 5 gradually emerges on the screen as the rotation angle from the reference point increases. - Further, in this case, the
controller 14 controls display such that thehidden icon 6 is also moved on the screen in accordance with the rotation angle of thecasing 10, similarly to the two-dimensional display object 5. In this case, thehidden icon 6 is moved in the opposite direction of the movement direction of the two-dimensional display object 5. -
FIG. 12 is a diagram for explaining movement processing for the two-dimensional display object 5 or theUI object 4 such as thehidden icon 6. - As shown in
FIG. 12 , twoUI objects 4 are moved in parallel to each other and in the opposite directions in accordance with the rotation angle of thecasing 10, and accordingly it is possible for the user to recognize a changed perspective. - The
hidden icon 6 is typically an icon having association with a track to be reproduced in a reference image. For example, thehidden icon 6 is an icon of a track whose artist is the same for the track to be reproduced, or an icon of a track whose field (pops, jazz, etc.) is the same for the track to be reproduced. Alternatively, thehidden icon 6 may be an icon of a moving image or a still image, or the like associated with the track to be reproduced. - The
controller 14 calculates movement amounts of the two-dimensional display object 5 and thehidden icon 6 in Step 308 based on the rotation angle of thecasing 10. Then, thecontroller 14 only has to move the two-dimensional display object 5 and thehidden icon 6 on the screen based on the calculated movement amounts. - In the case where the rotation angle of the
casing 10 is a limit angle or more (YES of Step 307), thecontroller 14 stops movement of the two-dimensional display object 5. In this case, the movement of thehidden icon 6 may be stopped or may not be stopped. - Next, the
controller 14 determines whether thehidden icon 6 is selected with the touch panel 12 (Step 310). In the case where thehidden icon 6 is not selected (NO of Step 310), thecontroller 14 returns to Step 301. On the other hand, in the case where thehidden icon 6 is selected (YES of Step 310), a command for thathidden icon 6 is issued (Step 311). - [Action Etc.]
- In the second embodiment, when the user rotates the
casing 10, the two-dimensional display object 5 is moved on the screen in accordance with that rotation operation. Then, thehidden icon 6 hidden behind the two-dimensional display object emerges on the screen in accordance with the rotation operation, or hides behind the two-dimensional display object 5. Accordingly, the user can cause thehidden icon 6 to emerge on the screen or hide behind the two-dimensional display object 5 with the feeling of opening or closing a door (two-dimensional display object 5), at a time when thecasing 10 is rotated. - Further, in this embodiment, the two-
dimensional display object 5 is moved while being rotated at a time of the movement. Accordingly, the feeling of opening or closing a door (two-dimensional display object 5) further becomes instinctive. - Further, in this embodiment, the user can select an icon (hidden icon 6) without moving to another image when a track is reproduced. Accordingly, a load on the user can be relieved.
- In the above description, the reference image is a music reproduction image, and the two-
dimensional display object 5 is an image of a jacket of a track to be reproduced. However, the reference image may be a moving-image reproduction image. In this case, the two-dimensional display object 5 is a moving image to be reproduced in the moving-image reproduction image. - In this case, the
hidden icon 6 is typically an icon having association with that moving image. For example, in the case where the moving image is a moving image of a movie or a television program, thehidden icon 6 is an icon of a moving image of another movie or television program, or the like in a common series. Alternatively, in this case, thehidden icon 6 may be an icon of music (sound track) used in another movie or television program, or the like in a common series. - Next, a third embodiment of the present disclosure will be described. It should be noted that in an information processing apparatus according to the third embodiment, a controller can simultaneously execute a plurality of application programs (multitask). Examples of the application programs include a mail program, a browsing program, a music reproduction program, a moving image reproduction program, and a telephone directory management program, but the application programs are not limited thereto.
-
FIG. 13 is a flowchart showing processing of aninformation processing apparatus 100 according to the third embodiment, -
FIG. 14 is a diagram showing display states on a screen in the case where the processing shown inFIG. 13 is executed. - In Steps 401 to 405 shown in
FIG. 13 , the same processing as that inSteps 101 to 105 shown inFIG. 3 is executed. - In the case where the determination is positive in Steps 401 to 404 (YES of Steps 401 to 404), the
controller 14 updates the reference point (Step 405). In this case, the reference image is displayed on the screen (see left part ofFIG. 14 ). In the third embodiment, the reference image (first object) may have any configuration. The reference image is, for example, an image displayed by one of the plurality of application programs. - In Step 406, the
controller 14 calculates a rotation angle (about y axis) of thecasing 10 from the reference point based on an output from the sensor 13. Next, thecontroller 14 determines whether the rotation angle of thecasing 10 is less than a critical angle (for example, about 20 degrees to 45 degrees). - In the case where the rotation angle is less than a critical angle (NO of Step 407), the
controller 14 returns to Step 401. In the third embodiment, unlike the embodiments described above, the display on the screen is not changed when the rotation angle is less than a critical angle, and the display remains in the reference image. - On the other hand, in the case where the rotation angle is a critical angle or more (YES of Step 407), the
controller 14 displays an image for selecting one application program from the plurality of application programs (second object) (Step 408) (see right part ofFIG. 14 ). This image includes apage group 8 of application programs. - Pages 7 included in the
page group 8 correspond to application programs such as a mail program, a browsing program, a music reproduction program, a moving image reproduction program, and a telephone directory management program. The pages 7 may include images of windows currently being opened. - Upon display of the
page group 8, thecontroller 14 determines whether a page 7 is selected with the touch panel 12 (Step 409). Methods of selecting a page 7 include a method performed by a user sliding a finger on thetouch panel 12 to more the focus and selecting an optional page by releasing the finger. It should be noted that any selection method for the pages 7 may be used. - When determining that a page 7 is selected with the touch panel 12 (YES of Step 409), the
controller 14 executes an application corresponding to the selected page 7 (Step 410). - In the third embodiment, the user can rotate the
casing 10 to cause thepage group 8 of the application programs to emerge on the screen, and select an optional application program. Accordingly, the user does not need to repeat screen moving or the like in order to execute another application. Accordingly, a load on the user can be relieved. - Next, a fourth embodiment of the present disclosure will be described.
- In the fourth embodiment, in the case where the
casing 10 is rotated by a critical angle or more, a mail deleteicon 9 is displayed on the screen. -
FIG. 15 is a diagram showing an example of a case where when thecasing 10 is rotated by a critical angle or more, the mail deleteicon 9 is displayed on a screen. - In the fourth embodiment, it is assumed a case where the reference image is an image of an in-box (first object) (see left part of
FIG. 15 ). - The
controller 14 determines whether the rotation angle (about y axis) of thecasing 10 from the reference point is a critical angle (about 20 degrees to 45 degrees) or more. In the case where the rotation angle is less than a critical angle, the mail delete icon 9 (second object) is not displayed on the screen (see left part ofFIG. 15 ). On the other hand, in the case where the rotation angle is a critical angle or more, thecontroller 14 displays the mail deleteicon 9 on the screen. Upon display of the mail deleteicon 9, the mail deleteicon 9 enters a selectable state. When detecting an input made by the user via thetouch panel 12, thecontroller 14 executes a command of a mail deleteicon 9 displayed at a position corresponding to the contact position, an deletes a mail corresponding thereto. - Here, there is a case where an icon for deleting contents such as the mail delete
icon 9 is mistakenly selected and it is irreversible. On the other hand, in theinformation processing apparatus 100 according to the fourth embodiment, anicon 9 for deleting contents is displayed on the screen for the first time when thecasing 10 is rotated by a critical angle or more, and enters an operable state. Accordingly, it is possible to prevent the user from mistakenly deleting content when the user does not intend to delete content. - It should be noted that in the above description, the mail delete
icon 9 emerges on the screen for the first time when thecasing 10 is rotated by a critical angle or more, but the configuration is not limited thereto. For example, a configuration in which the mail deleteicon 9 emerges on the screen while rotating in accordance with the rotation angle (for example, one rotation) is also conceived. Alternatively, there is conceived a case where the mail deleteicon 9 emerges on the screen while the color thereof gradually becomes dark in accordance with the rotation angle. - In the above description, the mail delete icon. 9 is described as an icon for deleting contents, but icons for deleting contents may be icons for deleting music contents, moving image contents, still image contents, or the like.
- Next, a fifth embodiment of the present disclosure will be described. In the fifth embodiment, in the case where the
casing 10 is rotated by a critical angle or more, anicon 21 for operating a reproduction position of content such as music content is displayed. -
FIG. 16 is a diagram showing an example of a case where when thecasing 10 is rotated by a critical angle or more, anicon 21 for operating a reproduction position of content such as music content (hereinafter, reproduction position control icon 21) is displayed. - In the fifth embodiment, it is assumed that the reference image is a music reproduction image (first object) (see left part of
FIG. 16 ). A user performs touch and tap operations with a finger or the like at positions on a lower area of thescreen 1, on which rewind, pause, and fast forward icons are displayed, to thereby control reproduction, stop, and the like of music. - The
controller 14 determines whether the rotation angle (about x axis) of thecasing 10 from the reference point is a critical angle (about 20 degrees to 45 degrees) or more. In the case where the rotation angle is a critical angle or more, thecontroller 14 displays a reproduction position control icon 21 (second object) at an upper portion of the screen 1 (see center and right part ofFIG. 16 ). Upon display of the reproductionposition control icon 21 on the screen, the reproductionposition control icon 21 enters an operable state. In the case where the reproductionposition control icon 21 is operated with thetouch panel 12, thecontroller 14 changes a reproduction position of a track in accordance with a change amount of a contact position on thetouch panel 12. The user touches a position at which the reproductionposition control icon 21 is displayed and slides a finger in a lateral direction, thus optionally selecting a reproduction position of the track. - Further, in the case where the rotation angle is a critical angle or more, the
controller 14 changes a ratio of the change amount of the reproduction position of the track to the change amount of the contact position on thetouch panel 12, in accordance with the rotation angle (see center and right part ofFIG. 16 ). For example, in the case where the rotation angle is small, thecontroller 14 reduces the ratio (center ofFIG. 16 ), and in the case where the rotation angle is large, thecontroller 14 increases the ratio (right part ofFIG. 16 ). Accordingly, even in the case where the user slides a finger on thetouch panel 12 in a lateral direction at the same speed, the change amount of the reproduction position becomes small (low speed) in the center ofFIG. 16 , and the change amount of the reproduction position becomes large (high speed) in the right part ofFIG. 16 . - In the example shown in
FIG. 16 , by rotating thecasing 10, the user can optionally changing the ratio (speed) of the change amount of the reproduction position of the track to the change amount of the contact position on thetouch panel 12. - In the above description, the reproduction
position control icon 21 is described as an icon for operating a reproduction position of music contents, but the reproductionposition control icon 21 may be an icon for operating a reproduction position of moving image contents. - Next, a sixth embodiment of the present disclosure will be described.
- The sixth embodiment is different from the embodiments descried above in that when the user rotates the
casing 10 with a finger or the like in contact with the touch panel. 12, the display or the like on the screen is changed. Therefore, that point will mainly be described. -
FIG. 17 is a flowchart showing processing of aninformation processing apparatus 100 according to the sixth embodiment.FIG. 18 is a diagram showing display states on a screen in the case where the processing shown inFIG. 17 is executed. - The
controller 14 determines whether contact of a user's finger or the like is detected with the touch panel 12 (Step 501). In the case where contact is not detected, thecontroller 14 returns to Step 501 and determines whether contact of a user's finger or the like is detected with thetouch panel 12 again. - In the case where contact is not detected, the display states or the like on the screen are not changed and an image as shown in the Left end part of
FIG. 18 is displayed on the screen. - In the left end part of
FIG. 18 , a track selection image 22 (first image) including a plurality of track titles (selection items) is displayed. It should be noted that a state where an image (first image) including selection items such as track titles is displayed on the entire screen will be hereinafter referred to as a first display mode. - In the case where contact is detected with the touch panel 12 (YES of Step 501), the
controller 14 proceeds to the next Step 502. In Step 502, thecontroller 14 determines whether a rotatable selection item such as a track title is displayed at a position of on the screen that corresponds to the contact position of thetouch panel 12. - In the case where a rotatable selection item (track title) is not displayed at the contact position (NO of Step 502), the
controller 14 returns to Step 501. - On the other hand, in the case where a rotatable selection item (track title) is displayed on the contact position (YES of Step 502), a position of the
casing 10 at a time when contact is detected is recorded as a reference point (Step 503). - Next, the
controller 14 calculates a rotation angle (about y axis) of thecasing 10 from the reference point based on an output from the sensor 13 (motion sensor) (Step 504). - Upon calculation of the rotation angle of the
casing 10, thetrack selection image 22 is displayed in a rotational manner in accordance with the calculated rotation angle (Step 505) (see center left part and center right part ofFIG. 18 ). Further, in this case, thecontroller 14 displays a track reproduction image 23 (second image) in a rotational manner in accordance with the calculated rotation angle (see center left part and center right part ofFIG. 18 ). Thetrack reproduction image 23 is atrack reproduction image 23 of a track selected on thetrack selection image 22, and has an image of a jacket or the like of the track (see right end part ofFIG. 18 ). - In Step 505, the
controller 14 calculates rotation angles of thetrack selection image 22 and the track reproduction image 23 (UI objects 4) based on the rotation angle of thecasing 10, and displays the respective images in a rotational manner about the respective axes (seeFIG. 6 ). Thetrack selection image 22 is displayed in a rotational manner with a left end of thescreen 1 as a center axis of the rotation. On the other hand, thetrack reproduction image 23 is displayed in a rotational manner with a right end of thescreen 1 as a center axis of the rotation. - It should be noted that in the following description, a state where the UI objects 4 such as the
track selection image 22 and thetrack reproduction image 23 are displayed in a rotational manner in accordance with the rotation angle of the casing 10 (may be displayed with movement) is referred to as a second display mode. In the second display mode, as the rotation angle of thecasing 10 becomes large, thetrack reproduction image 23 gradually emerges on the screen and thetrack selection image 22 gradually disappears from the screen. - Upon display of the
track selection image 22 and thetrack reproduction image 23 in a rotational manner, thecontroller 14 then determines whether the rotation angle of thecasing 10 is a first angle (about 0 degrees to 10 degrees) or more (Step 506). In the case where the rotation angle of thecasing 10 is a first angle or more (YES of Step 506), thecontroller 14 reproduces the selected track (Step 507), and proceeds to Step 508. On the other hand, in the case where the rotation angle of thecasing 10 is less than a first angle (NO of Step 506), thecontroller 14 does not reproduce the track and proceeds to the next Step 508. - In Step 508, the
controller 14 determines whether contact with thetouch panel 12 is released based on the output from thetouch panel 12. In the case where contact with thetouch panel 12 is not released (NO of Step 508), thecontroller 14 calculates a rotation angle of thecasing 10 again (Step 504) and displays therespective images touch panel 12 to the release of the contact with thetouch panel 12, thecontroller 14 displays therespective images casing 10. - On the other hand, in the case where contact with the
touch panel 12 is released (YES of Step 508), thecontroller 14 determines whether the rotation angle of thecasing 10 at a time when contact is released is a second angle (about 20 degrees to 45 degrees) or more (Step 509). - In the case where the rotation angle of the
casing 10 at a time when contact is released is less than a second angle (NO of Step 509), thecontroller 14 displays thetrack selection image 22 on the screen (Step 510) (center left part ofFIG. 18 to left end part ofFIG. 18 ) (second display mode to first display mode). In other words, in the case where the rotation angle of thecasing 10 at a time when contact is released is less than a second angle, the selection and determination of a track selected on thetrack selection image 22 is cancelled and thetrack selection image 22 is displayed on the screen again. Upon display of thetrack selection image 22 on the screen, thecontroller 14 returns to Step 501 again and determines whether thetouch panel 12 is touched. - On the other hand, in the case where the rotation angle of the
casing 10 at a time when contact is released is a second angle or more (YES of Step 509), thecontroller 14 displays thetrack reproduction image 23 on the screen (Step 511) (center right part ofFIG. 18 to right end part ofFIG. 18 (second display mode to third display mode). In other words, in the case where the rotation angle of thecasing 10 at a time when contact is released is a second angle or more, a track selected on thetrack selection image 22 is determined and atrack reproduction image 23 of the track is displayed on the screen. It should be noted that a state where the track reproduction image 23 (image after selection and determination) is displayed on theentire screen 1 will be hereinafter referred to as a third display mode. - In the state where the
track reproduction image 23 is displayed (third display mode), a user performs touch and tap operations with a finger or the lie at positions on a lower area of thescreen 1, on which rewind, pause, and fast forward icons are displayed, to thereby control reproduction, stop, and the like of music. It should be noted that in this embodiment, at a time when the second display mode is switched to the third display mode (center right part ofFIG. 18 to right end part ofFIG. 18 ), the reproduction of the track has been started (see Step 507). - [Action Etc.]
- Through the processing shown in
FIG. 17 , the user touches with a finger an area of thetrack selection image 22 where an optional selection item (track title) is displayed, and rotates thecasing 10 with the finger being in contact with the area, and accordingly atrack reproduction image 23 of the track can be gradually caused to emerge on the screen. The user visually recognizes the track reproduction image 23 (including image of a jacket or the like of the track), and accordingly can grasp details or the like of the track. It should be noted that in this embodiment, in the case where the rotation angle of thecasing 10 is the first angle or more, the track is started to be reproduced, with the result that details of the track selected on thetrack selection image 22 can be grasped more easily. - Further, if the user releases the finger from the
touch panel 12 in a state were the rotation angle of thecasing 10 is less than the second angle, it is possible to cancel the selection item (track title) selected on thetrack selection image 22 and display thetrack selection image 22 again on the screen (center left part ofFIG. 18 to center right part ofFIG. 18 ). Accordingly, in the case where the track selected on thetrack selection image 22 is not a desired track, the user can release the finger from thetouch panel 12 to display thetrack selection image 22 again on the screen, and look for a desired track quickly. - On the other hand, in the case where a track selected on the selection image is a desired track, the user rotates the
casing 10 to a large degree (second angle or more) and releases the finger from thetouch panel 12, to thereby select and determine the track and cause thetrack reproduction image 23 to be displayed on the screen (center right part ofFIG. 18 to right end part ofFIG. 18 ). - In the case where a specific position of the
track reproduction image 23 is touched in the third display mode (right end part ofFIG. 18 ), the third display mode may be switched to the second display mode (center left part and center right part ofFIG. 18 ). - In the above description, in the second display mode, the
track selection image 22 and thetrack reproduction image 23 are displayed in a rotational manner in accordance with the rotation angle. However, thetrack selection image 22 and thetrack reproduction image 23 may be moved in parallel to each other and in the opposite directions in accordance with the rotation angle in the second display mode (seeFIG. 12 ). Alternatively, there is conceived a case where as the rotation angle becomes large, the color of thetrack selection image 22 becomes light and the color of thetrack reproduction image 23 becomes dark. - In the above description, in the second display mode, the
track selection image 22 and thetrack reproduction image 23 are displayed in a rotational manner in accordance with the rotation angle of thecasing 10 about the y axis, with an axis of the y-axis direction as a center axis. On the other hand, thetrack selection image 22 and thetrack reproduction image 23 can be configured to be displayed in a rotational manner in accordance with the rotation angle of thecasing 10 about the x axis, with the axis of an x-axis direction as a center axis. Alternatively, a combination of the above is also possible. In this case, an image after selection and determination (second image) can be differentiated in accordance with a rotation direction. - In the above description, the first image (image including selection items) is a
track selection image 22 and the second image (image after selection and determination) is atrack reproduction image 23. However, the combination of the first image and the second image is not limited to the above. For example, the first image may be a moving-image selection image, and the second image may be a moving-image reproduction image. In this case, the reproduction of the moving image may be started in the case of a first angle or more of thecasing 10. Alternatively, examples of the combination of the first image and the second image include a case where the first image is a selection image of an application program and the second image is an image of a window displayed by the application program. Typically, if the first image is an image including selection items and the second image is an image obtained after the selection item is selected and determined, the embodiment of the present disclosure can be applied to any cases. - Next, a seventh embodiment of the present disclosure will be described.
-
FIG. 19 is a flowchart showing processing of aninformation processing apparatus 100 according to the seventh embodiment.FIG. 20 is a diagram showing display states on a screen in the case where the processing shown inFIG. 19 is executed. - As shown in
FIG. 20 , in the seventh embodiment, a first image is assumed to be atext image 24 of foreign language such as English (image including selection items of character information) (see left end part ofFIG. 20 ). Thetext image 24 of foreign language includes selection items (selection items of character information) such as words and idioms. - Further, in the seventh embodiment, a second image is assumed to be a
translation image 25 of a selected word, idiom, or the like (image including information relating to character information) (see right end part ofFIG. 20 ). - It should be noted that in
FIG. 20 , a case is assumed where a word of “optimization” is selected as a selection item, and atranslation image 25 of “optimization” emerges as a second image on the screen in accordance with the rotation angle. - The
controller 14 determines whether thetouch panel 12 is touched (Step 601), and if contact is detected (YES of Step 601), thecontroller 14 determines whether a selection item such as a word is displayed at a contact position of the touch panel 12 (Step 602). - In the case where a selection item such as a word is selected (YES of Step 602), a position of the
casing 10 at that time is recorded as a reference point (Step 603), and the rotation angle of thecasing 10 is calculated (Step 604). Next, thecontroller 14 displays thetext image 24 of foreign language and thetranslation image 25 in a rotational manner in accordance with the rotation angle of the casing 10 (Step 605). - During a period of time from the detection of the contact with the
touch panel 12 to the release of the contact, thecontroller 14 displays thetext image 24 of foreign language and thetranslation image 25 in a rotational manner in accordance with the rotation angle of the casing 10 (loop of NO in Steps 604 to 606) (center left part and center right part ofFIG. 20 ) (second display mode). - In the case where contact with the
touch panel 12 is released (YES of Step 606), thecontroller 14 determines whether the rotation angle of thecasing 10 at a time when contact is released is a predetermined angle (for example, about 20 degrees to 45 degrees) or more (Step 607). In the case where the rotation angle is less than the above-mentioned degree, thecontroller 14 displays thetext image 24 of foreign language on the screen (Step 608) (center left part ofFIG. 20 to left end part ofFIG. 20 ) (second display mode to first display mode). On the other hand, in the case of the above-mentioned angle or more, thecontroller 14 displays thetranslation image 25 on the screen (Step 609) (center right part ofFIG. 20 to right end part ofFIG. 20 ) (second display mode to third display mode). - In the seventh embodiment, the user touches with a finger an area of the
text image 24 of foreign language, in which a word that the user does not know is displayed, and rotates thecasing 10 with the finger being in contact with the area. Accordingly, the user can cause atranslation image 25 of the word to emerge on the screen. The user visually recognizes thetranslation image 25, and accordingly can check the meaning of the word. When the check is ended, the user only has to release the finger from thetouch panel 12 in a state where the rotation angle of thecasing 10 is smaller than the above-mentioned angle. Accordingly, thetext image 24 of foreign language can be displayed again. On the other hand, in the case where the user intends to check the meaning of the word in details, the user only has to release the finger from thetouch panel 12 in a state where thecasing 10 is rotated to a large degree (by the above-mentioned angle or more). Accordingly, thetranslation image 25 is displayed on the screen and the meaning of the word can be checked in details. - In the case where a specific position of the
translation image 25 is touched in the third display mode (right end part ofFIG. 20 ), the third display mode may be switched to the second display mode. - In the above description, the first image is the
text image 24 of foreign language, and the second image is thetranslation image 25. However, the first image can be a search image on the web (image including search word), and the second image can be an image of a search result (image including information on search word), - Next, an eighth embodiment of the present disclosure will be described.
-
FIG. 21 is a flowchart showing processing of aninformation processing apparatus 100 according to an eighth embodiment.FIG. 22 is a diagram showing display states on a screen in the case where the processing shown inFIG. 21 is executed. - The
controller 14 determines whether thetouch panel 12 is touched (Step 701). In the case where contact is detected (YES of Step 701), thecontroller 14 determines whether a three-dimensional display object 26 is displayed at a position on the screen that corresponds to the contact position (Step 702). - In the case where a three-
dimensional display object 26 is displayed (YES of Step 702), thecontroller 14 records a position of thecasing 10 at a time when contact is detected, as a reference point (Step 703). Next, thecontroller 14 calculates the rotation angle of thecasing 10 after the detection of the contact (Step 704), and displays the three-dimensional display object 26 in a rotational manner in accordance with the rotation angle of the casing 10 (Step 705). - During a period of time from the detection of the contact with the
touch panel 12 to the release of the contact, thecontroller 14 displays the three-dimensional display object 26 in a rotational manner in accordance with the rotation angle of the casing 10 (loop of NO in Steps 704 to 706). - In the case where contact with the
touch panel 12 is released (YES of Step 706), thecontroller 14 returns to Step 701 again and determines whether contact with thetouch panel 12 is detected. - Through the processing shown in
FIG. 21 , as shown inFIG. 22 , the user repeats an operation of rotating thecasing 10 while touching thetouch panel 12 and an operation of rotating thecasing 10 in the opposite direction without touching thetouch panel 12, to thereby display a side surface, a rear surface, or the like of the three-dimensional display object 26. - It should be noted that in
FIG. 22 , an example is shown in which when thecasing 10 is rotated about the y axis, the three-dimensional display object 26 is rotated about the y axis. However, when thecasing 10 is rotated about the x axis, the three-dimensional display object 26 may be rotated about the x axis. It should be noted that the three-dimensional display object 26 may be displayed in a rotational manner about both the x axis and the y axis. - In
FIG. 22 , the three-dimensional display object 26 is a hexahedron. However, the shape of the three-dimensional display object 26 may be another polyhedron such as a tetrahedron, a sphere, or the like. The shape of the three-dimensional display object 26 is not particularly limited. - In the case where the three-
dimensional display object 26 has a plurality of surfaces, icons may be assigned to the respective surfaces. In this case, a selectable icon is switched in accordance with the angle of the casing 10 (see first embodiment described above). Accordingly, the user rotates thecasing 10 with the finger or the like being in contact with thetouch panel 12 to rotate the three-dimensional display object 26, to thereby cause an optional icon to be displayed on the screen for selection. - In the case where the three-
dimensional display object 26 includes a plurality of icons, those icons may have mutual association. - It should be noted that when icons are assigned to respective surfaces, an optional icon is selected and determined in an operation such as a tap operation. Accordingly, when the user touches a display position of the three-
dimensional display object 26 in order to rotate the three-dimensional display object 26, an icon can be prevented from being selected and determined by the user. - In the case where the user slides a finger or the like at a position of the screen on which the three-
dimensional display object 26 is displayed, the three-dimensional display object 26 may be configured to be rotated. -
FIG. 23 is a diagram showing an example of a case where when the user slides a finger or the like at a position of the screen on which the three-dimensional display object 26 is displayed, the three-dimensional display object 26 is rotated. - As shown in the left end part and the center left part of
FIG. 23 , the user touches with a finger or the like a position where the three-dimensional display object 26 is displayed, and rotates thecasing 10 while maintaining this state. Then, the three-dimensional display object 26 is displayed in a rotational manner on the screen (Steps 701 to 706). - In addition, as shown in the center right part and the right end part of
FIG. 23 , when the user slides the finger or the like at a position of the screen on which the three-dimensional display object 26 is displayed, the three-dimensional display object 26 is rotated. In this case, thecontroller 14 only has to display the three-dimensional display object 26 in a rotational manner based on information of the change amount of the contact position from thetouch panel 12. Accordingly, the user can rotate the three-dimensional display object 26 by rotating thecasing 10 while touching thetouch panel 12, or can rotate the three-dimensional display object 26 by sliding a finger at a position where the three-dimensional display object 26 is displayed. Accordingly, a surface such as a rear surface of the three-dimensional display object 26 can be displayed on the screen with ease. - In the description of
FIG. 23 , when the user rotates thecasing 10 while touching a position at which the three-dimensional display object 26 is displayed, the three-dimensional display object 26 is displayed in a rotational manner, but the present disclosure is not limited thereto. The example shown inFIG. 22 can also be applied to a configuration in which the reference point is automatically updated, which has been described in the first embodiment to fifth embodiment. - In the embodiments described above, the motion sensor has been exemplified as the sensor 13 that detects the inclination of the information processing apparatus. However, the sensor 13 that detects the inclination of the information processing apparatus is not limited to the motion sensor. For example, for the sensor 13, image sensors such as a COD (Charge Coupled Device) sensor and a CMOS (Complementary Metal Oxide Semiconductor) sensor may be used. In this case, an image sensor is provided on the front side of the
casing 10. - In this case, the
controller 14 determines a position or an angle of the face of the user based on an image captured by the image sensor. Thecontroller 14 can detect the inclination of the information processing apparatus with respect to the face of the user based on a change of the position of the face within the image or a change of the angle of the face. - It should be noted that in the case where the image sensor is used as the sensor 13 in the first to fifth embodiments, the reference point is obtained when the face of the user is positioned in front of the
casing 10. - In recent years, there is a case where an image sensor 13 such as a COD sensor 13 may be arranged on the front side of the
casing 10 for the purpose of video chat or the like. The image sensor 13 may be effectively used as a sensor 13 for calculating the rotation angle. In this case, const reduction is achieved. - Alternatively, the sensor 13 may be a combination of a motion sensor and an image sensor. In this case, the position accuracy of a reference point, calculation accuracy of a rotation angle, or the like can be improved.
- There is conceived a case where the information processing apparatus does not include a display and a display is separately provided.
-
FIG. 24 is a diagram showing an example of a case where aninformation processing apparatus 101 does not include adisplay 11, and a display is separately provided. - The
information processing apparatus 101 shown inFIG. 24 does not include adisplay 11. A UI such as a three-dimensional display object 27 is displayed on a screen of adisplay apparatus 102 such as a liquid crystal display apparatus or a television apparatus. In this case, when a user rotates thecasing 10 while touching thetouch panel 12, the three-dimensional display object 27 is displayed in a rotational manner on the screen. - The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2010-199819 filed in the Japan Patent Office on Sep. 7, 2010, the entire content of which is hereby incorporated by reference.
- It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Claims (20)
1. An information processing apparatus, comprising:
a display having a screen;
a sensor configured to detect an inclination; and
a controller configured to display a first object on the screen and display a second object associated with the first object on the screen in accordance with the inclination detected by the sensor.
2. The information processing apparatus according to claim 1 , further comprising a touch panel configured to detect contact, wherein
the second object is an object operable with the touch panel.
3. The information processing apparatus according to claim 2 , wherein
the controller switches availability of an operation of the second object using the touch panel in accordance with the inclination.
4. The information processing apparatus according to claim 3 , wherein
the first object is an object operable with the touch panel.
5. The information processing apparatus according to claim 4 , wherein
the controller switches a first state where the first object is operable with the touch panel and a second state where the second object is operable with the touch panel in accordance with the inclination.
6. The information processing apparatus according to claim 5 , wherein
the controller displays in a rotational manner a three-dimensional display object having a first surface serving as the first object and a second surface serving as the second object in accordance with the inclination, to thereby display the second object on the screen in accordance with the inclination.
7. The information processing apparatus according to claim 6 , wherein
the controller controls display such that the three-dimensional display object is hardly rotated in a case where an angle of the inclination is less than a predetermined value, and such that a rotation speed of the three-dimensional display object becomes higher as the inclination increases in a case where the angle of the inclination is the predetermined value or more.
8. The information processing apparatus according to claim 6 , wherein
the controller determines a position of a virtual camera in accordance with the inclination, and changes a background image of the three-dimensional display object in accordance with the position of the virtual camera.
9. The information processing apparatus according to claim 3 , wherein
the controller moves the first object and the second object in directions different from each other in accordance with the inclination, to thereby display the second object on the screen in accordance with the inclination.
10. The information processing apparatus according to claim 9 , wherein
the controller moves the first object while rotating the first object.
11. The information processing apparatus according to claim 3 , wherein
the first object is an image displayed when content is reproduced,
the second object is an object for operating a reproduction position of the content, and
the controller changes, in a case where the second object is operated with the touch panel, the reproduction position in accordance with a change amount of a contact position of the touch panel.
12. The information processing apparatus according to claim 11 , wherein
the controller changes a ratio of a change amount of the reproduction position of the content to the change amount of the contact position of the touch panel, in accordance with the inclination.
13. The information processing apparatus according to claim 3 , wherein
the controllers can simultaneously execute a plurality of application programs,
the first object is an image displayed by one of the plurality of application programs, and
the second object is an image for selecting one of the plurality of application programs.
14. The information processing apparatus according to claim 3 , wherein
the first object is an image indicating content, and
the second object is an icon for deleting content.
15. The information processing apparatus according to claim 1 , wherein
the controller can update a reference point serving as a reference of the inclination.
16. The information processing apparatus according to claim 15 , wherein
the controller determines whether an angle of the inclination is a predetermined threshold value or more, and updates the reference point when the angle of the inclination is the predetermined threshold value or more.
17. The information processing apparatus according to claim 15 , wherein
the controller determines whether the contact with the touch panel is not detected for a predetermined period of time or more and updates, in the case where the contact is not detected for the predetermined period of time or more, a position of the information processing apparatus at that time as the reference point.
18. The information processing apparatus according to claim 15 , wherein
the controller determines whether a change amount of the inclination is less than a predetermined threshold value within a predetermined period of time and updates, in the case where an angle of the inclination is less than the predetermined threshold value, a position of the information processing apparatus at that time as the reference point.
19. A program causing an information processing apparatus to execute:
displaying a first object on a screen of a display; and
displaying a second object associated with the first object on the screen in accordance with an inclination detected by a sensor.
20. A control method, comprising:
displaying a first object on a screen of a display; and
displaying a second object associated with the first object on the screen in accordance with an inclination detected by a sensor.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/317,488 US20140306915A1 (en) | 2010-09-07 | 2014-06-27 | Information processing apparatus, program, and control method |
US15/429,400 US10088916B2 (en) | 2010-09-07 | 2017-02-10 | Information processing apparatus, program, and control method |
US15/710,088 US10120462B2 (en) | 2010-09-07 | 2017-09-20 | Information processing apparatus, program, and control method |
US16/168,318 US10635191B2 (en) | 2010-09-07 | 2018-10-23 | Information processing apparatus, program, and control method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010199819A JP5664036B2 (en) | 2010-09-07 | 2010-09-07 | Information processing apparatus, program, and control method |
JP2010-0199819 | 2010-09-07 | ||
US13/212,362 US8786636B2 (en) | 2010-09-07 | 2011-08-18 | Information processing apparatus, program, and control method |
US14/317,488 US20140306915A1 (en) | 2010-09-07 | 2014-06-27 | Information processing apparatus, program, and control method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/212,362 Continuation US8786636B2 (en) | 2010-09-07 | 2011-08-18 | Information processing apparatus, program, and control method |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/429,400 Continuation US10088916B2 (en) | 2010-09-07 | 2017-02-10 | Information processing apparatus, program, and control method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140306915A1 true US20140306915A1 (en) | 2014-10-16 |
Family
ID=44674366
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/212,362 Active 2032-01-19 US8786636B2 (en) | 2010-09-07 | 2011-08-18 | Information processing apparatus, program, and control method |
US14/317,488 Abandoned US20140306915A1 (en) | 2010-09-07 | 2014-06-27 | Information processing apparatus, program, and control method |
US15/429,400 Active US10088916B2 (en) | 2010-09-07 | 2017-02-10 | Information processing apparatus, program, and control method |
US15/710,088 Active US10120462B2 (en) | 2010-09-07 | 2017-09-20 | Information processing apparatus, program, and control method |
US16/168,318 Active US10635191B2 (en) | 2010-09-07 | 2018-10-23 | Information processing apparatus, program, and control method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/212,362 Active 2032-01-19 US8786636B2 (en) | 2010-09-07 | 2011-08-18 | Information processing apparatus, program, and control method |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/429,400 Active US10088916B2 (en) | 2010-09-07 | 2017-02-10 | Information processing apparatus, program, and control method |
US15/710,088 Active US10120462B2 (en) | 2010-09-07 | 2017-09-20 | Information processing apparatus, program, and control method |
US16/168,318 Active US10635191B2 (en) | 2010-09-07 | 2018-10-23 | Information processing apparatus, program, and control method |
Country Status (6)
Country | Link |
---|---|
US (5) | US8786636B2 (en) |
EP (2) | EP3543825B1 (en) |
JP (1) | JP5664036B2 (en) |
CN (2) | CN202548818U (en) |
BR (1) | BRPI1103785A2 (en) |
TW (1) | TWI459284B (en) |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5664036B2 (en) * | 2010-09-07 | 2015-02-04 | ソニー株式会社 | Information processing apparatus, program, and control method |
JP6207023B2 (en) * | 2011-05-09 | 2017-10-04 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Rotate objects on the screen |
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US9202297B1 (en) * | 2011-07-12 | 2015-12-01 | Domo, Inc. | Dynamic expansion of data visualizations |
JP5586545B2 (en) * | 2011-09-09 | 2014-09-10 | 任天堂株式会社 | GAME SYSTEM, PORTABLE GAME DEVICE, INFORMATION PROCESSOR CONTROL METHOD, AND INFORMATION PROCESSOR CONTROL PROGRAM |
WO2013067392A1 (en) * | 2011-11-02 | 2013-05-10 | Hendricks Investment Holdings, Llc | Device navigation icon and system, and method of use thereof |
KR101905038B1 (en) * | 2011-11-16 | 2018-10-08 | 삼성전자주식회사 | Apparatus having a touch screen under multiple applications environment and method for controlling thereof |
US10642444B2 (en) * | 2011-12-28 | 2020-05-05 | Panasonic Intellectual Property Management Co., Ltd. | Image display control device, and image display control method |
JP5938987B2 (en) * | 2012-03-28 | 2016-06-22 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
KR101233955B1 (en) * | 2012-05-17 | 2013-02-15 | 권오형 | Device and method for user-centric configuration of icon in main screen |
KR20130136174A (en) * | 2012-06-04 | 2013-12-12 | 삼성전자주식회사 | Method for providing graphical user interface and apparatus for the same |
US9429997B2 (en) * | 2012-06-12 | 2016-08-30 | Apple Inc. | Electronic device with wrapped display |
CN102799361A (en) * | 2012-06-21 | 2012-11-28 | 华为终端有限公司 | Method for calling application object out and mobile terminal |
JP5966665B2 (en) * | 2012-06-26 | 2016-08-10 | ソニー株式会社 | Information processing apparatus, information processing method, and recording medium |
CN102750102A (en) * | 2012-06-28 | 2012-10-24 | 宇龙计算机通信科技(深圳)有限公司 | Mobile terminal and operation control element position replacing method |
US9196219B1 (en) | 2012-07-18 | 2015-11-24 | Amazon Technologies, Inc. | Custom color spectrum for skin detection |
CN102750107A (en) * | 2012-08-02 | 2012-10-24 | 深圳市经纬科技有限公司 | Single-hand operation method of large-screen handheld electronic device and device |
CN103631469B (en) * | 2012-08-21 | 2016-10-05 | 联想(北京)有限公司 | The display processing method of icon, device and electronic equipment |
US9697649B1 (en) | 2012-09-04 | 2017-07-04 | Amazon Technologies, Inc. | Controlling access to a device |
US9218114B1 (en) * | 2012-09-04 | 2015-12-22 | Amazon Technologies, Inc. | Providing time-dependent items |
US9167404B1 (en) | 2012-09-25 | 2015-10-20 | Amazon Technologies, Inc. | Anticipating data use in a wireless device |
KR20150086367A (en) * | 2012-11-19 | 2015-07-27 | 위키패드 인코포레이티드 | Virtual multiple sided virtual rotatable user interface icon queue |
TWI540492B (en) * | 2012-12-20 | 2016-07-01 | 晨星半導體股份有限公司 | Electronic device and electronic device controlling method |
CN103135987A (en) * | 2013-02-22 | 2013-06-05 | 北京小米科技有限责任公司 | Dynamic icon display method and device |
US9671926B2 (en) | 2013-02-22 | 2017-06-06 | Xiaomi Inc. | Method and terminal device for displaying dynamic icon |
US9595140B2 (en) * | 2013-03-15 | 2017-03-14 | Bosch Automotive Service Solutions Inc. | Graphical user interface with search function |
JP6171452B2 (en) * | 2013-03-25 | 2017-08-02 | セイコーエプソン株式会社 | Image processing apparatus, projector, and image processing method |
US9319589B2 (en) | 2013-05-31 | 2016-04-19 | Sony Corporation | Device and method for capturing images and selecting a desired image by tilting the device |
US10210841B1 (en) * | 2013-07-19 | 2019-02-19 | Yelp Inc. | Pull-to-view image user interface feature |
KR20150018264A (en) * | 2013-08-09 | 2015-02-23 | 엘지전자 주식회사 | Wearable glass-type device and control method thereof |
US9355094B2 (en) * | 2013-08-14 | 2016-05-31 | Google Inc. | Motion responsive user interface for realtime language translation |
KR20150101915A (en) * | 2014-02-27 | 2015-09-04 | 삼성전자주식회사 | Method for displaying 3 dimension graphic user interface screen and device for performing the same |
JP6125455B2 (en) * | 2014-03-26 | 2017-05-10 | 京セラドキュメントソリューションズ株式会社 | Image forming system, portable terminal, and image display program |
CN103970500B (en) * | 2014-03-31 | 2017-03-29 | 小米科技有限责任公司 | The method and device that a kind of picture shows |
US9619016B2 (en) | 2014-03-31 | 2017-04-11 | Xiaomi Inc. | Method and device for displaying wallpaper image on screen |
US10178291B2 (en) * | 2014-07-23 | 2019-01-08 | Orcam Technologies Ltd. | Obtaining information from an environment of a user of a wearable camera system |
EP3002666A1 (en) | 2014-10-02 | 2016-04-06 | Huawei Technologies Co., Ltd. | Interaction method for user interfaces |
CN115048007B (en) * | 2014-12-31 | 2024-05-07 | 创新先进技术有限公司 | Device and method for adjusting interface operation icon distribution range and touch screen device |
WO2016138620A1 (en) | 2015-03-02 | 2016-09-09 | 华为技术有限公司 | Method for displaying desktop icons and mobile terminal |
JP5981591B1 (en) * | 2015-03-17 | 2016-08-31 | 株式会社コロプラ | Computer program and computer system for controlling object operations in an immersive virtual space |
CN106293029B (en) * | 2015-05-30 | 2020-12-08 | 深圳富泰宏精密工业有限公司 | Portable electronic device and camera module control method thereof |
CN105915986A (en) * | 2016-04-13 | 2016-08-31 | 深圳Tcl数字技术有限公司 | Intelligent television and icon control method thereof |
JP6508122B2 (en) * | 2016-05-11 | 2019-05-08 | 京セラドキュメントソリューションズ株式会社 | Operation input device, portable terminal and operation input method |
JP6903935B2 (en) * | 2017-02-17 | 2021-07-14 | ソニーグループ株式会社 | Information processing systems, information processing methods, and programs |
US11474838B2 (en) * | 2018-10-31 | 2022-10-18 | Verizon Patent And Licensing Inc. | Modifying content interface based upon level of activity |
JP7307568B2 (en) | 2019-03-20 | 2023-07-12 | 任天堂株式会社 | Image display system, image display program, display control device, and image display method |
JP2020053088A (en) * | 2019-12-11 | 2020-04-02 | キヤノン株式会社 | Control device, control method, and program |
CN111142722A (en) * | 2019-12-20 | 2020-05-12 | 珠海格力电器股份有限公司 | Method, device, terminal equipment and storage medium for displaying application content |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070124699A1 (en) * | 2005-11-15 | 2007-05-31 | Microsoft Corporation | Three-dimensional active file explorer |
US20100088639A1 (en) * | 2008-10-08 | 2010-04-08 | Research In Motion Limited | Method and handheld electronic device having a graphical user interface which arranges icons dynamically |
US20100093400A1 (en) * | 2008-10-10 | 2010-04-15 | Lg Electronics Inc. | Mobile terminal and display method thereof |
US20100125816A1 (en) * | 2008-11-20 | 2010-05-20 | Bezos Jeffrey P | Movement recognition as input mechanism |
US20120026098A1 (en) * | 2010-07-30 | 2012-02-02 | Research In Motion Limited | Portable electronic device having tabletop mode |
US8543415B2 (en) * | 2008-11-26 | 2013-09-24 | General Electric Company | Mobile medical device image and series navigation |
US8847992B2 (en) * | 2008-08-22 | 2014-09-30 | Google Inc. | Navigation in a three dimensional environment using an orientation of a mobile device |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10232757A (en) * | 1997-02-19 | 1998-09-02 | Sharp Corp | Media selector |
US20020067378A1 (en) * | 2000-12-04 | 2002-06-06 | International Business Machines Corporation | Computer controlled user interactive display interfaces with three-dimensional control buttons |
JP2002175139A (en) * | 2000-12-07 | 2002-06-21 | Sony Corp | Information processor, menu display method and program storage medium |
US6798429B2 (en) * | 2001-03-29 | 2004-09-28 | Intel Corporation | Intuitive mobile device interface to virtual spaces |
US7187389B2 (en) * | 2001-04-12 | 2007-03-06 | International Business Machines Corporation | System and method for simultaneous display of multiple object categories |
JP2004164375A (en) * | 2002-11-14 | 2004-06-10 | Canon Inc | Information processor |
JP3977780B2 (en) | 2003-06-20 | 2007-09-19 | 株式会社日立製作所 | gas turbine |
US20050114791A1 (en) * | 2003-11-20 | 2005-05-26 | International Business Machines Corporation | Cueing mechanism that indicates a display is able to be scrolled |
KR100608589B1 (en) * | 2004-07-24 | 2006-08-03 | 삼성전자주식회사 | Three dimensional motion graphic user interface and method and apparutus for providing this user interface |
US20070028187A1 (en) * | 2005-08-01 | 2007-02-01 | Goro Katsuyama | Apparatus and method for performing display processing, and computer program product |
KR100746008B1 (en) * | 2005-10-31 | 2007-08-06 | 삼성전자주식회사 | Three dimensional motion graphic user interface, apparatus and method for providing the user interface |
US8068121B2 (en) * | 2007-06-29 | 2011-11-29 | Microsoft Corporation | Manipulation of graphical objects on a display or a proxy device |
JP2009017486A (en) * | 2007-07-09 | 2009-01-22 | Victor Co Of Japan Ltd | Content reproducing device |
US20090187862A1 (en) * | 2008-01-22 | 2009-07-23 | Sony Corporation | Method and apparatus for the intuitive browsing of content |
JP4767990B2 (en) * | 2008-04-14 | 2011-09-07 | 株式会社スクウェア・エニックス | GAME DEVICE AND PROGRAM |
US9582049B2 (en) * | 2008-04-17 | 2017-02-28 | Lg Electronics Inc. | Method and device for controlling user interface based on user's gesture |
WO2010012097A1 (en) * | 2008-07-29 | 2010-02-04 | Bce Inc. | An integrated media player/navigation control tool |
KR101602363B1 (en) * | 2008-09-11 | 2016-03-10 | 엘지전자 주식회사 | 3 Controling Method of 3 Dimension User Interface Switchover and Mobile Terminal using the same |
CN102203850A (en) * | 2008-09-12 | 2011-09-28 | 格斯图尔泰克公司 | Orienting displayed elements relative to a user |
US8645871B2 (en) * | 2008-11-21 | 2014-02-04 | Microsoft Corporation | Tiltable user interface |
US8279184B2 (en) * | 2009-01-27 | 2012-10-02 | Research In Motion Limited | Electronic device including a touchscreen and method |
JP5251590B2 (en) | 2009-02-24 | 2013-07-31 | パナソニック株式会社 | Flat speaker |
KR101558207B1 (en) * | 2009-09-14 | 2015-10-07 | 엘지전자 주식회사 | Mobile Terminal And Method Of Setting Items Using The Same |
KR20110037657A (en) * | 2009-10-07 | 2011-04-13 | 삼성전자주식회사 | Method for providing gui by using motion and display apparatus applying the same |
US10528221B2 (en) * | 2009-12-31 | 2020-01-07 | International Business Machines Corporation | Gravity menus for hand-held devices |
US8947355B1 (en) | 2010-03-25 | 2015-02-03 | Amazon Technologies, Inc. | Motion-based character selection |
US8581905B2 (en) | 2010-04-08 | 2013-11-12 | Disney Enterprises, Inc. | Interactive three dimensional displays on handheld devices |
KR101340797B1 (en) * | 2010-09-01 | 2013-12-11 | 주식회사 팬택 | Portable Apparatus and Method for Displaying 3D Object |
JP5664036B2 (en) * | 2010-09-07 | 2015-02-04 | ソニー株式会社 | Information processing apparatus, program, and control method |
US8910081B2 (en) * | 2011-04-11 | 2014-12-09 | Microsoft Corporation | Push notifications for updating multiple dynamic icon panels |
-
2010
- 2010-09-07 JP JP2010199819A patent/JP5664036B2/en active Active
-
2011
- 2011-08-18 US US13/212,362 patent/US8786636B2/en active Active
- 2011-08-24 TW TW100130286A patent/TWI459284B/en not_active IP Right Cessation
- 2011-08-30 BR BRPI1103785-7A patent/BRPI1103785A2/en not_active IP Right Cessation
- 2011-08-31 CN CN2011203291388U patent/CN202548818U/en not_active Expired - Lifetime
- 2011-08-31 EP EP19168274.9A patent/EP3543825B1/en active Active
- 2011-08-31 EP EP11179465.7A patent/EP2426574B1/en active Active
- 2011-08-31 CN CN2011102599466A patent/CN102446007A/en active Pending
-
2014
- 2014-06-27 US US14/317,488 patent/US20140306915A1/en not_active Abandoned
-
2017
- 2017-02-10 US US15/429,400 patent/US10088916B2/en active Active
- 2017-09-20 US US15/710,088 patent/US10120462B2/en active Active
-
2018
- 2018-10-23 US US16/168,318 patent/US10635191B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070124699A1 (en) * | 2005-11-15 | 2007-05-31 | Microsoft Corporation | Three-dimensional active file explorer |
US8847992B2 (en) * | 2008-08-22 | 2014-09-30 | Google Inc. | Navigation in a three dimensional environment using an orientation of a mobile device |
US20100088639A1 (en) * | 2008-10-08 | 2010-04-08 | Research In Motion Limited | Method and handheld electronic device having a graphical user interface which arranges icons dynamically |
US20100093400A1 (en) * | 2008-10-10 | 2010-04-15 | Lg Electronics Inc. | Mobile terminal and display method thereof |
US20100125816A1 (en) * | 2008-11-20 | 2010-05-20 | Bezos Jeffrey P | Movement recognition as input mechanism |
US8543415B2 (en) * | 2008-11-26 | 2013-09-24 | General Electric Company | Mobile medical device image and series navigation |
US20120026098A1 (en) * | 2010-07-30 | 2012-02-02 | Research In Motion Limited | Portable electronic device having tabletop mode |
Also Published As
Publication number | Publication date |
---|---|
US20120056878A1 (en) | 2012-03-08 |
TWI459284B (en) | 2014-11-01 |
BRPI1103785A2 (en) | 2013-01-15 |
US8786636B2 (en) | 2014-07-22 |
US10120462B2 (en) | 2018-11-06 |
US20180011554A1 (en) | 2018-01-11 |
US10088916B2 (en) | 2018-10-02 |
CN202548818U (en) | 2012-11-21 |
US20170153716A1 (en) | 2017-06-01 |
EP3543825A1 (en) | 2019-09-25 |
US20190056802A1 (en) | 2019-02-21 |
JP2012058900A (en) | 2012-03-22 |
EP2426574A2 (en) | 2012-03-07 |
CN102446007A (en) | 2012-05-09 |
EP2426574B1 (en) | 2019-04-10 |
US10635191B2 (en) | 2020-04-28 |
JP5664036B2 (en) | 2015-02-04 |
TW201232387A (en) | 2012-08-01 |
EP2426574A3 (en) | 2015-03-18 |
EP3543825B1 (en) | 2022-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10635191B2 (en) | Information processing apparatus, program, and control method | |
US9958971B2 (en) | Information processing apparatus, program, and control method | |
US20210081560A1 (en) | Device, method, and graphical user interface for accessing an application in a locked device | |
JP5951781B2 (en) | Multidimensional interface | |
US9367279B2 (en) | Display device and method of controlling therefor | |
US20110001628A1 (en) | Map information display device, map information display method and program | |
KR20230007515A (en) | Method and system for processing detected gestures on a display screen of a foldable device | |
KR20100061259A (en) | Input device for portable device and method thereof | |
JP2016081302A (en) | Display control apparatus, control method thereof, program, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYAZAWA, YUSUKE;SUZUKI, SEIJI;OKUMURA, YASUSHI;SIGNING DATES FROM 20140702 TO 20140710;REEL/FRAME:033314/0626 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |