WO2012090920A1 - Content display device, content display method, and recording medium - Google Patents

Content display device, content display method, and recording medium Download PDF

Info

Publication number
WO2012090920A1
WO2012090920A1 PCT/JP2011/080033 JP2011080033W WO2012090920A1 WO 2012090920 A1 WO2012090920 A1 WO 2012090920A1 JP 2011080033 W JP2011080033 W JP 2011080033W WO 2012090920 A1 WO2012090920 A1 WO 2012090920A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
display
pop
amount
housing
Prior art date
Application number
PCT/JP2011/080033
Other languages
French (fr)
Japanese (ja)
Inventor
樹利 杉山
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2012090920A1 publication Critical patent/WO2012090920A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals

Definitions

  • the present invention relates to a content display device, a content display method, and a recording medium, and more particularly, to a content display device, a content display method, and a recording medium for displaying stereoscopic content.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2007-019666
  • a technique for changing a stereoscopic image (multi-viewpoint image) to be displayed based on the gaze direction is disclosed.
  • the present invention has been conceived in view of such circumstances, and its purpose is to change the display content of the three-dimensional content in accordance with the movement of the observer's body.
  • a content display device includes a housing, a display device, a memory for storing stereoscopic content, a control unit for displaying the stereoscopic content stored in the memory on the display device, A detection unit for detecting the inclination in the first direction.
  • the control unit changes to increase the pop-out amount of the three-dimensional content, and changes the other direction different from the one direction in the first direction.
  • a change is made so as to reduce the pop-out amount of the three-dimensional content.
  • the detection unit detects the amount of inclination of the housing in the first direction. Further, the control unit changes the pop-out amount of the three-dimensional content by an amount corresponding to the tilt amount.
  • control unit changes the pop-out amount of the three-dimensional content on condition that the amount of inclination exceeds a specific amount.
  • the detection unit detects the inclination of the housing in the second direction intersecting with the first direction.
  • the control unit changes the display target of the stereoscopic content on the display device along the second direction.
  • the stereoscopic content display method is a stereoscopic content display method executed in a content display device including a housing, a display device, and a memory for storing the stereoscopic content.
  • the display method is changed to increase the pop-up amount of the three-dimensional content when detecting the inclination of the first direction of the housing and the inclination of one direction of the first direction.
  • a step of changing so as to reduce the pop-out amount of the three-dimensional content when an inclination in the other direction different from the one direction in the one direction is detected.
  • a recording medium stores a computer-readable content display program for causing a computer having a housing, a display device, and a memory for storing stereoscopic content to function as the content display device. It is a medium.
  • the program stored in the recording medium causes the computer to detect the inclination of the first direction of the housing and the amount of projection of the three-dimensional content when the inclination in one direction of the first direction is detected.
  • the step of changing so as to decrease the pop-out amount of the three-dimensional content is executed.
  • the content display device changes the pop-up amount of the stereoscopic content displayed on the display device according to the inclination of the housing in the first direction.
  • the display content of the stereoscopic content displayed on the display device can be substantially changed according to the movement of the body of the observer having the housing.
  • FIG. 3 is a diagram for explaining a relationship between a moving direction of a housing and an amount of popping out of an object in a three-dimensional content in the mobile phone of FIG.
  • FIG. 3 is a diagram for explaining a relationship between a moving direction of a housing and an amount of popping out of an object in a three-dimensional content in the mobile phone of FIG.
  • FIG. 3 is a diagram for explaining a relationship between a moving direction of a housing and an amount of popping out of an object in a three-dimensional content in the mobile phone of FIG. 4 is a flowchart of image pop-out amount adjustment processing executed by a CPU (Central Processing Unit) in FIG. 3.
  • FIG. 2 is a diagram schematically illustrating an example of a relationship between a tilt of a housing and a protruding amount of an object of a three-dimensional content in the mobile phone of FIG. 1. It is a figure for demonstrating the detection direction of the acceleration of an acceleration sensor in the mobile telephone of FIG.
  • FIG. 1 is a diagram illustrating an appearance of a mobile phone 100 that is an example of a content display device.
  • the mobile phone 100 has a display 101.
  • the mobile phone 100 will be described as a representative example of the “content display device”.
  • the display device may be another information device having a display such as a PND (Personal Navigation Device), PDA (Personal Data Assistance), game machine, electronic dictionary, electronic book reader device, personal computer, etc. Good.
  • PND Personal Navigation Device
  • PDA Personal Data Assistance
  • game machine electronic dictionary
  • electronic book reader device personal computer, etc. Good.
  • the mobile phone 100 displays three-dimensional content (including a 3D (three-dimensional) still image and a 3D moving image, hereinafter also referred to as “3D content” as appropriate) in addition to the conventional planar content. That is, the mobile phone 100 can display the object as if the object is protruding from the display 101 of the mobile phone 100.
  • a method using dedicated glasses includes a liquid crystal active shutter glasses method and a deflector plate method.
  • examples of methods that do not use dedicated glasses include a parallax barrier method and a lenticular method.
  • the 3D content display method according to the present embodiment may be any method, and is not limited to the above method.
  • the mobile phone 100 is covered with a casing 150 at its outer shell.
  • the housing 150 includes a display 101 on its main surface.
  • the main body 150 includes a plurality of buttons constituting the operation unit 104, a speaker 108 that outputs sound, and a microphone (hereinafter referred to as a microphone) 107 on the main surface.
  • An operation switch 104 ⁇ / b> A that constitutes a part of the operation unit 104 is provided on the side surface of the housing 150.
  • the operation switch 104A includes a member accommodated in the housing 150 when pressed.
  • FIG. 2 shows a state in which the housing 150 is held on the left hand of the user (the viewer of the stereoscopic content displayed on the display 101) and the operation switch 104A is pressed by the user and accommodated in the housing 150. .
  • FIG. 3 is a diagram illustrating a hardware configuration of the mobile phone 100 according to the present embodiment.
  • the mobile phone 100 includes a display 101, a touch sensor 102, an operation unit 104, a communication interface 105, an acceleration sensor 106, a microphone 107, a speaker 108, a memory interface 109, and a CPU 110. And a memory 111 and a RAM (Random Access Memory) 112.
  • the touch sensor 102 is provided on the display 101.
  • the display 101 and the touch sensor 102 constitute a touch panel 103.
  • the touch sensor 102 receives an input of information by a touch operation from the outside.
  • the display 101 and the touch sensor 102 constitute a touch panel.
  • the display 101 displays various information by being controlled by the CPU 110.
  • the touch sensor 102 detects a touch operation with a user's finger or stylus pen, and inputs the coordinates where the touch operation is performed to the CPU 110.
  • the touch sensor 102 detects a contact area with the touch sensor 102 when a touch operation is performed, and inputs the contact area to the CPU 110.
  • the touch sensor 102 detects a pressure when the touch sensor 102 is pressed in a touch operation on the touch sensor 102 and inputs the pressure to the CPU 110.
  • the operation unit 104 includes a plurality of buttons such as operation switches 104A and the touch sensor 102. However, the operation unit 104 may be configured by only the operation switch 104 ⁇ / b> A, or the operation switch 104 ⁇ / b> A may be provided in the mobile phone 100 as a software button displayed on the touch panel 103.
  • the communication interface 105 is controlled by the CPU 110 to perform data communication with an external terminal or server via a network.
  • the CPU 110 receives content data and programs from an external server via the communication interface 105.
  • the microphone 107 receives sound and generates a sound signal.
  • the microphone 107 inputs an audio signal to the CPU 110.
  • the speaker 108 outputs various information (for example, voice message, beep sound, etc.) by being controlled by the CPU 110.
  • the memory interface 109 reads the data from the recording medium 200 that can be attached to and detached from the housing 150 to input the data to the CPU 110 and store the data from the CPU 110 in the recording medium 200.
  • CD-ROM Compact Disc-Read Only Memory
  • DVD-ROM Digital Versatile Disk-Read Only Memory
  • USB Universal Serial Bus
  • memory card FD (Flexible Disk), Hard disk, magnetic tape, cassette tape, MO (Magnetic Optical Disc), MD (Mini Disc), IC (Integrated Circuit) card (excluding memory card), optical card, mask ROM, EPROM, EEPROM (Electronically Erasable Programmable Read-Only A medium for storing the program in a nonvolatile manner such as Memory).
  • the memory 111 is realized by various RAMs, ROMs (Read-Only Memory), hard disks, and the like.
  • the memory 111 is a USB memory, CD-ROM, DVD-ROM, memory card, FD, hard disk, magnetic tape, cassette tape, MO, MD, IC card (memory card) used via a reading interface.
  • a non-volatile storage medium such as an optical card, mask ROM, EPROM, and EEPROM.
  • the memory 111 stores a control program executed by the CPU 110, 3D content data, and the like.
  • the 3D content data includes data for displaying a 3D still image and data for displaying a 3D moving image.
  • CPU 110 executes various programs stored in memory 111.
  • the process in the mobile phone 100 (for example, the process shown in the flowchart of FIG. 8) is realized by each hardware and software executed by the CPU 110.
  • Such software may be stored in the memory 111 in advance, or may be stored in a recording medium and distributed as a program product. Alternatively, the software may be provided as a program product that can be downloaded by an information provider connected to a network.
  • Such software is read from the recording medium by using a reading device (not shown), or downloaded by using the communication interface 105 and temporarily stored in the memory 111.
  • the CPU 110 stores the software in the form of an executable program in the memory 111 and then executes the program.
  • the program includes not only a program that can be directly executed by a processor such as the CPU 110 but also a program in a source program format, a compressed program, and an encrypted program.
  • the RAM 112 functions as a work area when the CPU 110 executes a program.
  • the acceleration sensor 106 may detect the acceleration applied to the housing 150 and may be housed inside the housing 150 or may be attached to the outside of the housing 150. Moreover, the acceleration sensor 106 detects the acceleration about a triaxial direction, for example.
  • FIG. 10 is a diagram for explaining the acceleration detection direction of the acceleration sensor 106.
  • the acceleration sensor 106 is The acceleration in the three axial directions in the axial direction and the depth direction (Z-axis direction) is detected.
  • FIG. 4 is a functional block diagram of the mobile phone 100.
  • the mobile phone 100 includes a central processing unit 191, a pop-up amount acquisition unit 192, a display information generation unit 193, an attitude detection unit 194, and a communication unit 195 as its functions.
  • the central processing unit 191 controls the operation of the mobile phone 100 as a whole, and is realized by the CPU 110 executing a program, for example.
  • the pop-out amount acquisition unit 192 acquires the pop-out amount to be displayed for each 3D content (each object) to be displayed on the display 101 based on the state of the mobile phone 100.
  • the CPU 110 executes a program. Realized.
  • the display information generation unit 193 is realized by generating image data to be displayed on the display 101, for example, by the CPU 110 executing a program.
  • the display information generation unit 193 may be realized by a dedicated circuit such as a graphic board.
  • the posture detection unit 194 detects the inclination of the housing 150 and is realized by the acceleration sensor 106, for example.
  • the communication unit 195 executes processing for communicating with other devices, and is realized by the communication interface 105, for example.
  • the amount of pop-out of the three-dimensional content (object) displayed on the display 101 can be controlled by changing the inclination of the housing 150.
  • FIG. 5A shows a state in which the user holds the mobile phone 100 by hand and views the stereoscopic display content displayed on the display 101.
  • the pop-out amount of the three-dimensional content displayed on the display 101 at this time is schematically shown in FIG.
  • FIG. 5B schematically shows the degree of pop-out from the display surface 901 of the display 101 when the object 900 of the three-dimensional content is displayed.
  • the user's eyes looking at the display 101 are schematically shown as E.
  • FIG. 6A shows a state in which the mobile phone 100 is tilted in the direction of arrow A12 from the state shown in FIG.
  • the state shown in FIG. 6A is the direction of arrow A12 from the state shown in FIG. 5A, that is, the case where the surface on which the display 101 is provided is the front of the housing 150. This corresponds to a state in which the upper end of the housing 150 is tilted rearward.
  • the display of the display 101 is controlled so that the amount of the projected three-dimensional content is reduced. This is based on the detection of backward acceleration (acceleration on the + side in the Z-axis direction in FIG. 10) in the Z-axis direction detected by the acceleration sensor 106.
  • the pop-up amount of the three-dimensional content shown in FIG. 5B is as shown in FIG. Changed as shown.
  • FIG. 6B compared to the state shown in FIG. 5B, the amount of protrusion of the object 900 with respect to the display surface 901 is changed in a direction away from the user's eyes E. ing.
  • FIG. 7A shows a state in which the mobile phone 100 is tilted in the direction of arrow A11 from the state shown in FIG.
  • the state shown in FIG. 7A is the same as the state shown in FIG. 5A in the direction of arrow A11, that is, when the surface on which the display 101 is provided is the front of the housing 150. This corresponds to a state in which the upper end of the housing 150 is tilted forward.
  • the display of the display 101 is controlled so that the amount of the stereoscopic content to be displayed increases. This is based on the fact that forward acceleration (acceleration on the negative side in the Z-axis direction in FIG. 10) is detected in the Z-axis direction detected by the acceleration sensor 106.
  • the pop-out amount of the three-dimensional content shown in FIG. 5B is as shown in FIG. Changed as shown.
  • FIG. 7B compared to the state shown in FIG. 5B, the amount of protrusion of the object 900 with respect to the display surface 901 is changed in a direction approaching the user's eye E. ing.
  • FIG. 8 is a flowchart of processing (image pop-out amount adjustment processing) performed by the CPU 110 for adjusting the pop-out amount of the stereoscopic content displayed on the display 101.
  • the CPU 110 executes an image pop-out amount adjustment process in the background when an application for displaying the three-dimensional content on the display 101 is executed, thereby adjusting the pop-out amount of the three-dimensional content.
  • CPU 110 first initializes acceleration sensor 106 in step S10, and advances the process to step S20.
  • step S20 the CPU 110 determines whether or not the operation switch 104A is turned on. If it is determined that the operation switch 104A is turned on, the process proceeds to step S30. If it is determined that the operation switch 104A is not turned on, the process returns to step S10.
  • the operation switch 104A being turned on refers to a state in which the operation switch 104A is pressed and stored in the housing 150 as described with reference to FIG. CPU 110 determines, for example, whether or not operation switch 104A is turned on by detecting the energization state of a predetermined portion or the presence or absence of a potential change in an electric circuit constituting mobile phone 100.
  • step S30 the CPU 110 determines whether or not the acceleration sensor 106 has changed in the acceleration detection state in the Z-axis direction (see FIG. 10) after the operation switch 104A is turned on in step S20. If it is determined that the change has occurred, the process proceeds to step S40, and if the change is not detected, the process returns to step S20.
  • step S40 the CPU 110 acquires the pop-up amount of the three-dimensional content according to the inclination of the casing 150, and generates image data so that the three-dimensional content displayed on the display 101 at that time is displayed with the pop-out amount. Then, the image data is displayed on the display 101.
  • the inclination of the casing 150 is determined based on the detected value of the acceleration in the Z-axis direction of the acceleration sensor 106 detected in step S30.
  • the memory 111 stores the pop-up amount of the three-dimensional content in association with the inclination of the housing 150.
  • the memory 111 is shown in FIG. 6B in association with, for example, the inclination of the direction of the arrow A12 in FIG. 5A (that is, a positive acceleration detected in the Z-axis direction).
  • the pop-out amount that the pop-out amount is smaller than the initial state (FIG. 5B) is stored, and the inclination of the direction of the arrow A11 in FIG. In relation to the negative acceleration detected in the axial direction), the pop-out amount is larger than the initial state (FIG. 5B) as shown in FIG.
  • the pop-out amount is stored.
  • step S50 after updating the display of the display 101 so that the pop-out amount of a solid content may change in step S40.
  • step S50 the CPU 110 determines whether or not the operation switch 104A is turned off. If it is determined that the operation switch 104A is not turned off, the process returns to step S30. If it is determined that the operation switch 104A is turned off, the process returns to step S10.
  • the operation switch 104A being turned off means, for example, a state in which the operation switch 104A has been depressed as shown in FIG.
  • CPU 110 determines whether or not operation switch 104A is turned off by detecting the energization state of a predetermined portion or the presence or absence of a potential change in the electric circuit in mobile phone 100.
  • the tilt of the casing 150 in the Z-axis direction changes during the period when the operation switch 104A is ON (pressed).
  • the pop-out amount of the stereoscopic content to be changed is changed according to the inclination.
  • such a change in the amount of three-dimensional content pop-up is not limited to a period during which the operation switch 104A is ON (pressed).
  • the amount of stereoscopic content pop-up changes according to the inclination of the casing 150 in the Z-axis direction without requiring a special operation of pressing the operation switch 104A. May be.
  • the processes in steps S20 and S50 in FIG. 8 are omitted. Specifically, after the acceleration sensor 106 is initialized in step S10, the state of the acceleration sensor 106 is monitored in step S30, and when it is determined that the state has changed, in step S40, the pop-up corresponding to the change is made. The display of content on the display 101 is updated so as to be displayed in quantity.
  • the form of the part that is operated as a condition for changing the pop-out amount of the three-dimensional content is not limited to a button shape like the operation switch 104A.
  • a small hole that can be covered with a human finger is provided on a side surface of the housing 150 where the operation switch 104A is provided, and the small hole is covered with a human finger or the like. Only during a certain period, the pop-up amount of the three-dimensional content may be adjusted according to the inclination of the housing 150. That is, the process of step S20 may be changed so that the small hole is covered with a user's finger or the like and if it is covered, the process proceeds to step S30.
  • step S50 may be changed so that it is detected whether the small hole is covered with a user's finger or the like, and the process returns to step S10 when it is not covered.
  • the CPU 110 determines whether the small hole is covered with a finger or the like based on, for example, whether light input from the outside of the housing 150 through the small hole can be detected. Specifically, a light receiving element that receives light incident through a small hole is provided inside the housing 150, and the CPU 110 indicates that the small hole indicates that the light received by the light receiving element is less than a predetermined amount. If it is more than the predetermined amount, it is determined that the small hole is not covered with a finger or the like.
  • the memory 111 stores the pop-up amount of the three-dimensional content in association with each of the positive value acceleration and the negative value acceleration detected in the Z-axis direction. .
  • the pop-out amount stored in the memory 111 may be stored in multiple stages not only depending on whether it is positive or negative but also corresponding to the magnitude of the acceleration value.
  • the mobile phone 100 not only tilts the housing 150 but also according to the tilting size (the angle of tilting and / or the magnitude of the force applied to the housing 150 when tilting).
  • the pop-up amount of the stereoscopic content displayed on the display 101 can be controlled.
  • the acceleration value and the pop-out amount may be associated by a function or the like.
  • the pop-out amount of the three-dimensional content is not changed.
  • the change amount of the pop-out amount of the three-dimensional content with respect to the inclination of the casing 150 is indicated by a line L1.
  • the amount of change refers to the amount of change in the amount of protrusion from the amount of protrusion displayed at that time (for example, when acceleration is detected in step S30 in FIG. 8).
  • the tilt of the casing refers to the tilt in the Z-axis direction. When the tilt of the casing is zero, the detection output of the acceleration sensor 106 at the time when the operation switch 104A is turned on in step S20. Is the standard.
  • the amount of pop-up of the three-dimensional content is not changed, so that the user can pop out the three-dimensional content.
  • the amount is not intended to be changed, it is possible to avoid changing the pop-out amount in response to a slight shaking of the housing 150.
  • the pop-up amount of the three-dimensional content displayed on the display 101 is the depth direction of the housing 150 as described with reference to FIGS. 5A to 7A. It was changed according to the change of the slope.
  • the display position of the three-dimensional content may be changed according to the tilt in the left-right direction.
  • the display mode of the three-dimensional content is also changed when tilted in the left-right direction.
  • FIG. 12 is a diagram schematically illustrating an example of a display mode of the stereoscopic content on the display 101 when the housing 150 is in the state illustrated in FIG. In FIG. 12, on the display 101, the three-dimensional content 181 is displayed at substantially the center in the left-right direction.
  • FIG. 13A shows an example of how the three-dimensional content is displayed on the display 101 when the housing 150 is tilted to the left as shown in FIG. 11F from the state shown in FIG. FIG.
  • the display position of the stereoscopic content shown in FIG. 12 is indicated by a broken line 182 for reference.
  • the three-dimensional content 181 is displayed on the left side of the position shown in FIG.
  • FIG. 13B shows an example of a display mode of stereoscopic content on the display 101 when the housing 150 is tilted to the right as shown in FIG. 11D from the state shown in FIG. FIG.
  • the display position of the stereoscopic content shown in FIG. 12 is indicated by a broken line 182 for reference.
  • the three-dimensional content 181 is displayed on the right side of the position shown in FIG.
  • the CPU 110 may change the display position of the stereoscopic content by changing the position (location) of the area used for displaying the stereoscopic content in the display area of the display 101. That is, when an application for displaying stereoscopic content on the display 101 is executed, the CPU 110 adjusts the movement amount of the stereoscopic content in the background according to the amount of inclination of the casing 150 in the left-right direction. Thus, the position (place) of the area used for displaying the stereoscopic content in the display area of the display 101 is changed.
  • FIG. 18 shows content 1201, which is an example of three-dimensional content.
  • the content 1201 is a part of a larger content 1200 (FIG. 19).
  • the acceleration sensor 106 detected in the housing 150 detects the acceleration in the Z-axis direction while the content 1201 that is a part of the content 1200 is displayed on the display 101
  • the range displayed on the display 101 is changed. More specifically, when the housing 150 is tilted from the state shown in FIG. 5A to the back side as shown in FIG. 6A, the display is displayed on the display 101 as shown in FIG.
  • the content to be changed is changed to the content 1211.
  • FIG. 21 shows the position of the content 1211 in the content 1200.
  • FIG. 23 shows the position of the content 1212 in the content 1200.
  • the content 1211 is a display range located on the near side of the content 1201 in the content 1200.
  • the content 1212 is a display range located on the far side of the content 1201 in the content 1200.
  • a broken line DL represents a line where the surface represented by the contents 1201, 1211, and 1212 and the display 101 intersect.
  • the amount of change in the display range on the content 1200 may be determined according to the amount of tilt detected by the acceleration sensor 106. Further, when the change amount is determined, it is preferable not to change the display range when the inclination is small to some extent as described with reference to FIG.
  • a map displayed in three dimensions can be cited.
  • the tag when the tag is attached
  • an application for displaying content when activated, an image in a range centered on an object corresponding to a tag specified by default is displayed on the display 101.
  • the content includes a plurality of objects, and further includes tags corresponding to each of one or more objects among the plurality of objects.
  • the plurality of tags are stored in association with the arrangement of the objects corresponding to each tag on the map.
  • the designated tag When tilted in the direction of the arrow A12, the designated tag is changed to a tag arranged at the next position in the direction associated with the direction of the arrow A12 in the content. Then, the display on the display 101 is updated to the one centered on the object corresponding to the changed tag.
  • tags are included for object 1281 and object 1282, and the direction corresponding to the direction of arrow A12 in content 1200 is the direction indicated by arrow A120.
  • a range 1291 centered on the object 1281 is displayed on the display 101.
  • the housing 150 is tilted in the direction of the arrow A12 (FIG. 5A)
  • the range displayed on the display 101 is changed from the range 1291 to the range 1292.
  • a range 1292 is a range centered on the object 1282.
  • the arrangement associated with the tag of the object 1282 is positioned next to the arrangement associated with the tag of the object 1281 in the direction of the arrow A120.
  • the center of the display range in the double arrow A10 direction map is arranged in the double arrow A81 direction.
  • the object that is currently the center of the display may be changed to a tagged object arranged next to D2.
  • the arrow in FIG. When it is detected in step S30 that the camera is tilted in the direction of A11, the plane including the start point displayed on the display 101 in the map content 801 is changed to the next object on the D1 side of the double arrow A81.
  • the pop-up amount of the three-dimensional content displayed on the display 101 is the depth direction of the housing 150 as described with reference to FIGS. 5A to 7A. It was changed in response to the change in inclination in the (front-rear direction).
  • the content display mode may be changed according to the tilt in the left-right direction.
  • the display mode of the three-dimensional content is also changed when tilted in the left-right direction.
  • FIG. 18 shows content 1201 as an example of three-dimensional content.
  • the content 1201 is a part of a larger content 1200 (FIG. 19).
  • the user tilts the housing 150 clockwise as shown in FIG.
  • the content displayed on the display 101 is changed to the content 1202 as shown in FIG.
  • the content 1202 is a display range located on the left side of the content 1201 in the content 1200, as shown in FIG.
  • the content 1201 that is a part of the content 1200 is displayed on the display 101 in the state of FIG. 11E (FIG. 18), the user tilts the housing 150 counterclockwise. 11 (F), the content displayed on the display 101 is changed to content 1203 as shown in FIG. As shown in FIG. 27, the content 1203 is a display range located on the right side of the content 1201 in the content 1200.
  • a broken line DL represents a line where the surface represented by the contents 1201 to 1203 and the display 101 intersect.
  • the mobile phone 100 moves the display range of the content in accordance with the inclination of the housing 150 while displaying the stereoscopic content, the mode for changing the pop-up amount, and the content left and right. Operates in move mode.
  • FIG. 17 shows a flowchart of an example of the image pop-out amount adjustment process executed in the present modification.
  • the image pop-out amount adjustment processing of the present modification corresponds to a process in which steps S60 to S80 are added to the image pop-out amount adjustment processing shown in FIG. The contents will be specifically described below.
  • CPU 110 first initializes acceleration sensor 106 in step S10 and advances the process to step S20.
  • step S20 the CPU 110 determines whether or not the operation switch 104A is turned on. If it is determined that the operation switch 104A is turned on, the process proceeds to step S30. If it is determined that the operation switch 104A is not turned on, the process proceeds to step S60.
  • step S30 the CPU 110 determines whether or not the acceleration sensor 106 has changed in the acceleration detection state in the X-axis direction or the Z-axis direction (see FIG. 10) after the operation switch 104A is turned on in step S20. If it is determined that there is a change, the process proceeds to step S40. If no change is detected, the process returns to step S20.
  • step S40 the CPU 110 acquires the pop-up amount of the three-dimensional content according to the inclination of the casing 150, and generates image data so that the three-dimensional content displayed on the display 101 at that time is displayed with the pop-out amount.
  • the display 101 is updated so that the image data is displayed on the display 101, and the process proceeds to step S50.
  • step S50 the CPU 110 determines whether or not the operation switch 104A is turned off. If it is determined that the operation switch 104A is not turned off, the process returns to step S30. If it is determined that the operation switch 104A is turned off, the process returns to step S10.
  • step S60 CPU 110 determines whether or not there has been a change in the acceleration detection state in the X-axis direction or the Z-axis direction (see FIG. 10) in acceleration sensor 106 since operation switch 104A was turned on in step S20. If it is determined that there is a change, the process proceeds to step S70. If no change is detected, the process returns to step S20.
  • step S70 the CPU 110 updates the display on the display 101 to move the cross section in the manner described with reference to FIGS. 12, 13A, and 13B according to the inclination of the casing 150, and then proceeds to step S80. Proceed with the process.
  • step S80 the CPU 110 determines whether or not the operation switch 104A is turned on. If it is determined that the operation switch 104A is turned on, the process returns to step S10, and if it is determined that it is not turned on, the process returns to step S60.
  • the content pop-out amount is controlled according to the inclination of the housing 150 when the operation switch 104 is turned on, and the housing 150 when the operation switch 104 is not turned on.
  • the position of the cross section to be displayed in the content is controlled according to the inclination of the content. Note that when the operation switch 104 is turned on, the position of the cross section to be displayed in the content is controlled according to the tilt of the case 150, and when the operation switch 104 is not turned on, The amount of content jumping out may be controlled according to the inclination.
  • the pop-out amount of the three-dimensional content is changed for the entire display content according to the inclination of the housing 150.
  • the pop-up amount in the display may be determined for each object included in the content.
  • FIG. 14 is a flowchart of a subroutine for determining the pop-out amount in step S40 (see FIG. 8).
  • CPU 110 obtains the type of the Nth object included in the stereoscopic content in step S41.
  • the object type is acquired based on, for example, the file name of the object.
  • step S42 the CPU 110 acquires a jump amount change mode according to the type of object, and advances the process to step S43.
  • the memory 111 depending on the type of object, for example, when the casing 150 is tilted to one side (+ side in the Z-axis direction), the pop-out amount is increased or the pop-out amount is decreased. Is stored.
  • step 42 the change type of the object type acquired in step S41 is acquired.
  • step S43 the CPU 110 obtains the pop-out amount corresponding to the amount of change in the inclination of the casing 150, and advances the process to step S44.
  • the memory 111 stores a pop-out amount corresponding to the amount of change in inclination.
  • the CPU 110 acquires the amount of change in the tilt of the housing 150 based on the detection output of the acceleration sensor 106. Then, the pop-out amount corresponding to the acquired amount of change in inclination is acquired based on the above-described stored contents.
  • step S44 the CPU 110 determines whether or not the pop-out amount has been determined in step S43 for all the objects included in the stereoscopic content that is the display target of the display 101. Each object is displayed on the display 101 by the amount, and the process returns to FIG. On the other hand, if it is determined that the pop-out amounts of all objects have not yet been determined, N is incremented by 1 in step S45, and the process returns to step S41.
  • N is initialized to 1 at the start of the process of FIG.
  • N is initialized to 1 at the start of the process of FIG.
  • the pop-out amount is small for the remaining characters so that the pop-out amount is increased for some characters.
  • the display on the display 101 is changed.
  • the pop-out amount of each character may be changed in the opposite direction.
  • the pop-out amount is increased for the character, and the pop-out amount is decreased for the text.
  • the display on the display 101 is changed.
  • the pop-out amounts of the character and the text may be changed in the opposite directions. Further, before the casing 150 is tilted, in the content, the text and the character are displayed so as to be positioned on the same plane, and when the casing 150 is tilted, as described above, the text and the character (or a plurality of types) are displayed. The pop-out amount of each object may be controlled so that the pop-out amount of the character (2) is changed.
  • the change mode of the pop-out amount may be adjusted for each portion. That is, for example, when the housing 150 is tilted in the direction of the arrow A11 in FIG. 5A, the upper body of the character has a larger amount of protrusion, and the lower body of the character has a smaller amount of protrusion.
  • the display on the display 101 may be changed.
  • the inclination of the housing 150 is detected by the acceleration sensor 106, but the detection of the inclination of the housing in the content display device of the present invention is not limited to this.
  • the mobile phone 100 includes a camera 106 ⁇ / b> A instead of the acceleration sensor 106, and can detect the tilt of the housing 150 by processing an image captured by the camera 106 ⁇ / b> A. .
  • FIG. 16 is a functional block diagram of the mobile phone 100 of the present modification.
  • an image processing unit 196 is provided instead of the posture detection unit 194 (see FIG. 4).
  • the image processing unit 196 detects a tilt of the housing 150 by processing a plurality of images taken at different timings by the camera 106A, and is realized by the CPU 110 executing a predetermined program, for example. This circuit may be realized.
  • the image processing unit 196 detects two overlapping timings when two images are arranged in space by detecting an overlapping portion by using a normalized cross-correlation method in images taken at two different timings.
  • the direction of movement of the camera 106A is calculated.
  • the calculated movement direction is the movement direction of the housing 150.
  • any other calculation method can be employed as long as the direction of movement of the housing 150 is calculated based on the captured image.
  • the display 101 which is an example of the display device is provided integrally with the housing 150, but the present invention is not limited to this.
  • the pop-out amount of the three-dimensional content on the display 101 may be changed according to the inclination of the casing 150 that is configured separately from the display 101.
  • 100 mobile phone 101 display, 102 touch sensor, 103 touch panel, 104 operation unit, 104A operation switch, 106 acceleration sensor, 106A camera, 107 microphone, 108 speaker, 109 memory interface, 150 housing, 191 central processing unit, 192 popping out Quantity acquisition unit, 193 display information generation unit, 194 attitude detection unit, 195 communication unit, 196 image processing unit, 200 recording medium, 900 object, 1200 content.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Provided is a content display device (100), wherein, when it is detected that a casing (150) has inclined in one orientation of a first direction, the nature of the display is changed such that the degree to which stereoscopic content exits the plane increases. Conversely, when it is detected that the casing (150) has inclined in another orientation of the first direction, the nature of the display is changed such that the degree to which stereoscopic content exits the plane decreases.

Description

コンテンツ表示装置、コンテンツ表示方法、および、記録媒体Content display device, content display method, and recording medium
 本発明は、コンテンツ表示装置、コンテンツ表示方法、および、記録媒体に関し、特に、立体コンテンツを表示させるためのコンテンツ表示装置、コンテンツ表示方法、および、記録媒体に関する。 The present invention relates to a content display device, a content display method, and a recording medium, and more particularly, to a content display device, a content display method, and a recording medium for displaying stereoscopic content.
 従来から、立体コンテンツの表示に関し、種々の技術が開示されている。
 たとえば、特許文献1(特開2007-019666号公報)では、立体コンテンツの観察者が身体を動かした場合に当該観察者の水平方向および垂直方向の視差に対応するために、観察者の垂直方向の視線方向を検出し、当該視線方向に基づいて、表示する立体画像(多視点画像)を変更する技術が開示されている。これにより、立体コンテンツの観察者が視線方向を変えても、当該観察者が自然に当該立体コンテンツを観察できるようにしている。
Conventionally, various techniques have been disclosed for displaying stereoscopic content.
For example, in Patent Document 1 (Japanese Patent Application Laid-Open No. 2007-019666), when a stereoscopic content observer moves his / her body, in order to cope with the horizontal and vertical parallaxes of the observer, Is disclosed, and a technique for changing a stereoscopic image (multi-viewpoint image) to be displayed based on the gaze direction is disclosed. Thereby, even if the observer of the stereoscopic content changes the line-of-sight direction, the observer can naturally observe the stereoscopic content.
特開2007-019666号公報JP 2007-019666 A
 近年、ゲームなどの多くのコンテンツに立体コンテンツが採用されることとなり、立体コンテンツの表示に際し、観察者の身体の動きを反映させる試みが種々なされている。このような中で、特許文献1に開示されているような、観察者が自然に観察できるなどの、表示内容の実質的な変更を伴わないものも、観察者の動きが反映された表示としては効果を奏している。しかしながら、コンテンツの表示に関する技術が急速に進歩する中、観察者の動きによって実質的にコンテンツの表示内容を変更し、立体コンテンツの表示をバラエティに富んだものとすることが期待されている。 In recent years, stereoscopic content has been adopted for many contents such as games, and various attempts have been made to reflect the movement of the observer's body when displaying stereoscopic content. Under such circumstances, a display that is not accompanied by a substantial change in display content, such as that the observer can observe naturally, as disclosed in Patent Document 1, is a display that reflects the movement of the observer. Is effective. However, while the technology related to the display of content is rapidly progressing, it is expected that the display content of the content will be substantially changed by the movement of the observer, and the display of the stereoscopic content will be rich in variety.
 本発明は、かかる実情に鑑み考え出されたものであり、その目的は、立体コンテンツの表示内容を、観察者の身体の動きに応じて変更させることである。 The present invention has been conceived in view of such circumstances, and its purpose is to change the display content of the three-dimensional content in accordance with the movement of the observer's body.
 本発明に従ったコンテンツ表示装置は、筐体と、表示装置と、立体コンテンツを記憶するためのメモリと、メモリに記憶された立体コンテンツを表示装置に表示させるための制御部と、筐体の第1の方向の傾きを検出するための検出部とを備える。制御部は、検出部が、第1の方向の一方の向きの傾きを検出した場合には立体コンテンツの飛び出し量を増加するように変更し、第1の方向の一方の向きとは異なる他方の向きの傾きを検出した場合には立体コンテンツの飛び出し量を減少するように変更する。 A content display device according to the present invention includes a housing, a display device, a memory for storing stereoscopic content, a control unit for displaying the stereoscopic content stored in the memory on the display device, A detection unit for detecting the inclination in the first direction. When the detection unit detects a tilt in one direction of the first direction, the control unit changes to increase the pop-out amount of the three-dimensional content, and changes the other direction different from the one direction in the first direction. When the inclination of the direction is detected, a change is made so as to reduce the pop-out amount of the three-dimensional content.
 好ましくは、検出部は、筐体の第1の方向の傾きの量を検出する。また、制御部は、傾きの量に応じた量だけ、立体コンテンツの飛び出し量を変更する。 Preferably, the detection unit detects the amount of inclination of the housing in the first direction. Further, the control unit changes the pop-out amount of the three-dimensional content by an amount corresponding to the tilt amount.
 好ましくは、制御部は、傾きの量が特定の量を超えたことを条件として、立体コンテンツの飛び出し量を変更する。 Preferably, the control unit changes the pop-out amount of the three-dimensional content on condition that the amount of inclination exceeds a specific amount.
 好ましくは、検出部は、筐体の、第1の方向と交わる第2の方向の傾きを検出する。また、制御部は、検出部が第2の方向の傾きを検出した場合には、立体コンテンツにおける表示装置での表示対象を、第2の方向に沿って変更する。 Preferably, the detection unit detects the inclination of the housing in the second direction intersecting with the first direction. In addition, when the detection unit detects the inclination in the second direction, the control unit changes the display target of the stereoscopic content on the display device along the second direction.
 本発明に従った立体コンテンツの表示方法は、筐体、表示装置、および、立体コンテンツを記憶するためのメモリを備えたコンテンツ表示装置において実行される立体コンテンツの表示方法である。当該表示方法は、筐体の第1の方向の傾きを検出するステップと、第1の方向の一方の向きの傾きを検出した場合には立体コンテンツの飛び出し量を増加するように変更し、第1の方向の一方の向きとは異なる他方の向きの傾きを検出した場合には立体コンテンツの飛び出し量を減少するように変更するステップとを備える。 The stereoscopic content display method according to the present invention is a stereoscopic content display method executed in a content display device including a housing, a display device, and a memory for storing the stereoscopic content. The display method is changed to increase the pop-up amount of the three-dimensional content when detecting the inclination of the first direction of the housing and the inclination of one direction of the first direction. A step of changing so as to reduce the pop-out amount of the three-dimensional content when an inclination in the other direction different from the one direction in the one direction is detected.
 本発明に従った記録媒体は、筐体、表示装置、および、立体コンテンツを記憶するためのメモリを備えたコンピュータをコンテンツ表示装置として機能させるためのコンピュータ読取可能なコンテンツ表示用プログラムを記憶した記録媒体である。当該記録媒体に記憶されたプログラムは、コンピュータに、筐体の第1の方向の傾きを検出するステップと、第1の方向の一方の向きの傾きを検出した場合には立体コンテンツの飛び出し量を増加するように変更し、第1の方向の一方の向きとは異なる他方の向きの傾きを検出した場合には立体コンテンツの飛び出し量を減少するように変更するステップとを実行させる。 A recording medium according to the present invention stores a computer-readable content display program for causing a computer having a housing, a display device, and a memory for storing stereoscopic content to function as the content display device. It is a medium. The program stored in the recording medium causes the computer to detect the inclination of the first direction of the housing and the amount of projection of the three-dimensional content when the inclination in one direction of the first direction is detected. When the inclination of the other direction different from the one direction of the first direction is detected, the step of changing so as to decrease the pop-out amount of the three-dimensional content is executed.
 本発明によれば、コンテンツ表示装置は、筐体が第1の方向の傾きに応じて、表示装置に表示される立体コンテンツの飛び出し量を変更する。 According to the present invention, the content display device changes the pop-up amount of the stereoscopic content displayed on the display device according to the inclination of the housing in the first direction.
 これにより、表示装置に表示される立体コンテンツの表示内容を、筐体を持つ観察者の身体の動きに応じて、実質的に変更させることができる。 Thereby, the display content of the stereoscopic content displayed on the display device can be substantially changed according to the movement of the body of the observer having the housing.
コンテンツ表示装置の一実施の形態である携帯電話の外観を示す図である。It is a figure which shows the external appearance of the mobile telephone which is one embodiment of a content display apparatus. コンテンツ表示装置の筐体がユーザの左手に保持され、操作スイッチが当該ユーザによって押下されて筐体内に収容されている状態を示す図である。It is a figure which shows the state by which the housing | casing of a content display apparatus is hold | maintained at the left hand of a user, and the operation switch is pushed down by the said user, and is accommodated in the housing | casing. 図1の携帯電話のハードウェア構成を示す図である。It is a figure which shows the hardware constitutions of the mobile telephone of FIG. 図1の携帯電話の機能ブロック図である。It is a functional block diagram of the mobile phone of FIG. 図1の携帯電話における、筐体の移動の向きと、立体コンテンツにおけるオブジェクトの飛び出し量の関係を説明するための図である。FIG. 3 is a diagram for explaining a relationship between a moving direction of a housing and an amount of popping out of an object in a three-dimensional content in the mobile phone of FIG. 図1の携帯電話における、筐体の移動の向きと、立体コンテンツにおけるオブジェクトの飛び出し量の関係を説明するための図である。FIG. 3 is a diagram for explaining a relationship between a moving direction of a housing and an amount of popping out of an object in a three-dimensional content in the mobile phone of FIG. 図1の携帯電話における、筐体の移動の向きと、立体コンテンツにおけるオブジェクトの飛び出し量の関係を説明するための図である。FIG. 3 is a diagram for explaining a relationship between a moving direction of a housing and an amount of popping out of an object in a three-dimensional content in the mobile phone of FIG. 図3のCPU(Central Processing Unit)が実行する、画像飛び出し量調整処理のフローチャートである。4 is a flowchart of image pop-out amount adjustment processing executed by a CPU (Central Processing Unit) in FIG. 3. 図1の携帯電話における筐体の傾きと立体コンテンツのオブジェクトの飛び出し量との関係の一例を模式的に示す図である。FIG. 2 is a diagram schematically illustrating an example of a relationship between a tilt of a housing and a protruding amount of an object of a three-dimensional content in the mobile phone of FIG. 1. 図1の携帯電話における、加速度センサの加速度の検出方向を説明するための図である。It is a figure for demonstrating the detection direction of the acceleration of an acceleration sensor in the mobile telephone of FIG. 図1の携帯電話における、筐体の移動の向きを説明するための図である。It is a figure for demonstrating the direction of the movement of a housing | casing in the mobile telephone of FIG. 図1の携帯電話の変形例(1)を説明するための図である。It is a figure for demonstrating the modification (1) of the mobile telephone of FIG. 図1の携帯電話の変形例(1)を説明するための図である。It is a figure for demonstrating the modification (1) of the mobile telephone of FIG. 図1の携帯電話の変形例(1)を説明するための図である。It is a figure for demonstrating the modification (1) of the mobile telephone of FIG. 図1の携帯電話の変形例(5)を説明するための図である。It is a figure for demonstrating the modification (5) of the mobile telephone of FIG. 図1の携帯電話の変形例(6)を説明するための図である。It is a figure for demonstrating the modification (6) of the mobile telephone of FIG. 図1の携帯電話の変形例(6)を説明するための図である。It is a figure for demonstrating the modification (6) of the mobile telephone of FIG. 図1の携帯電話の変形例(4)を説明するための図である。It is a figure for demonstrating the modification (4) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG. 図1の携帯電話の変形例(3)を説明するための図である。It is a figure for demonstrating the modification (3) of the mobile telephone of FIG. 図1の携帯電話の変形例(3)を説明するための図である。It is a figure for demonstrating the modification (3) of the mobile telephone of FIG. 図1の携帯電話の変形例(3)を説明するための図である。It is a figure for demonstrating the modification (3) of the mobile telephone of FIG. 図1の携帯電話の変形例(3)を説明するための図である。It is a figure for demonstrating the modification (3) of the mobile telephone of FIG. 図1の携帯電話の変形例(2)を説明するための図である。It is a figure for demonstrating the modification (2) of the mobile telephone of FIG.
 以下、図面を参照しつつ、本発明の実施の形態について説明する。以下の説明では、同一の部品には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.
 <コンテンツ表示装置の外観構成>
 図1は、コンテンツ表示装置の一例である携帯電話100の外観を示す図である。携帯電話100は、ディスプレイ101を有する。
<External configuration of content display device>
FIG. 1 is a diagram illustrating an appearance of a mobile phone 100 that is an example of a content display device. The mobile phone 100 has a display 101.
 本実施の形態においては、携帯電話100を「コンテンツ表示装置」の代表例として説明を行う。ただし、当該表示装置は、PND(Personal Navigation Device)、PDA(Personal Data Assistance)、ゲーム機、電子辞書、電子ブックリーダーデバイス、パーソナルコンピュータなどのような、ディスプレイを有する他の情報機器であってもよい。 In the present embodiment, the mobile phone 100 will be described as a representative example of the “content display device”. However, the display device may be another information device having a display such as a PND (Personal Navigation Device), PDA (Personal Data Assistance), game machine, electronic dictionary, electronic book reader device, personal computer, etc. Good.
 携帯電話100は、従来の平面コンテンツに加えて、立体コンテンツ(3D(three-dimensional)の静止画像および3Dの動画像を含む。以下、適宜「3Dコンテンツ」ともいう)を表示する。つまり、携帯電話100は、オブジェクトが当該携帯電話100のディスプレイ101から飛び出しているように、当該オブジェクトを表示できる。 The mobile phone 100 displays three-dimensional content (including a 3D (three-dimensional) still image and a 3D moving image, hereinafter also referred to as “3D content” as appropriate) in addition to the conventional planar content. That is, the mobile phone 100 can display the object as if the object is protruding from the display 101 of the mobile phone 100.
 現在、静止画像や動画像を3Dで表示するための多数の方式が提案されている。たとえば、専用のメガネを用いる方式として、液晶アクティブシャッターメガネ方式および偏向板方式が挙げられる。また、専用のメガネを用いない方式として、視差バリア方式およびレンチキュラー方式が挙げられる。本実施の形態に係る3Dコンテンツの表示方式は、いかなる方式であってもよく、上記の方式に限定されるものではない。 Currently, many methods for displaying still images and moving images in 3D have been proposed. For example, a method using dedicated glasses includes a liquid crystal active shutter glasses method and a deflector plate method. In addition, examples of methods that do not use dedicated glasses include a parallax barrier method and a lenticular method. The 3D content display method according to the present embodiment may be any method, and is not limited to the above method.
 携帯電話100は、その外郭を筐体150に覆われている。筐体150は、その主面にディスプレイ101を備えている。また、筐体150は、その主面に操作部104を構成する複数のボタン、音声を出力するスピーカ108、および、マイクロフォン(以下、マイクという)107を有する。なお、筐体150の側面には、操作部104の一部を構成する操作スイッチ104Aが設けられている。 The mobile phone 100 is covered with a casing 150 at its outer shell. The housing 150 includes a display 101 on its main surface. The main body 150 includes a plurality of buttons constituting the operation unit 104, a speaker 108 that outputs sound, and a microphone (hereinafter referred to as a microphone) 107 on the main surface. An operation switch 104 </ b> A that constitutes a part of the operation unit 104 is provided on the side surface of the housing 150.
 操作スイッチ104Aは、押下されることにより筐体150内に収容される部材を含む。図2は、筐体150がユーザ(ディスプレイ101に表示される立体コンテンツの観察者)の左手に保持され、操作スイッチ104Aが当該ユーザによって押下されて筐体150内に収容されている状態を示す。 The operation switch 104A includes a member accommodated in the housing 150 when pressed. FIG. 2 shows a state in which the housing 150 is held on the left hand of the user (the viewer of the stereoscopic content displayed on the display 101) and the operation switch 104A is pressed by the user and accommodated in the housing 150. .
 <コンテンツ表示装置のハードウェア構成>
 図3は、本実施の形態の携帯電話100のハードウェア構成を示す図である。
<Hardware configuration of content display device>
FIG. 3 is a diagram illustrating a hardware configuration of the mobile phone 100 according to the present embodiment.
 図3を参照して、携帯電話100は、ディスプレイ101と、タッチセンサ102と、操作部104と、通信インターフェイス105と、加速度センサ106と、マイク107と、スピーカ108と、メモリインターフェイス109と、CPU110と、メモリ111と、RAM(Random Access Memory)112とを含む。 Referring to FIG. 3, the mobile phone 100 includes a display 101, a touch sensor 102, an operation unit 104, a communication interface 105, an acceleration sensor 106, a microphone 107, a speaker 108, a memory interface 109, and a CPU 110. And a memory 111 and a RAM (Random Access Memory) 112.
 携帯電話100では、タッチセンサ102はディスプレイ101上に設けられている。ディスプレイ101とタッチセンサ102により、タッチパネル103が構成されている。タッチセンサ102は、外部からタッチ操作されることにより、情報の入力を受付ける。携帯電話100では、ディスプレイ101とタッチセンサ102とにより、タッチパネルが構成されている。 In the mobile phone 100, the touch sensor 102 is provided on the display 101. The display 101 and the touch sensor 102 constitute a touch panel 103. The touch sensor 102 receives an input of information by a touch operation from the outside. In the mobile phone 100, the display 101 and the touch sensor 102 constitute a touch panel.
 ディスプレイ101は、CPU110によって制御されることにより、種々の情報を表示する。タッチセンサ102は、ユーザの指やスタイラスペンによるタッチ操作を検出して、当該タッチ操作が行なわれた座標をCPU110に入力する。また、タッチセンサ102は、タッチ操作がなされたときのタッチセンサ102に対する接触面積を検出し、当該接触面積をCPU110に入力する。また、タッチセンサ102は、当該タッチセンサ102に対するタッチ操作における、当該タッチセンサ102を押下された際の圧力を検出し、当該圧力をCPU110に入力する。 The display 101 displays various information by being controlled by the CPU 110. The touch sensor 102 detects a touch operation with a user's finger or stylus pen, and inputs the coordinates where the touch operation is performed to the CPU 110. The touch sensor 102 detects a contact area with the touch sensor 102 when a touch operation is performed, and inputs the contact area to the CPU 110. The touch sensor 102 detects a pressure when the touch sensor 102 is pressed in a touch operation on the touch sensor 102 and inputs the pressure to the CPU 110.
 操作部104は、操作スイッチ104A等の複数のボタンおよびタッチセンサ102を含む。ただし、操作部104は、操作スイッチ104Aのみから構成されてもよいし、また、操作スイッチ104Aは、タッチパネル103に表示されるソフトウェアボタンとして、携帯電話100に設けられてもよい。 The operation unit 104 includes a plurality of buttons such as operation switches 104A and the touch sensor 102. However, the operation unit 104 may be configured by only the operation switch 104 </ b> A, or the operation switch 104 </ b> A may be provided in the mobile phone 100 as a software button displayed on the touch panel 103.
 通信インターフェイス105は、CPU110によって制御されることにより、ネットワークを介して、外部の端末やサーバとデータ通信を行なう。CPU110は、たとえば、通信インターフェイス105を介して、外部のサーバからコンテンツデータやプログラムを受信する。 The communication interface 105 is controlled by the CPU 110 to perform data communication with an external terminal or server via a network. For example, the CPU 110 receives content data and programs from an external server via the communication interface 105.
 マイク107は、音声を受付けて、音声信号を生成する。マイク107は、音声信号をCPU110に入力する。スピーカ108は、CPU110によって制御されることにより、さまざまな情報(たとえば、音声メッセージやビープ音など)を出力する。 The microphone 107 receives sound and generates a sound signal. The microphone 107 inputs an audio signal to the CPU 110. The speaker 108 outputs various information (for example, voice message, beep sound, etc.) by being controlled by the CPU 110.
 メモリインターフェイス109は、筐体150に対して着脱可能な記録媒体200からデータを読出すことによってCPU110に当該データを入力したり、CPU110からのデータを当該記録媒体200に格納したりする。 The memory interface 109 reads the data from the recording medium 200 that can be attached to and detached from the housing 150 to input the data to the CPU 110 and store the data from the CPU 110 in the recording medium 200.
 なお、記録媒体200としては、CD-ROM(Compact Disc - Read Only Memory)、DVD-ROM(Digital Versatile Disk - Read Only Memory)、USB(Universal Serial Bus)メモリ、メモリカード、FD(Flexible Disk)、ハードディスク、磁気テープ、カセットテープ、MO(Magnetic Optical Disc)、MD(Mini Disc)、IC(Integrated Circuit)カード(メモリカードを除く)、光カード、マスクROM、EPROM、EEPROM(Electronically Erasable Programmable Read-Only Memory)などの、不揮発的にプログラムを格納する媒体が挙げられる。 As the recording medium 200, CD-ROM (Compact Disc-Read Only Memory), DVD-ROM (Digital Versatile Disk-Read Only Memory), USB (Universal Serial Bus) memory, memory card, FD (Flexible Disk), Hard disk, magnetic tape, cassette tape, MO (Magnetic Optical Disc), MD (Mini Disc), IC (Integrated Circuit) card (excluding memory card), optical card, mask ROM, EPROM, EEPROM (Electronically Erasable Programmable Read-Only A medium for storing the program in a nonvolatile manner such as Memory).
 また、メモリ111は、各種のRAMや、ROM(Read-Only Memory)や、ハードディスクなどによって実現される。たとえば、メモリ111は、読取用のインターフェイスを介して利用される、USBメモリ、CD-ROM、DVD-ROM、メモリカード、FD、ハードディスク、磁気テープ、カセットテープ、MO、MD、ICカード(メモリカードを除く)、光カード、マスクROM、EPROM、EEPROMなどの、不揮発的にデータを格納する媒体などによっても実現される。 The memory 111 is realized by various RAMs, ROMs (Read-Only Memory), hard disks, and the like. For example, the memory 111 is a USB memory, CD-ROM, DVD-ROM, memory card, FD, hard disk, magnetic tape, cassette tape, MO, MD, IC card (memory card) used via a reading interface. And a non-volatile storage medium such as an optical card, mask ROM, EPROM, and EEPROM.
 メモリ111は、CPU110によって実行される制御プログラムや、3Dコンテンツデータなどを記憶する。3Dコンテンツデータは、3Dの静止画像を表示するためのデータや、3Dの動画像を表示するためのデータを含む。 The memory 111 stores a control program executed by the CPU 110, 3D content data, and the like. The 3D content data includes data for displaying a 3D still image and data for displaying a 3D moving image.
 CPU110は、メモリ111に記憶されている各種のプログラムを実行する。携帯電話100における処理(たとえば、図8のフローチャートに示される処理)は、各ハードウェアおよびCPU110により実行されるソフトウェアによって実現される。このようなソフトウェアは、メモリ111に予め記憶されている場合もあれば、記録媒体に格納されてプログラム製品として流通している場合もある。あるいは、当該ソフトウェアは、ネットワークに接続されている情報提供事業者によってダウンロード可能なプログラム製品として提供される場合もある。 CPU 110 executes various programs stored in memory 111. The process in the mobile phone 100 (for example, the process shown in the flowchart of FIG. 8) is realized by each hardware and software executed by the CPU 110. Such software may be stored in the memory 111 in advance, or may be stored in a recording medium and distributed as a program product. Alternatively, the software may be provided as a program product that can be downloaded by an information provider connected to a network.
 このようなソフトウェアは、図示しない読取装置を利用することによって当該記録媒体から読取られて、あるいは、通信インターフェイス105を利用することによってダウンロードされて、メモリ111に一旦格納される。CPU110は、ソフトウェアを実行可能なプログラムの形式でメモリ111に格納してから、当該プログラムを実行する。 Such software is read from the recording medium by using a reading device (not shown), or downloaded by using the communication interface 105 and temporarily stored in the memory 111. The CPU 110 stores the software in the form of an executable program in the memory 111 and then executes the program.
 ここでいうプログラムとは、CPU110のようなプロセッサにより直接実行可能なプログラムだけでなく、ソースプログラム形式のプログラム、圧縮処理されたプログラム、暗号化されたプログラムを含む。 Here, the program includes not only a program that can be directly executed by a processor such as the CPU 110 but also a program in a source program format, a compressed program, and an encrypted program.
 RAM112は、CPU110がプログラムを実行する際のワークエリアとして機能する。 The RAM 112 functions as a work area when the CPU 110 executes a program.
 加速度センサ106は、筐体150に対して加えられた加速度を検出し、筐体150の内部に収容されていてもよいし、筐体150の外部に取付けられていてもよい。また、加速度センサ106は、たとえば、3軸方向についての加速度を検出する。図10は、加速度センサ106の加速度の検出方向を説明するための図である。 The acceleration sensor 106 may detect the acceleration applied to the housing 150 and may be housed inside the housing 150 or may be attached to the outside of the housing 150. Moreover, the acceleration sensor 106 detects the acceleration about a triaxial direction, for example. FIG. 10 is a diagram for explaining the acceleration detection direction of the acceleration sensor 106.
 携帯電話100では、図10に示されるように、筐体150の長手方向をY軸方向とし、筐体150の短手方向をX軸方向とした場合、加速度センサ106は、X軸方向、Y軸方向、および、奥行き方向(Z軸方向)の3軸方向の加速度を検出する。 In the mobile phone 100, as shown in FIG. 10, when the longitudinal direction of the housing 150 is the Y-axis direction and the short direction of the housing 150 is the X-axis direction, the acceleration sensor 106 is The acceleration in the three axial directions in the axial direction and the depth direction (Z-axis direction) is detected.
 <コンテンツ表示装置の機能>
 図4は、携帯電話100の機能ブロック図である。
<Function of content display device>
FIG. 4 is a functional block diagram of the mobile phone 100.
 携帯電話100は、その機能として、中央処理部191、飛び出し量取得部192、表示情報生成部193、姿勢検出部194、および、通信部195を含む。 The mobile phone 100 includes a central processing unit 191, a pop-up amount acquisition unit 192, a display information generation unit 193, an attitude detection unit 194, and a communication unit 195 as its functions.
 中央処理部191は、携帯電話100の動作を全体的に制御し、たとえば、CPU110がプログラムを実行することにより実現される。 The central processing unit 191 controls the operation of the mobile phone 100 as a whole, and is realized by the CPU 110 executing a program, for example.
 飛び出し量取得部192は、携帯電話100の状態に基づいて、ディスプレイ101に表示させる3Dコンテンツ(の各オブジェクト)についての、表示させるべき飛び出し量を取得し、たとえば、CPU110がプログラムを実行することにより実現される。 The pop-out amount acquisition unit 192 acquires the pop-out amount to be displayed for each 3D content (each object) to be displayed on the display 101 based on the state of the mobile phone 100. For example, the CPU 110 executes a program. Realized.
 表示情報生成部193は、ディスプレイ101に表示させる画像データを生成し、たとえば、CPU110がプログラムを実行することにより実現される。なお、表示情報生成部193は、グラフィックボードなどの専用の回路によって実現されてもよい。 The display information generation unit 193 is realized by generating image data to be displayed on the display 101, for example, by the CPU 110 executing a program. The display information generation unit 193 may be realized by a dedicated circuit such as a graphic board.
 姿勢検出部194は、筐体150の傾きを検出し、たとえば、加速度センサ106によって実現される。 The posture detection unit 194 detects the inclination of the housing 150 and is realized by the acceleration sensor 106, for example.
 通信部195は、他の機器と通信するための処理を実行し、たとえば、通信インターフェイス105によって実現される。 The communication unit 195 executes processing for communicating with other devices, and is realized by the communication interface 105, for example.
 <コンテンツ表示装置における立体コンテンツの表示>
 携帯電話100では、筐体150の傾きが変えられることによって、ディスプレイ101に表示される立体コンテンツ(のオブジェクト)の飛び出し量を制御することができる。
<Display of stereoscopic content in content display device>
In the mobile phone 100, the amount of pop-out of the three-dimensional content (object) displayed on the display 101 can be controlled by changing the inclination of the housing 150.
 このことを、図5~図7を参照して、より詳細に説明する。
 図5の(A)には、ユーザが手で携帯電話100を保持し、ディスプレイ101に表示されている立体表示コンテンツを視聴している状態が示されている。このときの、ディスプレイ101に表示されている立体コンテンツの飛び出し量が、図5の(B)に模式的に示されている。図5の(B)では、立体コンテンツのオブジェクト900が表示される際のディスプレイ101の表示面901からの飛び出し度合いが、模式的に示されている。図5の(B)において、ディスプレイ101を見るユーザの目が、Eとして模式的に示されている。
This will be described in more detail with reference to FIGS.
FIG. 5A shows a state in which the user holds the mobile phone 100 by hand and views the stereoscopic display content displayed on the display 101. The pop-out amount of the three-dimensional content displayed on the display 101 at this time is schematically shown in FIG. FIG. 5B schematically shows the degree of pop-out from the display surface 901 of the display 101 when the object 900 of the three-dimensional content is displayed. In FIG. 5B, the user's eyes looking at the display 101 are schematically shown as E.
 図6の(A)は、図5の(A)に示された状態から、携帯電話100が矢印A12の向きに傾けられた状態が示されている。 6A shows a state in which the mobile phone 100 is tilted in the direction of arrow A12 from the state shown in FIG.
 図6の(A)に示された状態は、図5の(A)に示された状態から、矢印A12の向きに、つまり、ディスプレイ101が設けられた面を筐体150の正面とした場合、筐体150の上端が後ろ側に傾けられた状態に相当する。 The state shown in FIG. 6A is the direction of arrow A12 from the state shown in FIG. 5A, that is, the case where the surface on which the display 101 is provided is the front of the housing 150. This corresponds to a state in which the upper end of the housing 150 is tilted rearward.
 筐体150がこのように傾けられると、表示される立体コンテンツの飛び出し量が少なくなるように、ディスプレイ101の表示が制御される。これは、加速度センサ106において検出されるZ軸方向について、後ろ向きの加速度(図10のZ軸方向の+側の向きの加速度)が検出されたことに基づく。 When the casing 150 is tilted in this way, the display of the display 101 is controlled so that the amount of the projected three-dimensional content is reduced. This is based on the detection of backward acceleration (acceleration on the + side in the Z-axis direction in FIG. 10) in the Z-axis direction detected by the acceleration sensor 106.
 以上のように、図5の(A)から、携帯電話100が矢印A12の向きに傾けられると、図5の(B)に示された立体コンテンツの飛び出し量は、図6の(B)に示されるように変更される。図6の(B)では、図5の(B)に示された状態と比較して、オブジェクト900は、表示面901に対して、ユーザの目Eから離れる向きに、その飛び出し量が変更されている。 As described above, when the mobile phone 100 is tilted in the direction of the arrow A12 from FIG. 5A, the pop-up amount of the three-dimensional content shown in FIG. 5B is as shown in FIG. Changed as shown. In FIG. 6B, compared to the state shown in FIG. 5B, the amount of protrusion of the object 900 with respect to the display surface 901 is changed in a direction away from the user's eyes E. ing.
 図7の(A)は、図5の(A)に示された状態から、携帯電話100が矢印A11の向きに傾けられた状態が示されている。 7A shows a state in which the mobile phone 100 is tilted in the direction of arrow A11 from the state shown in FIG.
 図7の(A)に示された状態は、図5の(A)に示された状態から、矢印A11の向きに、つまり、ディスプレイ101が設けられた面を筐体150の正面とした場合、筐体150の上端が前側に傾けられた状態に相当する。 The state shown in FIG. 7A is the same as the state shown in FIG. 5A in the direction of arrow A11, that is, when the surface on which the display 101 is provided is the front of the housing 150. This corresponds to a state in which the upper end of the housing 150 is tilted forward.
 筐体150がこのように傾けられると、表示される立体コンテンツの飛び出し量が多くなるように、ディスプレイ101の表示が制御される。これは、加速度センサ106において検出されるZ軸方向について、前向きの加速度(図10のZ軸方向の-側の向きの加速度)が検出されたことに基づく。 When the casing 150 is tilted in this way, the display of the display 101 is controlled so that the amount of the stereoscopic content to be displayed increases. This is based on the fact that forward acceleration (acceleration on the negative side in the Z-axis direction in FIG. 10) is detected in the Z-axis direction detected by the acceleration sensor 106.
 以上のように、図5の(A)から、携帯電話100が矢印A11の向きに傾けられると、図5の(B)に示された立体コンテンツの飛び出し量は、図7の(B)に示されるように変更される。図7の(B)では、図5の(B)に示された状態と比較して、オブジェクト900は、表示面901に対して、ユーザの目Eに近づく向きに、その飛び出し量が変更されている。 As described above, when the mobile phone 100 is tilted in the direction of the arrow A11 from FIG. 5A, the pop-out amount of the three-dimensional content shown in FIG. 5B is as shown in FIG. Changed as shown. In FIG. 7B, compared to the state shown in FIG. 5B, the amount of protrusion of the object 900 with respect to the display surface 901 is changed in a direction approaching the user's eye E. ing.
 <コンテンツ表示装置における立体コンテンツの表示制御>
 図8は、CPU110が実行する、ディスプレイ101に表示される立体コンテンツの飛び出し量を調整するための処理(画像飛び出し量調整処理)のフローチャートである。
<Display control of stereoscopic content in content display device>
FIG. 8 is a flowchart of processing (image pop-out amount adjustment processing) performed by the CPU 110 for adjusting the pop-out amount of the stereoscopic content displayed on the display 101.
 CPU110は、ディスプレイ101に立体コンテンツを表示するアプリケーションが実行されているときに、そのバックグラウンドで画像飛び出し量調整処理を実行し、これにより、立体コンテンツの飛び出し量を調整する。 The CPU 110 executes an image pop-out amount adjustment process in the background when an application for displaying the three-dimensional content on the display 101 is executed, thereby adjusting the pop-out amount of the three-dimensional content.
 図8を参照して、画像飛び出し量調整処理では、CPU110は、まずステップS10で、加速度センサ106を初期化して、ステップS20へ処理を進める。 Referring to FIG. 8, in the image pop-out amount adjustment process, CPU 110 first initializes acceleration sensor 106 in step S10, and advances the process to step S20.
 ステップS20では、CPU110は、操作スイッチ104AがONされたか否かを判断し、ONされたと判断するとステップS30へ処理を進め、ONされていないと判断するとステップS10へ処理を戻す。 In step S20, the CPU 110 determines whether or not the operation switch 104A is turned on. If it is determined that the operation switch 104A is turned on, the process proceeds to step S30. If it is determined that the operation switch 104A is not turned on, the process returns to step S10.
 ここで、操作スイッチ104AがONされるとは、図2を参照して説明したように、操作スイッチ104Aが押下されて、筐体150内へ収納された状態をいう。CPU110は、たとえば、携帯電話100を構成する電気回路における、所定の部分の通電状態や電位の変化の有無を検出することにより、操作スイッチ104AがONされたか否かを判断する。 Here, the operation switch 104A being turned on refers to a state in which the operation switch 104A is pressed and stored in the housing 150 as described with reference to FIG. CPU 110 determines, for example, whether or not operation switch 104A is turned on by detecting the energization state of a predetermined portion or the presence or absence of a potential change in an electric circuit constituting mobile phone 100.
 ステップS30では、CPU110は、ステップS20で操作スイッチ104AがONされてから、加速度センサ106において、Z軸方向(図10参照)についての加速度の検出状態に変化があったか否かを判断し、変化があったと判断するとステップS40へ処理を進め、当該変化が検出されなければステップS20へ処理を戻す。 In step S30, the CPU 110 determines whether or not the acceleration sensor 106 has changed in the acceleration detection state in the Z-axis direction (see FIG. 10) after the operation switch 104A is turned on in step S20. If it is determined that the change has occurred, the process proceeds to step S40, and if the change is not detected, the process returns to step S20.
 ステップS40では、CPU110は、筐体150の傾きに応じた立体コンテンツの飛び出し量を取得し、その時点でディスプレイ101に表示させている立体コンテンツが当該飛び出し量で表示されるように画像データを生成し、当該画像データをディスプレイ101に表示させる。ここで、「筐体150の傾き」とは、ステップS30で検出された、加速度センサ106のZ軸方向の加速度の検出値に基づいて決定される。メモリ111には、筐体150の傾きに関連付けられて、立体コンテンツの飛び出し量が記憶されている。 In step S40, the CPU 110 acquires the pop-up amount of the three-dimensional content according to the inclination of the casing 150, and generates image data so that the three-dimensional content displayed on the display 101 at that time is displayed with the pop-out amount. Then, the image data is displayed on the display 101. Here, “the inclination of the casing 150” is determined based on the detected value of the acceleration in the Z-axis direction of the acceleration sensor 106 detected in step S30. The memory 111 stores the pop-up amount of the three-dimensional content in association with the inclination of the housing 150.
 メモリ111には、たとえば、図5の(A)の矢印A12の向きの傾き(つまり、Z軸方向で検出された正の値の加速度)に関連付けられて、図6の(B)に示されるような、初期時状態(図5の(B))よりも飛び出し量が減少しているような飛び出し量が記憶され、また、図5の(A)の矢印A11の向きの傾き(つまり、Z軸方向で検出された負の値の加速度)に関連付けられて、図7の(B)に示されるような、初期時状態(図5の(B))よりも飛び出し量が増しているような飛び出し量が記憶される。 The memory 111 is shown in FIG. 6B in association with, for example, the inclination of the direction of the arrow A12 in FIG. 5A (that is, a positive acceleration detected in the Z-axis direction). The pop-out amount that the pop-out amount is smaller than the initial state (FIG. 5B) is stored, and the inclination of the direction of the arrow A11 in FIG. In relation to the negative acceleration detected in the axial direction), the pop-out amount is larger than the initial state (FIG. 5B) as shown in FIG. The pop-out amount is stored.
 そして、CPU110は、ステップS40で立体コンテンツの飛び出し量が変化するようにディスプレイ101の表示を更新した後、ステップS50へ処理を進める。 And CPU110 advances a process to step S50, after updating the display of the display 101 so that the pop-out amount of a solid content may change in step S40.
 ステップS50では、CPU110は、操作スイッチ104AがOFFされたか否かを判断し、OFFされていないと判断するとステップS30へ処理を戻し、OFFされたと判断すると、ステップS10へ処理を戻す。 In step S50, the CPU 110 determines whether or not the operation switch 104A is turned off. If it is determined that the operation switch 104A is not turned off, the process returns to step S30. If it is determined that the operation switch 104A is turned off, the process returns to step S10.
 ここで、操作スイッチ104AがOFFされたとは、たとえば、図1に示されるように、操作スイッチ104Aの押下が解除された状態をいう。CPU110は、たとえば、携帯電話100内の電気回路における、所定の部分の通電状態や電位の変化の有無を検出することにより、操作スイッチ104AがOFFされたか否かを判断する。 Here, the operation switch 104A being turned off means, for example, a state in which the operation switch 104A has been depressed as shown in FIG. For example, CPU 110 determines whether or not operation switch 104A is turned off by detecting the energization state of a predetermined portion or the presence or absence of a potential change in the electric circuit in mobile phone 100.
 以上説明した画像飛び出し量調整処理では、操作スイッチ104AがON(押下)されている期間中に、筐体150のZ軸方向(図10参照)方向の傾きが変化することにより、ディスプレイ101に表示される立体コンテンツの飛び出し量が当該傾きに応じて変更される。 In the image pop-out amount adjustment process described above, the tilt of the casing 150 in the Z-axis direction (see FIG. 10) changes during the period when the operation switch 104A is ON (pressed). The pop-out amount of the stereoscopic content to be changed is changed according to the inclination.
 なお、このような立体コンテンツの飛び出し量の変更は、操作スイッチ104AがON(押下)されている期間中に限定されるものではない。つまり、ディスプレイ101に立体コンテンツが表示されているときには、操作スイッチ104Aの押下という特別な操作を必要とされることなく筐体150のZ軸方向の傾きに応じて、立体コンテンツの飛び出し量が変更されてもよい。この場合には、図8のステップS20およびステップS50の処理が省略される。具体的には、ステップS10で加速度センサ106が初期化された後、ステップS30で加速度センサ106の状態が監視され、状態に変化があったと判断されると、ステップS40で、変化に応じた飛び出し量で表示されるようにディスプレイ101におけるコンテンツの表示が更新される。 It should be noted that such a change in the amount of three-dimensional content pop-up is not limited to a period during which the operation switch 104A is ON (pressed). In other words, when stereoscopic content is displayed on the display 101, the amount of stereoscopic content pop-up changes according to the inclination of the casing 150 in the Z-axis direction without requiring a special operation of pressing the operation switch 104A. May be. In this case, the processes in steps S20 and S50 in FIG. 8 are omitted. Specifically, after the acceleration sensor 106 is initialized in step S10, the state of the acceleration sensor 106 is monitored in step S30, and when it is determined that the state has changed, in step S40, the pop-up corresponding to the change is made. The display of content on the display 101 is updated so as to be displayed in quantity.
 また、立体コンテンツの飛び出し量の変更の条件として操作される部位の形態は、操作スイッチ104Aのようなボタン形状のものに限定されない。たとえば、筐体150の側面の、操作スイッチ104Aが設けられているような部位に、人間の指で覆うことができるような小孔が設けられ、当該小孔が人間の指などで覆われている期間のみ、筐体150の傾きに応じて立体コンテンツの飛び出し量が調整されてもよい。つまり、ステップS20の処理は、当該小孔がユーザの指などで覆われているかを検出し、覆われていればステップS30へ処理が進められるように変更されてもよい。また、ステップS50は、当該小孔がユーザの指などで覆われていないかを検出し、覆われていない場合にステップS10へ処理が戻されるように変更されてもよい。CPU110は、当該小孔が指などで覆われているか否かを、たとえば、小孔を介して筐体150の外部から入力される光を検出できるか否かに基づいて判断する。具体的には、筐体150内部に小孔を介して入射する光を受ける受光素子が設けられ、CPU110は、当該受光素子の受光量が所定量未満となった場合には当該小孔が指などで覆われていると判断し、当該所定量以上の場合には当該小孔が指などで覆われていない状態であると判断する。 In addition, the form of the part that is operated as a condition for changing the pop-out amount of the three-dimensional content is not limited to a button shape like the operation switch 104A. For example, a small hole that can be covered with a human finger is provided on a side surface of the housing 150 where the operation switch 104A is provided, and the small hole is covered with a human finger or the like. Only during a certain period, the pop-up amount of the three-dimensional content may be adjusted according to the inclination of the housing 150. That is, the process of step S20 may be changed so that the small hole is covered with a user's finger or the like and if it is covered, the process proceeds to step S30. In addition, step S50 may be changed so that it is detected whether the small hole is covered with a user's finger or the like, and the process returns to step S10 when it is not covered. The CPU 110 determines whether the small hole is covered with a finger or the like based on, for example, whether light input from the outside of the housing 150 through the small hole can be detected. Specifically, a light receiving element that receives light incident through a small hole is provided inside the housing 150, and the CPU 110 indicates that the small hole indicates that the light received by the light receiving element is less than a predetermined amount. If it is more than the predetermined amount, it is determined that the small hole is not covered with a finger or the like.
 本実施の形態では、上記したように、メモリ111に、Z軸方向で検出された正の値の加速度と負の値の加速度のそれぞれに関連付けられて、立体コンテンツの飛び出し量が記憶されている。 In the present embodiment, as described above, the memory 111 stores the pop-up amount of the three-dimensional content in association with each of the positive value acceleration and the negative value acceleration detected in the Z-axis direction. .
 なお、メモリ111に記憶される飛び出し量は、正負によってのみでなく、加速度の値の大きさに対応して多段階で記憶されていても良い。これにより、携帯電話100は、筐体150を傾ける向きだけでなく、傾ける大きさ(傾ける角度の大きさ、および/または、傾ける際に筐体150に加えられる力の大きさ)に応じて、ディスプレイ101に表示される立体コンテンツの飛び出し量を制御できる。 It should be noted that the pop-out amount stored in the memory 111 may be stored in multiple stages not only depending on whether it is positive or negative but also corresponding to the magnitude of the acceleration value. As a result, the mobile phone 100 not only tilts the housing 150 but also according to the tilting size (the angle of tilting and / or the magnitude of the force applied to the housing 150 when tilting). The pop-up amount of the stereoscopic content displayed on the display 101 can be controlled.
 さらに、メモリ111には、加速度の値と飛び出し量とが、関数等によって関連付けられていても良い。この場合、たとえば図9に示されるように、傾き(検出される加速度の絶対値)が小さい場合には、立体コンテンツの飛び出し量が変更されないことが好ましい。 Furthermore, in the memory 111, the acceleration value and the pop-out amount may be associated by a function or the like. In this case, for example, as shown in FIG. 9, when the inclination (absolute value of detected acceleration) is small, it is preferable that the pop-out amount of the three-dimensional content is not changed.
 図9では、筐体150の傾きに対する立体コンテンツの飛び出し量の変化量が、線L1で示されている。変化量とは、その時点(たとえば、図8のステップS30で加速度が検出された時点)で表示されている飛び出し量からの、飛び出し量の変化量をいう。また、筐体の傾きとは、Z軸方向の傾きをいい、また、筐体の傾きが0である場合とは、ステップS20において操作スイッチ104AがONされた時点での加速度センサ106の検出出力が基準とされる。 In FIG. 9, the change amount of the pop-out amount of the three-dimensional content with respect to the inclination of the casing 150 is indicated by a line L1. The amount of change refers to the amount of change in the amount of protrusion from the amount of protrusion displayed at that time (for example, when acceleration is detected in step S30 in FIG. 8). The tilt of the casing refers to the tilt in the Z-axis direction. When the tilt of the casing is zero, the detection output of the acceleration sensor 106 at the time when the operation switch 104A is turned on in step S20. Is the standard.
 このように、筐体150の傾きが小さい場合(加速度センサ106が検出する加速度の値が特定の値よりも小さい場合)に立体コンテンツの飛び出し量を変更されないことにより、ユーザが、立体コンテンツの飛び出し量の変更を意図しないときに、筐体150の微妙な揺れに応じて当該飛び出し量が変更されることを回避できる。 As described above, when the inclination of the housing 150 is small (when the acceleration value detected by the acceleration sensor 106 is smaller than a specific value), the amount of pop-up of the three-dimensional content is not changed, so that the user can pop out the three-dimensional content. When the amount is not intended to be changed, it is possible to avoid changing the pop-out amount in response to a slight shaking of the housing 150.
 <変形例(1)>
 以上説明した本実施の形態では、ディスプレイ101に表示される立体コンテンツの飛び出し量が、図5の(A)~図7の(A)を参照して説明したように、筐体150の奥行き方向に傾きが変化したことに応じて、変更された。
<Modification (1)>
In the present embodiment described above, the pop-up amount of the three-dimensional content displayed on the display 101 is the depth direction of the housing 150 as described with reference to FIGS. 5A to 7A. It was changed according to the change of the slope.
 携帯電話100では、さらに、筐体150が左右方向に傾けられたときに、当該左右方向の傾きに応じて、立体コンテンツの表示位置が変更されてもよい。 In the mobile phone 100, when the casing 150 is tilted in the left-right direction, the display position of the three-dimensional content may be changed according to the tilt in the left-right direction.
 つまり、携帯電話100では、図11の(A)~(C)に示されるように、携帯電話100の筐体150がその奥行き方向に傾けられるときに立体コンテンツの飛び出し量が変更されるのに加えて、図11の(D)~(F)に示すように、左右方向に傾けられたときにも、当該立体コンテンツの表示態様が変更される。 That is, in the mobile phone 100, as shown in FIGS. 11A to 11C, when the casing 150 of the mobile phone 100 is tilted in the depth direction, the pop-out amount of the three-dimensional content is changed. In addition, as shown in (D) to (F) of FIG. 11, the display mode of the three-dimensional content is also changed when tilted in the left-right direction.
 図11の(A)~(C)に示すような奥行き方向の傾きの変化に応じた立体コンテンツの飛び出し量の変更は、図5の(B)~図7の(B)を参照して説明したとおりである。 Changes in the pop-out amount of the three-dimensional content according to the change in the inclination in the depth direction as shown in FIGS. 11A to 11C will be described with reference to FIGS. 5B to 7B. Just as you did.
 一方、筐体150が左右方向に傾けられたとき、つまり、図11の(E)に示された状態から、両矢印A20の方向に傾けられて、図11の(D)または図11の(F)に示された状態とされることにより、表示された立体コンテンツの表示位置が、左右に移動する。 On the other hand, when the casing 150 is tilted in the left-right direction, that is, from the state shown in FIG. 11E, it is tilted in the direction of the double-headed arrow A20, and FIG. By being in the state shown in F), the display position of the displayed three-dimensional content moves to the left and right.
 ディスプレイ101において、立体コンテンツの表示される位置が左右方向に変更されることについて、さらに図12、図13Aおよび図13Bを参照して説明する。 It will be further described with reference to FIGS. 12, 13A, and 13B that the display position of the stereoscopic content on the display 101 is changed in the left-right direction.
 図12は、筐体150が図11の(E)に示された状態とされているときの、ディスプレイ101における立体コンテンツの表示態様の一例を模式的に示す図である。図12では、ディスプレイ101において、その左右方向のほぼ中心に、立体コンテンツ181が表示されている。 FIG. 12 is a diagram schematically illustrating an example of a display mode of the stereoscopic content on the display 101 when the housing 150 is in the state illustrated in FIG. In FIG. 12, on the display 101, the three-dimensional content 181 is displayed at substantially the center in the left-right direction.
 図13Aは、筐体150が、図11の(E)に示された状態から図11の(F)に示されるように左側に傾けられたときの、ディスプレイ101における立体コンテンツの表示態様の一例を模式的に示す図である。図13Aでは、参考のために、図12に示された立体コンテンツの表示位置が破線182で示されている。そして、図13Aでは、ディスプレイ101において、立体コンテンツ181が、図12に示された位置よりも左側に表示されている。 FIG. 13A shows an example of how the three-dimensional content is displayed on the display 101 when the housing 150 is tilted to the left as shown in FIG. 11F from the state shown in FIG. FIG. In FIG. 13A, the display position of the stereoscopic content shown in FIG. 12 is indicated by a broken line 182 for reference. In FIG. 13A, on the display 101, the three-dimensional content 181 is displayed on the left side of the position shown in FIG.
 図13Bは、筐体150が、図11の(E)に示された状態から図11の(D)に示されるように右側に傾けられたときの、ディスプレイ101における立体コンテンツの表示態様の一例を模式的に示す図である。図13Bでは、参考のために、図12に示された立体コンテンツの表示位置が破線182で示されている。そして、図13Bでは、ディスプレイ101において、立体コンテンツ181が、図12に示された位置よりも右側に表示されている。 FIG. 13B shows an example of a display mode of stereoscopic content on the display 101 when the housing 150 is tilted to the right as shown in FIG. 11D from the state shown in FIG. FIG. In FIG. 13B, the display position of the stereoscopic content shown in FIG. 12 is indicated by a broken line 182 for reference. In FIG. 13B, the three-dimensional content 181 is displayed on the right side of the position shown in FIG.
 なお、本変形例において、CPU110は、ディスプレイ101の表示領域の中で立体コンテンツの表示に用いる領域の位置(場所)を変更することにより、立体コンテンツの表示位置が変更されてもよい。つまり、CPU110は、ディスプレイ101に立体コンテンツを表示するためのアプリケーションが実行されているときに、そのバックグラウンドで立体コンテンツの移動量を筐体150が左右方向に傾きの量に応じて調整するための処理を実行し、これにより、ディスプレイ101の表示領域の中で立体コンテンツの表示に用いる領域の位置(場所)を変更する。 In this modification, the CPU 110 may change the display position of the stereoscopic content by changing the position (location) of the area used for displaying the stereoscopic content in the display area of the display 101. That is, when an application for displaying stereoscopic content on the display 101 is executed, the CPU 110 adjusts the movement amount of the stereoscopic content in the background according to the amount of inclination of the casing 150 in the left-right direction. Thus, the position (place) of the area used for displaying the stereoscopic content in the display area of the display 101 is changed.
 <変形例(2)>
 本変形例では、変形例(1)と同様に、筐体150が左右方向に傾けられたときに、立体コンテンツ中の、ディスプレイ101での表示対象となる断面が、傾けられた向きに応じた位置向きに変更される。
<Modification (2)>
In the present modification, as in the modification (1), when the casing 150 is tilted in the left-right direction, the cross-section to be displayed on the display 101 in the three-dimensional content corresponds to the tilted direction. Changed to position.
 筐体150の前後方向の傾きに応じた、立体コンテンツのディスプレイ101に表示される範囲の変更について、図18~図23を参照して説明する。 The change of the range displayed on the three-dimensional content display 101 according to the tilt of the casing 150 in the front-rear direction will be described with reference to FIGS.
 図18には、立体コンテンツの一例であるコンテンツ1201が示されている。また、コンテンツ1201は、より大きなコンテンツ1200の一部である(図19)。 FIG. 18 shows content 1201, which is an example of three-dimensional content. The content 1201 is a part of a larger content 1200 (FIG. 19).
 本変形例では、ディスプレイ101に、コンテンツ1200の一部分であるコンテンツ1201が表示されている状態で、筐体150において検出された加速度センサ106がZ軸方向の加速度が検出されると、立体コンテンツのディスプレイ101に表示される範囲が変更される。より具体的には、筐体150が図5の(A)に示した状態から図6の(A)に示すように背面側へ傾けられると、図20に示されるように、ディスプレイ101に表示される内容は、コンテンツ1211へと変更される。なお、図21には、コンテンツ1200の中のコンテンツ1211の位置が示されている。 In the present modification, when the acceleration sensor 106 detected in the housing 150 detects the acceleration in the Z-axis direction while the content 1201 that is a part of the content 1200 is displayed on the display 101, The range displayed on the display 101 is changed. More specifically, when the housing 150 is tilted from the state shown in FIG. 5A to the back side as shown in FIG. 6A, the display is displayed on the display 101 as shown in FIG. The content to be changed is changed to the content 1211. FIG. 21 shows the position of the content 1211 in the content 1200.
 一方、筐体150が図5の(A)に示した状態から図7の(A)に示すように手前背面側へ傾けられると、図22に示されるように、ディスプレイ101に表示される内容は、コンテンツ1212へと変更される。なお、図23には、コンテンツ1200の中のコンテンツ1212の位置が示されている。 On the other hand, when the casing 150 is tilted from the state shown in FIG. 5A to the front rear side as shown in FIG. 7A, the contents displayed on the display 101 as shown in FIG. Is changed to content 1212. FIG. 23 shows the position of the content 1212 in the content 1200.
 図21に示されるように、コンテンツ1211は、コンテンツ1200において、コンテンツ1201に対して手前側に位置する表示範囲である。また、図23に示されるように、コンテンツ1212は、コンテンツ1200において、コンテンツ1201に対して奥側に位置する表示範囲である。また、図18~図23において、破線DLは、コンテンツ1201,1211,1212で表される面とディスプレイ101が交わる線を表している。 21, the content 1211 is a display range located on the near side of the content 1201 in the content 1200. Also, as shown in FIG. 23, the content 1212 is a display range located on the far side of the content 1201 in the content 1200. 18 to 23, a broken line DL represents a line where the surface represented by the contents 1201, 1211, and 1212 and the display 101 intersect.
 なお、本変形例においても、加速度センサ106によって検出される傾きの量に応じて、コンテンツ1200上の表示範囲の変更量が決定されてもよい。また、当該変更量が判断される場合、図9を参照して説明したように、傾きがある程度小さい場合には、表示範囲の変更は行なわれないようにすることが好ましい。 In this modification as well, the amount of change in the display range on the content 1200 may be determined according to the amount of tilt detected by the acceleration sensor 106. Further, when the change amount is determined, it is preferable not to change the display range when the inclination is small to some extent as described with reference to FIG.
 また、図19のコンテンツ1200の一例として、3次元表示された地図を挙げることができる。そして、当該地図に、オブジェクト(建物等)毎にタグが付されている場合、筐体150が図5の(A)の矢印A12の向きに傾けられたことに応じて、タグを用いて表示内容を変更する制御が考えられる。具体的には、コンテンツを表示するアプリケーションが起動されると、デフォルトで指定されるタグに対応したオブジェクトを中心とした範囲の画像が、ディスプレイ101に表示される。コンテンツは、複数のオブジェクトを含み、さらに、当該複数のオブジェクトの中の1以上のオブジェクトのそれぞれに対応したタグを含む。複数のタグは、各タグが対応するオブジェクトの地図上の配置に関連付けられて、記憶されている。そして、矢印A12の向きに傾けられると、指定されるタグが、コンテンツにおいて矢印A12の向きに関連付けられた向きの、次の位置に配置されたタグへと変更される。そして、ディスプレイ101における表示は、変更後のタグに対応するオブジェクトを中心としたものに更新される。 Further, as an example of the content 1200 of FIG. 19, a map displayed in three dimensions can be cited. And when the tag is attached | subjected to the said map for every object (building etc.), it displays using a tag according to the housing | casing 150 being inclined to the direction of arrow A12 of (A) of FIG. Control that changes the contents can be considered. Specifically, when an application for displaying content is activated, an image in a range centered on an object corresponding to a tag specified by default is displayed on the display 101. The content includes a plurality of objects, and further includes tags corresponding to each of one or more objects among the plurality of objects. The plurality of tags are stored in association with the arrangement of the objects corresponding to each tag on the map. When tilted in the direction of the arrow A12, the designated tag is changed to a tag arranged at the next position in the direction associated with the direction of the arrow A12 in the content. Then, the display on the display 101 is updated to the one centered on the object corresponding to the changed tag.
 具体的には、たとえば図28に示すように、コンテンツ1200において、オブジェクト1281とオブジェクト1282についてタグが含まれ、また、コンテンツ1200における矢印A12の向きに対応した向きが矢印A120で示される向きであるとする。そして、ディスプレイ101において、オブジェクト1281を中心とする範囲1291が表示されているとする。このときに、筐体150が矢印A12(図5の(A))の向きに傾けられると、ディスプレイ101に表示される範囲は、範囲1291から、範囲1292へ変更される。範囲1292は、オブジェクト1282を中心とする範囲である。そして、オブジェクト1282のタグに関連付けられた配置は、矢印A120の向きにおいて、オブジェクト1281のタグに関連付けられた配置の次に位置する。 Specifically, as shown in FIG. 28, for example, in content 1200, tags are included for object 1281 and object 1282, and the direction corresponding to the direction of arrow A12 in content 1200 is the direction indicated by arrow A120. And It is assumed that a range 1291 centered on the object 1281 is displayed on the display 101. At this time, when the housing 150 is tilted in the direction of the arrow A12 (FIG. 5A), the range displayed on the display 101 is changed from the range 1291 to the range 1292. A range 1292 is a range centered on the object 1282. The arrangement associated with the tag of the object 1282 is positioned next to the arrangement associated with the tag of the object 1281 in the direction of the arrow A120.
 なお、筐体150の複数の方向についての傾きが検出される場合(たとえば、図11を参照して説明したように、両矢印A10方向地図における表示範囲の中心が両矢印A81方向に配列されたオブジェクトの中で、現在表示の中心となっているオブジェクトから、D2側の次に配列された、タグを付されたオブジェクトへと変更されてもよい。この場合、図5の(A)の矢印A11の向きに、ステップS30において、傾けられたことを検出されると、地図コンテンツ801において、ディスプレイ101に表示される始点を含む面が、両矢印A81のD1側の次のオブジェクトへと変更される。 When the inclination of the casing 150 in a plurality of directions is detected (for example, as described with reference to FIG. 11, the center of the display range in the double arrow A10 direction map is arranged in the double arrow A81 direction. Among the objects, the object that is currently the center of the display may be changed to a tagged object arranged next to D2. In this case, the arrow in FIG. When it is detected in step S30 that the camera is tilted in the direction of A11, the plane including the start point displayed on the display 101 in the map content 801 is changed to the next object on the D1 side of the double arrow A81. The
 <変形例(3)>
 以上説明した本実施の形態では、ディスプレイ101に表示される立体コンテンツの飛び出し量が、図5の(A)~図7の(A)を参照して説明したように、筐体150の奥行き方向(前後方向)に傾きが変化したことに応じて、変更された。
<Modification (3)>
In the present embodiment described above, the pop-up amount of the three-dimensional content displayed on the display 101 is the depth direction of the housing 150 as described with reference to FIGS. 5A to 7A. It was changed in response to the change in inclination in the (front-rear direction).
 携帯電話100では、さらに、筐体150が左右方向に傾けられたときに、当該左右方向の傾きに応じて、コンテンツの表示態様が変更されてもよい。 In the mobile phone 100, when the casing 150 is tilted in the left-right direction, the content display mode may be changed according to the tilt in the left-right direction.
 つまり、携帯電話100では、図11の(A)~(C)に示されるように、携帯電話100の筐体150がその奥行き方向に傾けられるときに立体コンテンツの飛び出し量が変更されるのに加えて、図11の(D)~(F)に示すように、左右方向に傾けられたときにも、当該立体コンテンツの表示態様が変更される。 That is, in the mobile phone 100, as shown in FIGS. 11A to 11C, when the casing 150 of the mobile phone 100 is tilted in the depth direction, the pop-out amount of the three-dimensional content is changed. In addition, as shown in (D) to (F) of FIG. 11, the display mode of the three-dimensional content is also changed when tilted in the left-right direction.
 図11の(A)~(C)に示すような奥行き方向の傾きの変化に応じた立体コンテンツの飛び出し量の変更は、図5の(B)~図7の(B)を参照して説明したとおりである。 Changes in the pop-out amount of the three-dimensional content according to the change in the inclination in the depth direction as shown in FIGS. 11A to 11C will be described with reference to FIGS. 5B to 7B. Just as you did.
 一方、筐体150が左右方向に傾けられたとき、つまり、図11の(E)に示された状態から、両矢印A20の方向に傾けられて、図11の(D)または図11の(F)に示された状態とされることにより、立体コンテンツのディスプレイ101に表示される範囲が変更される。立体コンテンツの、ディスプレイ101に表示される範囲が左右方向に変更されることについて、さらに図18,図19,図24~図27を参照して説明する。 On the other hand, when the casing 150 is tilted in the left-right direction, that is, from the state shown in FIG. 11E, it is tilted in the direction of the double-headed arrow A20, and FIG. By being in the state shown in F), the range displayed on the display 101 of the stereoscopic content is changed. The fact that the range of the stereoscopic content displayed on the display 101 is changed in the horizontal direction will be further described with reference to FIGS. 18, 19, and 24 to 27. FIG.
 上記したように、図18には、立体コンテンツの一例であるコンテンツ1201が示されている。また、コンテンツ1201は、より大きなコンテンツ1200の一部である(図19)。 As described above, FIG. 18 shows content 1201 as an example of three-dimensional content. The content 1201 is a part of a larger content 1200 (FIG. 19).
 図11の(E)の状態にあるディスプレイ101に、コンテンツ1200の一部分であるコンテンツ1201が表示されている状態(図18)で、ユーザが、筐体150を、時計回りに傾けることにより図11の(D)に示す状態にすると、ディスプレイ101に表示される内容は、図24に示されるように、コンテンツ1202へと変更される。コンテンツ1202は、図25に示されるように、コンテンツ1200において、コンテンツ1201に対して左側に位置する表示範囲である。 When the content 1201 that is a part of the content 1200 is displayed on the display 101 in the state of FIG. 11E (FIG. 18), the user tilts the housing 150 clockwise as shown in FIG. In the state shown in (D), the content displayed on the display 101 is changed to the content 1202 as shown in FIG. The content 1202 is a display range located on the left side of the content 1201 in the content 1200, as shown in FIG.
 図11の(E)の状態にあるディスプレイ101に、コンテンツ1200の一部分であるコンテンツ1201が表示されている状態(図18)で、ユーザが、筐体150を、反時計回りに傾けることにより図11の(F)に示す状態にすると、ディスプレイ101に表示される内容は、図26に示されるように、コンテンツ1203へと変更される。コンテンツ1203は、図27に示されるように、コンテンツ1200において、コンテンツ1201に対して右側に位置する表示範囲である。 When the content 1201 that is a part of the content 1200 is displayed on the display 101 in the state of FIG. 11E (FIG. 18), the user tilts the housing 150 counterclockwise. 11 (F), the content displayed on the display 101 is changed to content 1203 as shown in FIG. As shown in FIG. 27, the content 1203 is a display range located on the right side of the content 1201 in the content 1200.
 なお、図18,図19,図24~図27において、破線DLは、コンテンツ1201~1203で表される面とディスプレイ101が交わる線を表している。 18, 19, and 24 to 27, a broken line DL represents a line where the surface represented by the contents 1201 to 1203 and the display 101 intersect.
 以上説明したように、本変形例では、筐体150が左右方向に傾けられると立体コンテンツのディスプレイ101に表示される範囲が、傾けられた向きに応じた位置に変更される。 As described above, in this modification, when the casing 150 is tilted in the left-right direction, the range displayed on the display 101 of the stereoscopic content is changed to a position corresponding to the tilted direction.
 <変形例(4)>
 以上説明した変形例(2)では、筐体150が前後方向に傾けられると(図11の(A)~(C))、ディスプレイ101に表示されるコンテンツの飛び出し量が変更され(図5~図7)、また、左右方向に傾けられると(図11の(D)~(F))、コンテンツにおける、ディスプレイ101での表示対象となる断面が左右方向に移動するように変更された。また、変形例(3)では、筐体150が前後方向に傾けられると、コンテンツにおける、ディスプレイ101での表示対象となる断面が前後方向に移動するように変更された。
<Modification (4)>
In the modified example (2) described above, when the casing 150 is tilted in the front-rear direction (FIGS. 11A to 11C), the pop-up amount of the content displayed on the display 101 is changed (FIGS. 5 to 5). 7), and when tilted in the left-right direction (FIGS. 11D to 11F), the cross-section to be displayed on the display 101 in the content is changed so as to move in the left-right direction. Moreover, in the modification (3), when the housing | casing 150 was tilted in the front-back direction, it changed so that the cross section used as the display target in the display 101 in a content might move in the front-back direction.
 本変形例では、携帯電話100は、立体コンテンツを表示している間、筐体150の傾きに合わせて、コンテンツにおける表示範囲を移動させるモード、飛び出し量を変更するモード、および、コンテンツを左右に移動させるモードで動作する。 In this modification, the mobile phone 100 moves the display range of the content in accordance with the inclination of the housing 150 while displaying the stereoscopic content, the mode for changing the pop-up amount, and the content left and right. Operates in move mode.
 図17に、本変形例において実行される画像飛び出し量調整処理の一例のフローチャートを示す。本変形例の画像飛び出し量調整処理は、図8に示した画像飛び出し量調整処理に対して、ステップS60~ステップS80の処理が追加されたものに相当する。以下、具体的に内容を説明する。 FIG. 17 shows a flowchart of an example of the image pop-out amount adjustment process executed in the present modification. The image pop-out amount adjustment processing of the present modification corresponds to a process in which steps S60 to S80 are added to the image pop-out amount adjustment processing shown in FIG. The contents will be specifically described below.
 図17を参照して、本変形例の画像飛び出し量調整処理では、CPU110は、まずステップS10で、加速度センサ106を初期化して、ステップS20へ処理を進める。 Referring to FIG. 17, in the image pop-out amount adjustment process of the present modification, CPU 110 first initializes acceleration sensor 106 in step S10 and advances the process to step S20.
 ステップS20では、CPU110は、操作スイッチ104AがONされたか否かを判断し、ONされたと判断するとステップS30へ処理を進め、ONされていないと判断するとステップS60へ処理を進める。 In step S20, the CPU 110 determines whether or not the operation switch 104A is turned on. If it is determined that the operation switch 104A is turned on, the process proceeds to step S30. If it is determined that the operation switch 104A is not turned on, the process proceeds to step S60.
 ステップS30では、CPU110は、ステップS20で操作スイッチ104AがONされてから、加速度センサ106において、X軸方向またはZ軸方向(図10参照)についての加速度の検出状態に変化があったか否かを判断し、変化があったと判断するとステップS40へ処理を進め、当該変化が検出されなければステップS20へ処理を戻す。 In step S30, the CPU 110 determines whether or not the acceleration sensor 106 has changed in the acceleration detection state in the X-axis direction or the Z-axis direction (see FIG. 10) after the operation switch 104A is turned on in step S20. If it is determined that there is a change, the process proceeds to step S40. If no change is detected, the process returns to step S20.
 ステップS40では、CPU110は、筐体150の傾きに応じた立体コンテンツの飛び出し量を取得し、その時点でディスプレイ101に表示させている立体コンテンツが当該飛び出し量で表示されるように画像データを生成し、当該画像データをディスプレイ101に表示させるようにディスプレイ101の表示を更新し、ステップS50へ処理を進める。 In step S40, the CPU 110 acquires the pop-up amount of the three-dimensional content according to the inclination of the casing 150, and generates image data so that the three-dimensional content displayed on the display 101 at that time is displayed with the pop-out amount. The display 101 is updated so that the image data is displayed on the display 101, and the process proceeds to step S50.
 ステップS50では、CPU110は、操作スイッチ104AがOFFされたか否かを判断し、OFFされていないと判断するとステップS30へ処理を戻し、OFFされたと判断すると、ステップS10へ処理を戻す。 In step S50, the CPU 110 determines whether or not the operation switch 104A is turned off. If it is determined that the operation switch 104A is not turned off, the process returns to step S30. If it is determined that the operation switch 104A is turned off, the process returns to step S10.
 一方、ステップS60では、CPU110は、ステップS20で操作スイッチ104AがONされてから、加速度センサ106において、X軸方向またはZ軸方向(図10参照)についての加速度の検出状態に変化があったか否かを判断し、変化があったと判断するとステップS70へ処理を進め、当該変化が検出されなければステップS20へ処理を戻す。 On the other hand, in step S60, CPU 110 determines whether or not there has been a change in the acceleration detection state in the X-axis direction or the Z-axis direction (see FIG. 10) in acceleration sensor 106 since operation switch 104A was turned on in step S20. If it is determined that there is a change, the process proceeds to step S70. If no change is detected, the process returns to step S20.
 ステップS70では、CPU110は、筐体150の傾きに応じて、図12、図13Aおよび図13Bを参照して説明した態様で断面を移動するようにディスプレイ101の表示を更新させて、ステップS80へ処理を進める。 In step S70, the CPU 110 updates the display on the display 101 to move the cross section in the manner described with reference to FIGS. 12, 13A, and 13B according to the inclination of the casing 150, and then proceeds to step S80. Proceed with the process.
 ステップS80では、CPU110は、操作スイッチ104AがONされているか否かを判断し、ONされていると判断するとステップS10へ処理を戻し、ONされていないと判断するとステップS60へ処理を戻す。 In step S80, the CPU 110 determines whether or not the operation switch 104A is turned on. If it is determined that the operation switch 104A is turned on, the process returns to step S10, and if it is determined that it is not turned on, the process returns to step S60.
 以上説明したように、本変形例では、操作スイッチ104がONされているときの筐体150の傾きに応じてコンテンツの飛び出し量が制御され、操作スイッチ104がONされていないときの筐体150の傾きに応じてコンテンツにおける表示対象とされる断面の位置が制御される。なお、操作スイッチ104がONされているときに、筐体150の傾きに応じてコンテンツにおける表示対象とされる断面の位置が制御され、操作スイッチ104がONされていないときに、筐体150の傾きに応じてコンテンツの飛び出し量が制御されても良い。 As described above, in this modification, the content pop-out amount is controlled according to the inclination of the housing 150 when the operation switch 104 is turned on, and the housing 150 when the operation switch 104 is not turned on. The position of the cross section to be displayed in the content is controlled according to the inclination of the content. Note that when the operation switch 104 is turned on, the position of the cross section to be displayed in the content is controlled according to the tilt of the case 150, and when the operation switch 104 is not turned on, The amount of content jumping out may be controlled according to the inclination.
 <変形例(5)>
 以上説明した本実施の形態では、筐体150の傾きに応じて、立体コンテンツの飛び出し量等が、表示コンテンツ全体として変更されていた。
<Modification (5)>
In the present embodiment described above, the pop-out amount of the three-dimensional content is changed for the entire display content according to the inclination of the housing 150.
 本変形例では、コンテンツに含まれるオブジェクト毎に、表示における飛び出し量が決定されてもよい。 In this modification, the pop-up amount in the display may be determined for each object included in the content.
 図14は、ステップS40(図8参照)における飛び出し量の決定についてのサブルーチンのフローチャートである。 FIG. 14 is a flowchart of a subroutine for determining the pop-out amount in step S40 (see FIG. 8).
 図14を参照して、本変形例では、CPU110は、ステップS41で、立体コンテンツに含まれるN番目のオブジェクトの種類を取得する。オブジェクトの種類とは、たとえば、オブジェクトのファイル名などに基づいて取得される。 Referring to FIG. 14, in the present modification, CPU 110 obtains the type of the Nth object included in the stereoscopic content in step S41. The object type is acquired based on, for example, the file name of the object.
 次に、CPU110は、ステップS42で、オブジェクトの種類に応じた飛び出し量の変更態様を取得して、ステップS43へ処理を進める。メモリ111には、オブジェクトの種類に応じて、たとえば、筐体150が一方側(Z軸方向の+側)に傾けられたときに、飛び出し量を多くするか、または、飛び出し量を少なくするかを決定する情報が記憶されている。そして、ステップ42では、ステップS41で取得したオブジェクトの種類が、これらの中のどちらの変更態様であるかを取得する。 Next, in step S42, the CPU 110 acquires a jump amount change mode according to the type of object, and advances the process to step S43. In the memory 111, depending on the type of object, for example, when the casing 150 is tilted to one side (+ side in the Z-axis direction), the pop-out amount is increased or the pop-out amount is decreased. Is stored. In step 42, the change type of the object type acquired in step S41 is acquired.
 ステップS43では、CPU110は、筐体150の傾きの変化の量に応じた飛び出し量を取得して、ステップS44へ処理を進める。 In step S43, the CPU 110 obtains the pop-out amount corresponding to the amount of change in the inclination of the casing 150, and advances the process to step S44.
 メモリ111には、傾きの変化の量に応じた飛び出し量が記憶されている。CPU110は、加速度センサ106の検出出力に基づいて、筐体150の傾きの変化の量を取得する。そして、取得した傾きの変化の量に応じた飛び出し量を、上記した記憶内容に基づいて、取得する。 The memory 111 stores a pop-out amount corresponding to the amount of change in inclination. The CPU 110 acquires the amount of change in the tilt of the housing 150 based on the detection output of the acceleration sensor 106. Then, the pop-out amount corresponding to the acquired amount of change in inclination is acquired based on the above-described stored contents.
 ステップS44では、CPU110は、ディスプレイ101の表示対象となっている立体コンテンツに含まれるすべてのオブジェクトについて、ステップS43で飛び出し量を決定したか否かを判断し、決定したと判断すると当該決定した飛び出し量で各オブジェクトをディスプレイ101に表示させて、処理を図8へリターンさせる。一方、まだすべてのオブジェクトの飛び出し量を決定していないと判断すると、ステップS45でNを1インクリメントして、ステップS41へ処理を戻す。 In step S44, the CPU 110 determines whether or not the pop-out amount has been determined in step S43 for all the objects included in the stereoscopic content that is the display target of the display 101. Each object is displayed on the display 101 by the amount, and the process returns to FIG. On the other hand, if it is determined that the pop-out amounts of all objects have not yet been determined, N is incremented by 1 in step S45, and the process returns to step S41.
 なお、本変形例において、Nは、図14の処理開始時に、1に初期化されている。
 以上説明した本変形例では、筐体150がある向きに傾けられたとき、立体コンテンツに含まれる一部のオブジェクトは飛び出し量が多くなるように、他のオブジェクトは飛び出し量が少なくなるように、表示が更新される。
In this modification, N is initialized to 1 at the start of the process of FIG.
In the modification described above, when the casing 150 is tilted in a certain direction, a part of the objects included in the three-dimensional content has an amount of popping up, and other objects have a small amount of popping out. The display is updated.
 これにより、たとえば、立体コンテンツに複数のキャラクタが含まれる場合、筐体150がある向きに傾けられると、一部のキャラクタについては飛び出し量が多くなるように、残りのキャラクタについては飛び出し量が少なくなるように、ディスプレイ101の表示が変更される。なお、筐体150が逆の向きに傾けられると、それぞれのキャラクタの飛び出し量が逆向きに変更されても良い。 Thereby, for example, when a plurality of characters are included in the stereoscopic content, when the casing 150 is tilted in a certain direction, the pop-out amount is small for the remaining characters so that the pop-out amount is increased for some characters. Thus, the display on the display 101 is changed. Note that when the casing 150 is tilted in the opposite direction, the pop-out amount of each character may be changed in the opposite direction.
 他の例としては、立体コンテンツにテキストとキャラクタが含まれる場合、筐体150がある向きに傾けられると、キャラクタについては飛び出し量が多くなるように、テキストについては飛び出し量が少なくなるように、ディスプレイ101の表示が変更される。 As another example, when text and characters are included in the three-dimensional content, when the casing 150 is tilted in a certain direction, the pop-out amount is increased for the character, and the pop-out amount is decreased for the text. The display on the display 101 is changed.
 なお、筐体150が逆の向きに傾けられると、キャラクタとテキストの飛び出し量がそれぞれ逆向きに変更されても良い。また、筐体150が傾けられる前は、当該コンテンツにおいて、テキストもキャラクタも同じ平面上に位置するように表示され、筐体150が傾けられると、上記のように、テキストとキャラクタ(または複数種類のキャラクタ)の飛び出し量が変更されるように、各オブジェクトの飛び出し量が制御されても良い。 Note that when the casing 150 is tilted in the opposite direction, the pop-out amounts of the character and the text may be changed in the opposite directions. Further, before the casing 150 is tilted, in the content, the text and the character are displayed so as to be positioned on the same plane, and when the casing 150 is tilted, as described above, the text and the character (or a plurality of types) are displayed. The pop-out amount of each object may be controlled so that the pop-out amount of the character (2) is changed.
 また、1つのキャラクタでも、その部分毎に飛び出し量の変更態様が調整されてもよい。つまり、たとえば、図5の(A)の矢印A11の向きに筐体150が傾けられた場合、キャラクタの上半身は飛び出し量が多くなるように、そして、当該キャラクタの下半身は飛び出し量が少なくなるように、ディスプレイ101の表示が変更されてもよい。 Also, even with one character, the change mode of the pop-out amount may be adjusted for each portion. That is, for example, when the housing 150 is tilted in the direction of the arrow A11 in FIG. 5A, the upper body of the character has a larger amount of protrusion, and the lower body of the character has a smaller amount of protrusion. In addition, the display on the display 101 may be changed.
 <変形例(6)>
 以上説明した本実施の形態では、加速度センサ106により筐体150の傾きが検出されたが、本発明のコンテンツ表示装置における筐体の傾きの検出は、これに限定されない。たとえば、携帯電話100は、図15に示すように、加速度センサ106の代わりにカメラ106Aを備え、当該カメラ106Aによって撮影される画像を処理することにより、筐体150の傾きを検出することができる。
<Modification (6)>
In the present embodiment described above, the inclination of the housing 150 is detected by the acceleration sensor 106, but the detection of the inclination of the housing in the content display device of the present invention is not limited to this. For example, as shown in FIG. 15, the mobile phone 100 includes a camera 106 </ b> A instead of the acceleration sensor 106, and can detect the tilt of the housing 150 by processing an image captured by the camera 106 </ b> A. .
 図16は、本変形例の携帯電話100の機能ブロック図である。
 図16を参照して、本変形例では、姿勢検出部194(図4参照)の代わりに、画像処理部196が備えられている。
FIG. 16 is a functional block diagram of the mobile phone 100 of the present modification.
Referring to FIG. 16, in the present modification, an image processing unit 196 is provided instead of the posture detection unit 194 (see FIG. 4).
 画像処理部196は、カメラ106Aによって異なるタイミングで撮影された複数の画像を処理することにより筐体150の傾きを検出し、たとえば、CPU110が所定のプログラムを実行することにより実現されるが、専用の回路によって実現されても良い。 The image processing unit 196 detects a tilt of the housing 150 by processing a plurality of images taken at different timings by the camera 106A, and is realized by the CPU 110 executing a predetermined program, for example. This circuit may be realized.
 画像処理部196は、たとえば、2つの異なるタイミングで撮影された画像において、正規化相互相関法により重なり部分を検出することにより、当該2つの画像が空間に配置された場合の、異なる2つのタイミングの間でのカメラ106Aの移動の向きを算出する。本変形例では、当該算出された移動の向きが、筐体150の移動の向きとされる。 For example, the image processing unit 196 detects two overlapping timings when two images are arranged in space by detecting an overlapping portion by using a normalized cross-correlation method in images taken at two different timings. The direction of movement of the camera 106A is calculated. In the present modification, the calculated movement direction is the movement direction of the housing 150.
 なお、本変形例では、撮影された画像に基づいて筐体150の移動の向きが算出されるのであれば、いかなる他の方法に基づいた算出方法を採用することができる。 In the present modification, any other calculation method can be employed as long as the direction of movement of the housing 150 is calculated based on the captured image.
 <その他の変形例>
 以上説明した本実施の形態およびその変形例では、表示装置の一例であるディスプレイ101は、筐体150と一体的に設けられていたが、本発明はこれに限定されない。ディスプレイ101における立体コンテンツの飛び出し量は、当該ディスプレイ101と別体で構成される筐体150の傾きに応じて変更されてもよい。
<Other variations>
In the above-described embodiment and its modifications, the display 101 which is an example of the display device is provided integrally with the housing 150, but the present invention is not limited to this. The pop-out amount of the three-dimensional content on the display 101 may be changed according to the inclination of the casing 150 that is configured separately from the display 101.
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiment disclosed this time should be considered as illustrative in all points and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 100 携帯電話、101 ディスプレイ、102 タッチセンサ、103 タッチパネル、104 操作部、104A 操作スイッチ、106 加速度センサ、106A カメラ、107 マイク、108 スピーカ、109 メモリインターフェイス、150 筐体、191 中央処理部、192 飛び出し量取得部、193 表示情報生成部、194 姿勢検出部、195 通信部、196 画像処理部、200 記録媒体、900 オブジェクト、1200 コンテンツ。 100 mobile phone, 101 display, 102 touch sensor, 103 touch panel, 104 operation unit, 104A operation switch, 106 acceleration sensor, 106A camera, 107 microphone, 108 speaker, 109 memory interface, 150 housing, 191 central processing unit, 192 popping out Quantity acquisition unit, 193 display information generation unit, 194 attitude detection unit, 195 communication unit, 196 image processing unit, 200 recording medium, 900 object, 1200 content.

Claims (6)

  1.  筐体(150)と、
     表示装置(101)と、
     立体コンテンツを記憶するためのメモリ(111)と、
     前記メモリに記憶された立体コンテンツを前記表示装置に表示させるための制御部(101)と、
     前記筐体の第1の方向の傾きを検出するための検出部(106,106A)とを備え、
     前記制御部は、前記検出部が、前記第1の方向の一方の向きの傾きを検出した場合には前記立体コンテンツの飛び出し量を増加するように変更し、前記第1の方向の前記一方の向きとは異なる他方の向きの傾きを検出した場合には前記立体コンテンツの飛び出し量を減少するように変更する、コンテンツ表示装置(100)。
    A housing (150);
    A display device (101);
    A memory (111) for storing three-dimensional content;
    A control unit (101) for causing the display device to display the stereoscopic content stored in the memory;
    A detector (106, 106A) for detecting the inclination of the housing in the first direction;
    The control unit is configured to change the amount of pop-up of the three-dimensional content when the detection unit detects a tilt in one direction of the first direction, and to change the one direction of the first direction. A content display device (100) that changes so as to reduce the pop-out amount of the three-dimensional content when an inclination in the other direction different from the direction is detected.
  2.  前記検出部は、前記筐体の前記第1の方向の傾きの量を検出し、
     前記制御部は、前記傾きの量に応じた量だけ、前記立体コンテンツの飛び出し量を変更する、請求項1に記載のコンテンツ表示装置。
    The detection unit detects an amount of inclination of the housing in the first direction;
    The content display device according to claim 1, wherein the control unit changes the pop-out amount of the three-dimensional content by an amount corresponding to the tilt amount.
  3.  前記制御部は、前記傾きの量が特定の量を超えたことを条件として、前記立体コンテンツの飛び出し量を変更する、請求項2に記載のコンテンツ表示装置。 The content display device according to claim 2, wherein the control unit changes the pop-out amount of the three-dimensional content on condition that the amount of tilt exceeds a specific amount.
  4.  前記検出部は、前記筐体の、前記第1の方向と交わる第2の方向の傾きを検出し、
     前記制御部は、前記検出部が前記第2の方向の傾きを検出した場合には、前記立体コンテンツにおける前記表示装置での表示対象を、前記第2の方向に沿って変更する、請求項1~請求項3のいずれかに記載のコンテンツ表示装置。
    The detector detects an inclination of the housing in a second direction intersecting the first direction;
    The said control part changes the display target in the said display apparatus in the said three-dimensional content along the said 2nd direction, when the said detection part detects the inclination of the said 2nd direction. The content display device according to any one of claims 3 to 4.
  5.  筐体、表示装置、および、立体コンテンツを記憶するためのメモリを備えたコンテンツ表示装置において実行される立体コンテンツの表示方法であって、
     前記筐体の第1の方向の傾きを検出するステップと、
     前記第1の方向の一方の向きの傾きを検出した場合には前記立体コンテンツの飛び出し量を増加するように変更し、前記第1の方向の前記一方の向きとは異なる他方の向きの傾きを検出した場合には前記立体コンテンツの飛び出し量を減少するように変更するステップとを備える、コンテンツ表示方法。
    A stereoscopic content display method executed in a content display device including a housing, a display device, and a memory for storing the stereoscopic content,
    Detecting the inclination of the housing in the first direction;
    When the inclination of one direction of the first direction is detected, the amount of pop-up of the three-dimensional content is changed so as to increase the inclination of the other direction different from the one direction of the first direction. A content display method comprising: a step of changing so as to reduce the pop-out amount of the three-dimensional content when detected.
  6.  筐体、表示装置、および、立体コンテンツを記憶するためのメモリを備えたコンピュータをコンテンツ表示装置として機能させるためのコンピュータ読取可能なコンテンツ表示用プログラムを記憶した記録媒体であって、
     前記プログラムは、前記コンピュータに、
     前記筐体の第1の方向の傾きを検出するステップと、
     前記第1の方向の一方の向きの傾きを検出した場合には前記立体コンテンツの飛び出し量を増加するように変更し、前記第1の方向の前記一方の向きとは異なる他方の向きの傾きを検出した場合には前記立体コンテンツの飛び出し量を減少するように変更するステップとを実行させる、記録媒体。
    A storage medium storing a computer-readable content display program for causing a computer including a housing, a display device, and a memory for storing stereoscopic content to function as a content display device,
    The program is stored in the computer.
    Detecting the inclination of the housing in the first direction;
    When the inclination of one direction of the first direction is detected, the amount of pop-up of the three-dimensional content is changed so as to increase the inclination of the other direction different from the one direction of the first direction. A recording medium that executes a step of changing so as to reduce the pop-out amount of the three-dimensional content when detected.
PCT/JP2011/080033 2010-12-27 2011-12-26 Content display device, content display method, and recording medium WO2012090920A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010289971A JP2012138776A (en) 2010-12-27 2010-12-27 Content display device, content display method, and program for content display
JP2010-289971 2010-12-27

Publications (1)

Publication Number Publication Date
WO2012090920A1 true WO2012090920A1 (en) 2012-07-05

Family

ID=46383023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/080033 WO2012090920A1 (en) 2010-12-27 2011-12-26 Content display device, content display method, and recording medium

Country Status (2)

Country Link
JP (1) JP2012138776A (en)
WO (1) WO2012090920A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5689707B2 (en) * 2011-02-15 2015-03-25 任天堂株式会社 Display control program, display control device, display control system, and display control method
JP5745497B2 (en) * 2012-12-04 2015-07-08 任天堂株式会社 Display system, display control apparatus, information processing program, and display method
US10659755B2 (en) * 2015-08-03 2020-05-19 Sony Corporation Information processing device, information processing method, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009084213A1 (en) * 2007-12-28 2009-07-09 Capcom Co., Ltd. Computer, program, and storage medium
JP2010257160A (en) * 2009-04-23 2010-11-11 Nec Casio Mobile Communications Ltd Terminal equipment, display method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009084213A1 (en) * 2007-12-28 2009-07-09 Capcom Co., Ltd. Computer, program, and storage medium
JP2010257160A (en) * 2009-04-23 2010-11-11 Nec Casio Mobile Communications Ltd Terminal equipment, display method, and program

Also Published As

Publication number Publication date
JP2012138776A (en) 2012-07-19

Similar Documents

Publication Publication Date Title
US11934228B2 (en) Method for providing image using foldable display and electronic device for supporting the same
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
US9507428B2 (en) Electronic device, control method, and control program
JP5739671B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
US9501204B2 (en) Display device
US11683470B2 (en) Determining inter-pupillary distance
EP1570886A1 (en) Game apparatus and recording medium storing a game program
JP2014135086A (en) Three dimensional user interface effects on display by using properties of motion
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
EP2902998A1 (en) Display device, control system, and control programme
WO2012133226A1 (en) Electronic apparatus, control method, and control program
CN112245912B (en) Sound prompting method, device, equipment and storage medium in virtual scene
JP2013176529A (en) Apparatus and method for changing position of virtual camera based on changed game state
WO2012090920A1 (en) Content display device, content display method, and recording medium
US9619048B2 (en) Display device
JP2012252592A (en) Display control program, display control apparatus, display control method, and display control system
CN112717409A (en) Virtual vehicle control method and device, computer equipment and storage medium
JP6788294B2 (en) Display control device and program
JP6788295B2 (en) Display control device and program
JP7210153B2 (en) ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP6126195B2 (en) Display device
JP2016186727A (en) Video display device, video display method, and program
JP2015109092A (en) Display device
JP2012185273A (en) Information display device, display control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11852600

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11852600

Country of ref document: EP

Kind code of ref document: A1