JP2010026638A - Mobile type image display device, control method thereof, program, and information storage medium - Google Patents

Mobile type image display device, control method thereof, program, and information storage medium Download PDF

Info

Publication number
JP2010026638A
JP2010026638A JP2008184858A JP2008184858A JP2010026638A JP 2010026638 A JP2010026638 A JP 2010026638A JP 2008184858 A JP2008184858 A JP 2008184858A JP 2008184858 A JP2008184858 A JP 2008184858A JP 2010026638 A JP2010026638 A JP 2010026638A
Authority
JP
Japan
Prior art keywords
display screen
plurality
user
finger
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2008184858A
Other languages
Japanese (ja)
Other versions
JP2010026638A5 (en
JP5205157B2 (en
Inventor
Shinichi Hirata
Hiroshi Osawa
洋 大澤
真一 平田
Original Assignee
Sony Computer Entertainment Inc
株式会社ソニー・コンピュータエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc, 株式会社ソニー・コンピュータエンタテインメント filed Critical Sony Computer Entertainment Inc
Priority to JP2008184858A priority Critical patent/JP5205157B2/en
Priority claimed from PCT/JP2009/056004 external-priority patent/WO2010007813A1/en
Publication of JP2010026638A publication Critical patent/JP2010026638A/en
Publication of JP2010026638A5 publication Critical patent/JP2010026638A5/ja
Application granted granted Critical
Publication of JP5205157B2 publication Critical patent/JP5205157B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a mobile type image display device for enabling a user to easily perform various operation inputs. <P>SOLUTION: The mobile type image display device, which is provided with a substantially rectangular display screen and a plurality of touch sensors arranged along at least two sides defining a circumference of the display screen to detect a position where a finger of a user touches, changes an image to be displayed on the screen in accordance with a combination of a plurality of finger positions detected by the plurality of the touch sensors, respectively. <P>COPYRIGHT: (C)2010,JPO&INPIT

Description

  The present invention relates to a portable image display device carried by a user, a control method thereof, a program, and an information storage medium.

In recent years, various image display devices such as a portable game machine, a cellular phone, and a personal digital assistant (PDA) that can be carried by a user and have a display screen that presents various information to the user have appeared ( For example, see Patent Document 1). Such a portable image display device executes various types of information processing in response to user operation inputs, and displays the results on the display screen.
US Patent Application Publication No. 2007/0202956

  The portable image display device as described above preferably includes a user interface that allows a user to easily perform various operation inputs.

  The present invention has been made in view of the above circumstances, and one of its purposes is a portable image display device that allows a user to easily perform various operation inputs, a control method thereof, a program, and an information storage medium. Is to provide.

  A portable image display device according to the present invention for solving the above problems is provided along each of a substantially rectangular display screen and at least two sides constituting the outer periphery of the display screen. A plurality of touch sensors that detect touched positions, and a display image control unit that changes an image displayed on the display screen in accordance with a combination of positions of a plurality of fingers detected by each of the plurality of touch sensors; It is characterized by including.

  Further, in the portable image display device, the display image control means changes an image displayed on the display screen according to a combination of movements of a plurality of finger positions detected by the plurality of touch sensors. It is good as well.

  Further, in the portable image display device, the display image control means changes an image displayed on the display screen in accordance with a combination of movement directions of a plurality of finger positions detected by the plurality of touch sensors. It is also possible to make it.

  The portable image display device includes a substantially rectangular flat plate-shaped housing, the display screen is provided on a surface of the housing, and each of the plurality of touch sensors is provided on a side surface of the housing. It may be provided.

  The control method of the portable image display device according to the present invention is provided along each of a substantially rectangular display screen and at least two sides constituting the outer periphery of the display screen, each touched by a user's finger. A plurality of touch sensors for detecting positions, and a method for controlling a portable image display device, wherein the display screen is displayed in accordance with a combination of positions of a plurality of fingers detected by each of the plurality of touch sensors. And a step of changing an image to be displayed.

  The program according to the present invention is provided along each of a substantially rectangular display screen and at least two sides constituting the outer periphery of the display screen, and each of the touches detects a position touched by a user's finger. A program for causing a computer to function as a portable image display device comprising a sensor, and displaying on the display screen in accordance with a combination of a plurality of finger positions detected by each of the plurality of touch sensors It is a program for causing the computer to function as display image control means for changing an image. This program may be stored in a computer-readable information storage medium.

  Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.

  FIG. 1A and FIG. 1B are perspective views showing the appearance of a portable image display device 1 according to an embodiment of the present invention. FIG. 1A shows the portable image display device 1. FIG. 1B shows a state seen from the front (front) side, and FIG. 1B shows a state where the same portable image display device 1 is seen from the back side.

  As shown in these drawings, the casing 10 of the portable image display device 1 has a substantially rectangular flat plate shape as a whole, and the first display screen 12a is on the front surface and the second display is on the back surface. Each screen 12b is provided. Each of these display screens has a substantially rectangular shape, and is arranged so as to occupy most of the front surface or the back surface of the housing 10. The first display screen 12a and the second display screen 12b may be various devices capable of displaying images, such as a liquid crystal display panel and an organic EL display panel. In addition, a touch panel capable of detecting a position touched by a user's finger or the like may be placed on these display screens.

  Furthermore, in the portable image display device 1 according to the present embodiment, four touch sensors are provided along each of the four sides constituting the outer periphery of the display screen. These touch sensors are arranged on each of the four side surfaces of the housing 10 and each have a linear detection region. By being arranged on the side surface of the housing 10 in this way, each touch sensor is arranged on one side of the outer periphery of the first display screen 12a arranged on the front surface of the housing 10 and the second display screen 12b arranged on the back surface. It will be arranged along both sides of the outer periphery of the.

  Specifically, in the present embodiment, the first touch sensor 14a and the third touch sensor 14c are displayed on the side surfaces along the long side direction (left-right direction) of the first display screen 12a and the second display screen 12b. A second touch sensor 14b and a fourth touch sensor 14d are arranged on the side surfaces along the short side direction (vertical direction) of the screen 12a and the second display screen 12b, respectively.

  These touch sensors are used to detect a position touched by a finger of a user who uses the portable image display device 1. Each touch sensor may be of various types such as a capacitance type, a pressure sensitive type, and an optical type. Each touch sensor may detect not only the position touched by the user's finger but also the strength with which the user's finger presses the touch sensor, for example, by detecting the contact area or pressure of the finger. When these touch sensors detect the position of the user's finger, the portable image display device 1 operates the position of the user's finger in the direction along the outer periphery of the first display screen 12a or the second display screen 12b. It can be accepted as input.

  A first far-infrared sensor 16a is provided on the surface of the housing 10 adjacent to the first display screen 12a. Furthermore, a second far-infrared sensor 16b is provided on the rear surface of the housing 10 adjacent to the second display screen 12b. These far infrared sensors detect far infrared rays emitted from a heat source. Thereby, the portable image display device 1 can detect whether or not there is a user in front of each far-infrared sensor. The portable image display device 1 may include various devices that can detect the position of the user, such as a CCD camera, for example, instead of or in addition to the far-infrared sensor.

  Further, an acceleration sensor 18 and a gyro sensor 20 are disposed inside the housing 10 of the portable image display device 1. The acceleration sensor 18 is a three-axis acceleration sensor, and detects acceleration generated in the directions of three reference axes (X axis, Y axis, and Z axis) set with respect to the housing 10. Here, the three reference axes are substantially orthogonal to each other, the X axis is the long side direction of the rectangle on the surface of the housing 10, the Y axis is the short side direction of the rectangle, and the Z axis is the thickness direction of the housing 10. It is assumed that it is set. When the acceleration sensor 18 detects the acceleration generated in each reference axis due to gravity, the portable image display device 1 can detect its own posture (that is, the inclination of the housing 10 with respect to the vertical direction in which gravity acts). it can.

  The gyro sensor 20 detects the angular velocity of rotation about the gyro reference axis (here, the Z axis), and outputs an electrical signal corresponding to the detected angular velocity. For example, the gyro sensor 20 may be a piezoelectric vibration gyro. By integrating the angular velocity detected by the gyro sensor 20, the portable image display device 1 can calculate the rotation angle of the housing 10 with respect to the Z axis.

  Although not shown in FIG. 1, the portable image display device 1 includes various operation members for receiving user operation inputs such as buttons and switches in addition to the touch sensor. It is good also as preparing for a back surface, a side surface, etc.

  FIG. 2 is a configuration block diagram showing the internal configuration of the portable image display device 1. As shown in the figure, the portable image display device 1 includes a control unit 30, a storage unit 32, and an image processing unit 34. The control unit 30 is a CPU or the like, for example, and executes various types of information processing according to programs stored in the storage unit 32. The storage unit 32 is, for example, a memory element such as a RAM or a ROM, a disk device, and the like, and stores a program executed by the control unit 30 and various data. The storage unit 32 also functions as a work memory for the control unit 30.

  The image processing unit 34 includes, for example, a GPU and a frame buffer memory, and draws an image to be displayed on each of the first display screen 12a and the second display screen 12b in accordance with an instruction output from the control unit 30. As a specific example, the image processing unit 34 includes two frame buffer memories corresponding to the first display screen 12a and the second display screen 12b, respectively, and the GPU performs the two processing at predetermined intervals according to an instruction from the control unit 30. Write an image to each of the two frame buffer memories. The images written in these frame buffer memories are converted into video signals at a predetermined timing and displayed on the corresponding display screens.

  In the present embodiment, the control unit 30 executes various processes based on detection results obtained by the touch sensors 14a to 14d, the far-infrared sensors 16a and 16b, the acceleration sensor 18, and the gyro sensor 20, respectively. In particular, the control unit 30 determines the content of the instruction operation by the user based on the detection result of the position of the user's finger by the touch sensors 14a to 14d, executes processing according to the instruction operation, and displays the processing result. It is displayed on the display screen and presented to the user.

  Here, the display screen browsed by the user may be the first display screen 12a or the second display screen 12b depending on circumstances. Therefore, the portable image display device 1 specifies a display screen (hereinafter referred to as a main display screen) that is assumed to be browsed mainly by the user, and displays an image to be mainly displayed on the main display screen. Further, on the display screen opposite to the main display screen (hereinafter referred to as the sub display screen), auxiliary information related to the image displayed on the main display screen may be displayed or displayed on the main display screen. Another image generated by a program different from the program that generates the image to be displayed may be displayed. Further, the sub display screen may not display an image until the screen is switched to the main display screen by a user operation or the like.

  Note that which of the first display screen 12a and the second display screen 12b is used as the main display screen depends on which side of the housing 10 the user detected by the first far-infrared sensor 16a and the second far-infrared sensor 16b, for example. It is good also as switching according to the information which shows whether it exists, the attitude | position of the housing | casing 10 detected by the acceleration sensor 18, etc. Or it is good also as switching according to a user's explicit instruction | indication operation.

  Hereinafter, a specific example of processing executed by the control unit 30 according to the detection result of each touch sensor in the present embodiment will be described. In this embodiment, the control part 30 performs the process which changes the image displayed on a main display screen according to the combination of the several position of a user's finger | toe detected by touch sensor 14a-14d. In order to execute such processing, the portable image display device 1 functionally includes a detection position / information acquisition unit 40, an operation type determination unit 42, and a display image control processing unit 44 as shown in FIG. And. These functions are realized when the control unit 30 executes a program stored in the storage unit 32. This program may be provided by being stored in various computer-readable information storage media such as an optical disk and a memory card, or may be provided via a communication network such as the Internet.

  The detection position / information acquisition unit 40 detects coordinate values output by the touch sensors 14a to 14d by detecting the position of the user's finger. In this embodiment, when each of the touch sensors 14a to 14d detects a position touched by a user's finger, the touch sensors 14a to 14d output coordinate values indicating the position at predetermined time intervals. Here, the coordinate value output by each touch sensor is assumed to be a one-dimensional value indicating the position in the linear detection region. The detection position / information acquisition unit 40 acquires coordinate values output by each of the four touch sensors at predetermined time intervals. Accordingly, a maximum of four coordinate value sequences indicating the movement (time change) of the position of the user's finger can be obtained corresponding to the four touch sensors. The detection position / information acquisition unit 40 may acquire not only the coordinate value indicating the detection position but also the pressure value indicating the strength with which the user presses the position. In this case, the detection position / information acquisition unit 40 may adopt only the coordinate value whose pressure value is equal to or greater than a predetermined threshold as the coordinate value indicating the position where the user's finger is in contact.

  The operation type determination unit 42 determines the type of operation performed by the user based on a plurality of coordinate value sequences indicating the movement of the user's finger position acquired by the detected position / information acquisition unit 40. For example, for each of the four touch sensors, the operation type determination unit 42 first determines whether or not the user's finger keeps touching over a predetermined period and whether or not the user's finger has moved a predetermined amount or more within that period. When there are a plurality of touch sensors that are determined to move the user's finger by a predetermined amount or more, the combination of these touch sensors and the direction of movement of the user's finger (that is, whether the coordinate value has increased in the coordinate value sequence) Or the type of operation has been reduced). Thus, the operation type can be determined according to the combination of the movements (moving directions in this case) of the positions of the user's fingers detected by the plurality of touch sensors.

  The display image control processing unit 44 changes the image displayed on the main display screen according to the operation type determined by the operation type determination unit 42. Specifically, for example, image processing for enlarging or reducing an image displayed on the screen or scrolling (translating) in the horizontal and vertical directions is executed.

  Hereinafter, a specific example of the change of the image according to the type of user operation will be described with reference to FIGS. In these figures, the first display screen 12a is a main display screen, and each touch sensor actually arranged on the side surface of the casing 10 along the four sides constituting the outer periphery thereof is displayed as a first display. These are schematically shown side by side on each side of the screen 12a. In addition, the movement of the user's finger is indicated by an arrow on each touch sensor.

  FIG. 4 shows an example in which the control unit 30 scrolls the image displayed on the first display screen 12a in the left-right direction in accordance with a user operation. In the example of this figure, both the first touch sensor 14a and the third touch sensor 14c arranged along the left-right direction of the display screen are in the left direction (that is, the X-axis negative direction) toward the display screen. The movement of the finger is detected. In this case, the control unit 30 executes a process of scrolling the image displayed on the first display screen 12a in the same left direction as the direction in which the user's finger moves. On the other hand, when the first touch sensor 14a and the third touch sensor 14c detect the movement of the user's finger in the right direction (X-axis positive direction) toward the display screen, the control unit 30 displays the first display screen 12a. Scroll the image displayed at to the right.

  FIG. 5 shows an example in which the image displayed on the first display screen 12a is scrolled up and down. As shown in this figure, the second touch sensor 14b and the fourth touch sensor 14d arranged along the vertical direction of the display screen cause the user's finger to move upward (Y-axis positive direction) toward the display screen. When the movement is detected, the control unit 30 scrolls the image displayed on the first display screen 12a upward. Further, when the movement of the user's finger in the downward direction (Y-axis negative direction) is detected, the control unit 30 scrolls the display image downward.

  FIG. 6 shows an example in which the user instructs to reduce the image. In the example of this figure, the first touch sensor 14a detects the movement of the user's finger in the positive X-axis direction, and the second touch sensor 14b detects the movement of the user's finger in the negative Y-axis direction. That is, the two touch sensors adjacent to each other detect the movement of the user's finger in a direction approaching the other touch sensor. In this case, the control unit 30 executes a process for reducing the image displayed on the first display screen 12a. Similarly, when the second touch sensor 14b detects the movement of the user's finger in the Y-axis positive direction and the third touch sensor 14c, respectively, or when the third touch sensor 14c detects the X-axis negative direction, When the 4-touch sensor 14d detects the movement of the user's finger in the positive Y-axis direction, the fourth touch sensor 14d is in the negative Y-axis direction, and the first touch sensor 14a is in the negative X-axis direction. Similarly, when motion is detected, image reduction processing may be executed.

  Conversely, when each of the two adjacent touch sensors detects the movement of the user's finger in a direction away from the other touch sensor, the control unit 30 displays the image displayed on the first display screen 12a. Execute the enlargement process. FIG. 7 shows an example of such an image enlargement instruction by the user. The fourth touch sensor 14d detects the movement of the user's finger in the Y-axis positive direction, and the first touch sensor 14a moves in the X-axis positive direction. It shows an example in which the movement of the user's finger is detected. As described above, when a combination of the movement of the user's finger in the opposite direction to the above-described image reduction instruction is detected, the control unit 30 performs a process of enlarging the image displayed on the first display screen 12a. Execute.

  8 and 9 show an example of the movement of the user's finger when a rotation instruction is given. In the example of FIG. 8, the first touch sensor 14a and the third touch sensor 14c facing each other detect movements of the user's fingers in opposite directions. That is, the first touch sensor 14a detects the movement of the user's finger in the X-axis positive direction and the third touch sensor 14c detects the X-axis negative direction. In response to this, the control unit 30 performs a process of rotating the image displayed on the first display screen 12a counterclockwise. Conversely, when the first touch sensor 14a detects the movement of the user's finger in the negative X-axis direction and the third touch sensor 14c in the positive X-axis direction, the display image is rotated clockwise. To perform the process.

  FIG. 9 shows an example in which the second touch sensor 14b and the fourth touch sensor 14d, which are also opposed to each other, detect the movements of the user's fingers in opposite directions. In the example of this figure, the second touch sensor 14b detects the movement of the user's finger in the positive Y-axis direction and the fourth touch sensor 14d in the negative Y-axis direction, and the control unit 30 displays the display image accordingly. Rotate counterclockwise. If the second touch sensor 14b detects the movement of the user's finger in the negative Y-axis direction and the fourth touch sensor 14d detects the positive direction of the Y-axis, the control unit 30 conversely rotates the display image clockwise. .

  In the above example, it is assumed that the first display screen 12a is the main display screen. However, the control unit 30 performs the same processing when the second display screen 12b is selected as the main display screen. By executing, the user can perform an operation input that changes the image displayed on the display screen. However, in this case, the control unit 30 determines whether each touch sensor is selected depending on which display screen is selected as the main display screen and in what orientation the image is displayed with respect to the main display screen. Assume that the correspondence relationship between the moving direction of the finger position to be detected and the scroll direction and rotation direction of the image is changed. Accordingly, the image can be changed in accordance with the direction in which the user moves the finger.

  In the above example, the control unit 30 may change the image displayed on the first display screen 12a according to not only the moving direction of the user's finger detected by each touch sensor but also the moving amount. Good. For example, the scroll amount and scroll speed, the enlargement / reduction ratio, the enlargement / reduction speed, the rotation amount and the rotation speed when changing the image may be changed according to the amount of movement of the user's finger. Further, in the above example, the control unit 30 scrolls, enlarges / reduces, or rotates the entire image displayed on the main display screen. By executing a process that moves, enlarges, reduces, or rotates a part of an image or an operation target object arranged in the image according to the movement of the user's finger, The image displayed on the main display screen may be changed.

  In the description so far, the example in which the display image is changed in accordance with the combination of the movements of the positions of the plurality of fingers has been described. However, the present invention is not limited thereto, and the control unit 30 may, for example, have a plurality of predetermined positions. In response to an operation in which the user simply touches the finger, various processes for changing the display image may be executed.

  Further, in the description so far, the touch sensors 14a to 14d detect only one position touched by the user's finger. However, the present invention is not limited to this, and each of the touch sensors 14a to 14d may be capable of multipoint detection for detecting the positions of the plurality of fingers when the plurality of fingers touch at the same time. In this case, the control unit 30 may change the image displayed on the main display screen according to the combination of the positions of a plurality of fingers detected by one touch sensor.

  Specifically, when the positions of a plurality of fingers are detected by one touch sensor, the detection position / information acquisition unit 40 acquires a plurality of coordinate values output by the touch sensor. And using information such as the distance between the acquired coordinate values (that is, the order of the positions in the detection area indicated by each coordinate value) and the position of the finger detected last time, Each of the acquired plurality of coordinate values is associated with a coordinate value estimated to indicate a position touched by the same finger of the user among the coordinate values output by the touch sensor last time. Accordingly, the detection position / information acquisition unit 40 can acquire a plurality of coordinate value sequences indicating the movement of each of a plurality of fingers of the user based on the output of one touch sensor. By using the plurality of coordinate value sequences in the same manner as the plurality of coordinate value sequences acquired based on the coordinate values output from the plurality of touch sensors in the above-described example, the operation type determination unit 42 allows the user to Determine the type of operation to be performed.

  FIG. 10 shows an example of the movement of the user's finger detected by such a multipoint detection type touch sensor. In the example of this figure, the first touch sensor 14a detects the movement of the user's two fingers approaching each other. In this case, as in the example of FIG. 6, the display image control processing unit 44 executes a process of reducing the display image. On the contrary, when the movement of the two fingers separating from each other is detected, the display image control processing unit 44 executes a process of enlarging the display image. In addition, not only the 1st touch sensor 14a but another three touch sensors are good also as a display image being reduced or expanded by detecting the motion of the same user's finger | toe.

  When the touch sensor is a multi-point detection type as described above, a plurality of coordinate value sequences in which the position does not change and a coordinate value sequence in which the position changes with time are mixed for the output from each touch sensor. A coordinate value sequence may be acquired. In such a case, the operation type determination unit 42 determines that the coordinate value sequence whose position does not change is a position where the user's hand is touching by gripping the housing 10, and such coordinates The value sequence may be excluded from the operation type determination target. In this way, even when the touch sensor detects a finger or palm contact not intended by the user, only the operation of moving the finger on the touch sensor by the user is illustrated in FIGS. 4 to 9. It is possible to specify user operations on a plurality of touch sensors as exemplified in FIG.

  Further, even when one touch sensor only detects the movement of one finger of the user, the control unit 30 may execute various processes according to the detected position of the user's finger. Good. For example, the control unit 30 performs the same processing as in the examples of FIGS. 4 and 5 in which the display image is scrolled up, down, left, and right when the movement of the user's finger is detected by any one touch sensor. May be executed. Also, the control unit 30 may execute various processes even when it is detected by a single touch sensor that the user's finger has simply touched a predetermined position. As an example, when the portable image display device 1 includes a camera device, control is performed when it is detected that the user has pressed a position where a shutter is provided in a general camera with a pressure exceeding a predetermined strength. The unit 30 may execute image capturing processing by the camera device.

  Here, an example of processing executed by the operation type determination unit 42 for a plurality of coordinate value sequences output by the detection position / information acquisition unit 40 in order to determine the actual finger movement by the user will be described. In the following example, the touch sensors 14a to 14d are multi-point detection sensors, and the detection position / information acquisition unit 40 can output a plurality of coordinate value sequences for each touch sensor.

  First, the operation type determination unit 42 performs a low-pass filter process that passes only frequency components having a predetermined cutoff frequency (for example, about 5 Hz) or less for each coordinate value sequence. As a result, a relatively high frequency signal component caused by camera shake or the like is removed from the coordinate value sequence indicating the detection position of the user's finger.

  Subsequently, the operation type determination unit 42 rearranges the plurality of coordinate value sequences, each of which has been subjected to the low-pass filter processing, according to a predetermined criterion. The order of this rearrangement is such that the higher the order of the coordinate values, the higher the possibility is that the user obtained a finger touching the touch sensor with the intention of performing some operation input. Specifically, for example, the operation type determination unit 42 calculates the index value of each coordinate value sequence according to the conditions listed below, and rearranges the coordinate value sequence in the order of the calculated index value. As an example, the index value is calculated to be higher for a coordinate value sequence having a higher pressure value detected when each coordinate value constituting the coordinate value sequence is detected. As a result, the rank of the coordinate value sequence obtained when the user's finger unintentionally touches the touch sensor is lowered. In addition, an index value is calculated to be high for a coordinate value sequence in which the detection positions indicated by the coordinate value sequence are stably present at relatively the same position or are stably changing in one direction. . That is, a coordinate value sequence whose position has changed extremely greatly in a short time, and a coordinate value sequence whose position change direction repeatedly changes in a short time may have occurred due to erroneous detection, etc. The index value is calculated low. In addition, an index value of a coordinate value sequence (that is, a coordinate value sequence that is independent from the others) in which no detection position by another coordinate value sequence exists in the vicinity of the detection position indicated by the coordinate value sequence is calculated to be high. In this way, when the detection positions indicated by the plurality of coordinate value sequences are close to a relatively narrow range, such coordinate value sequences are obtained by the user holding the portable image display device 1. Since it is assumed, the index value of such a coordinate value sequence can be lowered. Further, the number of coordinate values constituting each coordinate value sequence (that is, the period during which the user's finger is kept in contact) may be used for calculating the index value.

  Further, the operation type determination unit 42 evaluates each coordinate value sequence in accordance with the rearranged order, and determines whether each coordinate value sequence corresponds to a predetermined operation pattern, whereby the operation type by the user is determined. Judgment is made. By doing so, it is possible to determine the operation type by preferentially using a coordinate value sequence assumed to be obtained by the user intentionally touching the touch sensor with a finger as a determination target. At this time, a priority order may be determined in advance for the operation type to be determined. That is, the portable image display device 1 stores in advance a table in which the priority order is associated with each of a plurality of operation patterns to be determined. Then, for each of the plurality of operation patterns, it is determined whether or not there is a coordinate value sequence corresponding to the operation pattern in order from the operation pattern with the highest priority. Further, whether or not each coordinate value sequence corresponds to the operation pattern is determined in the order in which the above-described rearrangement is performed for each coordinate value sequence. As a specific example of the order of this operation pattern, the operation by the movement of two or more fingers of the user, the operation by the movement of one finger, the operation by which one finger touches a predetermined position in descending order of priority. An order in which one finger moves away from a predetermined position is exemplified. As a result of determining the operation type in this order, it is determined that the operation pattern that is initially determined to have the corresponding coordinate value sequence is the operation input actually performed by the user. In this way, it is possible to determine an operation with a low possibility of erroneous detection as a user operation with priority. Further, when there are a plurality of types of operations by movements of two or more fingers, operations by movement of one finger, and the like, the priority order between these types of operation patterns may be determined. Further, the priority order may be changed according to the contents of the process being executed by the application program. In this way, depending on the content (state) of the process being executed, an operation that is unlikely to be performed by the user or an operation that is to be avoided from being erroneously determined (for final determination of the process content) The priority of the operation etc. can be lowered.

  When the operation type determination unit 42 determines whether or not each coordinate value sequence corresponds to a predetermined operation pattern, it is determined that the detected position changes by an amount exceeding a given play amount. You may add to the determination conditions of whether it corresponds to a pattern. In this way, when the user's finger slightly touches the touch sensor against the user's intention, it is determined that the user has performed an operation of moving the finger in a predetermined direction on the touch sensor. Can be avoided. The play amount may be a different value for each type of the plurality of operation patterns. For example, the play amount may be changed between an operation pattern for touch sensors separated from each other as illustrated in FIG. 5 and an operation pattern for touch sensors adjacent to each other as illustrated in FIG.

  According to the present embodiment described above, the image displayed on the display screen changes according to the combination of finger positions detected by each of the plurality of touch sensors provided along the outer periphery of the display screen. To do. As a result, the user can easily and intuitively input an operation on the image displayed on the display screen by touching the position on the plurality of touch sensors corresponding to the display screen with a finger. It can be performed. In addition, by detecting the positions of a plurality of fingers with a plurality of touch sensors and changing an image in accordance with the combination, the portable image display device 1 according to the present embodiment receives various types of operation inputs by the user. Can be accepted. Furthermore, since the plurality of touch sensors are provided along the outer periphery of the display screen, when the user touches and operates these touch sensors with a finger, the display screen is not hidden with the finger. It is possible to make it difficult to impair the viewability.

  The embodiments of the present invention are not limited to those described above.

  For example, when a touch sensor is provided on each side surface of the housing 10 of the portable image display device 1, the side surface of the housing is concave when viewed from a direction parallel to both the surface on which the display screen is provided and the side surface. It is good also as arrange | positioning and arranging a touch sensor linearly along the bottom face of this recessed part. FIG. 11 shows a cross-sectional shape of the portable image display device having the concave side surface as described above. The portable image display device shown in this figure has the same structure as that of the portable image display device shown in FIG. 1 except that the side surface is formed in a concave shape. It is assumed that a sensor is arranged. Further, this figure shows a state in which the portable image display device 1 is cut by a plane perpendicular to each display screen and passing through the first touch sensor 14a and the third touch sensor 14c. If the touch sensor is arranged along the bottom surface of the recess in this way, as shown in the figure, when the user intentionally touches the touch sensor, the finger can be brought into contact with the touch sensor, It is possible to prevent the user's finger or palm from touching the touch sensor unintentionally at a place where the user simply holds the casing.

  Further, in the example described above, each touch sensor is provided on the side surface of the housing 10. However, the touch sensor may be provided adjacent to each side constituting the outer periphery of the display screen on the same surface as each display screen of the housing. Moreover, the touch sensor does not necessarily need to be provided along all four sides constituting the outer periphery of the rectangle. For example, if two touch sensors are provided along each of at least two sides such as one long side and one short side, control is performed according to the combination of the positions of the user's fingers detected by each of these touch sensors. By changing the image displayed on the display screen by the unit 30, the user can intuitively perform various operations such as enlargement and reduction on the display screen.

1 is a perspective view illustrating an appearance of a portable image display device according to an embodiment of the present invention. It is a block diagram which shows the internal structure of the portable image display apparatus which concerns on this embodiment. It is a functional block diagram which shows the function example of the portable image display apparatus which concerns on this embodiment. It is explanatory drawing which shows an example of operation with respect to the touch sensor by a user. It is explanatory drawing which shows another example of operation with respect to the touch sensor by a user. It is explanatory drawing which shows another example of operation with respect to the touch sensor by a user. It is explanatory drawing which shows another example of operation with respect to the touch sensor by a user. It is explanatory drawing which shows another example of operation with respect to the touch sensor by a user. It is explanatory drawing which shows another example of operation with respect to the touch sensor by a user. It is explanatory drawing which shows another example of operation with respect to the touch sensor by a user. It is a figure which shows the cross-sectional shape of the other portable image display apparatus which concerns on embodiment of this invention.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 1 Portable image display apparatus, 10 housing | casing, 12a 1st display screen, 12b 2nd display screen, 14a-14d Touch sensor, 16a and 16b Far-infrared sensor, 18 Acceleration sensor, 20 Gyro sensor, 30 Control part, 32 Memory | storage Unit, 34 image processing unit, 40 detection position / information acquisition unit, 42 operation type determination unit, 44 display image control processing unit.

Claims (7)

  1. A substantially rectangular display screen;
    A plurality of touch sensors provided along each of at least two sides constituting the outer periphery of the display screen, each detecting a position touched by a user's finger;
    Display image control means for changing an image displayed on the display screen in accordance with a combination of positions of a plurality of fingers detected by each of the plurality of touch sensors;
    A portable image display device comprising:
  2. The portable image display device according to claim 1,
    The display image control means changes an image to be displayed on the display screen according to a combination of movements of a plurality of finger positions detected by the plurality of touch sensors. .
  3. The portable image display device according to claim 2,
    The display image control means changes an image to be displayed on the display screen in accordance with a combination of movement directions of a plurality of finger positions detected by the plurality of touch sensors. apparatus.
  4. The portable image display device according to any one of claims 1 to 3,
    The portable image display device includes a substantially rectangular flat plate-shaped housing,
    The display screen is provided on a surface of the housing,
    Each of the plurality of touch sensors is provided on a side surface of the housing. A portable image display device.
  5. A substantially rectangular display screen;
    A plurality of touch sensors provided along each of at least two sides constituting the outer periphery of the display screen, each detecting a position touched by a user's finger;
    A method for controlling a portable image display device comprising:
    Changing an image displayed on the display screen in accordance with a combination of positions of a plurality of fingers detected by each of the plurality of touch sensors;
    A control method for a portable image display device, comprising:
  6. A substantially rectangular display screen;
    A plurality of touch sensors provided along each of at least two sides constituting the outer periphery of the display screen, each detecting a position touched by a user's finger;
    A program for causing a computer to function as a portable image display device comprising:
    Display image control means for changing an image displayed on the display screen in accordance with a combination of positions of a plurality of fingers detected by each of the plurality of touch sensors;
    A program for causing the computer to function as
  7.   A computer-readable information storage medium storing the program according to claim 6.
JP2008184858A 2008-07-16 2008-07-16 Portable image display device, control method thereof, program, and information storage medium Active JP5205157B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008184858A JP5205157B2 (en) 2008-07-16 2008-07-16 Portable image display device, control method thereof, program, and information storage medium

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2008184858A JP5205157B2 (en) 2008-07-16 2008-07-16 Portable image display device, control method thereof, program, and information storage medium
PCT/JP2009/056004 WO2010007813A1 (en) 2008-07-16 2009-03-25 Mobile type image display device, method for controlling the same and information memory medium
US13/003,751 US8854320B2 (en) 2008-07-16 2009-03-25 Mobile type image display device, method for controlling the same and information memory medium

Publications (3)

Publication Number Publication Date
JP2010026638A true JP2010026638A (en) 2010-02-04
JP2010026638A5 JP2010026638A5 (en) 2011-09-01
JP5205157B2 JP5205157B2 (en) 2013-06-05

Family

ID=41732440

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008184858A Active JP5205157B2 (en) 2008-07-16 2008-07-16 Portable image display device, control method thereof, program, and information storage medium

Country Status (1)

Country Link
JP (1) JP5205157B2 (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010231653A (en) * 2009-03-27 2010-10-14 Softbank Mobile Corp Display device, display method, and program
JP2010245843A (en) * 2009-04-06 2010-10-28 Canon Inc Image display device
JP2011048717A (en) * 2009-08-28 2011-03-10 Shin Etsu Polymer Co Ltd Electronic equipment with variable resistance device, and computer program
JP2011086035A (en) * 2009-10-14 2011-04-28 Nec Corp Portable device, and image display control method, and device therefor
JP2011145829A (en) * 2010-01-13 2011-07-28 Buffalo Inc Operation input device
JP2011209881A (en) * 2010-03-29 2011-10-20 Ntt Docomo Inc Mobile terminal
KR101119835B1 (en) * 2010-03-25 2012-02-28 이노디지털 주식회사 Remote controller having user interface of touch pad
JP2012128668A (en) * 2010-12-15 2012-07-05 Nikon Corp Electronic device
WO2012096313A1 (en) * 2011-01-14 2012-07-19 株式会社ニコン Electronic device
JP2012174250A (en) * 2011-02-24 2012-09-10 Kyocera Corp Mobile electronic device, contact operation control method, and contact operation control program
JP2012174247A (en) * 2011-02-24 2012-09-10 Kyocera Corp Mobile electronic device, contact operation control method, and contact operation control program
JP2012174246A (en) * 2011-02-24 2012-09-10 Kyocera Corp Mobile electronic device, contact operation control method, and contact operation control program
JP2012174252A (en) * 2011-02-24 2012-09-10 Kyocera Corp Electronic device, contact operation control program, and contact operation control method
JP2012194810A (en) * 2011-03-16 2012-10-11 Kyocera Corp Portable electronic apparatus, method for controlling contact operation, and contact operation control program
JP2012243204A (en) * 2011-05-23 2012-12-10 Nikon Corp Electronic device and image display method
JP2013025418A (en) * 2011-07-15 2013-02-04 Lenovo Singapore Pte Ltd Input device
JP2013073446A (en) * 2011-09-28 2013-04-22 Kyocera Corp Portable electronic device
JP2013117925A (en) * 2011-12-05 2013-06-13 Denso Corp Input device
JP2013528831A (en) * 2010-05-07 2013-07-11 マイクロチップ テクノロジー ジャーマニー ツー ゲーエムベーハー ウント コンパニー カーゲー Circuit board for display and display module having display and circuit board
JP2013235359A (en) * 2012-05-08 2013-11-21 Tokai Rika Co Ltd Information processor and input device
KR101356743B1 (en) 2011-06-28 2014-02-03 (주)메이벨솔루션 Tochtoc terminal
JP2014123197A (en) * 2012-12-20 2014-07-03 Casio Comput Co Ltd Input device, its input operation method and control program, and electronic apparatus
WO2014132882A1 (en) * 2013-02-27 2014-09-04 Necカシオモバイルコミュニケーションズ株式会社 Terminal device, information display method and recording medium
JP2014191460A (en) * 2013-03-26 2014-10-06 Fujifilm Corp Portable type image display terminal and actuation method of the same
JP2014528137A (en) * 2011-09-30 2014-10-23 インテル コーポレイション Mobile devices that eliminate unintentional touch sensor contact
JP2015005182A (en) * 2013-06-21 2015-01-08 カシオ計算機株式会社 Input device, input method, program and electronic apparatus
JP2015512101A (en) * 2012-02-29 2015-04-23 ゼットティーイー コーポレーションZte Corporation Touch operation processing method and mobile terminal
JP2016501358A (en) * 2012-11-13 2016-01-18 ウーテーアー・エス・アー・マニファクチュール・オロロジェール・スイス Electronic watch startup mode
US9541993B2 (en) 2011-12-30 2017-01-10 Intel Corporation Mobile device operation using grip intensity
JP2017033111A (en) * 2015-07-29 2017-02-09 東芝テック株式会社 Order reception system and management unit

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143604A (en) * 1997-11-05 1999-05-28 Nec Corp Portable terminal equipment
JP2002354311A (en) * 2001-05-28 2002-12-06 Fuji Photo Film Co Ltd Portable electronic equipment
JP2004206206A (en) * 2002-12-24 2004-07-22 Casio Comput Co Ltd Input device and electronic device
JP2004213451A (en) * 2003-01-07 2004-07-29 Matsushita Electric Ind Co Ltd Information processor and frame
JP2005322087A (en) * 2004-05-10 2005-11-17 Sharp Corp Display
JP2006079499A (en) * 2004-09-13 2006-03-23 Fujitsu Component Ltd Display device having touch input function
WO2006096501A1 (en) * 2005-03-04 2006-09-14 Apple Computer, Inc. Hand held electronic device with multiple touch sensing devices
JP2007094708A (en) * 2005-09-28 2007-04-12 Kddi Corp Information terminal device
JP2007207275A (en) * 2007-04-27 2007-08-16 Sony Corp Information processor and information processing method
JP2009099067A (en) * 2007-10-18 2009-05-07 Sharp Corp Portable electronic equipment, and operation control method of portable electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11143604A (en) * 1997-11-05 1999-05-28 Nec Corp Portable terminal equipment
JP2002354311A (en) * 2001-05-28 2002-12-06 Fuji Photo Film Co Ltd Portable electronic equipment
JP2004206206A (en) * 2002-12-24 2004-07-22 Casio Comput Co Ltd Input device and electronic device
JP2004213451A (en) * 2003-01-07 2004-07-29 Matsushita Electric Ind Co Ltd Information processor and frame
JP2005322087A (en) * 2004-05-10 2005-11-17 Sharp Corp Display
JP2006079499A (en) * 2004-09-13 2006-03-23 Fujitsu Component Ltd Display device having touch input function
WO2006096501A1 (en) * 2005-03-04 2006-09-14 Apple Computer, Inc. Hand held electronic device with multiple touch sensing devices
JP2007094708A (en) * 2005-09-28 2007-04-12 Kddi Corp Information terminal device
JP2007207275A (en) * 2007-04-27 2007-08-16 Sony Corp Information processor and information processing method
JP2009099067A (en) * 2007-10-18 2009-05-07 Sharp Corp Portable electronic equipment, and operation control method of portable electronic equipment

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010231653A (en) * 2009-03-27 2010-10-14 Softbank Mobile Corp Display device, display method, and program
JP2010245843A (en) * 2009-04-06 2010-10-28 Canon Inc Image display device
JP2011048717A (en) * 2009-08-28 2011-03-10 Shin Etsu Polymer Co Ltd Electronic equipment with variable resistance device, and computer program
JP2011086035A (en) * 2009-10-14 2011-04-28 Nec Corp Portable device, and image display control method, and device therefor
JP2011145829A (en) * 2010-01-13 2011-07-28 Buffalo Inc Operation input device
KR101119835B1 (en) * 2010-03-25 2012-02-28 이노디지털 주식회사 Remote controller having user interface of touch pad
JP2011209881A (en) * 2010-03-29 2011-10-20 Ntt Docomo Inc Mobile terminal
US9986643B2 (en) 2010-05-07 2018-05-29 Microchip Technology Germany Gmbh Circuit board for display and display module with display and circuit board
JP2013528831A (en) * 2010-05-07 2013-07-11 マイクロチップ テクノロジー ジャーマニー ツー ゲーエムベーハー ウント コンパニー カーゲー Circuit board for display and display module having display and circuit board
JP2012128668A (en) * 2010-12-15 2012-07-05 Nikon Corp Electronic device
JP2012145867A (en) * 2011-01-14 2012-08-02 Nikon Corp Electronic equipment
WO2012096313A1 (en) * 2011-01-14 2012-07-19 株式会社ニコン Electronic device
JP2012174247A (en) * 2011-02-24 2012-09-10 Kyocera Corp Mobile electronic device, contact operation control method, and contact operation control program
JP2012174246A (en) * 2011-02-24 2012-09-10 Kyocera Corp Mobile electronic device, contact operation control method, and contact operation control program
JP2012174252A (en) * 2011-02-24 2012-09-10 Kyocera Corp Electronic device, contact operation control program, and contact operation control method
JP2012174250A (en) * 2011-02-24 2012-09-10 Kyocera Corp Mobile electronic device, contact operation control method, and contact operation control program
JP2012194810A (en) * 2011-03-16 2012-10-11 Kyocera Corp Portable electronic apparatus, method for controlling contact operation, and contact operation control program
JP2012243204A (en) * 2011-05-23 2012-12-10 Nikon Corp Electronic device and image display method
KR101356743B1 (en) 2011-06-28 2014-02-03 (주)메이벨솔루션 Tochtoc terminal
JP2013025418A (en) * 2011-07-15 2013-02-04 Lenovo Singapore Pte Ltd Input device
JP2013073446A (en) * 2011-09-28 2013-04-22 Kyocera Corp Portable electronic device
JP2014528137A (en) * 2011-09-30 2014-10-23 インテル コーポレイション Mobile devices that eliminate unintentional touch sensor contact
US9317156B2 (en) 2011-09-30 2016-04-19 Intel Corporation Mobile device rejection of unintentional touch sensor contact
US10001871B2 (en) 2011-09-30 2018-06-19 Intel Corporation Mobile device rejection of unintentional touch sensor contact
JP2013117925A (en) * 2011-12-05 2013-06-13 Denso Corp Input device
US9541993B2 (en) 2011-12-30 2017-01-10 Intel Corporation Mobile device operation using grip intensity
US9507473B2 (en) 2012-02-29 2016-11-29 Zte Corporation Method for processing touch operation and mobile terminal
JP2015512101A (en) * 2012-02-29 2015-04-23 ゼットティーイー コーポレーションZte Corporation Touch operation processing method and mobile terminal
JP2013235359A (en) * 2012-05-08 2013-11-21 Tokai Rika Co Ltd Information processor and input device
JP2016501358A (en) * 2012-11-13 2016-01-18 ウーテーアー・エス・アー・マニファクチュール・オロロジェール・スイス Electronic watch startup mode
JP2014123197A (en) * 2012-12-20 2014-07-03 Casio Comput Co Ltd Input device, its input operation method and control program, and electronic apparatus
JPWO2014132882A1 (en) * 2013-02-27 2017-02-02 日本電気株式会社 Terminal device, information display method and program
WO2014132882A1 (en) * 2013-02-27 2014-09-04 Necカシオモバイルコミュニケーションズ株式会社 Terminal device, information display method and recording medium
JP2014191460A (en) * 2013-03-26 2014-10-06 Fujifilm Corp Portable type image display terminal and actuation method of the same
JP2015005182A (en) * 2013-06-21 2015-01-08 カシオ計算機株式会社 Input device, input method, program and electronic apparatus
JP2017033111A (en) * 2015-07-29 2017-02-09 東芝テック株式会社 Order reception system and management unit

Also Published As

Publication number Publication date
JP5205157B2 (en) 2013-06-05

Similar Documents

Publication Publication Date Title
KR101985159B1 (en) Systems and methods for using textures in graphical user interface widgets
US6400376B1 (en) Display control for hand-held data processing device
US9639258B2 (en) Manipulation of list on a multi-touch display
EP2732357B1 (en) Methods and systems for a virtual input device
US8674948B2 (en) Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques
KR101234968B1 (en) Touchpad Combined With A Display And Having Proximity And Touch Sensing Capabilities
US9244544B2 (en) User interface device with touch pad enabling original image to be displayed in reduction within touch-input screen, and input-action processing method and program
US9176542B2 (en) Accelerometer-based touchscreen user interface
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
US8860672B2 (en) User interface with z-axis interaction
US10198854B2 (en) Manipulation of 3-dimensional graphical objects for view in a multi-touch display
US10042546B2 (en) Systems and methods to present multiple frames on a touch screen
US9733752B2 (en) Mobile terminal and control method thereof
JP5446624B2 (en) Information display device, information display method, and program
JP2010140321A (en) Information processing apparatus, information processing method, and program
US20120260220A1 (en) Portable electronic device having gesture recognition and a method for controlling the same
EP3511806A1 (en) Method and apparatus for displaying a picture on a portable device
US20100103136A1 (en) Image display device, image display method, and program product
US9507431B2 (en) Viewing images with tilt-control on a hand-held device
US20110012848A1 (en) Methods and apparatus for operating a multi-object touch handheld device with touch sensitive display
JP6009454B2 (en) Enhanced interpretation of input events that occur when interacting with a computing device that uses the motion of the computing device
US7271795B2 (en) Intuitive mobile device interface to virtual spaces
US8441441B2 (en) User interface for mobile devices
KR20100027976A (en) Gesture and motion-based navigation and interaction with three-dimensional virtual content on a mobile device
JP2009157908A (en) Information display terminal, information display method, and program

Legal Events

Date Code Title Description
A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20101124

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20101203

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110714

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110714

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120918

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121025

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130212

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130218

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160222

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250