JP2013145449A - Information terminal device - Google Patents

Information terminal device Download PDF

Info

Publication number
JP2013145449A
JP2013145449A JP2012005239A JP2012005239A JP2013145449A JP 2013145449 A JP2013145449 A JP 2013145449A JP 2012005239 A JP2012005239 A JP 2012005239A JP 2012005239 A JP2012005239 A JP 2012005239A JP 2013145449 A JP2013145449 A JP 2013145449A
Authority
JP
Japan
Prior art keywords
image data
touch panel
touch
display surface
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2012005239A
Other languages
Japanese (ja)
Inventor
Shinichi Kanjiya
進一 閑治谷
Shinichi Aokura
伸一 青倉
Original Assignee
Sharp Corp
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp, シャープ株式会社 filed Critical Sharp Corp
Priority to JP2012005239A priority Critical patent/JP2013145449A/en
Publication of JP2013145449A publication Critical patent/JP2013145449A/en
Application status is Pending legal-status Critical

Links

Abstract

PROBLEM TO BE SOLVED: To provide an information terminal device for facilitating operations and information management without the need of performing a plurality of operations in order to store desired image data.SOLUTION: An information terminal device 100 includes a display part 1 having a display surface 2, a touch panel 10 superimposed and disposed on the display surface 2, an input detection part 22 for detecting a position at which the touch panel 10 is touched, specifying means, a storage part 30, and a control part 20. The specifying means specifies a range of at least a part of an image displayed on the display surface 2 as a specific range. In the state that the specific range is specified, when the input detection part 22 detects that the position included in the range in the touch panel 10 corresponding to the specific range is touched, the control part 20 stores image data on the image displayed within the specific range in the storage part 30 as segmentation image data.

Description

  The present invention relates to an information terminal device.

  An information terminal device such as a smartphone is operated by touching the display surface with a user's finger or a touch pen via a touch panel.

  For example, Patent Document 1 includes an LCD (Liquid Crystal Display) monitor provided with a touch panel, and a text display area for displaying Japanese, a conversion area, and a boundary line that separates the two areas are set on the LCD monitor. A portable terminal is disclosed. According to the mobile terminal disclosed in Patent Document 1, the user can select a part or all of a Japanese character string by a touch operation, touch the selected character string, and exceed the boundary line. When the image is slid and released in the conversion area, the Japanese character string is translated into English, and the English character string that is the translation result is displayed in the conversion area.

  Further, Patent Document 2 includes a touch panel type display surface. When an image displayed on the display surface is slid in one direction on the display surface, the image is stored in a storage unit, and the display surface is displayed. A display device that displays an image stored in a storage unit on a display surface when a slide operation is performed in the other direction is disclosed.

  Patent Document 3 discloses a portable terminal in which a character input key and a received mail are displayed on an LCD monitor, and a text string in the received mail can be arbitrarily selected by an operation on a touch panel input device. According to the portable terminal disclosed in Patent Literature 3, the related information of the selected character string is temporarily stored in association with an arbitrary character input key, and the character input key used for temporary storage is transferred to the body of the forwarded mail. When a touch-and-slide operation is performed, related information temporarily stored can be inserted into the body of the forwarded mail.

JP 2010-191495 A JP 2010-123016 A JP 2010-33450 A

  In the conventional information terminal device, in order to save the image data of the screen displayed on the display surface, all of the image data of the displayed image must be saved. In order to partially save the image data of the displayed image, an area to be saved is selected on the screen, the selected image is cut out, and the cut out image is saved as image data. As described above, in order to partially save the image data of the displayed image, there are many steps to be processed, and the number of operations performed by the user is increased. Therefore, the operation of the information terminal device becomes complicated.

  Patent Documents 1 to 3 do not describe a method capable of facilitating partial storage of image data of a displayed image.

  An object of the present invention is to provide an information terminal device that does not need to perform a plurality of operations to partially store image data of a displayed image and is easy to operate and manage information.

The present invention includes a display unit having a display surface on which an image is displayed;
A touch panel arranged over the display surface;
Touch detection means for detecting a position where the touch panel is touched;
A designation means for designating at least a part of the image displayed on the display surface as the designated range;
Storage means for storing image data;
When the touch detection unit detects that a position included in the range on the touch panel corresponding to the specified range is touched in a state where the specified range is specified, image data of an image displayed in the specified range And a control unit that stores the image as cut-out image data in a storage unit.

In the present invention, the touch detection unit detects a touch start position and a touch end position with respect to the touch panel when the touch panel is continuously touched,
The specifying means specifies the specified range based on the touch start position and the touch end position detected by the touch detection means.

In the present invention, the control means includes
If the touch detection unit detects that the inside of the range on the touch panel corresponding to the specified range is touched in a state where the specified range is specified, the cut image data is stored in the storage unit and stored in the storage unit. An icon for displaying the extracted cut-out image data on a display surface is created.

In the present invention, the control means includes
When the touch detection unit detects that the range on the touch panel corresponding to the specified range is touched in the state where the specified range is specified, in addition to the cut-out image data, information related to the cut-out image data is displayed. It memorize | stores in a memory | storage means, It is characterized by the above-mentioned.

  According to the present invention, an information terminal device includes: a display unit having a display surface on which an image is displayed; a touch panel disposed on the display surface; a touch detection unit that detects a position touched by the touch panel; Means, storage means for storing image data, and control means.

  The designation means designates at least a partial range of the image displayed on the display surface as the designated range. When the control unit detects that a position included in the range on the touch panel corresponding to the specified range is touched in a state in which the specified range is specified, the image displayed in the specified range Is stored in the storage means as cut-out image data.

  As described above, the image data of the image within the designated range is stored by only one operation in which the position included in the range on the touch panel corresponding to the designated range is touched, so that the information terminal device is displayed. It is not necessary to perform a plurality of operations to partially store the image data of the image, and the operation is easy.

  According to the invention, the touch detection means detects a touch start position and a touch end position with respect to the touch panel when the touch panel is continuously touched. The designation means designates the designated range based on the touch start position and the touch end position detected by the touch detection means.

  Thus, by designating the designated range, the range of the image to be stored can be designated by a simple operation.

  According to the invention, the control unit stores the cut-out image data when the touch detection unit detects that the inside of the range on the touch panel corresponding to the specified range is touched in a state where the specified range is specified. And creating an icon for displaying the cut-out image data stored in the storage means on the display surface.

  By creating such an icon, the cut-out image data can be displayed on the display surface with a simple operation, and information management can be facilitated.

  Further, according to the present invention, when the touch detection unit detects that the range on the touch panel corresponding to the specified range is touched in a state where the specified range is specified, the control unit adds the cut image data Since information related to the cut-out image data is stored in the storage unit, the cut-out image data can be used more effectively.

It is a block diagram which shows the structure of the information terminal device 100 which is one Embodiment of this invention. It is a flowchart which shows the procedure of an image data storage process. It is a figure which shows the display surface 2 on which the image was displayed. It is a figure which shows the display surface 2 at the time of detecting a double click. It is a figure which shows the display surface 2 on which the designated range frame 42 was displayed. It is a flowchart which shows the procedure of an iconization process. It is a figure which shows the display surface 2 on which the save menu image was displayed. It is a figure which shows the displayed display surface 2 on which the icon image 45 was displayed. It is a flowchart which shows the procedure of an image data reference process. It is a figure which shows the display surface 2 on which clipped image data was displayed. It is a figure which shows the display surface 2 on which the URL information image 46 was displayed. It is a figure which expands and shows clipped image data.

1. Information terminal device FIG. 1 is a block diagram showing a configuration of an information terminal device 100 according to an embodiment of the present invention. The information terminal device 100 includes a display unit 1, a touch panel 10, a control unit 20, and a storage unit 30.

<Display section>
The display unit 1 that is a display means includes a display surface 2 and a backlight device that irradiates the display surface 2 from its back surface. The display surface 2 is formed by a transmissive liquid crystal panel having a liquid crystal element, and the liquid crystal panel is formed in a flat plate shape.

<Touch panel>
The touch panel 10 is integrated with the display unit 1 so as to overlap the display surface 2. The touch panel 10 is a device for taking in an input from a user's finger or a touch pen, and can simultaneously acquire a plurality of inputs on the touch panel 10.

<Control unit>
The control unit 20 serving as a control unit controls the entire information terminal device 100 according to a program (not shown). Therefore, the control unit 20 is electrically connected to each of the display unit 1, the touch panel 10, and a storage unit 30 described later. The control unit 20 includes a display control unit 21, an input detection unit 22, and a time measurement unit 23.

  The display control unit 21 displays an image on the display surface 2. The display control unit 21 includes a graphic drawing unit 21a, an icon display unit 21b, a position determination unit 21c, and a line drawing unit 21d.

  The graphic drawing unit 21 a displays a graphic drawn on the display surface 2 when a user's finger, a touch pen, or the like touches.

  The icon display unit 21 b creates an icon for displaying the information stored in the storage unit 30 on the display surface 2 and displays the icon on the display surface 2.

The position determination unit 21c determines a position for displaying an image or the like on the display surface 2, and displays the image at the position.
The line drawing unit 21d displays a designated range frame, which will be described later, on the display surface 2.

  The input detection unit 22 that is a touch detection unit detects an input to the touch panel 10 from a user's finger or a touch pen. The input detection unit 22 includes a position detection unit 22a, a contact detection unit 22b, a command detection unit 22c, and a contact number detection unit 22d.

  The position detection unit 22a detects a position where the touch panel 10 is touched with a user's finger or a touch pen. When the touch panel 10 is continuously touched with a user's finger or a touch pen, the position detection unit 22a is a touch start position where the continuous touch is started on the touch panel 10 and the continuous touch position. It is possible to detect a touch end position that is a position at which a simple touch is ended. “Continuous touch” refers to a plurality of touches performed within a relatively short time and a single touch that continues to touch the touch panel 10 for a relatively long time.

  The contact detection unit 22b detects that the touch panel 10 has been touched. Further, the touch detection unit 22b determines that dragging has started when the touch panel 10 is continuously touched for a predetermined length or longer, and when the touch on the touch panel 10 is no longer detected, the drag is finished. Judge.

  The command detection unit 22c detects commands such as “save” and “do not save”.

  The contact number detection unit 22d detects how many times a user's finger or touch pen is touched on the touch panel 10 within a relatively short time (for example, 0.5 to 1 second). The contact number detection unit 22d can determine whether the operation performed on the touch panel 10 is a single click or a double click.

  The time measuring unit 23 has a timer function and can measure the time that has elapsed since the touch on the touch panel 10 was performed.

<Storage unit>
The storage unit 30 stores information related to various operations performed on the touch panel 10. The storage unit 30 includes an input coordinate storage unit 31, an input information storage unit 32, an icon information storage unit 33, a drawing information storage unit 34, an image information storage unit 35 serving as storage means, and an icon display information storage unit 36. And a flag storage unit 37 and a save setting unit 38.

The input coordinate storage unit 31 stores coordinate information of a position where the touch panel 10 is touched.
The input information storage unit 32 stores various information input to the touch panel 10.

  The icon information storage unit 33 stores a command for displaying information related to the icon on the display surface 2 from the icon.

The drawing information storage unit 34 stores drawing information to be displayed on the display surface 2.
The image information storage unit 35 stores image data. The image information storage unit 35 can store cut-out image data described later.

  The icon display information storage unit 36 stores information related to icons. The icon display information storage unit 36 stores an icon file name, URL (Uniform Resource Locator) information of image data, date / time information indicating the date and time of icon creation, and coordinate position information indicating the image position of the cut-out image data on the display surface 2. Remember.

The flag storage unit 37 stores flags necessary for various processes.
The save setting unit 38 stores saved conditions in various processes.

2. Image data storage process and iconification process Using the information terminal device 100 as described above, an image data storage process and an iconization process for iconifying image data are performed. Before these processes, Auto setting or manual setting may be selectable as a setting necessary for the iconification process. In the auto setting, the file name of the icon to be created is automatically set at the date and time when the iconization process is performed during the iconization process. In manual setting, the user inputs the file name of the icon to be created during the iconification process. The selected setting is stored in the save setting unit 38.

Image data storage processing and iconification processing will be described.
FIG. 2 is a flowchart showing the procedure of the image data storage process. FIG. 3 is a diagram illustrating the display surface 2 on which an image is displayed. FIG. 4 is a diagram showing the display surface 2 at the time when a double click is detected. FIG. 5 is a diagram showing the display surface 2 on which the designated range frame 42 is displayed.

  In the image data storage process, when the display control unit 21 displays an image on the display surface 2, the process proceeds to step S1. In the present embodiment, a URL is input to the information terminal device 100 and an image of a web page is displayed on the display surface 2.

  The image displayed on the display surface 2 is an area in which various applications are executed by single-clicking or double-clicking (W-clicking) via the touch panel 10 and an area other than the area. Even if it is done, it consists of outside the area where the application is not executed. Examples of the area include various icons.

  In step S1, the input detection unit 22 determines whether or not the touch panel 10 is double-clicked.

  Whether or not the touch panel is double-clicked is determined by detecting how many times the touch panel is touched within a predetermined range within a predetermined time by the contact number detection unit 22d. If the number of contacts detected by the contact number detection unit 22d is two times, it is determined that the touch panel 10 has been double-clicked.

  Step S1 is repeated until it is determined that the touch panel 10 is double-clicked. If it is determined that the touch panel 10 is double-clicked, the process proceeds to step S2.

  In step S2, the position detection unit 22a determines whether or not the detection position (the position double-clicked via the touch panel 10) is outside the area. If it is outside the area, the process proceeds to step S3. If it is not outside the area (if it is an area), the process returns to step S1. Since this flowchart shows the procedure of the image data storage process, the procedure not directly related to the image data storage process is omitted, and if it is determined in this step that the area is double-clicked, the step Before returning to S1, an application related to the area is executed.

  In step S3, the time measuring unit 23 sets a timer, and starts measuring the time from when the double click is detected in step S2. In this embodiment, the timer is set for 2 seconds.

  In step S4, the time measuring unit 23 determines whether or not 2 seconds have elapsed since the timer was set in step S3. That is, it is determined whether or not a timeout has occurred. If timed out, the process returns to step S1. If not timed out, the process proceeds to step S5.

  In step S <b> 5, the contact detection unit 22 b determines whether dragging has started on the touch panel 10.

  If dragging is started, the process proceeds to step S6, where the control unit 20 detects the coordinates of the double-clicked position (touch start position) 40 on the display surface 2 detected by the position detection unit 22a in step S2 shown in FIG. The position is stored in the input coordinate storage unit 31. If dragging is not started, the process returns to step S4.

  In step S7, the time measuring unit 23 releases the timer set in step S3 (timer clear).

  In step S8, the line drawing unit 21d displays the specified range frame 42 from the touch start position 40 to the position on the display surface 2 corresponding to the current position 43 on the touch panel 10 with which the user's finger 41 is in contact.

  The designated range frame 42 is a rectangular line whose diagonal line is a straight line connecting the touch start position 40 and the current position 43 on the touch panel 10 with which the user's finger 41 is in contact, and whose side is perpendicular or horizontal to the side of the display surface 2. In this embodiment, the color of the frame is red so that it can be easily recognized. When the user's finger 41 is in contact with the position 43, the designated range frame 42 has a size in which a straight line connecting the touch start position 40 and the position 43 is a diagonal line as shown by a broken line in FIG. However, if the position of the user's finger 41 moves from the position 43 toward the position 43 ′, the diagonal line becomes longer and the size of the rectangle increases with the movement.

  At this time, when the finger 41 slid at the current position 43 on the touch panel 10 for dragging reaches the lower end of the display surface 2, the image is automatically scrolled. Therefore, it is possible to enclose an image in a range wider than the range of the display surface 2 with the designated range frame 42.

  In step S9, the contact detection unit 22b determines whether the drag has ended. If it is determined that the drag has ended, the process proceeds to step S10.

  In step S10, the time measuring unit 23 sets a timer, and starts measuring the time from the point in time when it is determined that the drag has ended in step S9. In this embodiment, the timer is set for 2 seconds.

  In step S11, the position detection unit 22a detects the position (touch end position) 43 ′ on the touch panel 10 where the drag has ended, and the control unit 20 determines the coordinate position of the touch end position 43 ′ on the touch panel 10 as coordinate position data. Is stored in the input coordinate storage unit 31.

  The designated range frame 42 is determined by the touch start position 40 and the touch end position 43 ′. The control unit 20 corresponds to a designation unit, and an area on the display surface 2 defined by the designation range frame 42 corresponds to a designation range. In this way, when the designated range frame 42 is fixed, the range of the image to be stored can be designated by a simple operation.

  The determined designated range frame 42 can be narrowed or widened by a drag operation. When the designated range frame 42 is narrowed, the side of the designated range frame 42 is touched via the touch panel 10 as a dragging start position, and dragged toward the opposite side. When the designated range frame 42 is expanded, the side of the designated range frame 42 is touched as a dragging start position and dragged in the direction opposite to the opposite side.

  In step S <b> 12, the control unit 20 cuts out the image displayed in the designated range frame 42 as cut-out image data and temporarily stores it in the input information storage unit 32.

  In step S13, the time measuring unit 23 determines whether or not 2 seconds have elapsed since the timer was set in step S10. That is, it is determined whether or not a timeout has occurred. If it is time-out, the line drawing unit 21d deletes the designated range frame 42 from the display surface 2, deletes the cut-out image data temporarily stored in the input information storage unit 32, and returns to step S1. If not timed out, the process proceeds to step S14.

  In step S14, the contact detection unit 22b determines whether or not the touch panel 10 is touched. If it is determined that the touch panel 10 has been touched, the process proceeds to step S15. If it is not determined that the touch panel 10 has been touched, the process returns to step S13. The touch operation at this time may be any operation as long as the finger 41 or the like touches the touch panel 10. For example, an operation of throwing an image with the finger 41 toward any side of the display surface 2. (Operation of sliding the finger 41 toward any one side of the display surface 2 at a predetermined speed or more), single click, double click, and the like.

  In step S <b> 15, the position detection unit 22 a detects the touched position of the touch panel 10, and the input detection unit 22 determines whether or not the detection position is included in a range in the touch panel 10 corresponding to the designated range frame 42. to decide. If the detected position is included in the range on the touch panel 10 corresponding to the designated range frame 42, the cut-out image data temporarily stored in the input information storage unit 32 is stored in the image information storage unit 35, The cut-out image data temporarily stored in the input information storage unit 32 is deleted. Further, the data of the coordinate position of the touch start position 40 and the touch end position 43 ′ stored in the input coordinate storage unit 31 is stored in the icon display information storage unit 36 as the coordinate position information of the cut-out image data and input. Data on the coordinate position of the touch start position and the touch end position stored in the coordinate storage unit 31 is deleted. Thereafter, the process proceeds to step S16.

  If the position is not included in the range of the touch panel 10 corresponding to the designated range frame 42, that is, if the position is included in the range of the touch panel 10 corresponding to the outside of the specified range frame 42, the line drawing unit 21d The frame 42 is erased from the display surface 2, the cut-out image data temporarily stored in the input information storage unit 32 is deleted, and the process returns to step S1.

  In step S16, the icon display unit 21b performs an iconization process. The iconification process will be described later. In step S17, the timer set in step S10 is canceled (timer clear). This completes the image data storage process.

  Thus, in the image data storage process, the image in the designated range frame 42 is extracted as cut-out image data by only one operation in which the position included in the range on the touch panel 10 corresponding to the designated range frame 42 is touched. Remembered. Therefore, the information terminal device 100 does not need to perform a plurality of operations to store desired image data, and is easy to operate.

  FIG. 6 is a flowchart showing the procedure of iconification processing. FIG. 7 is a diagram showing the display surface 2 on which the save menu image is displayed. FIG. 8 is a diagram showing the display surface 2 on which the icon image 45 is displayed.

  In step S16a, the icon display unit 21b determines whether the setting stored in the save setting unit 38 is a manual setting. If it is a manual setting, it will progress to step S16b. If it is not manual setting, that is, if it is auto setting, the process proceeds to step S16j.

  In step S16b, the icon display unit 21b displays a save menu image as shown in FIG. In the save menu image, the file name column 44 is blank, and the user can input the file name in the column 44.

  In step S16c, the icon display unit 21b determines whether there is a save instruction. The save instruction is determined based on whether or not the character image “save” displayed on the display surface 2 as shown in FIG. 7 is touched via the touch panel 10. If there is a save instruction, the process proceeds to step S16d. If the character image “do not save” displayed as shown in FIG. 7 is touched via the touch panel 10, the iconization process ends without performing iconization.

  In step S16d, the icon display unit 21b determines whether or not the file name column 44 shown in FIG. 7 is blank. If the file name field 44 is blank, the process proceeds to step S16e. If the file name field 44 is not blank, the process proceeds to step S16f.

  In step S16e, the icon display unit 21b sets the current date and time as a file name, and stores the file name information in the icon display information storage unit 36.

  In step S16f, the icon display unit 21b sets the input character as a file name and stores the file name information in the icon display information storage unit 36.

  In step S16g, the icon display unit 21b stores the URL of the web page in which the cutout image data exists as URL information as information related to the cutout image data in the icon display information storage unit 36.

  In step S16h, the icon display unit 21b stores the coordinate position information of the clipped image data stored in the input coordinate storage unit 31 as information related to the clipped image data in the icon display information storage unit 36.

  In step S16i, the icon display unit 21b stores the current date and time as information related to the cut-out image data in the icon display information storage unit 36 as date and time information.

  Information related to these cut-out image data can be confirmed by referring to the property. Therefore, the cut-out image data can be used more effectively.

  In step S16j, as shown in FIG. 8, the icon display unit 21b creates an icon for displaying the cut-out image data on the display surface 2, and displays the icon image 45 at a predetermined position on the display surface 2. . In the present embodiment, the icon image 45 is displayed below the display surface 2 that has the least influence on image browsing. In this icon image 45, file names are displayed in an overlapping manner.

In this way, the icon is created and the icon image 45 is displayed on the display surface 2, whereby the cut-out image data can be displayed on the display surface 2 with a simple operation, and information management can be facilitated. .
This is the end of the iconization process.

3. Image Data Reference Processing Hereinafter, image data reference processing for displaying cut-out image data on the display surface 2 with a simple operation will be described. FIG. 9 is a flowchart showing the procedure of the image data reference process. FIG. 10 is a diagram illustrating the display surface 2 on which cut-out image data is displayed. FIG. 11 is a diagram showing the display surface 2 on which the URL information image 46 is displayed. FIG. 12 is an enlarged view showing the cut-out image data. In the image data reference process, when an image including the icon image 45 is displayed on the display surface 2, the process proceeds to step S20.

  In step S20, the position detection unit 22a determines whether or not the position of the touch panel 10 corresponding to the icon image 45 has been touched. If it is determined that the touch has been made, the process proceeds to step S21.

  In step S21, the contact number detection unit 22d detects the number of times touched in step S20, and determines whether the touch is a single click. If it is single click, it will progress to step S22. If it is not a single click, that is, if it is a double click, the process proceeds to step S28. Thus, in the image data reference process, the subsequent operation changes depending on the type of touch performed on the icon image 45 via the touch panel 10.

In step S22, the position determination unit 21c acquires a command for displaying the cut-out image data on the display surface 2 from the icon information storage unit 33, and the coordinate position of the cut-out image data from the icon display information storage unit 36. Data is acquired, and a position on the display surface 2 where the cut-out image data is displayed is determined. Then, at that position, the position determination unit 21c displays the cut-out image data so as to overlap the currently displayed image as shown in FIG. The clipped image data is surrounded by a display frame 42a.
Thereafter, the image data reference ends.

  In step S23, the display control unit 21 acquires URL information from the icon information storage unit 33, and displays a URL information image 46 near the icon image 45 as shown in FIG.

  In step S24, the contact detection unit 22b determines whether the touch panel 10 has been touched, that is, whether a click has been detected. If a click is detected, the process proceeds to step S25. If no click is detected, the process returns to step S24.

  In step S25, the position detection unit 22a determines whether URL information has been selected. If the position touched on the touch panel 10 is a position corresponding to the URL information image 46, the position detection unit 22a determines that the URL information has been selected. If it is determined that the URL information has been selected, the process proceeds to step S26. If it is not determined that the URL information is selected, that is, if a position not corresponding to the URL information image 46 is touched, the process returns to step S20.

  In step S26, the control unit 20 downloads image information related to the selected URL information via a communication unit (not shown).

  In step S <b> 27, the position determination unit 21 c acquires a command for displaying the cut-out image data on the display surface 2 from the icon information storage unit 33, and the coordinate position of the cut-out image data from the icon display information storage unit 36. Data is acquired, and a position on the display surface 2 where the cut-out image data is displayed is determined.

  In step S <b> 28, the display control unit 21 displays the downloaded image information on the display surface 2. At this time, the image indicated by the cut-out image data is displayed at the position indicated by the coordinate position data acquired from the icon display information storage unit 36. Therefore, in addition to the cutout image data, the user can browse image data other than the cutout image data in the downloaded image information. Further, as shown in FIG. 12, the cut-out image data and image data other than the cut-out image data can be enlarged and displayed.

  This is the end of the image data reference process. Note that the icon image 45 is displayed on the display surface 2 unless the user performs an erasure process, and is not deleted even if the information terminal device 100 is turned off.

DESCRIPTION OF SYMBOLS 1 Display part 2 Display surface 10 Touch panel 20 Control part 21 Display control part 22 Input detection part 23 Time measurement part 30 Storage part 31 Input coordinate storage part 32 Input information storage part 33 Icon information storage part 34 Drawing information storage part 35 Image information storage Unit 36 Icon display information storage 37 Flag storage unit 38 Save setting 100 Information terminal device

Claims (4)

  1. Display means having a display surface on which an image is displayed;
    A touch panel arranged over the display surface;
    Touch detection means for detecting a position where the touch panel is touched;
    A designation means for designating at least a part of the image displayed on the display surface as the designated range;
    Storage means for storing image data;
    When the touch detection unit detects that a position included in the range on the touch panel corresponding to the specified range is touched in a state where the specified range is specified, image data of an image displayed in the specified range And a control means for storing the data as cut-out image data in the storage means.
  2. The touch detection means detects a touch start position and a touch end position on the touch panel when the touch panel is continuously touched,
    2. The information terminal device according to claim 1, wherein the specifying unit specifies the specified range based on the touch start position and the touch end position detected by the touch detection unit.
  3. The control means
    If the touch detection unit detects that the inside of the range on the touch panel corresponding to the specified range is touched in a state where the specified range is specified, the cut image data is stored in the storage unit and stored in the storage unit. The information terminal device according to claim 1, wherein an icon for displaying the extracted cut-out image data on a display surface is created.
  4. The control means
    When the touch detection unit detects that the range on the touch panel corresponding to the specified range is touched in the state where the specified range is specified, in addition to the cut-out image data, information related to the cut-out image data is displayed. The information terminal device according to claim 1, wherein the information terminal device is stored in a storage unit.
JP2012005239A 2012-01-13 2012-01-13 Information terminal device Pending JP2013145449A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012005239A JP2013145449A (en) 2012-01-13 2012-01-13 Information terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012005239A JP2013145449A (en) 2012-01-13 2012-01-13 Information terminal device

Publications (1)

Publication Number Publication Date
JP2013145449A true JP2013145449A (en) 2013-07-25

Family

ID=49041222

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012005239A Pending JP2013145449A (en) 2012-01-13 2012-01-13 Information terminal device

Country Status (1)

Country Link
JP (1) JP2013145449A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000163444A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information device and information storage medium
JP2000311040A (en) * 1998-10-19 2000-11-07 Chisato Okabe Device and method for data delivery and recording medium recording data delivery program
JP2005301992A (en) * 2004-03-19 2005-10-27 Ricoh Co Ltd Electronic apparatus with display unit, information-processing method and program for making computer execute the same program
JP2011050038A (en) * 2009-07-27 2011-03-10 Sanyo Electric Co Ltd Image reproducing apparatus and image sensing apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000311040A (en) * 1998-10-19 2000-11-07 Chisato Okabe Device and method for data delivery and recording medium recording data delivery program
JP2000163444A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information device and information storage medium
JP2005301992A (en) * 2004-03-19 2005-10-27 Ricoh Co Ltd Electronic apparatus with display unit, information-processing method and program for making computer execute the same program
JP2011050038A (en) * 2009-07-27 2011-03-10 Sanyo Electric Co Ltd Image reproducing apparatus and image sensing apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JPN6015042993; '"あなたの知らないFirefox"' ネトラン 第3巻,第10号, 20091001, pp.68-73, 株式会社にゅーあきば *
JPN6015042994; 岡田 長治,中村 睦: '"スクリーンショット"' お気に入りのUbuntu 第1版, 20091010, pp.320-325, 株式会社カットシステム *

Similar Documents

Publication Publication Date Title
US8386950B2 (en) Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display
KR101919645B1 (en) Explicit touch selection and cursor placement
EP2172836B1 (en) Mobile terminal and user interface of mobile terminal
CN101606124B (en) Multi-window managing device, program, storage medium, and information processing device
KR101973631B1 (en) Electronic Device And Method Of Controlling The Same
US9690474B2 (en) User interface, device and method for providing an improved text input
EP2113830A2 (en) User interface method and apparatus for mobile terminal having touchscreen
KR20130093043A (en) Method and mobile device for user interface for touch and swipe navigation
EP2619657B1 (en) Terminal device for downloading and installing an application and method thereof
KR100977385B1 (en) Mobile terminal able to control widget type wallpaper and method for wallpaper control using the same
EP2613244A2 (en) Apparatus and method for displaying screen on portable device having flexible display
KR101743632B1 (en) Apparatus and method for turning e-book pages in portable terminal
EP2284728A1 (en) Web browsing method and web browsing device
KR20110041915A (en) Terminal and method for displaying data thereof
EP1835385A2 (en) Method and device for fast access to application in mobile communication terminal
US8539375B1 (en) Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
US20140019910A1 (en) Touch and gesture input-based control method and terminal therefor
US20130227413A1 (en) Method and Apparatus for Providing a Contextual User Interface on a Device
JP2010250510A (en) Display control device, display control method and computer program
EP2325740A2 (en) User interface apparatus and method
JP2010102662A (en) Display apparatus and mobile terminal
EP2825950B1 (en) Touch screen hover input handling
EP2698708A1 (en) Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same
US8806376B2 (en) Mobile communication device and method of controlling the same
KR100984817B1 (en) User interface method using touch screen of mobile communication terminal

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140918

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150624

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150728

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150925

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20151027

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20151228

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20160315