WO2016006074A1 - Dispositif électronique, procédé et programme associés - Google Patents

Dispositif électronique, procédé et programme associés Download PDF

Info

Publication number
WO2016006074A1
WO2016006074A1 PCT/JP2014/068358 JP2014068358W WO2016006074A1 WO 2016006074 A1 WO2016006074 A1 WO 2016006074A1 JP 2014068358 W JP2014068358 W JP 2014068358W WO 2016006074 A1 WO2016006074 A1 WO 2016006074A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sticky note
background image
image
display
Prior art date
Application number
PCT/JP2014/068358
Other languages
English (en)
Japanese (ja)
Inventor
広志 賀澤
Original Assignee
株式会社東芝
東芝ライフスタイル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝, 東芝ライフスタイル株式会社 filed Critical 株式会社東芝
Priority to PCT/JP2014/068358 priority Critical patent/WO2016006074A1/fr
Publication of WO2016006074A1 publication Critical patent/WO2016006074A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/02Digital computers in general; Data processing equipment in general manually operated with input through keyboard and computation using a built-in program, e.g. pocket calculators

Definitions

  • Embodiments described herein relate generally to an electronic device, a method, and a program.
  • a program is known that displays a sticky note-shaped memo (sticky note data) on a display like a real sticky note.
  • An electronic apparatus includes a display processing unit and a processing unit.
  • the display processing unit displays a plurality of tag images including information input by a user operation and a background image of a first area including a display area of the plurality of tag images.
  • the processing unit determines the display positions of the plurality of tag images, the second background image, and the plurality of tag images. And the information associated with each.
  • FIG. 1 is a perspective view showing a tablet according to one embodiment.
  • FIG. 2 is a block diagram illustrating an example of the configuration of the tablet according to one embodiment.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of a tablet according to one embodiment.
  • FIG. 4 is a diagram illustrating an example of a screen displayed on the display of one embodiment.
  • FIG. 5 is a diagram illustrating an example of a screen on a display on which sticky note data is created according to one embodiment.
  • FIG. 6 is a diagram illustrating an example of a screen on a display on which a calendar is displayed according to one embodiment.
  • FIG. 7 is a diagram illustrating an example of a screen on a display on which a background image is displayed according to one embodiment.
  • FIG. 8 is a flowchart illustrating a part of the procedure of the calendar display process in the tablet of one embodiment.
  • FIG. 9 is a diagram illustrating an example of a screen on a display on which an expanded display according to one embodiment is displayed.
  • FIG. 10 is a flowchart illustrating a part of the procedure of the background image display process in the tablet of one embodiment.
  • FIGS. 1 to 10 Note that a plurality of expressions may be written together for the constituent elements according to the embodiment and the description of the elements. It is not precluded that other expressions not described in the component and description are made. Furthermore, it is not prevented that other expressions are given for the components and descriptions in which a plurality of expressions are not described.
  • FIG. 1 is a perspective view showing a tablet 10 according to one embodiment.
  • the tablet 10 is an example of an electronic device.
  • the electronic device is not limited to this, and may be various devices such as a personal computer, a portable computer, a television receiver, a monitor, a mobile phone, a smartphone, a personal digital assistant (PDA), and a game machine. Also good.
  • PDA personal digital assistant
  • the tablet 10 includes a housing 11 and a display unit 12.
  • the housing 11 is formed in a substantially rectangular box shape.
  • the housing 11 accommodates the display unit 12 and various components such as a substrate.
  • the display unit 12 is exposed from the opening 15 provided in the housing 11 and displays an image. Note that in this specification, images include still images and moving images.
  • FIG. 2 is a block diagram showing an example of the configuration of the tablet 10.
  • the tablet 10 includes a central processing unit (CPU) 21, a main storage device 22, an auxiliary storage device 23, and a communication interface (hereinafter, referred to as “connected”) connected to each other via a bus 20.
  • the CPU 21 comprehensively controls various operations in the tablet 10.
  • the main storage device 22 stores a ROM (Read Only Memory) storing a control program executed by the CPU 21, a RAM (Random Access Memory) for providing a work area to the CPU, and various setting information and control information.
  • a non-volatile memory stores an operation system executed by the CPU 21, various application programs, various data necessary for executing the programs, and the like.
  • the communication I / F 24 is connected to the Internet and performs communication, for example.
  • the communication I / F 24 is connected to an input device such as a keyboard or a mouse for wireless communication.
  • the external I / F 25 inputs and outputs information with the input device and the removable medium, for example.
  • the display unit 12 is configured as a so-called touch screen in which a display 26 and a touch panel 27 are combined.
  • the display 26 is, for example, a liquid crystal display or an organic EL (Electro Luminescence) display.
  • the touch panel 27 is an input device that detects a position (touch position) on the display screen of the display 26 touched by a user's finger or a stylus pen.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of the tablet 10. As shown in FIG. 3, the tablet 10 includes a processing unit 31 and an operation unit 32. FIG. 3 also shows the display 26 for illustrative purposes.
  • the tablet 10 implements the processing unit 31 shown in FIG. 3 in cooperation with the CPU 21 of FIG. 2 and the programs (operating system and various application programs) stored in the main storage device 22 and the auxiliary storage device 23.
  • the operation unit 32 inputs operation information to the processing unit 31.
  • the user performs an input operation on the tablet 10 through the operation unit 32.
  • the operation unit 32 is realized by, for example, the input device such as the touch panel 27 of FIG. 2, a keyboard and a mouse connected to the communication I / F 24 or the external I / F 25.
  • the processing unit 31 includes an input control unit 35, a display control unit 36, a tag display unit 41, a tag storage unit 42, a tag creation unit 43, a background image display unit 44, a background image storage unit 45, and a recording
  • An information acquisition unit 46, a tag parameter changing unit 47, a destination information setting unit 48, and a destination information storage unit 49 are provided.
  • the display control unit 36, the sticky note display unit 41, and the background image display unit 44 are examples of a display processing unit.
  • FIG. 4 is a diagram illustrating an example of a screen displayed on the display 26.
  • the display control unit 36 can perform control to display the desktop 51 and the window 52 on the display 26 of the display unit 12.
  • the desktop 51 is an example of a first background image.
  • the tag display unit 41 displays the tag data 53 on the display 26 of the display unit 12.
  • the tag data 53 is an example of a tag image.
  • the desktop 51 forms the lowest layer of the desktop environment image displayed on the display 26.
  • the window 52 and the tag data 53 are displayed on the upper layer (front side) of the desktop 51.
  • the window 52 is displayed by various application programs executed by the processing unit 31.
  • the tag data 53 is displayed by a tag application program executed by the processing unit 31.
  • the tag data 53 includes information (position information) about the position (coordinates) and various information (recording information) such as text data. The recorded information will be described later.
  • the sticky note display unit 41 in FIG. 3 displays the sticky note data 53 on the display 26 at a position corresponding to the position information included in the sticky note data 53.
  • the tag display unit 41 displays the text data of the tag data 53 on the tag data 53. Note that other information such as image data may be displayed in the tag data 53.
  • the sticky note storage unit 42 stores information about the sticky note data 53 having position information and recording information.
  • the tag display unit 41 acquires information about the tag data 53 from the tag storage unit 42 and displays the tag data 53 on the display 26.
  • the input control unit 35 performs input control of various operations. For example, when the user touches the position where the tag data 53 on the display 26 is displayed with a finger and moves the finger, the input control unit 35 inputs the movement of the finger as an event in which the touch operation is continued. As will be described later, the tag parameter changing unit 47 can move the display position of the tag data 53 by changing the position information of the tag data 53 in accordance with the input.
  • the sticky note data 53 may be moved by a drag and drop operation with a mouse, for example. Thus, the user can arrange the sticky note data 53 at a desired position.
  • the sticky note parameter changing unit 47 When the sticky note parameter changing unit 47 changes the position information of the sticky note data 53 and moves the sticky note data 53, the sticky note parameter changing unit 47 normally places the sticky note data 53 at the position of the sticky note data 53 after the movement. Information on the position (normal position) is stored in the sticky note storage unit 42. In other words, the sticky note data 53 further includes information on the normal position.
  • each sticky note data 53 is provided with a delete button 61, an add button 62, a calendar display button 63, and an image display button 64.
  • the calendar display button 63 and the image display button 64 are examples of operators.
  • the tag parameter changing unit 47 deletes information about the tag data 53 provided with the delete button 61 from the tag storage unit. As a result, the tag display unit 41 erases the tag data 53 from the display 26.
  • the user operates the delete button 61 by, for example, performing a touch operation at a position on the display 26 where the delete button 61 is provided.
  • the delete button 61 may be operated by a click operation with a mouse, for example.
  • the add button 62, the calendar display button 63, and the image display button 64 are operated in the same manner as the delete button 61.
  • FIG. 5 is a diagram showing an example of a screen on the display 26 where the sticky note data 53 is created.
  • the tag creation unit 43 displays a tag creation window 71 on the display 26.
  • a title input field 72 for example, a title input field 72, a memo input field 73, an image input field 74, a date and time input field 75, and a creation button 76 are provided.
  • the user operates the operation unit 32 and inputs information to the tag creation unit 43 via the input control unit 35, whereby a title input field 72, a memo input field 73, an image input field 74, and a date / time input field 75.
  • Input various information.
  • the user inputs text data D1 as a title of the newly created sticky note data 53 in the title input field 72.
  • the user inputs text data D2 as the contents of the newly created sticky note data 53 in the memo input field 73.
  • the user inputs the text data D1 and D2 using a software keyboard generated on the display 26, but may input the text data using other input devices.
  • the user inputs the image data D3 in the image input field 74.
  • the user creates the image data D3 by performing a touch operation (handwriting input) on the touch panel 27, for example, but may use existing image data stored in the tablet 10, for example.
  • the user inputs date / time data D4 in the date / time input field 75.
  • the date / time data D4 includes information about year, month, day, and time. Note that the date / time data D4 may include information on only the year, month, and day, for example.
  • the user inputs the date / time data D4 by selecting the date / time from a pull-down menu, but may input it directly using a software keyboard.
  • the sticky note creation unit 43 creates information about the sticky note data 53 having the input text data D1, D2, image data D3, and date / time data D4.
  • the data is stored in the storage unit 42.
  • Text data D1, D2, image data D3, and date / time data D4 are examples of information input by a user operation. Note that at least one of the title input field 72, the memo input field 73, the image input field 74, and the date / time input field 75 may be blank.
  • the sticky note display unit 41 acquires information about the newly created sticky note data 53 from the sticky note storage unit 42, and displays the sticky note data 53 on the display 26.
  • the sticky note display unit 41 causes the sticky note data 53 to display, for example, text data D2 input in the memo input field 73.
  • FIG. 6 shows an example of a screen on the display 26 on which a calendar is displayed.
  • each unit of the processing unit 31 performs a calendar display process.
  • the background image display unit 44 displays the calendar image B1 on the display 26, and the tag parameter changing unit 47 moves the plurality of tag data 53 on the calendar image B1.
  • the background image display unit 44 switches the background image of the area including the display area of the tag data 53 from the desktop 51 and the window 52 to the calendar image B1.
  • the calendar image B1 is an example of a second background image. The calendar display process will be described later.
  • FIG. 7 is a diagram showing an example of a screen on the display 26 on which a background image is displayed.
  • each unit of the processing unit 31 performs a background image display process.
  • the background image display unit 44 displays the background image B2 on the display 26, and the tag parameter changing unit 47 moves the plurality of tag data 53 on the background image B2.
  • the background image display unit 44 switches the background image of the area including the display area of the tag data 53 from the desktop 51 and the window 52 to the background image B2.
  • the background image B2 is an example of a second background image. The background image display process will be described later.
  • the background image display unit 44 in FIG. 3 displays the calendar image B1 on the display 26 in the calendar display process described above.
  • the background image display unit 44 generates (draws) a calendar image B ⁇ b> 1 based on information about the date and time used in the operation system, for example, and displays it on the display 26.
  • the background image display unit 44 is not limited to this, for example, acquires the calendar image B1 stored in advance in the background image storage unit 45, or acquires the calendar image B1 via the Internet, and displays it on the display 26. You may do it.
  • the background image storage unit 45 stores the background image B2.
  • the background image display unit 44 acquires the background image B2 from the background image storage unit 45 and displays the background image B2 on the display 26 in the background image display process described above.
  • the background image display unit 44 is not limited to this, and may acquire the background image B2 via the Internet and display it on the display 26, for example.
  • the recording information acquisition unit 46 acquires the recording information (text data D1, D2, image data D3, and date / time data D4) of each tag data 53 from the tag storage unit 42.
  • the recording information acquisition unit 46 extracts predetermined information such as date and time from the acquired text data D1 and D2, image data D3, and date and time data D4.
  • the sticky note parameter changing unit 47 changes the parameters related to the display of the sticky note data 53 in the calendar display process and the background image display process described above.
  • the parameters related to display are, for example, position information of the sticky note data 53, information about display / non-display of the sticky note data 53, information about the color of the sticky note data 53, and information about the size of the sticky note data 53.
  • the display parameters are not limited to this.
  • the movement destination information setting unit 48 sets information (movement destination information) about the position (coordinates) of the movement destination of the tag data 53 in the background image display process.
  • the destination information is an example of one or more positions set on the second background image.
  • the destination information storage unit 49 stores the destination information. The destination information will be described later.
  • FIG. 8 is a flowchart illustrating a part of the procedure of the calendar display process in the tablet 10.
  • the stored information acquisition unit 46 determines whether or not the calendar display button 63 of any tag data 53 has been operated (step S11). The stored information acquisition unit 46 repeats the determination until the calendar display button 63 is operated (step S11: No).
  • step S11: Yes the recorded information acquisition unit 46 sends the text data D1, D2, image data D3, and date / time data D4 of one piece of sticky note data 53 from the sticky note storage unit 42. get.
  • the recording information acquisition unit 46 determines whether or not the text data D1, D2, the image data D3, and the date / time data D4 include information about date / time (date / time information) (step S12).
  • the date / time information is an example of information regarding a date or time associated with information input by a user's operation and information regarding a date or time when information is input by a user's operation.
  • the recording information acquisition unit 46 acquires the date information.
  • the text data D2 includes date / time information “1/18”.
  • the recorded information acquisition unit 46 analyzes the text data D2 to determine the presence / absence of date / time information, and extracts the date / time information “1/18”.
  • the image data D3 includes information “1/18” about the date and time.
  • the recorded information acquisition unit 46 determines the presence / absence of date / time information by analyzing the image data D3 by optical character recognition, for example, and extracts the date / time information “1/18”.
  • the recording information acquisition unit 46 extracts date and time information with various notations from the text data D1, D2 and the image data D3. For example, even if date / time information such as “January 18”, “1.18”, and “January 18” exists as text data D1, D2 or image data D3, Extract.
  • the created sticky note data 53 includes date / time data D4.
  • the recording information acquisition unit 46 acquires date / time data D4 “2014/1/18” as date / time information.
  • the recording information acquisition unit 46 may have a priority for extracting date / time information. For example, when the text data D1, D2, the image data D3, and the date / time data D4 each have date / time information, the recording information acquisition unit 46 acquires the date / time information from the recording information (for example, date / time data D4) with the highest priority. To extract.
  • the sticky note data 53 may be individually referred to as sticky note data 53A, 53B, 53C, 53D, and 53E as shown in FIG.
  • the tag data 53A has date / time information “1/18 9:00”.
  • the tag data 53B has date / time information “1/18”.
  • the tag data 53C includes date / time information “1/30”.
  • the tag data 53D has date information “1/18 10:00”.
  • the tag data 53E does not have date / time information.
  • the record information acquisition unit 46 provides the date / time information of the tag data 53 extracted from the record information of the tag data 53 to the tag parameter change unit 47.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53 according to the date and time information (step S13).
  • the background image display unit 44 sets XY coordinates (position) corresponding to each date of the calendar image B1 shown in FIG. For example, the background image display unit 44 sets “X: 820, Y: 290” as the XY coordinates corresponding to the position “1/18” of the calendar image B1. The background image display unit 44 also sets XY coordinates corresponding to the positions of other dates in the calendar image B1. The background image display unit 44 provides information about the XY coordinates corresponding to the date of the calendar image B ⁇ b> 1 to the tag parameter change unit 47.
  • the calendar image B1 has a cell indicating a plurality of dates.
  • the cell indicating the date is an example of an image that can specify the date or time.
  • the background image display unit 44 sets the display positions (XY coordinates) of each cell indicating the date.
  • the tag parameter changing unit 47 extracts information (date information) related to the month and day excluding time from the date / time information of the tag data 53. For example, the sticky note parameter changing unit 47 extracts “1/18” that is date information by excluding the time “9:00” from the date and time information of the sticky note data 53A.
  • the sticky note parameter changing unit 47 uses the position information of the sticky note data 53A, 53B, 53D having the date information “1/18” as the XY coordinates “X: 820, Y290” corresponding to the date “1/18” of the calendar image B1. Change to That is, the sticky note parameter changing unit 47 moves the sticky note data 53A, 53B, 53D to a position corresponding to the date information of the sticky note data 53A, 53B, 53D. In other words, the sticky note parameter changing unit 47 displays the grid display position indicating the date of the calendar image B1 and date information “1/18” associated with the sticky note data 53A, 53B, and 53D, respectively. Is used to change the display position of the tag data 53A, 53B, 53D.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53C having the date information “1/30” to XY coordinates corresponding to the date “1/30” of the calendar image B1. That is, the tag parameter changing unit 47 moves the tag data 53C to a position corresponding to the date information of the tag data 53C.
  • the sticky note parameter changing unit 47 Is changed to a predetermined position. For example, if the date information of the sticky note data 53 is for the month prior to the month of the current date, the sticky note parameter changing unit 47 moves the sticky note data 53 to the left end of the calendar image B1. If the date information of the sticky note data 53 is for the month after the month of the current date, the sticky note parameter changing unit 47 moves the sticky note data 53 to the right end of the calendar image B1.
  • step S12 if the record information acquisition unit 46 cannot acquire the date / time information from the text data D1, D2, image data D3, and date / time data D4 of the tag data 53 (step S12: No), the tag data 53 is stored in the date / time information. Is notified to the tag parameter changing unit 47.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53 having no date / time information to a predetermined position (step S14).
  • the background image display unit 44 presets a position (outside the frame) outside the area where each date of the calendar image B1 is displayed as a position where the sticky note data 53 not including date / time information is arranged.
  • the background image display unit 44 may randomly determine the position where the sticky note data 53 that does not include date and time information is arranged, for example, outside the frame of the calendar image B1.
  • the sticky note parameter changing unit 47 changes, for example, the position information of the sticky note data 53E having no date / time information to preset XY coordinates. That is, the sticky note parameter change unit 47 moves the sticky note data 53E outside the frame of the calendar image B1.
  • the sticky note parameter changing unit 47 determines whether there is another sticky note data 53 at the changed position (step S15). For example, the sticky note parameter changing unit 47 acquires the position information of each sticky note data 53 from the sticky note storage unit 42 and compares it with the positional information of the moved sticky note data 53, so that another sticky note data 53 is at the changed position. Determine if it exists.
  • the sticky note parameter changing unit 47 further changes the position information of the sticky note data 53 when another sticky note data 53 exists at the position of the sticky note data 53 whose position information has been changed (step S15: Yes). For example, the sticky note parameter changing unit 47 adds or subtracts predetermined values to the X coordinate and the Y coordinate of the position information of the sticky note data 53, respectively. In this way, the sticky note parameter changing unit 47 arranges a plurality of sticky note data 53 arranged at positions corresponding to the same date on the calendar image B1 while being shifted from each other.
  • the sticky note parameter changing unit 47 determines the position of the sticky note data 53 when no other sticky note data 53 exists at the position of the sticky note data 53 whose position information has been changed (step S15: No).
  • the sticky note parameter changing unit 47 may further change the position of the sticky note data 53 when, for example, a part of the sticky note data 53 overlaps with another sticky note data 53.
  • the sticky note parameter changing unit 47 determines whether or not the position information of all the sticky note data 53 has been changed (step S17).
  • the recording information acquisition unit 46 determines whether date / time information is included in the recording information of the next sticky note data 53. (Step S12).
  • the background image display unit 44 displays the calendar image B1 on the display 26 (step S18).
  • the background image display unit 44 may display the calendar image B1 on the display 26 when the user operates the calendar display button 63 (step S11: Yes).
  • the background image display unit 44 displays the calendar image B1 behind the plurality of tag data 53 and in front of the desktop 51 and the window 52. That is, the tag data 53 is located on the calendar image B 1, and the calendar image B 1 covers the desktop 51 and the window 52.
  • the tag parameter changing unit 47 arranges the tag data 53 at the position corresponding to the date of the displayed calendar image B1.
  • the tag data 53 is rearranged at a position suitable for the calendar image B1.
  • FIG. 6 shows the sticky note data 53A to 53E before rearrangement by two-dot chain lines, and the rearranged sticky note data 53A to 53E by a solid line. Therefore, the user can understand at a glance how far the date and time related to each tag data 53 is.
  • the background image display unit 44 displays the part B1a corresponding to the current date in the calendar image B1 in a display format different from the part B1b corresponding to other dates. For example, the background image display unit 44 displays the part B1a corresponding to the current date in a color different from the part B1b corresponding to another date. Therefore, the user can understand at a glance how far the current date is from the date and time related to each tag data 53.
  • the background image display unit 44 can change the calendar image B1 in accordance with a user operation. For example, the background image display unit 44 can change the calendar image B1 to the calendar of the next month or the previous month, or change the month display calendar to a year display calendar.
  • the tag parameter changing unit 47 changes the position information of the tag data 53 again to a position suitable for the calendar image B1.
  • FIG. 9 is a diagram illustrating an example of a screen on the display 26 on which an expanded display is performed.
  • the sticky note parameter changing unit 47 expands the sticky note data 53A, 53B, 53D arranged on the calendar image B1 in accordance with a user operation.
  • the sticky note parameter changing unit 47 displays one or more overlapping sticky note data 53 by operating one of the duplicate sticky note images 53 when the display areas of the plurality of sticky note data 53 overlap at least partially. Change the position to the position to be placed individually.
  • the sticky note parameter changing unit 47 places the sticky note data 53A, 53B, 53D having the date information “1/18” at the position corresponding to the date “1/18” of the calendar image B1. Arrange them in layers.
  • FIG. 9 shows the sticky note data 53A, 53B, 53D arranged in an overlapping manner with a two-dot chain line. It becomes difficult for the user to read the text data of the tag data 53A and 53D covered by the tag data 53B.
  • each unit of the processing unit 31 performs an unfolding display process.
  • the background image display unit 44 displays the time image B3 on the display 26.
  • the time image B3 is an image in which scales arranged at equal intervals and numbers indicating time are arranged.
  • the time image B3 is an example of a second background image, and includes an image whose time can be specified.
  • the tag parameter changing unit 47 changes the positions of the tag data 53A, 53B, and 53D to the developed positions on the time image B3.
  • the sticky note parameter changing unit 47 places the sticky note data 53A, 53B, 53D at the developed position at positions separated from each other without overlapping each other.
  • the sticky note parameter changing unit 47 changes the position of each of the sticky note data 53A, 53B, 53D arranged in a superimposed manner to a position where it is individually arranged.
  • the recording information acquisition unit 46 displays the date and time information of the superimposed tag data 53A, 53B, and 53D. From this, information about time (time information) is extracted. For example, the record information acquisition unit 46 extracts time information “9:00” from the date / time information “1/18 9:00” of the tag data 53A. Similarly, the recorded information acquisition unit 46 extracts time information “10:00” from the date / time information “1/18 10:00” of the tag data 53D. Since the date / time information “1/18” of the tag data 53B has no time information, the recording information acquisition unit 46 does not extract the time information from the date / time information of the tag data 53B.
  • the tag parameter changing unit 47 changes the position information of the tag data 53A and 53D in accordance with the extracted time information. For example, the tag parameter changing unit 47 moves the position information of the tag data 53A having the time information “9:00” to a position corresponding to the time “9:00” of the time image B3. Similarly, the sticky note parameter changing unit 47 moves the position information of the sticky note data 53D having the time information “10:00” to a position corresponding to the time “10:00” of the time image B3. That is, the sticky note parameter changing unit 47 changes the display positions of the sticky note data 53A and 53D to the positions where they are individually arranged using the time information associated with the sticky note data 53A and 53D, respectively.
  • the sticky note parameter changing unit 47 may shift the position information of the sticky note data 53.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53A and 53D to separate the sticky note data 53A and 53D from each other.
  • the position corresponding to the time “9:00” of the time image B3 and the tag data 53A are connected by a line
  • the position corresponding to the time “10:00” of the time image B3 and the tag data 53D are connected. Connected by lines. Thereby, the user can easily understand the time related to the tag data 53A and 53D and the text data of the tag data 53A and 53D.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53B for which time information could not be acquired to a predetermined position.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53B to a position (outside the frame) outside the area where each time of the time image B3 is displayed.
  • the background image display unit 44 displays the time image B3 on the display 26.
  • the background image display unit 44 displays the time image B3 behind the tag data 53A, 53B, 53D and before the calendar image B1. That is, the tag data 53A, 53B, and 53D are located on the time image B3, and the time image B3 covers the calendar image B1.
  • the tag parameter changing unit 47 changes the positions of the tag data 53A, 53B, 53D arranged in an overlapping manner on the calendar image B1 to the expanded positions.
  • the sticky note parameter changing unit 47 arranges the sticky note data 53A, 53B, 53D at the developed position according to the time information of the sticky note data 53A, 53B, 53D. Therefore, the user can understand at a glance how far the time relating to the tag data 53A, 53B, 53D is.
  • the sticky note parameter changing unit 47 again sets the positions of the sticky note data 53A, 53B, 53D to the positions corresponding to the date “1/18” on the calendar image B1. change. Further, the background image display unit 44 erases the time image B3 from the display 26. Thereby, the screen on the display 26 returns from the state of FIG. 9 to the state of FIG.
  • the sticky note parameter changing unit 47 acquires information about the above-described normal position from the sticky note storage unit 42, and sets the position of the sticky note data 53 to the normal position again. change. Thereby, the tag data 53 returns to the position before the calendar display process is performed. Further, the background image display unit 44 erases the calendar image B1 from the display 26. Thereby, the screen on the display 26 returns from the state of FIG. 6 to the state of FIG.
  • the background image display unit 44 of the processing unit 31 displays the calendar image B1 on the display 26 when the user operates the calendar display button 63.
  • the sticky note parameter changing unit 47 of the processing unit 31 automatically changes the position of the sticky note data 53 on the calendar image B1 to a position corresponding to the date information of the sticky note data 53 with the display of the calendar image B1.
  • FIG. 10 is a flowchart illustrating a part of the procedure of the background image display process in the tablet 10.
  • the stored information acquisition unit 46 determines whether or not the image display button 64 of any tag data 53 has been operated (step S111). The stored information acquisition unit 46 repeats the determination until the image display button 64 is operated (step S111: No).
  • step S111: Yes the recorded information acquisition unit 46 acquires text data D1, D2, image data D3, and date / time data D4 of one piece of sticky note data 53 from the sticky note storage unit 42. To do.
  • the recording information acquisition unit 46 determines whether the text data D1, D2, the image data D3, and the date / time data D4 include information (image related information) related to the background image B2 (step S112).
  • Image-related information is an example of information related to a user who has input information.
  • the recording information acquisition unit 46 determines that image-related information exists in at least one of the acquired text data D1, D2, image data D3, and date / time data D4 (step S112: Yes)
  • the image-related information is acquired. To do.
  • the background image B2 is a family photo showing the father, mother, and child. That is, the background image B2 includes images (a father image, a mother image, and a child image) that can specify a plurality of users.
  • the recorded information acquisition unit 46 determines whether text data D1, D2, image data D3, and date / time data D4 include information (image related information) related to the father, mother, and child. Then, the image related information is extracted.
  • the recording information acquisition unit 46 extracts image related information using an image related information database (hereinafter referred to as an image related information DB) 81 in FIG.
  • the image-related information DB 81 stores a father, mother, and child in association with specific words.
  • the image related information DB 81 stores the word “home visit” in association with “child”.
  • the recorded information acquisition unit 46 extracts the word “home visit” as the image related information from the text data D2 of the tag data 53A.
  • the extraction of the image related information by the recording information acquisition unit 46 is not limited to this.
  • the image-related information may be set by being selected from a pull-down menu, or may be automatically set based on the OS login ID.
  • the record information acquisition unit 46 provides the image parameter information of the tag data 53 extracted from the record information of the tag data 53 to the tag parameter change unit 47.
  • the tag parameter changing unit 47 changes the position information of the tag data 53 according to the image related information (step S113).
  • the sticky note parameter changing unit 47 uses the display position of the image that can identify the user and the information about the user who has input information to the sticky note data 53 to display the display position of the sticky note data 53. To change.
  • the movement destination information storage unit 49 stores the plurality of movement destination information set in advance for the background image B2 shown in FIG.
  • the destination information is XY coordinates (position) on the background image B2 corresponding to each image related information. For example, “X: 230, Y: 220” is set as the XY coordinates on the background image B2 corresponding to the image-related information “child”. Other destination information is also set.
  • the destination information may be set by the user.
  • the movement destination information setting unit 48 sets an arbitrary position on the background image B2 set by the user on the setting screen as movement destination information on the background image B2.
  • the destination information setting unit 48 sets one or more positions on the background image B1 based on a user operation.
  • the destination information storage unit 49 stores the set destination information.
  • the tag parameter changing unit 47 acquires the destination information from the destination information storage unit 49.
  • the sticky note parameter changing unit 47 converts the position information of the sticky note data 53A having the image related information “children” into the XY coordinates “X: 230, Y: 220 on the background image B2, which is destination information corresponding to the image related information. Change to That is, the sticky note parameter changing unit 47 moves the sticky note data 53A to a position corresponding to the image related information of the sticky note data 53A. In other words, the sticky note parameter changing unit 47 changes the display position of the sticky note data 53A using the movement destination information and the image related information.
  • the tag data 53 is an image.
  • the sticky note parameter changing unit 47 is notified that there is no related information.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53 having no image related information to a predetermined position (step S114).
  • the movement destination information storage unit 49 stores in advance the peripheral portion of the background image B2 as the position where the sticky note data 53 not including the image related information is arranged.
  • the sticky note parameter changing unit 47 changes the position information of the sticky note data 53 having no image related information to XY coordinates set in advance.
  • the sticky note parameter changing unit 47 determines whether there is another sticky note data 53 at the changed position (step S115). For example, the sticky note parameter changing unit 47 acquires the position information of each sticky note data 53 from the sticky note storage unit 42 and compares it with the positional information of the moved sticky note data 53, so that another sticky note data 53 is at the changed position. Determine if it exists.
  • the sticky note parameter changing unit 47 further changes the position information of the sticky note data 53 when another sticky note data 53 exists at the position of the sticky note data 53 whose position information has been changed (step S115: Yes). Thereby, the tag parameter changing unit 47 arranges the plurality of tag data 53 having the image related information corresponding to the same XY coordinates while being shifted from each other on the background image B2.
  • the sticky note parameter changing unit 47 determines the position of the sticky note data 53 when there is no other sticky note data 53 at the position of the sticky note data 53 whose position information has been changed (step S115: No).
  • the sticky note parameter changing unit 47 may further change the position of the sticky note data 53 when, for example, a part of the sticky note data 53 overlaps with another sticky note data 53.
  • the sticky note parameter changing unit 47 determines whether or not the position information of all the sticky note data 53 has been changed (step S117).
  • the recording information acquisition unit 46 performs text data D1, D2, image data D3, and date / time data D4 of the next sticky note data 53. Whether image related information is included is determined (step S112).
  • the background image display unit 44 acquires the background image B2 from the background image storage unit 45 and displays it on the display 26. (Step S118). The background image display unit 44 may display the background image B2 on the display 26 when the user operates the image display button 64 (step S111: Yes).
  • the background image display unit 44 displays the background image B2 behind the plurality of tag data 53 and before the desktop 51 and the window 52. That is, the tag data 53 is located on the background image B 2, and the background image B 2 covers the desktop 51 and the window 52.
  • the sticky note parameter changing unit 47 arranges the sticky note data 53 at a position corresponding to the image related information of the sticky note data 53 on the displayed background image B2. To do. In other words, the sticky note data 53 is rearranged at a position suitable for the background image B2.
  • the tag data 53A to 53E before rearrangement is indicated by a two-dot chain line, and the rearranged tag data 53A to 53E are indicated by a solid line. For this reason, for example, the user can understand at a glance to whom each tag data 53 relates to the father, mother, and child of the background image B2.
  • the sticky note parameter changing unit 47 acquires information about the above-described normal position from the sticky note storage unit 42, and changes the position of the sticky note data 53 to the normal position again. . Thereby, the tag data 53 returns to the position before the image display process is performed. Further, the background image display unit 44 erases the background image B2 from the display 26. Thereby, the screen on the display 26 returns from the state of FIG. 7 to the state of FIG.
  • the background image display unit 44 of the processing unit 31 displays the background image B2 on the display 26 when the image display button 64 is operated by the user.
  • the sticky note parameter changing unit 47 of the processing unit 31 changes the position of the sticky note data 53 on the background image B2 to a position corresponding to the image related information in accordance with the display of the background image B2.
  • the background image B2 is a family photo, but the background image B2 is not limited to this.
  • the background image B2 may be a map.
  • the recording information acquisition unit 46 extracts information related to the place as image related information from the recording information of the tag data 53.
  • the background image display unit 44 of the processing unit 31 displays the calendar image B1 or the background image B2 on the display 26. Further, the sticky note parameter changing unit 47 of the processing unit 31 displays the sticky note data 53 according to the calendar image B1 or the background image B2 and the record information of the sticky note data 53 in accordance with the display of the calendar image B1 or the background image B2. Change the parameters related to. That is, the background image display unit 44 displays the calendar image B1 or the background image B2, and displays the sticky note data 53 in a display form optimal for the calendar image B1 or the background image B2.
  • the user looks at the information related to each tag data 53 corresponding to, for example, the calendar image B1 or the background image B2 (for example, the date, person, and location related to the tag data 53) from the display form of the tag data 53.
  • the list of the tag data 53 is improved.
  • the tag parameter changing unit 47 of the processing unit 31 changes the position of the tag data 53 on the calendar image B1 or the background image B2.
  • the user can read the information related to each tag data 53 corresponding to, for example, the calendar image B1 or the background image B2, the relative relationship of each tag data 53, etc. at a glance from the position of the tag data 53.
  • the listability of the tag data 53 is improved.
  • the background image display unit 44 of the processing unit 31 displays the calendar image B1 on the display 26, and in accordance with the display of the calendar image B1, the position of the tag data 53 on the calendar image B1 is changed to date information of the tag data 53. Change to the appropriate position. Thereby, the user can read the absolute time position (date) of each sticky note data 53 at a glance from the position of the sticky note data 53, for example, and the listability of the sticky note data 53 is improved.
  • the sticky note parameter changing unit 47 of the processing unit 31 can change the position of the sticky note data 53 arranged in an overlapping manner by changing the position in the calendar display process to the developed position. Thereby, for example, the user can read that the plurality of tag data 53 are related to the same date and time, and can also read the contents of the plurality of tag data 53.
  • the sticky note parameter changing unit 47 arranges the sticky note data 53 at the developed position according to the recording information. For example, as in the embodiment, the sticky note parameter changing unit 47 arranges the sticky note data 53 relating to the same date at the developed position in a state where they are arranged based on the time information. Thereby, the user can read at a glance from the arrangement of the expanded sticky note data 53 about the information related to the sticky note data 53 arranged in an overlapping manner, and the listability of the sticky note data 53 is improved.
  • the background image display unit 44 displays the calendar image B1 on the display 26, and the tag parameter changing unit 47 changes the position of the tag data 53 along with the display of the calendar image B1.
  • change That is, when the calendar display button 63 provided on the tag data 53 is operated in a state where the tag data 53 is in the position desired by the user, the calendar image B1 is displayed and the position of the tag data 53 is changed.
  • the user can usually place the sticky note data 53 at a desired position, and can easily change the position of the sticky note data 53 according to the calendar image B1 simply by operating the calendar display button 63. it can.
  • the user sets the destination information by the destination information setting unit 48.
  • the sticky note parameter changing unit 47 changes the position of the sticky note data 53 on the background image B2 according to the destination information and the image related information of the sticky note data 53. Thereby, the user can move the tag data 53 to an arbitrary position of the arbitrary background image B2.
  • the sticky note application program executed by the tablet 10 of the present embodiment is provided by being incorporated in advance in a ROM or the like.
  • the sticky note application program executed in the tablet 10 of the present embodiment is a file in an installable or executable format, such as a CD-ROM, flexible disk (FD), CD-R, DVD (Digital Versatile Disk), or the like. It may be configured to be recorded on a readable recording medium.
  • the sticky note application program executed on the tablet 10 of the present embodiment may be stored on a computer connected to a network such as the Internet and provided by being downloaded via the network. Moreover, you may comprise so that the sticky note application program performed with the tablet 10 of this embodiment may be provided or distributed via networks, such as the internet.
  • the sticky note application program executed on the tablet 10 of this embodiment includes the above-described units (the processing unit 31, the sticky note display unit 41, the sticky note storage unit 42, the sticky note creation unit 43, the background image display unit 44, the background image storage unit 45,
  • the module configuration includes a recording information acquisition unit 46, a sticky note parameter change unit 47, a destination information setting unit 48, and a destination information storage unit 49).
  • a CPU processor
  • the above-described units are loaded onto the main storage device, and the processing unit 31, the sticky note display unit 41, the sticky note storage unit 42, the sticky note creation unit 43, the background image display unit 44, and the background image storage unit. 45, a recording information acquisition unit 46, a sticky note parameter change unit 47, a destination information setting unit 48, and a destination information storage unit 49 It is generated on the main memory.
  • the processing unit determines the display positions of the plurality of sticky note images.
  • the background image is changed using information associated with each of the plurality of tag images. Thereby, the list property of tag information improves.
  • the processing unit 31 when the calendar display button 63 is operated, the processing unit 31 performs a calendar display process.
  • the processing unit 31 may perform the calendar display process by operating a predetermined key on the keyboard.
  • the sticky note parameter changing unit 47 arranges the sticky note data 53A, 53B, and 53D arranged at the position developed by the development display process according to the time information.
  • the sticky note parameter changing unit 47 may arrange, for example, the sticky note data 53A, 53B, and 53D according to the priority set as the recording information.
  • the processing unit 31 has changed the position information of the tag data 53, which is a parameter related to the display of the tag data 53.
  • the processing unit 31 may change information on display / non-display of the tag data 53, information on the color of the tag data 53, and information on the size of the tag data 53, for example.
  • the calendar image B1 and the background image B2 are provided by the processing unit 31, but the present invention is not limited to this.
  • the processing unit 31 may be configured to use a calendar image or background image provided by the OS, or a calendar image or background image provided by a service on the Internet.
  • the background image is not limited to the calendar image B1, the background image B2, and the time image B3 of the present embodiment.
  • the background image is an image displayed in an area including the area where the sticky note image is displayed, and may be any image as long as the image is displayed as the background of the sticky note image.
  • the tag image is not limited to the tag data 53 of the present embodiment.
  • the tag image is associated with information that can be input by the user through at least one operation such as software keyboard, hardware keyboard, handwriting, and voice input, and the presence of the information can be identified. Any image may be used as long as at least a part of the image can be displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un dispositif électronique comprenant une unité de traitement d'affichage et une unité de traitement. L'unité de traitement d'affichage affiche : une pluralité d'images de vignettes repositionnables contenant des informations entrées à titre de résultat d'une opération d'un utilisateur; et une image d'arrière-plan associée à une première zone qui contient une zone d'affichage associée à la pluralité d'images de vignettes repositionnables. Lorsque l'image d'arrière-plan associée à la première zone a été commutée d'une première image d'arrière-plan à une seconde image d'arrière-plan, l'unité de traitement modifie la position d'affichage associée à la pluralité d'images de vignettes repositionnables en utilisant la seconde image d'arrière-plan et les informations associées à chaque image de la pluralité d'images de vignettes repositionnables. DRAWING:
PCT/JP2014/068358 2014-07-09 2014-07-09 Dispositif électronique, procédé et programme associés WO2016006074A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/068358 WO2016006074A1 (fr) 2014-07-09 2014-07-09 Dispositif électronique, procédé et programme associés

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/068358 WO2016006074A1 (fr) 2014-07-09 2014-07-09 Dispositif électronique, procédé et programme associés

Publications (1)

Publication Number Publication Date
WO2016006074A1 true WO2016006074A1 (fr) 2016-01-14

Family

ID=55063749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/068358 WO2016006074A1 (fr) 2014-07-09 2014-07-09 Dispositif électronique, procédé et programme associés

Country Status (1)

Country Link
WO (1) WO2016006074A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023001172A (ja) * 2019-05-21 2023-01-04 カシオ計算機株式会社 図形表示装置、図形表示方法及びプログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010008088A1 (fr) * 2008-07-17 2010-01-21 日本電気株式会社 Appareil de traitement de l’information, moyen de stockage sur lequel un programme a été enregistré et procédé de modification d’objet
JP2011164696A (ja) * 2010-02-04 2011-08-25 Ricoh Co Ltd ネットワークシステム、サーバ装置、及びグループウェアプログラム
JP2013058026A (ja) * 2011-09-07 2013-03-28 Bettsu & Systems:Kk スケジュール管理装置、スケジュール管理方法、およびプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010008088A1 (fr) * 2008-07-17 2010-01-21 日本電気株式会社 Appareil de traitement de l’information, moyen de stockage sur lequel un programme a été enregistré et procédé de modification d’objet
JP2011164696A (ja) * 2010-02-04 2011-08-25 Ricoh Co Ltd ネットワークシステム、サーバ装置、及びグループウェアプログラム
JP2013058026A (ja) * 2011-09-07 2013-03-28 Bettsu & Systems:Kk スケジュール管理装置、スケジュール管理方法、およびプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023001172A (ja) * 2019-05-21 2023-01-04 カシオ計算機株式会社 図形表示装置、図形表示方法及びプログラム
JP7396433B2 (ja) 2019-05-21 2023-12-12 カシオ計算機株式会社 図形表示装置、図形表示方法及びプログラム

Similar Documents

Publication Publication Date Title
US11782582B2 (en) Digital processing systems and methods for detectable codes in presentation enabling targeted feedback in collaborative work systems
US11755827B2 (en) Digital processing systems and methods for stripping data from workflows to create generic templates in collaborative work systems
US10133466B2 (en) User interface for editing a value in place
US12056664B2 (en) Digital processing systems and methods for external events trigger automatic text-based document alterations in collaborative work systems
US11385774B2 (en) Intuitive workspace management
US10572134B2 (en) Method and system for providing prototyping tool, and non-transitory computer-readable recording medium
US20130132878A1 (en) Touch enabled device drop zone
CN107113468A (zh) 基于显示设备的自动主屏确定
JP6907631B2 (ja) 情報処理装置及び情報処理プログラム
CN106462371A (zh) 提供协作式交互的系统和方法
US20150363095A1 (en) Method of arranging icon and electronic device supporting the same
JP6322485B2 (ja) 情報提供装置
US20110258555A1 (en) Systems and methods for interface management
CN106527868B (zh) 一种应用程序的任务管理方法及装置
JP5651450B2 (ja) 画面設計評価装置、画面設計評価方法及びプログラム
JP5853450B2 (ja) 情報処理装置、その制御方法、および、表示制御プログラム
WO2016006074A1 (fr) Dispositif électronique, procédé et programme associés
US9477384B2 (en) Display control apparatus, system and recording medium having display control program
JP2016115022A (ja) 情報処理装置、情報処理方法、およびプログラム
US20130325956A1 (en) Information-processing system, information-processing apparatus, information-processing method, and program
CN110049202B (zh) 信息处理装置和信息处理方法
JP6459470B2 (ja) 文書管理プログラム、方法及び文書管理装置
JP6029628B2 (ja) 表示制御装置、表示制御方法、及び表示制御プログラム
Nebeling et al. Beyond responsive design: adaptation to touch and multitouch
JP2013186153A (ja) 地図表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14897354

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14897354

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP