JP5254137B2 - Mobile communication device and communication device - Google Patents

Mobile communication device and communication device Download PDF

Info

Publication number
JP5254137B2
JP5254137B2 JP2009151722A JP2009151722A JP5254137B2 JP 5254137 B2 JP5254137 B2 JP 5254137B2 JP 2009151722 A JP2009151722 A JP 2009151722A JP 2009151722 A JP2009151722 A JP 2009151722A JP 5254137 B2 JP5254137 B2 JP 5254137B2
Authority
JP
Japan
Prior art keywords
object
item
key
unit
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009151722A
Other languages
Japanese (ja)
Other versions
JP2011008556A (en
Inventor
栄三 藤澤
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to JP2009151722A priority Critical patent/JP5254137B2/en
Priority claimed from PCT/JP2010/060902 external-priority patent/WO2010150893A1/en
Publication of JP2011008556A publication Critical patent/JP2011008556A/en
Application granted granted Critical
Publication of JP5254137B2 publication Critical patent/JP5254137B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a mobile communication device and a communication device that select an object based on an input to an area where an image is displayed by a display unit.

  In a communication device such as a cellular phone, there are various input devices other than an input device for pressing a keyboard, a numeric keypad, or a button as a device for inputting an operator's instruction. For example, Patent Document 1 describes a mobile terminal that detects, as an input, a tilt angle of a mobile terminal and a gesture such as “shake” or “tilt” from a signal obtained from an acceleration sensor. Further, the apparatus described in Patent Document 1 describes that an icon group representing information content is controlled from an inclination angle of a portable terminal, and an icon that moves from moment to moment is displayed on a display.

JP 2002-149616 A

  Further, as an input device, there is a touch panel or the like in which an operator contacts (touches) an area where an image is displayed and detects the contact as an input. Since the operator can perform an operation while directly touching the screen, an intuitive operation is possible and high operability can be realized.

  Here, a communication device may perform a narrow search when acquiring information via the Internet or when extracting a part of a file from one folder. In such a narrow search, a search condition as a key is compared with information of each file, and a file satisfying the search condition is extracted. In addition, as a display method of the files extracted by the narrow search, there is a method of determining the relevance rate with the search condition and displaying the extracted files in order of the relevance rate. In this way, by displaying the results of the narrowed search in descending order of the relevance ratio, it is possible to confirm from the file that conforms to the search conditions input by the operator.

  However, with this method, it is not possible to know how many files will be extracted as a narrowing result when the relevance ratio is constant until the search result is displayed. Further, when the number of files to be extracted is set, the high relevance ratio to be extracted changes depending on the search condition. For this reason, since the search is repeated while setting various conditions, it is difficult to improve operability.

  The present invention has been made in view of the above, and an object of the present invention is to provide a mobile communication device and a communication device capable of executing a narrow search with higher operability.

  In order to solve the above-described problems and achieve the object, the present invention provides a display unit that displays an image and an input that detects an input instruction that is input by touching a region where the image is displayed by the display unit. A detection unit, a communication unit that communicates with another terminal, and a control unit that performs control based on an input instruction detected by the input detection unit and controls an image displayed on the display unit. The control unit includes a group object having a plurality of communication destination specifying objects that are address information when performing communication in the communication unit, and a key serving as a condition for narrowing down the communication destination specifying object from the group objects When the input detection unit detects the touch to the key object and the movement instruction in the contact state in a state where the object is displayed on the screen by the display unit, An item for obtaining the matching rate of each communication destination specific object in the loop object with respect to the key object, detecting the amount of movement of the key object by the input detection unit, and extracting based on the amount of movement of the key object The range of the accuracy rate of the object is calculated, the item object satisfying the calculated accuracy rate range is extracted from the item objects in the group object, and one of the extracted communication destination specifying objects is selected. When the input detection unit detects that the instruction to be determined has been input, the communication unit starts communication based on the address information associated with the selected communication destination specific object. To do.

  In order to solve the above-described problems and achieve the object, the present invention detects a display unit that displays an image and an input instruction that is input by touching an area where the image is displayed by the display unit. An input detection unit that performs communication with another terminal, a control unit that performs control based on an input instruction detected by the input detection unit, and controls an image displayed on the display unit, The control unit is configured to display a group object having a plurality of item objects and a key object serving as a condition for narrowing down the item object from the group objects on the screen by the display unit. When the input detection unit detects contact with the key object and a movement instruction in the contact state, the key of each item object in the group object Obtaining a matching rate for the object, detecting an amount of movement of the key object by the input detection unit, calculating a range of the matching rate of the item object to be extracted based on the amount of movement of the key object, Item objects satisfying the calculated range of relevance ratios are extracted from the item objects in the group object.

  Here, it is preferable that the control unit sets the extracted set of item objects as a new group object.

  The control unit may display an item object included in the group object on the display unit, and display the extracted item object in the vicinity of the key object.

  In addition, the control unit varies the range of the relevance ratio calculated according to the distance between the key object and the group object, and extracts the item object having a lower relevance ratio as the distance is closer, and extracts the distance It is preferable to extract only item objects having a higher relevance ratio as the distance is farther.

  In addition, the control unit calculates the range of the relevance ratio of the item object to be extracted according to the movement distance of the key object in the display area of the group object. It is preferable to extract including low item objects, and to extract only item objects having a higher matching rate as the moving distance is shorter.

  In addition, the control unit may be configured so that the input object is extracted by the input detection unit in a state where the key object is selected and an item object is extracted based on the range of the relevance ratio at the time of the state. When a tap on the key object is detected, it is preferable that the range of the relevance ratio of the item object to be extracted is narrower than that before the occurrence of the tap.

  Moreover, it is preferable that the control unit narrows the range of the matching rate of the item object to be extracted as the tap time is longer.

  Further, it is preferable that the control unit narrows the range of the matching rate of the item object to be extracted as the number of taps increases.

  Further, it is preferable that the control unit displays the group object larger on the display unit as the number of item objects included in the group object increases.

  Further, it is preferable that the control unit does not extract the item object until a preset time elapses after the key object enters the display area of the group object.

  Moreover, it is preferable that the said control part attaches an outline to the outer periphery of the said group object, and displays it on the said display part.

  The input detection unit is a contact sensor disposed in front of the display unit, and detects a force applied to the display unit as an input, and the control unit displays the position where the load is input and the display It is preferable to determine the operated key object in association with the image displayed on the part.

  The portable communication device and the communication device according to the present invention have an effect that a narrow search can be executed with high operability.

FIG. 1 is a block diagram showing a schematic configuration of a mobile communication device according to an embodiment of the communication device of the present invention. FIG. 2 is a block diagram showing a schematic configuration of software of the portable communication device shown in FIG. FIG. 3 is an explanatory diagram illustrating an example of an image displayed on the display device. FIG. 4 is a flowchart showing an example of the processing operation of the mobile communication device. FIG. 5 is a flowchart showing an example of processing operation of the mobile communication device. FIG. 6A is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 6B is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 6C is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 6-4 is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 6-5 is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 7A is a flowchart illustrating an example of a processing operation of the mobile communication device. FIG. 7-2 is a flowchart illustrating an example of a processing operation of the mobile communication device. FIG. 8A is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 8-2 is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 8C is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 9 is a graph for explaining the operation of the mobile communication device. FIG. 10A is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 10-2 is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 11 is an explanatory diagram for explaining the operation of the mobile communication device. FIG. 12 is a graph for explaining the operation of the mobile communication device.

  Hereinafter, the present invention will be described in detail with reference to the drawings. The present invention is not limited to the following description. In addition, constituent elements in the following description include those that can be easily assumed by those skilled in the art, those that are substantially the same, and those in a so-called equivalent range. In the following, a mobile communication device as a communication device, more specifically, a mobile phone will be described as an example. However, the application target of the present invention is not limited to a mobile phone. For example, PHS (Personal Handy phone System) ), PDA, portable and vehicle-mounted navigation devices, notebook computers, game machines, and the like.

  FIG. 1 is a block diagram showing a schematic configuration of an embodiment of a communication device and a portable communication device according to the present invention. As shown in FIG. 1, the mobile communication device 10 basically includes a CPU (Central Processing Unit) 22, a communication unit 26, an audio processing unit 30, a display device 32, an input device 34, a ROM 36, and a RAM 38. And an internal storage 40 and an external storage interface (I / F, interface) 42. The mobile communication device 10 is connected to the external storage 46 via the external storage I / F 42. In addition to the above configuration, the mobile communication device 10 has various configurations as a mobile communication device, such as an imaging unit and various terminals. Further, the external shape of the mobile communication device is composed of two members connected by a hinge, and has various shapes such as a foldable shape, a shape in which the two members are slid, and a single box shape. be able to.

  The CPU 22 is a processing unit that comprehensively controls the overall operation of the mobile communication device 10. That is, the communication unit 26 and the various processes of the mobile communication device 10 are executed in an appropriate procedure according to the operation of the input device 34 and the software stored in the ROM 36 and the internal storage 40 of the mobile communication device 10. Control the operation of the display device 32 and the like. Various processes of the mobile communication device 10 include, for example, a voice call performed via a circuit switching network, creation and transmission / reception of an electronic mail, browsing of a Web (World Wide Web) site on the Internet, and the like. The operations of the communication unit 26, the audio processing unit 30, the display device 32, and the like include, for example, transmission / reception of signals by the communication unit 26, input / output of audio by the audio processing unit 30, display of images by the display device 32, and the like. .

  The CPU 22 executes processing based on programs (for example, operating system programs, application programs, etc.) stored in the ROM 36 and the internal storage 40. The CPU 22 is constituted by, for example, a microprocessor unit (MPU: Micro Processor Unit), and executes various processes of the mobile communication device 10 described above according to a procedure instructed by the software. That is, the CPU 22 sequentially reads instruction codes from the operating system program, application program, and the like stored in the ROM 36 and the internal storage 40 and executes processing.

  The CPU 22 has a function of executing a plurality of application programs. Examples of application programs executed by the CPU 22 include an image display application program for displaying an image on the display device 32, an operation detection application program for calculating an operation input based on an input detected by the input device 34, and a narrowing search. There are a plurality of application programs such as a search application program for performing communication, an Internet application program for performing Internet communication, a mail application program for creating mail, and a telephone application program for making calls.

  The communication unit 26 establishes a radio signal line by a CDMA system or the like with a base station via a channel assigned by the base station, and performs telephone communication and information communication with the base station.

  The audio processing unit 30 executes processing of an audio signal input to the microphone 50 and an audio signal output from the receiver 52. That is, the sound processing unit 30 amplifies the sound input from the microphone 50, performs AD conversion (Analog Digital conversion), and then performs signal processing such as encoding to convert the sound into digital sound data to be converted into the CPU 22 Output to. Further, the audio data sent from the CPU 22 is subjected to processing such as decoding, DA conversion (Digital Analog conversion), amplification, etc. to convert it into an analog audio signal, which is then output to the receiver 52.

  The display device 32 includes a display panel such as a liquid crystal display (LCD) or an organic EL (Organic Electro-Luminescence) panel, and a video corresponding to video data supplied from the CPU 22 and an image corresponding to the image data. Is displayed on the display panel.

  The input device 34 is a touch panel arranged on the front surface of the display device 32, and detects the contact as an input when the operator touches the surface. The input device 34 detects the touched position, the strength of the touch, and the like. As the touch panel, various types of touch panels such as a matrix switch, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method can be used. Here, when an image of a key to which various functions such as a power key, a call key, a numeric key, a character key, a direction key, a decision key, and a call key are displayed on the display device 32 is displayed by the operator. When the input device 34 is pressed, the input device 34 detects the pressed position (the position where the touch has occurred). The CPU 22 of the mobile communication device 10 performs the process on the assumption that the key operation corresponding to the position where the input device 34 detected contact is input.

  A ROM (Read Only Memory) 36 is a read-only storage device, and stores firmware used for driving the mobile communication device 10 such as BIOS. A RAM (Random Access Memory) 38 is a readable / writable storage device, and is composed of, for example, an SRAM (Static Random Access Memory) or a DRAM (Dynamic Random Access Memory). The ROM 36 and the RAM 38 constitute a main storage device. In the ROM 36 and RAM 38, computer programs and temporary data used in the software processing process are assigned by the CPU 22, and the assigned data are temporarily stored in the work area.

  The internal storage (internal memory) 40 is a readable and writable storage device provided in the mobile communication device 10, for example, a hard disk. The internal storage 40 is used as an auxiliary storage device, and stores software and data used for processing by the CPU 22. In addition to these tasks, the internal storage 40 also has an image folder for storing communication and downloaded image data, a standby image folder for storing image files used for standby images, and the like. In addition to these, the internal storage 40 stores and manages, for example, communication, downloaded audio data, software used by the CPU 22 to control the internal storage 40, the telephone number and mail address of the communication partner, and the like. An address book, voice files such as dial tone and ring tone, temporary data used in software processing, and the like are stored.

  The external storage I / F 42 is a terminal connected to the removable external storage 46. By connecting the external storage I / F 42 and the external storage 46, data can be transmitted and received between the CPU 22 and the external storage 46. The external storage (external memory) 46 is a storage device that can be attached to and detached from the mobile communication device 10 via the external storage I / F 42, such as a memory card such as an SD card (registered trademark) or a compact flash (registered trademark), A removable HDD (Hard Disk Drive) can be used. As a combination of the external storage I / F 42 and the external storage 46, a removable disk drive and a removable disk may be combined.

  Next, the software configuration of the mobile communication device 10 will be described with reference to FIG. Here, FIG. 2 is a block diagram illustrating a schematic configuration of software of the mobile communication device, and FIG. 3 is an explanatory diagram illustrating an example of an image displayed on the display device. In FIG. 2, for the sake of explanation, each software (application program) is shown as being provided separately. However, signal exchange and data processing among the CPU 22, ROM 36, RAM 38, internal storage 40, etc. However, each part is not visually provided separately. 2 shows software related to item object narrowing search and image display as software, the mobile communication device 10 includes various software in addition to the software shown in FIG. The object is an item to be operated by the operator, and includes a group object (folder), an item object (file), a key object (search condition), and the like. Note that the folder (directory) is not only a folder created by the operator, but also a temporary folder for managing a folder created by the mobile communication device 10 by the narrow search and a set of files detected by the narrow search. Also includes folders created. Examples of the file include image data, audio data, Internet home page data, an address book that stores phone numbers and mail addresses together with names, and the like. In addition, as search conditions, the character used as a keyword, the numerical value used as a threshold (date and time, the number of people in the case of an image), the genre of a file (for example, in the case of an image, landscape, portrait, etc., in the case of music, Classic, hip hop, rock, etc.).

  Here, with reference to FIG. 3, each unit displayed on the screen when the narrowing search is performed will be described. FIG. 3 is an explanatory diagram illustrating an example of an image displayed on the display device. As shown in FIG. 3, the display device displays a group object 102 having a plurality of item objects 104 and a key object 106 as a search condition. In the example illustrated in FIG. 3, the group object 102 is a folder in which a plurality of item objects 104 are stored. A group object named “Mr. A” and a group object named “Mr. B” are displayed. Has been. In each group object 102, the outer periphery (outline) of the display area of the group object is surrounded by a solid line. That is, the boundary line between the group object and other parts, that is, the outline of the group object is displayed. An item object 104 belonging to the group object 102 is displayed inside the group object 102.

  The item object 104 is composed of an image file, and a thumbnail image of the image file is displayed as the item object 104 on the display device. Further, the key object 106 is described in terms of a narrowing condition, and is displayed at a position separated from the group object 102 by a certain distance or more. A method for operating the image displayed on the display device 32 and a control method for narrowing search based on the operation will be described later.

  As illustrated in FIG. 2, the software of the mobile communication device 10 includes a search unit 60 that performs a narrow search, a display unit 62 that controls an image displayed on the display device 32, and an input unit that detects an input to the input device 34. 64, and a content control unit 66 that transmits / receives data to / from the internal storage 40 and the external storage 46. Hereinafter, the configuration of each unit will be described.

  The search unit 60 includes a search control unit 68, an object control unit 70, an object display unit 72, an object operation unit 74, an object management unit 76, an object information database 78, a content management unit 80, and a search index database. 82, a content information database 84, and a content access unit 86. The search unit 60 reads a group object in which a plurality of item objects are collected, calculates a matching rate between each item object in the group object and the search condition, and the matching rate is determined from the item objects in the group object. A narrowing search is performed by extracting item objects that satisfy, that is, the matching rate is in a predetermined range. In the present embodiment, the search unit 60 performs extraction from items having a high relevance rate, and extracts item objects determined to be included between 100% and the determined ratio. A method for determining the range of the precision to be extracted will be described later.

  The search control unit 68 exchanges information with the object control unit 70 and the content management unit 80, and performs a narrow-down search for item objects based on information supplied from each unit. The object control unit 70 exchanges information with the object display unit 72, the object operation unit 74, and the object management unit 76, and controls the behavior of various objects such as group objects, item objects, and key objects. Specifically, selection of an object to be displayed, determination of a position to display an object, creation of a new group object based on determination of the search control unit 68, determination of a group object to which each item object belongs, Move, create key objects, etc.

  The object display unit 72 sends information on the object to be displayed, determined based on the control by the object control unit 70, to the display control unit 88 of the display unit 62. That is, the object display unit 72 sends to the display control unit 88 information on which group object, item object, and key object are displayed, and at what position and how each object is displayed.

  The object operation unit 74 determines an object to be operated and an operation of the object based on the input signal sent from the input unit 64, and performs object control on the determined signal of the operation target object and the determined operation signal. Send to part 70.

  The object management unit 76 has a function of managing each object, and the object information database 78 is a storage unit that stores information about each object. Here, as information about each object, information indicating which group object the object belongs to, information for calculating the relevance ratio of the object in the narrowing search, information on the key object, etc. Various information necessary for the operation of the object is stored. Note that the object information database 78 stores information related to objects displayed on the screen of the display device 32. The object management unit 76 appropriately reads information in the object information database 78, sends it to the object control unit 70, and updates the information stored in the object information database 78 when the information about the object is updated.

  Next, the content management unit 80 includes a search index database 82, a content information database 84, and a content access unit 86, and photographs, music data, address book data, etc. stored in the internal storage 40 and the external storage 46, etc. These files are read out via the content control unit 66, and each file is written into the internal storage 40 or the external storage 46.

  The search index database 82 stores information related to search conditions used for narrowing search. Note that the search index database 82 also stores information on search conditions other than information on search conditions displayed as key objects on the screen of the display device 32 as necessary.

  The content information database 84 stores audio data, image files, Internet sites, and the like as item objects. That is, file information (audio information and image information) corresponding to the item object displayed on the screen is stored. The content information database 84 also stores file information of item objects other than the item objects displayed on the display device 32 as necessary. The content access unit 86 exchanges information with the content control unit 66, reads the file information, software information, and the like acquired by the content control unit 66. Information is sent to the content control unit 66.

  The display unit 62 includes a display control unit 88 and a drawing engine unit 90, generates an image signal of an image to be displayed on the display device 32 based on information sent from the object display unit 72 of the search unit 60, and Send to display device 32. The display control unit 88 creates an image to be displayed based on information sent from the object display unit 72, that is, information on which object is displayed in which position and how. The drawing engine unit 90 converts the image created by the display control unit 88 into an image signal, and sends the converted image signal to the display device 32. In this way, the display device 32 displays the image created by the display unit 62 based on the information sent from the search unit 60. When other software is activated, the display unit 62 generates an image based on information sent from various software other than the information sent from the object display unit 72.

  The input unit 64 includes an input interface (I / F) control unit 92, and sends an operation of the operator detected by the input device 34 to the object operation unit 74. The input interface control unit 92 converts the signal sent from the input device 34 into a signal that can be analyzed by the search unit 60 and sends the signal to the object operation unit 74.

  The content control unit 66 has a file system 94, reads information from the internal storage 40 and / or the external storage 46, sends the read information to the content access unit 86, and receives the information sent from the content access unit 86. To the internal storage 40 and / or the external storage 46. The file system 94 has a function of managing information reading and writing.

  Next, the operation of the mobile communication device 10, specifically, the search method for the narrow search and the image display method associated therewith will be described with reference to FIGS. 4 to 12. Here, FIG. 4 is a flowchart showing an example of the processing operation of the mobile communication device.

  First, when each software shown in FIG. 2 is activated, a group object having a plurality of item objects and a plurality of key objects are displayed on the display device 32 as shown in FIG. Yes. As described above, when a group object having a plurality of item objects and a plurality of key objects are displayed, a narrow search is performed when a predetermined operation is performed. Note that a plurality of two types of key objects, A mode key object and B mode key object, are displayed as key objects. The A mode and the B mode are different in that the range (threshold value) of the matching rate of the item object to be extracted, which is determined based on the operation of the key object, is obtained by different methods. For this reason, the A-mode key object and the B-mode key object may have the same search condition (keyword, reference value, reference condition) for narrowing down.

  In step S12, the search unit 60 determines whether a touch operation has been performed. That is, the input device 34 detects an operation by the operator, and determines whether the signal is input to the object operation unit 74 via the input unit 64. If it is determined in step S12 that the touch operation is not performed (No), the search unit 60 ends the process.

  If it is determined that the touch operation has been performed (Yes) in Step S12, the search unit 60 determines whether the touch coordinates are within the object region as Step S14. That is, it is determined whether the position touched by the operator is an area corresponding to the object. Here, for the object displayed on the screen of the display device 32, the area of the input device 34 corresponding to the screen display of the display device 32 is set as the object area. If it is determined that the touch coordinates are within the object area (Yes), the search unit 60 proceeds to step S16, and if it is determined that the touch coordinates are not within the object area (No), the process ends. That is, when the touch coordinates are within the object area, the search unit 60 determines that an instruction to operate the object has been input, and shifts to the operation of the object.

  If it is determined Yes in step S14, the search unit 60 determines whether the touch coordinates are within the group object region as step S16. That is, it is determined whether or not a group object is designated as an operation target. If it is determined in step S16 that the touch coordinates are within the area of the group object (Yes), that is, the group object is an operation target, the search unit 60 proceeds to the object movement mode in step S18. Here, the object movement mode means that when the operator moves the position in contact with the input device 34 from the state in which the group object is designated, that is, when the touch coordinates are changed, the group moves according to the change. In this mode, the position of the object is moved. In the object movement mode, when the touch operation is finished, that is, when the operator does not touch the input device 34, it is determined that the movement of the group object is finished, and the position where the touch operation is finished is determined. Set as a group object and end the process.

  If the search unit 60 determines in step S16 that the touch coordinates are not within the area of the group object (No), that is, if the group object is not an operation target, in step S20, the touch coordinates are the A mode key object. It is determined whether it is in the area. That is, it is determined whether or not an A mode key object is designated as an operation target. If it is determined in step S20 that the touch coordinates are within the area of the A mode key object (Yes), the search unit 60 proceeds to object adsorption A mode in step S22. The object adsorption A mode will be described after the description of the flowchart shown in FIG. 4 is completed. When the object adsorption A mode process in step S22 is completed, the search unit 60 ends the process.

  Next, if it is determined in step S20 that the touch coordinates are not within the area of the A-mode key object (No), that is, if the A-mode key object is not an operation target, in step S24, the touch coordinates are the B-mode key object. It is determined whether it is in the area. That is, it is determined whether or not a B-mode key object is designated as an operation target. If it is determined in step S24 that the touch coordinates are within the area of the B mode key object (Yes), the search unit 60 proceeds to the object adsorption B mode in step S26. The object adsorption B mode will be described later. The search unit 60 ends the process when the object adsorption B mode process in step S26 ends.

  If the search unit 60 determines in step S24 that the touch coordinates are not within the region of the B-mode key object (No), that is, the B-mode key object is not an operation target, the search unit 60 ends the processing. Note that the search unit 60 repeatedly performs the processes from step S14 to the end of the process for the number of all objects being displayed. That is, the above determination is repeated as many times as the number of group objects and key objects being displayed, and processing corresponding to the subsequent operation is performed in association with selection of the group object or selection of the key object. For example, when a group object is selected and moved, each item object included in the group object is also moved according to the movement. By performing the control in this way, the search unit 60 can operate any object touched by the operator.

  If there are a plurality of group objects on the screen, the search unit 60 performs the determination in step S16 for each group object. If there are a plurality of A mode key objects, the search unit 60 performs a step for each A mode key object. The determination in S20 is performed, and if there are a plurality of B mode key objects, the determination in step S24 is performed for each B mode key object.

  Next, the object adsorption A mode will be described with reference to FIGS. 5 and 6-1 to 6-5. FIG. 5 is a flowchart showing an example of processing operation of the mobile communication device when the mode is shifted to the object adsorption A mode. Here, the object adsorption A mode is a mode in which the range of the relevance ratio of the item object extracted from the group object is changed according to the distance between the key object and the group object. That is, in this mode, the threshold value of the matching rate of the item object to be attracted to the key object is changed according to the distance between the key object and the group object. Specifically, the closer the distance between the key object and the group object, the wider the range of precision of the item object to be extracted, and also extract the item objects with low precision, and the distance between the key object and the group object is far The range of the matching rate of the item object to be extracted is narrowed, and only the item objects having a high matching rate are extracted. In the present embodiment, since the setting is such that item objects with a high relevance rate are extracted in order, when the range of the relevance rate is wide, item objects with a lower relevance rate are also extracted.

  First, when the search unit 60 shifts to the object adsorption A mode in step S22, the search unit 60 calculates the distance between the key object and the target group object in step S30. Here, since the key object has been moved by the operator, the position of the key object is the position of the key object detected during the calculation in step S30. The reference position of the group object is a preset position of the area of the group object, for example, the center of the area or the end of the area. Furthermore, the target group object is a group object to be subjected to a narrowed search.

  After calculating the distance in step S30, the search unit 60 determines whether the distance between the objects (key object and group object) is within D in step S32. Here, the distance D is a distance in which the distance between the key object and the group object is a predetermined distance or more, and is a distance that is set not to perform aperture search.

  If it is determined in step S32 that the distance between the objects is within D (Yes), the search unit 60 stops the suction hold timer in step S34. The adsorption hold timer is a timer that operates when the distance between objects is greater than D. Details will be described later. If the target group object is in a suction hold mode described later, the suction hold timer is stopped, and then the suction group mode described later is entered.

  After stopping the suction hold timer in step S34, the search unit 60 determines whether the target group object is in the normal mode as step S36. Here, the normal mode is a mode in an initial state, and is a state in which the relevance ratio of the item object in the group object to the key object is not calculated. If it is determined in step S36 that the target group object is not in the normal mode (No), the search unit 60 proceeds to step S42. If the normal mode is not selected in step S36, the suction mode is selected.

  If it is determined in step S36 that the target group object is in the normal mode (Yes), the search unit 60 shifts the object state to the suction mode in step S38. That is, the object state is shifted from the normal mode to the suction mode. When the search unit 60 shifts to the suction mode in step S38, the search unit 60 searches for the target group object with the key object in step S40. Specifically, the relevance ratio for the key object is calculated for each item object belonging to (included in) the target group object.

  If the relevance ratio of each item object is calculated in Step S40 or if it is determined No in Step S36, that is, if it is already in the suction mode, the retrieval unit 60 calculates the conformance ratio to be attracted from the distance as Step S42. That is, based on the distance between the key object and the target group object calculated in step S30, the range of the relevance ratio to be attracted to the key object is calculated. Here, as described above, the range of the precision ratio is calculated as the distance is narrower, and a wider range is calculated (it is possible to extract even the one with a lower precision ratio), and the distance is calculated as the distance is farther (only those with a higher precision ratio are extracted) Possible).

  After calculating the range of the relevance ratio in step S42, the retrieval unit 60 takes out the suction target and displays it in step S44. In other words, from the item objects included in the group object, the item object whose extraction ratio calculated in step S40 is included in the range of the adjustment ratio calculated in step S42 is extracted, and the extracted item object is used as a suction target. Display around the object. When the retrieval unit 60 displays the suction target around the key object in step S44, the process proceeds to step S52.

  Next, when the search unit 60 determines in step S32 that the distance between the objects is not within D (No), that is, the distance between the objects is greater than D, in step S46, whether the target group object is in the suction mode. Determine. If the search unit 60 determines that the suction mode is not set (No) in step S46, the search unit 60 proceeds to step S52.

  If it is determined in step S46 that the suction mode is in the suction mode (Yes), the search unit 60 shifts the object state to the suction hold mode in step S48. That is, the mode of the target group object is set to the suction hold mode. Here, the suction hold mode is a process that does not calculate the range of the relevance rate and does not extract the item object from the target group object. Further, as described above, the target group object and the key object are in a state of being separated by a certain distance or more.

  When the search unit 60 shifts to the suction hold mode in step S48, the search unit 60 starts counting the suction hold timer in step S50. Here, the suction hold timer is a measuring unit that serves as a reference for determining whether to shift from the suction hold mode to the normal mode. Note that the time measurement can be executed by using time measurement means built in the apparatus or time measurement software. When the search unit 60 starts counting the timer in step S50, the search unit 60 proceeds to step S52.

  Note that the search unit 60 repeats the processing from step S30 to immediately before step S52 for each object. That is, when a plurality of group objects are displayed, the above processing is performed for each group.

  Next, when it is determined No in step S46, that is, in the normal mode, when the processing of step S44 or step S50 is completed, the search unit 60 determines whether the suction hold timer has timed out as step S52. Note that the time used as a reference for the timeout is set in advance.

  If the search unit 60 determines in step S52 that the timeout has not occurred (No), the search unit 60 proceeds to step S58. If it is determined in step S52 that the time-out has occurred (Yes), the search unit 60 shifts the object state to the normal mode in step S54. That is, the object state is shifted from the suction hold mode to the normal mode. In the case of the suction mode, the suction hold timer is not counted, and the process does not proceed to step S54. When the search unit 60 shifts to the normal mode in step S54, the search unit 60 discards the search information of the target group object in step S56. That is, the information on the relevance ratio of each item object included in the target group object calculated in step S40 with respect to the key object is discarded. Thereafter, the search unit 60 proceeds to step S58.

  When it is determined No in step S52 or when the process of step S56 is completed, the search unit 60 determines whether the drag state is released as step S58. That is, it is determined whether the input to the input device by the operator is finished and the key object is not selected. If it is determined in step S58 that the drag state has not been released (No), the search unit 60 proceeds to step S30 and repeats the above process. If the search unit 60 determines in step S58 that the drag state has been released (Yes), the search unit 60 ends the process.

  Next, an example of a specific operation example will be described with reference to FIGS. 6-1 to 6-5. Here, FIGS. 6-1 to 6-5 are explanatory diagrams for explaining the operation of the mobile communication device when the operation is performed in the object adsorption A mode. FIGS. 6-1 to 6-5 show examples of screen display. In the explanatory diagrams shown in FIGS. 6A to 6E, only one group object and item objects included in the group object and four A-mode key objects are illustrated, and other group objects and item objects included in the group object are included. The key object is not shown.

  First, as shown in FIG. 6A, when the operator touches an area corresponding to one of the key objects, in this embodiment, the “single” key object 106a, the search unit 60 displays “single”. It determines with the key object 106a having been selected, and transfers to object adsorption | suction A mode.

  Next, the “single” key object 106a is moved by the operator, and as shown in FIG. 6B, when the distance between the key object 106a and the group object 102a is within a certain distance, the group object 102a enters the suction mode. The relevance ratio of each item object included in the group object 102a to the search condition of the key object 106a is calculated. In this embodiment, since the item object is an image file and the key object is “single”, the relevance rate (in this case, the matching rate is also determined) based on whether the item object is an image file alone. ) Is calculated. The high relevance rate is that the image file that can be determined to be completely photographed alone is 100% relevance rate, and the image can be determined to be indistinct or small and other people can be captured. For files, the precision is low. Moreover, whether something is reflected in the image can be automatically analyzed by using image analysis software and a face recognition system. After calculating the relevance rate of each item object, the search unit 60 calculates the range of relevance rates of the item objects to be attracted based on the distance between the key object 106a and the group object 102a, and among the item objects of the group object 102a. Then, an item object of the matching rate included in the range of the matching rate is extracted, and the extracted item object 104a is moved to the “single” key object 106a side as shown in FIG. The state is adsorbed by 106a.

  Further, when the “single” key object 106a is moved further to the group object 102a side from the position shown in FIG. 6B and the distance between the key object 106a and the group object 102a becomes narrower (shorter), the item object to be picked up is displayed. The range of the relevance ratio is calculated more widely, and item objects with a lower relevance ratio are also extracted. As shown in FIG. 6C, a state in which more item objects 104b are attracted to the “single” key object 106a. Become.

  Next, when the operator releases the input device 34 and releases the drag state in a state where the input device 34 is not touched, the item object to be extracted (sucked) by the search unit 60 is determined. That is, the range of the relevance ratio of the item object to be extracted is determined, and the item objects satisfying the range of the relevance ratio are extracted. The group of extracted item objects becomes a new group object 102b.

  Next, when the operator touches the group of extracted item objects as shown in FIG. 6-4 and moves the group of extracted item objects away from the original group object 102a, as shown in FIG. 6-5. Are displayed separately as a new group object 102b. At this time, the new group object 102b is obtained by adding the name of the key object 106a to the name of the original group object 102a. That is, the search condition used for extraction is displayed as the name of the group object 102b. This makes it easier to understand the attributes of the group object.

  Next, the object adsorption B mode will be described with reference to FIGS. FIGS. 7A and 7B are flowcharts illustrating an example of processing operation of the mobile communication device when the mode is shifted to the object adsorption B mode. Here, the object adsorption B mode is a mode in which the range of the relevance ratio of the item object extracted from the group object is changed according to the distance that the key object moves in the group object. That is, in this mode, the threshold value of the matching rate of the item object to be attracted to the key object is changed according to the distance that the key object has moved within the group object. Specifically, the longer the distance that the key object moves within the group object, the wider the range of precision of the item object to be extracted, and also extract the item objects with low precision, and the key object moves within the group object. The shorter the distance is, the narrower the range of the matching rate of the item objects to be extracted, and only the item objects having a high matching rate are extracted. In this embodiment, since the setting is such that item objects with a high matching rate are extracted in order, item objects with a lower matching rate are also extracted when the range of the matching rate is widened.

  First, when the search unit 60 shifts to the object adsorption B mode in step S26, the search unit 60 determines in step S60 whether it is within the group object region. That is, it is determined whether the touch coordinates are within the group object area while the key object is being operated. That is, it is determined whether the key object has been moved to the area of the group object.

  If it is determined in step S60 that the area is within the group object (Yes), the search unit 60 stops the adsorption suspension timer in step S62. The adsorption hold timer is a timer that counts time when the key object is not within the area of the group object, and is used when the object state of the group object is an adsorption hold mode to be described later.

  After stopping the suction hold timer in step S62, the search unit 60 determines in step S64 whether the state (object state) of the target group object is the suction mode. Here, the target group object is a group object from which item objects are extracted. The adsorption mode is a mode in which the relevance ratio of each item object included in the group object to the key object is calculated, and the target group object can be attracted to the key object. Note that the suction mode can be shifted by performing the processing of step S92 and step S94 described later. In the suction hold mode, the suction hold timer is stopped to shift to the suction mode.

  If it is determined in step S64 that the suction mode is selected (Yes), the search unit 60 calculates the moving distance and speed from the previous coordinate and current coordinate, the previous time and current time, as step S66. Thereafter, in step S68, the search unit 60 calculates a precision that is subtracted from the movement distance (or speed), and subtracts it from the acquired precision. Here, the acquired matching rate is a matching rate that is a threshold value. In this embodiment, in order to extract the item objects in descending order of the matching rate, the matching rate to be subtracted based on the moving distance is calculated, and the acquisition matching rate that is the lower limit value is set, thereby matching the item object to be extracted. A range of rates can be set. In addition, since the detection timing of the coordinate and time is the same, the movement distance and the movement speed are in a proportional relationship, so the acquisition precision calculated by using either the movement distance or the speed is the same. Become. In the above example, the moving speed is the average moving speed, but the range of the relevance ratio may be set based on the maximum moving speed instead of the average moving speed. For example, the maximum speed of the moving speed is calculated regardless of the moving distance, and when the maximum speed is high, the range of the precision ratio may be widened, and the range of the precision ratio with the low maximum speed may be narrowed. In the above embodiment, the range of the relevance ratio is set based on the movement distance and / or the movement speed. However, the relevance ratio range may be determined based on the length of time spent in the group object. For example, regardless of whether or not the key object has moved, the range of precision is increased as the time in the group object increases, and if the time spent in the group object is longer, the range of precision is increased. If the time spent in the group object is short, the range of the relevance ratio may be narrowed. Further, these calculation methods may be combined.

  After calculating the acquired matching rate in step S68, the search unit 60 determines whether there is an item object equal to or higher than the acquired matching rate in step S70. That is, it is determined whether there is an item object with a matching rate included in the range of matching rates. If it is determined in step S70 that there is an item object equal to or higher than the acquired matching rate (Yes), the search unit 60 moves the item object equal to or higher than the acquired matching rate to the key object in step S72. That is, the item object having the matching rate included in the matching rate range is attracted to the key object. Thereafter, the process proceeds to step S88. If the search unit 60 determines in step S70 that there are no item objects equal to or higher than the acquired matching rate (No), the search unit 60 proceeds to step S88 because there is no item object to be attracted. In step S70 and step S72, the target item object may be a difference item object for the previous process. That is, it is determined whether or not there is an item object to be newly attracted, and when there is an item object to be newly attracted, the key object may be attracted.

  Next, if it is determined in step S64 that the suction mode is not set (No), the search unit 60 determines in step S74 whether the object state of the target group object is the normal mode. If it is determined in step S74 that the mode is not the normal mode (No), the search unit 60 proceeds to step S88. If the search unit 60 determines in step S74 that the normal mode is selected (Yes), the object state is shifted to the suction candidate mode in step S76, and the counting of the suction candidate timer is started in step S78. The process proceeds to step S88. Here, the suction candidate mode is a mode set during the transition from the normal mode to the suction mode. During the suction candidate mode, suction is not performed even if the key object is within the group object area. Further, when the key object is within the group object area for a certain period of time in the suction candidate mode, the mode shifts to the suction mode. Thus, since the key object is moved before the suction starts, it is possible to prevent the item object of the passed group object from being sucked even if it passes through the area of the group object other than the desired group object for a short time.

  Next, if it is determined in step S60 that the area is not within the group object area (No), the search unit 60 stops the adsorption candidate timer in step S80. After stopping the adsorption candidate timer in step S80, the search unit 60 determines in step S82 whether the object state of the target group object is the adsorption mode. If it is determined in step S82 that the suction mode is not set (No), the search unit 60 proceeds to step S88. If it is determined in step S82 that the suction mode is selected (Yes), in step S84, the search unit 60 shifts the object state to the suction hold mode, and then starts a suction hold timer in step S86. Steps S84 and S86 are the same processes as steps S48 and S50 in the flowchart shown in FIG.

  Further, the search unit 60 repeats the processing from step S60 to immediately before step S88 for the object. That is, when a plurality of target group objects are displayed, the above processing is performed for each group.

  Next, when it is determined No in Step S70, Step S74, and Step S82, the search unit 60 ends the processing of Step S72, Step S78, and Step S86. Retreat to the previous coordinate. That is, the coordinates and time are updated. When the coordinates and time are updated in step S88, the search unit 60 determines whether the adsorption candidate timer has timed out in step S90. That is, it is determined whether or not a certain period of time has elapsed with the key object entering the area of the target group object.

  If it is determined in step S90 that the time-out has occurred (Yes), the search unit 60 shifts the object state to the suction mode in step S92, and then searches for the target group object with the key object in step S94. Specifically, the relevance ratio for the key object is calculated for each item object belonging to (included in) the target group object. After performing the search in step S94, the search unit 60 proceeds to step S102.

  If it is determined in step S90 that the timeout has not occurred (No), the search unit 60 determines in step S96 whether the adsorption suspension timer has timed out. If it is determined in step S96 that the timeout has not occurred (No), the search unit 60 proceeds to step S102. If it is determined in step S96 that the time-out has occurred (Yes), the search unit 60 shifts the object state to the normal mode in step S98. That is, the object state is shifted from the suction hold mode to the normal mode. Next, when the search unit 60 shifts to the normal mode in step S98, the search unit 60 discards the search information of the target group object in step S100. That is, the information on the relevance ratio of each item object included in the target group object with respect to the key object calculated in step S94 is discarded. Thereafter, the search unit 60 proceeds to step S102.

  When it is determined No in step S96 or when the processes of steps S94 and S100 are completed, the search unit 60 determines whether the drag state is released as step S102. That is, it is determined whether the input to the input device by the operator is finished and the key object is not selected. If it is determined in step S102 that the drag state is not released (No), the search unit 60 proceeds to step S60 and repeats the above process. If the search unit 60 determines in step S102 that the drag state has been released (Yes), the search unit 60 proceeds to step S104.

  In step S104, the search unit 60 determines whether the target group object is in the suction mode. If it is determined in step S104 that the suction mode is not set (No), the search unit 60 ends the process.

  If it is determined in step S104 that the suction mode is selected (Yes), the search unit 60 shifts the object state to the hit mode in step S106, and proceeds to step S108. Here, the hit mode is a mode in which, when a tap (click) process is detected by the input device 34 in a state where the key object is in the target group object region, the acquisition precision is changed and the precision ratio range is narrowed. It is.

  The search part 60 determines whether it was touched as step S108. That is, it is determined whether the input device 34 has been touched by the operator. If the search unit 60 determines that it is not touched (No) in step S108, the search unit 60 proceeds to step S120. If it is determined in step S108 that the touch is made (Yes), the search unit 60 determines in step S110 whether the touched key object is a search key object, that is, a key object for which a narrow search is being performed. If the search unit 60 determines in step S110 that the key object is not touched (No), that is, another part is touched, the search unit 60 ends the process.

  If it is determined in step S110 that the key object is touched (Yes), the search unit 60 starts counting the hit release timer in step S112. Here, the hit release timer is a measuring means that serves as a reference for determining whether to release the hit mode. When the search unit 60 starts the hit release timer in step S112, the search unit 60 adds the acquired adaptation rate as step S114. Specifically, the acquisition precision ratio to be added according to the interval between the touch detected in step S108 and the previous touch release (drag release in step S104 or touch released in the previous step S108). Is calculated. In the present embodiment, the longer the time interval, the larger the acquisition accuracy rate to be added (that is, the value that changes the acquisition accuracy rate), and the shorter the time interval, the smaller the acquisition accuracy rate to be added. In other words, the longer the time interval, the larger the fluctuation range that increases the acquisition accuracy rate and narrows the range of the accuracy rate, and the shorter the time interval, the smaller the fluctuation range that increases the acquisition accuracy rate and the range of the accuracy rate Do not make it too narrow.

  After adding the acquired matching rate in step S114, the search unit 60 determines whether there is an item object that is equal to or lower than the acquired matching rate in step S116. That is, it is determined whether there is an item object that is less than or equal to the acquired matching rate (not included in the range of matching rate) among the item objects that are attracted to the key object. If the search unit 60 determines in step S116 that there is no item object that is equal to or less than the acquired matching rate (No), the search unit 60 proceeds to step S120. If it is determined in step S116 that there are item objects equal to or less than the acquired matching rate (Yes), the search unit 60 moves item objects equal to or lower than the acquired matching rate to the original group object in step S118. That is, the state where the item object having the acquisition precision or less is attracted to the key object is returned to the state belonging to the original group object. Thereafter, the search unit 60 proceeds to step S120.

  When it is determined No in step S108 and step S116, and when the process of step S118 is completed, the search unit 60 determines whether the hit release timer has timed out in step S120. If it is determined in step S120 that the timeout has not occurred (No), the search unit 60 proceeds to step S108, repeats the above process, and if determined to have timed out (Yes), ends the process. That is, the search unit 60 repeatedly performs the hit mode processing until a time-out occurs or an area other than the key object is touched.

  Next, an example of a specific operation example will be described with reference to FIGS. FIGS. 8A to 8C are explanatory diagrams for explaining the operation of the mobile communication device when the operation is performed in the object adsorption B mode. FIG. 9 is a graph for explaining the operation of the mobile communication device. FIGS. 10A, 10B, and 11 are also explanatory diagrams for explaining the operation of the mobile communication device when the operation is performed in the object adsorption B mode. FIG. 12 is a graph for explaining the operation of the mobile communication device. 8A to 8C, FIG. 10A, FIG. 10B, and FIG. 11 also display group objects and key objects related to the operation target, and display some group objects and keys. Omitted the object.

  First, as shown in FIG. 8A, when the operator touches an area corresponding to one of the key objects, that is, the “plurality” key object 106b in this embodiment, the search unit 60 displays “plurality”. It determines with the key object 106b having been selected, and transfers to object adsorption | suction B mode.

  Next, a plurality of key objects 106b are moved by the operator. At this time, if the key object 106b passes in a short time, the group object 102a does not enter the adsorption mode, so that the narrowing search is not performed, and as shown in FIG. 8B, the item object is not attracted to the key object 106b. In this way, by setting so as not to shift to the suction mode when passing in a short time, even if it is desired to perform a narrowing search of the group object 102c with the key object 106b, the key object 106b is moved to the area of the group object 102a. There is no need to move it around.

  Further, when the key object 106b is moved within the area of the group object 102a by being operated by the operator and moving for a certain period of time or more than a certain speed, the group object 102a enters the suction mode. Thereby, the search unit 60 calculates the relevance ratio of each item object included in the group object 102a with respect to the search condition of the key object 106b. In this embodiment, since the item object is an image file and the key object is “plurality”, the relevance rate (in this case, the matching rate is also determined) based on whether or not the image file includes a plurality of persons. ) Is calculated. Note that the accuracy rate is 100% for an image file that can be determined to be completely captured in multiple images, and it can be determined that the image is indistinct or multiple objects that appear to be people are captured. In the case of the image file, the precision is low. After calculating the relevance ratio of each item object, the search unit 60 calculates the acquired relevance ratio (that is, the range of the relevance ratio) based on the moving distance and moving speed of the key object 106b. Then, an item object of the matching rate included in the range of the matching rate is extracted, and the extracted item object 104c is moved to the “plurality” key object 106b side as shown in FIG. The state is adsorbed by 106b.

  Here, the acquisition matching rate changes as shown in the graph of FIG. 9 according to the moving distance and moving speed in the area of the group object. Specifically, as shown in FIG. 9, at a time t1 after entering a group object region and a predetermined time has elapsed, the object state becomes the suction mode, and the calculation of the acquisition matching rate starts. After that, while the key object is moved slowly, such as between time t1 and time t2, the rate of change of the acquisition precision is small, that is, gradually changes, such as between time t2 and time t3. While the key object is moved quickly, the rate of change of the acquisition precision is large, that is, the acquisition precision changes greatly. Thereby, the way of changing the range of the relevance ratio can be adjusted according to the way the operator moves the key object, and the operability can be improved.

  Next, as shown in FIG. 10A, with the item object 104c in the group object being attached to the key object 106b, the key object is moved out of the group object area and then released from the drag state. As shown in FIG. 10B, the new group object 102d is displayed separately. At this time, the new group object 102d is obtained by adding the name of the key object 106b to the name of the original group object 102a. That is, the search condition used for extraction is displayed as the name of the group object. This makes it easier to understand the attributes of the group object.

  Also, as shown in FIG. 11, when the key object 106b is in the area of the group object 102a and released from the dragged state of the key object 106b, and when the key object 106b is touched, that is, the key object When the key object 106b is clicked in a state where the object 106b is in the area of the group object 102a, the search unit 60 recalculates the acquisition precision and changes the acquisition precision higher. As a result, among the item objects attracted to the key object, the item objects having a conformance rate equal to or lower than the obtained conformance rate return from the attracted state to the state belonging to the group object. That is, it will be in the state which is not adsorbed. As described above, when the item object is clicked, the acquisition accuracy rate is increased, and when the item object is attracted too much, the item object having a low accuracy rate can be removed. For this reason, it is possible to attract a desired number of item objects without redoing the process from the beginning.

  Here, when the acquisition precision is changed by clicking (that is, tapping) the key object, it is preferable to change the rate of change (slope) of the acquisition precision according to the timing of the click, as shown in FIG. For example, as shown in FIG. 12, when the click interval is slow, the rate of change (slope) of the acquisition precision is gradually increased, that is, the acquisition precision is gradually increased, and the click interval is fast. In the case of (clicked at a short interval), it is preferable to make the rate of change (slope) of the acquisition accuracy rate abruptly, that is, the acquisition accuracy rate rapidly increases. In this way, by changing the rate of change (slope) of the acquisition precision according to the timing (interval) of the click, the operator can easily and appropriately adjust the adsorption state of the item object. Can be high. In the above embodiment, the acquisition change rate is changed based on the click interval and time.However, the present invention is not limited to this, and the acquisition accuracy rate to be converted by one click is set in advance. Based on the number of clicks, the amount of change (the rate of change) of the acquisition precision may be determined.

  As described above, the mobile communication device 10 calculates the range of the relevance ratio of the item objects to be extracted based on the positional relationship between the group object and the key object, and extracts the item objects that satisfy the range of the relevance ratio. Thus, the operator can visually recognize the state of the search result. Thereby, it is possible to easily obtain a search result desired by the operator. In addition, since the degree of precision can be easily adjusted by the operation of the operator, the operability of the operator can be improved.

  Also, by displaying the extracted item objects while adsorbing them to the key object, it is possible to recognize the number of extracted item objects, and in this respect, the operability can be improved. In the present embodiment, the item object is attracted to the key object and displayed to indicate that it has been extracted. However, the present invention is not limited to this. It is only necessary to visually recognize that the item object has been extracted by searching the key object, and the number of extracted item objects may be displayed.

  Moreover, it is preferable that the display area is set in proportion to the number of item objects to which the group object belongs. That is, it is preferable to display a group object with a small number of belonging item objects in a smaller area, and display a group object with a large number of belonging item objects in a larger area. Also, it is preferable to display the collection of item objects attracted by the key object in the same manner. Thereby, the adsorption state of the item object can be easily grasped visually, and the operability can be improved.

  In addition, the group object and / or the collection of item objects preferably displays a line, that is, the outline of the group object, on the outer periphery (outline) of the display area. Thus, by clearly indicating the outer periphery of the display area with a line, the size of the display area can be accurately grasped, and the touch operation by the operator can be appropriately performed.

  Here, in the above embodiment, the precision ratio range is calculated after calculating the precision ratio of each item object. However, the order of the precision ratio calculation and the precision ratio range calculation process may be either. Since the calculation amount can be reduced, it is preferable that the relevance ratio of the item object is used once it is calculated until it is discarded. In the present embodiment, the relevance ratio of each item object is calculated by the search unit, but the present invention is not limited to this. It suffices to determine the relevance ratio of each item object, and information on the relevance ratio supplied from an external device via a network or the like may be used.

  In the above embodiment, when the key object serving as the search condition and the group object serving as the search target are determined, the precision of each item object is calculated. At the stage when the object and the group object are created, the accuracy rate of the item object for each key object is calculated in advance. By reading, the matching rate of each item object with respect to the key object may be acquired.

  In the above-described embodiment, the image file has been described as an example. However, as described above, the image file can be used for searching a music file, a home page, and the like. Further, it may be used for searching for a communication destination (a partner to be contacted). In this case, the group object is a so-called address book, and the item object is an individual address (communication destination specifying object). In this case, examples of the key object include “address name is A line”, a part of a telephone number, a part of a mail address, a group of addresses, and the like. The individual address stores information such as name, telephone number, mail address, and address. In addition, it may be possible to shift to a mode for making a call to an individual address extracted as a result of searching in the address book, or to a mail creation screen for creating a mail to be transmitted to the extracted individual address. You may do it. In this case, the communication unit 26 communicates with the specified communication destination based on information stored in at least one individual address selected by the operator's operation on the input device from among the extracted individual addresses. Send emails and make phone calls.

  In the above embodiment, the relevance ratio is calculated based on the degree of coincidence with the keyword and the condition. However, in addition to this, the present invention calculates the relevance ratio in consideration of parameters such as the number of browsing times and popularity. It may be.

  Moreover, in the said Example, although the mobile telephone provided with the touch panel was demonstrated as an example as a suitable aspect, it is not limited to this. As described above, it can be used for various communication devices. Examples of the communication unit include a communication unit that connects to a public communication network via a base station, and a communication unit that directly communicates with other terminals. Examples of the connection destination include a server and other terminals.

  Moreover, in the said Example, although the liquid crystal display device was used as a display device and the touch panel was used as an input device, it is not limited to this. For example, a projection display unit such as a projector may be used as the display device, and a contact detection unit that detects a contact input to the display area of the projected image may be used as the input device. Specifically, the operator places a hand in the display area, brings the hand to the area where the object is displayed, and moves the hand in the display area from that state. The same control can be performed by detecting such movement of the hand by the contact detection means.

  As described above, the mobile communication device and the communication device according to the present invention are useful for extracting an object that meets a search condition from a plurality of objects.

10 Mobile communication device 22 CPU
26 Communication Unit 30 Audio Processing Unit 32 Display Device 34 Input Device 36 ROM
38 RAM
40 Internal storage 42 External storage interface 46 External storage 50 Microphone 52 Receiver 60 Search unit 62 Display unit 64 Input unit 66 Content control unit 68 Search control unit 70 Object control unit 72 Object display unit 74 Object operation unit 76 Object management unit 78 Object information Database 80 Content management unit 82 Search index database 84 Content information database 86 Content access unit 88 Display control unit 90 Drawing engine unit 92 Input interface control unit 94 File system 102 Group object 104 Item object 106 Key object

Claims (13)

  1. A display for displaying an image;
    An input detection unit that detects an input instruction input by touching a region where an image is displayed by the display unit;
    A communication unit that communicates with other terminals;
    A control unit that performs control based on an input instruction detected by the input detection unit, and that controls an image to be displayed on the display unit,
    The control unit includes a group object having a plurality of communication destination specifying objects that are address information when performing communication in the communication unit, and a key object that is a condition for narrowing down the communication destination specifying object from among the group objects. When the input detection unit detects a contact with the key object and a movement instruction in the contact state in a state of being displayed on the screen by the display unit,
    Obtaining a matching rate of each communication destination specific object in the group object with respect to the key object;
    Detecting the amount of movement of the key object by the input detection unit;
    Calculating a range of the precision of the communication destination specific object to be extracted based on the amount of movement of the key object;
    An instruction for extracting a communication destination specifying object satisfying the calculated range of the matching ratio from the communication destination specifying object in the group object, and selecting and determining one of the extracted communication destination specifying objects. When the input detection unit detects that an input has been made, communication at the communication unit is started based on address information associated with the selected communication destination specific object.
  2. A display for displaying an image;
    An input detection unit that detects an input instruction input by touching a region where an image is displayed by the display unit;
    A communication unit that communicates with other terminals;
    A control unit that performs control based on an input instruction detected by the input detection unit, and that controls an image to be displayed on the display unit,
    In the state where the control unit displays a group object having a plurality of item objects and a key object as a condition for narrowing down the item object from the group objects on the screen by the display unit, When the input detection unit detects contact and movement instructions in the contact state,
    Obtaining a matching rate of each item object in the group object with respect to the key object;
    Detecting the amount of movement of the key object by the input detection unit;
    Calculating the range of the precision of the item object to be extracted based on the amount of movement of the key object;
    An item object satisfying the calculated range of relevance ratio is extracted from item objects in the group object.
  3.   The communication device according to claim 2, wherein the control unit sets the set of extracted item objects as a new group object.
  4.   4. The communication according to claim 2, wherein the control unit displays an item object included in the group object on the display unit, and displays the extracted item object in the vicinity of the key object. 5. machine.
  5. The control unit varies the range of the precision that is calculated according to the distance between the key object and the group object,
    5. The item object according to any one of claims 2 to 4, wherein an item object having a lower matching rate is extracted as the distance is shorter, and only an item object having a higher matching rate is extracted as the distance is longer. The communication device described.
  6. The control unit calculates a range of the relevance ratio of the item object to be extracted according to a moving distance of the key object in the display area of the group object,
    5. The item object having a low matching rate is extracted as the moving distance is longer, and only the item object having a higher matching rate is extracted as the moving distance is shorter. Communication equipment as described in the paragraph.
  7. In the state where the key object is selected and the item object is extracted based on the range of the relevance ratio at the time of the state, the control unit is configured to perform the key object in the input detection unit. When a tap to is detected,
    The communication device according to claim 6, wherein the range of the precision of the item object to be extracted is narrower than that before the occurrence of the tap.
  8.   The communication device according to claim 7, wherein the control unit narrows the range of the matching rate of the item object to be extracted as the tap time is longer.
  9.   The communication device according to claim 7, wherein the control unit narrows the range of the matching rate of the item object to be extracted as the number of taps increases.
  10.   10. The item according to claim 6, wherein the control unit does not extract the item object until a preset time elapses after the key object enters the display area of the group object. The communication device according to claim 1.
  11.   11. The communication according to claim 2, wherein the control unit displays the group object larger on the display unit as the number of item objects included in the group object increases. machine.
  12.   The communication device according to any one of claims 2 to 11, wherein the control unit displays an outline on the outer periphery of the group object on the display unit.
  13. The input detection unit is a contact sensor disposed on the front surface of the display unit,
    Detecting the force applied to the display as an input,
    The control unit determines an operated key object by associating a position where a load is input with an image displayed on the display unit. The communication device described in 1.
JP2009151722A 2009-06-26 2009-06-26 Mobile communication device and communication device Active JP5254137B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009151722A JP5254137B2 (en) 2009-06-26 2009-06-26 Mobile communication device and communication device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2009151722A JP5254137B2 (en) 2009-06-26 2009-06-26 Mobile communication device and communication device
PCT/JP2010/060902 WO2010150893A1 (en) 2009-06-26 2010-06-25 Communication device and electronic device
US13/380,691 US9626094B2 (en) 2009-06-26 2010-06-25 Communication device and electronic device
EP10792211.4A EP2447857A4 (en) 2009-06-26 2010-06-25 Communication device and electronic device

Publications (2)

Publication Number Publication Date
JP2011008556A JP2011008556A (en) 2011-01-13
JP5254137B2 true JP5254137B2 (en) 2013-08-07

Family

ID=43565129

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009151722A Active JP5254137B2 (en) 2009-06-26 2009-06-26 Mobile communication device and communication device

Country Status (1)

Country Link
JP (1) JP5254137B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012150582A (en) * 2011-01-18 2012-08-09 Just Syst Corp Retrieval device and retrieval method
JP5865810B2 (en) * 2012-09-24 2016-02-17 株式会社Nttドコモ Search instruction device, search instruction method and program
JP6365741B2 (en) * 2017-07-13 2018-08-01 ソニー株式会社 Information processing apparatus, information processing method, and program.

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003233626A (en) * 2002-02-06 2003-08-22 Just Syst Corp Association presence/absence display device, association presence/absence display method and program for implementing the method on computer
JP2008097175A (en) * 2006-10-10 2008-04-24 Fuji Xerox Co Ltd Electronic file retrieving device
EP2180700A1 (en) * 2007-08-03 2010-04-28 Loilo Inc Interface system for editing video data

Also Published As

Publication number Publication date
JP2011008556A (en) 2011-01-13

Similar Documents

Publication Publication Date Title
US10042513B2 (en) Multifunction device with integrated search and application selection
TWI637310B (en) Continuity
JP6138866B2 (en) Device, method and graphical user interface for document manipulation
CN205608689U (en) Electronic equipment and be used for with interactive device of user interface component
US10222977B2 (en) Portable electronic device performing similar operations for different gestures
JP6077685B2 (en) Device, method, and graphical user interface for moving current position in content with variable scrub speed
US10097792B2 (en) Mobile device and method for messenger-based video call service
US10606470B2 (en) List scrolling and document translation, scaling, and rotation on a touch-screen display
US20180309875A1 (en) Voicemail manager for portable multifunction device
DE202017002875U1 (en) User interface for camera effects
US10430078B2 (en) Touch screen device, and graphical user interface for inserting a character from an alternate keyboard
KR20180034637A (en) Intelligent Device Identification
US9575646B2 (en) Modal change based on orientation of a portable multifunction device
US9207855B2 (en) Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
US10409461B2 (en) Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content
CN105260049B (en) For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
US9229634B2 (en) Portable multifunction device, method, and graphical user interface for interpreting a finger gesture
US9606715B2 (en) Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor
US8624935B2 (en) Smart keyboard management for a multifunction device with a touch screen display
US9420093B2 (en) Apparatus and method for providing additional information by using caller phone number
US10254949B2 (en) Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
DE202016001489U1 (en) Devices and graphical user interfaces for displaying and using menus
US20170090748A1 (en) Portable device, method, and graphical user interface for scrolling to display the top of an electronic document
KR101873908B1 (en) Method and Apparatus for Providing User Interface of Portable device
US10379728B2 (en) Methods and graphical user interfaces for conducting searches on a portable multifunction device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120406

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130326

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130417

R150 Certificate of patent or registration of utility model

Ref document number: 5254137

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20160426

Year of fee payment: 3