AU2014101081A4 - System, method and graphical user interface for facilitating a search - Google Patents

System, method and graphical user interface for facilitating a search Download PDF

Info

Publication number
AU2014101081A4
AU2014101081A4 AU2014101081A AU2014101081A AU2014101081A4 AU 2014101081 A4 AU2014101081 A4 AU 2014101081A4 AU 2014101081 A AU2014101081 A AU 2014101081A AU 2014101081 A AU2014101081 A AU 2014101081A AU 2014101081 A4 AU2014101081 A4 AU 2014101081A4
Authority
AU
Australia
Prior art keywords
search
nodes
interface
items
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2014101081A
Inventor
Jonathan Robert Burnett
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU2014101081A priority Critical patent/AU2014101081A4/en
Application granted granted Critical
Publication of AU2014101081A4 publication Critical patent/AU2014101081A4/en
Priority to PCT/AU2015/000541 priority patent/WO2016033639A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/954Navigation, e.g. using categorised browsing

Abstract

SYSTEM, METHOD, AND GRAPHICAL USER INTERFACE FOR FACILITATING A SEARCH Aspects of the present disclosure are related to a method for facilitating searching. The method comprises displaying an interactive tree on a graphical user interface (GUI). The tree is formed from a classification of metadata associated with items, and includes one or more value nodes, value ranges, attribute nodes, sub-attribute nodes, sub-group nodes, and group nodes. The method further includes detecting user selection of one or more nodes of the interactive tree, each selected node including one state from a set of states comprising at least two of optional, required and excluded states, and converting the selected nodes into a Boolean search string based on the corresponding states of the selected nodes. Moreover, the method provides the Boolean search string to a search engine to retrieve search results from the items; and displays the search results on the GUI. (o C 0 -rL (00 0 ~ I C~ci

Description

1 SYSTEM, METHOD, AND GRAPHICAL USER INTERFACE FOR FACILITATING A SEARCH FIELD [0001] Aspects of the present disclosure are related to information retrieval systems, and more particularly, to search interfaces for facilitating searching of items based on Boolean search queries. BACKGROUND [0002] Broadly, information retrieval (IR) refers to finding material (e.g., items) that satisfies an information need from within large data collections or sources. Accordingly, any search conducted on a computing system or on the World Wide Web is considered to be a form of information retrieval. [0003] Various models can be used to retrieve information, including Boolean models, statistical models and probabilistic models. Boolean models, which are based on Boolean logic and classical set theory, however, are the most widely used. In these models, the items to be searched and the user's query are conceived as sets of terms and retrieval is based on whether or not the items contain the specified query terms. [0004] In spite of the widespread acceptance of standard Boolean models, they have a number of shortcomings, with the main shortcoming being the unsuitability of this model to novice users. Non-technical users often make errors when forming Boolean queries because they resort to the plain English meaning of the logical operators AND, OR, or NOT. For example, a noun phrase in the form of "A and B" usually refers to more entities than "A" alone, whereas when used in the context of information retrieval, "A AND B" refers to fewer items than would be retrieved by "A" alone. Hence, one of the common mistakes made by users is to substitute the logical AND operator for the logical OR operator when translating an English sentence to a Boolean query. Furthermore, non-technical users are often unfamiliar with the rules of precedence and parentheses, and therefore are unable to form complex queries. Novices also face difficulties using parentheses, especially nested parentheses. Finally, users can be overwhelmed by the multitude of ways a query can be structured or modified, because of the combinatorial explosion of feasible queries as the number of concepts increases. In particular, 2 users have difficulty identifying and applying the different strategies that are available for narrowing or broadening a Boolean query. [0005] To aid non-technical users, a range of search interfaces have been available for some time. These interfaces include keyword search interfaces, parametric search interfaces, faceted search interfaces, and search builders. Each of these interfaces has different capabilities. [0006] A keyword search interface is the most common interface used by Internet search engines and computer file management systems. This interface automatically inserts an AND operator between multiple terms when a space is detected between keywords in a search string, an OR operator if an OR keyword is used in the search string, and a NOT operator if a minus sign is detected in the search string. The primary benefit of keyword search interfaces is the open-ended nature of the search. Any keyword can be entered as the basis for a search. However, the construction of more complex searches can be difficult. Moreover, applying constraints in this type of interface to reduce the number of results can be particularly difficult for novice users. Additionally, the lack of a context for a keyword makes it difficult to resolve ambiguity. [0007] The parametric search interface is generally used as an 'advanced search' option within Internet search engines. FIGS. 1 and 2 are screenshots illustrating examples of a typical parametric search interface. This interface includes a finite set of metadata fields for refining a search. In most parametric search interfaces, users specify their search parameters using a variety of controls such as checkboxes, pull-down menus, and sliders to construct an advanced Boolean query. Although these interfaces allow non-technical users to create advanced queries, the parametric search interfaces usually have predefined metadata fields that are often quite rigid. Consequently, these interfaces fair well when used with databases having common metadata, such as websites for real estate, vehicles, or jobs. However, because these interfaces support a finite set of metadata fields relevant to all the items within the search domain, a user is unable to search for items based on attributes applicable to a subset of items within the search domain. [0008] The faceted search interface, which is an extension of a parametric interface, leverages metadata fields and values to provide users with visible options for clarifying and refining queries. FIG. 3 is a screenshot illustrating a typical faceted search interface. The distinguishing feature between parametric and faceted interfaces is that the faceted interface is dynamic, with the available options adjusting automatically based on the values already selected.
3 Many faceted search implementations present the options as a set of cascading pick-lists, although this is an optional feature. Although faceted search systems provide a mechanism for refining a search, they generally do not support any mechanism for differentiating between optional and required values, or controlling the way in which constraints involving multiple attributes are applied. Moreover, these interfaces are generally unable to exclude attributes. [0009] Search builders provide a more flexible approach to constructing metadata based searches. These interfaces present a user with a list of fields and the corresponding set of available values. Additional constraints are specified by adding new sets of name-value pairs. An example of a search builder is the Apple* iTunes* Smart Playlist builder, shown in FIG. 4. Search builders often support AND, OR and NOT logical operators as part of constructing the search. Many search builders also allow searches to be the nested. More advanced search builders allow users to select tables as well as fields. However, at this level, formulating queries on the search builder requires some technical expertise. [0010] Very few of the search interfaces described above support a full set of logical operators (AND, OR, NOT). Further, those interfaces that support all logical operators typically apply logical operators to groups of values, for example, by using controls that specify "any of the values," "all of the values," or "none of the values". This approach is often misunderstood by non-technical users, who tend to think in terms of individual values being 'optional' or 'required'. [0011] Furthermore, most search interfaces are unsuccessful in constructing complex queries. It is generally difficult to search for results within a set of search results or to combine results from multiple searches. The systems that do support such complex queries tend to be beyond the capabilities of casual or non-technical users. Accordingly, currently there are no established search interfaces that can provide non-technical users with the ability to construct intuitive Boolean search queries that incorporate a full set of logical operators. Furthermore, there is no established search mechanism that assists non-technical users to construct complex or nested Boolean search queries in an intuitive way. SUMMARY [0012] Aspects of the present disclosure are related to a search interface that builds a graphical user interface (GUI) based on the underlying content. Specifically, the search interface 4 is configured to customize a graphical user interface based on metadata associated with the underlying content. For instance, the GUI for a clothing online shopping website may look different from the GUI for an electronics shopping website, which may look different from a GUI for a data management system. Furthermore, the search interface allows non-technical users to search for relevant content by constructing complex Boolean search queries that incorporate a full set of logical operators. [0013] According to one aspect of the present disclosure, a method for facilitating searching is provided. The method includes displaying an interactive tree on a graphical user interface (GUI). The tree is formed from a classification of metadata associated with items, and includes one or more value nodes, attribute nodes and group nodes. The method further includes detecting user selection of one or more nodes of the interactive tree. Each selected node includes one state from a set of states comprising at least two of optional, required and excluded states. Subsequently, the method proceeds to convert the selected nodes into a Boolean search string based on the corresponding states of the selected nodes, provide the Boolean search string to a search engine to retrieve search results from the items, and display the search results on the GUI. [0014] According to another aspect of the present disclosure, a search interface for facilitating searching of items is provided. The search interface includes a control builder configured to display a graphical user interface (GUI) comprising a filter interface having an interactive tree. The interactive tree is based on a classification of metadata associated with the items and including one or more of group nodes, sub-group nodes, attribute nodes, sub-attribute nodes, value ranges, and value nodes. The interactive tree further includes interactive objects for selecting the one or more nodes. The search interface also includes a query builder configured to detect user selection of the one or more nodes, each selected node including one state from a set of at least two states comprising optional, required, and excluded states, retrieve the selected nodes from the display module, and convert the selected nodes into a Boolean search query based on the corresponding states of the nodes. Moreover, the search interface includes a query search engine configured to execute a search of the items based on the search query to retrieve search results, and a results module configured to retrieve the search results from the query search engine, and display the search results on the GUI. [0015] According to yet another aspect of the present disclosure, a graphical user interface (GUI) for searching for electronic items is provided. The GUI includes a filter interface configured to display an interactive tree, where the interactive tree is based on a classification 5 tree of metadata associated with the electronic items. The interactive tree includes one or more value nodes, value ranges, sub-attribute nodes, attribute nodes sub-group nodes, and group nodes, and interactive objects for selecting the one or more nodes, wherein the interactive objects allow a node to be in a state from a set of states comprising unselected, optional, required, and excluded. The GUI further includes a results interface configured to display a list of search results comprising one or more electronic items corresponding to the one or more selected nodes, and a focus interface configured to display one or more value, attribute, and/or group nodes corresponding to the search results, and accept a user input to refine the search results by excluding one or more value, attribute and/or group nodes displayed in the focus interface. BRIEF DESCRIPTION OF THE DRAWINGS [0016] At least one embodiment of the present invention will now be described with reference to the drawings. [0017] FIG. 1 is a screenshot of a prior art parametric search interface. [0018] FIG. 2 is a screenshot of another prior art parametric search interface. [0019] FIG. 3 is a screenshot of a prior art faceted search interface. [0020] FIG. 4 is a screenshot of a prior art search builder interface. [0021] FIG. 5 is block diagram illustrating an information retrieval platform where an exemplary search interface is employed. [0022] FIGs. 6A and 6B collectively form a schematic block diagram representation of an electronic device upon which described arrangements can be practiced. [0023] FIG. 7 is a block diagram of the exemplary search interface of FIG. 5. [0024] FIG. 8 is a schematic diagram of an exemplary graphical user interface GUI. [0025] FIG. 9 illustrates an exemplary classification tree according to aspects of the present disclosure. [0026] FIG. 10 illustrates an exemplary filter interface of the GUI. [0027] FIG. 11 illustrates a first rule of the filter interface.
6 [0028] FIG. 12 illustrates a second rule of the filter interface. [0029] FIG. 13 illustrates a third rule of the filter interface. [0030] FIG. 14 illustrates an exemplary keyword search bar of the GUI. [0031] FIG. 15 illustrates an exemplary tag bar of the GUI. [0032] FIG. 16 illustrates an exemplary focus interface of the GUI. [0033] FIG. 17 is a flowchart of an exemplary method for facilitating searching of items based on a Boolean search query. [0034] FIG. 18 is a flowchart of a method for updating a graphical user interface [0035] FIG. 19 is a flowchart illustrating an exemplary method for conducting a search based on a Boolean search query. [0036] FIG. 20 is a flowchart illustrating an exemplary method for facilitating searching based on a keyword search query. [0037] FIG. 21 is a flowchart of a method for refining search results of FIG. 19. [0038] FIG. 22 is a flowchart of a method for adding nodes in a classification tree. [0039] While the systems and methods described herein are amenable to various modifications and alternative forms, specific implementations are shown by way of example in the drawings and are described in detail. It should be understood, however, that the drawings and detailed description thereto are not intended to limit the systems and methods described herein to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present invention as defined by the appended claims. DETAILED DESCRIPTION [0040] Aspects of the present disclosure are related to an exemplary search interface that allows novice users to easily perform Boolean searches or refine previous Boolean searches. To that end, the search interface classifies the available items based on metadata associated with the items and displays the classification in the form of an interactive tree. The interactive tree 7 includes interactive objects that allow multiple nodes of the interactive tree to be selected in one of four states. The four states being unselected, optionally selected, mandatorily selected, or excluded. Furthermore, the search interface is configured to construct structured Boolean search queries based on a user's selection of the nodes in one of the four states. This structured search combines some elements of parametric and faceted searches such as presentation of available options and support for AND, OR and NOT logical operators. Moreover, the search interface presents a consistent graphical user interface (GUI) and allows users to add new search values on the fly. [0041] The presently disclosed search interface can be implemented in various information retrieval systems to search for various forms of items. For instance, the search interface may be employed for searching products on a website, documents in a document management system, files on a user device, or any other items on an Internet search engine. PLATFORM [0042] FIG. 5 is a block diagram of an exemplary information retrieval (IR) platform 500 where the search interface can be employed. Specifically, the IR platform 500 illustrates one possible implementation of the search interface. The IR platform 500 includes a server computer 502 associated with a communications network 504. The network 504 interconnects and operatively couples one or more user devices 506a 506b, 506c, ... 506n (collectively referred to as user devices 506) to the server 502. Further, one or more users 508a, 508b... 508c (collectively referred to as users 508) operate the corresponding user devices 506. The platform 500 also includes a search interface 510 that renders a graphical user interface (GUI) 512 on a display of the user device 506. [0043] In the implementation illustrated in FIG. 5, the search interface 510 is installed and operative on the server computer 502, while the GUI 512 is displayed on the user devices 506. In alternative implementations, the search interface 510 and GUI 512 may be both installed and operative on the user device 506. One scenario where the search interface 510 is completely operative on the user device 506 is when the search interface 510 is employed to retrieve items (such as files or documents) from the user device 506. Scenarios where portions of the search interface 510 are implemented on the server 502 while other portions are implemented on the user device 506 include Internet applications, organizational networks, and so forth. [0044] The server computer 502 can include a single server computer, a distributed server computer, a cloud computing system or any computing system suitable for performing server 8 functions. In general, any computing device capable of being programmed to perform server functions in accordance with the present disclosure can be used. [0045] The user device 506 can be a wireless mobile phone (e.g., a smart phone), a personal digital assistant, a portable computer (e.g., a laptop, netbook, notepad computer, tablet computer, or the like), an e-book reader, a portable media player, a desktop computer or other suitable computing device. Furthermore, the network 504 can include one or more of a local area network, a wide area network, the Internet, a virtual private network, a wireless network (Wi-Fi, cellular, Bluetooth or the like), a wired network or the like. [0046] In case of a network solution, a user, such as user 508 can use the user device 506 to access the server 502. The software for accessing the server 502 can be executed from the user device 506. The software can be, for example, an application (or "app") that is downloaded from an online application marketplace such as that provided for the iPhone* , Android* , Blackberry* and Palm* wireless devices. Alternatively, or in addition to an application, the software can be provided from the server computer 502 in the form of software as a service or as a web service. [0047] The search interface 510 is configured to display the GUI 512 on a display of the user device 506. Particularly, the search interface 510 displays an interactive classification tree of one or more items on the GUI 512 to aid novice users in building a Boolean search query without knowledge of any technical syntax. Items may include documents, audio files, video files, image files, executables, or any other form of information. Moreover, the interactive tree may include one or more nodes corresponding to the underlying items. To build the Boolean search query, a user 508 may select one or more nodes from the interactive tree. The search interface 510, in turn, retrieves the user-selected nodes and converts that selection into a Boolean search string. The search interface then retrieves one or more items matching the search string and the GUI 512 displays the search results on a display device coupled to, or integrated with, the user devices 506. In the implementation illustrated in FIG. 5, the search interface 510 employs an applet operating on the user device 506 to locally modify the GUI 512 in response to commands entered by the user 508. EXEMPLARY DEVICE [0048] Figs. 6A and 6B collectively form a schematic block diagram of an exemplary user device 506, which in this case is implemented as a general purpose electronic device 506 including embedded components, upon which the search interface 510 is implemented and the methods to be described are desirably practiced. As described previously, the user device 506 9 may be, for example, a mobile phone, a smart phone, a tablet, a portable computer or a higher higher-level device such as a laptop, a desktop computer, a server computer, and other such devices with significantly larger processing resources. [0049] As seen in FIG. 6A, the electronic device 506 comprises an embedded controller 602. Accordingly, the electronic device 506 may be referred to as an "embedded device." In the present example, the controller 602 has a processing unit (or processor) 605 which is bi directionally coupled to the storage medium 603. The storage medium 603 may be formed from non-volatile semiconductor read only memory (ROM) 660 and semiconductor random access memory (RAM) 670, as seen in FIG. 6B. The RAM 670 may be volatile, non-volatile or a combination of volatile and non-volatile memory. [0050] The electronic device 506 includes a display controller 607, which is connected to a video display 614, such as a liquid crystal display (LCD) panel or the like. The display controller 607 is configured to display the GUI 512 on the video display 614 in accordance with instructions received from the embedded controller 602, to which the display controller 607 is connected. [0051] The electronic device 506 also includes user input devices 613 which are typically formed by keys, a keypad or like controls. In some implementations, the user input devices 613 may include a touch sensitive panel physically associated with the display 614 to collectively form a touch-screen. Other forms of user input devices may also be used, such as a microphone (not illustrated) for voice commands or a joystick/thumb wheel (not illustrated) for ease of navigation about menus. [0052] The electronic device 506 also has a communications interface 608 to permit coupling of the device 506 to a computer or the communication network 504 via a connection 621. The connection 621 may be wired or wireless. For example, the connection 621 may be radio frequency or optical. An example of a wired connection includes Ethernet. Further, an example of wireless connection includes Bluetoothm type local interconnection, Wi-Fi (including protocols based on the standards of the IEEE 802.11 family), Infrared Data Association (IrDa) and the like. [0053] Typically, the electronic device 506 is configured to perform some special function. The embedded controller 602, possibly in conjunction with further special function components 610, is provided to perform that special function. For example, where the device 506 is a mobile phone, the components 610 may represent a microphone, a speaker and an interface for 10 connection to cellular networks. The special function components 610 are connected to the embedded controller 602. Where the device 506 is a portable device, the special function components 610 may represent a number of encoders and decoders of a type including Joint Photographic Experts Group (JPEG), (Moving Picture Experts Group) MPEG, MPEG-1 Audio Layer 3 (MP3), and the like. [0054] The methods performed by the search interface 510 and the GUI 512 may be implemented using the embedded controller 602, where the processes of FIGs. 5-15 and 17-20 may be implemented as one or more software application programs 633 executable within the embedded controller 602. The electronic device 506 of FIG. 6A implements the described methods. In particular, with reference to FIG. 6B, the steps of the described methods are effected by instructions in the software 633 that are carried out within the controller 602. The software instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the described methods and a second part and the corresponding code modules manage a user interface between the first part and the user. [0055] The software 633 of the embedded controller 602 is typically stored in the non volatile ROM 660 of the storage medium 603. The software 633 stored in the ROM 660 can be updated when required from a computer readable medium. The software 633 can be loaded into and executed by the processor 605. In some instances, the processor 605 may execute software instructions that are located in RAM 670. Software instructions may be loaded into the RAM 670 by the processor 605 initiating a copy of one or more code modules from ROM 660 into RAM 670. Alternatively, the software instructions of one or more code modules may be pre installed in a non-volatile region of RAM 670 by a manufacturer. After one or more code modules have been located in RAM 670, the processor 605 may execute software instructions of the one or more code modules. [0056] The application program 633 is typically pre-installed and stored in the ROM 660 by a manufacturer, prior to distribution of the electronic device 506. However, in some instances, the application programs 633 may be supplied to the user encoded on one or more CD-ROM (not shown) and read via the portable memory interface 606 of FIG. 6A prior to storage in the storage medium 603 or in the portable memory 625. In another alternative, the software application program 633 may be read by the processor 605 from the network 620, or loaded into the controller 602 or the portable storage medium 625 from other computer readable media.
11 Computer readable storage media refers to any non-transitory tangible storage medium that participates in providing instructions and/or data to the controller 602 for execution and/or processing. Examples of such storage media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, flash memory, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the device 506. Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the device 506 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. A computer readable medium having such software or computer program recorded on it is a computer program product. [0057] The second part of the application programs 633 and the corresponding code modules mentioned above may be executed to implement the GUI 512 to be rendered or otherwise represented upon the display 614 of FIG. 6A. Through manipulation of the user input device 613 (e.g., the keypad), a user of the device 506 and the application programs 633 may manipulate the GUI 512 in a functionally adaptable manner to provide controlling commands and/or input to the search interface 510 associated with the GUI 512. [0058] FIG. 6B illustrates in detail the embedded controller 602 having the processor 605 for executing the application programs 633 and the storage medium 603 for storing application data 634. The processor 605 is able to execute the application programs 633 stored in one or both of the connected memories 660 and 670. When the electronic device 506 is initially powered up, a system program resident in the ROM 660 is executed. The application program 633 permanently stored in the ROM 660 is sometimes referred to as "firmware". Execution of the firmware by the processor 605 may fulfil various functions, including processor management, memory management, device management, storage management and user interface. [0059] The processor 605 typically includes a number of functional modules including a control unit (CU) 651, an arithmetic logic unit (ALU) 652 and a local or internal memory comprising a set of registers 654 which typically contain atomic data elements 656, 657, along with internal buffer or cache memory 655. One or more internal buses 659 interconnect these 12 functional modules. The processor 605 typically also has one or more interfaces 658 for communicating with external devices via system bus 681, using a connection 661. [0060] The application program 633 includes a sequence of instructions 662 through 663 that may include conditional branch and loop instructions. The program 633 may also include application data 634, which is used in execution of the program 633. This data 634 may be stored as part of the instruction or in a separate location within the ROM 660 or RAM 670. [0061] In general, the processor 605 is given a set of instructions, which are executed therein. This set of instructions may be organised into blocks, which perform specific tasks or handle specific events that occur in the electronic device 506. Typically, the application program 633 waits for events and subsequently executes the block of code associated with that event. Events may be triggered in response to input from a user, via the user input devices 613 of FIG. 6A, as detected by the processor 605. Events may also be triggered in response to other sensors and interfaces in the electronic device 506. [0062] The execution of a set of the instructions may require numeric variables to be read and modified. Such numeric variables are stored in the RAM 670. The disclosed method uses input variables 671 that are stored in known locations 672, 673 in the memory 670. The input variables 671 are processed to produce output variables 677 that are stored in known locations 678, 679 in the memory 670. Intermediate variables 674 may be stored in additional memory locations in locations 675, 676 of the memory 670. Alternatively, some intermediate variables may only exist in the registers 654 of the processor 605. [0063] The execution of a sequence of instructions is achieved in the processor 605 by repeated application of a fetch-execute cycle. The control unit 651 of the processor 605 maintains a register called the program counter, which contains the address in ROM 660 or RAM 670 of the next instruction to be executed. At the start of the fetch execute cycle, the contents of the memory address indexed by the program counter is loaded into the control unit 651. The instruction thus loaded controls the subsequent operation of the processor 605, causing for example, data to be loaded from ROM memory 660 into processor registers 654, the contents of a register to be arithmetically combined with the contents of another register, the contents of a register to be written to the location stored in another register and so on. At the end of the fetch execute cycle the program counter is updated to point to the next instruction in the system program code. Depending on the instruction just executed this may involve incrementing the 13 address contained in the program counter or loading the program counter with a new address in order to achieve a branch operation. [0064] Each step or sub-process in the processes of the methods described below is associated with one or more segments of the application program 633, and is performed by repeated execution of a fetch-execute cycle in the processor 605 or similar programmatic operation of other independent processor blocks in the electronic device 506. The described methods may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of described processes. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories. SEARCH INTERFACE [0065] As described previously, the search interface 510 is configured to render the GUI 512, detect commands entered by users, generate search queries based on user commands, refine search queries, retrieve search results and display the results on the GUI 512. Particularly, the search interface 510 is configured to display a simple GUI 512 for Boolean searches that allows users 508 to select individual search terms instead of forming relationships between various search terms. Specifically, the search interface 510 allows users to optionally or mandatorily select individual objects. Although these selections echo traditional Boolean search operators, they shift the emphasis away from a relationship between the search terms and towards a constraint on the individual search terms. The search interface 510 is also configured to classify items in a hierarchical structure, and generate an interactive representation of the hierarchical structure. Furthermore, the search interface 510 is configured to allow new search terms to be defined and save search queries so that the saved search queries can be used for nested searches. [0066] To perform the above-mentioned functions, the search interface 510 includes application programs 633 and application data 634 that together render the GUI 512. FIG. 7 is a block diagram illustrating an exemplary search interface 510. The application programs 633 include a classification module 703, an indexing module 704, a control builder 705, a query builder 706, a keyword module 707, a refinement module 708, a results module 709, and a query engine 710. The Application data 634 includes a classification tree 722, an index 724, and metadata 720, which are stored in the storage medium 603. Furthermore, the search interface 510 is operatively coupled to a keyword search engine 711 that performs searches on the 14 underlying items 718. The items 718 may be stored on the storage medium 603 or on the any other internal or external database on the server 502 without departing from the scope of the present disclosure. [0067] As described previously, the GUI 512 is an application interface between the search interface 510 and the users 508 that is displayed on the display 614 of the user device 506 and operates to permit a user to interact with the application program 633 through graphical icons and visual indicators. Particularly, the GUI 512 includes interactive icons and indicators that a user can select/modify/add/delete using a suitable input device such as a mouse, keyboard, touch screen, joystick, etc. Referring back to FIG. 7, the GUI 512 includes a filter interface 712, a focus interface 713, a keyword search bar 714, a tagger 715, a results window 716 and a previewer 717. [0068] FIG. 8 is a diagram illustrating a basic configuration of the GUI 512 displayed in the form of a window 800 including multiple sections/bars. The main section 802 is divided into the results window 716 and the previewer 717. The GUI 512 also includes a left panel 804 and a right panel 806. The left panel 804 includes the keyword search bar 714, the focus interface 713 and the filter interface 712, while the right panel 806 includes the tagger 715. It will be understood that the positions of these windows and bars are interchangeable and in one implementation may be reconfigured by the user 508 by simply dragging and dropping the different sections using the mouse. Furthermore, the user can resize each of the windows/interfaces by simply dragging their edges. [0069] The windows/ interfaces depicted in FIG. 8 are configured to depict information regarding the underlying classification tree 722 and to accept commands from the user 508 to perform certain functions such as conducting searches, displaying lists of results, and displaying document previews, and so on. Particularly, the keyword search bar 714 allows a user to perform a keyword based search, the tagger 715 allows users to insert additional tag elements to the items 718, the filter interface 712 allows users to build a search based on the nodes of the classification tree 722 and the focus interface 713 allows users to further refine the search query built on the filter interface 712. Functions of the application programs 633 and the GUI 512 will be described with reference to FIGS. 7 and 8. [0070] In FIG. 7, the search interface 510 is implemented to render a GUI for a document management system installed and operating on the server 502. Accordingly, only the control builder 705, the query builder 706 and the GUI 512 are installed and operative on the user device 15 506, while other application programs 633 and application data 634 are installed and operative on the server 502. [0071] However, it will be appreciated that in other implementations, the application programs and data may be present on the user device 506. For instance, in case the search interface 510 is used to search files (such as image, video, audio or text files) stored on the user device 506, the modules of the search interface 510 are present on the user device 506. Alternatively, if the search interface 510 is used on an Internet search engine or for a website, some application programs 633, such as the control builder 705, the query builder 706, the results module 709 and the keyword module 707 may be present on the user device 506 while the other application programs 633 are present on the server 502. It will be understood that other implementations are anticipated where the arrangement of the application programs 633 may be different and none of these other arrangements are outside the scope of the present disclosure. [0072] The classification module 703 is configured to generate the classification tree 722 and capture underlying items 718 as individual nodes in the classification tree structure based on the metadata 720 associated with the items 718. The metadata 720 may be retrieved by various techniques based on the particular implementation of the search interface 510 and saved in a particular memory location in the storage medium 603. For instance, the classification module 703 can transfer the metadata 720 associated with each item 718 using loadfiles or sidecar files. Alternatively, in systems where the search interface 510 has access to native items 718, the metadata 720 is derived directly from the items 718 using an ingestion process. This process can extract file system data, such as time stamps. Depending on the document type it is also possible to extract embedded metadata. In a web or cloud environment, the classification module 703 is typically deployed as a client-side process using technology such as JavaScript. In this case, the metadata 720 is transferred as required from the server 502 using technologies such as AJAX. If datafiles are used, a loader program or process is used to extract the metadata 720 and populate the storage medium 603. The loader may be configured to support a specific loadfile, sidecar, or document format. Where large loadfiles are used, the metadata 720 is transferred to the storage medium 603 as a batch process. In this scenario, the metadata 720 for multiple items 718 is updated in a batch process that is run on a frequency reflecting the nature of the underlying data. This frequency could be weekly, daily, or hourly. When metadata 620 is available in sidecar files, incremental transfers are generally more appropriate. These transfers can be processed in a close-to-live basis, possibly using an agent process that monitors a particular folder. Once the metadata 720 is retrieved, the classification module 703 generates the classification tree 722.
16 FIG. 9 illustrates an exemplary classification tree 722, which includes one or more top-level nodes called group nodes 902. In a retail environment, one of the group nodes 902 may be 'Products', which includes sub-group nodes 903 such as 'Furniture', and 'Electrical'. These sub groups may further include a further level of sub-groups such as 'Computers' 'Televisions', and 'Whitegoods'. The number of levels of sub-groups will depend on the classification scheme that is appropriate for the set of documents. [0073] The group nodes 902 or sub-group nodes 903 may include one or more attribute nodes 904 that correspond to individual metadata elements associated with the underlying items 718. In the present example, the 'Furniture' sub-group may include an attribute node 'Country of Origin', while the sub-group 'Electrical' may include attribute node 'Energy rating' and the sub-sub-group 'Televisions' may include attributes 'Video' and 'Audio'. Similarly, in a document management system, the metadata 720 may be classified into a group node 902 called 'document properties', which includes attribute nodes 904 such as 'last modified date', 'last saved date', 'creation date', 'document title', 'author', 'document type', 'project number', etc. Similarly, in a contract management system, one of the group nodes may be 'contract properties' that includes attribute nodes such as 'contract start date', 'contract end date', 'client name', etc. [0074] In some cases, the attribute nodes 904 may further include sub-attributes 905. For example, in FIG. 9, the attribute 'Video' has sub-attributes 'screen type', 'screen size', and 'definition'. The attribute or sub-attribute nodes 904, 905 further include one or more value nodes 906. Value nodes refer to actual element values of the metadata elements. For instance, the attribute node 'document type' may include value nodes such as 'email', 'image', 'MS Word', 'MS PowerPoint', and so on. In the retail environment, the attribute node 'screen type' includes value nodes 'LED', and 'LCD'. In some cases, it is possible for a value node to include sub-value nodes. For example, the value node 'LCD' can include sub-value nodes 'flat screen,' 'front projection' and 'rear projection.' . [0075] Value nodes 906 may exist in a single level under an attribute node 904. However, it is more likely that multiple levels of value nodes 906 exist. Classification of the value nodes into multiple levels is generally based on the data type and the context of the attribute node 904. For example, the value nodes 906 can be structured into multiple levels if the data type is hierarchical in nature or can be easily deconstructed. Moreover, the value nodes 906 can be structured by taxonomies, alpha ranges, numeric ranges, or size ranges. Typically, attributes that are inherently hierarchical, such as file locations, can be classified into hierarchical value node 17 levels. Alternatively, a deconstructive approach can be used to classify value nodes 906 into cascading levels for attributes that are conventionally non-hierarchical. One example of such attribute type is 'date', which can be broken into multiple cascading levels of value nodes corresponding to year, month and day. Another example of this type of attribute node is 'location', which can be broken down into multiple cascading levels of value nodes corresponding to country, state, city, suburb, street name or street number. [0076] Taxonomy structuring can be used for attribute nodes 904 that are inherently taxonomic in nature, such as the Dewey decimal classification system for books in a library. The value nodes for this attribute can be represented in increasing degree of specification. Similarly, alpha ranges are used to classify value nodes 906 that can be readily broken down into multiple alphabetical ranges, such as 'names' that can be classified into multiple levels corresponding to their first letters. Consequently, the attribute "document name" can have value nodes from A to Z. Numeric ranges can be used to classify attribute nodes 904 that can be readily segmented into numerical ranges. Examples of such attribute nodes include 'price of product' or 'file size', which can be itemized into value nodes corresponding to the magnitude of the attribute node. Finally, size ranges can be used to classify attributes that can be segmented into size ranges. Examples include clothing, which can be categorized into levels of value nodes corresponding to the target body size or file size. [0077] The process used to classify attributes and values into a hierarchical tree can be controlled using sets of rules that are understood by the classification module 703. These rules can be captured in a data dictionary that is included with the metadata 720. Individual metadata elements can be given a unique name that is then associated with a group hierarchy, an attribute type, an attribute label, and a classification rule. For example, the screen technology used by a particular model of television could be recorded as an attribute called 'TelevisionScreenType', which has possible values of 'LCD' or 'LED'. A record in a data dictionary associated with the attribute 'TelevisionScreenType' would instruct the classification module 703 to classify this data using the group hierarchy 'Products > Televisions', with an attribute label of 'Screen Type', using a 'pick-list' classification rule to present the possible 'text' values. [0078] The classification module 703 may understand a set of basic classification rules. These rules could include rules such as a 'pick-list' rule for classifying data into a pick-list, a 'filename' rule for deconstructing filepath information into a folder based hierarchy, or a 'date' rule for deconstructing date information into a year-month-day hierarchy. More complex rules 18 could also be defined, such a 'price' rule which the classification module 703 interprets as being data that needs to be divided into price-ranges. [0079] Once the classification tree 722 is created for the items in the underlying IR platform 500, the classification module 703 stores the information associated with each value node 906 as application data 634. This information may include one or more of a unique identifier for each node, a pointer to the parent, sibling, or daughter node, a sequence identifier used to control the order in which value nodes are presented, a name for identifying the value node, an easily recognizable label used to display the node, a pointer to the corresponding attribute definition, a flag to indicate value nodes, a flag to indicate attribute nodes that can only have one value, or a flag to indicate sub-trees that can be modified by the user 508. Furthermore, the information associated with group and attribute nodes 902, 904 is also stored as application data 634. In addition to the fields listed for value nodes 906, the information associated with group and attribute nodes 902, 904 includes the data type of the attribute (i.e., basic data type (such as Date, Text, Memo or Integer) or complex data type (such as Location or Person)), or the rule used to classify the attribute node 904. [0080] The classification module 703 is also configured to populate the value nodes 906 with the underlying items 718. This can be done as part of the attribute definition; particularly, where the range of available values is finite and known in advance. For example, an attribute for 'stock availability' could be classified in advance using the values of 'in-stock', 'back-ordered' and 'unavailable'. Alternatively, the value nodes 906 can be populated based on metadata associated with the items. This approach allows new value nodes to be added at any time, making it difficult to know all possible value nodes 906 in advance. For example, the value nodes 906 for the attribute node 'author', which are derived from the metadata 720 can increase or decrease as books are added or removed from the system. Various techniques are available to associate metadata 720 with documents 618 and any of these techniques may be employed in the present disclosure to associate metadata with the items without departing from the scope of the present disclosure. [0081] The indexing module 704 is configured to generate the index 724 of the items 718 against the classification tree 722. The item index 724 can be represented as a logical join between a list of items 718 and the nodes of the classification tree 722. In a simple implementation, the item index 724 is represented by two fields - ItemRef, which is a pointer to the item 718 being classified, and NodeRef, which is a pointer to the classification node used to 19 classify the items. Alternatively, in a more sophisticated approach, supporting indexes can be used to create links between an item 718 and the nodes in the classification tree 722, from the root to the leaf, that represent the actual classification. [0082] The ItemRef pointer may refer to an item 718 in the IR platform 500 that may or may not be integrated with the search interface 510. In case the items 718 are not integrated with the search interface 510, the metadata 720 is stored in the storage medium 603. Example metadata fields include unique identifiers for the item, a name to identify the item, the nominal date of the item, title for the item, and the type of item. [0083] The control builder 705 is configured to display the filter interface 712 on the GUI 512 based on the classified and indexed metadata 720. FIG. 10 depicts an exemplary filter interface 712, which includes an interactive tree 1002 and a settings control 1010. The interactive tree 1002 depicts the nodes of the classification tree 722 in a hierarchical structure. Alternatively, instead of an interactive tree, the individual attribute nodes of the classification tree 722 may be depicted as distinct visual controls. For example, visual styles such as boxes, shading or fonts can be used to represent each attribute node and corresponding set of value nodes. It is also possible to represent the value nodes in different styles, such as individual labels that are collected into columns. Irrespective of the representation style of the classification tree 722, the filter interface 712 depicts group, sub-group, attribute, sub-attribute, and value node labels of the underlying classification tree 722. In some arrangements, the interactive tree 902 also depicts a count of the number of items 718 under each node as part of the node label. [0084] The nodes of the interactive tree 1002 also include interactive objects 1004 such as pull-down menus, checkboxes, and so on to allow a particular node to be selected. In FIG. 10, the interactive objects 1004 are depicted as check boxes. However, it will be understood that any other interactive object 1004 known in the art can replace the check boxes without departing from the scope of the present disclosure. [0085] Furthermore, the control builder 705 is configured to employ different interactive objects 1004 for the group/attribute nodes and value nodes. This distinction makes it easy for a user 508 to recognize and distinguish value nodes from other nodes. In one implementation, the shape of the check boxes can reflect the distinction between the nodes. For example, group and attribute nodes 902, 904 may employ check boxes 906 with rounded corners to differentiate them from squarer check boxes 908 associated with value nodes 906. Alternatively, the check boxes 906 for group and attribute nodes may be of a different colour or different shape to the check 20 boxes 908 of value nodes. In yet another implementation, different interactive objects 1004 may be employed for the attributes and value nodes. [0086] It is generally desirable to allow only value nodes 906 to be selected. For example, selecting the 'red' value for the attribute 'colour' would translate into the criteria "where 'colour' is 'red"'. Allowing a user to select the attribute node 904 in this example would translate into the criteria "where colour is unspecified", which may be an inappropriate search string. It is therefore useful for the distinction between the interactive objects for value nodes and other type of nodes. [0087] However, in some cases it may be desirable for attribute or group nodes to also be selected. The interpretation of this selection would depend on the context. But rather than translating into the criteria "where colour is unspecified", it could mean "where colour has been specified". [0088] By selecting the interactive objects 1004, users can select any value node 906 as optional, required or excluded. These options reflect the underlying Boolean operators of OR, AND, and NOT, respectively. In case of check boxes, different symbols can be used to represent the different selection options. For instance, a 'e' can represent an optional state, a '+' can reflect a required state and a '-' can represent an excluded state. However, it will be understood that other symbols can be used just as easily without departing from the scope of the present disclosure. Table 1 illustrates the different symbols for attribute nodes and value nodes. State Value Node Attribute Node Unselected D D Optional e Required _ _ Excluded _ _ Table 1: Symbols for value and attribute nodes [0089] When the interactive objects 1004 are represented as check boxes, specifying a search criterion involves navigating down the interactive tree 1002 to locate and select one or more value node 906. Repeated selection of a checkbox causes the checkbox to cycle through the available states. Alternatively, a single selection of the checkbox may reveal a pop-up that allows a user to choose a specific state without having to cycle through the different states.
21 [0090] The filter interface 712 also includes a settings control 1010, which allows users to switch the filter interface 712 between a basic, standard and advanced mode. In FIG. 10, the settings control 1010 is depicted at buttons, however a drop down menu can also be used to select from the three available settings. In the basic mode, the checkboxes cycle through two states - 'optional' and 'unselected', thereby allowing a value node 906 to either be optionally selected or not selected. In the standard mode, the checkboxes may be configured to cycle through three states - 'unselected', 'optional' and 'required', thereby allowing a user 508 to select a value node 906 as an optional or mandatory field. In the advance mode, the control builder 705 configures the checkboxes to cycle through four states - 'unselected', 'optional', 'required' and 'excluded', thereby allowing a user 508 to optionally select a value node 906, mandatorily select a value node or exclude a value node from a search query. [0091] Furthermore, when a user selects a particular value node 906 by cycling through the various states of the checkbox associated with the value node 906, the attribute and group nodes associated with that value node 906 are automatically selected based on specific rules. If the state of a particular child node is 'optional', the parent node is also optionally selected. FIG. 11 illustrates this rule by depicting two instances of the filter interface 712. In the first instance 1102, no value nodes are selected, whereas in the second instance 1104, the value node "12" is selected. As seen in the Figure, when value node 12 is selected, the parent value nodes "June" and "2014" are automatically set to the optional state. Moreover, the attribute node "document date" and the group node "document" are also set to 'optional'. Therefore, optional selection of a value node propagates up the interactive tree 1002 from the value node 906 to the group node 902. [0092] According to the second rule, if a number of children nodes corresponding to a parent node are selected in a mix of optional and required states, the states of the parent nodes are automatically set to 'required'. FIG. 12 illustrates this rule by depicting two instances of the filter interface 712. In the first instance 1202, no value nodes are selected, but in the second instance 1204, value nodes "12" and "13" are selected in the optional state, while value node "24" is selected in the required state. As seen in this case, the parent value nodes "June" and "2014", the attribute node "document date" and the group node "document" are automatically set as required. [0093] In the third rule, if the state of a child node is selected as 'excluded', the state of the parent node is automatically set to 'required'. This reflects the way in which an attribute node 22 904 with an excluded value node 906 is itself included in the search query. An attribute node 904 in which all the available value nodes 906 are excluded essentially becomes a query that returns items 718 with a null value for that attribute. FIG. 13 illustrates the third rule. When value node "14" is switched to the excluded state in instance 1204, the parent value nodes "April" and "2014" are automatically set as required. The parent attribute node "document date" and the group node "document" are also set to the required state. [0094] Furthermore, when one or more value nodes are selected from two or more attribute nodes, the corresponding attribute and group nodes are automatically set as 'required', even when the subordinate value nodes are set as 'optional'. This corresponds to the most common expectation of a Boolean search string. For example, a user that selects a series of optional values for a 'date' attribute and a series of optional values for a 'document type', would usually be taken to have intended that a logical 'AND' operation would apply between the two sets of constraints. However, it is possible to change the status of the attribute nodes to 'optional' if an 'OR' operation between the two attributes is desired. [0095] Preferably, the state of attribute and groups nodes 902, 904 is activated by the selection of a corresponding value node 906. However, it is conceivable that selection of an inactive group (or attribute) node could trigger a 'select-all' event for the corresponding child nodes. In one implementation, the checkbox for a group or attribute node can cycle between three states - optional, required and unselected. Alternatively, a two-state cycle may be supported in which the check box toggles between an activated and deactivated state. Furthermore, simply deactivating a corresponding parent node can clear all the children nodes of a particular parent node. [0096] If the user selects a particular value node 906, which includes children nodes, the control builder 705 clears any previous selections of the children nodes. For example, if a user selects value nodes corresponding to three days in a month and then selects the value node corresponding to the whole month, the individually selected value nodes are redundant. Similarly, if a user selects all the children value nodes beneath a parent value node, then this has the same effect as if the parent value node had been selected. The control builder 705 detects if the user makes any selection in the filter interface 712 and communicates these selections to the query builder 706. [0097] The query builder 706 is configured to convert a user-selection of value nodes 906 into a Boolean search query. Specifically, the query builder monitors the interactive objects 23 1004 to determine if their state changes. A new Boolean search query is generated whenever the query builder determines a state change in the filter interface 712. Furthermore, the query builder is configured to generate a search query based on selection of one or more nodes in the focus interface 713. To form the Boolean search string, the query builder 706 groups the selected value nodes 906 into sets. The sets are then connected using brackets and operators based on the group and attribute nodes. [0098] For example, if a user selects three optional value nodes (A,B,C) of one attribute node (which is automatically set as 'required') and two optional values (X,Y) from another attribute (which is also automatically set as 'required'), the query builder 706 generates a search string for this selection that can be represented as +(eAeBeC)+( eXeY) (1) where A,B,C,X,Y,Z are numeric identifiers for the corresponding nodes in the classification tree 722. [0099] If a search query can be further simplified, the query builder 706 refines the Boolean search string using logical simplification. For example, a search query represented by the following expression +(+A+BeC)+(+XeYeZ) (2) can be refined into +(+A+B)+(+X) (3) as the optional values (C,Y,Z) are logically ignored in the presence of the required values. This expression can further be simplified as +(A)+(B)+(X) [00100] Similarly, any search string including excluded criterion can be refined by the query builder 706 in a similar fashion. For example, the search string represented below +(eAeBeC)+(+XeY-Z) (4) can be simplified as: +(*A*BeC)+(X)-(Z) (5) 24 [00101] Accordingly, by using conventional Boolean logic, the query builder 706 can represent the selected nodes as a sequence of values that are grouped using brackets and joined using AND or NOT operators. The values within the brackets are grouped using OR operators. [00102] The query builder 706 is further configured to parse and convert the simplified search strings into a set of nested SQL queries and communicate these SQL queries to the query engine 710. [00103] The query engine 710 is configured to execute the search query against the item index 724 to identify items 718 that satisfy the search criteria. Once items are identified, the query engine 710 communicates the identified items 718 to the results module 709, which displays a list of the retrieved results in the results window 716. Furthermore, the results module retrieves the first item in the list of results and displays the item in the previewer 717. [00104] Referring back to FIG. 7, the GUI 512 includes a keyword search bar 714. This search bar is operatively coupled to the keyword module 707. FIG. 14 is a schematic diagram of the keyword search bar 714, which includes a search field 1402 and a search scope selector 1404. The user can enter keywords in the search field 1402 and select the scope of the search from the scope selector 1404. As shown in FIG. 14, the search scope includes metadata search and content search. The keyword module 707 detects these inputs and communicates the inputs to the keyword search engine 711 to perform a persistent search, in case of content search. Alternatively, the user inputs are communicated to the query engine 710 to search for the keywords in the metadata 720. Results from the search are displayed on the results window 716. [00105] In one arrangement, the keyword module 707 also includes a thesaurus and/or dictionary and is able to generate additional keywords by using synonyms of the retrieved keyword or the stem of the retrieved keyword. For instance, if a user 508 enters a keyword "contract", the keyword module 707 can automatically generate keywords "contractor", "contracting", "contracted", "deed", "agreement" or "commitment". Moreover, the keyword module 610 can request the keyword search engine 711 to perform individual searches for each keyword. Subsequently, the keyword module 707 provides the keywords and the search results to the classification module 703 that inserts metadata tags associated with the keywords in the search results and generates attribute and value nodes related to the keywords. These attribute and value nodes are then updated in the filter interface 712. It will be understood that the classification module 703 can create different value nodes for each of the keyword synonyms or 25 create one value node that includes the search results for all the keyword synonyms. Where the search interface 510 is hosted on a website, the keywords and associated attribute and value nodes are saved so that subsequent users may view previously used keywords and associated search results to improve their own search experience. [00106] FIG. 15 illustrates an exemplary tagger 715, which allows users to add metadata tags to existing items and subsequently applies the metadata tags to the underlying documents. The tagger 715 is operatively coupled with the refinement module 708 and is used in combination with the filter interface 712 and/or the focus interface 713 to create and apply metadata tags to sets of searched documents. Metadata tags can be used to bookmark and/or classify documents of interest. For example, a new tag group called 'Pets' could be created, with corresponding tag values of 'Dogs' and 'Cats'. The user could then use keyword and/or filter searches on various breeds of animals to locate relevant documents, which are then tagged appropriately. It will now be possible to quickly locate documents based on the 'dogs' or 'cats' tags. [00107] As depicted in FIG. 15, the tagger 715 includes the group nodes and a node "new pick list" 1502. Selecting this control will create a new pick list attribute that can then be populated with relevant values. A user can select any of the group nodes displayed in the tagger 715. Such selection launches a new window 1504 requesting the user 508 to add a new value node for the specified attribute node. For example, a user can create a new metadata tag as a picklist called 'Document Type' and then add corresponding metadata tag values of 'correspondence' and 'contract'. This information is provided to the refinement module 708, which subsequently provides the user-entered information to the classification module 703 to add an attribute node 'document type' and value nodes 'correspondence' and 'contract' in the classification tree 722. Creation of these nodes initiates a process in which the classification module 703 added the new nodes to the classification tree 722, the indexing module 704 generates an index reference for one or more items 718 against these new nodes based on the metadata 720 associated with the items 718, and the control builder 705 and refinement module 708 update the GUI 512 with the new nodes. [00108] Tags are generally static. Accordingly, the tags are applied to a set of selected documents and remain in place even if the underlying set of documents changes. However, it is also possible to define dynamic tags that capture the underlying query associated with a tag.
26 Dynamic tags automatically update the tagged documents in response to changes in the underlying set of documents. [00109] Furthermore, the combination of tagger 715 and refinement module 708 can be used to capture intermediate results as part of a more complex query. For example, if a user wants to locate documents related to two television brands, say Sony and Samsung, but only wants LCD TVs from Sony and LED TVs from Samsung, the user could use three queries. In the first query, the user can select the value node "Sony" from the attribute node "brand" and the value node "LCD" from the attribute node "Type". In query 2, the user can select the value node "Samsung" from the attribute node "brand" and the value node "LED" from the attribute node "type". Query 3 can be a combination of queries 1 and 2. The results from the first and second query can be captured using distinct tags such as "Sony LCD TVs" and "Samsung LED TVs". Subsequently, these tags can form the basis of the third query. [00110] The refinement module 708 is also configured to collapse or expand value nodes 906 on the fly. For example, initially for an attribute 'date', value nodes may exist for each year from 1990 - 2014. However, over time, the number of items corresponding to each value node 906 in the attribute 'year' may keep increasing. Under these circumstances, the refinement module 708 is configured to expand a value node by adding new sub-level value nodes under each particular 'year' value node when the number of items in that value node exceeds a predefined threshold. Such sub-levels may be levels for each month of the year, each fortnight in the year and so on, depending on the number of items in the parent 'year' value node. Similarly, if the number of items under a sub-value node drops below a predetermined threshold value, the refinement module 708 may collapse the sub-level value nodes and instead place the items 718 under the parent value node 906. [00111] Moreover, the refinement module 708 is configured to save filter selections and tags as a search string. This in turn makes it possible to treat filters as dynamic tags, in the same way search tokens are captured and reused. Saving search selections effectively enables saved search strings to be used within new filter searches, creating the ability to nest queries. [00112] FIG. 16 illustrates an exemplary focus interface 713, which allows users to navigate within the filtered set of results. Accordingly, the focus interface 713 depicts the classification tree 722 with each node label depicting the number of responsive items retrieved based on the nodes selected in the filter interface 712. Because the nodes of the focus interface 713 are dependent on the selection in the filter interface 712, the focus interface 713 is refreshed each 27 time filter interface 712 is modified. Nodes with zero responsive items are generally omitted from the focus interface 713. Although the focus interface 713 uses the same underlying classification tree as the filter interface 712 it can be implemented in a way that dynamically reorganizes the classification tree to make it easier to navigate. For example, it can be made to collapse levels in the hierarchy in which there is only a single child node, or to readjust value ranges based on the value nodes present in the responsive set. Moreover, the focus interface 713 may not include any interactive objects multiple nodes to be selected. Consequently, the user 508 is allowed to select only one node at a time and the selection of a new node replaces the prior selection. [00113] The ability of the focus interface 713 to reflect the set of documents responsive to the selection in the filter interface 712 is of significant utility in situations where large datasets generate very large classification trees. In these cases, the underlying hierarchy may have too many levels and/or too many value nodes to be readily navigated via the Filter interface 712. In these situations, it may be desirable to introduce additional functionality into the focus interface 713 to facilitate refinement of the filter interface selections. For example, through the focus interface 713, a user may view additional attributes-value pairs associated with the current set of responsive documents. These attribute or value nodes may be candidates for further refinement of the search (either by inclusion or exclusion). It may therefore be convenient to allow users to set the corresponding filter interface nodes by right-clicking on a focus interface 713 node and choosing to 'exclude' or 'include' the node, which then triggers the corresponding change to be made in the Filter control. [00114] FIG. 16 depicts example filter interface nodes when, in the filter interface 712, value node "2014" is selected from the attribute "document date" and the value node "pdf' is selected from the attribute "file extension" to display 30 pdf documents with a document date in 2014. Accordingly, in FIG. 16, the focus interface 713 depicts the group, attribute, and value nodes included with these 30 results and allows the user to further refine the search by selecting one or more value nodes 906. [00115] The focus interface 713 also includes functionality to facilitate refinement of the search selections by facilitating adjustment of the selected nodes in the filter interface 712. For example, in FIG. 16, if the user selects the value node "S" from the "content search" attribute node, a window pops up that allows the users to set the corresponding filter interface 712 nodes 28 by choosing to 'exclude' or 'include' node S, which then triggers corresponding changes in the filter interface 712. EXEMPLARY METHODS [00116] FIGs. 17-22 are flowcharts illustrating exemplary methods for retrieving and displaying search results based on Boolean search queries. The methods will be described with reference to FIGS. 5-16. As described previously, the steps of the methods are implemented as one or more software code modules of the software application program 633 resident in the ROM 660 and controlled in execution by the processor 605 of the device 506 or the server 502. In some arrangements, one or more steps of the methods may be executed on the server 502, while other steps are executed on the user device 506. In another arrangement, the steps of the following methods may be executed on the user device 506. Moreover, one or more steps of the methods may be deleted, added, or reordered without departing from the scope of the present disclosure. [00117] FIG. 17 is a flowchart illustrating an exemplary method 1700 for generating a Boolean search query and facilitating searching based on the Boolean search query. The method 1700 begins at step 1702, where the classification module 703 retrieves metadata 720 associated with the underlying items 718 from one or more sources and stores the retrieved metadata 720 as application data 634. Various techniques may be utilized to retrieve the metadata 720 depending on the source of the native items 718, presence of absence of metadata included with the native items, the size of the metadata files and technology used to write and store the metadata. [00118] Subsequently (at step 1704), the classification module 703 generates a classification tree 722 including one or more of group, sub-group, attribute, sub-attribute or value nodes based on elements of the stored metadata 720. The nodes of the classification tree 722 depend on the underlying metadata 720 and in turn the underlying items 718. So, if the underlying items 718 are files related to products, the classification tree 722 can include nodes related to product features. Alternatively, if the underlying items 718 are related to official documents, the nodes of the classification tree 722 can be related to properties of the documents. [00119] The items 718 are subsequently indexed with the nodes of the classification tree 722 at step 1706. To that end, the indexing module 704 logically joins a list of items with the corresponding nodes of the classification tree 722 to generate the item index 724. Specifically, an item pointer is joined with a node pointer to generate the unique index 724.
29 [00120] At step 1708, the GUI 512 is displayed on the display 614 of the user device 506. The GUI 512 includes, among other interfaces, the filter interface 712, the focus interface 713, the keyword search bar 714 and the tagger 715. The filter interface 712 displays an interactive tree 1002 that is generated by appending interactive objects 1004 to the nodes of the classification tree 722, and the focus interface 713 displays nodes of the classification tree 722 corresponding to responsive items selected from the interactive tree 1002. The interactive objects 1004 allow nodes of the interactive tree 1002 to be selected. Specifically, the interactive objects 1004 allow each node of the interactive tree 1002 to be selected in one of three states optional, required, or excluded. The focus interface 713 may not include interactive objects, but allows selection of a single node at a time. [00121] At step 1710, user input is detected. A user may enter one or more commands in the GUI 512 using the input device 613. Depending on the type of input, the method 1700 can proceed in a number of directions. For instance, if the user input is detected in the filter interface 712, the method proceeds to process A. Alternatively, if the user input is detected in the keyword search bar 714, the method 1700 proceeds to process B. According to yet another event, if the user input is detected in the focus interface 713, the method proceeds to process C and if the user input is detected in the tagger 715, the method proceeds to process D. Each of the processes A-D will be described in detail with reference to FIGS. 19-22. [00122] FIG. 18 is a flowchart illustrating the sub-steps of method step 1708. Specifically, the flowchart illustrates an exemplary method for displaying the GUI 512 based on the classification tree 722. At step 1802, the filter interface 712 and the tagger 715 are generated based on the classification tree 722. Subsequently (at step 1804), selected filter nodes from the filter interface 712 are retrieved. As no user selection has been made so far, all the value nodes of the filter interface 712 are retrieved. The query builder 706 then converts the retrieved filter nodes into a first search string based on Boolean logic at step 1806. The focus interface 713 is then generated based on the first search string. Subsequently, at step 1810, the query builder 706 retrieves the selected focus nodes. [00123] Again, as no focus nodes are selected, the query builder 706 assumes that all the focus nodes are selected and generates a second search query at step 1812. It will be understood that when the GUI 512 is initiated for the first time, the second search query is the same as the first search query as none of the filter or focus nodes are selected. However, subsequently, once a user begins a search, the first and second search queries may be different. The second search 30 query is provided to the query engine 710 at step 1812, which retrieves search results and populates the results window 716 at step 1814. Moreover, the previewer 717 is updated with a preview of the first result from the results window 716 at step 1816. It will be understood that the method of FIG. 18 is performed whenever a GUI 512 is initiated or when the GUI 512 is updated after a particular user input/command. [00124] FIG. 19 is a flowchart illustrating an exemplary process A that is initiated when user input is detected in the filter interface 712. The method begins at step 1902 where the control builder 705 updates the status of one or more selected nodes of the interactive tree 1002. For instance, if the user selects the interactive icon 904 associated with a particular value node 906, the control builder 705 updates the status of the interactive icon 904 based on the next state of the value node 906. Subsequently (at step 1904), the control builder 705 propagates the selection of the value node 906 to any parent or child nodes based on the rules described with reference to FIGs. 11-13. [00125] The query builder 706, at step 1906, retrieves the one or more selected nodes from the filter interface 712 and converts the selected value nodes into a Boolean search string at step 1908. The state of the user-selected nodes is utilized to generate and refine the Boolean search string as described previously. At step 1910, the refined search string is provided to the query engine 710, which conducts a search in the item index 724 based on the refined search string and communicates the retrieved search results to the results module 709. [00126] Subsequently, at step 1912, the GUI is updated as described with reference to FIG. 18. Particularly, the focus interface is updated. To that end, the control builder 705 updates the nodes of the focus interface based on the refined search string. The query builder 706 then retrieves the focus interface to generate a second search string. A search is conducted based on the second search string, and results retrieved for this search as displayed on the results window 716. Furthermore, the previewer 717 is updated with a snapshot of the first result in the results window 716, and the tagger 715 is also updated. [00127] In addition to performing a search and displaying results based on the user selection, the search interface 510 may also be configured to save the search criteria. Accordingly, at step 1914, the query builder 706 may save the refined search string in the application data 634, the classification module 603 may update the classification tree 722 with a node for the saved search string and the control builder 605 may update the interactive tree 1002 in the filter interface 712 to display the value node corresponding to the saved search string. Consequently, the saved 31 search string can be optionally or mandatorily selected or excluded from other value nodes to generate nested search strings. [00128] FIG. 20 is a flowchart illustrating an exemplary process B that is initiated when user input is detected in the keyword search bar 714. The method 2000 begins at step 2002 where the keyword module 707 retrieves keywords entered by the user 508. The keyword module 707 also retrieves the search scope selected by the user 508. In one embodiment, the search scope includes metadata searching and content searching. If at decision box 2004, the method 2000 determines that the search scope is content search, the method proceeds to step 2006. Alternatively, the method proceeds to step 2016. [00129] At step 2006, the keyword module 707 generates alternative keywords based on the stem of the retrieved keywords and generates synonyms of the retrieved keywords using an inbuilt thesaurus or dictionary. Next, the keyword search engine 711 conducts a search in the content of the underlying items 718 at step 2008. The search results and keywords are communicated to the classification module 703 (at step 2010). The classification module 703 generates new metadata tags for the keywords in the retrieved search results and also generates new attribute and/or value nodes for the keywords. In one arrangement, the classification module 703 may generate separate value nodes for each keyword and synonym. Alternatively, a single node may be created for the keyword and its synonyms. [00130] Next (at step 2012), the retrieved documents are indexed with the new value nodes and stored in the item index 724. Finally, the GUI 512 is updated at step 2014. Particularly, the interactive tree 1002 is updated with node labels for the new value nodes and the focus interface 713, the tagger 715, the results window 716 and the previewer 717 are updated based on the updated filter interface 712. [00131] If the user selects metadata search at step 2002, the method proceeds to step 2018, where the keywords are provided to the query builder 706, which converts the keywords into a search string and communicates the search string to the query engine 710. The query engine executes the search string against the index 624 and retrieves search results corresponding to the search string. Subsequently, the GUI 512 is updated at step 2014 based on the retrieved search results.
32 [00132] FIG. 21 is a flowchart illustrating an exemplary method C that is initiated when user input is detected in the focus interface 713. The method 2100 begins at step 2102 where the query builder 706 retrieves a focus node selection from the focus interface 713. The query builder 706 generates a focus search string based on the selected focus node and communicates the focus search string to the query engine 710. At step 2106, the query engine 710 retrieves search results based on the focus search string and communicates these results to the control builder 705. [00133] Finally, the GUI 520 is updated at step 2108. To that end, the node labels of the focus interface 713 are updated to reflect the search results, the search results are displayed on the results window 716 and the first result from the results window 716 is retrieved and displayed in the previewer 717. In some arrangements, the tagger 715 is also updated based on the selected focus nodes. [00134] FIG. 22 is a flowchart illustrating an exemplary process D that is initiated when user input is detected in the tagger 715. Specifically, the method 2200 begins at step 2201, where the refinement module 708 detects whether a new tag is selected or not. If the user wishes to add a new tag, the method proceeds to step 2202 where the refinement module 708 displays a pop-up window requesting the user 508 to enter a label for the new tag. The refinement module 708 also communicates the tag label to the classification module 703, which adds the new tag label to the classification tree 722 at step 2202. Subsequently, the user selects one or more responsive documents from the results window to add to the new tag label. The refinement module 708 communicates the selected documents to the classification module 703, which subsequently retrieves the selected items (at step 2204) and updates the index 724 with the new tag and the responsive items 718 (step 2206). [00135] Finally, the GUI is updated at step 2208. Specifically, the filter interface 712 is updated with the new tags, the focus interface 713 is updated based on the selected nodes of the filter interface, the results window 716 is updated based on the selected focus nodes, the previewer 717 is updated with the first result from the results window 716 and the tagger 715 is updated with the new tag. If instead of adding a new tag, the user wishes to update an existing tag, the method proceeds to step 2210 where the user input is communicated to the indexing module 704, which updates the item index 724. Thereafter, the GUI is updated at step 2208.
33 [00136] Aspects of the disclosed search interface 510 supports the full set of Boolean operators (AND,OR,NOT) and provide a solution for non-technical users to conduct complex Boolean search queries without needing to learn any complex search syntax. Moreover, the search interface 510 allows users to easily refine searches by including or excluding additional value nodes. Furthermore, the search interface 510 can be implemented with any underlying database structures, making the search interface 510 very versatile. EXEMPLARY APPLICATIONS [00137] In addition to allowing users to search for particular documents based on their metadata, conduct complex nested queries, and refine search results, the disclosed search interface may also be utilized as a content aggregator. For instance, the search interface 510 may ingest metadata 720 from multiple locations, storage media, or websites (such as multiple electronics online shops, or multiple women's clothing shops, etc.). The ingested metadata may be classified and displayed in the filter interface as an interactive tree, thereby allowing users to search documents/products from multiple websites in a single location. In another example, the search interface may be utilized to aggregate content from multiple social media platforms, such as Facebook*, Twitter*, Instagram* . To that end, the search interface ingests metadata associated with feeds, photographs, albums, news, etc., and classifies this metadata into a classification tree based on one or more rules. The classified metadata is depicted as an interactive tree, allowing a user to search relevant information from all of his/her social media accounts in one place. [00138] Furthermore, the search interface may be utilized by website developers to manage websites and specifically to ascertain whether metadata associated with documents is properly recorded. To that end, a developer may ingest metadata associated with the underlying documents of a website into the search interface 510. Thereafter, the search interface 510 classifies and indexes the documents 718 based on the metadata 720 and displays the classification tree 722 in the filter interface 712. At this stage, the developer may review the various nodes of the classification tree 722 to determine if the documents 718 are properly classified. For instance, some documents may be incorrectly classified under the wrong group or attribute. Such errors would indicate that the microdata associated with the documents were incorrectly programmed. Accordingly, the developer may add new tags to correctly classify documents. If the search interface 510 is operatively coupled to the content management system 34 of the website, the content management system can be automatically updated when new tags are added. It will be appreciated that by utilizing the search interface to update metadata tags and correct any tagging issues, the visibility of the website can be increased, which in turn can increase the number of visitors to the website/webpage. [00139] The classification tree 722 generated by the search interface 510 may also be utilized as a sitemap for a website. Particularly, the search interface 510 may ingest metadata 720 associated with the content 718 of a website and display an interactive tree 1002 of the website content once the metadata 720 is classified. It will be appreciated that these applications are merely examples, and the search interface of the present disclosure may be utilized in numerous other such applications without departing from the scope of the present disclosure. [00140] In the context of this specification, the word "comprising" means "including principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings.

Claims (21)

1. A method for facilitating searching, the method comprises: displaying an interactive tree on a graphical user interface (GUI), the interactive tree being formed from a classification of metadata associated with items, and including one or more of value nodes, value ranges, sub-attribute nodes, attribute nodes, sub-group nodes, and group nodes; detecting user selection of one or more nodes of the interactive tree, each selected node including one state from a set of states comprising optional, required and excluded states; converting the selected nodes into a Boolean search string based on the corresponding states of the selected nodes; providing the Boolean search string to a search engine to retrieve search results from the items; and displaying the search results on the GUI.
2. The method of claim 1 further comprising refining the search results by: displaying one or more group, attribute and value nodes associated with the search results; detecting a user selection to exclude a displayed node from the selected nodes; generating a new Boolean search query by excluding the selected node from the selected nodes of the interactive tree; updating the GUI based on the new Boolean search query.
3. The method of claim 1 further comprising: inserting one or more metadata tag elements to one or more items in the search results; and adding one or more nodes in the interactive tree corresponding to the metadata tag elements.
4. The method of claim 1 further comprising: retrieving metadata associated with the items from a source; classifying the items into a classification tree including one or more of the group nodes, sub-group nodes, attribute nodes, sub-attribute nodes, and associated value nodes by applying one or more rules on the retrieved metadata; and indexing the items with the nodes of the classification tree. 36
5. The method of claim 4, wherein displaying the interactive tree comprises: appending an interactive object to each node of the classification tree to generate the interactive tree.
6. The method of claim 3, further comprising: refining one or more value nodes of the classification tree based on a comparison between a number of items responsive to the value node and a threshold value.
7. The method of claim 1 further comprising performing a keyword search by: detecting one or more keywords and search scope entered in the GUI, the search scope including at least one of metadata search and content search; generating alternative keywords based on the detected keywords; retrieving search results for the keywords based on a search of the keywords in the content of the items; adding metadata elements to the search results based on the keywords; updating the interactive tree based on the additional metadata elements.
8. A search interface for facilitating searching of items, the search interface comprising: a control builder configured to: display a graphical user interface (GUI) comprising a filter interface having an interactive tree, the interactive tree being based on a classification of metadata associated with the items and including: one or more of group nodes, sub-group nodes, attribute nodes, sub attribute nodes, value ranges, and value nodes and interactive objects for selecting the one or more nodes; a query builder configured to: detect user selection of the one or more nodes, each selected node including one state from a set of at least two states comprising optional, required, and excluded states; retrieve the selected nodes from the display module; convert the selected nodes into a Boolean search query based on the corresponding states of the nodes; and a query search engine configured to: 37 execute a search of the items based on the search query to retrieve search results; and a results module configured to: retrieve the search results from the query search engine; and display the search results on the GUI.
9. The search interface of claim 8, wherein the GUI further comprising: a focus interface configured to: display group, attribute, and value nodes associated with the search results; and accept a user input to exclude one or more displayed nodes.
10. The search interface of claim 8, wherein the GUI further comprises: a keyword search bar configured to accept entry of one or more keywords and a search scope; a tag bar configured to: allow a user to input one or more metadata tags and associate the one or more metadata tags with one or more electronic items from the search results; and display the one or more user selected metadata tags; and a results window configured to display the search results.
11. The search interface of claim 8 further comprising: a classification module configured to retrieve the metadata associated with the items from one or more sources and classify the items into a classification tree including one or more of the group nodes, sub-group nodes, attribute nodes, sub-attribute nodes, value ranges and value nodes by applying one or more rules to the retrieved metadata; an indexing module configured to create an index including an item reference and a node reference.
12. The search interface of claim 8, wherein the interactive objects are checkboxes, pop-up menus, or drop down menus.
13. The search interface of claim 12, wherein the optional state is represented by a dot, the required state is represented by a "+" and the excluded state is represented by a . 38
14. The search interface of claim 12 is configured to operate in a setting comprising one of basic, standard, and advanced setting, wherein in the basic setting, the state of a node from the one or more nodes of the interactive tree is selected between an unselected state and the optional state, in the standard setting, the state of the node is selected between the unselected state, optional state or required state, and in the advanced setting, the state of the node is selected between the unselected state, optional state, required state, and excluded state.
15. The search interface of claim 8 further comprising: a keyword module configured to: detect entry of one or more keywords in the keyword search bar; retrieve a user selected search scope, the search scope including metadata search and content search; generate one or more keywords for the detected keywords based on a stem or synonym of the detected keywords; and communicate the keywords and search results associated with the keywords to the classification module.
16. The search interface of claim 15, wherein the classification module is configured to: add metadata tags corresponding to the keywords to the search results associated with the keywords; and update the classification tree based on the additional metadata tags.
17. The search interface of claim 8, further comprising a refinement module configured to: refine the value nodes of the interactive tree once the number of items corresponding to a value node exceed or drop below a threshold value; and save the Boolean search string.
18. A graphical user interface (GUI) for searching for electronic items, comprising: a filter interface configured to display an interactive tree, the interactive tree being based on a classification tree of metadata associated with the electronic items and including: 39 one or more value nodes, value ranges, sub-attribute nodes, attribute nodes sub-group nodes, and group nodes, and interactive objects for selecting the one or more nodes, wherein the interactive objects allow a node to be in a state from a set of states comprising unselected, optional, required, and excluded; a results interface configured to display a list of search results comprising one or more electronic items corresponding to the one or more selected nodes; and a focus interface configured to: display one or more value, attribute, and/or group nodes corresponding to the search results, and accept a user input to refine the search results by including or excluding one or more value, attribute and/or group nodes displayed in the focus interface.
19. The graphical user interface (GUI) of claim 18 further comprising: a tagger configured to allow a user to select one or more tags and associate the one or more tags with one or more electronic items from the displayed search results; and display the one or more user selected tags.
20. The graphical user interface (GUI) of claim 19, further comprising: a keyword search interface configured to: allow a user to enter one or more keywords; and select a search scope from metadata search and full-text search.
21. The graphical user interface (GUI) of claim 19, wherein the interactive objects comprise at least one of checkboxes or drop-down menus, wherein the optional state is represented by a dot, the required state is represented by a "+" and the excluded state is represented by a Jonathan Robert Burnett Patent Attorneys for the Applicant/Nominated Person SPRUSON & FERGUSON
AU2014101081A 2014-09-05 2014-09-05 System, method and graphical user interface for facilitating a search Ceased AU2014101081A4 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2014101081A AU2014101081A4 (en) 2014-09-05 2014-09-05 System, method and graphical user interface for facilitating a search
PCT/AU2015/000541 WO2016033639A1 (en) 2014-09-05 2015-09-04 System, method, and graphical user interface for facilitating a search

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2014101081A AU2014101081A4 (en) 2014-09-05 2014-09-05 System, method and graphical user interface for facilitating a search

Publications (1)

Publication Number Publication Date
AU2014101081A4 true AU2014101081A4 (en) 2014-10-09

Family

ID=51684636

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2014101081A Ceased AU2014101081A4 (en) 2014-09-05 2014-09-05 System, method and graphical user interface for facilitating a search

Country Status (2)

Country Link
AU (1) AU2014101081A4 (en)
WO (1) WO2016033639A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3107045A1 (en) * 2015-06-18 2016-12-21 Simmonds Precision Products, Inc. Deep filtering of health and usage management (hums) data

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112416334A (en) * 2019-08-23 2021-02-26 腾讯科技(深圳)有限公司 Page configuration method, device, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2710548B2 (en) * 1993-03-17 1998-02-10 インターナショナル・ビジネス・マシーンズ・コーポレイション Method for retrieving data and converting between Boolean algebraic and graphic representations
US7873649B2 (en) * 2000-09-07 2011-01-18 Oracle International Corporation Method and mechanism for identifying transaction on a row of data
WO2005109244A1 (en) * 2004-05-11 2005-11-17 Angoss Software Corporation A method and system for interactive decision tree modification and visualization
JP4921103B2 (en) * 2006-10-13 2012-04-25 インターナショナル・ビジネス・マシーンズ・コーポレーション Apparatus, method and program for visualizing Boolean expressions

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3107045A1 (en) * 2015-06-18 2016-12-21 Simmonds Precision Products, Inc. Deep filtering of health and usage management (hums) data
US9555710B2 (en) 2015-06-18 2017-01-31 Simmonds Precision Products, Inc. Deep filtering of health and usage management system (HUMS) data

Also Published As

Publication number Publication date
WO2016033639A1 (en) 2016-03-10

Similar Documents

Publication Publication Date Title
US11409777B2 (en) Entity-centric knowledge discovery
US20230385033A1 (en) Storing logical units of program code generated using a dynamic programming notebook user interface
CN105493075B (en) Attribute value retrieval based on identified entities
US9489119B1 (en) Associative data management system utilizing metadata
US10599643B2 (en) Template-driven structured query generation
US9015175B2 (en) Method and system for filtering an information resource displayed with an electronic device
US10133823B2 (en) Automatically providing relevant search results based on user behavior
US10013144B2 (en) Visual preview of search results
US9659054B2 (en) Database browsing system and method
US20200342029A1 (en) Systems and methods for querying databases using interactive search paths
US20160210355A1 (en) Searching and classifying unstructured documents based on visual navigation
US20170097946A1 (en) Method and apparatus for saving search query as metadata with an image
KR101441219B1 (en) Automatic association of informational entities
US8612882B1 (en) Method and apparatus for creating collections using automatic suggestions
US20120117093A1 (en) Method and system for fusing data
US20210406268A1 (en) Search result annotations
AU2014101081A4 (en) System, method and graphical user interface for facilitating a search
JPH11282882A (en) Document management method
JP2002082965A (en) Document retrieval method
Ksikes Towards exploratory faceted search systems
Smith Exploratory and faceted browsing, over heterogeneous and cross-domain data sources
KR20240019303A (en) User interface for displaying web browser history data
Uma¹ et al. WEBPACS: A Tool to Show Case Library's Resources for Their Effective Use
Borden TimeIn

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry