WO2017034576A1 - Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur - Google Patents

Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur Download PDF

Info

Publication number
WO2017034576A1
WO2017034576A1 PCT/US2015/047195 US2015047195W WO2017034576A1 WO 2017034576 A1 WO2017034576 A1 WO 2017034576A1 US 2015047195 W US2015047195 W US 2015047195W WO 2017034576 A1 WO2017034576 A1 WO 2017034576A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic
user
electronic file
files
file
Prior art date
Application number
PCT/US2015/047195
Other languages
English (en)
Inventor
Baljit Singh
Jonathan Thomas MOZZETTA
Robert P. Cazier
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2015/047195 priority Critical patent/WO2017034576A1/fr
Publication of WO2017034576A1 publication Critical patent/WO2017034576A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

Definitions

  • Electronic devices and networking environments allow for the sharing of electronic files between various users. For example, multimedia files such as photographs and videos taken by various individuals can be shared with other individuals via a physical network, a networking application or other networking environment.
  • FIG. 1 is a diagram of a system for sharing similar electronic files based on user-generated criteria, according to one example of the principles described herein.
  • FIG. 2 is a diagram of an electronic device for sharing similar electronic files based on user-generated criteria, according to one example of the principles described herein.
  • FIG. 3 is a flowchart of a method for sharing similar electronic files based on user-generated criteria, according to one example of the principles described herein.
  • Fig. 4 is a diagram of an electronic device for sharing similar electronic files based on user-generated criteria, according to another example of the principles described herein.
  • FIG. 5 is a flowchart of a method for sharing similar electronic files based on user-generated criteria, according to another example of the principles described herein.
  • FIG. 6 is a diagram of an electronic device for sharing similar electronic files based on user-generated criteria, according to another example of the principles described herein.
  • Electronic device usage and networking usage has become widespread in facilitating the dissemination of information amongst various users.
  • users, or suppliers of electronic files can use the network environment to provide access to the electronic files to other select users.
  • a first user may take photographs at an event, such as a vacation or other social event. The first user may then upload the photographs to a network such as a home network or a public network so other users may view the photographs. While such network-sharing of electronic files has become widespread, its practical implementation may be cumbersome for users desiring to share electronic files.
  • the devices and systems of the present specification and the appended claims address these and other issues. Specifically, the present specification and the appended claims rely on user-selected criteria and metadata associated with the electronic files to determine whether certain electronic files are similar. If the files are similar as defined by the user- generated criteria, then they are grouped together and the suppliers of those similar electronic files are allowed access to the electronic files.
  • the present specification describes a method for sharing similar electronic files based on user-generated criteria.
  • user-generated criteria are received.
  • the user-generated criteria define a similarity between electronic files.
  • Electronic files that are similar are identified by comparing metadata associated with the electronic files against the user- generated criteria. Similar electronic files are shared with suppliers of the similar electronic files.
  • the present specification also describes a device for sharing similar electronic files based on user-generated criteria.
  • the device includes a user-generated criteria engine to receive user-generated criteria to define a similarity between electronic files.
  • the device also includes a compare engine to compare 1 ) first metadata associated with a first electronic file received from a first supplier, 2) second metadata associated with a second electronic file received from a second supplier, and 3) the user-generated criteria.
  • a share engine of the device shares the first electronic file and the second electronic file with the first supplier and the second supplier when the first electronic file and the second electronic file are similar as defined by the user-generated criteria.
  • the present specification also describes a non-transitory machine-readable storage medium encoded with instructions for sharing electronic files.
  • the instructions executable by a processor cause the processor to receive user-generated criteria to define a similarity between electronic files, analyze first metadata for a first electronic file received from a first supplier, analyze second metadata for a second electronic file received from a second supplier, determine whether the first electronic file and the second electronic file are similar by comparing the first metadata and second metadata against the user-generated criteria, and allow joint access to the first electronic file and the second electronic file by the first supplier and the second supplier when the first electronic file and the second electronic file are similar.
  • Certain examples of the present disclosure are directed to a device and method for grouping and sharing electronic files that 1 ) simplifies the process of sharing of electronic files; 2) reduces user-involvement in providing and sharing the electronic files; and 3) provides flexibility in generating a file- sharing policy.
  • the devices and systems disclosed herein may prove useful in addressing other deficiencies in a number of technical areas. Therefore the systems and devices disclosed herein should not be construed as addressing just the particular elements or deficiencies discussed herein.
  • similarity may refer to a geographic similarity, meaning that electronic files originated close to one another. Similarity may also refer to a temporal similarity meaning that the electronic files were generated at near the same time. Other examples of similarities include file format similarity, and file attribute similarity among other similarities. In some examples, similar electronic files may have at least one, and in some cases multiple shared characteristics.
  • Metadata and similar terminology may refer to data associated with the electronic file that is not content data.
  • content data refers to data that is visually represented such as shapes, images, and individual faces.
  • these shapes, images and faces are not metadata, but content data and the metadata refers to information describing the electronic file such as geographic location of where the electronic file was generated, a time stamp for the electronic file, as well as textual information describing file attributes, formats and characteristics.
  • EXIF exchangeable image file
  • a supplier refers to a user or entity that supplies an electronic file.
  • a supplier may be an individual that takes a photograph or shoots a video.
  • a supplier may be an individual that receives the electronic file from another entity or user.
  • a number of or similar language is meant to be understood broadly as any positive number including 1 to infinity; zero not being a number, but the absence of a number.
  • Fig. 1 is a diagram of a system (100) for sharing similar electronic files (106) based on user-generated criteria, according to one example of the principles described herein.
  • users such as suppliers using electronic devices such as personal computers, laptops, mobile phones, mobile devices, personal digital assistants (PDAs) and other electronic devices may share electronic files (106) with other users.
  • a first supplier such as a first user (104-1 )
  • a second supplier such as a second user (104-2)
  • a third supplier such as a third user (104-3) may supply different sets of electronic files (106-1 , 106-2, 106-3), respectively.
  • the supplication of the electronic files (106-1 , 106-2, 106-3) is indicated by the solid lines (108-1 , 108-2, 108-3).
  • (106-1 ) refers to electronic files provided by a first user (104-1 ).
  • elements without the identifier "-1 " refer to a generic instance of an element.
  • (106) refers to electronic files in general.
  • the electronic files (106) are shared via a network environment.
  • a network or network environment may be a network of a number of computers, an internet, an intranet, or the Internet.
  • the networking environment may also comprise a cloud network environment including, for example, a private cloud network, a public cloud network, or a hybrid cloud network, among others.
  • the networking environment may be a mobile network environment.
  • the networking environment may be a virtualized network environment.
  • the electronic files (106) refer to any electronic information that is to be shared with other users (104).
  • the electronic files (106) may be multimedia files such as photographic files, video files, and audio files among other types of multimedia files.
  • the electronic files may be textual documents.
  • the system (100) also includes a sharing electronic device (102) that allows for auto-grouping and auto-sharing of the electronic files (106) based on user-generated criteria. More specifically, the user-generated criteria allow a user to specify what electronic files (106) are to be shared. For example, a user, such as a first user (104-1 ), may generate criteria that indicate electronic files (106) that have a geographic similarity may be shared. In other words, electronic files (106) that were generated within a threshold distance from one another may be shared. As a specific example of a geographic user-generated criteria, a first user (104-1 ) generates a criteria that indicates electronic files (106) that were generated within "x" miles of one another may be shared.
  • a sharing electronic device (102) that allows for auto-grouping and auto-sharing of the electronic files (106) based on user-generated criteria. More specifically, the user-generated criteria allow a user to specify what electronic files (106) are to be shared. For example, a user, such as a first user (104-1 ), may generate criteria that indicate
  • a user such as a second user (104-2) generates criteria that indicate that electronic files (106) that have a temporal similarity may be shared.
  • a second user (104-2) generates criteria that indicate that electronic files (106) that were generated on the same date may be shared.
  • combinations of criteria may be generated. For example, a user may generate a criteria that indicates that electronic files (106) that were generated within "x" miles of one another and on the same day may be shared. While specific reference is made to geographic and temporal similarities other similarities may exist upon which the user-generated criteria are based. Moreover, while specific reference is made to individual criteria, and combined geographic-temporal criteria any number and any combination of criteria may be implemented. Examples of other similarities are described below in connection with Fig. 3.
  • the sharing electronic device (102), using the user-generated criteria analyzes metadata of the electronic files (106) to identify similar electronic files (106) and to group the similar electronic files (106) together.
  • metadata refers to information other than the content data that describes the electronic files (106).
  • content data of a photograph includes image data such as buildings, individuals, or other visual components
  • metadata includes geographic location identification information such as global positioning system (GPS) data and a time stamp in addition to other metadata. While specific reference is made to GPS data and a time stamp, other metadata may be used in the implementation of the present specification, examples of which are provided below in connection with Fig. 3.
  • the sharing electronic device (102) shares the similar electronic files (106) with specific users (104). Specifically, the sharing electronic device (102) shares the electronic files (106) that match the user-generated criteria with those users (104) that supplied the similar electronic files (106). For example, given the user-generated criteria that electronic files (106) generated within 10 miles of each other on the same day are similar and may be shared, the electronic device (102) analyzes the metadata of a first electronic file (106-1 ) to determine that a photograph was taken on August 6, 2015 at a location A. The electronic device (102) also analyzes the metadata of a second electronic file (106-2) to determine that a photograph was taken on August 6, 2015 at a location B that is less than 10 miles from location A. Still further, the electronic device (102) also analyzes the metadata of a third electronic file (106-3) to determine that a video was taken on August 4, 2015 at a location C that is more than 10 miles from location A and more than 10 miles from location B.
  • the electronic device (102) determines that the first electronic file (106-1 ) and the second electronic file (106-2) are similar based on the user-generated criteria of being within 10 miles from one another and having a time stamp of the same day. Accordingly, the electronic device (102) shares the first electronic file (106-1 ) and the second electronic file (106-2) with both the first user (104-1 ) and the second user (104-2) as indicated by the dashed lines (1 10-1 , 1 10-2) respectively.
  • the first user (104-1 ) can view both the first electronic file (106-1 ) submitted by the first user (104-1 ) as well as the second electronic file (106-2) supplied by the second user (104-2) on account of the similarity between the electronic files (106-1 , 106-2).
  • the second user (104-2) can view both the second electronic file (106- 2) submitted by the second user (104-2) as well as the first electronic file (106- 1 ) supplied by the first user (104-1 ) on account of the similarity between the electronic files (106-1 , 106-2).
  • the third electronic file (106-3) does not share a similarity with either the first electronic file (106-1 ) or the second electronic file (106-2), the third electronic file (106-3) is not grouped nor shared nor is the third user (104-3) granted access to view the first electronic file (106-1 ) and the second electronic file (106-2).
  • Fig. 2 is a diagram of an electronic device (102) for sharing similar electronic files (Fig. 1 , 106) based on user-generated criteria, according to one example of the principles described herein.
  • the electronic device (102) includes various hardware components. Specifically, the electronic device (102) includes a processor (212) and a number of engines (214, 216, 218). The engines (214, 216, 218) cause the processor (212) to execute the designated function of the engines (214, 216, 218).
  • the engines (214, 216, 218) refer to a combination of hardware and program instructions to perform a designated function.
  • the engines (214, 216, 218) may be hardware.
  • the engines (214, 216, 218) may be implemented in the form of electronic circuitry (e.g., hardware).
  • Each of the engines (214, 216, 218) may include its own processor, but one processor may be used by all the engines (214, 216, 218).
  • each of the engines (214, 216, 218) may include a processor and memory. Alternatively, one processor may execute the designated function of each of the engines (214, 216, 218).
  • the user-generated criteria engine (214) receives user-generated criteria.
  • the user-generated criteria define a similarity between electronic files (Fig. 1 , 106).
  • the user-generated criteria may be used to determine whether electronic files (Fig. 1 , 106) are similar.
  • the user-generated criteria may indicate that photographs are geographically similar if they are taken within 10 miles of one another.
  • the user-generated criteria may indicate that photographs are temporally similar if they are taken within 4 hours of one another.
  • the user- generated criteria indicate a relative similarity between multiple electronic files (Fig. 1 , 106).
  • the user-generated criteria engine (214) may provide a user- interface wherein a user (Fig. 1 , 104) may select from a number of candidate criteria. For example, radio buttons and drop-down menus may provide a user (Fig. 1 , 104) with a number of candidate criteria that may be used in the generation of criteria. In these examples after selecting candidate criteria, the user (Fig. 1 , 104) may enter a threshold value for determining similarity. For example, after selecting a criteria of "distance" a user (Fig. 1 , 104) selects a value of "10 miles" to be used by the compare engine (216) in determining similarity of electronic files (Fig. 1 , 106).
  • a map may be presented to the user. Via an icon, the user may visually indicate the threshold distance and time used in determining similarity. For example, a user may manipulate an icon, such as a shape on a map to identify a threshold distance for determining geographic similarity. As a specific example, on the map a circle is generated around Paris. A user expands or contracts this circle to generate the geographic criteria for determining similarity. Similarly, temporal criteria may include a visual indicator on a user interface that may be dragged to expand or compress the temporal criteria for determining similarity.
  • the user-generated criteria engine (214) may also allow, via an interface, a user to select a criteria relating to white balance. For example, a user could indicate that photographs taken outside are to be shared. In this example, the compare engine (216) would compare white balance metadata associated with various photographs to determine similarity. This white balance criteria could be paired with other criteria such as geographic or temporal criteria.
  • the user-generated criteria may also indicate particular users to incorporate into the sharing of electronic files (Fig. 1 , 106). For example, within a family network, a young child may have taken photos that a parent may not desire to share. Accordingly, the user generated criteria may specify which users within a sharing network are to be included. In this example, electronic files from those specified users would be analyzed for similarity.
  • the user-generated criteria may be dynamic meaning that they can change. For example, users (Fig. 1 , 104) may generate their own criteria in addition to the candidate criteria stored in the user-generated criteria engine (214). Still to this point, the criteria may be dynamic as new candidate criteria may be added to the engine (214).
  • the compare engine (216) compares first metadata associated with a first electronic file (Fig. 1 , 106-1 ) and second metadata associated with a second electronic file (Fig. 1 , 106-2). Each electronic file (Fig. 1 , 106) is associated with a supplier. Specifically the first electronic file (Fig. 1 , 106-1 ) is received from a first supplier and the second electronic file (Fig. 1 , 106-2) is received from a second supplier. The compare engine (216) compares this metadata against the aforementioned user-generated criteria to determine whether the pertinent electronic files (Fig. 1 , 106) are similar.
  • the compare engine (216) analyzes the first metadata which includes GPS information for the first electronic file (Fig. 1 , 106-1 ) and the second metadata which includes GPS information for the second electronic file (Fig. 1 , 106-2) to determine whether the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106- 2) are similar, i.e., they were taken within 10 miles of one another.
  • the compare engine (216) performs this operation for all electronic files (Fig. 1 , 106) uploaded to the network environment.
  • the metadata that the compare engine (216) analyzes may be distinct from the content data.
  • content data may be image data that includes visual information communicated via the photograph.
  • the visual information i.e., faces, buildings, etc. is not used to determine similarity, but rather the metadata, i.e., the textual information describing the visual information, may be used to determine similarity between the photographs.
  • the share engine (218) shares electronic files (Fig. 1 , 106) that are determined to be similar with the users (Fig. 1 , 104) that submitted those similar electronic files (Fig. 1 , 106). For example, given that the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) are determined to be similar as defined by the user-generated criteria, the share engine (218) shares the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) with both the supplier of the first electronic file (Fig. 1 , 106-1 ) and the supplier of the second electronic file (Fig. 1 , 106-2). In other words, both suppliers of the similar electronic files (Fig. 1 , 106) can see both electronic files (Fig. 1 , 106).
  • the share engine (218) generates a distinct shared location, such as a shared folder, and copies and moves the electronic files (Fig. 1 , 106) from their original locations to the shared location.
  • the share engine (218) may generate a name for the location, to identify its contents, i.e., the electronic files (Fig. 1 , 106) based on their similarities. For example, a folder titled "User A's and User's B photos in Hawaii" may be created in a shared location and is used to indicate that it contains photographs from User A and User B which had temporal similarity and geographic similarity.
  • the share engine (218) creates duplicate data on electronic devices associated with the corresponding users (Fig. 1 , 104). For example, rather than copying and moving the electronic files (Fig. 1 , 106), the share engine (218) creates a folder titled "User A's and User's B photos in Hawaii" which is placed in each of User A's and User B's actual devices or associated with User A's and User B's accounts.
  • the share engine (218) includes a database that includes pointers to the original electronic file (Fig. 1 , 106) locations. For example, rather than moving the data or creating a new folder, the share engine (218) updates a database to indicate a location of the electronic files (Fig. 1 , 106) and also indicating that both User A and User B have access to that folder.
  • a virtual environment including creating folders and moving folders to certain electronic devices
  • the examples described herein could be implemented in a virtual environment meaning users (Fig. 1 , 106) have accounts and the similar electronic files (Fig. 1 , 106) are virtually associated with a user (Fig. 1 , 104).
  • the data is represented and sorted by other than a file structure.
  • photographs from a trip to the Grand Canyon may be placed on a backup drive.
  • a database for all the electronic files (Fig. 1 , 106) is created.
  • the database holds references to the true location of the electronic files (Fig. 1 ,106) but does not move the files.
  • the content in the database can be sorted and virtualized based on the similarity criteria as described above.
  • Fig. 3 is a flowchart of a method (300) for sharing similar electronic files (Fig. 1 , 106) based on user-generated criteria, according to one example of the principles described herein.
  • the methods (300, 500) may be described below as being executed or performed by at least one device, for example, the sharing electronic device (Fig. 1 , 102).
  • Other suitable systems and/or computing devices may be used as well.
  • the methods (300, 500) may be implemented in the form of executable instructions stored on at least one machine-readable storage medium of the electronic device (Fig. 1 , 102) and executed by at least one processor of the electronic device (Fig. 1 , 102).
  • the methods (300, 500) may be implemented in the form of electronic circuitry (e.g., hardware). While Figs. 3 and 5 depict operations occurring in a particular order, a number of the operations of the methods (300, 500) may be executed concurrently or in a different order than shown in Figs. 3 and 5. In some examples, the methods (300, 500) may include more or less operations than are shown in Figs. 3 and 5. In some examples, a number of the operations of the methods (300, 500) may, at certain times, be ongoing and/or may repeat.
  • user-generated criteria that define a similarity between electronic files (Fig. 1 , 106) are received (block 301 ).
  • the user-generated criteria define different types of similarity.
  • the user-generated criteria may define geographic similarity meaning that electronic files (Fig. 1 , 106), such as multimedia files, were taken or generated close to one another, as defined by the criteria.
  • a user- generated criteria defines geographic similarity as electronic files (Fig. 1 , 106) that were generated within 5 miles of one another.
  • the user- generated criteria define a temporal similarity meaning that electronic files (Fig.
  • a user such as multimedia files
  • a predetermined threshold i.e., one hour
  • any user input threshold value may be used to determine geographic and temporal similarity.
  • a user may input a geographic threshold value of 10 miles as compared to the 5 miles previously mentioned.
  • a user may input a temporal similarity of "within 3 hours of one another" as opposed to the one our described previously.
  • Other examples of similarities that may be defined by the user- generated criteria include a file attribute similarity.
  • a file attribute may indicate a content characteristic of a file.
  • an attribute of a photograph may indicate a component of the photograph such as white balance, exposure, aperture, light levels, etc.
  • Attribute similarity may indicate for example, that a first photograph and a second photograph were both taken outside.
  • Another example of a similarity is a file format.
  • a file format indicates the form of an electronic file (Fig. 1 , 106).
  • the format may be a photograph, a video, a multimedia presentation, and a text document, among other file formats.
  • a specific example of another file format is an audio file that could similarly be uploaded and compared to determine similarity.
  • FIG. 1 , 106 Other examples of user-generated criteria may indicate that 1 ) electronic files (Fig. 1 , 106) were generated by the same device, 2) were generated by the same author, 3) were generated at a similar elevation, and 4) were taken at the same velocity (i.e., indicating both photos were taken on a train between cities), and 5) audio similarity. While specific reference is made to certain criteria any number of criteria could be implemented in accordance with the present method (300).
  • the electronic files (Fig. 1 , 106) that are similar are identified (block 302) by comparing metadata associated with the electronic files (Fig. 1 , 106) against the user-generated criteria. For example, metadata for each electronic file (Fig. 1 , 106) is analyzed to determine whether they satisfy the user-generated criteria. If they do, then these electronic files (Fig. 1 , 106) are identified as similar electronic files (Fig. 1 , 106) and they are grouped together. In a specific example, if metadata associated with a first electronic file (Fig. 1 , 106-1 ) and metadata associated with a second electronic file (Fig. 1 , 106- 2) indicate that they first electronic file (Fig.
  • the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) are photographs that were taken within 5 miles of each other, and therefore satisfy a user-generated criteria defining geographic similarity as within 5 miles from each other, the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) are identified as similar and are grouped together. As described above, such comparison may be performed by the compare engine (Fig. 2, 216).
  • the metadata may include geographical information for the electronic file (Fig. 1 , 106).
  • geographical information may include global positioning system (GPS) information and longitude and latitude coordinates among other geographical information.
  • GPS global positioning system
  • Another type of metadata that may be used in identifying (block 302) similar electronic files includes time stamp information, authorship of the electronic file (Fig. 1 , 106), electronic file attributes and electronic file formats as described above.
  • other examples of metadata include, information indicating a source device, an author, a filename, elevation information, speed information, and audio information, file attributes (such as white balance), and formats such as audio, video, or photograph. While specific reference is made to certain types of metadata, any number and any combination of metadata information may be used and compared against the user-generated criteria to determine similarity of electronic files (Fig. 1 , 106).
  • identifying (block 302) similar electronic files may include analyzing and correlating the text of the documents, i.e., data mining to determine similarity.
  • identifying (block 302) similar electronic files may include using a facial recognition engine to recognize faces and correlate if those faces show up in photographs having the same date and time and identifying them as coming from a shared experience.
  • the similar electronic files (Fig. 1 , 06) are shared (block 303) with suppliers of the similar electronic files (Fig. 1 , 106).
  • the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) are deemed similar and the third electronic file (Fig. 1 , 106-3) is deemed dissimilar from both
  • the first and second electronic files (Fig. 1 , 106-1 , 106-2) are shared with the first supplier and the second supplier.
  • sharing (block 303) of the similar electronic files (Fig. 1 , 106) may be exclusive to those suppliers of the similar electronic files (Fig. 1 , 106).
  • the third supplier can be excluded, or prevented from accessing both the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) on account of not having provided an electronic file (Fig. 1 , 106) similar to either.
  • sharing (block 303) of the similar electronic files (Fig. 1 , 106) may include creating a shared folder, duplicating the electronic files (Fig. 1 , 106) to devices or accounts associated with the respective users (Fig. 1 , 104), or virtually sharing the information via a database that includes pointers to the physical electronic files (Fig. 1 , 106).
  • Receiving (block 301 ) the user-generated criteria may be temporally separate from the operation to identify (block 302) the similar electronic files (Fig. 1 , 106) and sharing (block 303) the similar electronic files (Fig. 1 , 106).
  • the electronic device in an initial setup phase, the electronic device (Fig. 1 , 102) may receive the user-generated criteria.
  • the electronic device (Fig. 1 , 102) may receive the electronic files (Fig. 1 , 106);
  • receiving (block 303) the user-generated criteria, identifying (block 302) similar electronic files (Fig. 1 , 106), and sharing (block 303) the similar electronic files (Fig. 1 , 106) may be performed after a single user action.
  • the user (Fig. 1 , 104) may input to the electronic device (Fig. 1 , 102) the user-generated criteria.
  • the user (Fig. 1 , 104) may then not perform any other action other than taking photographs in order to group and share the electronic files (Fig. 1 , 106).
  • Such a method (300) may allow for simple auto- sharing of electronic files (Fig. 1 , 106)with reduced user involvement as the user involvement includes just setting the criteria and uploading electronic files (Fig. 1 , 106) and avoids a tedious manual selection of users and photos to share. Doing so alleviates the need to manually indicate with each file whether to share that photograph and with whom to share the photograph with.
  • Fig. 4 is a diagram of an electronic device (102) for sharing similar electronic files (Fig. 1 , 106) based on user-generated criteria, according to another example of the principles described herein.
  • the electronic device (102) may include a processor (212), a user-generated criteria engine (214), a compare engine (216), and a share engine (218) as described above in connection with Fig. 2.
  • the electronic device (102) may also include a network engine (420) to determine whether the suppliers of the electronic files (Fig. 1 , 106) are in a common sharing network. For example, many users (Fig. 1 , 104) may share electronic files (Fig. 1 , 106) across a network however, a first user (Fig.
  • the network engine (420) can determine that the first user (Fig. 1 , 106-1 ) and the third user (Fig. 1 , 106-3) are not in a common sharing network and prevent the sharing of photographs between the users.
  • the network engine (420) may determine whether users are in a common sharing network by receiving user input or by analyzing a profile of the users to determine whether the two users are associated in any fashion or whether the users have previously indicated their affiliation or association.
  • the network engine (420) allows a user (Fig. 1 , 104) to specify, either previous to, or contemporaneously with the generation of user-generated criteria, who may have access to electronic files (Fig. 1 , 106) provided by that user (Fig. 1 , 104). Put another way, the network engine (420) allows for selection of a subset of users of a network environment with whom the user (Fig. 1 , 104) is willing to share electronic files (Fig. 1 , 106). Doing so may increase the efficiency of the compare engine (216) and may prevent unwanted entities from viewing or sharing electronic files (Fig. 1 , 106) with the user.
  • the system (102) also may include an analytics engine (422) to generate a prompt to query for user-generated criteria.
  • the analytics engine (422) may analyze metadata of received electronic files (Fig. 1 , 106) for indicia of potential similarity with other electronic files (Fig. 1 , 106).
  • a specific example is given as follows.
  • the first user (Fig. 1 , 104- 1 ) uploads a photograph taken in Hawaii, as identified by metadata of the photograph.
  • the analytics engine (422) analyzing the metadata of this photograph and other photographs and determines that another user (Fig. 1 , 104-2) in the first user's sharing network has also uploaded a photograph taken in Hawaii around the same time. Accordingly, the analytics engine (422) prompts the first user (Fig.
  • the analytics engine (422) may be a learning engine to ascertain additional user-generated criteria for grouping and sharing electronic files (Fig. 1 , 106).
  • the analytics engine (422) finds photographs from a first supplier and select photographs from a second supplier that the compare engine (216) has identified as similar. The analytics engine (422) then examines the metadata of these similar photographs and identifies other photos that seem to be similar to the similar photographs.
  • the analytics engine (422) may allow for adjustment of the user-generated criteria. For example, assume 100 photos were taken by two different suppliers on a shared vacation. Of those photos one lies outside the +/- range of 2 days but is taken in a similar location. The analytics engine (422) could then query "we found a candidate for this collection" "confirm or eliminate from this collection?" If the user elects to confirm this photograph in the collection, the user has extended the user- generated criteria. In this example, the statistics that inform the query may be guided by frequency. For example, if many photographs are taken in a given area the sensitivity will go down and if few photographs are taken in a given area or time (or whatever criteria) the sensitivity for outliers will be increased.
  • the analytics engine (422) may be used to further adjust the user- generated criteria. As another example, if a user often takes a bunch of photos over a short period and then usage falls off, the analytics engine (422) may trigger a confirmation message to help support/supplement the logic already selected by a user.
  • FIG. 5 is a flowchart of a method (500) for sharing similar electronic files (Fig. 1 , 106) based on user-generated criteria, according to another example of the principles described herein.
  • user-generated criteria that define a similarity between electronic files (Fig. 1 , 106) are received (block 501 ). This may be performed as described above in connection with Fig. 3.
  • First metadata for a first electronic file (Fig. 1 , 106-1 ) - which first electronic file (Fig. 1 , 106-1 ) is received from a first supplier - is analyzed (block 502).
  • metadata for an electronic file (Fig. 1 , 106) indicates characteristics such as an origin location, an origin time, a file format, a file type, file attributes, or combinations thereof.
  • the electronic device (Fig. 1 , 102) may acquire and parse the metadata to be used in comparison with other metadata.
  • second metadata for a second electronic file (Fig. 1 , 106-2) - which second electronic file (Fig. 1 , 106-2) is received from a second supplier - is analyzed (block 503).
  • Both sets of metadata are compared (block 504) against the user-generated criteria to determine their similarity.
  • metadata for the first electronic file (Fig. 1 , 106-1 ) may indicate it was a photograph taken on a particular day at a particular location.
  • Metadata for the second electronic file (Fig. 1 , 106-2) may indicate it was also a photograph taken on the particular day at the particular location.
  • the system (Fig. 1 , 102) may determine their similarity (block 505) and group the electronic files (Fig. 1 ,106) associated with the first and second metadata together. This may be performed as described above in connection with Fig. 3.
  • the similar electronic files (Fig. 1 , 106) are then shared (block 506) to the suppliers of those similar electronic files (Fig. 1 , 106).
  • the supplier of the first electronic file (Fig. 1 , 106-1 ) and the supplier of the second electronic file (Fig. 1 , 106-2) may view both the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2). Doing so allows for a simple mechanism for determining which electronic files (Fig. 1 , 106) to share and auto-sharing those electronic files (Fig. 1 , 106) at some point in time removed from when the criteria for sharing are generated. As such, simple time-minimal sharing of electronic files (Fig. 1 , 106) is promoted by the systems and methods described herein.
  • Fig. 6 is a diagram of an electronic device (102) for sharing electronic files (Fig. 1 , 106) from user-generated criteria, according to another example of the principles described herein.
  • the electronic device (102) includes a processor (212) and a machine-readable storage medium (624).
  • processor (212) includes a processor (212) and a machine-readable storage medium (624).
  • machine-readable storage medium (624) includes a processor (212) and a machine-readable storage medium (624).
  • the instructions may be distributed (e.g., stored) across multiple machine-readable storage mediums and the instructions may be distributed (e.g., executed by) across multiple processors.
  • the processor (212) may include at least one processor and other resources used to process programmed instructions.
  • the processor (212) may be a number of central processing units (CPUs), microprocessors, and/or other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium (624).
  • the processor (212) may fetch, decode, and execute instructions (626, 628, 630, 632).
  • the processor (212) may include a number of electronic circuits comprising a number of electronic components for performing the functionality of a number of the instructions in the machine-readable storage medium (624).
  • the machine-readable storage medium (624) represent generally any memory capable of storing data such as programmed instructions or data structures used by the electronic device (102).
  • the machine-readable storage medium (624) includes a machine readable storage medium that contains machine readable program code to cause tasks to be executed by the processor (212).
  • the machine-readable storage medium (624) may be tangible and/or non-transitory storage medium.
  • the machine-readable storage medium (624) may be any appropriate storage medium that is not a transmission storage medium.
  • the machine-readable storage medium (624) may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions.
  • machine-readable storage medium (624) may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.
  • RAM Random Access Memory
  • EEPROM Electrically-Erasable Programmable Read-Only Memory
  • the machine-readable storage medium (624) may be disposed within the electronic device (102), as shown in Fig. 6. In this situation, the executable instructions may be "installed" on the electronic device (102).
  • the machine-readable storage medium (624) may be a portable, external or remote storage medium, for example, that allows the electronic device (102) to download the instructions from the portable/external/remote storage medium.
  • the executable instructions may be part of an
  • the machine-readable storage medium (624) may be encoded with executable instructions.
  • user-generated criteria instructions (626) when executed by a processor (212), cause the electronic device (102) to receive user- generated criteria to define a similarity between electronic files (Fig. 1 , 106).
  • Metadata analysis instructions when executed by a processor (212), cause the electronic device (102) to analyze first metadata for a first electronic file (Fig. 1 , 106-1 ) received from a first supplier and to analyze second metadata for a second electronic file (Fig. 1 , 106-2) received from a second supplier.
  • Similarity instructions when executed by a processor (212), cause the electronic device (102) to determine whether the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2) have a similarity by comparing the first metadata and the second metadata against the user-generated criteria.
  • Joint access instructions when executed by a processor (212), cause the electronic device (102) to allow joint access to the first electronic file (Fig.
  • Allowing joint access may include generating a shared folder to hold the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2), the shared folder being accessible to both the first supplier and the second supplier, and in some cases accessible by just the first supplier and the second supplier.
  • Allowing joint access includes generating, or adding to, a database that includes pointers to original locations of the first electronic file (Fig. 1 , 106-1 ) and the second electronic file (Fig. 1 , 106-2).
  • the processor (212) and machine-readable storage medium (624) are located within the same physical component, such as a server, or a network component.
  • the machine-readable storage medium (624) may be part of the physical component's main memory, caches, registers, non-volatile memory, or elsewhere in the physical component's memory hierarchy.
  • the machine-readable storage medium (624) may be in communication with the processor (212) over a network.
  • the electronic device (102) may be implemented on a user device, on a server, on a collection of servers, or combinations thereof.
  • the electronic device (102) of Fig. 6 may be part of a general purpose computer. However, in alternative examples, the electronic device (100) is part of an application specific integrated circuit
  • Certain examples of the present disclosure are directed to a system and device for grouping and sharing electronic files that 1 ) simplifies the process of sharing of electronic files; 2) reduces user-involvement in providing and sharing the electronic files; and 3) provides flexibility in generating a file- sharing policy.
  • the devices and systems disclosed herein may prove useful in addressing other deficiencies in a number of technical areas. Therefore the systems and devices disclosed herein should not be construed as addressing just the particular elements or deficiencies discussed herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Un exemple selon la présente invention concerne un procédé. Dans ce procédé, des critères générés par un utilisateur sont reçus qui définissent une similarité entre des fichiers électroniques. Des fichiers électroniques qui sont similaires sont identifiés par comparaison de métadonnées associées aux fichiers électroniques avec les critères générés par l'utilisateur. Les fichiers électroniques qui sont similaires sont partagés avec des fournisseurs des fichiers électroniques similaires.
PCT/US2015/047195 2015-08-27 2015-08-27 Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur WO2017034576A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2015/047195 WO2017034576A1 (fr) 2015-08-27 2015-08-27 Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2015/047195 WO2017034576A1 (fr) 2015-08-27 2015-08-27 Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur

Publications (1)

Publication Number Publication Date
WO2017034576A1 true WO2017034576A1 (fr) 2017-03-02

Family

ID=58101172

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/047195 WO2017034576A1 (fr) 2015-08-27 2015-08-27 Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur

Country Status (1)

Country Link
WO (1) WO2017034576A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070255785A1 (en) * 2006-04-28 2007-11-01 Yahoo! Inc. Multimedia sharing in social networks for mobile devices
US20130222369A1 (en) * 2012-02-23 2013-08-29 Charles D. Huston System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment
US20140208235A1 (en) * 2000-10-10 2014-07-24 Addnclick, Inc. Dynamic information management system and method for content delivery and sharing in content-, metadata- & viewer-based, live social networking among users concurrently engaged in the same and/or similar content
US20140365432A1 (en) * 2013-06-10 2014-12-11 Dropbox, Inc. Dropsite for shared content
US20150039616A1 (en) * 2013-08-02 2015-02-05 Shoto, Inc. Discovery and sharing of photos between devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140208235A1 (en) * 2000-10-10 2014-07-24 Addnclick, Inc. Dynamic information management system and method for content delivery and sharing in content-, metadata- & viewer-based, live social networking among users concurrently engaged in the same and/or similar content
US20070255785A1 (en) * 2006-04-28 2007-11-01 Yahoo! Inc. Multimedia sharing in social networks for mobile devices
US20130222369A1 (en) * 2012-02-23 2013-08-29 Charles D. Huston System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment
US20140365432A1 (en) * 2013-06-10 2014-12-11 Dropbox, Inc. Dropsite for shared content
US20150039616A1 (en) * 2013-08-02 2015-02-05 Shoto, Inc. Discovery and sharing of photos between devices

Similar Documents

Publication Publication Date Title
US11934357B2 (en) Dropsite for shared content
US9906576B2 (en) System and method for creating and managing geofeeds
US9805060B2 (en) System and method for predicting a geographic origin of content and accuracy of geotags related to content obtained from social media and other content providers
US10515261B2 (en) System and methods for sending digital images
US10523768B2 (en) System and method for generating, accessing, and updating geofeeds
JP6303023B2 (ja) 一時的なイベンティングに関するシステム及び方法
US8612470B1 (en) Application recommendation using stored files
US10628463B2 (en) Applying geo-tags to digital media captured without location information
US9563850B2 (en) Method and interface for displaying locations associated with annotations
US20150339324A1 (en) System and Method for Imagery Warehousing and Collaborative Search Processing
US11861516B2 (en) Methods and system for associating locations with annotations
US20150193521A1 (en) Methods for Generating an Activity Stream
US20140282080A1 (en) Methods and systems of sharing digital files
US9465521B1 (en) Event based media interface
WO2017034576A1 (fr) Partage de fichiers électroniques similaires sur la base de critères générés par un utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15902440

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15902440

Country of ref document: EP

Kind code of ref document: A1