US20150302249A1 - System, Method, and Computer Program Product for Detecting Unwanted Data Based on an Analysis of an Icon - Google Patents

System, Method, and Computer Program Product for Detecting Unwanted Data Based on an Analysis of an Icon Download PDF

Info

Publication number
US20150302249A1
US20150302249A1 US14/679,864 US201514679864A US2015302249A1 US 20150302249 A1 US20150302249 A1 US 20150302249A1 US 201514679864 A US201514679864 A US 201514679864A US 2015302249 A1 US2015302249 A1 US 2015302249A1
Authority
US
United States
Prior art keywords
icon
image
program product
computer program
analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/679,864
Inventor
Alexander James Hinchliffe
Oliver Georges Devane
Lee Codel Lawson Tarbotton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
McAfee LLC
Original Assignee
McAfee LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by McAfee LLC filed Critical McAfee LLC
Priority to US14/679,864 priority Critical patent/US20150302249A1/en
Publication of US20150302249A1 publication Critical patent/US20150302249A1/en
Assigned to MCAFEE, LLC reassignment MCAFEE, LLC CHANGE OF NAME AND ENTITY CONVERSION Assignors: MCAFEE, INC.
Assigned to MORGAN STANLEY SENIOR FUNDING, INC. reassignment MORGAN STANLEY SENIOR FUNDING, INC. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MCAFEE, LLC
Assigned to JPMORGAN CHASE BANK, N.A. reassignment JPMORGAN CHASE BANK, N.A. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MCAFEE, LLC
Assigned to JPMORGAN CHASE BANK, N.A. reassignment JPMORGAN CHASE BANK, N.A. CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE PATENT 6336186 PREVIOUSLY RECORDED ON REEL 045055 FRAME 786. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST. Assignors: MCAFEE, LLC
Assigned to MORGAN STANLEY SENIOR FUNDING, INC. reassignment MORGAN STANLEY SENIOR FUNDING, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE PATENT 6336186 PREVIOUSLY RECORDED ON REEL 045056 FRAME 0676. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST. Assignors: MCAFEE, LLC
Assigned to MCAFEE, LLC reassignment MCAFEE, LLC RELEASE OF INTELLECTUAL PROPERTY COLLATERAL - REEL/FRAME 045055/0786 Assignors: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT
Assigned to MCAFEE, LLC reassignment MCAFEE, LLC RELEASE OF INTELLECTUAL PROPERTY COLLATERAL - REEL/FRAME 045056/0676 Assignors: MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/00496
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/51Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems at application loading time, e.g. accepting, rejecting, starting or inhibiting executable software based on integrity or source reliability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Definitions

  • the present invention relates to detecting unwanted data, and more particularly to systems for detecting unwanted data.
  • a system, method, and computer program product are provided for detecting unwanted data based on an analysis of an icon.
  • an icon is analyzed. Furthermore, unwanted data is detected based on the analysis.
  • FIG. 1 illustrates a network architecture, in accordance with one embodiment.
  • FIG. 2 shows a representative hardware environment that may be associated with the servers and/or clients of FIG. 1 , in accordance with one embodiment.
  • FIG. 3 shows a method for detecting unwanted data based on an analysis of an icon, in accordance with one embodiment.
  • FIG. 4 shows a method for detecting unwanted icons, in accordance with another embodiment.
  • FIG. 1 illustrates a network architecture 100 , in accordance with one embodiment.
  • a plurality of networks 102 is provided.
  • the networks 102 may each take any form including, but not limited to a local area network (LAN), a wireless network, a wide area network (WAN) such as the Internet, peer-to-peer network, etc.
  • LAN local area network
  • WAN wide area network
  • peer-to-peer network etc.
  • servers 104 which are capable of communicating over the networks 102 .
  • clients 106 are also coupled to the networks 102 and the servers 104 .
  • Such servers 104 and/or clients 106 may each include a desktop computer, lap-top computer, hand-held computer, mobile phone, personal digital assistant (PDA), peripheral (e.g. printer, etc.), any component of a computer, and/or any other type of logic.
  • PDA personal digital assistant
  • peripheral e.g. printer, etc.
  • any component of a computer and/or any other type of logic.
  • at least one gateway 108 is optionally coupled therebetween.
  • FIG. 2 shows a representative hardware environment that may be associated with the servers 104 and/or clients 106 of FIG. 1 , in accordance with one embodiment.
  • Such figure illustrates a typical hardware configuration of a workstation in accordance with one embodiment having a central processing unit 210 , such as a microprocessor, and a number of other units interconnected via a system bus 212 .
  • a central processing unit 210 such as a microprocessor
  • the workstation shown in FIG. 2 includes a Random Access Memory (RAM) 214 , Read Only Memory (ROM) 216 , an I/O adapter 218 for connecting peripheral devices such as disk storage units 220 to the bus 212 , a user interface adapter 222 for connecting a keyboard 224 , a mouse 226 , a speaker 228 , a microphone 232 , and/or other user interface devices such as a touch screen (not shown) to the bus 212 , communication adapter 234 for connecting the workstation to a communication network 235 (e.g., a data processing network) and a display adapter 236 for connecting the bus 212 to a display device 238 .
  • a communication network 235 e.g., a data processing network
  • display adapter 236 for connecting the bus 212 to a display device 238 .
  • the workstation may have resident thereon any desired operating system. It will be appreciated that an embodiment may also be implemented on platforms and operating systems other than those mentioned.
  • One embodiment may be written using JAVA, C, and/or C++ language, or other programming languages, along with an object oriented programming methodology.
  • Object oriented programming (OOP) has become increasingly used to develop complex applications.
  • FIG. 3 shows a method 300 for detecting unwanted data based on an analysis of an icon, in accordance with one embodiment.
  • the method 300 may be carried out in the context of the architecture and environment of FIGS. 1 and/or 2 . Of course, however, the method 300 may be carried out in any desired environment.
  • an icon is analyzed.
  • the icon may include any type of image.
  • the icon may include an image of a file, a folder, etc.
  • the icon may be associated with an application.
  • Such application may include an executable application, in one embodiment. To this end, selecting the icon may result in execution of the application.
  • the icon may be selected by a user clicking on the icon.
  • analyzing the icon may include generating a hash of the icon.
  • a checksum, signature, etc. of the icon may be generated in other embodiments.
  • the analysis may include comparing the hash to a white list.
  • white list may optionally include a list of known wanted icons (e.g. icons predetermined to be associated with wanted, non-malicious, etc. applications).
  • the analysis of the icon may further include determining whether an application associated with the hash is valid.
  • the application associated with the hash may be determined to be valid if the application includes one which is predetermined to be valid for the hash. For example, each hash in the white list may be predetermined to be associated with a particular application (e.g. an application determined to be wanted, non-malicious, legitimate, etc.). Thus, the application associated with the hash may be compared to an application predetermined to be valid for such hash.
  • the application associated with the hash may be determined to be valid if a characteristic of the application includes one which is predetermined to indicate that the application is valid.
  • a characteristic of the application may include the application not being packed (e.g. encrypted, compressed, etc.).
  • the characteristic may include the application containing multiple different icons. Thus, if it is determined that the application associated with the hash contains multiple different icons, it may optionally be determined that the application is valid. As another option, if it is determined that the application associated with the hash does not contain multiple different icons (e.g. contains only the icon from which the hash is generated), it may be determined that the application is not valid).
  • the characteristic may include any desired aspect capable of being predetermined to indicate whether the application associated with the hash is valid.
  • analysis may include comparing the hash to a black list.
  • the black list may include a list of known unwanted icons.
  • known unwanted icons may include icons predetermined to be associated with unwanted, malicious, etc. applications, for example.
  • the analysis may include a heuristic analysis of at least one aspect associated with the icon.
  • the heuristic analysis may only be performed if it is determined that the hash of the icon is included in the white list.
  • the heuristic analysis may include any analysis of the icon that is based on a context of the aspect associated with the icon.
  • the aspect associated with the icon may include a file in which the icon resides.
  • the heuristic analysis may include determining whether the file in which the icon resides is valid (e.g. is predetermined to be wanted, non-malicious etc.).
  • the heuristic analysis may include determining whether a name of the file, an extension of the file, etc. in which the icon resides indicates that the file is valid (e.g. whether the aspect associated with the icon is predetermined to be associated with a valid file, etc.).
  • the aspect associated with the icon may include version information corresponding with an application associated with the icon.
  • the heuristic analysis may optionally include determining whether the application associated with the icon is valid by determining whether the application includes version information for such application. For example, if the application includes version information, it may be determined that such application is valid.
  • the aspect associated with the icon may include a language in which the application associated with the icon is written.
  • the heuristic analysis may include determining whether the language in which the application associated with the icon is written indicates that the application is valid. In one embodiment, if the language in which the application is written is predetermined to be a language in which known wanted applications are written, it may be determined that the application is valid. In another embodiment, if the language in which the application is written is not predetermined to be a language in which known wanted applications are written (e.g. is predetermined to be a language in which known unwanted applications are written), it may be determined that the application is not valid.
  • the analysis of the icon may involve pixels used to render the icon.
  • the pixels may be compared with at least one set of pixels associated with an icon predetermined to be wanted, predetermined to be associated with a wanted application, etc.
  • the analysis may include an image recognition involving the icon.
  • image recognition may compare the icon to at least one predetermined image (e.g. image predetermined to be associated with an icon used by a wanted application, etc.) for determining whether the image includes the predetermined image.
  • the image recognition may include determining whether (and optionally an extent to which) the pixels used to render the icon vary from the predetermined image, such as a color the pixels, a number of the pixels, etc.
  • the analysis of the icon may be performed at any desired time.
  • the icon may be analyzed on-access.
  • the icon may be selected in response to a detection of a selection of the icon (e.g. by a user, etc.).
  • the icon may be analyzed on-demand.
  • the icon may be analyzed independent of a selection of the icon.
  • the icon may be analyzed based on a scheduled analysis.
  • unwanted data is detected based on the analysis.
  • the unwanted data may include malware, spyware, adware, etc.
  • the unwanted data may include any data determined to be unwanted.
  • the unwanted data may include the application associated with the icon, the icon itself, and/or any other data associated with the icon.
  • the unwanted data may be detected if it is determined that the icon is included in the white list but is associated with an invalid application.
  • it may be determined that the icon corresponds unwanted data if the application associated with the icon does not include one which is predetermined to be valid for the hash of the icon.
  • the unwanted data may be detected if it is determined that the icon is included in the black list.
  • the unwanted data may be detected if the heuristic analysis indicates that the aspect associated with the icon indicates that the icon is associated with unwanted data. For example, if it is determined that the file in which the icon resides is not valid, based on the heuristic analysis, the unwanted data may be detected. As another example, if it is determined that the language in which the application associated with the icon is written indicates that the application is not valid, based on the heuristic analysis, the unwanted data may be detected.
  • the unwanted data may be detected.
  • the image recognition involving the icon results in a determination that the icon does not include a predetermined image, that the pixels used to render the icon vary from a predetermined image, etc., the unwanted data may be detected.
  • unwanted data may be detected based on an analysis of an icon. For example, use of the icon by unwanted data for executing such unwanted data may be detected. Such use of the icon may be detected if the icon is also associated with a valid application (e.g. wanted data), and also if the icon is unique to (e.g. only associated with) the unwanted data, such as if the icon is created specifically for the unwanted data.
  • a valid application e.g. wanted data
  • the icon is unique to (e.g. only associated with) the unwanted data, such as if the icon is created specifically for the unwanted data.
  • FIG. 4 shows a method 400 for detecting unwanted icons, in accordance with another embodiment.
  • the method 400 may be carried out in the context of the architecture and environment of FIGS. 1-3 .
  • the method 400 may be carried out in any desired environment. It should also be noted that the aforementioned definitions may apply during the present description.
  • an icon is identified.
  • the icon may be identified based on a scan for icons. For example, a device storing the icon, a system (e.g. operating system) employing the icon, etc. may be scanned for identifying the icon.
  • a system e.g. operating system
  • the icon may be identified based on an on-access scan.
  • the icon may be identified in response to a selection of the icon.
  • the icon may be identified based on an on-demand scan, such that the icon may be identified independent of a selection thereof.
  • the icon is analyzed utilizing signatures. For example, a hash of the icon may be generated. In one embodiment, the hash may be compared to a black list of hashes associated with known unwanted icons (e.g. icons predetermined to be associated with unwanted applications). Thus, if the hash is included in the black list, it may optionally be determined that the icon is unwanted.
  • the hash may be compared to a white list of hashes associated with known wanted icons (e.g. icons predetermined to be associated with wanted applications). If the hash is determined to be included in the white list, the icon may be identified as a type of icon which is associated with a wanted application. However, it may be further be determined whether the particular application associated with the icon (e.g. the application executed via the icon, etc.) is a wanted application.
  • known wanted icons e.g. icons predetermined to be associated with wanted applications.
  • At least one characteristic of the application associated with the icon may be identified for determining whether such characteristic is indicative of the application being a wanted application. Just by way of example, if the application is packed, it may be determined that the application is unwanted. If the application is not packed, it may be determined that the application is wanted.
  • the application includes multiple different icons, it may be determined that the application is wanted (e.g. as multiple icons may indicate that the application is a legitimate, non-malicious application and a single icon included in the application may indicate that the icon is utilized for deceptive purposes, such as for mischaracterizing an unwanted application as a wanted application). If the application does not include multiple different icons, it may be determined that the application is unwanted. In this way, if the icon is associated with multiple different applications, it may be determined whether any of such applications are unwanted.
  • the icon is unwanted, as shown in decision 406 .
  • the icon may be determined to be unwanted if the hash of the icon is included in the black list. In another embodiment, it may be determined that the icon is unwanted if the hash of the icon is included in the white list and if it is determined that the particular application associated with the icon is not a wanted application.
  • the icon is analyzed utilizing heuristics. Note operation 408 .
  • the heuristics may be utilized for identifying whether the icon is unwanted when the icon cannot be determined to be predetermined to be unwanted (e.g. based on the black list).
  • the heuristics may be utilized for identifying whether the icon is unwanted when the hash of the icon is included in the white list, but it is unknown whether the application associated with icon is unwanted.
  • a file containing the icon may be analyzed for determining whether such file is predetermined to be wanted. Such analysis may include determining whether a name of the file is indicative of a file predetermined to be wanted. For example, if the file is not predetermined to be wanted, it may be determined that the icon is unwanted.
  • the analysis of the file may include determining whether a file extension of the file is indicative of a file predetermined to be wanted.
  • the icon may be predetermined to be of a type that is associated with a file of a particular type (e.g. as identified by the file extension). To this end, if the file is not predetermined to be wanted, it may be determined that the icon is unwanted.
  • the analysis of the file may include determining whether the file is stored in a predetermined location. (e.g. a predetermined folder). Just by way of example, it may be determined whether the file is stored in a predetermined folder that is indicative of the file being wanted. If the file is not stored in the predetermined location, it may be determined that the icon is unwanted.
  • a predetermined location e.g. a predetermined folder.
  • the analysis of the file may include determining whether the file is written in a predetermined language (e.g. high-level language).
  • a predetermined language e.g. high-level language
  • Such predetermined language may include a language in which wanted files are predetermined to be written. If the file is not written in the predetermined language, it may be determined that the icon is unwanted.
  • the analysis of the file may include determining whether the file includes version information (e.g. information indicating a version of the data stored in the file, etc.). For example, it may be determined whether the file includes version information for the file in resources of the file. If it is determined that the file does not include version information, it may be determined that the icon is unwanted.
  • version information e.g. information indicating a version of the data stored in the file, etc.
  • the icon may be determined whether the icon is unwanted, as shown in decision 410 . If it is not determined that the icon is unwanted, the icon is analyzed utilizing pixels of the icon. Note operation 412 .
  • the pixels of the icon may be analyzed in any manner capable of being utilized to determine whether the icon is unwanted.
  • the pixels of the icon may be analyzed via pixel identification.
  • a pattern of pixels of the icon may be identified and compared to sets of pixels of icons predetermined to be wanted. Thus, if the pattern of pixels of the icon matches one of such sets of pixels, it may be determined that the icon is wanted. Of course, as another option, if the pattern of pixels of the icon matches one of such sets of pixels, it may be further be determined whether the particular application associated with the icon is a wanted application, as described above with respect to operation 404 . If, however, the pattern of pixels of the icon does not match one of such sets of pixels, it may be determined that the icon is unwanted.
  • the pixels of the icon may be analyzed via image recognition.
  • the image recognition may utilize the pixels of the icon to identify image characteristics of the icon.
  • the characteristics may include a number of pixels of a particular color that exist in the icon (e.g. for determining whether the icon is of a folder).
  • a characteristic of an icon may be identified even when pixels of the icon vary (e.g. up to a threshold amount).
  • Such characteristics may further be compared to characteristics predetermined to be associated with wanted icons. If the characteristics of the icon matches the characteristics predetermined to be associated with wanted icons, it may be determined that the icon is wanted. Optionally, if the characteristics of the icon matches the characteristics predetermined to be associated with wanted icons, it may be further be determined whether the particular application associated with the icon is a wanted application, as described above with respect to operation 404 . If, however, the characteristics of the icon does not match the characteristics predetermined to be associated with wanted icons, it may be determined that the icon is unwanted.
  • the unwanted icon is detected. Note operation 416 . Accordingly, a detection of an unwanted icon may be made.
  • reaction to the detection of the unwanted icon is performed, as shown in operation 418 .
  • the reaction may include blocking access to the unwanted icon.
  • Such access may include selection of the icon, for example.
  • the reaction may include deleting the icon. In yet another embodiment, the reaction may include removing any data associated with the icon, such as an application associated with the icon. In yet another embodiment, the icon and/or data associated therewith may be quarantined. In still yet other embodiments, detection of the unwanted icon may be logged, a user may be notified, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system, method, and computer program product are provided for detecting unwanted data based on an analysis of an icon. In use, an icon is analyzed. Furthermore, unwanted data is detected based on the analysis.

Description

    FIELD OF THE INVENTION
  • The present invention relates to detecting unwanted data, and more particularly to systems for detecting unwanted data.
  • BACKGROUND
  • Traditionally, systems have been provided for detecting unwanted data, such that devices employing such systems may be secured from the unwanted data. However, techniques utilized by such systems for detecting unwanted data have generally exhibited various limitations. Just by way of example, traditional systems have been incapable of detecting unwanted data based on icons associated with such unwanted data.
  • There is thus a need for addressing these and/or other issues associated with the prior art.
  • SUMMARY
  • A system, method, and computer program product are provided for detecting unwanted data based on an analysis of an icon. In use, an icon is analyzed. Furthermore, unwanted data is detected based on the analysis.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a network architecture, in accordance with one embodiment.
  • FIG. 2 shows a representative hardware environment that may be associated with the servers and/or clients of FIG. 1, in accordance with one embodiment.
  • FIG. 3 shows a method for detecting unwanted data based on an analysis of an icon, in accordance with one embodiment.
  • FIG. 4 shows a method for detecting unwanted icons, in accordance with another embodiment.
  • DETAILED DESCRIPTION
  • FIG. 1 illustrates a network architecture 100, in accordance with one embodiment. As shown, a plurality of networks 102 is provided. In the context of the present network architecture 100, the networks 102 may each take any form including, but not limited to a local area network (LAN), a wireless network, a wide area network (WAN) such as the Internet, peer-to-peer network, etc.
  • Coupled to the networks 102 are servers 104 which are capable of communicating over the networks 102. Also coupled to the networks 102 and the servers 104 is a plurality of clients 106. Such servers 104 and/or clients 106 may each include a desktop computer, lap-top computer, hand-held computer, mobile phone, personal digital assistant (PDA), peripheral (e.g. printer, etc.), any component of a computer, and/or any other type of logic. In order to facilitate communication among the networks 102, at least one gateway 108 is optionally coupled therebetween.
  • FIG. 2 shows a representative hardware environment that may be associated with the servers 104 and/or clients 106 of FIG. 1, in accordance with one embodiment. Such figure illustrates a typical hardware configuration of a workstation in accordance with one embodiment having a central processing unit 210, such as a microprocessor, and a number of other units interconnected via a system bus 212.
  • The workstation shown in FIG. 2 includes a Random Access Memory (RAM) 214, Read Only Memory (ROM) 216, an I/O adapter 218 for connecting peripheral devices such as disk storage units 220 to the bus 212, a user interface adapter 222 for connecting a keyboard 224, a mouse 226, a speaker 228, a microphone 232, and/or other user interface devices such as a touch screen (not shown) to the bus 212, communication adapter 234 for connecting the workstation to a communication network 235 (e.g., a data processing network) and a display adapter 236 for connecting the bus 212 to a display device 238.
  • The workstation may have resident thereon any desired operating system. It will be appreciated that an embodiment may also be implemented on platforms and operating systems other than those mentioned. One embodiment may be written using JAVA, C, and/or C++ language, or other programming languages, along with an object oriented programming methodology. Object oriented programming (OOP) has become increasingly used to develop complex applications.
  • Of course, the various embodiments set forth herein may be implemented utilizing hardware, software, or any desired combination thereof. For that matter, any type of logic may be utilized which is capable of implementing the various functionality set forth herein.
  • FIG. 3 shows a method 300 for detecting unwanted data based on an analysis of an icon, in accordance with one embodiment. As an option, the method 300 may be carried out in the context of the architecture and environment of FIGS. 1 and/or 2. Of course, however, the method 300 may be carried out in any desired environment.
  • As shown in operation 302, an icon is analyzed. With respect to the present description, the icon may include any type of image. For example, the icon may include an image of a file, a folder, etc.
  • As another example, the icon may be associated with an application. Such application may include an executable application, in one embodiment. To this end, selecting the icon may result in execution of the application. As an option, the icon may be selected by a user clicking on the icon.
  • In one embodiment, analyzing the icon may include generating a hash of the icon. Of course, however, a checksum, signature, etc. of the icon may be generated in other embodiments. Additionally, the analysis may include comparing the hash to a white list. Such white list may optionally include a list of known wanted icons (e.g. icons predetermined to be associated with wanted, non-malicious, etc. applications).
  • In another embodiment, if it is determined that the hash is included in the white list, based on the comparison, the analysis of the icon may further include determining whether an application associated with the hash is valid. As an option, the application associated with the hash may be determined to be valid if the application includes one which is predetermined to be valid for the hash. For example, each hash in the white list may be predetermined to be associated with a particular application (e.g. an application determined to be wanted, non-malicious, legitimate, etc.). Thus, the application associated with the hash may be compared to an application predetermined to be valid for such hash.
  • As another option, the application associated with the hash may be determined to be valid if a characteristic of the application includes one which is predetermined to indicate that the application is valid. Just by way of example, such characteristic may include the application not being packed (e.g. encrypted, compressed, etc.). To this end, if it is determined that the application associated with the hash is packed (or optionally that the icon from which the hash is generated is included in a packed file), it may optionally be determined that the application is not valid. If, however, it is determined that the application associated with the hash is packed, it may optionally, be determined that the application is valid.
  • As another example, the characteristic may include the application containing multiple different icons. Thus, if it is determined that the application associated with the hash contains multiple different icons, it may optionally be determined that the application is valid. As another option, if it is determined that the application associated with the hash does not contain multiple different icons (e.g. contains only the icon from which the hash is generated), it may be determined that the application is not valid). Of course, it should be noted that the characteristic may include any desired aspect capable of being predetermined to indicate whether the application associated with the hash is valid.
  • As yet another option, analysis may include comparing the hash to a black list. The black list may include a list of known unwanted icons. Such known unwanted icons may include icons predetermined to be associated with unwanted, malicious, etc. applications, for example.
  • In another embodiment, the analysis may include a heuristic analysis of at least one aspect associated with the icon. As an option, the heuristic analysis may only be performed if it is determined that the hash of the icon is included in the white list. Moreover, the heuristic analysis may include any analysis of the icon that is based on a context of the aspect associated with the icon.
  • The aspect associated with the icon may include a file in which the icon resides. For example, the heuristic analysis may include determining whether the file in which the icon resides is valid (e.g. is predetermined to be wanted, non-malicious etc.). As another example, the heuristic analysis may include determining whether a name of the file, an extension of the file, etc. in which the icon resides indicates that the file is valid (e.g. whether the aspect associated with the icon is predetermined to be associated with a valid file, etc.).
  • As another option, the aspect associated with the icon may include version information corresponding with an application associated with the icon. Thus, the heuristic analysis may optionally include determining whether the application associated with the icon is valid by determining whether the application includes version information for such application. For example, if the application includes version information, it may be determined that such application is valid.
  • As yet another option, the aspect associated with the icon may include a language in which the application associated with the icon is written. For example, the heuristic analysis may include determining whether the language in which the application associated with the icon is written indicates that the application is valid. In one embodiment, if the language in which the application is written is predetermined to be a language in which known wanted applications are written, it may be determined that the application is valid. In another embodiment, if the language in which the application is written is not predetermined to be a language in which known wanted applications are written (e.g. is predetermined to be a language in which known unwanted applications are written), it may be determined that the application is not valid.
  • In still yet another embodiment, the analysis of the icon may involve pixels used to render the icon. For example, the pixels may be compared with at least one set of pixels associated with an icon predetermined to be wanted, predetermined to be associated with a wanted application, etc. As another option, the analysis may include an image recognition involving the icon. Such image recognition may compare the icon to at least one predetermined image (e.g. image predetermined to be associated with an icon used by a wanted application, etc.) for determining whether the image includes the predetermined image. As another option, the image recognition may include determining whether (and optionally an extent to which) the pixels used to render the icon vary from the predetermined image, such as a color the pixels, a number of the pixels, etc.
  • In addition, the analysis of the icon may be performed at any desired time. In one embodiment, the icon may be analyzed on-access. For example, the icon may be selected in response to a detection of a selection of the icon (e.g. by a user, etc.).
  • In another embodiment, the icon may be analyzed on-demand. Thus the icon may be analyzed independent of a selection of the icon. Just by way of example, the icon may be analyzed based on a scheduled analysis.
  • Furthermore, as shown in operation 304, unwanted data is detected based on the analysis. In various embodiments, the unwanted data may include malware, spyware, adware, etc. Of course, however, the unwanted data may include any data determined to be unwanted. Also, the unwanted data may include the application associated with the icon, the icon itself, and/or any other data associated with the icon.
  • In one embodiment, the unwanted data may be detected if it is determined that the icon is included in the white list but is associated with an invalid application. Just by way of example, it may be determined that the icon corresponds unwanted data if the application associated with the icon does not include one which is predetermined to be valid for the hash of the icon. In another embodiment, the unwanted data may be detected if it is determined that the icon is included in the black list.
  • In yet another embodiment, the unwanted data may be detected if the heuristic analysis indicates that the aspect associated with the icon indicates that the icon is associated with unwanted data. For example, if it is determined that the file in which the icon resides is not valid, based on the heuristic analysis, the unwanted data may be detected. As another example, if it is determined that the language in which the application associated with the icon is written indicates that the application is not valid, based on the heuristic analysis, the unwanted data may be detected.
  • In still yet another embodiment, if it is determined that pixels used to render the icon do not include pixels associated with an icon predetermined to be wanted, predetermined to be associated with a wanted application, etc., the unwanted data may be detected. In another embodiment, if the image recognition involving the icon results in a determination that the icon does not include a predetermined image, that the pixels used to render the icon vary from a predetermined image, etc., the unwanted data may be detected.
  • In this way, unwanted data may be detected based on an analysis of an icon. For example, use of the icon by unwanted data for executing such unwanted data may be detected. Such use of the icon may be detected if the icon is also associated with a valid application (e.g. wanted data), and also if the icon is unique to (e.g. only associated with) the unwanted data, such as if the icon is created specifically for the unwanted data.
  • More illustrative information will now be set forth regarding various optional architectures and features with which the foregoing technique may or may not be implemented, per the desires of the user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.
  • FIG. 4 shows a method 400 for detecting unwanted icons, in accordance with another embodiment. As an option, the method 400 may be carried out in the context of the architecture and environment of FIGS. 1-3. Of course, however, the method 400 may be carried out in any desired environment. It should also be noted that the aforementioned definitions may apply during the present description.
  • As shown in operation 402, an icon is identified. In one embodiment, the icon may be identified based on a scan for icons. For example, a device storing the icon, a system (e.g. operating system) employing the icon, etc. may be scanned for identifying the icon.
  • In one embodiment, the icon may be identified based on an on-access scan. For example, the icon may be identified in response to a selection of the icon. In another embodiment, the icon may be identified based on an on-demand scan, such that the icon may be identified independent of a selection thereof.
  • Additionally, as shown in operation 404, the icon is analyzed utilizing signatures. For example, a hash of the icon may be generated. In one embodiment, the hash may be compared to a black list of hashes associated with known unwanted icons (e.g. icons predetermined to be associated with unwanted applications). Thus, if the hash is included in the black list, it may optionally be determined that the icon is unwanted.
  • In another embodiment, the hash may be compared to a white list of hashes associated with known wanted icons (e.g. icons predetermined to be associated with wanted applications). If the hash is determined to be included in the white list, the icon may be identified as a type of icon which is associated with a wanted application. However, it may be further be determined whether the particular application associated with the icon (e.g. the application executed via the icon, etc.) is a wanted application.
  • As an option, at least one characteristic of the application associated with the icon may be identified for determining whether such characteristic is indicative of the application being a wanted application. Just by way of example, if the application is packed, it may be determined that the application is unwanted. If the application is not packed, it may be determined that the application is wanted.
  • As another example, if the application includes multiple different icons, it may be determined that the application is wanted (e.g. as multiple icons may indicate that the application is a legitimate, non-malicious application and a single icon included in the application may indicate that the icon is utilized for deceptive purposes, such as for mischaracterizing an unwanted application as a wanted application). If the application does not include multiple different icons, it may be determined that the application is unwanted. In this way, if the icon is associated with multiple different applications, it may be determined whether any of such applications are unwanted.
  • Further, it is determined whether the icon is unwanted, as shown in decision 406. In one embodiment, the icon may be determined to be unwanted if the hash of the icon is included in the black list. In another embodiment, it may be determined that the icon is unwanted if the hash of the icon is included in the white list and if it is determined that the particular application associated with the icon is not a wanted application.
  • If it is not determined that the icon is unwanted, the icon is analyzed utilizing heuristics. Note operation 408. Thus, as an option, the heuristics may be utilized for identifying whether the icon is unwanted when the icon cannot be determined to be predetermined to be unwanted (e.g. based on the black list). For example, the heuristics may be utilized for identifying whether the icon is unwanted when the hash of the icon is included in the white list, but it is unknown whether the application associated with icon is unwanted.
  • In one embodiment, a file containing the icon may be analyzed for determining whether such file is predetermined to be wanted. Such analysis may include determining whether a name of the file is indicative of a file predetermined to be wanted. For example, if the file is not predetermined to be wanted, it may be determined that the icon is unwanted.
  • As another option, the analysis of the file may include determining whether a file extension of the file is indicative of a file predetermined to be wanted. For example, the icon may be predetermined to be of a type that is associated with a file of a particular type (e.g. as identified by the file extension). To this end, if the file is not predetermined to be wanted, it may be determined that the icon is unwanted.
  • As yet another option, the analysis of the file may include determining whether the file is stored in a predetermined location. (e.g. a predetermined folder). Just by way of example, it may be determined whether the file is stored in a predetermined folder that is indicative of the file being wanted. If the file is not stored in the predetermined location, it may be determined that the icon is unwanted.
  • As yet another option, the analysis of the file may include determining whether the file is written in a predetermined language (e.g. high-level language). Such predetermined language may include a language in which wanted files are predetermined to be written. If the file is not written in the predetermined language, it may be determined that the icon is unwanted.
  • As still yet another option, the analysis of the file may include determining whether the file includes version information (e.g. information indicating a version of the data stored in the file, etc.). For example, it may be determined whether the file includes version information for the file in resources of the file. If it is determined that the file does not include version information, it may be determined that the icon is unwanted.
  • Accordingly, it may be determined whether the icon is unwanted, as shown in decision 410. If it is not determined that the icon is unwanted, the icon is analyzed utilizing pixels of the icon. Note operation 412. The pixels of the icon may be analyzed in any manner capable of being utilized to determine whether the icon is unwanted.
  • In one embodiment, the pixels of the icon may be analyzed via pixel identification. As an option, a pattern of pixels of the icon may be identified and compared to sets of pixels of icons predetermined to be wanted. Thus, if the pattern of pixels of the icon matches one of such sets of pixels, it may be determined that the icon is wanted. Of course, as another option, if the pattern of pixels of the icon matches one of such sets of pixels, it may be further be determined whether the particular application associated with the icon is a wanted application, as described above with respect to operation 404. If, however, the pattern of pixels of the icon does not match one of such sets of pixels, it may be determined that the icon is unwanted.
  • In another embodiment, the pixels of the icon may be analyzed via image recognition. The image recognition may utilize the pixels of the icon to identify image characteristics of the icon. Just by way example, the characteristics may include a number of pixels of a particular color that exist in the icon (e.g. for determining whether the icon is of a folder). Thus, a characteristic of an icon may be identified even when pixels of the icon vary (e.g. up to a threshold amount).
  • Such characteristics may further be compared to characteristics predetermined to be associated with wanted icons. If the characteristics of the icon matches the characteristics predetermined to be associated with wanted icons, it may be determined that the icon is wanted. Optionally, if the characteristics of the icon matches the characteristics predetermined to be associated with wanted icons, it may be further be determined whether the particular application associated with the icon is a wanted application, as described above with respect to operation 404. If, however, the characteristics of the icon does not match the characteristics predetermined to be associated with wanted icons, it may be determined that the icon is unwanted.
  • Moreover, as shown in decision 414, it is determined whether the icon is unwanted. If it is not determined that the icon is unwanted, the method 400. In this way, icons not determined to be unwanted based on an analysis thereof may be allowed to exist, be utilized, etc.
  • If, however, it is determined that the icon is unwanted (in decision 406, 410 or 412, the unwanted icon is detected. Note operation 416. Accordingly, a detection of an unwanted icon may be made.
  • Furthermore, a reaction to the detection of the unwanted icon is performed, as shown in operation 418. In one embodiment, the reaction may include blocking access to the unwanted icon. Such access may include selection of the icon, for example.
  • In another embodiment, the reaction may include deleting the icon. In yet another embodiment, the reaction may include removing any data associated with the icon, such as an application associated with the icon. In yet another embodiment, the icon and/or data associated therewith may be quarantined. In still yet other embodiments, detection of the unwanted icon may be logged, a user may be notified, etc.
  • While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (20)

1. A computer program product embodied on a computer readable medium, comprising:
computer code for analyzing an image; and
computer code for detecting unwanted data based on the analysis.
2. The computer program product of claim 1, wherein the image is associated with an application.
3. The computer program product of claim 1, wherein the image is analyzed in response to a detection of a selection of the image.
4. The computer program product of claim 1, wherein the image is analyzed independent of a selection of the image.
5. The computer program product of claim 1, wherein the analysis includes generating a hash of the image.
6. The computer program product of claim 5, wherein the analysis further includes comparing the hash with a white list.
7. The computer program product of claim 5, wherein the analysis further includes comparing the hash with a black list.
8. The computer program product of claim 1, wherein the analysis includes determining whether the image is included in a packed file.
9. The computer program product of claim 1, wherein the analysis includes a heuristic analysis of at least one aspect associated with the image.
10. The computer program product of claim 9, wherein the at least one aspect includes a file in which the image resides.
11. The computer program product of claim 9, wherein the at least one aspect includes version information corresponding with an application associated with the image.
12. The computer program product of claim 9, wherein the at least one aspect includes a language in which an application associated with the image is written.
13. The computer program product of claim 1, wherein the analysis includes an analysis involving pixels used to render the image.
14. The computer program product of claim 1, wherein the analysis includes an image recognition involving the image.
15. The computer program product of claim 1, wherein the unwanted data includes at least one of malware, spyware, and adware.
16. The computer program product of claim 1, and further comprising computer code for reacting in response to the detection of the unwanted data.
17. The computer program product of claim 16, wherein the reacting includes blocking a selection of the image.
18. A method, comprising:
analyzing an image; and
detecting unwanted data based on the analysis.
19. A system comprising:
a processor for detecting unwanted data based on an analysis of an image.
20. The system of claim 19, wherein the processor is coupled to memory via a bus.
US14/679,864 2008-08-06 2015-04-06 System, Method, and Computer Program Product for Detecting Unwanted Data Based on an Analysis of an Icon Abandoned US20150302249A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/679,864 US20150302249A1 (en) 2008-08-06 2015-04-06 System, Method, and Computer Program Product for Detecting Unwanted Data Based on an Analysis of an Icon

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/187,196 US9003314B2 (en) 2008-08-06 2008-08-06 System, method, and computer program product for detecting unwanted data based on an analysis of an icon
US14/679,864 US20150302249A1 (en) 2008-08-06 2015-04-06 System, Method, and Computer Program Product for Detecting Unwanted Data Based on an Analysis of an Icon

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/187,196 Continuation US9003314B2 (en) 2008-08-06 2008-08-06 System, method, and computer program product for detecting unwanted data based on an analysis of an icon

Publications (1)

Publication Number Publication Date
US20150302249A1 true US20150302249A1 (en) 2015-10-22

Family

ID=49326333

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/187,196 Active 2030-09-29 US9003314B2 (en) 2008-08-06 2008-08-06 System, method, and computer program product for detecting unwanted data based on an analysis of an icon
US14/679,864 Abandoned US20150302249A1 (en) 2008-08-06 2015-04-06 System, Method, and Computer Program Product for Detecting Unwanted Data Based on an Analysis of an Icon

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/187,196 Active 2030-09-29 US9003314B2 (en) 2008-08-06 2008-08-06 System, method, and computer program product for detecting unwanted data based on an analysis of an icon

Country Status (1)

Country Link
US (2) US9003314B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10395026B2 (en) * 2015-09-22 2019-08-27 Samsung Electronics Co., Ltd. Method for performing security function and electronic device for supporting the same

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2881882B1 (en) * 2013-12-05 2017-03-01 Kaspersky Lab, ZAO System and method for blocking elements of application interface
RU2645265C2 (en) * 2013-12-05 2018-02-19 Закрытое акционерное общество "Лаборатория Касперского" System and method of blocking elements of application interface
KR101624276B1 (en) * 2014-05-21 2016-05-26 주식회사 안랩 Method and apparatus for detecting icon spoofing of mobile application
US9954874B2 (en) * 2014-10-07 2018-04-24 Symantec Corporation Detection of mutated apps and usage thereof
US9197663B1 (en) * 2015-01-29 2015-11-24 Bit9, Inc. Methods and systems for identifying potential enterprise software threats based on visual and non-visual data
CN106055602A (en) * 2016-05-24 2016-10-26 腾讯科技(深圳)有限公司 File verification method and apparatus
US10354173B2 (en) * 2016-11-21 2019-07-16 Cylance Inc. Icon based malware detection
RU2654146C1 (en) * 2017-06-16 2018-05-16 Акционерное общество "Лаборатория Касперского" System and method of detecting malicious files accompanied with using the static analysis elements
EP3416085B1 (en) * 2017-06-16 2020-06-03 AO Kaspersky Lab System and method of detecting malicious files with the use of elements of static analysis
CN109344279B (en) * 2018-12-12 2021-08-10 山东山大鸥玛软件股份有限公司 Intelligent handwritten English word recognition method based on Hash retrieval
US11385766B2 (en) * 2019-01-07 2022-07-12 AppEsteem Corporation Technologies for indicating deceptive and trustworthy resources

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050251758A1 (en) * 2004-04-26 2005-11-10 Microsoft Corporation Indicating file type on thumbnail preview icon
US20060242712A1 (en) * 2005-04-22 2006-10-26 Linn Christopher S Security methods and systems
US20070056035A1 (en) * 2005-08-16 2007-03-08 Drew Copley Methods and systems for detection of forged computer files
US20080127340A1 (en) * 2006-11-03 2008-05-29 Messagelabs Limited Detection of image spam
US7421587B2 (en) * 2001-07-26 2008-09-02 Mcafee, Inc. Detecting computer programs within packed computer files
US20090158164A1 (en) * 2007-12-14 2009-06-18 International Business Machines Corporation Managing icon integrity
US20110302655A1 (en) * 2010-06-08 2011-12-08 F-Secure Corporation Anti-virus application and method
US8256000B1 (en) * 2009-11-04 2012-08-28 Symantec Corporation Method and system for identifying icons

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7421587B2 (en) * 2001-07-26 2008-09-02 Mcafee, Inc. Detecting computer programs within packed computer files
US20050251758A1 (en) * 2004-04-26 2005-11-10 Microsoft Corporation Indicating file type on thumbnail preview icon
US20060242712A1 (en) * 2005-04-22 2006-10-26 Linn Christopher S Security methods and systems
US20070056035A1 (en) * 2005-08-16 2007-03-08 Drew Copley Methods and systems for detection of forged computer files
US20080127340A1 (en) * 2006-11-03 2008-05-29 Messagelabs Limited Detection of image spam
US20090158164A1 (en) * 2007-12-14 2009-06-18 International Business Machines Corporation Managing icon integrity
US8256000B1 (en) * 2009-11-04 2012-08-28 Symantec Corporation Method and system for identifying icons
US20110302655A1 (en) * 2010-06-08 2011-12-08 F-Secure Corporation Anti-virus application and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10395026B2 (en) * 2015-09-22 2019-08-27 Samsung Electronics Co., Ltd. Method for performing security function and electronic device for supporting the same

Also Published As

Publication number Publication date
US20130276105A1 (en) 2013-10-17
US9003314B2 (en) 2015-04-07

Similar Documents

Publication Publication Date Title
US9003314B2 (en) System, method, and computer program product for detecting unwanted data based on an analysis of an icon
AU2018217323B2 (en) Methods and systems for identifying potential enterprise software threats based on visual and non-visual data
US8601451B2 (en) System, method, and computer program product for determining whether code is unwanted based on the decompilation thereof
CA2856729C (en) Detecting malware using stored patterns
US8621608B2 (en) System, method, and computer program product for dynamically adjusting a level of security applied to a system
US8677493B2 (en) Dynamic cleaning for malware using cloud technology
US8256000B1 (en) Method and system for identifying icons
US8127354B1 (en) System, method, and computer program product for identifying vulnerabilities associated with data loaded in memory
US9106688B2 (en) System, method and computer program product for sending information extracted from a potentially unwanted data sample to generate a signature
US20130031111A1 (en) System, method, and computer program product for segmenting a database based, at least in part, on a prevalence associated with known objects included in the database
US9141797B1 (en) Detection of fake antivirus in computers
US10747879B2 (en) System, method, and computer program product for identifying a file used to automatically launch content as unwanted
US20130246352A1 (en) System, method, and computer program product for generating a file signature based on file characteristics
US8370941B1 (en) Rootkit scanning system, method, and computer program product
US8627461B2 (en) System, method, and computer program product for verifying an identification of program information as unwanted
US8458794B1 (en) System, method, and computer program product for determining whether a hook is associated with potentially unwanted activity
US20130247182A1 (en) System, method, and computer program product for identifying hidden or modified data objects
US8645949B2 (en) System, method, and computer program product for scanning data utilizing one of a plurality of virtual machines of a device
US8484725B1 (en) System, method and computer program product for utilizing a threat scanner for performing non-threat-related processing

Legal Events

Date Code Title Description
AS Assignment

Owner name: MCAFEE, LLC, CALIFORNIA

Free format text: CHANGE OF NAME AND ENTITY CONVERSION;ASSIGNOR:MCAFEE, INC.;REEL/FRAME:043665/0918

Effective date: 20161220

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:045055/0786

Effective date: 20170929

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND

Free format text: SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:045056/0676

Effective date: 20170929

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., MARYLAND

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE PATENT 6336186 PREVIOUSLY RECORDED ON REEL 045056 FRAME 0676. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:054206/0593

Effective date: 20170929

Owner name: JPMORGAN CHASE BANK, N.A., NEW YORK

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE REMOVE PATENT 6336186 PREVIOUSLY RECORDED ON REEL 045055 FRAME 786. ASSIGNOR(S) HEREBY CONFIRMS THE SECURITY INTEREST;ASSIGNOR:MCAFEE, LLC;REEL/FRAME:055854/0047

Effective date: 20170929

AS Assignment

Owner name: MCAFEE, LLC, CALIFORNIA

Free format text: RELEASE OF INTELLECTUAL PROPERTY COLLATERAL - REEL/FRAME 045055/0786;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT;REEL/FRAME:054238/0001

Effective date: 20201026

AS Assignment

Owner name: MCAFEE, LLC, CALIFORNIA

Free format text: RELEASE OF INTELLECTUAL PROPERTY COLLATERAL - REEL/FRAME 045056/0676;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC., AS COLLATERAL AGENT;REEL/FRAME:059354/0213

Effective date: 20220301