US10262122B2 - Analysis apparatus, analysis system, analysis method, and analysis program - Google Patents

Analysis apparatus, analysis system, analysis method, and analysis program Download PDF

Info

Publication number
US10262122B2
US10262122B2 US15/518,157 US201515518157A US10262122B2 US 10262122 B2 US10262122 B2 US 10262122B2 US 201515518157 A US201515518157 A US 201515518157A US 10262122 B2 US10262122 B2 US 10262122B2
Authority
US
United States
Prior art keywords
access
access logs
authentication information
logs
source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/518,157
Other languages
English (en)
Other versions
US20170308688A1 (en
Inventor
Shingo Orihara
Hiroshi Asakura
Yang Zhong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Assigned to NIPPON TELEGRAPH AND TELEPHONE CORPORATION reassignment NIPPON TELEGRAPH AND TELEPHONE CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ASAKURA, HIROSHI, ORIHARA, SHINGO, ZHONG, YANG
Publication of US20170308688A1 publication Critical patent/US20170308688A1/en
Application granted granted Critical
Publication of US10262122B2 publication Critical patent/US10262122B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/577Assessing vulnerabilities and evaluating computer system security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1441Countermeasures against malicious traffic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2101Auditing as a secondary aspect
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2151Time stamp
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/083Network architectures or network communication protocols for network security for authentication of entities using passwords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection

Definitions

  • the present invention relates to an analysis apparatus, an analysis system, an analysis method, and an analysis program.
  • an object of the present invention is to solve the above described problem and to detect attacks accurately.
  • the present invention is an analysis apparatus that analyzes access logs including authentication results and authentication information of users, the analysis apparatus comprising: an extracting unit that groups together, from the access logs, access logs of the same access source; a calculation unit that calculates a similarity between pieces of authentication information in plural access logs of the same access source, from the access logs, and if the calculated similarity is equal to or greater than a predetermined value, presumes that a piece of authentication information of the access logs has been input by a human; and a determination unit that determines that there is a possibility that the access source in the access logs is being an attack source, if an authentication result of any of the plural access logs is authentication failure and the calculation unit presumes that any of the pieces of authentication information of the plural access logs has not been input by a human.
  • FIG. 1 is a diagram illustrating an example of a configuration of a system.
  • FIG. 2 is a diagram for explanation of a specific example of processing by a series extracting unit.
  • FIG. 3 is a diagram for explanation of a specific example of similarity calculation by a calculation unit in FIG. 1 .
  • FIG. 4 is a diagram for explanation of a specific example of the similarity calculation by the calculation unit in FIG. 1 .
  • FIG. 5 is a diagram for explanation of a specific example of level determination by a risk determination unit in FIG. 1 .
  • FIG. 6 is a flow chart illustrating a processing sequence by an analysis apparatus in FIG. 1 .
  • FIG. 7 is a diagram for explanation of an anonymous IP accumulation unit in FIG. 1 .
  • FIG. 8 is a diagram illustrating a computer that executes an analysis program.
  • the system includes an authentication apparatus 1 and an analysis apparatus 10 .
  • the authentication apparatus 1 receives access from one or more user terminal devices (illustration thereof omitted), and executes authentication processing. Results of the authentication are recorded as access logs.
  • the analysis apparatus 10 receives the access logs from the authentication apparatus 1 and analyzes the access logs.
  • These access logs include information, such as IP addresses of terminal devices of access sources (transmission sources), authentication information used in the authentication (for example, IDs and passwords), authentication results, and dates and times of access.
  • the authentication results include: information on whether or not the authentication has succeeded; if the authentication has failed, information on reasons for the failure; and the like.
  • the reasons for failure in the authentication are, for example: account error where the ID or the like used for the authentication is different from the ID or the like that has been registered; password error where the password associated with the ID is different from the password that has been registered; and the like.
  • the analysis apparatus 10 includes a receiving unit 11 , a series extracting unit 12 , a calculation unit 13 , and a risk determination unit (determination unit) 14 .
  • An anonymous IP accumulation unit 15 illustrated with a broken line may be included or may be not included, and the case where the anonymous IP accumulation unit 15 is included will be described later.
  • the receiving unit 11 receives access logs of each user from the authentication apparatus 1 .
  • the receiving unit 11 receives access logs of each user in chronological order from the authentication apparatus 1 .
  • the series extracting unit 12 group together the access logs received by the receiving unit 11 by access source (user). That is, the series extracting unit 12 extracts a series of access logs for each access source. For example, the series extracting unit 12 forms an access log group having the same access source IP address or IP address range (for example, an access log group of a user identifier A and an access log group of a user identifier B in FIG. 1 ), from a group of the access logs received by the receiving unit 11 . If the access source IP address or IP address range is the same, the series extracting unit 12 regards the access log to be of the same user and assigns the same user identifier to the access log.
  • the series extracting unit 12 regards the access log to be of the same user and assigns the same user identifier to the access log.
  • the series extracting unit 12 may group that access log as an access log of another user when the time interval therebetween is equal to or greater than a predetermined value, “t”. For example, as illustrated in FIG. 2 , for an access log with the same access source IP address or IP address range, from access logs, “1” to “12”, the series extracting unit 12 groups that access log as an access log of the same user when the time interval therebetween is less than “t”, and groups an access log as an access log of another user (a new user) if the time interval therebetween is equal to greater than “t”. For example, in FIG.
  • the series extracting unit 12 is able to group the access logs as access logs of different users according to the change.
  • the calculation unit 13 calculates a similarity between pieces of authentication information (for example, IDs and passwords) of two access logs from access logs of the same user grouped together by the series extracting unit 12 . If the calculated similarity is equal to or greater than a predetermined value, the calculation unit 13 presumes that the piece of authentication information in any one access log of these two access logs has been input by a human. On the contrary, if the calculated similarity is less than the predetermined value, the calculation unit 13 presumes that the piece of authentication information in any one access log of these two access logs has not been input by a human, that is, has been input by a machine (for example, by list based attack). The calculation unit 13 may, of course, calculate similarities between pieces of authentication information of three or more access logs of the same user.
  • pieces of authentication information for example, IDs and passwords
  • the calculation unit 13 receives: a user identifier; and IDs included in two access logs that are chronologically consecutive (the most recent two access logs) of a series of access logs of that user identifier. The calculation unit 13 then compares the IDs (ID1 and ID2) included in the most recent two access logs of this user identifier B with each other, and calculates a similarity therebetween. If the calculated similarity is equal to or greater than the predetermined value, the calculation unit 13 presumes that the piece of authentication information has been input by a human, and outputs the presumed result to the risk determination unit 14 .
  • the calculation unit 13 presumes that this piece of authentication information has been input by a machine, and outputs the presumed result to the risk determination unit 14 .
  • a specific example of this similarity calculation by the calculation unit 13 will be described later.
  • the risk determination unit 14 determines highness (level) of a possibility that the user is being an attack source. For example, if an authentication result of any one access log of two consecutive access logs of a user's access logs that have been grouped together by the series extracting unit 12 is authentication failure, and the calculation unit 13 presumes that authentication information of any one access log of these two access logs has not been input by a human (that is, has been input by a machine), the risk determination unit 14 determines that a possibility that the user of the access source in these access logs is being the attack source is high.
  • the risk determination unit 14 increases highness (level) of a risk of attack occurring with the user of the access source in these access logs being the attack source.
  • the risk determination unit 14 decreases the highness (level) of the risk of the attack occurring with the user of the access source of these access logs being the attack source.
  • the risk determination unit 14 may output a user identifier of a user of an access source, for which the level has been increased consecutively twice (for example, the level has reached “3” or higher from “1”), as a user identifier of a user highly likely to be an attack source. Thereby, an administrator or the like of the system is able to know the user identifier of the user highly likely to be the attack source.
  • the nearer the locations of, or the more similar the input operations for, characters, numbers, and symbols composing the respective pieces of authentication information, are to each other on the input device the more highly the calculation unit 13 calculates the similarity therebetween.
  • the nearer the distance (keyboard distance) between the characters, numbers, and symbols composing the respective pieces of authentication information is on the keyboard layout, the more highly the calculation unit 13 calculates the similarity therebetween.
  • the similarity therebetween is calculated highly.
  • the calculation unit 13 may calculate the similarity highly, too.
  • a similarity between “yamada.taro” and “taro.yamada” illustrated with a reference numeral 304 is calculated highly, because the latter piece of authentication information is a character string resulting from interchange between character strings divided by the dot (.) of the former piece of authentication information.
  • each of two pieces of authentication information (for example, IDs) to be compared with each other includes a number at the end thereof, character strings before and after a delimiter in one of the pieces of authentication information with the number omitted therefrom are interchanged, and the interchanged character string with the number added back is the same as the other one of the pieces of authentication information; the calculation unit 13 may calculate the similarity highly.
  • a similarity between “taro.yamada123” and “yamada.taro123” illustrated with a reference numeral 305 is calculated highly, since when the character strings before and after the delimiter in the former piece of authentication information with the number omitted therefrom are interchanged and the number is added back, the obtained character string becomes the same as the latter piece of authentication information.
  • the calculation unit 13 may calculate the similarity highly. For example, the calculation unit 13 calculates a similarity between “yamada” and “yamada123” highly, since the latter piece of authentication information, from which the number therein has been deleted, is the same as the former piece of authentication information.
  • the calculation unit 13 may calculate the similarity highly.
  • a similarity between “yamada.taro@example.co.jp” and “yamada.taro” illustrated with a reference numeral 401 in FIG. 4 is calculated highly, since the partial character string up to the at-sign of the former matches the other.
  • the calculation unit 13 may calculate the similarity highly.
  • a similarity between “yamada-taro” and “yamadataro” is calculated highly, since when the symbol (hyphen (-)) is removed from the former, the former matches the latter.
  • the calculation unit 13 may calculate the similarity highly.
  • a similarity between “yamada-taro” and “yamada-t” illustrated with a reference numeral 403 in FIG. 4 or between “yamada.taro” and “yama.taro” illustrated with a reference numeral 404 , is calculated highly, because when the partial character string before or after the symbol is partially omitted, the obtained character string matches the other piece of authentication information.
  • one of pieces of authentication information includes one non-alphanumeric symbol; if, when the partial character string before or after the symbol is partially omitted and the obtained partial character strings before and after the symbol are interchanged, the obtained character string matches the other piece of authentication information, the similarity may be calculated highly, too.
  • the calculation unit 13 calculates a similarity between these pieces of authentication information highly. Patterns, for which the similarity between pieces of authentication information to be compared with each other is determined by the calculation unit 13 highly (that is, patterns typical in human input errors), may, for example, be stored in a predetermined area of a storage unit (illustration thereof omitted) of the analysis apparatus 10 , and be modified as appropriate by an administrator or the like of the analysis apparatus 10 .
  • level determination by the risk determination unit 14 increases level when behavior similar to list based attack (login using authentication information input by a machine, or the like) is observed in an access log, and decreases the level when behavior not similar to list based attack (login using authentication input by a human, or the like) is observed therein.
  • a level 1 indicates a “normal IP address”
  • level 2 indicates “caution”
  • level 3 indicates that “possibility of list based attack is high”
  • level 4 indicates that “list based attack is ongoing”
  • level 5 indicates that “possibility of success in list based attack is extremely high”.
  • An upper diagram in FIG. 5 is a state transition diagram illustrating an example of the level determination by the risk determination unit 14 .
  • Nodes in this state transition diagram represent levels (level 1 to level 5), and arrows (edges) connecting between the nodes represent events for transition to the respective states.
  • An event is described with, for example, as illustrated with a reference numeral 601 , a combination of success or failure of the login, and a determination result (input determination) of whether the authentication information has been input by a machine or by a human.
  • Success or failure of the login is any one of login success (LOGIN), login failure due to password error (FAIL), login failure due to account error (UNKNOWN), and success or failure of login not being questioned (*).
  • the determination result of whether the authentication information has been input by a machine or by a human is any one of: whether the authentication information has been input by a machine or by a human being not questioned (no input determination) (*); machine (MACHINE); and human (HUMAN).
  • no input determination (*); machine (MACHINE); and human (HUMAN).
  • LOGIN-* indicates that whether the authentication information has been input by a machine or a human is not questioned, and the login has succeeded.
  • FAIL-MACHINE indicates that the login has failed due to password error, and the authentication information has been input by a machine.
  • “*-MACHINE” indicates that success or failure of the login is not questioned, and the authentication information has been input by a machine.
  • the risk determination unit 14 every time the risk determination unit 14 finds an access log, for which the authentication information is presumed to have been input by a machine, for example, the risk determination unit 14 increases highness (level) of a risk of attack occurring with the access source IP address of that access log being the attack source, and every time the risk determination unit 14 finds an access log, for which the authentication information is presumed to have been input by a human, the risk determination unit 14 decreases the highness (level) of the risk of the attack occurring with the access source IP address being the attack source.
  • the risk determination unit 14 every time the risk determination unit 14 finds, in a series of access logs, a characteristic typical in list based attack, the risk determination unit 14 increases highness (level) of a risk of attack of the access source of that access log, and if the risk determination unit 14 does not find a characteristic typical in list based attack, the risk determination unit 14 decreases the highness (level) of the risk of the attack of that access source.
  • the risk determination unit 14 then outputs, for example, the IP address (user identifier), for which the level has become equal to or greater than the level 3, as a user identifier of a user that is highly likely to be the attack source. Thereafter, the analysis apparatus 10 may output an alert to the terminal of the IP address, or notify the authentication apparatus 1 or the like of the IP address as a target to be blocked. Further, if an event does not occur for a predetermined time period or longer during state transition to the respective nodes of the state transition diagram illustrated in FIG. 5 , the risk determination unit 14 may reset the level to the level 1.
  • the IP address user identifier
  • the receiving unit 11 receives access logs from the authentication apparatus 1 (S 1 ), and classifies the access logs received by the receiving unit 11 by user (S 2 ). For example, from the group of access logs received by the receiving unit 11 , the series extracting unit 12 groups together access logs having the same access source IP address or IP address range.
  • the calculation unit 13 calculates a similarity between pieces of authentication information of the most recent two access logs for each user (S 3 ). For example, the calculation unit 13 calculates a similarity between pieces of authentication information included in two access logs that are chronologically consecutive, for each user, by the method illustrated in FIG. 3 and FIG. 4 .
  • the similarity calculated by the calculation unit 13 is equal to or greater than a predetermined value (S 4 ; Yes), a piece of authentication information included in those access logs is presumed to have been input by a human (S 5 ), and if the similarity is less than the predetermined value (S 4 ; No), a piece of authentication information included in those access logs is presumed to have been input by a machine (S 6 ).
  • the calculation unit 13 then outputs the presumed result to the risk determination unit 14 .
  • the risk determination unit 14 determines highness (level) of a possibility that the user is being the attack source (S 7 ). For example, by the method illustrated in FIG. 5 , the risk determination unit 14 determines highness (level) of a risk of attack occurring with each user being the attack source.
  • the risk determination unit 14 then outputs a user identifier of a user, for which the level determined at S 7 is equal to or greater than a predetermined value (S 8 ). For example, the risk determination unit 14 outputs user identification information of a user, for which the level determined by the method illustrated in FIG. 5 is equal to or greater than “3”. Further, after S 8 , if there are any access logs, for which a similarity has not been calculated, in the access logs that have been classified by user at S 2 (S 9 ; No for “end of logs?”), the processing is returned to S 3 . On the contrary, if similarities have been calculated for all of the access logs that have been classified by user at S 2 (S 9 ; Yes for “end of logs?”), the processing is ended.
  • the analysis apparatus 10 executes the above described processing for access logs of each user classified at S 2 .
  • the analysis apparatus 10 may process the access logs of the respective users in order, or may process the access logs of the respective users concurrently.
  • the analysis apparatus 10 includes plural calculation units 13 and risk determination units 14 according to the concurrency.
  • the analysis apparatus 10 is able to detect the list based attack. Specifically, the analysis apparatus 10 is able to output highness (level) of a possibility that each user is being an attack source of list based attack, or a user identifier of a user highly likely to be an attack source of list based attack. Further, since the analysis apparatus 10 uses success or failure of authentication, a reason of authentication failure, and a presumed result for input of authentication information, in detection of attack, even if an attacker slows down the trial frequency of login in order to disguise the login as normal login, it becomes easier for the analysis apparatus 10 to detect the attack.
  • the analysis apparatus 10 may further include the anonymous IP accumulation unit 15 illustrated in FIG. 1 .
  • This anonymous IP accumulation unit 15 accumulates therein IP addresses, each of which is highly likely to be used among plural users.
  • the receiving unit 11 removes, from access logs received from the authentication apparatus 1 , any access log of an IP address or IP address range that is the same as the IP address accumulated in the anonymous IP accumulation unit 15 , and outputs them to the series extracting unit 12 .
  • the series extracting unit 12 is able to group together an access log group from the same user, based on the IP addresses or IP address ranges of the access sources in the access logs.
  • An IP address (anonymous IP address) accumulated in this anonymous IP accumulation unit 15 is, for example, an IP address that is used for The Onion Router (Tor), a proxy server, a mobile carrier, a public access point, or the like.
  • An anonymous IP address accumulated in this anonymous IP accumulation unit 15 is, for example, as illustrated in FIG. 7 , obtained by an obtaining unit (illustration thereof omitted in FIG. 1 ) through a Tor list, an open proxy list, a mobile carrier IP address range, other manual input, or the like, and accumulated therein. Further, the obtaining unit may obtain these IP addresses by an obtaining method and at time intervals, which have been determined for each IP address obtainment source.
  • the obtaining unit refers to anonymous IP address obtainment information illustrated with a reference numeral 501 in FIG. 7 , obtains IP addresses (a list of IP addresses) every 30 minutes from a providing site of a Tor list, and accumulates the IP addresses in the anonymous IP accumulation unit 15 . Further, the obtaining unit refers to the anonymous IP address obtainment information illustrated with the reference numeral 501 in FIG. 7 , obtains IP addresses (a list of IP addresses) by manual input as required from a mobile carrier IP address range, and accumulates the IP addresses in the anonymous IP accumulation unit 15 .
  • a program which describes the processing executed by the analysis apparatus 10 according to the above described embodiment in a language executable by a computer, may be generated and executed.
  • the computer executing the program, effects that are the same as those of the above described embodiment are able to be obtained.
  • recording this program in a computer readable recording medium, and causing the computer to load and execute the program recorded in this recording medium, processing that is the same as that of the above described embodiment may be realized.
  • a computer which executes an analysis program that realizes functions that are the same as those of the analysis apparatus 10 , will be described.
  • FIG. 8 is a diagram illustrating the computer that executes the analysis program.
  • a computer 1000 has, for example, a memory 1010 , a central processing unit (CPU) 1020 , a hard disk drive interface 1030 , a disk drive interface 1040 , a serial port interface 1050 , a video adapter 1060 , and a network interface 1070 . These units are connected to one another via a bus 1080 .
  • CPU central processing unit
  • the memory 1010 includes a read only memory (ROM) 1011 and a random access memory (RAM) 1012 .
  • the ROM 1011 stores therein a boot program, such as Basic Input Output System (BIOS), for example.
  • BIOS Basic Input Output System
  • the hard disk drive interface 1030 is connected to a hard disk drive 1090 .
  • the disk drive interface 1040 is connected to a disk drive 1100 .
  • An attachable and detachable storage medium, such as a magnetic disk or an optical disk, for example, is inserted in the disk drive 1100 .
  • a mouse 1110 and a keyboard 1120 are connected to the serial port interface 1050 .
  • a display 1130 for example, is connected to the video adapter 1060 .
  • the hard disk drive 1090 stores therein, for example, an OS 1091 , an application program 1092 , a program module 1093 , and program data 1094 .
  • the analysis program is stored as a program module, for example, in which commands executed by the computer 1000 are described, in the hard disk drive 1090 .
  • a program module in which the processing executed by the analysis apparatus 10 described in the above embodiment is described, is stored in the hard disk drive 1090 .
  • data used in information processing by the analysis program are stored as program data in, for example, the hard disk drive 1090 .
  • the CPU 1020 loads the program module 1093 and the program data 1094 stored in the hard disk drive 1090 as necessary into the RAM 1012 , and executes the above described sequences.
  • the program module 1093 and the program data 1094 related to the analysis program are not necessarily stored in the hard disk drive 1090 , and for example, may be stored in an attachable and detachable storage medium and read out by the CPU 1020 via the disk drive 1100 or the like. Or, the program module 1093 and the program data 1094 related to the analysis program may be stored in another computer connected via a network, such as a local area network (LAN) or a wide area network (WAN), and read out by the CPU 1020 via the network interface 1070 .
  • LAN local area network
  • WAN wide area network

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
US15/518,157 2014-10-28 2015-10-22 Analysis apparatus, analysis system, analysis method, and analysis program Active US10262122B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2014219276 2014-10-28
JP2014-219276 2014-10-28
PCT/JP2015/079796 WO2016068007A1 (ja) 2014-10-28 2015-10-22 分析装置、分析システム、分析方法、および、分析プログラム

Publications (2)

Publication Number Publication Date
US20170308688A1 US20170308688A1 (en) 2017-10-26
US10262122B2 true US10262122B2 (en) 2019-04-16

Family

ID=55857346

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/518,157 Active US10262122B2 (en) 2014-10-28 2015-10-22 Analysis apparatus, analysis system, analysis method, and analysis program

Country Status (3)

Country Link
US (1) US10262122B2 (ja)
JP (1) JP6200101B2 (ja)
WO (1) WO2016068007A1 (ja)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6564137B2 (ja) * 2016-06-01 2019-08-21 日本電信電話株式会社 検知装置、検知方法、検知システム、および検知プログラム
US11102207B2 (en) * 2017-11-21 2021-08-24 T-Mobile Usa, Inc. Adaptive greylist processing
US11606372B2 (en) 2017-12-19 2023-03-14 T-Mobile Usa, Inc. Mitigating against malicious login attempts
WO2020021811A1 (ja) * 2018-07-25 2020-01-30 日本電信電話株式会社 解析装置、解析方法及び解析プログラム
JP7231024B2 (ja) * 2019-06-06 2023-03-01 富士通株式会社 情報処理プログラム、情報処理方法、および情報処理装置
CN112069424A (zh) * 2019-06-10 2020-12-11 北京国双科技有限公司 访问行为数据分析方法及装置
US20220247750A1 (en) * 2021-01-29 2022-08-04 Paypal, Inc. Evaluating access requests using assigned common actor identifiers

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09212458A (ja) 1996-01-30 1997-08-15 Toshiba Corp パスワード認証方法
US7032026B1 (en) * 2001-08-31 2006-04-18 Oracle International Corp. Method and apparatus to facilitate individual and global lockouts to network applications
US20070005985A1 (en) * 2005-06-30 2007-01-04 Avigdor Eldar Techniques for password attack mitigation
US20080060078A1 (en) * 2006-08-31 2008-03-06 Lord Robert B Methods and systems for detecting an access attack
US20090031406A1 (en) * 2007-07-26 2009-01-29 Fuji Xerox Co., Ltd. Authentication information processing device, authentication information processing method, storage medium, and data signal
JP2010079562A (ja) 2008-09-25 2010-04-08 Fujitsu Ltd 情報処理装置、情報処理方法、およびプログラム
US8312540B1 (en) * 2008-06-13 2012-11-13 Juniper Networks, Inc. System for slowing password attacks
US20150106930A1 (en) * 2013-10-11 2015-04-16 Fujitsu Limited Log analysis device and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09212458A (ja) 1996-01-30 1997-08-15 Toshiba Corp パスワード認証方法
US7032026B1 (en) * 2001-08-31 2006-04-18 Oracle International Corp. Method and apparatus to facilitate individual and global lockouts to network applications
US20070005985A1 (en) * 2005-06-30 2007-01-04 Avigdor Eldar Techniques for password attack mitigation
US20080060078A1 (en) * 2006-08-31 2008-03-06 Lord Robert B Methods and systems for detecting an access attack
US20090031406A1 (en) * 2007-07-26 2009-01-29 Fuji Xerox Co., Ltd. Authentication information processing device, authentication information processing method, storage medium, and data signal
US8312540B1 (en) * 2008-06-13 2012-11-13 Juniper Networks, Inc. System for slowing password attacks
JP2010079562A (ja) 2008-09-25 2010-04-08 Fujitsu Ltd 情報処理装置、情報処理方法、およびプログラム
US20150106930A1 (en) * 2013-10-11 2015-04-16 Fujitsu Limited Log analysis device and method

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
"Block Unauthorized Login! Authentication Enhancement by OpenAM", [online], http://atmarkit.co.jp/alt/article/1310/17/news003.html, Total 15 Pages, (Retrieval Date Oct. 7, 2014).
"How to Protect Oneself From Threats That Are Getting More Sophisticated, such as Advanced Persistent Threat and Unauthorized Login", [online], http://www.atmarkit.co.jp/alt/articles/1406/27/news012.html, Total 14 Pages, (Retrieval Date Oct. 7, 2014).
"InfoCage SiteShell Ver2.0.1 Password List Kogeki Taisaku Settei Tejunsho", NEC Corporation, Total 31 Pages, (Nov. 9, 2015).
"Measures to Deal With Unauthorized Login by List Based Account Hacking (Collection of Measures for Internet Service Providers such as Site Administrators", [online], http://www.soumu.go.jp/main_content/000265403.pdf, Total 18 Pages, (Retrieval Date Oct. 7, 2014).
"Unlocking Lockout of User Account", [online], http://www.atmarkit.co.jp/alt/articles/0311/29/news005.html, Total 5 Pages, (Retrieval Date Oct. 7, 2014).
Hiroshi Tokumaru, "Systematically Learning How to Make Safe Web Applications", SB Creative Corporation, Total 1 Page, (Mar. 2011).
International Search Report dated Nov. 17, 2015 in PCT/JP2015/079796 Filed Oct. 22, 2015.

Also Published As

Publication number Publication date
JP6200101B2 (ja) 2017-09-20
US20170308688A1 (en) 2017-10-26
JPWO2016068007A1 (ja) 2017-04-27
WO2016068007A1 (ja) 2016-05-06

Similar Documents

Publication Publication Date Title
US10262122B2 (en) Analysis apparatus, analysis system, analysis method, and analysis program
US10949534B2 (en) Method for predicting and characterizing cyber attacks
US11044264B2 (en) Graph-based detection of lateral movement
US11025664B2 (en) Identifying security actions for responding to security threats based on threat state information
CN107408181B (zh) 恶意软件感染终端的检测装置、恶意软件感染终端的检测系统、恶意软件感染终端的检测方法以及记录介质
US11487880B2 (en) Inferring security incidents from observational data
JP6528448B2 (ja) ネットワーク攻撃監視装置、ネットワーク攻撃監視方法、及びプログラム
US9876814B2 (en) Detecting domains generated by a domain generation algorithm
US8549314B2 (en) Password generation methods and systems
JP6697123B2 (ja) プロファイル生成装置、攻撃検知装置、プロファイル生成方法、および、プロファイル生成プログラム
CN107438049B (zh) 一种恶意登录识别方法及装置
JP6386593B2 (ja) 悪性通信パターン抽出装置、悪性通信パターン抽出システム、悪性通信パターン抽出方法、および、悪性通信パターン抽出プログラム
JP6181884B2 (ja) マルウェア感染端末の検出装置、マルウェア感染端末の検出方法およびマルウェア感染端末の検出プログラム
CN111183620B (zh) 入侵调查
JP2017076185A (ja) ネットワーク監視装置、ネットワーク監視方法、及びネットワーク監視プログラム
CN112437062A (zh) 一种icmp隧道的检测方法、装置、存储介质和电子设备
Mohammadmoradi et al. Making whitelisting-based defense work against badusb
WO2019159809A1 (ja) アクセス分析システム及びアクセス分析方法
KR101153115B1 (ko) 해킹 툴을 탐지하는 방법, 서버 및 단말기
US11233809B2 (en) Learning device, relearning necessity determination method, and relearning necessity determination program

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIPPON TELEGRAPH AND TELEPHONE CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ORIHARA, SHINGO;ASAKURA, HIROSHI;ZHONG, YANG;REEL/FRAME:041942/0821

Effective date: 20170316

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4