KR101678179B1 - Device of detecting wrongful use of personal information - Google Patents

Device of detecting wrongful use of personal information Download PDF

Info

Publication number
KR101678179B1
KR101678179B1 KR1020150064461A KR20150064461A KR101678179B1 KR 101678179 B1 KR101678179 B1 KR 101678179B1 KR 1020150064461 A KR1020150064461 A KR 1020150064461A KR 20150064461 A KR20150064461 A KR 20150064461A KR 101678179 B1 KR101678179 B1 KR 101678179B1
Authority
KR
South Korea
Prior art keywords
decryption
requests
encryption
log data
user
Prior art date
Application number
KR1020150064461A
Other languages
Korean (ko)
Other versions
KR20160131619A (en
Inventor
박성은
Original Assignee
(주)케이사인
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)케이사인 filed Critical (주)케이사인
Priority to KR1020150064461A priority Critical patent/KR101678179B1/en
Publication of KR20160131619A publication Critical patent/KR20160131619A/en
Application granted granted Critical
Publication of KR101678179B1 publication Critical patent/KR101678179B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/02Network architectures or network communication protocols for network security for separating internal from external traffic, e.g. firewalls
    • H04L63/0227Filtering policies
    • H04L63/0236Filtering by address, protocol, port number or service, e.g. IP-address or URL
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload

Abstract

The privacy tamper detection device includes a log data collector, a condition log data generator, and a detector. The log data collector receives log data provided from a plurality of token servers. The condition log data generator separates log data according to a predetermined detection condition and provides condition log data. The detector determines whether the system is illegally used based on the condition log data. The apparatus for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on condition log data generated according to detection conditions.

Description

[0001] DEVICE OF DETECTING WRONGFUL USE OF PERSONAL INFORMATION [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to personal information protection, and more particularly, to a device for detecting fraudulent use of personal information.

Personal information may be encrypted in order to maintain the security of personal information. Even if personal information is encrypted, personal information may be used illegally by others. Various studies have been conducted to prevent illegal use of personal information by others.

An object of the present invention is to provide an apparatus for detecting fraudulent use of personal information that can determine whether fraudulent use of the system is based on condition log data generated according to detection conditions.

In order to accomplish one object of the present invention, a device for detecting fraudulent use of personal information according to embodiments of the present invention includes a log data collector, a condition log data generator, and a detector. The log data collector receives log data provided from a plurality of token servers. The condition log data generator separates the log data according to a predetermined detection condition and provides condition log data. The detector determines whether the system is fraudulent or not based on the condition log data.

In an exemplary embodiment, the detection condition includes a system access IP address, a system access time, a number of encryption requests corresponding to the number of requests for encryption to the system, and a number of decryption requests corresponding to the number of requests for decryption to the system .

In an exemplary embodiment, if the system access IP address is an international IP address, the detector may determine that the system is fraudulent.

In an exemplary embodiment, if the system access time is a time when access to the system is not allowed and the number of decryption requests is greater than a predetermined number of decryption limits, the detector may determine that the system is fraudulent .

In an exemplary embodiment, if the number of decryption requests is greater than the decryption limit count for a policy that is not allowed to be accessed among the policies included in the system, the detector may determine that the system is fraudulent.

In an exemplary embodiment, if the data size of the encryption and decryption is greater than a predetermined data limit size and the number of encryption requests and the number of decryption requests are greater than a predetermined number of encryption thresholds and a decryption limit number, It can be determined that the system is illegally used.

In an exemplary embodiment, the personal information fraud detection apparatus may further include a multidimensional analyzer for providing pattern information corresponding to a pattern of accessing the system for each user of the system access IP address based on the log data have.

In an exemplary embodiment, the pattern information may be determined based on the user, the system access time, the number of encryption requests, the number of decryption requests, and the policy included in the system.

In an exemplary embodiment, the pattern information may be determined according to a dimension selection signal for selecting the system access time, the number of encryption requests, the number of decryption requests, and a part of policies included in the system.

In an exemplary embodiment, the pattern information may be an encryption request count, an average, a standard deviation, and a maximum value of the number of decryption requests for each system access time period.

In an exemplary embodiment, the pattern information is updated at predetermined time intervals, and the pattern information may be stored in a pattern information register included in the multi-dimensional analyzer.

In an exemplary embodiment, when the number of encryption and the number of decryption of the user are greater than the maximum value, the multi-dimensional analyzer can provide a warning message. And the multidimensional analyzer can provide a caution message when the number of encryption and the number of decryption of the user are smaller than the maximum value and larger than a sum of the average and the standard deviation. The multidimensional analyzer may provide a security message if the number of encryption and the number of decryption of the user are less than a sum of the average and the standard deviation.

The apparatus for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on condition log data generated according to detection conditions.

1 is a block diagram illustrating an apparatus for detecting fraudulent use of personal information according to embodiments of the present invention.
FIG. 2 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent.
FIG. 3 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent use based on the number of times of request for an encryption / decryption request.
FIG. 4 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent use based on a data size.
FIG. 5 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent use based on the number of requests for encryption / decryption for each policy.
FIG. 6 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent use based on data size of each policy.
FIG. 7 is a block diagram illustrating an apparatus for detecting fraudulent use of personal information according to an exemplary embodiment of the present invention. Referring to FIG.
FIGS. 8 and 9 are views showing examples of the number of encryption request times and the number of decryption request times of the user according to the dimension selection signal in the personal information fraud detection apparatus of FIG. 7;
FIG. 10 is a diagram illustrating an example of a multidimensional analyzer included in the personal information fraud detection apparatus of FIG.

For the embodiments of the invention disclosed herein, specific structural and functional descriptions are set forth for the purpose of describing an embodiment of the invention only, and it is to be understood that the embodiments of the invention may be practiced in various forms, And is not to be construed as limited to the embodiments described in Figs.

The present invention is capable of various modifications and various forms, and specific embodiments are illustrated in the drawings and described in detail in the text. It is to be understood, however, that the invention is not intended to be limited to the particular forms disclosed, but on the contrary, is intended to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The terms first, second, etc. may be used to describe various components, but the components should not be limited by the terms. The terms may be used for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.

The terminology used in this application is used only to describe a specific embodiment and is not intended to limit the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In the present application, the terms "comprise", "having", and the like are intended to specify the presence of stated features, integers, steps, operations, elements, components, or combinations thereof, , Steps, operations, components, parts, or combinations thereof, as a matter of principle.

Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries should be construed as meaning consistent with meaning in the context of the relevant art and are not to be construed as ideal or overly formal in meaning unless expressly defined in the present application .

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. The same reference numerals are used for the same constituent elements in the drawings and redundant explanations for the same constituent elements are omitted.

1 is a block diagram illustrating an apparatus for detecting fraudulent use of personal information according to embodiments of the present invention.

Referring to FIG. 1, the personal information fraud detection apparatus 10 includes a log data collector 200, a condition log data generator 300, and a detector 400. The log data collector 200 receives log data LOG_D provided from a plurality of token servers. For example, the plurality of token servers may include a first token server 110, a second token server 120, and a third token server 130. The log data collector 200 may receive the first log data LOG_D1 provided from the first token server 110. [ In addition, the log data collector 200 may receive the second log data LOG_D2 provided from the second token server 120. [ In the same manner, the log data collector 200 can receive the third log data LOG_D3 provided from the third token server 130. [

The condition log data generator 300 separates the log data LOG_D according to a predetermined detection condition D_C and provides the condition log data CLOG_D. The detector 400 determines whether the system is illegally used (DD) based on the condition log data CLOG_D. The system may include a tamper detection device 10 for personal information. For example, the detection condition D_C may include a system access IP address, a system access time, an encryption request count RC_E corresponding to the number of requests for encryption to the system, and a decryption request count RC_D corresponding to the number of decryption requests requested to the system. . ≪ / RTI > For example, the condition log data generator 300 may classify an external system access IP address for accessing the system from abroad according to the system access IP address and a domestic system access IP address for accessing the system in the country. In this case, the condition log data (CLOG_D) may be an overseas system access IP address and a domestic system access IP address.

Also, the condition log data generator 300 can determine whether system access is attempted from a plurality of countries for a predetermined time according to the system access IP address and the system access time. In this case, the IP address at which the system access is attempted from a plurality of countries for a predetermined time and the IP address at which the system access is not attempted from a plurality of countries during the time may be included in the condition log data CLOG_D. Therefore, the condition log data CLOG_D may be data classified according to the detection condition D_C.

For example, the condition log data (CLOG_D) generated according to the detection condition (D_C) can be analyzed to determine whether the same user uses multiple IP addresses, and the condition log data (CLOG_D ), It is possible to determine whether multiple users use the same IP address or not, and to determine whether or not the IP address is fraudulent based on the use position of the IP address determined according to the detection condition D_C. The apparatus 10 for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on the condition log data CLOG_D generated according to the detection condition D_C.

FIG. 2 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent.

Referring to FIGS. 1 and 2, the detection condition D_C includes a system access IP address, a system access time, an encryption request count RC_E corresponding to the number of requests for encryption to the system, and a decryption request Number of times (RC_D).

In an exemplary embodiment, if the system access IP address (ip address) is the foreign IP address, the detector 400 may determine that the system is fraudulent. For example, 210.124.121.XX may be the domestic IP address, and 72.244.64.XX may be the foreign IP address. When the system access IP address included in the first log data LOG_D1 provided from the first token server 110 is 210.124.121.XX, the detector 400 may not judge that the system is fraudulent. On the other hand, when the system access IP address included in the second log data LOG_D2 provided from the second token server 120 is 72.244.64.XX, the detector 400 can judge that the system is fraudulent .

In an exemplary embodiment, if the system access time is a time when access to the system is not allowed and the number of decryption requests RC_D is greater than a predetermined number of decryption thresholds DLN, It can be judged. For example, the time when the system is not allowed to access may be from 11 pm to 3 am. The system access time of the system access IP address included in the third log data LOG_D3 provided from the third token server 130 may be 11:30 PM. Also, using the same system access IP address, the system can be accessed more than the number of decryption limit (DLN) between 11:00 pm and 3:00 am. For example, the number of decoding limit DLN may be 5. If the system access ID address is used to request 6 decryption from 11 pm to 3 am, the detector 400 may determine that the system is illegally used. The time when access to the system is not allowed may be set based on the day of the week, day, month, and year.

In an exemplary embodiment, if the number of decryption requests (RC_D) is greater than the number of decryption limits (DLN) for a policy that is not allowed to be accessed among the policies included in the system, the detector 400 determines that the system is fraudulent can do. For example, the policies included in the system may include encryption policies and decryption policies. The encryption policy may include an encryption method, a length of data to be encrypted, and the like. The decryption policy may include a decryption method or a length of data to be decrypted. Some policies may include policies that are not allowed for security. For example, the number of decoding limit DLN may be 5. If the system access ID address is used to request a six-time decryption for a policy that is not allowed to be accessed, the detector 400 may determine that the system is fraudulent.

FIG. 3 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent use based on the number of times of request for an encryption / decryption request.

3, when the number of encryption request (RC_E) and the number of decryption request (RC_D) are greater than a predetermined number of encryption limit (ELN) and the number of decryption limit (DLN), the detector 400 determines that the system is fraudulent can do. For example, the predetermined number of encryption thresholds (ELN) may be five. If the number of times of encryption request (RC_E) is 3 using the same system access IP address, the detector 400 may not judge the illegal use of the system. On the other hand, if the number of times of encryption request (RC_E) is 7 using the same system access IP address, the detector 400 can judge that the system is fraudulent use. For example, the predetermined number of decoding limit DLN may be five. If the number of decryption requests RC_D is 2 using the same system access IP address, the detector 400 may not judge that the system is fraudulent. On the other hand, if the number of times of encryption request RC_E is 6 using the same system access IP address, the detector 400 can determine that the system is fraudulent.

The apparatus 10 for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on the condition log data CLOG_D generated according to the detection condition D_C.

FIG. 4 is a diagram for explaining a case where the personal information fraud detection apparatus of FIG. 1 determines that the system is fraudulent use based on a data size.

Referring to FIG. 4, if the data size DS of the encryption and decryption is larger than a predetermined data limit size DLS, the detector 400 may determine that the system is fraudulent. For example, the predetermined encryption limit size may be 100. If the encryption data size (DS) is 10 for the same system access IP address, the detector 400 may not judge the illegal use of the system. On the other hand, when the data size (DS) of the encryption is 10000 for the same system access IP address, the detector 400 can determine that the system is fraudulent. For example, the predetermined decryption limit size may be 150. If the decryption data size (DS) is 20 for the same system access IP address, the detector 400 may not judge that the system is fraudulent. On the other hand, if the data size DS of the decryption is 1000 for the same system access IP address, the detector 400 can determine that the system is fraudulent.

FIG. 5 is a view for explaining a case where the apparatus for fraudulent use of personal information shown in FIG. 1 judges fraudulent use of the system on the basis of the number of requests for encryption / decryption for each policy. FIG. Is judged to be unauthorized use of the system based on the policy-specific data size.

5, if the number of encryption requests RC_E and the number of decryption requests RC_D are greater than a predetermined number of encryption thresholds ELN and the number of decryption thresholds DLN for a particular policy, It can be judged to be illegal use. If the number of encryption request (RC_E) and the number of decryption request (RC_D) are smaller than a predetermined number of encryption limit number (ELN) and the number of decryption limit number (DLN), the detector 400 determines that the system is fraudulent I can not.

Referring to FIG. 6, if the data size (DS) of encryption and decryption is larger than a predetermined data limit size (DLS) for a specific policy, the detector 400 can determine that the system is fraudulent. Further, if the data size DS of the encryption and decryption is smaller than the predetermined data limit size DLS for the specific policy, the detector 400 may not judge the illegal use of the system.

The apparatus 10 for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on the condition log data CLOG_D generated according to the detection condition D_C.

FIG. 7 is a block diagram illustrating an apparatus for detecting fraudulent use of personal information according to an exemplary embodiment of the present invention. FIG. 8 and FIG. 9 are diagrams for explaining a method for detecting a fraudulent use of personal information, FIG. 8 is a diagram showing examples of the number of requests. FIG.

7 to 9, the personal information defacement detection apparatus 10 includes a log data collector 200, a condition log data generator 300, and a detector 400. The log data collector 200 receives log data LOG_D provided from a plurality of token servers. The condition log data generator 300 separates the log data LOG_D according to a predetermined detection condition D_C and provides the condition log data CLOG_D. The detector 400 determines whether the system is fraudulent or not based on the condition log data CLOG_D.

In the exemplary embodiment, the apparatus 10 for detecting misuse of the personal information includes a pattern analyzer (PD_I) for providing pattern information PA_I corresponding to a pattern for accessing the system for each user of the system access IP address based on the log data LOG_D (500). In the exemplary embodiment, the pattern information PA_I may be determined based on the user, the system access time, the number of times of encryption request (RC_E), the number of decryption requests (RC_D), and the policy included in the system.

In the exemplary embodiment, the pattern information PA_I is determined according to the system access time, the number of encryption request (RC_E), the number of decryption request (RC_D), and the dimension selection signal DS_S for selecting some of the policies included in the system . For example, when the dimension selection signal DS_S is the first selection signal, the pattern information PA_I may be determined based on the system access time, the number of encryption requests RC_E, and the number of decryption requests RC_D. Also, when the dimension selection signal DS_S is the second selection signal, the pattern information PA_I may be determined based on the system access time, the number of encryption requests RC_E, the number of decryption requests RC_D, and the policy.

In the exemplary embodiment, the pattern information PA_I may be an average, a standard deviation, and a maximum value of the number of encryption request times RC_E and the number of decryption request times RC_D for each system access time period. The pattern information PA_I may be determined based on information on the user, the system access time, the number of encryption requests (RC_E), and the number of decryption requests (RC_D) during the first time interval.

The pattern information PA_I can be calculated every time unit. For example, the average of the number of times of encryption request (RC_E) at 1:00 PM of the user every day may be included in the pattern information (PA_I), and the standard deviation of the number of times of encryption request (RC_E) PA_I), and the maximum value of the number of times of encryption request (RC_E) at 1:00 PM of the user every day may be included in the pattern information PA_I. The average of the number of decryption requests RC_D at 1:00 PM every day may be included in the pattern information PA_I and the standard deviation of the number of decryption requests RC_D at 1:00 PM every day may be used as the pattern information PA_I, , And the maximum value of the number of decryption requests (RC_D) at 1:00 PM of the user every day may be included in the pattern information PA_I.

The pattern information PA_I can be calculated on a daily basis. For example, the average of the number of times of encryption requests RC_E on the 10th day of the month of the user may be included in the pattern information PA_I, and the standard deviation of the number of times of encryption requests RC_E on the 10th day of each month, , And the maximum value of the number of times of encryption request (RC_E) at the 10th day of the month by the user may be included in the pattern information PA_I. In addition, the average of the decryption request count RC_D on the 10th day of the month can be included in the pattern information PA_I, and the standard deviation of the decryption request count RC_D on the 10th day of the month is included in the pattern information PA_I , And the maximum value of the number of times of decryption request (RC_D) by the user on the 10th day of the month may be included in the pattern information PA_I.

The pattern information PA_I can be calculated every day of the week. For example, the average of the number of times of encryption request RC_E may be included in the pattern information PA_I every Monday by the user, and the standard deviation of the number of times of encryption request RC_E may be included in the pattern information PA_I every Monday , And the maximum value of the number of times of encryption request (RC_E) by the user every Monday may be included in the pattern information PA_I. In addition, the average of the decryption request count RC_D every Monday by the user may be included in the pattern information PA_I, and the standard deviation of the decryption request count RC_D may be included in the pattern information PA_I every Monday by the user , The maximum value of the number of times of decryption (RC_D) requested by the user every Monday may be included in the pattern information PA_I.

FIG. 10 is a diagram illustrating an example of a multidimensional analyzer included in the personal information fraud detection apparatus of FIG.

Referring to FIG. 10, the pattern information PA_I may be updated at predetermined time intervals, and the pattern information PA_I may be stored in the pattern information register 510 included in the multidimensional analyzer 500.

In an exemplary embodiment, if the number of times the user's encryption and the number of decryption are greater than the maximum value, the multi-dimensional analyzer 500 may provide a warning message M_D. When the number of encryption and decryption times of the user is smaller than the maximum value and larger than the sum of the average and the standard deviation, the multidimensional analyzer 500 may provide a care message M_D. When the number of times of encryption and the number of decryption of the user are smaller than the sum of the average and the standard deviation, the multidimensional analyzer 500 can provide the security message M_D. The apparatus 10 for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on the condition log data CLOG_D generated according to the detection condition D_C.

The apparatus for detecting fraudulent use of personal information according to the embodiments of the present invention can determine whether the system is fraudulent or not based on the condition log data generated according to the detection condition and can be applied to various security systems.

While the present invention has been described with reference to the preferred embodiments thereof, it will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention as defined in the appended claims. It will be understood.

Claims (12)

A log data collector for receiving log data provided from a plurality of token servers;
The log data is separated according to predetermined detection conditions including a system access IP address, a system access time, a number of encryption requests corresponding to the number of requests for encryption to the system, and a number of decryption requests corresponding to the number of requests for decryption to the system A condition log data generator for providing condition log data;
A detector for determining whether the system is illegally used based on the condition log data; And
And a multi-dimensional analyzer for analyzing the log data and periodically updating pattern information corresponding to a pattern of accessing the system for each user of the system access IP address at predetermined time intervals,
Wherein the pattern information includes an average, a standard deviation, and a maximum value of the number of encryption requests and the number of decryption requests for each system access time period,
Wherein the multidimensional analyzer analyzes the number of times of the encryption request in the current time zone of the user, the number of decryption requests in the user's current time zone, the average number of times of encryption requests and the average number of decryption requests, And provides one of a warning message, a caution message, and a safety message based on the deviation and the maximum value.
delete The method according to claim 1,
And if the system access IP address is an overseas IP address, the detector determines that the system is fraudulent use.
The method of claim 3,
Wherein the detector determines that the system is illegally used if the system access time is a time during which access to the system is not allowed and the number of decryption requests is greater than a predetermined number of decryption limits. Device.
5. The method of claim 4,
Wherein the detector determines that the system is fraudulent if the number of decryption requests is greater than the decryption limit count for a policy that is not allowed to be accessed among the policies included in the system. .
6. The method of claim 5,
When the data size of the encryption and the decryption is larger than a predetermined data limit size and the number of encryption requests and the number of decryption requests are larger than a predetermined number of encryption limit times and a decryption limit number, Wherein the personal information tamper detection device detects the unauthorized use of personal information.
delete The method according to claim 1,
Wherein the pattern information is determined based on the user, the system access time, the number of encryption requests, the number of decryption requests, and the policy included in the system.
9. The method of claim 8,
Wherein the pattern information is determined according to a dimension selection signal for selecting the system access time, the number of encryption requests, the number of decryption requests, and a part of policies included in the system.
delete The method according to claim 1,
Wherein the pattern information is stored in a pattern information register included in the multidimensional analyzer.
The method according to claim 1,
When the number of times of the encryption request and the number of decryption requests of the user are larger than the maximum value, the multi-dimensional analyzer provides the warning message,
Wherein the multidimensional analyzer provides the attention message when the number of times of the encryption request and the number of decryption requests of the user are smaller than the maximum value and larger than a sum of the average and the standard deviation,
Wherein the multidimensional analyzer provides the security message if the number of times the encryption is requested by the user and the number of decryption requests are smaller than a sum of the average and the standard deviation.
KR1020150064461A 2015-05-08 2015-05-08 Device of detecting wrongful use of personal information KR101678179B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150064461A KR101678179B1 (en) 2015-05-08 2015-05-08 Device of detecting wrongful use of personal information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150064461A KR101678179B1 (en) 2015-05-08 2015-05-08 Device of detecting wrongful use of personal information

Publications (2)

Publication Number Publication Date
KR20160131619A KR20160131619A (en) 2016-11-16
KR101678179B1 true KR101678179B1 (en) 2016-11-21

Family

ID=57537845

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150064461A KR101678179B1 (en) 2015-05-08 2015-05-08 Device of detecting wrongful use of personal information

Country Status (1)

Country Link
KR (1) KR101678179B1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102151173B1 (en) * 2019-01-14 2020-09-02 펜타시큐리티시스템 주식회사 Method and apparatus for detecting abnormal behavior of groupware user
KR102134653B1 (en) 2019-11-25 2020-07-16 한국인터넷진흥원 Apparatus for rule optimization to improve detection accuracy for exploit attacks and method thereof
KR20230113121A (en) 2022-01-21 2023-07-28 한양대학교 산학협력단 Heteroaryl derivatives and uses thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002318734A (en) 2001-04-18 2002-10-31 Teamgia:Kk Method and system for processing communication log
JP5179792B2 (en) * 2007-07-13 2013-04-10 株式会社日立システムズ Operation detection system
KR101278971B1 (en) 2011-04-12 2013-07-30 주식회사 위즈디엔에스코리아 Interception system for preventing dishonestly using information and Method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4575190B2 (en) * 2005-02-24 2010-11-04 三菱電機株式会社 Audit log analysis apparatus, audit log analysis method, and audit log analysis program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002318734A (en) 2001-04-18 2002-10-31 Teamgia:Kk Method and system for processing communication log
JP5179792B2 (en) * 2007-07-13 2013-04-10 株式会社日立システムズ Operation detection system
KR101278971B1 (en) 2011-04-12 2013-07-30 주식회사 위즈디엔에스코리아 Interception system for preventing dishonestly using information and Method thereof

Also Published As

Publication number Publication date
KR20160131619A (en) 2016-11-16

Similar Documents

Publication Publication Date Title
US10211978B2 (en) Data security in a disconnected environment
US7415719B2 (en) Policy specification framework for insider intrusions
US9152808B1 (en) Adapting decoy data present in a network
US20060174346A1 (en) Instrumentation for alarming a software product
KR101678179B1 (en) Device of detecting wrongful use of personal information
KR20130118335A (en) Using power fingerprinting (pfp) to monitor the integrity and enhance security of computer based systems
US10462170B1 (en) Systems and methods for log and snort synchronized threat detection
US11070876B2 (en) Security monitoring with attack detection in an audio/video processing device
US11310278B2 (en) Breached website detection and notification
CN114884678A (en) Block chain-based data security management method and system
KR101451782B1 (en) User verification system via mouse movement pattern and method thereof
CN102945254A (en) Method for detecting abnormal data among TB-level mass audit data
GB2535579A (en) Preventing unauthorized access to an application server
US8353032B1 (en) Method and system for detecting identity theft or unauthorized access
Kim et al. A system for detection of abnormal behavior in BYOD based on web usage patterns
US8819815B1 (en) Method and system for distributing and tracking information
Weng et al. TLSmell: Direct Identification on Malicious HTTPs Encryption Traffic with Simple Connection-Specific Indicators.
US11882107B2 (en) Application single sign-on determinations based on intelligent traces
GB2505529A (en) Protecting a user from compromised web resources
JP5454166B2 (en) Access discrimination program, apparatus, and method
US9043943B1 (en) Self-destructing content
KR101566882B1 (en) System and method for monitoring encrypted database and preventing massive decryption
CN116980237B (en) Urban safety informatization data acquisition method
Margulies Sweeping Claims and Casual Legal Analysis in the Latest UN Mass Surveillance Report
Нестерчук et al. SECURITY OF PERSONAL DATE

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant