WO2022107241A1 - Processing device, processing method, and program - Google Patents

Processing device, processing method, and program Download PDF

Info

Publication number
WO2022107241A1
WO2022107241A1 PCT/JP2020/042976 JP2020042976W WO2022107241A1 WO 2022107241 A1 WO2022107241 A1 WO 2022107241A1 JP 2020042976 W JP2020042976 W JP 2020042976W WO 2022107241 A1 WO2022107241 A1 WO 2022107241A1
Authority
WO
WIPO (PCT)
Prior art keywords
fraud
data
call
possibility
fraudulent
Prior art date
Application number
PCT/JP2020/042976
Other languages
French (fr)
Japanese (ja)
Inventor
渉太 遠藤
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to PCT/JP2020/042976 priority Critical patent/WO2022107241A1/en
Publication of WO2022107241A1 publication Critical patent/WO2022107241A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers

Definitions

  • the present invention relates to a processing device, a processing method and a program.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a technique capable of appropriately determining the possibility that a call is fraudulent.
  • the processing device includes an acquisition unit that acquires voice data of the fraudulent suspect in a call between the fraudulent suspect and the user, an identifier of the fraudulent method using the call, and the above-mentioned method in the method.
  • An estimation unit that estimates the fraudulent method by the fraudulent suspect by comparing the pattern data that associates the fraudulent suspect's spoofing target with the keywords used in the fraudulent suspect's speech with the speech in the voice data. Refer to the priority data that associates the identifier of the technique with the identifier of the analysis method for analyzing the possibility that the call is fraud and the priority for analyzing the possibility that the call is the method by the analysis method. Then, among the analysis methods associated with the estimated method, the calculation unit that calculates the possibility of fraud analyzed by the analysis method with high priority as the risk level and the calculated risk level are notified. It has a notification unit.
  • a computer acquires voice data of the fraudulent suspect in a call between the fraudulent suspect and a user, and the computer obtains an identifier of a fraudulent technique using the call and an identifier.
  • the fraudulent method by the fraudulent suspect is compared with the pattern data in which the spoofing target of the fraudulent suspect in the technique is associated with the keyword used in the speech of the fraudulent suspect in the technique and the speech in the voice data.
  • the possibility of fraud analyzed by the high-priority analysis method among the analysis methods associated with the estimated method is calculated as the risk level, and the computer is calculated. Notify the degree of danger.
  • One aspect of the present invention is a program that causes a computer to function as the above-mentioned processing device.
  • FIG. 1 is a diagram illustrating a functional block of a processing system according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of a data structure of call data.
  • FIG. 3 is a diagram illustrating an example of a data structure of pattern data.
  • FIG. 4 is a diagram illustrating an example of a data structure of priority data.
  • FIG. 5 is a diagram illustrating an example of a data structure of voiceprint data.
  • FIG. 6 is a diagram illustrating an example of a data structure of number data.
  • FIG. 7 is a flowchart illustrating an example of the calculation process by the calculation unit.
  • FIG. 8 is a diagram illustrating a functional block of the sentiment analysis unit.
  • FIG. 9 is a diagram illustrating an example of a data structure of tactical emotions.
  • FIG. 9 is a diagram illustrating an example of a data structure of tactical emotions.
  • FIG. 10 is a diagram illustrating an example of a data structure of speech emotion data.
  • FIG. 11 is a flowchart illustrating an example of emotion analysis processing by the emotion analysis unit.
  • FIG. 12 is a diagram illustrating a hardware configuration of a computer used in a processing device.
  • the processing apparatus 1 outputs the possibility of fraud using a call as a risk level.
  • Call fraud is a special fraud in which a fraudster impersonates a specific person and tricks the other party.
  • Call fraud includes, for example, oleore fraud that impersonates a close relative of the other party, and refund fraud that impersonates an organization such as a city hall employee or a bank employee.
  • the processing device 1 acquires data related to a call between the calling terminal 61 and the receiving terminal 62 from the call system 6 and calculates the degree of risk.
  • a fraud suspect who is suspected of fraud.
  • a caller different from the user is referred to as a fraud suspect.
  • the fraud suspect is a target for outputting a risk level indicating whether or not a special fraud is committed in a call with a user.
  • the suspected fraud may or may not actually commit fraud. If the suspect is likely to be fraudulent, the risk is high, and if the suspect is not fraudulent, the risk is low.
  • the processing device 1 includes call data 11, pattern data 12, priority data 13, voiceprint data 14, number data 15, acquisition unit 21, estimation unit 22, calculation unit 23, notification unit 24, and feedback unit 25.
  • the call data 11, the pattern data 12, the priority data 13, the voice print data 14, and the number data 15 are data stored in the memory 902, the storage 903, or the like.
  • the acquisition unit 21, the estimation unit 22, the calculation unit 23, the notification unit 24, and the feedback unit 25 are functions implemented in the CPU 901.
  • the call data 11 is data related to a call for which the degree of danger is calculated.
  • the call data 11 is used by the processing device 1 to calculate the risk of fraud due to a call.
  • the call data 11 includes the telephone number and voice data of the calling terminal 61 and the telephone number and voice data of the incoming terminal 62.
  • the call data 11 may include at least voice data of the utterance of the suspected fraud, a telephone number, and the like.
  • the pattern data 12 associates the identifier of the fraudulent technique using a call with the spoofing target of the fraudulent suspect in the technique and the keyword used in the utterance of the fraudulent suspect in the technique.
  • the pattern data 12 is used to identify a fraudulent method that is presumed when the call to be processed is a special fraud from the combination of the spoofing target and the keyword of the fraud suspect.
  • a plurality of fraudulent suspects' spoofing targets or a plurality of utterances may be associated with the identifier of one method.
  • the identifier of the technique is "Oreore fraud", and the data set of the fraud suspect's spoofing target "son” and the keyword “accident” is associated with it.
  • Data of the method identifier "refund fraud”, the data set of the fraud suspect's spoofing target "bank clerk” and the keyword “account”, the fraud suspect's spoofing target "city hall employee”, and the keyword “medical expenses” Two datasets in the set are associated.
  • the priority data 13 associates the identifier of the method, the identifier of the analysis method for analyzing the possibility that the call is fraud, and the priority for analyzing the possibility that the call is the method.
  • the priority data 13 is used to calculate the degree of risk by weighting the analysis method suitable for the technique.
  • the priority means that it is possible to know whether or not it is a fraud when the method associated with the priority in the priority data 13 is analyzed by the analysis method associated with the priority. If the priority is high, analyzing the technique with the analysis method is likely to reveal whether it is a fraud or not. If the priority is low, it is unlikely that even if the method is analyzed by the analysis method, it can be determined whether it is a fraud or not.
  • the priority data 13 associates the high-priority analysis method with the low-priority analysis method to the identifier of the technique.
  • the priority data 13 whether or not the priority is high for each method is set for each analysis method implemented by the processing device 1.
  • the priority of each analysis method of number analysis and voiceprint analysis by individual whitelist is high, and the priority of sentiment analysis is low.
  • the priority data 13 indicates the priority as two values of "high” or “low", but the priority is not limited to this.
  • the priority may be set by any numerical value or any number of indexes (labels) as long as it can be converted into the weight of each analysis method.
  • the priority data 13 when the priority is set for each method and each analysis method, the priority may be set as high or low based on whether or not the priority is equal to or higher than a predetermined threshold value.
  • the voiceprint data 14 is referred to in order to identify the identity of the suspected fraud from the voiceprint. As shown in FIG. 5, the voiceprint data 14 includes an individual whitelist and a shared blacklist.
  • the individual whitelist includes voiceprint data of each user's son, daughter, and other close relatives of the user who are the targets of fraudulent suspects.
  • the individual whitelist is generated for each individual user such as user A and user B.
  • the voiceprint data 14 is used for the call to be processed, the individual whitelist of the user in the call is referred to.
  • the shared blacklist contains fraudster's voiceprint data.
  • Shared blacklists are provided by public authorities such as the police.
  • Each fraudster's identifier on the shared blacklist is associated with a fraud potential and score.
  • the score is updated by the feedback of each user described later, and the possibility of fraud is set from the score.
  • the number data 15 is referred to for identifying the identity of the telephone number used by the fraudulent suspect. As shown in FIG. 6, the number data 15 includes a shared whitelist, an individual whitelist, and a shared blacklist.
  • the shared whitelist includes the phone numbers of people who are the target of fraudulent suspects, such as police stations and banks, or the phone numbers of the organizations to which they belong.
  • the individual whitelist includes the phone numbers of each user's sons, daughters, and other close relatives of the user who are the targets of fraudulent suspects.
  • the individual whitelist is generated for each individual user such as user A and user B.
  • the individual whitelist of the user in the call is referred to.
  • the shared blacklist includes the fraudster's phone number. Shared blacklists are provided by public authorities such as the police. Each fraudster's identifier on the shared blacklist is associated with a fraud potential and score. The score is updated by the feedback of each user described later, and the possibility of fraud is set from the score.
  • the acquisition unit 21 acquires the voice data, telephone number, etc. of the fraudulent suspect in the call between the fraudulent suspect and the user, and generates the call data 11.
  • the acquisition unit 21 acquires data related to a call from the terminal used by the user among the calling terminal 61 and the incoming terminal 62, or from the call control system 55 that controls the call between the calling terminal 61 and the incoming terminal 62.
  • the acquisition unit 21 may notify that the voice is acquired in the call.
  • the estimation unit 22 compares the pattern data 12 with the utterances in the voice data, and estimates the method of fraud by the suspected fraud.
  • the estimation unit 22 converts the voice data into text data by the voice recognition system, and extracts the keyword of the remark of the suspected fraud from the text data by natural language processing.
  • the estimation unit 22 estimates that the extracted keyword is a fraudulent technique used in the target call by using the identifier of the technique included in the keyword column of the pattern data 12.
  • the estimation unit 22 may presume that the identifier of the method included in the keyword field of the pattern data 12 is the method of fraud that is working in the target call, as with the keyword, even if the synonym of the extracted keyword is used. ..
  • the estimation unit 22 may presume that the technique containing more keywords is a fraudulent technique.
  • the calculation unit 23 refers to the priority data 13 and determines the possibility of fraud analyzed by the analysis method having a higher priority among the analysis methods associated with the methods estimated by the estimation unit 22. Calculated as.
  • the calculation unit 23 includes a priority acquisition unit 31, an emotion analysis unit 32, a voiceprint analysis unit 33, and a number analysis unit 34.
  • Each of the sentiment analysis unit 32, the voiceprint analysis unit 33, and the number analysis unit 34 analyzes the possibility that the call is fraudulent.
  • three analysis methods are implemented, but two analysis methods may be used, or four or more analysis methods may be used.
  • the priority acquisition unit 31 acquires the priority of each analysis method implemented by the processing device 1 with reference to the priority data 13 for the analysis method associated with the method estimated by the estimation unit 22. For example, if the method estimated by the estimation unit 22 is an oleore fraud, the priority acquisition unit 31 acquires number analysis and voiceprint analysis by an individual whitelist as an analysis method with a high priority, and as an analysis method with a low priority. , Get sentiment analysis. In the priority data 13, when the weight of the analysis method is set by a numerical value or an index, the priority acquisition unit 31 acquires the numerical value or the index of each analysis method.
  • calculation unit 23 causes each of the emotion analysis unit 32, the voiceprint analysis unit 33, and the number analysis unit 34 to analyze the call data 11 and calculate the possibility of fraud.
  • the voiceprint analysis unit 33 compares the voiceprint of the suspected fraud in the call data 11 with each voiceprint of the voiceprint data 14, and if it matches the voiceprint of the whitelist, the possibility of fraud is set to zero, and the voiceprint of the blacklist is used. If there is a match, get the possibility of fraud associated with that matching voiceprint.
  • the possibility of fraud may be adjusted according to the degree of similarity between the voiceprint registered in the voiceprint data 14 and the voiceprint of the suspected fraud. For example, the voiceprint analysis unit 33 calculates the similarity between the voiceprint of the suspected fraud and the voiceprint of the blacklist, and multiplies the possibility of fraud associated with the voiceprint of the blacklist by the similarity. It may be output as a possibility.
  • the voiceprint analysis unit 33 may calculate the similarity between the voiceprint of the suspected fraud and the voiceprint of the whitelist, and output a value having a negative correlation with the calculated similarity as the possibility of fraud.
  • the number analysis unit 34 compares the telephone number of the suspected fraud in the call data 11 with each telephone number of the number data 15, and if the telephone number matches the telephone number in the white list, the possibility of fraud is set to zero and the black list is set. If it matches your phone number, you get the possibility of fraud associated with that matching phone number.
  • the sentiment analysis unit 32 analyzes the emotions of the fraudulent suspect from the voice data of the fraudulent suspect in the call data 11, and can be fraudulent because of a match with the emotion at the time of fraud or a contradiction with the emotion of the non-fraud suspect. Calculate the sex. The processing of the sentiment analysis unit 32 will be described in detail later.
  • the calculation unit 23 calculates the degree of risk in a call from the possibility of fraud calculated by each of the priority acquisition unit 31, the emotion analysis unit 32, the voiceprint analysis unit 33, and the number analysis unit 34.
  • the calculation unit 23 refers to the priority data and risks fraud analyzed by the analysis method having a higher priority among the analysis methods associated with the methods estimated by the estimation unit 22. Calculated as a degree.
  • the calculation unit 23 is the number analysis unit.
  • the possibility of fraud based on the collation result with the individual white list in 34 and the possibility of fraud based on the collation result with the voiceprint data 14 in the voiceprint analysis unit 33 are acquired, and the risk level is calculated from the possibility of these frauds. do.
  • the degree of risk may be the average of the possibilities of these frauds.
  • the calculation unit 23 sets the weight of the possibility of fraud calculated by the high-priority analysis method high and the weight of the possibility of fraud calculated by the low-priority analysis method low, and calculates the degree of risk. You may.
  • the priority acquisition unit 31 is associated with the method estimated by the estimation unit 22 with reference to the priority data 13. Get the priority of each analysis method.
  • the calculation unit 23 calculates the degree of risk by weighting the priority of each analysis method with the possibility of fraud obtained by each analysis method. For example, if the method estimated by the estimation unit 22 is an oleore fraud, the priority acquisition unit 31 refers to the priority data 13 and refers to the priority of the individual whitelist number analysis, the priority of the voiceprint analysis, and the sentiment in the oleore fraud. Get the analysis priority.
  • the calculation unit 23 multiplies and adds the priority acquired by the priority acquisition unit 31 to each of the possibility of fraud by individual whitelist number analysis, the possibility of fraud by voiceprint analysis, and the possibility of fraud by sentiment analysis. , Calculate the degree of risk.
  • the method of calculating the degree of risk calculated by the calculation unit 23 is not limited to this, but the possibility of fraud output by each analysis method is weighted according to the priority of each analysis method set for each method of fraud. It should be done.
  • the processing of the calculation unit 23 according to the embodiment of the present invention will be described.
  • the process shown in FIG. 7 is an example and is not limited to this.
  • step S101 the calculation unit 23 refers to the priority data 13 and determines the priority of each analysis means of the method estimated by the estimation unit 22.
  • step S102 the calculation unit 23 calculates the possibility of fraud by sentiment analysis.
  • step S103 the calculation unit 23 calculates the possibility of fraud by voiceprint analysis.
  • step S104 the calculation unit 23 calculates the possibility of fraud by number analysis.
  • step S105 the calculation unit 23 calculates the risk of the call according to the priority acquired in step S101.
  • the calculation unit 23 may calculate the degree of risk from the possibility of fraud by a high-priority analysis method.
  • the calculation unit 23 may calculate the degree of risk by weighting the possibility of fraud by each analysis method calculated in steps S102 to S104 with the priority acquired in step S101.
  • the notification unit 24 outputs the degree of risk calculated by the calculation unit 23.
  • the degree of danger may be displayed on the user's telephone, for example, or may be notified by voice so that only the user can hear it.
  • the degree of risk may be notified to the telephone number, e-mail address, SNS (Social Networking Service) account, or the like of another person or another terminal registered in advance as the user's contact information.
  • SNS Social Networking Service
  • the notification unit 24 does not have to notify the degree of danger if it can be collated with the voiceprint or number of the whitelist. Further, the notification method may be changed depending on whether or not the risk level is higher than the threshold value. For example, when the risk level is higher than the threshold value, the notification unit 24 notifies by a highly immediate notification means such as calling a pre-registered telephone number to notify the user, and when the risk level is lower than the threshold value, the notification unit 24 is registered in advance. You may notify by a notification means with low immediacy such as notifying to the e-mail address.
  • the feedback unit 25 causes the user to input the evaluation of the fraudulent suspect who is the other party of the call, and updates the blacklist scores of the voice print data 14 and the number data 15 according to the input evaluation.
  • a bad evaluation is input from the user, such as the suspected fraud being considered a fraudster
  • the feedback unit 25 counts up the score corresponding to the suspected fraud. If the user inputs a good evaluation, such as thinking that the fraudulent suspect is not a fraudster, the feedback unit 25 counts down the score corresponding to the fraudulent suspect.
  • the feedback unit 25 refers to the score of each fraudster in the blacklist of the voiceprint data 14 and the number data 15 at a predetermined timing, and updates the value of the possibility of fraud of each fraudster.
  • the feedback unit 25 may add the data of the suspected fraud to the voice print data 14 and the number data 15. Further, the feedback unit 25 may update the voiceprint data 14 and the number data 15 based on the information obtained from a public institution such as the police.
  • the fraudulent method is estimated from the remarks of the fraudulent suspect in the call, the weight of the analysis method is changed according to the estimated fraudulent method, and the risk of fraud is calculated. Since the processing device 1 according to the embodiment of the present invention calculates the degree of risk by giving light weight to the possibility of fraud output by the analysis method according to the presumed method, the possibility that the call is fraud is appropriate. It becomes possible to judge.
  • the sentiment analysis unit 32 includes a call data 11, an emotion score data 41, a technique emotion data 42, a speech emotion data 43, a score calculation unit 51, a feature detection unit 52, a contradiction detection unit 53, and an emotion analysis output unit 54.
  • the call data 11, the emotion score data 41, the trick emotion data 42, and the spoken emotion data 43 are data stored in the memory 902, the storage 903, or the like.
  • the score calculation unit 51, the feature detection unit 52, the contradiction detection unit 53, and the sentiment analysis output unit 54 are functions implemented in the CPU 901.
  • the call data 11 is as described with reference to FIGS. 1 and 2.
  • the method emotion data 42 is data in which the identifier of the method of fraud using a call is associated with the tendency of the emotion score of the suspected fraud in the method.
  • the tendency of each emotion type is associated with the identifier of the technique as the tendency of the emotion score.
  • the tendency of the emotion score in the technique emotion data 42 may include a temporal change of the emotion score, such as when an incoming call is received or when a specific utterance is made.
  • the emotional score of joy is high at the time of incoming call, and the emotional score of calmness is low at the time of confession such as an accident.
  • the technique emotion data 42 may associate the identifier of one technique with the tendency of a plurality of emotion scores.
  • the utterance emotion data 43 is data in which a keyword is associated with the characteristics of the emotion score of the non-fraud suspect when the keyword is issued.
  • the utterance emotion data 43 identifies the emotion when a general person who is not a fraud suspect utters a keyword. As shown in FIG. 10, the utterance emotion data 43 associates the keyword with the tendency of the emotion score. In the example shown in FIG. 9, it is shown that the emotional score of joy tends to be low for the keyword “accident”.
  • the score calculation unit 51 calculates the emotional score of the fraudulent suspect in the voice data of the call data 11.
  • the score calculation unit 51 calculates emotion scores for four emotion types: joy, anger, sadness, and calm.
  • the score calculation unit 51 sequentially calculates the emotion score with the passage of time of the utterance of the suspected fraud.
  • the score calculation unit 51 may calculate the emotional score of the fraudulent suspect by a general method.
  • the feature detection unit 52 refers to the tactical emotion data 42, and if the emotional score of the fraudulent suspect matches the tendency of the emotional score associated with the tactic estimated by the estimation unit 22, the call is a tactic. Detect as likely.
  • the feature detection unit 52 determines the possibility of fraud based on whether or not the characteristic of the specific fraud technique estimated by the estimation unit 22 appears in the emotion score of the fraud suspect.
  • the feature detection unit 52 determines that the call to be processed has a high possibility of fraud when the estimated feature of the technique appears, and when the feature of the estimated technique does not appear, the feature detection unit 52 determines that the call to be processed has a high possibility of fraud. Judge that the possibility of fraud is low.
  • the emotion score calculated by the score calculation unit 51 has a high joyful emotional score at the time of incoming call and a low calm emotional score at the time of confession such as an accident, as shown in the example shown in FIG. If so, it is determined that there is a high possibility of fraud.
  • the feature detection unit 52 may calculate the possibility of fraud based on the similarity between the emotional tendency of the technique registered in the emotional data 42 and the emotional score tendency calculated by the score calculation unit 51.
  • the feature detection unit 52 calculates the possibility of fraud with high probability. If the similarity is low, the possibility of fraud is calculated low.
  • the feature detection unit 52 can change the viewpoint of analyzing the emotions of the fraud suspect depending on the presumed method, the possibility of fraud can be appropriately calculated.
  • the contradiction detection unit 53 refers to the utterance emotion data 43, and when the characteristics of the emotion score calculated by the score calculation unit 51 and the emotion score of the non-fraud suspect in the utterance emotion data 43 are inconsistent, the call is fraudulent. Detects that there is a high possibility.
  • the contradiction detection unit 53 determines the possibility of fraud based on whether or not the emotion score of the fraud suspect tends to contradict the emotional tendency of the general public who is not the fraud suspect.
  • the contradiction detection unit 53 determines that there is a high possibility of fraud for the call to be processed if it tends to contradict the emotional tendency of the general public, and if there is a similar tendency, the possibility of fraud for the call to be processed. Is determined to be low.
  • the contradiction detection unit 53 determines that the possibility of fraud is low when the score calculated by the score calculation unit 51 has a low feeling of joy as shown by the keyword “accident” in the example shown in FIG. If the emotion is high, it is judged that there is a high possibility of fraud.
  • the contradiction detection unit 53 may calculate the possibility that the call is fraudulent from the dissimilarity between the emotion score calculated by the score calculation unit 51 and the tendency of the emotion score specified by the utterance emotion data 43.
  • the contradiction detection unit 53 determines that the call is likely to be fraudulent when the dissimilarity between the emotion score calculated by the score calculation unit 51 and the tendency of the emotion score specified by the utterance emotion data 43 is high. If the dissimilarity is low, it is determined that the call is unlikely to be fraudulent.
  • the sentiment analysis output unit 54 outputs the possibility of fraud from each analysis method of the feature detection unit 52 and the contradiction detection unit 53.
  • the sentiment analysis output unit 54 outputs, for example, the average of the possibility of fraud determined by the feature detection unit 52 and the possibility of fraud determined by the inconsistency detection unit 53 as the possibility of fraud by emotion analysis. ..
  • the emotion analysis output unit 54 may output the possibility of fraud by emotion analysis from the possibility of fraud of at least one of the feature detection unit 52 and the inconsistency detection unit 53, and the calculation method is not limited. do not have.
  • the emotion analysis process by the emotion analysis unit 32 will be described with reference to FIG.
  • the process shown in FIG. 11 is an example and is not limited to this.
  • step S201 the sentiment analysis unit 32 initializes the possibility of fraud by sentiment analysis.
  • step S202 the sentiment analysis unit 32 calculates the score of each emotion from the voice data of the suspected fraud.
  • step S203 the emotion analysis unit 32 determines whether or not the score of each emotion calculated in step S202 has a tendency of the emotion of the technique estimated in the technique emotion data 42. If there is a tendency, in step S204, the sentiment analysis unit 32 raises the possibility of fraud by sentiment analysis. If there is no tendency, in step S205, the sentiment analysis unit 32 reduces the possibility of fraud by sentiment analysis.
  • step S206 the emotion analysis unit 32 refers to the utterance emotion data 43 and determines whether or not there is a contradiction between the content of the fraudulent suspect's remark and the emotion score at the time of remark. If there is a contradiction, in step S207, the sentiment analysis unit 32 raises the possibility of fraud by sentiment analysis. If there is no contradiction, in step S208, the sentiment analysis unit 32 reduces the possibility of fraud by sentiment analysis.
  • step S209 the sentiment analysis unit 32 outputs the possibility of fraud by the sentiment analysis obtained in the processes of steps S201 to S209.
  • the sentiment analysis unit 32 analyzes the possibility of fraud based on whether or not the estimated characteristics of the technique appear in the voice data of the suspected fraud, it is appropriate to use the sentiment analysis to determine the possibility that the call is fraudulent. Can be determined. Further, since the sentiment analysis unit 32 analyzes the possibility of fraud depending on whether or not the emotion in the utterance of the voice data of the suspected fraud is different from the emotion in the utterance of the general public, the call is fraudulent using the sentiment analysis. The possibility can be judged appropriately.
  • the processing device 1 of the present embodiment described above includes, for example, a CPU (Central Processing Unit, processor) 901, a memory 902, a storage 903 (HDD: Hard Disk Drive, SSD: Solid State Drive), and a communication device 904.
  • a general purpose computer system including an input device 905 and an output device 906 is used.
  • each function of the processing device 1 is realized by executing a predetermined program loaded on the memory 902 by the CPU 901.
  • the processing device 1 may be mounted on one computer or may be mounted on a plurality of computers. Further, the processing device 1 may be a virtual machine mounted on a computer.
  • the program of the processing device 1 can be stored in a computer-readable recording medium such as an HDD, SSD, USB (Universal Serial Bus) memory, CD (Compact Disc), DVD (Digital Versatile Disc), or distributed via a network. You can also do it.
  • a computer-readable recording medium such as an HDD, SSD, USB (Universal Serial Bus) memory, CD (Compact Disc), DVD (Digital Versatile Disc), or distributed via a network. You can also do it.
  • the present invention is not limited to the above embodiment, and many modifications can be made within the scope of the gist thereof.
  • Processing device 6 Call system 11 Call data 12 Pattern data 13 Priority data 14 Voice pattern data 15 Number data 21 Acquisition unit 22 Estimate unit 23 Calculation unit 24 Notification unit 25 Feedback unit 31 Priority acquisition unit 32 Emotion analysis unit 33 Voice pattern analysis unit 34 Number analysis unit 41 Emotion score data 42 Technique emotion data 43 Speech emotion data 51 Score calculation unit 52 Feature detection unit 53 Inconsistency detection unit 54 Emotion analysis output unit 55 Call control system 61 Calling terminal 62 Incoming terminal 901 CPU 902 Memory 903 Storage 904 Communication device 905 Input device 906 Output device 905 Input device 906 Output device

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)

Abstract

A processing device 1 comprises: an acquisition unit 21 for acquiring voice data from a fraud suspect in a call between the fraud suspect and a user; an inference unit 22 for inferring a fraud trick used by the fraud suspect by comparing, with utterances in the voice data, pattern data 12 for associating an identifier for trick utilizing calls, the fraud suspect's spoofing target during the trick, and keywords used in the fraud suspect's utterances during the trick; a calculation unit 23 for referencing priority level data that associates the identifier for the trick, an identifier for analysis scheme for analyzing the possibility that a call is fraudulent, and priority levels for analyzing with the analysis scheme the possibility that the call is a trick, and calculates, as a risk level, the possibility of fraud analyzed by using the analysis scheme with a high priority level from among the analysis schemes associated with the inferred trick; and a notification unit 24 for notifying the calculated risk level.

Description

処理装置、処理方法およびプログラムProcessing equipment, processing methods and programs
 本発明は、処理装置、処理方法およびプログラムに関する。 The present invention relates to a processing device, a processing method and a program.
 通話を使って、ユーザの近親者、または市役所職員など、任意の組織の職員になりすまして、ユーザに詐欺を働く特殊詐欺が増えている。通話における音声データを分析して特殊詐欺を検出する装置があり、各種分析方法が提案されている。例えば特許文献1では、音声データのテキストデータと、学習済モデルを比較して、特定の通話に該当する確度を算出して、算出された確度が所定値以上の場合、所定の端末に通知する。 There is an increasing number of special frauds that use calls to impersonate users' close relatives or employees of any organization, such as city hall employees, and commit fraud to users. There is a device that analyzes voice data in a call to detect special fraud, and various analysis methods have been proposed. For example, in Patent Document 1, the text data of voice data is compared with a trained model to calculate the accuracy corresponding to a specific call, and if the calculated accuracy is equal to or higher than a predetermined value, the predetermined terminal is notified. ..
特開2019-153961号公報Japanese Unexamined Patent Publication No. 2019-153961
 しかしながら、特殊詐欺の手口の多様化および巧妙化により、単一の分析方法で、特殊詐欺を判別できない場合がある。また多数の分析手法があっても、手口毎に有効な分析方法が異なる場合がある。このような状況において、通話が詐欺である可能性を適切に判定することは難しい。 However, due to the diversification and sophistication of special fraud techniques, it may not be possible to discriminate special fraud with a single analysis method. Even if there are many analysis methods, the effective analysis method may differ depending on the method. In such a situation, it is difficult to properly determine the possibility that the call is fraudulent.
 本発明は、上記事情に鑑みてなされたものであり、本発明の目的は、通話が詐欺である可能性を適切に判定することが可能な技術を提供することである。 The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a technique capable of appropriately determining the possibility that a call is fraudulent.
 本発明の一態様の処理装置は、詐欺被疑者とユーザ間の通話における、前記詐欺被疑者の音声データを取得する取得部と、前記通話を使った詐欺の手口の識別子と、前記手口における前記詐欺被疑者のなりすまし対象と、前記手口における詐欺被疑者の発話で用いられるキーワードを対応づけるパターンデータと、前記音声データにおける発話を比較して、前記詐欺被疑者による詐欺の手口を推定する推定部と、手口の識別子と、前記通話が詐欺である可能性を分析する分析手法の識別子と、前記通話が前記手口である可能性を前記分析手法で分析する優先度を対応づける優先度データを参照して、推定された手口に対応づけられた分析手法のうち、優先度が高い分析手法で分析された詐欺の可能性を、危険度として算出する算出部と、算出された危険度を通知する通知部を備える。 The processing device according to one aspect of the present invention includes an acquisition unit that acquires voice data of the fraudulent suspect in a call between the fraudulent suspect and the user, an identifier of the fraudulent method using the call, and the above-mentioned method in the method. An estimation unit that estimates the fraudulent method by the fraudulent suspect by comparing the pattern data that associates the fraudulent suspect's spoofing target with the keywords used in the fraudulent suspect's speech with the speech in the voice data. Refer to the priority data that associates the identifier of the technique with the identifier of the analysis method for analyzing the possibility that the call is fraud and the priority for analyzing the possibility that the call is the method by the analysis method. Then, among the analysis methods associated with the estimated method, the calculation unit that calculates the possibility of fraud analyzed by the analysis method with high priority as the risk level and the calculated risk level are notified. It has a notification unit.
 本発明の一態様の処理方法は、コンピュータが、詐欺被疑者とユーザ間の通話における、前記詐欺被疑者の音声データを取得し、前記コンピュータが、前記通話を使った詐欺の手口の識別子と、前記手口における前記詐欺被疑者のなりすまし対象と、前記手口における詐欺被疑者の発話で用いられるキーワードを対応づけるパターンデータと、前記音声データにおける発話を比較して、前記詐欺被疑者による詐欺の手口を推定し、前記コンピュータが、手口の識別子と、前記通話が詐欺である可能性を分析する分析手法の識別子と、前記通話が前記手口である可能性を前記分析手法で分析する優先度を対応づける優先度データを参照して、推定された手口に対応づけられた分析手法のうち、優先度が高い分析手法で分析された詐欺の可能性を、危険度として算出し、前記コンピュータが、算出された危険度を通知する。 In one aspect of the processing method of the present invention, a computer acquires voice data of the fraudulent suspect in a call between the fraudulent suspect and a user, and the computer obtains an identifier of a fraudulent technique using the call and an identifier. The fraudulent method by the fraudulent suspect is compared with the pattern data in which the spoofing target of the fraudulent suspect in the technique is associated with the keyword used in the speech of the fraudulent suspect in the technique and the speech in the voice data. Estimate and associate the computer with the tactical identifier, the identifier of the analysis method that analyzes the possibility that the call is fraudulent, and the priority of analyzing the possibility that the call is the tactic. With reference to the priority data, the possibility of fraud analyzed by the high-priority analysis method among the analysis methods associated with the estimated method is calculated as the risk level, and the computer is calculated. Notify the degree of danger.
 本発明の一態様は、上記処理装置として、コンピュータを機能させるプログラムである。 One aspect of the present invention is a program that causes a computer to function as the above-mentioned processing device.
 本発明によれば、通話が詐欺である可能性を適切に判定することが可能な技術を提供することができる。 According to the present invention, it is possible to provide a technique capable of appropriately determining the possibility that a call is fraudulent.
図1は、本発明の実施の形態に係る処理システムの機能ブロックを説明する図である。FIG. 1 is a diagram illustrating a functional block of a processing system according to an embodiment of the present invention. 図2は、通話データのデータ構造の一例を説明する図である。FIG. 2 is a diagram illustrating an example of a data structure of call data. 図3は、パターンデータのデータ構造の一例を説明する図である。FIG. 3 is a diagram illustrating an example of a data structure of pattern data. 図4は、優先度データのデータ構造の一例を説明する図である。FIG. 4 is a diagram illustrating an example of a data structure of priority data. 図5は、声紋データのデータ構造の一例を説明する図である。FIG. 5 is a diagram illustrating an example of a data structure of voiceprint data. 図6は、番号データのデータ構造の一例を説明する図である。FIG. 6 is a diagram illustrating an example of a data structure of number data. 図7は、算出部による算出処理の一例を説明するフローチャートである。FIG. 7 is a flowchart illustrating an example of the calculation process by the calculation unit. 図8は、感情分析部の機能ブロックを説明する図である。FIG. 8 is a diagram illustrating a functional block of the sentiment analysis unit. 図9は、手口感情のデータ構造の一例を説明する図である。FIG. 9 is a diagram illustrating an example of a data structure of tactical emotions. 図10は、発話感情データのデータ構造の一例を説明する図である。FIG. 10 is a diagram illustrating an example of a data structure of speech emotion data. 図11は、感情分析部による感情分析処理の一例を説明するフローチャートである。FIG. 11 is a flowchart illustrating an example of emotion analysis processing by the emotion analysis unit. 図12は、処理装置に用いられるコンピュータのハードウエア構成を説明する図である。FIG. 12 is a diagram illustrating a hardware configuration of a computer used in a processing device.
 以下、図面を参照して、本発明の実施形態を説明する。図面の記載において同一部分には同一符号を付し説明を省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the description of the drawings, the same parts are designated by the same reference numerals and the description thereof will be omitted.
 (処理装置)
 図1を参照して、本発明の実施の形態に係る処理装置1を説明する。処理装置1は、通話を用いた詐欺の可能性を、危険度として出力する。通話を用いた詐欺は、特殊詐欺において、詐欺者が特定の者になりすまして通話相手をだます詐欺である。通話を用いた詐欺は、例えば、通話相手の近親者になりすますオレオレ詐欺、市役所職員、銀行員などの組織の者になりすます還付金詐欺などである。
(Processing device)
The processing apparatus 1 according to the embodiment of the present invention will be described with reference to FIG. The processing device 1 outputs the possibility of fraud using a call as a risk level. Call fraud is a special fraud in which a fraudster impersonates a specific person and tricks the other party. Call fraud includes, for example, oleore fraud that impersonates a close relative of the other party, and refund fraud that impersonates an organization such as a city hall employee or a bank employee.
 処理装置1は、通話システム6から発信端末61および着信端末62間の通話に関するデータを取得して、危険度を算出する。発信端末61および着信端末62の2の通話者のうち、一方は、処理装置1による危険度の通知サービスを利用するユーザであって、もう一方は、詐欺が疑われる詐欺被疑者である。本発明の実施の形態において、ユーザとは異なる通話者を、詐欺被疑者と称する。詐欺被疑者は、ユーザとの通話において特殊詐欺を働いているか否かを示す危険度を出力する対象である。詐欺被疑者は、実際に詐欺を働いているか否かを問わない。詐欺被疑者が詐欺を働いている可能性が高い場合、危険度が高く、詐欺を働いていない可能性が高い場合、危険度が低い。 The processing device 1 acquires data related to a call between the calling terminal 61 and the receiving terminal 62 from the call system 6 and calculates the degree of risk. Of the two callers of the calling terminal 61 and the receiving terminal 62, one is a user who uses the risk notification service by the processing device 1, and the other is a fraud suspect who is suspected of fraud. In the embodiment of the present invention, a caller different from the user is referred to as a fraud suspect. The fraud suspect is a target for outputting a risk level indicating whether or not a special fraud is committed in a call with a user. The suspected fraud may or may not actually commit fraud. If the suspect is likely to be fraudulent, the risk is high, and if the suspect is not fraudulent, the risk is low.
 処理装置1は、通話データ11、パターンデータ12、優先度データ13、声紋データ14、番号データ15、取得部21、推定部22、算出部23、通知部24およびフィードバック部25を備える。通話データ11、パターンデータ12、優先度データ13、声紋データ14および番号データ15は、メモリ902またはストレージ903等に記憶されるデータである。取得部21、推定部22、算出部23、通知部24およびフィードバック部25は、CPU901に実装される機能である。 The processing device 1 includes call data 11, pattern data 12, priority data 13, voiceprint data 14, number data 15, acquisition unit 21, estimation unit 22, calculation unit 23, notification unit 24, and feedback unit 25. The call data 11, the pattern data 12, the priority data 13, the voice print data 14, and the number data 15 are data stored in the memory 902, the storage 903, or the like. The acquisition unit 21, the estimation unit 22, the calculation unit 23, the notification unit 24, and the feedback unit 25 are functions implemented in the CPU 901.
 通話データ11は、危険度を算出する対象の通話に関するデータである。通話データ11は、処理装置1が通話による詐欺の危険度を算出するために、用いられる。通話データ11は、図2に示すように、発信端末61の電話番号および音声データ、着信端末62の電話番号および音声データを含む。通話データ11は、少なくとも詐欺被疑者の発話の音声データ、電話番号等を含んでいればよい。 The call data 11 is data related to a call for which the degree of danger is calculated. The call data 11 is used by the processing device 1 to calculate the risk of fraud due to a call. As shown in FIG. 2, the call data 11 includes the telephone number and voice data of the calling terminal 61 and the telephone number and voice data of the incoming terminal 62. The call data 11 may include at least voice data of the utterance of the suspected fraud, a telephone number, and the like.
 パターンデータ12は、通話を使った詐欺の手口の識別子と、その手口における詐欺被疑者のなりすまし対象と、その手口における詐欺被疑者の発話で用いられるキーワードを対応づける。パターンデータ12は、詐欺被疑者のなりすまし対象とキーワードの組み合わせから、処理対象の通話が特殊詐欺の場合に推測される詐欺の手口を特定するために用いられる。一つの手口の識別子に、複数の詐欺被疑者のなりすまし対象、または複数の発話が対応づけられても良い。 The pattern data 12 associates the identifier of the fraudulent technique using a call with the spoofing target of the fraudulent suspect in the technique and the keyword used in the utterance of the fraudulent suspect in the technique. The pattern data 12 is used to identify a fraudulent method that is presumed when the call to be processed is a special fraud from the combination of the spoofing target and the keyword of the fraud suspect. A plurality of fraudulent suspects' spoofing targets or a plurality of utterances may be associated with the identifier of one method.
 図3に示すパターンデータ12において、手口の識別子が「オレオレ詐欺」に、詐欺被疑者のなりすまし対象「息子」、およびキーワード「事故」のデータセットが対応づけられる。手口の識別子「還付金詐欺」に、詐欺被疑者のなりすまし対象「銀行員」、およびキーワード「口座」のデータセットと、詐欺被疑者のなりすまし対象「市役所職員」、およびキーワード「医療費」のデータセットの2つのデータセットが対応づけられる。 In the pattern data 12 shown in FIG. 3, the identifier of the technique is "Oreore fraud", and the data set of the fraud suspect's spoofing target "son" and the keyword "accident" is associated with it. Data of the method identifier "refund fraud", the data set of the fraud suspect's spoofing target "bank clerk" and the keyword "account", the fraud suspect's spoofing target "city hall employee", and the keyword "medical expenses" Two datasets in the set are associated.
 優先度データ13は、手口の識別子と、通話が詐欺である可能性を分析する分析手法の識別子と、通話が手口である可能性を分析手法で分析する優先度を対応づける。優先度データ13は、手口に好適な分析方法に重みを置いて危険度を算出するために用いられる。優先度は、優先度データ13でその優先度に対応づけられた手口を、その優先度に対応づけられた分析方法で分析すると、詐欺であるか否かがわかる可能性を意味する。優先度が高い場合、その手口をその分析方法で分析すると、詐欺であるか否かがわかる可能性が高い。優先度が低い場合、その手口をその分析方法で分析しても、詐欺であるか否かがわかる可能性が低い。 The priority data 13 associates the identifier of the method, the identifier of the analysis method for analyzing the possibility that the call is fraud, and the priority for analyzing the possibility that the call is the method. The priority data 13 is used to calculate the degree of risk by weighting the analysis method suitable for the technique. The priority means that it is possible to know whether or not it is a fraud when the method associated with the priority in the priority data 13 is analyzed by the analysis method associated with the priority. If the priority is high, analyzing the technique with the analysis method is likely to reveal whether it is a fraud or not. If the priority is low, it is unlikely that even if the method is analyzed by the analysis method, it can be determined whether it is a fraud or not.
 優先度データ13は、図4に示すように、手口の識別子に、優先度の高い分析方法と優先度の低い分析方法を対応づける。優先度データ13は、処理装置1が実装する各分析方法について、各手口について優先度が高いか否かが設定される。図4に示す例でオレオレ詐欺について、個別ホワイトリストによる番号分析および声紋分析の各分析方法の優先度が高く、感情分析の優先度が低い。 As shown in FIG. 4, the priority data 13 associates the high-priority analysis method with the low-priority analysis method to the identifier of the technique. In the priority data 13, whether or not the priority is high for each method is set for each analysis method implemented by the processing device 1. In the example shown in FIG. 4, for the oleore fraud, the priority of each analysis method of number analysis and voiceprint analysis by individual whitelist is high, and the priority of sentiment analysis is low.
 図4に示す例において優先度データ13は、優先度を「高い」または「低い」の2値で示すが、これに限らない。優先度は、各分析方法の重みに変換可能であればよく、任意の数値で設定されても良いし、任意の数の指標(ラベル)で設定されても良い。優先度データ13において、各手口および各分析手法に優先度が設定される場合、所定の閾値以上であるか否かに基づいて、優先度が高いまたは低いと設定されても良い。 In the example shown in FIG. 4, the priority data 13 indicates the priority as two values of "high" or "low", but the priority is not limited to this. The priority may be set by any numerical value or any number of indexes (labels) as long as it can be converted into the weight of each analysis method. In the priority data 13, when the priority is set for each method and each analysis method, the priority may be set as high or low based on whether or not the priority is equal to or higher than a predetermined threshold value.
 声紋データ14は、詐欺被疑者の素性を、声紋から特定するために参照される。図5に示すように声紋データ14は、個別ホワイトリストおよび共有ブラックリストを含む。 The voiceprint data 14 is referred to in order to identify the identity of the suspected fraud from the voiceprint. As shown in FIG. 5, the voiceprint data 14 includes an individual whitelist and a shared blacklist.
 個別ホワイトリストは、各ユーザの息子、娘など、ユーザの近親者であって詐欺被疑者がなりすます対象となる者の声紋データを含む。個別ホワイトリストは、ユーザA、ユーザBなどの個々のユーザ毎に生成される。処理対象の通話について声紋データ14が用いられる場合、その通話におけるユーザの個別ホワイトリストが参照される。 The individual whitelist includes voiceprint data of each user's son, daughter, and other close relatives of the user who are the targets of fraudulent suspects. The individual whitelist is generated for each individual user such as user A and user B. When the voiceprint data 14 is used for the call to be processed, the individual whitelist of the user in the call is referred to.
 共有ブラックリストは、詐欺師の声紋データを含む。共有ブラックリストは、例えば警察などの公的機関から提供される。共有ブラックリストの各詐欺者の識別子に、詐欺の可能性およびスコアが対応づけられる。スコアは、後述する各ユーザのフィードバックによって更新され、詐欺の可能性は、スコアから設定される。 The shared blacklist contains fraudster's voiceprint data. Shared blacklists are provided by public authorities such as the police. Each fraudster's identifier on the shared blacklist is associated with a fraud potential and score. The score is updated by the feedback of each user described later, and the possibility of fraud is set from the score.
 番号データ15は、詐欺被疑者が利用する電話番号の素性を特定するために参照される。図6に示すように、番号データ15は、共有ホワイトリスト、個別ホワイトリストおよび共有ブラックリストを含む。 The number data 15 is referred to for identifying the identity of the telephone number used by the fraudulent suspect. As shown in FIG. 6, the number data 15 includes a shared whitelist, an individual whitelist, and a shared blacklist.
 共有ホワイトリストは、警察署、銀行など、詐欺被疑者がなりすます対象となる者の電話番号、またはこれらの者が属する組織の電話番号を含む。 The shared whitelist includes the phone numbers of people who are the target of fraudulent suspects, such as police stations and banks, or the phone numbers of the organizations to which they belong.
 個別ホワイトリストは、各ユーザの息子、娘など、ユーザの近親者であって詐欺被疑者がなりすます対象となる者の電話番号を含む。個別ホワイトリストは、ユーザA、ユーザBなどの個々のユーザ毎に生成される。処理対象の通話について番号データ15が用いられる場合、その通話におけるユーザの個別ホワイトリストが参照される。 The individual whitelist includes the phone numbers of each user's sons, daughters, and other close relatives of the user who are the targets of fraudulent suspects. The individual whitelist is generated for each individual user such as user A and user B. When the number data 15 is used for the call to be processed, the individual whitelist of the user in the call is referred to.
 共有ブラックリストは、詐欺師の電話番号を含む。共有ブラックリストは、例えば警察などの公的機関から提供される。共有ブラックリストの各詐欺者の識別子に、詐欺の可能性およびスコアが対応づけられる。スコアは、後述する各ユーザのフィードバックによって更新され、詐欺の可能性は、スコアから設定される。 The shared blacklist includes the fraudster's phone number. Shared blacklists are provided by public authorities such as the police. Each fraudster's identifier on the shared blacklist is associated with a fraud potential and score. The score is updated by the feedback of each user described later, and the possibility of fraud is set from the score.
 取得部21は、詐欺被疑者とユーザ間の通話における、詐欺被疑者の音声データ、電話番号等を取得して通話データ11を生成する。取得部21は、発信端末61および着信端末62のうちユーザの利用する端末から、あるいは発信端末61および着信端末62間の通話を制御する通話制御システム55から、通話に関するデータを取得する。なお取得部21は、通話において、音声を取得する旨を通知しても良い。 The acquisition unit 21 acquires the voice data, telephone number, etc. of the fraudulent suspect in the call between the fraudulent suspect and the user, and generates the call data 11. The acquisition unit 21 acquires data related to a call from the terminal used by the user among the calling terminal 61 and the incoming terminal 62, or from the call control system 55 that controls the call between the calling terminal 61 and the incoming terminal 62. The acquisition unit 21 may notify that the voice is acquired in the call.
 推定部22は、パターンデータ12と、音声データにおける発話を比較して、詐欺被疑者による詐欺の手口を推定する。推定部22は、音声データを音声認識システムによりテキストデータに変換し、自然言語処理により、テキストデータから詐欺被疑者の発言のキーワードを抽出する。推定部22は、抽出されたキーワードが、パターンデータ12のキーワード欄に含まれる手口の識別子を、対象の通話で働かれている詐欺の手口と推定する。推定部22は、抽出されたキーワードの同義語ついてもキーワードと同様に、パターンデータ12のキーワード欄に含まれる手口の識別子を、対象の通話で働かれている詐欺の手口と推定しても良い。複数の手口にわたって抽出されたキーワードが含まれる場合、推定部22は、より多くのキーワードが含まれる手口を、詐欺の手口と推定しても良い。 The estimation unit 22 compares the pattern data 12 with the utterances in the voice data, and estimates the method of fraud by the suspected fraud. The estimation unit 22 converts the voice data into text data by the voice recognition system, and extracts the keyword of the remark of the suspected fraud from the text data by natural language processing. The estimation unit 22 estimates that the extracted keyword is a fraudulent technique used in the target call by using the identifier of the technique included in the keyword column of the pattern data 12. The estimation unit 22 may presume that the identifier of the method included in the keyword field of the pattern data 12 is the method of fraud that is working in the target call, as with the keyword, even if the synonym of the extracted keyword is used. .. When the keywords extracted over a plurality of techniques are included, the estimation unit 22 may presume that the technique containing more keywords is a fraudulent technique.
 算出部23は、優先度データ13を参照して、推定部22によって推定された手口に対応づけられた分析手法のうち、優先度が高い分析手法で分析された詐欺の可能性を、危険度として算出する。本発明の実施の形態において算出部23は、優先度取得部31、感情分析部32、声紋分析部33および番号分析部34を有する。感情分析部32、声紋分析部33および番号分析部34のそれぞれは、通話が詐欺である可能性を分析する。本発明の実施の形態において、3つの分析方法を実装するが、2つの分析方法でも良いし、4つ以上の分析方法でも良い。 The calculation unit 23 refers to the priority data 13 and determines the possibility of fraud analyzed by the analysis method having a higher priority among the analysis methods associated with the methods estimated by the estimation unit 22. Calculated as. In the embodiment of the present invention, the calculation unit 23 includes a priority acquisition unit 31, an emotion analysis unit 32, a voiceprint analysis unit 33, and a number analysis unit 34. Each of the sentiment analysis unit 32, the voiceprint analysis unit 33, and the number analysis unit 34 analyzes the possibility that the call is fraudulent. In the embodiment of the present invention, three analysis methods are implemented, but two analysis methods may be used, or four or more analysis methods may be used.
 優先度取得部31は、推定部22によって推定された手口に対応づけられた分析手法について、優先度データ13を参照して、処理装置1が実装する各分析方法の優先度を取得する。例えば推定部22によって推定された手口がオレオレ詐欺とすると、優先度取得部31は、優先度が高い分析方法として、個別ホワイトリストによる番号分析および声紋分析を取得し、優先度が低い分析方法として、感情分析を取得する。優先度データ13において、分析方法の重みが数値または指標で設定される場合、優先度取得部31は、各分析方法の数値または指標を取得する。 The priority acquisition unit 31 acquires the priority of each analysis method implemented by the processing device 1 with reference to the priority data 13 for the analysis method associated with the method estimated by the estimation unit 22. For example, if the method estimated by the estimation unit 22 is an oleore fraud, the priority acquisition unit 31 acquires number analysis and voiceprint analysis by an individual whitelist as an analysis method with a high priority, and as an analysis method with a low priority. , Get sentiment analysis. In the priority data 13, when the weight of the analysis method is set by a numerical value or an index, the priority acquisition unit 31 acquires the numerical value or the index of each analysis method.
 また算出部23は、感情分析部32、声紋分析部33および番号分析部34のそれぞれに通話データ11を分析させ、詐欺の可能性を算出させる。 Further, the calculation unit 23 causes each of the emotion analysis unit 32, the voiceprint analysis unit 33, and the number analysis unit 34 to analyze the call data 11 and calculate the possibility of fraud.
 声紋分析部33は、通話データ11における詐欺被疑者の声紋と、声紋データ14の各声紋を比較して、ホワイトリストの声紋と一致する場合、詐欺の可能性をゼロとし、ブラックリストの声紋と一致する場合、その一致する声紋に対応づけられた詐欺の可能性を取得する。声紋データ14に登録される声紋と、詐欺被疑者の声紋との類似度に従って、詐欺の可能性が調整されても良い。例えば、声紋分析部33は、詐欺被疑者の声紋とブラックリストの声紋の類似度を算出し、そのブラックリストの声紋に対応づけられた詐欺の可能性に類似度を乗算した値を、詐欺の可能性として出力しても良い。声紋分析部33は、詐欺被疑者の声紋とホワイトリストの声紋の類似度を算出し、算出された類似度と負の相関を持つ値を、詐欺の可能性として出力しても良い。 The voiceprint analysis unit 33 compares the voiceprint of the suspected fraud in the call data 11 with each voiceprint of the voiceprint data 14, and if it matches the voiceprint of the whitelist, the possibility of fraud is set to zero, and the voiceprint of the blacklist is used. If there is a match, get the possibility of fraud associated with that matching voiceprint. The possibility of fraud may be adjusted according to the degree of similarity between the voiceprint registered in the voiceprint data 14 and the voiceprint of the suspected fraud. For example, the voiceprint analysis unit 33 calculates the similarity between the voiceprint of the suspected fraud and the voiceprint of the blacklist, and multiplies the possibility of fraud associated with the voiceprint of the blacklist by the similarity. It may be output as a possibility. The voiceprint analysis unit 33 may calculate the similarity between the voiceprint of the suspected fraud and the voiceprint of the whitelist, and output a value having a negative correlation with the calculated similarity as the possibility of fraud.
 番号分析部34は、通話データ11における詐欺被疑者の電話番号と、番号データ15の各電話番号を比較して、ホワイトリストの電話番号と一致する場合、詐欺の可能性をゼロとし、ブラックリストの電話番号と一致する場合、その一致する電話番号に対応づけられた詐欺の可能性を取得する。 The number analysis unit 34 compares the telephone number of the suspected fraud in the call data 11 with each telephone number of the number data 15, and if the telephone number matches the telephone number in the white list, the possibility of fraud is set to zero and the black list is set. If it matches your phone number, you get the possibility of fraud associated with that matching phone number.
 感情分析部32は、通話データ11における詐欺被疑者の音声データから、詐欺被疑者の感情を分析し、詐欺時の感情との一致または非詐欺被疑者の感情との矛盾などから、詐欺の可能性を算出する。感情分析部32の処理は、後に詳述する。 The sentiment analysis unit 32 analyzes the emotions of the fraudulent suspect from the voice data of the fraudulent suspect in the call data 11, and can be fraudulent because of a match with the emotion at the time of fraud or a contradiction with the emotion of the non-fraud suspect. Calculate the sex. The processing of the sentiment analysis unit 32 will be described in detail later.
 算出部23は、優先度取得部31、感情分析部32、声紋分析部33および番号分析部34のそれぞれで算出された詐欺の可能性から、通話における危険度を算出する。 The calculation unit 23 calculates the degree of risk in a call from the possibility of fraud calculated by each of the priority acquisition unit 31, the emotion analysis unit 32, the voiceprint analysis unit 33, and the number analysis unit 34.
 算出部23は、例えば、優先度データを参照して、推定部22によって推定された手口に対応づけられた分析手法のうち、優先度が高い分析手法で分析された詐欺の可能性を、危険度として算出する。推定部22によって推定された手口がオレオレ詐欺とし、優先度取得部31が、優先度が高い分析方法として、個別ホワイトリストによる番号分析および声紋分析を取得した場合、算出部23は、番号分析部34における個別ホワイトリストとの照合結果による詐欺の可能性と、声紋分析部33における声紋データ14との照合結果による詐欺の可能性を取得して、これらの詐欺の可能性から、危険度を算出する。危険度は、これらの詐欺の可能性の平均であっても良い。優先度が高い分析方法で算出された詐欺の可能性のみから危険度を算出する方法を説明したが、これに限らない。算出部23は、優先度が高い分析方法で算出された詐欺の可能性の重みを高く、優先度が低い分析方法で算出された詐欺の可能性の重みを低く設定して、危険度を算出しても良い。 For example, the calculation unit 23 refers to the priority data and risks fraud analyzed by the analysis method having a higher priority among the analysis methods associated with the methods estimated by the estimation unit 22. Calculated as a degree. When the method estimated by the estimation unit 22 is an oleore fraud and the priority acquisition unit 31 acquires the number analysis and the voice print analysis by the individual white list as the analysis method with high priority, the calculation unit 23 is the number analysis unit. The possibility of fraud based on the collation result with the individual white list in 34 and the possibility of fraud based on the collation result with the voiceprint data 14 in the voiceprint analysis unit 33 are acquired, and the risk level is calculated from the possibility of these frauds. do. The degree of risk may be the average of the possibilities of these frauds. The method of calculating the risk level only from the possibility of fraud calculated by the analysis method with high priority has been explained, but the method is not limited to this. The calculation unit 23 sets the weight of the possibility of fraud calculated by the high-priority analysis method high and the weight of the possibility of fraud calculated by the low-priority analysis method low, and calculates the degree of risk. You may.
 優先度データ13において、各手口および各分析手法に優先度が設定される場合、優先度取得部31は、優先度データ13を参照して、推定部22で推定された手口に対応づけられた各分析手法の優先度を取得する。算出部23は、各分析手法で得られた詐欺の可能性に各分析方法の優先度を加重して、危険度を算出する。例えば推定部22によって推定された手口がオレオレ詐欺とすると、優先度取得部31は、優先度データ13を参照して、オレオレ詐欺における個別ホワイトリスト番号分析の優先度、声紋分析の優先度および感情分析の優先度を取得する。算出部23は、個別ホワイトリスト番号分析による詐欺の可能性、声紋分析による詐欺の可能性および感情分析による詐欺の可能性それぞれに、優先度取得部31が取得した優先度を乗算し加算して、危険度を算出する。 When the priority is set for each technique and each analysis method in the priority data 13, the priority acquisition unit 31 is associated with the method estimated by the estimation unit 22 with reference to the priority data 13. Get the priority of each analysis method. The calculation unit 23 calculates the degree of risk by weighting the priority of each analysis method with the possibility of fraud obtained by each analysis method. For example, if the method estimated by the estimation unit 22 is an oleore fraud, the priority acquisition unit 31 refers to the priority data 13 and refers to the priority of the individual whitelist number analysis, the priority of the voiceprint analysis, and the sentiment in the oleore fraud. Get the analysis priority. The calculation unit 23 multiplies and adds the priority acquired by the priority acquisition unit 31 to each of the possibility of fraud by individual whitelist number analysis, the possibility of fraud by voiceprint analysis, and the possibility of fraud by sentiment analysis. , Calculate the degree of risk.
 算出部23が算出する危険度の算出方法は、これに限るものではないが、詐欺の手口毎に設定された各分析方法の優先度に従って、各分析方法によって出力された詐欺の可能性が加重されていれば良い。 The method of calculating the degree of risk calculated by the calculation unit 23 is not limited to this, but the possibility of fraud output by each analysis method is weighted according to the priority of each analysis method set for each method of fraud. It should be done.
 図7を参照して、本発明の実施の形態に係る算出部23の処理を説明する。図7に示す処理は一例であって、これに限るものではない。 With reference to FIG. 7, the processing of the calculation unit 23 according to the embodiment of the present invention will be described. The process shown in FIG. 7 is an example and is not limited to this.
 ステップS101において算出部23は、優先度データ13を参照して、推定部22によって推定された手口の各分析手段の優先度を決定する。 In step S101, the calculation unit 23 refers to the priority data 13 and determines the priority of each analysis means of the method estimated by the estimation unit 22.
 ステップS102において算出部23は、感情分析による詐欺の可能性を算出する。ステップS103において算出部23は、声紋分析による詐欺の可能性を算出する。ステップS104において算出部23は、番号分析による詐欺の可能性を算出する。 In step S102, the calculation unit 23 calculates the possibility of fraud by sentiment analysis. In step S103, the calculation unit 23 calculates the possibility of fraud by voiceprint analysis. In step S104, the calculation unit 23 calculates the possibility of fraud by number analysis.
 ステップS105において算出部23は、ステップS101で取得した優先度に従って通話の危険度を算出する。算出部23は、優先度の高い分析手法による詐欺の可能性から危険度を算出しても良い。あるいは算出部23は、ステップS102ないしステップS104で算出した各分析手法による詐欺の可能性を、ステップS101で取得した優先度で重み付けして、危険度を算出しても良い。 In step S105, the calculation unit 23 calculates the risk of the call according to the priority acquired in step S101. The calculation unit 23 may calculate the degree of risk from the possibility of fraud by a high-priority analysis method. Alternatively, the calculation unit 23 may calculate the degree of risk by weighting the possibility of fraud by each analysis method calculated in steps S102 to S104 with the priority acquired in step S101.
 通知部24は、算出部23によって算出された危険度を出力する。危険度は、例えば、ユーザの電話機に表示されても良いし、ユーザのみに聞こえるように音声で通知しても良い。危険度は、予めユーザの連絡先として登録された他者または他端末の電話番号、メールアドレス、SNS(Social Networking Service)のアカウント等に通知されても良い。 The notification unit 24 outputs the degree of risk calculated by the calculation unit 23. The degree of danger may be displayed on the user's telephone, for example, or may be notified by voice so that only the user can hear it. The degree of risk may be notified to the telephone number, e-mail address, SNS (Social Networking Service) account, or the like of another person or another terminal registered in advance as the user's contact information.
 通知部24は、ホワイトリストの声紋または番号と照合できた場合、危険度を通知しなくても良い。また危険度が閾値と比べて高いか否かによって、通知方法を変更しても良い。例えば通知部24は、危険度が閾値よりも高い場合、予め登録された電話番号に電話して通知するなど即時性の高い通知手段で通知し、危険度が閾値よりも低い場合、予め登録されたメールアドレスに通知するなど即時性の低い通知手段で通知しても良い。 The notification unit 24 does not have to notify the degree of danger if it can be collated with the voiceprint or number of the whitelist. Further, the notification method may be changed depending on whether or not the risk level is higher than the threshold value. For example, when the risk level is higher than the threshold value, the notification unit 24 notifies by a highly immediate notification means such as calling a pre-registered telephone number to notify the user, and when the risk level is lower than the threshold value, the notification unit 24 is registered in advance. You may notify by a notification means with low immediacy such as notifying to the e-mail address.
 フィードバック部25は、ユーザに、通話相手となる詐欺被疑者の評価を入力させ、入力された評価に従って、声紋データ14および番号データ15のブラックリストのスコアを更新する。ユーザから、詐欺被疑者が詐欺師と思うなど、悪い評価が入力された場合、フィードバック部25は、その詐欺被疑者に対応するスコアをカウントアップする。ユーザから、詐欺被疑者が詐欺師でないと思うなど、良い評価が入力された場合、フィードバック部25は、その詐欺被疑者に対応するスコアをカウントダウンする。フィードバック部25は、所定のタイミングで、声紋データ14および番号データ15のブラックリストの各詐欺師のスコアを参照して、各詐欺師の詐欺である可能性の値を更新する。 The feedback unit 25 causes the user to input the evaluation of the fraudulent suspect who is the other party of the call, and updates the blacklist scores of the voice print data 14 and the number data 15 according to the input evaluation. When a bad evaluation is input from the user, such as the suspected fraud being considered a fraudster, the feedback unit 25 counts up the score corresponding to the suspected fraud. If the user inputs a good evaluation, such as thinking that the fraudulent suspect is not a fraudster, the feedback unit 25 counts down the score corresponding to the fraudulent suspect. The feedback unit 25 refers to the score of each fraudster in the blacklist of the voiceprint data 14 and the number data 15 at a predetermined timing, and updates the value of the possibility of fraud of each fraudster.
 フィードバック部25は、詐欺被疑者のデータが声紋データ14および番号データ15に登録されていない場合、声紋データ14および番号データ15に詐欺被疑者のデータを追加しても良い。またフィードバック部25は、警察などの公的機関から得た情報に基づいて、声紋データ14および番号データ15を更新しても良い。 If the data of the suspected fraud is not registered in the voice print data 14 and the number data 15, the feedback unit 25 may add the data of the suspected fraud to the voice print data 14 and the number data 15. Further, the feedback unit 25 may update the voiceprint data 14 and the number data 15 based on the information obtained from a public institution such as the police.
 このような処理装置1によれば、通話における詐欺被疑者の発言から詐欺の手口を推定し、推定された詐欺の手口によって分析方法の重みを変更して、詐欺の危険度を算出する。本発明の実施の形態に係る処理装置1は、推定される手口に応じて分析方法が出力する詐欺の可能性に軽重を付けて危険度を算出するので、通話が詐欺である可能性を適切に判定することが可能になる。 According to such a processing device 1, the fraudulent method is estimated from the remarks of the fraudulent suspect in the call, the weight of the analysis method is changed according to the estimated fraudulent method, and the risk of fraud is calculated. Since the processing device 1 according to the embodiment of the present invention calculates the degree of risk by giving light weight to the possibility of fraud output by the analysis method according to the presumed method, the possibility that the call is fraud is appropriate. It becomes possible to judge.
 (感情分析部)
 図8を参照して、感情分析部32を説明する。感情分析部32は、通話データ11、感情スコアデータ41、手口感情データ42、発話感情データ43、スコア算出部51、特徴検出部52、矛盾検出部53および感情分析出力部54を備える。通話データ11、感情スコアデータ41、手口感情データ42および発話感情データ43は、メモリ902またはストレージ903等に記憶されるデータである。スコア算出部51、特徴検出部52、矛盾検出部53および感情分析出力部54は、CPU901に実装される機能である。
(Sentiment analysis department)
The sentiment analysis unit 32 will be described with reference to FIG. The emotion analysis unit 32 includes a call data 11, an emotion score data 41, a technique emotion data 42, a speech emotion data 43, a score calculation unit 51, a feature detection unit 52, a contradiction detection unit 53, and an emotion analysis output unit 54. The call data 11, the emotion score data 41, the trick emotion data 42, and the spoken emotion data 43 are data stored in the memory 902, the storage 903, or the like. The score calculation unit 51, the feature detection unit 52, the contradiction detection unit 53, and the sentiment analysis output unit 54 are functions implemented in the CPU 901.
 通話データ11は、図1および図2を参照して説明した通りである。 The call data 11 is as described with reference to FIGS. 1 and 2.
 手口感情データ42は、通話を使った詐欺の手口の識別子と、手口における詐欺被疑者の感情スコアの傾向を対応づけたデータである。図9に示すように手口感情データ42は、手口の識別子に、感情スコアの傾向として各感情種別の傾向を対応づける。手口感情データ42における感情スコアの傾向は、着信時、特定の発話とした際など、感情スコアの時間的変化を含んでも良い。図9に示す例でオレオレ詐欺について、喜びの感情スコアは着信時に高く、平静の感情スコアは事故等の告白時に低いことを示す。1つの手口において複数のパターンがあることを考慮して、手口感情データ42は、1つの手口の識別子に、複数の感情スコアの傾向を対応づけても良い。 The method emotion data 42 is data in which the identifier of the method of fraud using a call is associated with the tendency of the emotion score of the suspected fraud in the method. As shown in FIG. 9, in the technique emotion data 42, the tendency of each emotion type is associated with the identifier of the technique as the tendency of the emotion score. The tendency of the emotion score in the technique emotion data 42 may include a temporal change of the emotion score, such as when an incoming call is received or when a specific utterance is made. In the example shown in FIG. 9, regarding the oleore fraud, the emotional score of joy is high at the time of incoming call, and the emotional score of calmness is low at the time of confession such as an accident. Considering that there are a plurality of patterns in one technique, the technique emotion data 42 may associate the identifier of one technique with the tendency of a plurality of emotion scores.
 発話感情データ43は、キーワードと、そのキーワードが発せられた際の非詐欺被疑者の感情のスコアの特徴を対応づけたデータである。発話感情データ43は、詐欺被疑者ではない一般人が、キーワードを発した際の感情を特定する。図10に示すように発話感情データ43は、キーワードに、感情スコアの傾向を対応づける。図9に示す例でキーワード「事故」について、喜びの感情スコアが低い傾向があることを示す。 The utterance emotion data 43 is data in which a keyword is associated with the characteristics of the emotion score of the non-fraud suspect when the keyword is issued. The utterance emotion data 43 identifies the emotion when a general person who is not a fraud suspect utters a keyword. As shown in FIG. 10, the utterance emotion data 43 associates the keyword with the tendency of the emotion score. In the example shown in FIG. 9, it is shown that the emotional score of joy tends to be low for the keyword “accident”.
 スコア算出部51は、通話データ11の音声データにおける詐欺被疑者の感情のスコアを算出する。本発明の実施の形態においてスコア算出部51は、喜び、怒り、悲哀および平静の4つの感情種別について、感情スコアを算出する。スコア算出部51は、詐欺被疑者の発話の時間経過に伴って、感情スコアを逐次算出する。スコア算出部51は、一般的な方法で詐欺被疑者の感情のスコアを算出すれば良い。 The score calculation unit 51 calculates the emotional score of the fraudulent suspect in the voice data of the call data 11. In the embodiment of the present invention, the score calculation unit 51 calculates emotion scores for four emotion types: joy, anger, sadness, and calm. The score calculation unit 51 sequentially calculates the emotion score with the passage of time of the utterance of the suspected fraud. The score calculation unit 51 may calculate the emotional score of the fraudulent suspect by a general method.
 特徴検出部52は、手口感情データ42を参照して、詐欺被疑者の感情のスコアが、推定部22によって推定された手口に対応づけられた感情スコアの傾向に当てはまる場合、通話が手口である可能性が高いと検出する。特徴検出部52は、詐欺被疑者の感情スコアに、推定部22で推定された特定詐欺の手口の特徴が現れていか否かによって、詐欺の可能性を判定する。特徴検出部52は、推定された手口の特徴が現れている場合、処理対象の通話について詐欺の可能性が高いと判定し、推定された手口の特徴が現れていない場合、処理対象の通話について詐欺の可能性が低いと判定する。例えば特徴検出部52は、スコア算出部51が算出した感情スコアが、図9に示す例でオレオレ詐欺に示すように喜びの感情スコアが着信時に高く、平静の感情スコアが事故等の告白時に低い場合、詐欺である可能性が高いと判定する。 The feature detection unit 52 refers to the tactical emotion data 42, and if the emotional score of the fraudulent suspect matches the tendency of the emotional score associated with the tactic estimated by the estimation unit 22, the call is a tactic. Detect as likely. The feature detection unit 52 determines the possibility of fraud based on whether or not the characteristic of the specific fraud technique estimated by the estimation unit 22 appears in the emotion score of the fraud suspect. The feature detection unit 52 determines that the call to be processed has a high possibility of fraud when the estimated feature of the technique appears, and when the feature of the estimated technique does not appear, the feature detection unit 52 determines that the call to be processed has a high possibility of fraud. Judge that the possibility of fraud is low. For example, in the feature detection unit 52, the emotion score calculated by the score calculation unit 51 has a high joyful emotional score at the time of incoming call and a low calm emotional score at the time of confession such as an accident, as shown in the example shown in FIG. If so, it is determined that there is a high possibility of fraud.
 特徴検出部52は、手口感情データ42に登録されるその手口の感情の傾向と、スコア算出部51が算出した感情スコアの傾向の類似度によって、詐欺の可能性を算出しても良い。特徴検出部52は、手口感情データ42に登録されるその手口の感情の傾向と、スコア算出部51が算出した感情スコアの傾向との類似度が高い場合、詐欺の可能性を高く算出し、類似度が低い場合、詐欺の可能性を低く算出する。 The feature detection unit 52 may calculate the possibility of fraud based on the similarity between the emotional tendency of the technique registered in the emotional data 42 and the emotional score tendency calculated by the score calculation unit 51. When the characteristic detection unit 52 has a high degree of similarity between the emotional tendency of the technique registered in the emotional data 42 and the emotional score tendency calculated by the score calculation unit 51, the feature detection unit 52 calculates the possibility of fraud with high probability. If the similarity is low, the possibility of fraud is calculated low.
 特徴検出部52は、推定される手口によって、詐欺被疑者の感情を分析する視点を変更することができるので、適切に詐欺の可能性を算出することができる。 Since the feature detection unit 52 can change the viewpoint of analyzing the emotions of the fraud suspect depending on the presumed method, the possibility of fraud can be appropriately calculated.
 矛盾検出部53は、発話感情データ43を参照して、スコア算出部51が算出した感情スコアと、発話感情データ43における非詐欺被疑者の感情のスコアの特徴が矛盾する場合、通話が詐欺である可能性が高いと検出する。矛盾検出部53は、詐欺被疑者の感情スコアに、詐欺被疑者ではない一般人の感情の傾向と相反する傾向があるか否かによって、詐欺の可能性を判定する。矛盾検出部53は、一般人の感情の傾向と相反する傾向がある場合、処理対象の通話について詐欺の可能性が高いと判定し、同様の傾向がある場合、処理対象の通話について詐欺の可能性が低いと判定する。例えば矛盾検出部53は、スコア算出部51が算出したスコアが、図10に示す例でキーワード「事故」に示すように喜びの感情が低い場合、詐欺の可能性が低いと判定し、喜びの感情が高い場合、詐欺の可能性が高いと判定する。 The contradiction detection unit 53 refers to the utterance emotion data 43, and when the characteristics of the emotion score calculated by the score calculation unit 51 and the emotion score of the non-fraud suspect in the utterance emotion data 43 are inconsistent, the call is fraudulent. Detects that there is a high possibility. The contradiction detection unit 53 determines the possibility of fraud based on whether or not the emotion score of the fraud suspect tends to contradict the emotional tendency of the general public who is not the fraud suspect. The contradiction detection unit 53 determines that there is a high possibility of fraud for the call to be processed if it tends to contradict the emotional tendency of the general public, and if there is a similar tendency, the possibility of fraud for the call to be processed. Is determined to be low. For example, the contradiction detection unit 53 determines that the possibility of fraud is low when the score calculated by the score calculation unit 51 has a low feeling of joy as shown by the keyword “accident” in the example shown in FIG. If the emotion is high, it is judged that there is a high possibility of fraud.
 矛盾検出部53は、スコア算出部51が算出した感情スコアと、発話感情データ43が特定する感情スコアの傾向との非類似度から、通話が詐欺である可能性を算出しても良い。矛盾検出部53は、スコア算出部51が算出した感情スコアと、発話感情データ43が特定する感情スコアの傾向との非類似度が高い場合、通話が詐欺である可能性が高いと判定し、非類似度が低い場合、通話が詐欺である可能性が低いと判定する。 The contradiction detection unit 53 may calculate the possibility that the call is fraudulent from the dissimilarity between the emotion score calculated by the score calculation unit 51 and the tendency of the emotion score specified by the utterance emotion data 43. The contradiction detection unit 53 determines that the call is likely to be fraudulent when the dissimilarity between the emotion score calculated by the score calculation unit 51 and the tendency of the emotion score specified by the utterance emotion data 43 is high. If the dissimilarity is low, it is determined that the call is unlikely to be fraudulent.
 感情分析出力部54は、特徴検出部52と矛盾検出部53の各分析手法から、詐欺である可能性を出力する。感情分析出力部54は、例えば、特徴検出部52が判定した詐欺である可能性と、矛盾検出部53が判定した詐欺である可能性の平均を、感情分析による詐欺である可能性として出力する。感情分析出力部54は、特徴検出部52および矛盾検出部53のうちの少なくともいずれか一方の詐欺である可能性から、感情分析による詐欺である可能性を出力すれば良く、その計算方法は問わない。 The sentiment analysis output unit 54 outputs the possibility of fraud from each analysis method of the feature detection unit 52 and the contradiction detection unit 53. The sentiment analysis output unit 54 outputs, for example, the average of the possibility of fraud determined by the feature detection unit 52 and the possibility of fraud determined by the inconsistency detection unit 53 as the possibility of fraud by emotion analysis. .. The emotion analysis output unit 54 may output the possibility of fraud by emotion analysis from the possibility of fraud of at least one of the feature detection unit 52 and the inconsistency detection unit 53, and the calculation method is not limited. do not have.
 図11を参照して、感情分析部32による感情分析処理を説明する。図11に示す処理は一例であって、これに限るものではない。 The emotion analysis process by the emotion analysis unit 32 will be described with reference to FIG. The process shown in FIG. 11 is an example and is not limited to this.
 まずステップS201において感情分析部32は、感情分析による詐欺の可能性を初期化する。ステップS202において感情分析部32は、詐欺被疑者の音声データから、各感情のスコアを算出する。 First, in step S201, the sentiment analysis unit 32 initializes the possibility of fraud by sentiment analysis. In step S202, the sentiment analysis unit 32 calculates the score of each emotion from the voice data of the suspected fraud.
 ステップS203において感情分析部32は、ステップS202で算出された各感情のスコアに、手口感情データ42において推定された手口の感情の傾向があるか否かを判定する。傾向がある場合、ステップS204において感情分析部32は、感情分析による詐欺の可能性を上げる。傾向がない場合、ステップS205において感情分析部32は、感情分析による詐欺の可能性を下げる。 In step S203, the emotion analysis unit 32 determines whether or not the score of each emotion calculated in step S202 has a tendency of the emotion of the technique estimated in the technique emotion data 42. If there is a tendency, in step S204, the sentiment analysis unit 32 raises the possibility of fraud by sentiment analysis. If there is no tendency, in step S205, the sentiment analysis unit 32 reduces the possibility of fraud by sentiment analysis.
 ステップS206において感情分析部32は、発話感情データ43を参照して、詐欺被疑者の発言内容と、発言時の感情スコアに矛盾があるか否かを判定する。矛盾がある場合、ステップS207において感情分析部32は、感情分析による詐欺の可能性を上げる。矛盾がない場合、ステップS208において感情分析部32は、感情分析による詐欺の可能性を下げる。 In step S206, the emotion analysis unit 32 refers to the utterance emotion data 43 and determines whether or not there is a contradiction between the content of the fraudulent suspect's remark and the emotion score at the time of remark. If there is a contradiction, in step S207, the sentiment analysis unit 32 raises the possibility of fraud by sentiment analysis. If there is no contradiction, in step S208, the sentiment analysis unit 32 reduces the possibility of fraud by sentiment analysis.
 ステップS209において感情分析部32は、ステップS201ないしステップS209の処理で得られた感情分析による詐欺の可能性を出力する。 In step S209, the sentiment analysis unit 32 outputs the possibility of fraud by the sentiment analysis obtained in the processes of steps S201 to S209.
 感情分析部32は、詐欺被疑者の音声データにおいて、推定された手口の特徴が現れているか否かによって詐欺の可能性を分析するので、感情分析を用いて通話が詐欺である可能性を適切に判定することができる。また感情分析部32は、詐欺被疑者の音声データの発話における感情が、一般人のその発話における感情と異なるか否かによって詐欺の可能性を分析するので、感情分析を用いて通話が詐欺である可能性を適切に判定することができる。 Since the sentiment analysis unit 32 analyzes the possibility of fraud based on whether or not the estimated characteristics of the technique appear in the voice data of the suspected fraud, it is appropriate to use the sentiment analysis to determine the possibility that the call is fraudulent. Can be determined. Further, since the sentiment analysis unit 32 analyzes the possibility of fraud depending on whether or not the emotion in the utterance of the voice data of the suspected fraud is different from the emotion in the utterance of the general public, the call is fraudulent using the sentiment analysis. The possibility can be judged appropriately.
 上記説明した本実施形態の処理装置1は、例えば、CPU(Central Processing Unit、プロセッサ)901と、メモリ902と、ストレージ903(HDD:Hard Disk Drive、SSD:Solid State Drive)と、通信装置904と、入力装置905と、出力装置906とを備える汎用的なコンピュータシステムが用いられる。このコンピュータシステムにおいて、CPU901がメモリ902上にロードされた所定のプログラムを実行することにより、処理装置1の各機能が実現される。 The processing device 1 of the present embodiment described above includes, for example, a CPU (Central Processing Unit, processor) 901, a memory 902, a storage 903 (HDD: Hard Disk Drive, SSD: Solid State Drive), and a communication device 904. , A general purpose computer system including an input device 905 and an output device 906 is used. In this computer system, each function of the processing device 1 is realized by executing a predetermined program loaded on the memory 902 by the CPU 901.
 なお、処理装置1は、1つのコンピュータで実装されてもよく、あるいは複数のコンピュータで実装されても良い。また処理装置1は、コンピュータに実装される仮想マシンであっても良い。 The processing device 1 may be mounted on one computer or may be mounted on a plurality of computers. Further, the processing device 1 may be a virtual machine mounted on a computer.
 処理装置1のプログラムは、HDD、SSD、USB(Universal Serial Bus)メモリ、CD (Compact Disc)、DVD (Digital Versatile Disc)などのコンピュータ読取り可能な記録媒体に記憶することも、ネットワークを介して配信することもできる。 The program of the processing device 1 can be stored in a computer-readable recording medium such as an HDD, SSD, USB (Universal Serial Bus) memory, CD (Compact Disc), DVD (Digital Versatile Disc), or distributed via a network. You can also do it.
 なお、本発明は上記実施形態に限定されるものではなく、その要旨の範囲内で数々の変形が可能である。 The present invention is not limited to the above embodiment, and many modifications can be made within the scope of the gist thereof.
 1 処理装置
 6 通話システム
 11 通話データ
 12 パターンデータ
 13 優先度データ
 14 声紋データ
 15 番号データ
 21 取得部
 22 推定部
 23 算出部
 24 通知部
 25 フィードバック部
 31 優先度取得部
 32 感情分析部
 33 声紋分析部
 34 番号分析部
 41 感情スコアデータ
 42 手口感情データ
 43 発話感情データ
 51 スコア算出部
 52 特徴検出部
 53 矛盾検出部
 54 感情分析出力部
 55 通話制御システム
 61 発信端末
 62 着信端末
 901 CPU
 902 メモリ
 903 ストレージ
 904 通信装置
 905 入力装置
 906 出力装置
1 Processing device 6 Call system 11 Call data 12 Pattern data 13 Priority data 14 Voice pattern data 15 Number data 21 Acquisition unit 22 Estimate unit 23 Calculation unit 24 Notification unit 25 Feedback unit 31 Priority acquisition unit 32 Emotion analysis unit 33 Voice pattern analysis unit 34 Number analysis unit 41 Emotion score data 42 Technique emotion data 43 Speech emotion data 51 Score calculation unit 52 Feature detection unit 53 Inconsistency detection unit 54 Emotion analysis output unit 55 Call control system 61 Calling terminal 62 Incoming terminal 901 CPU
902 Memory 903 Storage 904 Communication device 905 Input device 906 Output device

Claims (4)

  1.  詐欺被疑者とユーザ間の通話における、前記詐欺被疑者の音声データを取得する取得部と、
     前記通話を使った詐欺の手口の識別子と、前記手口における前記詐欺被疑者のなりすまし対象と、前記手口における詐欺被疑者の発話で用いられるキーワードを対応づけるパターンデータと、前記音声データにおける発話を比較して、前記詐欺被疑者による詐欺の手口を推定する推定部と、
     手口の識別子と、前記通話が詐欺である可能性を分析する分析手法の識別子と、前記通話が前記手口である可能性を前記分析手法で分析する優先度を対応づける優先度データを参照して、推定された手口に対応づけられた分析手法のうち、優先度が高い分析手法で分析された詐欺の可能性を、危険度として算出する算出部と、
     算出された危険度を通知する通知部
     を備える処理装置。
    An acquisition unit that acquires voice data of the fraudulent suspect in a call between the fraudulent suspect and the user,
    Comparing the speech in the voice data with the pattern data that associates the identifier of the fraudulent technique using the call with the spoofing target of the fraudulent suspect in the technique and the keyword used in the speech of the fraudulent suspect in the technique. Then, the estimation unit that estimates the method of fraud by the suspected fraud,
    Refer to the priority data that associates the identifier of the technique with the identifier of the analysis method for analyzing the possibility that the call is fraud and the priority for analyzing the possibility that the call is the method by the analysis method. , A calculation unit that calculates the possibility of fraud analyzed by a high-priority analysis method as a risk level among the analysis methods associated with the estimated method,
    A processing device equipped with a notification unit that notifies the calculated risk level.
  2.  前記算出部は、前記優先度データを参照して、推定された前記手口に対応づけられた各分析手法の優先度を取得し、各分析手法で得られた可能性に各分析方法の優先度を加重して、前記危険度を算出する
     請求項1に記載の処理装置。
    The calculation unit refers to the priority data, acquires the priority of each analysis method associated with the estimated method, and sets the priority of each analysis method to the possibility obtained by each analysis method. The processing apparatus according to claim 1, wherein the degree of risk is calculated by weighting the above.
  3.  コンピュータが、詐欺被疑者とユーザ間の通話における、前記詐欺被疑者の音声データを取得し、
     前記コンピュータが、前記通話を使った詐欺の手口の識別子と、前記手口における前記詐欺被疑者のなりすまし対象と、前記手口における詐欺被疑者の発話で用いられるキーワードを対応づけるパターンデータと、前記音声データにおける発話を比較して、前記詐欺被疑者による詐欺の手口を推定し、
     前記コンピュータが、手口の識別子と、前記通話が詐欺である可能性を分析する分析手法の識別子と、前記通話が前記手口である可能性を前記分析手法で分析する優先度を対応づける優先度データを参照して、推定された手口に対応づけられた分析手法のうち、優先度が高い分析手法で分析された詐欺の可能性を、危険度として算出し、
     前記コンピュータが、算出された危険度を通知する
     処理方法。
    The computer acquires the voice data of the fraudulent suspect in the call between the fraudulent suspect and the user, and obtains the voice data of the fraudulent suspect.
    The computer associates the identifier of the fraudulent technique using the call with the spoofing target of the fraudulent suspect in the technique and the keyword used in the speech of the fraudulent suspect in the technique, and the voice data. By comparing the speeches in the above, the method of fraud by the suspected fraud was estimated.
    Priority data in which the computer associates the identifier of the technique with the identifier of the analysis method for analyzing the possibility that the call is fraud and the priority for analyzing the possibility that the call is the method by the analysis method. The possibility of fraud analyzed by the high-priority analysis method among the analysis methods associated with the estimated method is calculated as the risk level.
    A processing method in which the computer notifies the calculated degree of danger.
  4.  コンピュータを、請求項1または2に記載の処理装置として機能させるためのプログラム。 A program for making a computer function as the processing device according to claim 1 or 2.
PCT/JP2020/042976 2020-11-18 2020-11-18 Processing device, processing method, and program WO2022107241A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/042976 WO2022107241A1 (en) 2020-11-18 2020-11-18 Processing device, processing method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/042976 WO2022107241A1 (en) 2020-11-18 2020-11-18 Processing device, processing method, and program

Publications (1)

Publication Number Publication Date
WO2022107241A1 true WO2022107241A1 (en) 2022-05-27

Family

ID=81708551

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/042976 WO2022107241A1 (en) 2020-11-18 2020-11-18 Processing device, processing method, and program

Country Status (1)

Country Link
WO (1) WO2022107241A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001094517A (en) * 1999-09-20 2001-04-06 Matsushita Electric Ind Co Ltd Device and method for voice communication
JP2007139864A (en) * 2005-11-15 2007-06-07 Nec Corp Apparatus and method for detecting suspicious conversation, and communication device using the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001094517A (en) * 1999-09-20 2001-04-06 Matsushita Electric Ind Co Ltd Device and method for voice communication
JP2007139864A (en) * 2005-11-15 2007-06-07 Nec Corp Apparatus and method for detecting suspicious conversation, and communication device using the same

Similar Documents

Publication Publication Date Title
US10574812B2 (en) Systems and methods for cluster-based voice verification
EP2622832B1 (en) Speech comparison
US20080059198A1 (en) Apparatus and method for detecting and reporting online predators
US20070038460A1 (en) Method and system to improve speaker verification accuracy by detecting repeat imposters
EP2437477A1 (en) Fraud detection
CN105723450A (en) Envelope comparison for utterance detection
US11245791B2 (en) Detecting robocalls using biometric voice fingerprints
KR101795593B1 (en) Device and method for protecting phone counselor
CN111754982A (en) Noise elimination method and device for voice call, electronic equipment and storage medium
JP2007266944A (en) Telephone terminal and caller verification method
EP4055592A1 (en) Systems and methods for customer authentication based on audio-of-interest
Derakhshan et al. Detecting telephone-based social engineering attacks using scam signatures
GB2584827A (en) Multilayer set of neural networks
WO2017005071A1 (en) Communication monitoring method and device
WO2022107241A1 (en) Processing device, processing method, and program
JP2016071068A (en) Call analysis device, call analysis method, and call analysis program
CN113191787A (en) Telecommunication data processing method, device electronic equipment and storage medium
WO2022107242A1 (en) Processing device, processing method, and program
JP6733901B2 (en) Psychological analysis device, psychological analysis method, and program
US11582336B1 (en) System and method for gender based authentication of a caller
CN113064983B (en) Semantic detection method, semantic detection device, computer equipment and storage medium
JP6772832B2 (en) Crime judgment device, relay system, telephone system, crime judgment method and program
US20200184352A1 (en) Information output system, information output method, and recording medium
JP2018013529A (en) Specific conversation detection device, method and program
US20230252190A1 (en) Obfuscating communications that include sensitive information based on context of the communications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20962405

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20962405

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP