WO2021042895A1 - Procédé et système d'identification de code de vérification sur la base d'un réseau neuronal, et dispositif informatique - Google Patents

Procédé et système d'identification de code de vérification sur la base d'un réseau neuronal, et dispositif informatique Download PDF

Info

Publication number
WO2021042895A1
WO2021042895A1 PCT/CN2020/103596 CN2020103596W WO2021042895A1 WO 2021042895 A1 WO2021042895 A1 WO 2021042895A1 CN 2020103596 W CN2020103596 W CN 2020103596W WO 2021042895 A1 WO2021042895 A1 WO 2021042895A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
dimensional
feature maps
convolution
convolutional
Prior art date
Application number
PCT/CN2020/103596
Other languages
English (en)
Chinese (zh)
Inventor
李国安
Original Assignee
深圳壹账通智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳壹账通智能科技有限公司 filed Critical 深圳壹账通智能科技有限公司
Publication of WO2021042895A1 publication Critical patent/WO2021042895A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/36User authentication by graphic or iconic representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Definitions

  • the embodiments of the present application relate to the field of artificial intelligence, and in particular to a method, system, computer device, and computer-readable storage medium for identifying a verification code based on a neural network.
  • the image verification code (CAPTCHA) technology is widely used in various fields as an important technical means to distinguish between humans and malicious programs.
  • the image verification code can be applied to the web page of the WEB system that involves login and input. Specifically: a randomly generated verification code image is displayed on the website page, and the user recognizes the verification code based on human eyes and recognizes Enter the verification code into the form, and submit the form to the server for verification. Because the image verification code is random and difficult to be recognized by malicious programs, a different verification code can be randomly generated every time a website is accessed, so as to achieve the purpose of protecting the website from malicious use.
  • the current technical means are: the verification code is automatically identified through the general convolutional neural network and the recurrent neural network, and the identified verification code is automatically submitted to the form for website verification, so as to improve the efficiency of obtaining information resources.
  • the inventor realizes that the above technical means have low recognition accuracy and require more computer computing resources.
  • the purpose of the embodiments of the present application is to provide a verification code recognition method, system, computer equipment, and computer readable storage medium based on a neural network, which solves the problem of low recognition accuracy and consumes more computer computing resources. problem.
  • an embodiment of the present application provides a method for identifying a verification code based on a neural network, which includes the following steps:
  • n one-dimensional convolution feature maps to the first fully connected layer and the first classifier in sequence to output the first prediction vector
  • the embodiment of the present application also provides a verification code recognition system based on a neural network, including:
  • the obtaining module is used to obtain the verification code image in the target webpage
  • the first convolution module is configured to perform a convolution operation on the verification code image through the two-dimensional convolution module to obtain n two-dimensional convolution feature maps;
  • the second convolution module is configured to perform a convolution operation on n one-dimensional convolution feature maps through the one-dimensional convolution module to obtain n one-dimensional convolution feature maps processed by the one-dimensional convolution module;
  • the first output module is configured to input the n one-dimensional convolution feature maps into the first fully connected layer and the first classifier in sequence to output the first prediction vector;
  • the second output module is configured to sequentially input the n one-dimensional convolution feature maps processed by the one-dimensional convolution module into the second fully connected layer and the second classifier to output the second prediction vector;
  • the prediction module is configured to calculate the final prediction result based on the first prediction vector and the second prediction vector.
  • an embodiment of the present application also provides a computer device, the computer device memory, a processor, and computer-readable instructions stored on the memory and running on the processor, the computer When the readable instructions are executed by the processor, the following steps are implemented:
  • n one-dimensional convolution feature maps to the first fully connected layer and the first classifier in sequence to output the first prediction vector
  • an embodiment of the present application also provides a computer-readable storage medium having computer-readable instructions stored in the computer-readable storage medium, and the computer-readable instructions may be executed by at least one processor, So that the at least one processor executes the following steps:
  • n one-dimensional convolution feature maps to the first fully connected layer and the first classifier in sequence to output the first prediction vector
  • the neural network-based verification code recognition method, system, computer equipment, and computer-readable storage medium provided in the embodiments of the application recognize images to be verified through the neural network results of two-dimensional convolution and one-dimensional convolution, and have a high Recognition accuracy can effectively shorten calculation time and computing resources.
  • FIG. 1 is a schematic flowchart of Embodiment 1 of a verification code recognition method based on a neural network according to an embodiment of the present application.
  • Embodiment 2 is a schematic diagram of program modules of Embodiment 2 of a verification code recognition system based on a neural network according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of the hardware structure of the third embodiment of the computer equipment of this application.
  • FIG. 1 shows a flow chart of the method for identifying a verification code based on a neural network in the first embodiment of the present application. It can be understood that the flowchart in this method embodiment is not used to limit the order of execution of the steps. details as follows.
  • Step S100 Obtain the verification code image in the target webpage.
  • the computer device 2 generates a login request message for the website webpage through a browser in response to a user operation behavior, the login request is an HTTP request message based on the HTTP protocol; the server of the website webpage receives the HTTP request message, Based on the HTTP request message, a login page carrying a verification code picture is generated, and the login page is returned to the browser; the computer device parses the login page, and extracts the verification code picture from the login page.
  • the verification code picture may be a picture processed by means of distortion, adhesion, noise, animation, etc.
  • the content in the verification code picture can be any combination of a given string including numbers, English letters and Chinese characters.
  • Step S102 Perform a convolution operation on the verification code image by a two-dimensional convolution module to obtain n two-dimensional convolution feature maps.
  • the two-dimensional convolution module includes sequentially connected: an input layer, a Lambda layer (for applying any Theano/TensorFlow expression to the output of the previous layer), and multiple two-dimensional convolutions Layer combination, each two-dimensional convolutional layer combination is sequentially connected, and each two-dimensional convolutional layer combination includes: two convolution branches and a merged layer connected to the two convolution branches, each convolution branch
  • the road includes successively connected: two-dimensional convolution layer, Batch Normalization layer (batch normalization layer), LeakyReLU layer (activation layer), Depthwise two-dimensional convolution layer (depth two-dimensional convolution layer), Batch Normalization layer, LeakyReLU layer, two-dimensional convolutional layer and Batch Normalization layer.
  • step S102 may further include: S1: input the verification code picture through the input layer, and divide the verification code picture into two feature maps through the Lambda layer, each of which is split The size of the feature map is (H/2)xWxC; S2: Input two feature maps into the i-th two-dimensional convolutional layer combination connected to the Lambda layer to obtain multiple i-th convolutional feature maps after superposition , The initial value of i is 1; S3: divide the multiple i-th convolution feature maps into two groups of convolution feature maps, and each group of convolution feature maps includes multiple segmented convolution feature maps; S4: divide The two sets of convolution feature maps in the plurality of i-th convolutional feature maps are correspondingly input to the next two-dimensional convolutional layer combination of the i-th two-dimensional convolutional layer combination, and the next two-dimensional convolution Layer combination outputs multiple i+1th convolution feature maps; S5: Repeat S3 and S4 for the multiple i+1th
  • Step (1) input the verification code image through the input layer, and divide the verification code image into two feature images through the Lambda layer, and the size of each feature image after segmentation is (height H/2 ) x width x number of channels ;
  • Step (2) Input the two feature maps into the two-dimensional convolutional layer combination connected to the Lambda layer, and obtain multiple superimposed first convolutional feature maps with a size of H 1 x W 1.
  • the convolution kernel parameters and the number of convolution kernels of the two-dimensional convolution layer a1 and the two-dimensional convolution layer b1 for processing the above two pictures can be different, and the other network layers are the same.
  • the two-dimensional convolutional layer a1 is composed of 96 convolution kernels
  • the two-dimensional convolutional layer b1 is composed of 128 convolution kernels.
  • Step (3) Divide the first convolution feature maps with a size of H 1 x W 1 into two groups of convolution feature maps, and each group of convolution feature maps includes multiple sizes ( H 1 /2 )x Convolution feature map of W 2 ; for each group of multiple size ( H 1 /2 ) x W 2 convolution feature map input into the next two-dimensional convolution layer combination, and get multiple sizes of ( H 1 /2 ) x W 3 second convolution feature map;
  • Step (4) input n second convolution feature maps with size (H 1 /2 ) x W 3 to the next two-dimensional convolution layer combination, ... until n two-dimensional convolutions with H of 1 are obtained Feature map.
  • the verification code image (none, 140, 180, 3) is obtained through the above steps (none, 1, 19, 256), none represents the number of verification code images, "1" represents the height, “19” represents the width, and "256 "Represents the number of channels.
  • Step S104 Convert the n two-dimensional convolution feature maps into n one-dimensional convolution feature maps.
  • n two-dimensional convolution feature maps with H being 1 are input to the Permule layer and the TimeDistributed layer to obtain n one-dimensional convolution feature maps.
  • the Permute layer (arrangement layer) is used to rearrange the dimensions of the input vector, such as replacing (none, 1, 19, 256) with (none, 19, 1, 256).
  • the Fatten layer (flattened layer) is used to compress the dimensions, for example, the input matrix (none, 1, 19, 256) is flattened to obtain (none, 19, 256).
  • Step S106 Perform a convolution operation on the n one-dimensional convolution feature maps through the one-dimensional convolution module to obtain n one-dimensional convolution feature maps processed by the one-dimensional convolution module. Go to step S110.
  • the one-dimensional convolution module includes sequentially connected: one-dimensional convolutional layer, a combination of multiple one-dimensional convolutional layers, and a post-merging layer; two adjacent one-dimensional convolutional layers
  • An intermediate merging layer is configured between the combinations.
  • the intermediate merging layer is used to merge the output and input of the one-dimensional convolutional layer combination connected to the input terminal, and the output of the intermediate merging layer is used as the one connected to the output terminal.
  • the input of the combination of a two-dimensional convolutional layer; the post-merging layer is used to merge the output of the subsequent one-dimensional convolutional layer combination and the output of each intermediate merged layer.
  • each one-dimensional convolutional layer combination includes sequentially connected: one-dimensional hollow convolutional layer, activation layer, Lambda layer, spatial Dropout layer (used to prevent overfitting), one-dimensional convolutional layer.
  • an exemplary operation flow is as follows:
  • step S108 the n one-dimensional convolution feature maps obtained in step S104 are sequentially input into the first fully connected layer and the first classifier to output the first prediction vector.
  • step S110 the n one-dimensional convolution feature maps obtained in step S106 and processed by the one-dimensional convolution module are sequentially input into the second fully connected layer and the second classifier to output a second prediction vector.
  • Step S112 Based on the first prediction vector and the second prediction vector, a final prediction result is obtained by calculation.
  • the output structure of the fully connected layer is (number of batches, number of features, number of text types). Assuming that there are 7071 types of text types in the full sample, and the "number of text types" is 7071, these 7071 texts can be recognized.
  • This embodiment uses the neural network results of two-dimensional convolution and one-dimensional convolution to identify the image to be verified, which has high recognition accuracy, and is suitably configured with a neural network model structure, which can calculate data in parallel and effectively Improve the efficiency of CPU/GPU usage and shorten the calculation time.
  • FIG. 2 shows a schematic diagram of the program modules of the second embodiment of the neural network-based verification code recognition system according to the embodiment of the present application.
  • the cell classification system 20 may include or be divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to complete the present invention. Apply, and realize the verification code recognition method based on neural network.
  • the program module referred to in the embodiments of the present application refers to a series of computer-readable instruction segments capable of completing specific functions, and is more suitable for describing the execution process of the cell classification system 20 in the storage medium than the program itself. The following description will specifically introduce the functions of each program module in this embodiment:
  • the obtaining module 200 is used to obtain the verification code image in the target webpage
  • the first convolution module 202 is configured to perform a convolution operation on the verification code image through the two-dimensional convolution module to obtain n two-dimensional convolution feature maps;
  • the conversion module 204 is configured to convert the n two-dimensional convolution feature maps into n one-dimensional convolution feature maps;
  • the second convolution module 206 is configured to perform a convolution operation on n one-dimensional convolution feature maps through the one-dimensional convolution module to obtain n one-dimensional convolution feature maps processed by the one-dimensional convolution module ;
  • the first output module 208 is configured to sequentially input the n one-dimensional convolution feature maps into the first fully connected layer and the first classifier to output the first prediction vector;
  • the second output module 210 is configured to sequentially input the n one-dimensional convolution feature maps processed by the one-dimensional convolution module into the second fully connected layer and the second classifier to output a second prediction vector;
  • the prediction module 212 is configured to calculate a final prediction result based on the first prediction vector and the second prediction vector.
  • the two-dimensional convolution module includes sequentially connected: an input layer, a Lambda layer, and a combination of multiple two-dimensional convolutional layers, each two-dimensional convolutional layer combination is sequentially connected, and each two-dimensional convolutional layer
  • the convolutional layer combination includes: two convolution branches and a merge layer connected to the two convolution branches, each convolution branch includes sequentially connected: two-dimensional convolution layer, first Batch Normalization layer, LeakyReLU Layer, Depthwise two-dimensional convolutional layer, second Batch Normalization layer, LeakyReLU layer, two-dimensional convolutional layer, and third Batch Normalization layer.
  • the first convolution module 202 is further configured to: S1: input the verification code image through the input layer, and divide the verification code image into two feature maps through the Lambda layer, and cut The size of each feature map after division is (H/2)xWxC; S2: Input the two feature maps into the i-th two-dimensional convolutional layer combination connected with the Lambda layer to obtain multiple superimposed first i convolution feature map, the initial value of i is 1; S3: the multiple i-th convolution feature maps are divided into two groups of convolution feature maps, and each group of convolution feature maps includes multiple segmented convolution features.
  • S4 the two sets of convolutional feature maps in the multiple i-th convolutional feature maps are correspondingly input into the next two-dimensional convolutional layer combination of the i-th two-dimensional convolutional layer combination, through the following A two-dimensional convolutional layer combination outputs multiple i+1th convolution feature maps;
  • the one-dimensional convolution module includes sequentially connected: one-dimensional convolutional layer, a combination of multiple one-dimensional convolutional layers, and a post-merging layer; two adjacent one-dimensional convolutional layer combinations An intermediate merging layer is arranged therebetween, and the intermediate merging layer is used to merge the output and input of the one-dimensional convolutional layer connected to the input terminal, and the output of the intermediate merging layer is used as the one-dimensional connected to the output terminal.
  • the input of the convolutional layer combination; the post-merging layer is used to merge the output of the subsequent one-dimensional convolutional layer combination and the output of each intermediate merged layer.
  • each one-dimensional convolutional layer combination includes sequentially connected: one-dimensional hollow convolutional layer, activation layer, Lambda layer, spatial Dropout layer, one-dimensional convolutional layer within the group.
  • the computer device 2 is a device that can automatically perform numerical calculation and/or information processing in accordance with pre-set or stored instructions.
  • the computer device 2 may be a PC, a rack server, a blade server, a tower server, or a cabinet server (including an independent server or a server cluster composed of multiple servers).
  • the computer device 2 at least includes, but is not limited to, a memory 21, a processor 22, a network interface 23, and a cell classification system 20 that can communicate with each other through a system bus. among them:
  • the memory 21 includes at least one type of computer-readable storage medium.
  • the readable storage medium includes flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX memory, etc.), random access memory ( RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disks, optical disks, etc.
  • the memory 21 may be an internal storage unit of the computer device 2, for example, a hard disk or a memory of the computer device 2.
  • the memory 21 may also be an external storage device of the computer device 2, for example, a plug-in hard disk or a smart memory card (Smart Memory Card) equipped on the computer device 20. Media Card, SMC), Secure Digital (SD) card, Flash Card, etc.
  • the memory 21 may also include both the internal storage unit of the computer device 2 and its external storage device.
  • the memory 21 is generally used to store the operating system and various application software installed in the computer device 2, for example, the program code of the cell sorting system 20 in the second embodiment.
  • the memory 21 can also be used to temporarily store various types of data that have been output or will be output.
  • the processor 22 may be a central processing unit (Central Processing Unit, CPU), a controller, a microcontroller, a microprocessor, or other data processing chips.
  • the processor 22 is generally used to control the overall operation of the computer device 2.
  • the processor 22 is used to run the program code or process data stored in the memory 21, for example, to run the verification code recognition system 20, so as to implement the neural network-based verification code recognition method of the first embodiment.
  • the network interface 23 may include a wireless network interface or a wired network interface, and the network interface 23 is generally used to establish a communication connection between the computer device 2 and other electronic devices.
  • the network interface 23 is used to connect the computer device 2 with an external terminal through a network, and establish a data transmission channel and a communication connection between the computer device 2 and the external terminal.
  • the network may be an intranet (Intranet), the Internet (Internet), a global system of mobile communication (GSM), and wideband code division multiple access (Wideband Code). Division Multiple Access, WCDMA), 4G network, 5G network, Bluetooth (Bluetooth), Wi-Fi and other wireless or wired networks.
  • FIG. 3 only shows the computer device 2 with components 20-23, but it should be understood that it is not required to implement all the components shown, and more or fewer components may be implemented instead.
  • the cell classification system 20 stored in the memory 21 may also be divided into one or more program modules, and the one or more program modules are stored in the memory 21 and are composed of one or more program modules. Is executed by two processors (in this embodiment, the processor 22) to complete the application.
  • FIG. 2 shows a schematic diagram of program modules for implementing the second embodiment of the cell classification system 20.
  • the cell classification system 20 can be divided into an acquisition module 200, a first convolution module 202, and a transformation module.
  • the program module referred to in the present application refers to a series of computer-readable instruction instruction segments that can complete specific functions, and is more suitable than a program to describe the execution process of the cell classification system 20 in the computer device 2.
  • the specific functions of the program modules 200-212 have been described in detail in the second embodiment, and will not be repeated here.
  • the computer-readable storage medium may be nonvolatile or volatile, such as flash memory, hard disk, multimedia card, card-type memory (for example, SD or DX). Memory, etc.), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory , Magnetic disks, optical disks, servers, App application malls, etc., on which computer-readable instructions are stored, and corresponding functions are realized when the programs are executed by the processor.
  • the computer-readable storage medium of this embodiment is used to store the verification and identification system 20, and is executed by the processor, so that the at least one processor executes the following steps:
  • n one-dimensional convolution feature maps to the first fully connected layer and the first classifier in sequence to output the first prediction vector

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Hardware Design (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention est appliquée au domaine de l'intelligence artificielle. La présente invention concerne un procédé d'identification de code de vérification sur la base d'un réseau neuronal. Le procédé consiste en : l'obtention d'une image de code de vérification dans une page web cible ; la réalisation d'une opération de convolution sur l'image de code de vérification au moyen d'un module de convolution bidimensionnel de façon à obtenir n cartes de caractéristiques de convolution bidimensionnelles ; la conversion des n cartes de caractéristiques de convolution bidimensionnelles en n cartes de caractéristiques de convolution unidimensionnelles ; la réalisation d'une opération de convolution sur les n cartes de caractéristiques de convolution unidimensionnelle au moyen d'un module de convolution unidimensionnel de façon à obtenir n cartes de caractéristiques de convolution unidimensionnelles traitées par le module de convolution unidimensionnel ; l'entrée séquentielle des n cartes de caractéristique de convolution unidimensionnelle dans une première couche de connexion complète et un premier classificateur de façon à délivrer en sortie un premier vecteur de prédiction ; l'entrée séquentielle des n cartes de caractéristiques de convolution unidimensionnelle traitées par le module de convolution unidimensionnel dans une seconde couche de connexion complète et un second classificateur de façon à délivrer en sortie un second vecteur de prédiction ; et en fonction du premier vecteur de prédiction et du second vecteur de prédiction, la réalisation d'un calcul pour obtenir un résultat de prédiction final. Les modes de réalisation de la présente invention ont une précision d'identification élevée, et elle raccourcit efficacement une durée de calcul et réduit les ressources de fonctionnement.
PCT/CN2020/103596 2019-09-06 2020-07-22 Procédé et système d'identification de code de vérification sur la base d'un réseau neuronal, et dispositif informatique WO2021042895A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910844014.4A CN110674488B (zh) 2019-09-06 2019-09-06 基于神经网络的验证码识别方法、系统及计算机设备
CN201910844014.4 2019-09-06

Publications (1)

Publication Number Publication Date
WO2021042895A1 true WO2021042895A1 (fr) 2021-03-11

Family

ID=69076620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103596 WO2021042895A1 (fr) 2019-09-06 2020-07-22 Procédé et système d'identification de code de vérification sur la base d'un réseau neuronal, et dispositif informatique

Country Status (2)

Country Link
CN (1) CN110674488B (fr)
WO (1) WO2021042895A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113822276A (zh) * 2021-09-30 2021-12-21 中国平安人寿保险股份有限公司 基于神经网络的图片矫正方法、装置、设备及介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110674488B (zh) * 2019-09-06 2024-04-26 深圳壹账通智能科技有限公司 基于神经网络的验证码识别方法、系统及计算机设备
CN112270325B (zh) * 2020-11-09 2024-05-24 携程旅游网络技术(上海)有限公司 字符验证码识别模型训练方法、识别方法、系统、设备及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845381A (zh) * 2017-01-16 2017-06-13 西北工业大学 基于双通道卷积神经网络的空谱联合的高光谱图像分类方法
CN108509991A (zh) * 2018-03-29 2018-09-07 青岛全维医疗科技有限公司 基于卷积神经网络的肝部病理图像分类方法
CN108734222A (zh) * 2018-05-24 2018-11-02 西南大学 基于校对网络的卷积神经网络图像分类方法
CN108852350A (zh) * 2018-05-18 2018-11-23 中山大学 一种基于深度学习算法的头皮脑电图致痫区的识别与定位方法
CN110674488A (zh) * 2019-09-06 2020-01-10 深圳壹账通智能科技有限公司 基于神经网络的验证码识别方法、系统及计算机设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105868785A (zh) * 2016-03-30 2016-08-17 乐视控股(北京)有限公司 基于卷积神经网络的图片鉴别方法及系统
CN106339753A (zh) * 2016-08-17 2017-01-18 中国科学技术大学 一种有效提升卷积神经网络稳健性的方法
CN107811649B (zh) * 2017-12-13 2020-12-22 四川大学 一种基于深度卷积神经网络的心音多分类方法
CN108596069A (zh) * 2018-04-18 2018-09-28 南京邮电大学 基于深度3d残差网络的新生儿疼痛表情识别方法及系统
CN109034224B (zh) * 2018-07-16 2022-03-11 西安电子科技大学 基于双分支网络的高光谱分类方法
CN109376753B (zh) * 2018-08-31 2022-06-28 南京理工大学 一种三维空谱空间维像元类属概率计算方法
CN109948475B (zh) * 2019-03-06 2021-03-16 武汉大学 一种基于骨架特征和深度学习的人体动作识别方法
CN110047506B (zh) * 2019-04-19 2021-08-20 杭州电子科技大学 一种基于卷积神经网络和多核学习svm的关键音频检测方法
CN110188761A (zh) * 2019-04-22 2019-08-30 平安科技(深圳)有限公司 验证码的识别方法、装置、计算机设备和存储介质
CN110070067B (zh) * 2019-04-29 2021-11-12 北京金山云网络技术有限公司 视频分类方法及其模型的训练方法、装置和电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845381A (zh) * 2017-01-16 2017-06-13 西北工业大学 基于双通道卷积神经网络的空谱联合的高光谱图像分类方法
CN108509991A (zh) * 2018-03-29 2018-09-07 青岛全维医疗科技有限公司 基于卷积神经网络的肝部病理图像分类方法
CN108852350A (zh) * 2018-05-18 2018-11-23 中山大学 一种基于深度学习算法的头皮脑电图致痫区的识别与定位方法
CN108734222A (zh) * 2018-05-24 2018-11-02 西南大学 基于校对网络的卷积神经网络图像分类方法
CN110674488A (zh) * 2019-09-06 2020-01-10 深圳壹账通智能科技有限公司 基于神经网络的验证码识别方法、系统及计算机设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113822276A (zh) * 2021-09-30 2021-12-21 中国平安人寿保险股份有限公司 基于神经网络的图片矫正方法、装置、设备及介质

Also Published As

Publication number Publication date
CN110674488A (zh) 2020-01-10
CN110674488B (zh) 2024-04-26

Similar Documents

Publication Publication Date Title
WO2022105125A1 (fr) Procédé et appareil de segmentation d'image, dispositif informatique et support de stockage
WO2021042895A1 (fr) Procédé et système d'identification de code de vérification sur la base d'un réseau neuronal, et dispositif informatique
US11354797B2 (en) Method, device, and system for testing an image
CN112395979B (zh) 基于图像的健康状态识别方法、装置、设备及存储介质
CN112214775B (zh) 防止第三方获取关键图数据信息的对图数据的注入式攻击方法、装置、介质及电子设备
WO2023035531A1 (fr) Procédé de reconstruction à super-résolution pour image de texte et dispositif associé
CN112995414B (zh) 基于语音通话的行为质检方法、装置、设备及存储介质
CN112016502B (zh) 安全带检测方法、装置、计算机设备及存储介质
CN115564000A (zh) 二维码生成方法、装置、计算机设备及存储介质
CN117195886A (zh) 基于人工智能的文本数据处理方法、装置、设备及介质
CN116774973A (zh) 数据渲染方法、装置、计算机设备及存储介质
CN113139490B (zh) 一种图像特征匹配方法、装置、计算机设备及存储介质
CN112071331B (zh) 语音文件修复方法、装置、计算机设备及存储介质
CN110929118B (zh) 网络数据处理方法、设备、装置、介质
CN113988223A (zh) 证件图像识别方法、装置、计算机设备及存储介质
CN113327194A (zh) 图像风格迁移方法、装置、设备和存储介质
CN113378025A (zh) 数据处理方法、装置、电子设备及存储介质
CN117093717B (zh) 一种相似文本聚合方法、装置、设备及其存储介质
CN110719260B (zh) 智能网络安全分析方法、装置及计算机可读存储介质
CN115881103B (zh) 语音情绪识别模型训练方法、语音情绪识别方法及装置
CN116958149B (zh) 医疗模型训练方法、医疗数据分析方法、装置及相关设备
CN109902699B (zh) 一种信息处理方法、装置和计算机存储介质
CN117874073A (zh) 一种搜索优化方法、装置、设备及其存储介质
CN117011664A (zh) 重构模型训练方法、异常信息检测方法、装置及存储介质
CN115826973A (zh) 列表页面的生成方法、装置、计算机设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20861877

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM F1205A DATED 20.07.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20861877

Country of ref document: EP

Kind code of ref document: A1