CN113240394B - Electric power business hall service method based on artificial intelligence - Google Patents

Electric power business hall service method based on artificial intelligence Download PDF

Info

Publication number
CN113240394B
CN113240394B CN202110545056.5A CN202110545056A CN113240394B CN 113240394 B CN113240394 B CN 113240394B CN 202110545056 A CN202110545056 A CN 202110545056A CN 113240394 B CN113240394 B CN 113240394B
Authority
CN
China
Prior art keywords
face
image
client
data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110545056.5A
Other languages
Chinese (zh)
Other versions
CN113240394A (en
Inventor
许为
胡宗富
程修远
张江龙
陈冬隐
刘怡桑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Fujian Electric Power Co Ltd
Information and Telecommunication Branch of State Grid Fujian Electric Power Co Ltd
Original Assignee
State Grid Fujian Electric Power Co Ltd
Information and Telecommunication Branch of State Grid Fujian Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Fujian Electric Power Co Ltd, Information and Telecommunication Branch of State Grid Fujian Electric Power Co Ltd filed Critical State Grid Fujian Electric Power Co Ltd
Priority to CN202110545056.5A priority Critical patent/CN113240394B/en
Publication of CN113240394A publication Critical patent/CN113240394A/en
Application granted granted Critical
Publication of CN113240394B publication Critical patent/CN113240394B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K19/00Record carriers for use with machines and with at least a part designed to carry digital markings
    • G06K19/06Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
    • G06K19/06009Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
    • G06K19/06037Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/413Classification of content, e.g. text, photographs or tables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • Electromagnetism (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Public Health (AREA)
  • Water Supply & Treatment (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Primary Health Care (AREA)
  • Toxicology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an artificial intelligence-based electric power business hall service method, which comprises the following steps of S1: carrying out customer identification; step S2: a service acceptance stage; the method comprises the steps that an electric power company provides a two-dimensional code for a client, after the two-dimensional code is obtained, the client logs in a business hall client through scanning and scanning the online business, and when an offline worker accepts the business, the client obtains client related information through scanning and scanning the two-dimensional code; and step S3: a data auditing stage; the picture file data uploaded by the client online business is audited by adopting a system picture identification technology; and step S4: and (5) a service handling stage. The invention can effectively solve the problems of low customer identification efficiency, high online business handling pressure, high electronic material auditing difficulty, low customer service initiative and the like, can improve the operation efficiency of the business expansion process, shorten the time of the business process, improve the customer experience, improve the customer satisfaction level and improve the core competitiveness of a company.

Description

Electric power business hall service method based on artificial intelligence
Technical Field
The invention relates to the field of electric power business hall service and electric power business hall service mode innovation under the background of Internet +, in particular to an electric power business hall service method based on artificial intelligence.
Background
In recent years, the research of artificial intelligence has risen to the climax, and the artificial intelligence will become a driving force for promoting technical innovation and social revolution. Artificial intelligence is a new science of studying and developing theories, methods, techniques and applications for simulating, extending and expanding human intelligence. The artificial intelligence comprises a plurality of research categories such as voice recognition, picture recognition, face recognition and the like, and lays a technical research foundation for the invention.
In addition, increased competition in the electricity market and increased levels of customer demand place higher demands and challenges on the grid companies. In this context, customers no longer just meet regular service responses, but expect faster, better, and more up-to-date service experiences, and therefore the regular electricity business hall service model requires innovation.
Although the service flow of part of the current electric power business hall is shortened, and an internet plus technical means is introduced to realize on-line acceptance service and on-line audit data, the requirements of customers on service response are not met, and the service difficulty of power grid workers is increased. Through deep analysis and summary, the existing electric power business hall mode has the following defects:
(1) The service acceptance client identity recognition efficiency is low
When a client transacts business online, the identity of the client is not verified, and the authenticity of the client identity is risky. When a client transacts business off line, the client needs to carry various certificates, the identity recognition is lack of high-tech means for support, and the recognition efficiency is low.
(2) The online and offline business handling pressure is high
The client usually makes the user number when performing online and offline service handling, and the access for acquiring the user number information is less and difficult to remember, so that the client needs to spend a longer time to input the user number and other related information in the service handling stage, the service handling time is longer, the service handling amount is larger, and the service handling difficulty is larger.
(3) The on-line electronic material has high auditing difficulty
With the increase of the online transaction types, the related electronic materials are various, most of the materials are in picture formats, and the requirement on the accuracy of manual identification is high, so that the difficulty of online electronic examination and verification of the materials is high.
(4) Customer heavy business lack of initiative service
The traditional power customer service mode is passive and lacks initiative, which is often the situation that customers actively consult business to handle related business and lacks initiative service information push and flow management.
Disclosure of Invention
In view of the above, the present invention provides an artificial intelligence-based service method for an electric power business hall, which solves the problem of insufficient service modes of the current electric power business hall, and designs an artificial intelligence-based service innovation mode for the electric power business hall by deeply analyzing the service modes of the current electric power business hall and combining with the analysis of actual needs of electric power customers.
The invention is realized by adopting the following scheme: an electric power business hall service method based on artificial intelligence comprises the following steps:
step S1: carrying out customer identification;
step S2: a service acceptance stage; the method comprises the steps that an electric power company provides a two-dimensional code for a client, after the two-dimensional code is obtained, the client logs in a business hall client through scanning and scanning the online business, and when an offline worker accepts the business, the client obtains client related information through scanning and scanning the two-dimensional code;
and step S3: a data auditing stage; the picture file data uploaded by the client online business is audited by adopting a system picture identification technology;
and step S4: and (5) a service handling stage.
Further, the step S1 specifically includes the following steps:
step S11: under the environment of proper illumination without overexposure or excessive darkness, the front face information of the client without blocking of five sense organs is collected by adopting a camera, and a picture with clear face edge and no motion blur is collected;
step S12: carrying out face detection on the collected face information: inputting the collected face pictures of the client into a face detection model for judging whether a certain region in the pictures belongs to a face or not, and judging that the region is the detected face region when the region passes the detection of all characteristic small windows;
step S13: preprocessing a face image, compressing the image to 20 × 20, converting a color image into a gray image, expanding the gray distribution in the original image to the image with the whole gray level by using a gray stretching formula, and obtaining the face image multidimensional feature vectors with the same size and the same gray value range;
wherein: the specific formula of the image gray stretching is as follows:
Figure BDA0003073157300000021
n (I, j) is the gray level of a pixel point at the ith and j position after image conversion, I (I, j) is the gray level of a pixel point at the ith and j position before conversion, min represents the minimum gray level of each pixel point of the original image, and max represents the maximum gray level of each pixel point of the original image;
step S14: performing dimensionality reduction on the multi-dimensional feature vector of the face image obtained in the step S13 by adopting a principal component analysis method, and extracting a face feature coefficient matrix;
step S15: carrying out face matching and recognition; firstly, projecting a characteristic template stored in a database on a space Wpca to obtain a projected characteristic coefficient matrix; then projecting the feature data of the collected face image on the space Wpca to obtain a projection feature coefficient matrix; performing Euclidean distance measurement on the projection characteristic coefficient matrixes obtained twice to obtain face matching similarity; and giving a similarity threshold, matching the acquired feature data of the face image with a feature template stored in a database, and outputting a result obtained by matching when the similarity exceeds the threshold.
Further, the step S12 specifically includes the following steps:
step S121: adopting 2000 face pictures with the size of 20 × 20, 2000 non-face pictures with the size of 20 × 20 and 4000 large-picture face pictures as positive and negative samples for face recognition;
step S122: before training, preprocessing the pictures, and exhaustively obtaining all the Haar-like characteristic sub-images of the input 20 × 20 size pictures through permutation and combination, wherein generally, the permutation and exhaustion of a common definition picture can generate 10 tens of thousands of characteristic sub-images;
step S123: distinguishing a human face from a non-human face by adopting an Adaboost algorithm, and training a human face detection model;
step S124: and inputting the newly acquired face image into the trained face detection model, and outputting a sub-window containing the whole face region.
Further, the step S122 specifically includes the following steps:
step S1221: obtaining all sub-images of Haar-like characteristics of the input 20 x 20 size pictures through permutation and combination exhaustion;
step S1222: an integral graph is constructed, a matrix for describing global information is used, and the method is a rapid algorithm which can work out the pixel sum of all areas in an image only by traversing the image once; the specific formula is as follows:
Figure BDA0003073157300000031
wherein g (u, v) represents a Haar-like characteristic value at the position (u, v) and is equal to the sum of all pixels in the upper left corner direction of the original image (u, v);
step S1223, after the integral graph is constructed, the Haar-like eigenvalue of a certain sub-image can pass through the calculation difference of the matrix pixel sum, and the specific formula is as follows:
sum=g(x2,y2)-g(x1,y2)-g(x2,y1)+g(x1,y1) (3)
wherein sum represents the Haar-like feature values of the sub-images at the four vertex positions (x 1, y 1), (x 2, y 1), (x 2, y 2), respectively, g (x 2, y 2) represents the sum of all pixels in the upper left corner direction of the original image (x 2, v 2), g (x 1, y 2) represents the sum of all pixels in the upper left corner direction of the original image (x 1, v 2), g (x 2, y 1) represents the sum of all pixels in the upper left corner direction of the original image (x 2, v 1), and g (x 1, y 1) represents the sum of all pixels in the upper left corner direction of the original image (x 1, v 1);
further, the step S123 specifically includes the following steps:
step S1231: in the process of face detection, an Adaboost algorithm is used for selecting some rectangular features which can represent the face most, namely weak classifiers; firstly, learning is carried out through a plurality of 20 × 20 face images and non-face image training samples to obtain a weak classifier of a first face, namely a Haar-like characteristic value of a training set; calculating a Haar-like characteristic value of an input image, comparing the Haar-like characteristic value with a characteristic value of a weak classifier, and judging that the input image is a human face when the characteristic value of the input image is greater than the characteristic value of the weak classifier so as to judge whether the input image is the human face;
step S1232: training the same weak classifier by using different training sets, and then integrating the weak classifiers obtained by using different training sets to form a strong classifier for improving the accuracy of face recognition, which specifically comprises the following steps:
step Sa: setting a training sample set, training positive and negative sample numbers and a maximum cycle number of training;
step Sb: initializing the sample weight to be 1/N, namely the reciprocal of the number of samples, namely the initial probability distribution of the training samples;
step Sc: carrying out first iterative training to obtain a first optimal weak classifier;
step Sd: randomly increasing the weight of the misjudged samples in the previous round;
step Se: putting a new sample and the sample which is wrongly divided in the last time together for a new round of training;
step Sf: circularly executing the step Sd and the step Se, obtaining T optimal weak classifiers after T rounds, and combining the T optimal weak classifiers to obtain a strong classifier; when the accuracy of the strong classifier reaches 95%, the training result is ideal, and the strong classifier is output;
step S1233: circularly executing the step S1232 to obtain a plurality of strong classifiers, and then connecting the plurality of strong classifiers obtained by training in series to form a cascade classifier with a cascade structure for human image recognition; because the training picture takes a large picture as input, a plurality of sub-windows are obtained after multi-region and multi-scale cutting is carried out on the picture, and the sub-windows are sequentially input into the cascade classifier for detection; if the face passes through all the strong classifiers, outputting the face; if one classifier link fails, returning; and when the identification accuracy of the stacked classifier reaches 95%, the training result is ideal, and a strong classifier is output.
Further, the step S14 specifically includes the following steps:
step S141: and representing the facial image feature vector into a column dimension vector in a stacking mode. The expression of the face image feature vector is as follows:
Figure BDA0003073157300000041
wherein f is i Representing a one-dimensional column vector formed by taking each face image from the first column of an image matrix according to the face image characteristics of the face image, and sequentially taking the face image to the last column, wherein each column is connected end to end; n, M represents the length and width of the image;
step S142: representing a column vector of each face image in a sample set, sequentially transposing the column vectors of each image into row vectors to form a face sample matrix, wherein the specific expression is as follows:
f={f i (m,n)}=(f 1 ,f 2 ,...,f k ,...f L ) T (5)
wherein f represents the sample matrix, i.e. the image set, L represents the total number of training samples, f i Representing a one-dimensional row vector formed by a m x n face image;
step S143: based on step S142, the covariance matrix of the training samples is:
[C f ]=E{(f-m f )(f-m f ) T } (6)
wherein [ C f ]To train the covariance matrix of the samples, M f Is the mean vector of all training samples, (f-m) f ) A vector representing the original face sample compared to the average face;
step S144: and (2) obtaining feature vectors of mutually orthogonal feature values belonging to each feature value:
[C f ]a i =λ i a i (7)
wherein λ is i Is represented by [ Cf]Of the ith characteristic value, alpha i Representing the ith feature vector corresponding to the feature vector;
step S145: and then, arranging according to the eigenvalues in a descending order, and forming an orthogonal matrix, namely a multidimensional orthogonal space, by taking the eigenvector corresponding to each eigenvalue. Transforming equation (6) into matrix
Figure BDA0003073157300000044
The simplified results are as follows:
Figure BDA0003073157300000042
step S146: solving eigenvalues and eigenvectors of the matrix in the formula (8), and performing SVD (singular value decomposition) on the solved eigenvectors and eigenvalues to obtain eigenvectors of the original training sample, wherein the expression specifically comprises the following steps:
Figure BDA0003073157300000043
wherein u is i Face projection, v, representing the ith feature i Is the eigenvector of the covariance matrix in (8), and p is the number of the eigenvector;
step S146: obtaining projection characteristic space W needed by PCA face recognition PCA The specific expression is as follows:
W pca =(u 1 ,u 2 ,...,u p-1 :u p ) (10)
step S147: projection calculation is carried out through a K-L conversion formula to obtain the space W of each sample PCA Characteristic coefficient matrix of (1):
g=[A](f-m f ) (11)
wherein g represents that the projected feature coefficients exist in a matrix, and each line in g represents the feature coefficient of one human face sample; [ A ]]Representation space W PCA ,(f-m f ) Representing an original face sample in space W PCA And performing projection. Further, the step S2 specifically includes the following steps:
step S21: manufacturing a two-dimensional code: collecting information including collecting a house number, a house owner name, a detailed power utilization address, a district manager name, a district manager contact telephone and a palm power installation path;
step S21: manufacturing a two-dimensional code: collecting information including collecting a house number, a house owner name, a detailed power utilization address, a district manager name, a district manager contact telephone and a palm power installation path;
the method specifically comprises the following steps:
step S211: and (3) data analysis: determining the type of the coded character, and converting the coded character into a symbolic character according to a corresponding character set; selecting an error correction grade, and selecting an error correction grade from L, M, Q, H, wherein under the condition of a certain specification, the higher the error correction grade is, the smaller the capacity of real data is;
step S212: and (3) data encoding: converting the data characters into bit streams, wherein each 8 bits of the bit streams are provided with a code word, and the data characters integrally form a code word sequence of data; the code word sequence bears the specific data content of the two-dimensional code; the specific coding process includes the steps of firstly grouping numbers, letters and Chinese characters to be coded, then converting the grouped contents of all parts into binary forms, merging sequences, and finally adding a mode indicator at the head of the sequences;
step S213: error correction coding: partitioning the code word sequence formed in the last step as required, generating error correction code words according to the error correction level and the partitioned code words, and adding the error correction code words to the back of the data code word sequence to form a new sequence; the error correction coding improves the fault tolerance rate of the dirty and damaged two-dimensional code;
step S214: and (3) constructing final data information: under the condition that the specification is determined, the sequences formed from the step S211 to the step S213 are put into each block in sequence, then each block is calculated to obtain a corresponding error correction code word block, and the error correction code word blocks form a sequence in sequence and are added behind the original data code word sequence; step S215: constructing a matrix: putting the detection pattern, the separator, the positioning pattern, the correction pattern and the code word module into a matrix, and filling the complete sequence in the matrix into the area of the two-dimensional code matrix with the corresponding specification;
step S216: masking: the mask graph is used for the coding area of the symbol, so that dark and light colors or black and white areas in the two-dimensional code graph can be distributed in an optimal ratio;
step S217: format and version information: generating format and version information and putting the format and the version information into corresponding areas, wherein versions 7 to 40 all contain the version information, and all versions without the version information are 0; two positions on the two-dimensional code contain version information which are redundant; the version information is a matrix of 18 bits and 6X3, wherein 6 bits are data bits and the next 12 bits are error correction bits;
step S22: the two-dimensional code provides: the electric company provides two-dimensional code information for customers in various modes including online self-service acquisition, business hall printing, community property cooperation and sticking to a meter box;
step S23: acquiring the two-dimensional code: the client obtains the two-dimension code information in various modes including online self-service printing, business hall printing, counter printing, meter box position, property room-handing data and property management position;
step S24: two-dimensional code application: the two-dimension code is scanned on line to obtain the information including the account number, the name of the owner and the detailed electricity utilization address, and the two-dimension code is used for identity authentication when business is transacted off line.
Further, the step S3 specifically includes the following steps:
step S31: acquiring and converting the picture, and recognizing the characters of the picture by adopting an OCR picture recognition technology; when a client transacts business online and offline, uploading a picture file including an identity card, a bank card and a bill by adopting an OCR picture recognition technology for recognition and archiving;
step S32: carrying out system intelligent screening on the client uploaded data: adopting a state network company marketing business system to firstly discriminate whether the client uploads the required data, if not, prompting that the client data is wrong, uploading again, then mutually comparing the information with the same attribute between the information obtained by identification and the application information filled online by the client, and if the key information is consistent, automatically dispatching the order by the system; if the information is inconsistent, sending the comparison result to a relevant department for manual rechecking;
step S33: manual auxiliary treatment: firstly, manually and periodically check the work orders which are automatically dispatched, secondly, intelligently providing detailed results by the system for the work orders with inconsistent information comparison, after manual check, screening opinion is fed back to a client, the client is required to adjust uploaded electronic data, and finally, the system summarizes the failure cases that manual check is normal, client feedback check is normal, sampling check is normal, but the system is discriminated to be abnormal.
Further, the step S31 specifically includes the following steps:
step S31: acquiring and converting the picture, and recognizing the characters of the picture by adopting an OCR picture recognition technology; when a client transacts business online and offline, uploading a picture file including an identity card, a bank card and a bill by adopting an OCR picture recognition technology for recognition and archiving;
the method specifically comprises the following steps:
step S311: collecting data in a camera lens or file uploading mode; setting a file shooting area according to the type of the uploaded file, and shooting and uploading the file according to shooting requirements when a user uploads a picture, wherein the periphery of the file has proper margins, so that the uploaded picture contains all required information.
Step S312: carrying out image preprocessing: firstly, rotating a picture to enable the picture to be horizontally aligned along the vertical direction, adjusting receipt data by using an adaptive thresholding function adaptive _ threshold and a scimit-image frame in Opencv, carrying out binarization by using an adaptive _ threshold method to obtain data after picture binarization processing, reserving white pixels in a high-gradient area by using scimit-image and reserving black pixels in a low-gradient area to obtain a high-contrast sample picture, and then cutting the picture to obtain bill information.
Step S313: picture segmentation: for a segment of multi-line text, the text segmentation comprises two steps of line segmentation and character segmentation; firstly, projecting the characters after inclination correction to a Y axis, and accumulating all values to obtain a histogram on the Y axis; secondly, performing horizontal scanning counted from left to right to obtain the number of black points in the line, wherein when the area with the y-axis numerical value not being 0 is the place where the characters exist, the area with the y-axis numerical value being 0 is the distance between every two lines; then, carrying out vertical scanning from top to bottom in statistics, and obtaining each character in the same horizontal scanning process;
step S314: step S311 to step S313 are to obtain a small block of picture, perform template rough classification and template fine matching on the feature vectors extracted from the characters scanned in each part of the picture and the feature template library, and identify characters;
step S315: judging whether the picture uploaded by the user is the required material: and matching the extracted characters with keywords, if the extracted characters meet the requirements, storing the picture and character information, and if the extracted characters do not meet the requirements, reminding the user of uploading again.
Step S32: carrying out system intelligent screening on the client uploaded data: adopting a state network company marketing business system to firstly discriminate whether the client uploads the required data, if not, prompting that the client data is wrong, uploading again, then mutually comparing the information with the same attribute between the information obtained by identification and the application information filled online by the client, and if the key information is consistent, automatically dispatching the order by the system; if the information is inconsistent, sending a comparison result to a relevant department for manual rechecking;
step S33: manual auxiliary treatment: firstly, manually and periodically check the work orders which are automatically dispatched, secondly, intelligently providing detailed results by the system for the work orders with inconsistent information comparison, after manual check, screening opinion is fed back to a client, the client is required to adjust uploaded electronic data, and finally, the system summarizes the failure cases that manual check is normal, client feedback check is normal, sampling check is normal, but the system is discriminated to be abnormal.
Further, the step S4 specifically includes the following steps:
step S41: establishing a data source: the data is derived from internal data and third-party data of the electric power app, the marketing business system, the power supply business hall, the customer service system, and the related data information comprises a user name, an electric power utilization address, business work order information, payment condition records, channel use records, complaints, fault repair, customer contact records, electric power acquisition information and business expansion information;
step S42: according to the basic attributes of the users, rules of recently handled services and historically handled services and four dimensions of a service list which is not handled yet, cleaning, converting and combining data to obtain a user data wide table;
step S43: clustering power customers with similar characteristics by adopting a k-means-based clustering algorithm, grouping the power customers with different characteristics, and grouping data objects by clustering analysis according to description objects and relationship information thereof found in data; the grouped objects in the group are similar, while the objects in different groups are different; repeatedly assigning and updating the steps for many times until iteration is converged and the centroid is not changed, and outputting a clustering result;
step S44: clustering, namely observing data of each group, setting a corresponding label, and establishing a client group portrait with the same behavior type;
step S45: through the client figure, a business list which is possibly needed to be handled by a user recently is intelligently analyzed and predicted based on a collaborative filtering algorithm, and power utilization reminding, business handling reminding, power utilization plan reminding and online communication service are carried out in advance;
step S46: actively pushing information: the method comprises two scenes of client online service handling and client offline service handling, and the whole-process intelligent reminding service is realized in three time periods before client service handling, in the service handling process and after the service handling is finished through a channel comprising an online APP or offline manual service according to an intelligent analysis prediction result.
Further, the step S45 specifically includes the following steps:
step S451: based on a collaborative filtering algorithm, calculating the similarity between user figures according to the figure data of different dimensions of a client:
sim(u,i)=Σw k* sim(k,i) (12)
where sim (u, i) is the weight of user u on behavior i, w k Representing the similarity of the user k and the user u, sim (k, i) representing the weight of the user k to the behavior i;
step S452: for a certain user, after calculating the similarity between the user and other users, obtaining a top100 associated user based on grade descending order;
step S453: according to the recent electricity transaction condition and electricity transaction service of a top100 associated user, if some service transaction occurs, actively inquiring whether the user needs the service transaction or not through information push, and providing a service-related skip page; if the user answers no or no answer, the piece of information is recorded.
Compared with the prior art, the invention has the following beneficial effects:
the invention effectively solves the problems of low customer identification efficiency, high online business handling pressure, high electronic material auditing difficulty, low customer service initiative and the like, can improve the operation efficiency of the business expansion process, shorten the time of the business process, improve the customer experience, improve the customer satisfaction level and improve the core competitiveness of a company.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Fig. 2 is a flow chart of the application of the face recognition technology in the embodiment of the present invention.
Fig. 3 is a flowchart of a face detection cascade classifier technique according to an embodiment of the present invention.
Fig. 4 is a flowchart of an application of two-dimensional code information sharing according to an embodiment of the present invention.
FIG. 5 is a flowchart of an image recognition technique according to an embodiment of the present invention.
Fig. 6 is a flowchart of an application of intelligent prediction and real-time reminding according to an embodiment of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings and the embodiments.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 1, the present embodiment provides an electric power business hall service method based on artificial intelligence, which includes the following steps:
step S1: carrying out customer identification;
step S2: a service acceptance stage; the method comprises the steps that an electric power company provides a two-dimensional code for a client, after the two-dimensional code is obtained, the client logs in a business hall client through scanning and scanning the online business, and when an offline worker accepts the business, the client obtains client related information through scanning and scanning the two-dimensional code;
and step S3: a data auditing stage; the picture file data uploaded by the client online business is audited by adopting a system picture identification technology;
and step S4: and (5) a business handling stage.
As shown in fig. 2, in this embodiment, the step S1 specifically includes the following steps:
step S11: under the environment of proper illumination without overexposure or excessive darkness, the front face information of the client without blocking of five sense organs is collected by adopting a camera, and a picture with clear face edge and no motion blur is collected;
step S12: carrying out face detection on the collected face information: inputting the collected face pictures of the client into a face detection model for judging whether a certain region in the pictures belongs to a face or not, and judging that the region is the detected face region when the region passes the detection of all characteristic small windows;
step S13: preprocessing a face image, compressing the image to 20 × 20, converting a color image into a gray image, expanding the gray distribution in the original image to the image with the whole gray level by using a gray stretching formula, and obtaining the face image multidimensional feature vectors with the same size and the same gray value range;
wherein: the specific formula of the image gray stretching is as follows:
Figure BDA0003073157300000081
n (I, j) is the gray level of a pixel point at the ith and j position after the image conversion, I (I, j) is the gray level of a pixel point at the ith and j position before the image conversion, min represents the minimum gray level of each pixel point of the original image, and max represents the maximum gray level of each pixel point of the original image;
step S14: performing dimensionality reduction on the multi-dimensional feature vector of the face image obtained in the step S13 by adopting a principal component analysis method, and extracting a face feature coefficient matrix;
step S15: carrying out face matching and recognition; firstly, projecting a characteristic template stored in a database on a space Wpca to obtain a projected characteristic coefficient matrix; then projecting the feature data of the collected face image on the space Wpca to obtain a projection feature coefficient matrix; performing Euclidean distance measurement on the projection characteristic coefficient matrixes obtained twice to obtain face matching similarity; and giving a similarity threshold, matching the acquired feature data of the face image with a feature template stored in a database, and outputting a result obtained by matching when the similarity exceeds the threshold.
In this embodiment, the step S12 specifically includes the following steps:
step S121: according to an open source face recognition data set of Wuhan university, 2000 face pictures with the size of 20 × 20, 2000 non-face pictures with the size of 20 × 20 and 4000 face pictures with large pictures are used as positive and negative face recognition samples;
step S122: the Harr-like feature is a simple rectangular feature, which reflects local graying of the image and is used to depict local features of the face, such as the eyes are generally darker than the cheeks and the two sides of the bridge of the nose are darker than the bridge of the nose. Before training, preprocessing pictures, and exhaustively obtaining all the Haar-like characteristic sub-images of the input 20 × 20 pictures through permutation and combination, wherein generally, 10 ten thousand characteristic sub-images can be generated by exhaustively arranging a common-definition picture;
step S123: distinguishing a human face from a non-human face by adopting an Adaboost algorithm, and training a human face detection model;
step S124: and inputting the newly acquired face image into the trained face detection model, and outputting a sub-window containing the whole face region.
In this embodiment, the step S122 specifically includes the following steps:
step S1221: exhaustively obtaining sub-images of all Haar-like features of the input 20 × 20 size picture by permutation and combination;
step S1222: an integral graph is constructed, a matrix for describing global information is constructed, and the method is a quick algorithm which can calculate the sum of all regional pixels in an image by only traversing the image once. The specific formula is as follows:
Figure BDA0003073157300000091
wherein g (u, v) represents a Haar-like characteristic value at the position (u, v) and is equal to the sum of all pixels in the upper left corner direction of the original image (u, v);
step S1223, after the integral graph is constructed, the Haar-like characteristic value of a certain sub-image can be calculated through the difference of matrix pixel sums, and the specific formula is shown as follows:
sum=g(x2,y2)-g(x1,y2)-g(x2,y1)+g(x1,y1) (3)
here, sum represents the Haar-like feature values of the sub-images at the four vertex positions (x 1, y 1), (x 2, y 1), (x 2, y 2), respectively, g (x 2, y 2) represents the sum of all pixels in the upper left corner direction of the original image (x 2, v 2), g (x 1, y 2) represents the sum of all pixels in the upper left corner direction of the original image (x 1, v 2), g (x 2, y 1) represents the sum of all pixels in the upper left corner direction of the original image (x 2, v 1), and g (x 1, y 1) represents the sum of all pixels in the upper left corner direction of the original image (x 1, v 1).
In this embodiment, the step S123 specifically includes the following steps:
step S1231: in the process of face detection, an Adaboost algorithm is used for selecting some rectangular features which can represent the face most, namely weak classifiers; firstly, learning is carried out through a plurality of 20 × 20 face images and non-face image training samples to obtain a weak classifier of a first face, namely a Haar-like characteristic value of a training set; calculating a Haar-like characteristic value of an input image, comparing the Haar-like characteristic value with a characteristic value of a weak classifier, and judging that the input image is a human face when the characteristic value of the input image is greater than the characteristic value of the weak classifier so as to judge whether the input image is the human face;
step S1232: training the same weak classifier by using different training sets, and then integrating the weak classifiers obtained by using different training sets to form a strong classifier for improving the accuracy of face recognition, which specifically comprises the following steps:
step Sa: setting a training sample set, training positive and negative sample numbers and a maximum cycle number of training;
step Sb: initializing the sample weight to be 1/N, namely the reciprocal of the number of samples, namely the initial probability distribution of the training samples;
step Sc: carrying out first iterative training to obtain a first optimal weak classifier;
step Sd: randomly increasing the weight of the misjudged samples in the previous round;
step Se: putting a new sample and the sample which is wrongly divided in the last time together for a new round of training;
step Sf: circularly executing the step Sd and the step Se, obtaining T optimal weak classifiers after T rounds, and combining the T optimal weak classifiers to obtain a strong classifier; when the accuracy of the strong classifier reaches 95%, the training result is ideal, and the strong classifier is output;
step S1233: circularly executing the step S1232 to obtain a plurality of strong classifiers, and connecting the plurality of strong classifiers obtained by training in series to form a cascade classifier with a cascade structure for face recognition; because the training picture takes a large picture as input, after multi-region and multi-scale cutting is carried out on the picture, a plurality of sub-windows are obtained and are sequentially input into the stacked classifier for detection; if the face passes through all the strong classifiers, outputting the face; if one classifier link fails, returning; and when the identification accuracy of the stacked classifier reaches 95%, the training result is ideal, and a strong classifier is output.
In this embodiment, the step S14 specifically includes the following steps:
step S141: and representing the facial image feature vector into a column dimension vector in a stacking mode. The expression of the face image feature vector is as follows:
Figure BDA0003073157300000101
wherein f is i The method comprises the following steps that (1) each face image is counted from the first column of an image matrix according to the face image characteristics of the face image, the last column is sequentially counted, and each column is connected end to form a one-dimensional column vector; n, M represent the image length and width dimensions, 20 in this example;
step S142: representing a column vector of each face image in a sample set, sequentially transposing the column vector of each image into a row vector to form a face sample matrix, wherein a specific expression is as follows:
f={f i (m,n)}=(f 1 ,f 2 ,...,f k ,...f L ) T (5)
wherein f represents the sample matrix, i.e. the image set, L represents the total number of training samples, f i Representing a one-dimensional row vector formed by a m x n face image;
step S143: based on step S142, the covariance matrix of the training samples is:
[C f ]=E{(f-m f )(f-m f ) T } (6)
wherein, [ Cf]To train the covariance matrix of the samples, M f Is the mean vector of all training samples, (f-m) f ) A vector representing the original face samples compared to the average face;
step S144: and (2) obtaining feature vectors of mutually orthogonal feature values belonging to each feature value:
[C f ]a i =λ i a i (7)
wherein λ is i Is represented by [ Cf]Of the ith characteristic value, alpha i Representing the ith feature vector corresponding to the feature vector;
step S145: and then, carrying out descending order according to the eigenvalues, and forming an orthogonal matrix by taking the eigenvector corresponding to each eigenvalue, namely a multidimensional orthogonal space. Transforming equation (6) into matrix
Figure BDA0003073157300000111
The simplified results are as follows:
Figure BDA0003073157300000112
step S146: solving eigenvalues and eigenvectors of the matrix in the formula (8), and performing SVD (singular value decomposition) on the solved eigenvectors and eigenvalues to obtain eigenvectors of the original training sample, wherein the expression specifically comprises the following steps:
Figure BDA0003073157300000113
wherein u is i Face projection, v, representing the ith feature i Is the eigenvector of the covariance matrix in (8), and p is the number of the eigenvector;
step S146: obtains projection feature space W needed by PCA face recognition PCA The specific expression is as follows:
W pca =(u 1 ,u 2 ,...,u p-1 ,u p ) (10)
step S147: projection calculation is carried out through a K-L conversion formula to obtain the space W of each sample PCA Characteristic coefficient matrix of (c):
g=[A](f-m f ) (11)
wherein, g represents that the projected characteristic coefficient exists in a matrix, and each line in g represents the characteristic coefficient of a human face sample; [ A ]]A presentation space W PCA ,(f-m f ) Representing an original face sample in space W PCA And performing projection. As shown in fig. 3 and 4, in this embodiment, the step S2 specifically includes the following steps:
step S21: and (3) two-dimensional code manufacturing: collecting comprises collecting the account number, the name of the house owner, the detailed electricity utilization address, the name of the district manager, the contact telephone of the district manager and the palm electricity installation path information;
the method specifically comprises the following steps:
step S211: and (3) data analysis: determining the type of the coded character, and converting the coded character into a symbolic character according to a corresponding character set; selecting an error correction grade, and selecting an error correction grade from L, M, Q, H, wherein under the condition of a certain specification, the higher the error correction grade is, the smaller the capacity of real data is;
step S212: and (3) data encoding: converting the data characters into bit streams, wherein each 8 bits of the bit streams are provided with a code word, and the code word sequence of the data is integrally formed; the code word sequence bears the specific data content of the two-dimensional code; the specific coding process includes the steps of firstly grouping numbers, letters and Chinese characters to be coded, then converting the grouped contents of all parts into binary forms, merging sequences, and finally adding a mode indicator at the head of the sequences; (e.g., the numerical mode indicator is "0001")
Step S213: error correction coding: partitioning the code word sequence formed in the last step as required, generating error correction code words according to the error correction level and the partitioned code words, and adding the error correction code words to the back of the data code word sequence to form a new sequence; the error correction coding improves the fault tolerance rate of the dirty and damaged two-dimensional code;
step S214: and (3) constructing final data information: under the condition that the specification is determined, the sequences formed from the step S211 to the step S213 are put into each block in sequence, then each block is calculated to obtain a corresponding error correction code word block, and the error correction code word blocks form a sequence in sequence and are added behind the original data code word sequence; step S215: constructing a matrix: putting the detection pattern, the separator, the positioning pattern, the correction pattern and the code word module into a matrix, and filling the complete sequence in the matrix into the area of the two-dimensional code matrix with the corresponding specification;
step S216: masking: the mask graph is used for the coding area of the symbol, so that the dark color and the light color or the black color and the white area in the two-dimensional code graph can be distributed in an optimal ratio;
step S217: format and version information: generating format and version information and putting the format and the version information into corresponding areas, wherein versions 7-40 all contain the version information, and all versions without the version information are 0; two positions on the two-dimensional code contain version information which are redundant; the version information is 18 bits, a matrix of 6X3, where 6 bits are data bits, such as version number 8, 001000 is information of the data bits, and the following 12 bits are error correction bits;
step S22: the two-dimensional code provides: the electric company provides two-dimensional code information for customers in various modes including online self-service acquisition, business hall printing, community property cooperation and sticking to a meter box;
step S23: two-dimensional code obtains: the client obtains the two-dimension code information in various modes including online self-service printing, business hall printing, counter printing, meter box position, property room-handing data and property management position;
step S24: two-dimensional code application: the two-dimension code is scanned on line to obtain the information including the account number, the name of the owner and the detailed electricity utilization address, and the two-dimension code is used for identity authentication when business is transacted off line.
As shown in fig. 5, in this embodiment, the step S3 specifically includes the following steps:
step S31: acquiring and converting the picture, and recognizing the characters of the picture by adopting an OCR picture recognition technology; when a client transacts business online and offline, uploading a picture file including an identity card, a bank card and a bill by adopting an OCR picture recognition technology for recognition and archiving;
the method specifically comprises the following steps:
step S311: collecting data in a camera lens or file uploading mode; and setting a file shooting area according to the type of the uploaded file, and shooting and uploading the file according to shooting requirements when a user uploads a picture, wherein the periphery of the file is appropriately reserved to ensure that the uploaded picture contains all required information.
Step S312: carrying out image preprocessing: firstly, rotating a picture to enable the picture to be horizontally aligned along the vertical direction, adjusting receipt data by using an adaptive thresholding function adaptive _ threshold and a scimit-image frame in Opencv, carrying out binarization by using an adaptive _ threshold method to obtain data after picture binarization processing, reserving white pixels in a high-gradient area by using scimit-image and reserving black pixels in a low-gradient area to obtain a high-contrast sample picture, and then cutting the picture to obtain bill information.
Step S313: picture segmentation: for a segment of multi-line text, the text segmentation comprises two steps of line segmentation and character segmentation; firstly, projecting the characters after inclination correction to a Y axis, and accumulating all values to obtain a histogram on the Y axis; secondly, performing horizontal scanning counted from left to right to obtain the number of black points in the line, wherein when the area with the y-axis numerical value not being 0 is the place where the characters exist, the area with the y-axis numerical value being 0 is the distance between every two lines; then, carrying out vertical scanning from top to bottom in statistics, and obtaining each character in the same horizontal scanning process;
step S314: step S311 to step S313 are to obtain a small block of picture, perform template rough classification and template fine matching on the feature vectors extracted from the characters scanned in each part of the picture and the feature template library, and identify characters;
step S315: judging whether the picture uploaded by the user is the required material: and matching the extracted characters with keywords, if the extracted characters meet the requirements, storing the picture and character information, and if the extracted characters do not meet the requirements, reminding the user of uploading again.
Step S32: carrying out system intelligent screening on the client uploaded data: adopting a state network company marketing business system to firstly discriminate whether the client uploads the required data, if not, prompting that the client data is wrong, uploading again, then mutually comparing the information with the same attribute between the information obtained by identification and the application information filled online by the client, and if the key information is consistent, automatically dispatching the order by the system; if the information is inconsistent, sending the comparison result to a relevant department for manual rechecking;
step S33: manual auxiliary treatment: firstly, manually and periodically check the work orders which are automatically dispatched, secondly, intelligently providing detailed results by the system for the work orders with inconsistent information comparison, after manual check, screening opinion is fed back to a client, the client is required to adjust uploaded electronic data, and finally, the system summarizes the failure cases that manual check is normal, client feedback check is normal, sampling check is normal, but the system is discriminated to be abnormal.
In this embodiment, the step S31 specifically includes the following steps:
as shown in fig. 6, in this embodiment, the step S4 specifically includes the following steps:
step S41: establishing a data source: the data is derived from internal data and third-party data of the electric power app, the marketing business system, the power supply business hall, the customer service system, and the related data information comprises a user name, an electric power utilization address, business work order information, payment condition records, channel use records, complaints, fault repair, customer contact records, electric power acquisition information and business expansion information;
step S42: according to the basic attributes of the users, rules of recently handled services and historically handled services and four dimensions of a service list which is not handled yet, cleaning, converting and combining data to obtain a user data wide table;
step S43: clustering power customers with similar characteristics by adopting a k-means-based clustering algorithm, grouping the power customers with different characteristics, and grouping data objects by clustering analysis according to description objects and relationship information thereof found in data; the grouped objects in the group are similar, while the objects in different groups are different; repeatedly assigning and updating the steps for many times until iteration is converged and the centroid is not changed, and outputting a clustering result;
step S44: clustering, namely observing data of each group, setting a corresponding label, and establishing a client group portrait with the same behavior type;
step S45: through the client figure, a business list which is possibly needed to be handled by a user recently is intelligently analyzed and predicted based on a collaborative filtering algorithm, and power utilization reminding, business handling reminding, power utilization plan reminding and online communication service are carried out in advance;
step S46: actively pushing information: the method comprises two scenes of client online service handling and client offline service handling, and the whole-process intelligent reminding service is realized in three time periods before client service handling, in the service handling process and after the service handling is finished through a channel comprising an online APP or offline manual service according to an intelligent analysis prediction result.
In this embodiment, the step S45 specifically includes the following steps:
step S451: based on a collaborative filtering algorithm, calculating the similarity between user figures according to the figure data of different dimensions of a client:
sim(u,i)=Σw k* sim(k,i) (12)
where sim (u, i) the weight of user u on behavior i, w k Representing the similarity of the user k and the user u, and sim (k, i) representing the weight of the user k to the behavior i;
step S452: for a certain user, after calculating the similarity between the user and other users, obtaining a top100 associated user based on grade descending order;
step S453: according to the recent electricity transaction condition and electricity transaction service of a top100 associated user, if some service transaction occurs, actively inquiring whether the user needs the service transaction or not through information push, and providing a service-related skip page; if the user answers no or no answer, the piece of information is recorded.
The above description is only a preferred embodiment of the present invention, and all equivalent changes and modifications made in accordance with the claims of the present invention should be covered by the present invention.

Claims (8)

1. An electric power business hall service method based on artificial intelligence is characterized in that: the method comprises the following steps:
step S1: carrying out customer identification;
step S2: a service acceptance stage; the method comprises the steps that a power company provides a two-dimensional code for a client, after the client obtains the two-dimensional code, the client logs in a business hall client by scanning the two-dimensional code when handling business online, and when off-line workers handle business, the client obtains client related information by scanning the two-dimensional code;
and step S3: a data auditing stage; the picture file data uploaded by the client online business is audited by adopting a system picture identification technology;
and step S4: a service handling stage;
the step S1 specifically includes the steps of:
step S11: under the proper illumination environment without over exposure or over darkness, the front face information of a client without blocking of five sense organs is collected by adopting a camera, and a picture with clear face edge and no motion blur is collected;
step S12: carrying out face detection on the collected face information: inputting the collected face pictures of the client into a face detection model for judging whether a certain region in the pictures belongs to a face or not, and judging that the region is the detected face region when the region passes the detection of all characteristic small windows;
step S13: preprocessing a face image, compressing the image to 20 × 20, converting a color image into a gray image, expanding the gray distribution in the original image to the image with the whole gray level by using a gray stretching formula, and obtaining the face image multidimensional feature vectors with the same size and the same gray value range;
wherein: the specific formula of the image gray stretching is as follows:
Figure FDA0003928009140000021
n (I, j) is the gray level of a pixel point at the ith and j position after the image conversion, I (I, j) is the gray level of a pixel point at the ith and j position before the image conversion, min represents the minimum gray level of each pixel point of the original image, and max represents the maximum gray level of each pixel point of the original image;
step S14: performing dimensionality reduction on the multi-dimensional feature vector of the face image obtained in the step S13 by adopting a principal component analysis method, and extracting a face feature coefficient matrix;
step S15: carrying out face matching and recognition; firstly, projecting a characteristic template stored in a database on a space Wpca to obtain a projection characteristic coefficient matrix; then projecting the collected feature data of the face image on the space Wpca to obtain a projection feature coefficient matrix; performing Euclidean distance measurement on the projection characteristic coefficient matrixes obtained twice to obtain face matching similarity; giving a similarity threshold, matching the acquired feature data of the face image with a feature template stored in a database, and outputting a result obtained by matching when the similarity exceeds the threshold;
the step S4 specifically includes the following steps:
step S41: establishing a data source: the data is derived from internal data and third-party data of the electric power app, the marketing business system, the power supply business hall, the customer service system, and the related data information comprises a user name, an electric power utilization address, business work order information, payment condition records, channel use records, complaints, fault repair, customer contact records, electric power acquisition information and business expansion information;
step S42: according to the basic attributes of the users, rules of recently handled services and historically handled services and four dimensions of a service list which is not handled yet, cleaning, converting and combining data to obtain a user data wide table;
step S43: clustering power customers with similar characteristics by adopting a k-means-based clustering algorithm, grouping the power customers with different characteristics, and grouping data objects by clustering analysis according to description objects and relationship information thereof found in data; the grouped objects in the group are similar, while the objects in different groups are different; repeatedly assigning and updating the steps for many times until iteration is converged and the centroid is not changed, and outputting a clustering result;
step S44: clustering, namely observing data of each group, setting a corresponding label, and establishing a client group portrait with the same behavior type;
step S45: through the client portrait, a business list which is possibly needed to be handled by a user in the near term is intelligently analyzed and predicted based on a collaborative filtering algorithm, and services including power utilization reminding, business handling reminding, power utilization plan reminding and online communication service are carried out in advance;
step S46: information active push: the method comprises two scenes of online business handling and offline business handling of a client, and the whole-process intelligent reminding service is realized in three time periods before the business handling of the client, in the business handling process and after the business handling is finished through a channel comprising an online APP or offline manual service according to an intelligent analysis prediction result.
2. The artificial intelligence based electric power business hall service method according to claim 1, wherein: the step S12 specifically includes the following steps:
step S121: 2000 face pictures with the size of 20 × 20, 2000 non-face pictures with the size of 20 × 20 and 4000 large picture face pictures are used as face identification positive and negative samples;
step S122: before training, preprocessing pictures, exhaustively obtaining all sub-images of Haar-like features of the input 20 × 20 pictures through permutation and combination, wherein 10 ten thousand feature sub-images can be generated by exhaustively arranging a picture with common definition;
step S123: distinguishing a human face from a non-human face by adopting an Adaboost algorithm, and training a human face detection model;
step S124: and inputting the newly acquired face image into the trained face detection model, and outputting a sub-window containing the whole face region.
3. The artificial intelligence based electric power business hall service method according to claim 2, characterized in that: the step S122 specifically includes the following steps:
step S1221: exhaustively obtaining sub-images of all Haar-like features of the input 20 × 20 size picture by permutation and combination;
step S1222: and (3) constructing an integral graph to describe a matrix of global information, wherein a specific formula is as follows:
Figure FDA0003928009140000041
wherein g (u, v) represents a Haar-like characteristic value at the position (u, v) and is equal to the sum of all pixels in the upper left corner direction of the original image (u, v);
step S1223, after the integral graph is constructed, the Haar-like characteristic value of a certain sub-image passes through the pixel sum of the matrix and the calculation difference, and the specific formula is as follows:
sum=g(x2,y2)-g(x1,y2)-g(x2,y1)+g(x1,y1) (3)
here, sum represents the Haar-like feature values of the sub-images at the four vertex positions (x 1, y 1), (x 2, y 1), (x 2, y 2), respectively, g (x 2, y 2) represents the sum of all pixels in the upper left corner direction of the original image (x 2, v 2), g (x 1, y 2) represents the sum of all pixels in the upper left corner direction of the original image (x 1, v 2), g (x 2, y 1) represents the sum of all pixels in the upper left corner direction of the original image (x 2, v 1), and g (x 1, y 1) represents the sum of all pixels in the upper left corner direction of the original image (x 1, v 1).
4. The artificial intelligence based electric power business hall service method according to claim 2, wherein: the step S123 specifically includes the following steps:
step S1231: in the process of detecting the human face, using an Adaboost algorithm to select some rectangular features which can represent the human face most, namely weak classifiers; firstly, learning is carried out through a plurality of 20 × 20 face images and non-face image training samples to obtain a weak classifier of a first face, namely a Haar-like characteristic value of a training set; calculating a Haar-like characteristic value of an input image, comparing the Haar-like characteristic value with a characteristic value of a weak classifier, judging that the input image is a human face when the characteristic value of the input image is greater than the characteristic value of the weak classifier, and judging whether the input image is the human face or not;
step S1232: training the same weak classifier by using different training sets, and then integrating the weak classifiers obtained by the different training sets to form a strong classifier for improving the accuracy of face recognition, which specifically comprises the following steps:
step Sa: setting a training sample set, training positive and negative sample numbers and a maximum cycle number of training;
and Sb: initializing the sample weight to be 1/N, namely the reciprocal of the number of samples, namely the initial probability distribution of the training samples;
step Sc: carrying out first iterative training to obtain a first optimal weak classifier;
step Sd: randomly increasing the weight of the misjudged samples in the previous round;
step Se: putting a new sample and a sample which is mistakenly divided last time together for a new round of training;
step Sf: circularly executing the step Sd and the step Se, obtaining T optimal weak classifiers after T rounds, and combining the T optimal weak classifiers to obtain a strong classifier; when the accuracy of the strong classifier reaches 95%, the training result is ideal, and the strong classifier is output;
step S1233: circularly executing the step S1232 to obtain a plurality of strong classifiers, and then connecting the plurality of strong classifiers obtained by training in series to form a cascade classifier with a cascade structure for human image recognition; because the training picture takes a large picture as input, a plurality of sub-windows are obtained after multi-region and multi-scale cutting is carried out on the picture, and the sub-windows are sequentially input into the cascade classifier for detection; if the face passes through all the strong classifiers, outputting the face; if one classifier link fails, returning; and when the identification accuracy of the stacked classifier reaches 95%, the training result is ideal, and a strong classifier is output.
5. The artificial intelligence based electric power business hall service method according to claim 1, wherein: the step S14 specifically includes the following steps:
step S141: representing the characteristic vector of the face image into a column dimension vector in a stacking mode; the expression of the face image feature vector is as follows:
Figure FDA0003928009140000061
wherein f is i Representing a one-dimensional column vector formed by taking each face image from the first column of an image matrix according to the face image characteristics of the face image, and sequentially taking the face image to the last column, wherein each column is connected end to end; n, M represents the length and width of the image;
step S142: representing a column vector of each face image in a sample set, sequentially transposing the column vectors of each image into row vectors to form a face sample matrix, wherein the specific expression is as follows:
f={f i (m,n)}=(f 1 ,f 2 ,...,f k ,...f L ) T (5)
wherein f represents the sample matrix, i.e. the image set, L represents the total number of training samples, f i Representing a one-dimensional row vector formed by a m x n face image;
step S143, based on the step S142, the covariance matrix of the training sample is:
[C f ]=E{(f-m f )(f-m f ) T } (6)
wherein, [ C ] f ]To train the covariance matrix of the sample, m f Is the average vector of all training samples, (f-m) f ) A vector representing the original face sample compared to the average face;
step S144: obtaining a feature vector of mutually orthogonal feature values belonging to each feature value:
[C f ]a i =λ i a i (7)
wherein λ is i Is represented by [ C f ]Of the ith characteristic value, alpha i Representing the ith feature vector corresponding to the feature vector;
step S145: then, arranging according to the eigenvalues in a descending order, and forming an orthogonal matrix by using the eigenvectors corresponding to each eigenvalue, namely a multidimensional orthogonal space; transforming equation (6) into matrix
Figure FDA0003928009140000071
The simplified results are as follows:
Figure FDA0003928009140000072
step S146: solving eigenvalues and eigenvectors of the matrix in the formula (8), and performing SVD (singular value decomposition) on the solved eigenvectors and eigenvalues to obtain eigenvectors of the original training sample, wherein the expression specifically comprises the following steps:
Figure FDA0003928009140000073
wherein u is i Face projection, v, representing the ith feature i Is the eigenvector of the covariance matrix in (8), and p is the number of the eigenvector;
step S146: obtains projection feature space W needed by PCA face recognition PCA The specific expression is as follows:
W pca =(u 1 ,u 2 ,...,u p-1 ,u p ) (10)
step S147, projection calculation is carried out through the K-L conversion formula, and the space W of each sample is obtained PCA Characteristic coefficient matrix of (1):
g=[A](f-m f ) (11)
wherein g represents a feature to be projectedA coefficient existence matrix, wherein each line in g represents a characteristic coefficient of a face sample; [ A ]]A presentation space W PCA
6. The artificial intelligence based electric power business hall service method according to claim 1, wherein: the step S2 specifically includes the following steps:
step S21: manufacturing a two-dimensional code: collecting comprises collecting the account number, the name of the house owner, the detailed electricity utilization address, the name of the district manager, the contact telephone of the district manager and the palm electricity installation path information;
the method specifically comprises the following steps:
step S211: and (3) data analysis: determining the type of the coded character, and converting the coded character into a symbolic character according to a corresponding character set; selecting an error correction grade, and selecting an error correction grade from L, M, Q, H, wherein under the condition of a certain specification, the higher the error correction grade is, the smaller the capacity of real data is;
step S212: and (3) data encoding: converting the data characters into bit streams, wherein each 8 bits of the bit streams are provided with a code word, and the code word sequence of the data is integrally formed; the code word sequence bears the specific data content of the two-dimensional code; the specific coding process includes the steps of firstly grouping numbers, letters and Chinese characters to be coded, then converting the grouped contents of all the parts into binary forms, merging sequences, and finally adding a mode indicator at the head of the sequences;
step S213: error correction coding: partitioning the code word sequence formed in the previous step according to the requirement, generating error correction code words according to the error correction grade and the partitioned code words, and adding the error correction code words to the back of the data code word sequence to form a new sequence; the error correction coding improves the fault tolerance rate of the dirty and damaged two-dimensional code;
step S214: and (3) constructing final data information: under the condition that the specification is determined, the sequences formed from the step S211 to the step S213 are put into each block in sequence, then each block is calculated to obtain a corresponding error correction code word block, and the error correction code word blocks form a sequence in sequence and are added behind the original data code word sequence; step S215: constructing a matrix: putting the detection graph, the separator, the positioning graph, the correction graph and the code word module into a matrix, and filling the complete sequence into the area of the two-dimensional code matrix with the corresponding specification;
step S216: masking: the mask graph is used for the coding area of the symbol, so that the dark color and the light color or the black color and the white area in the two-dimensional code graph can be distributed in an optimal ratio;
step S217: format and version information: generating format and version information and putting the format and the version information into corresponding areas, wherein versions 7 to 40 all contain the version information, and all versions without the version information are 0; two positions on the two-dimensional code contain version information which are redundant; the version information is a matrix of 18 bits, 6X3, wherein 6 bits are data bits and the next 12 bits are error correction bits;
step S22: the two-dimensional code provides: the electric company provides two-dimensional code information for customers in various ways including online self-service acquisition, business hall printing, community property cooperation and sticking to a meter box;
step S23: acquiring the two-dimensional code: the client obtains the two-dimension code information in various modes including online self-service printing, business hall printing, counter printing, meter box position, property room-handing data and property management position;
step S24: two-dimensional code application: the two-dimension code is scanned on line to obtain the information including the account number, the name of the owner and the detailed electricity utilization address, and the two-dimension code is used for identity authentication when business is transacted off line.
7. The artificial intelligence based electric power business hall service method according to claim 1, wherein: the step S3 specifically includes the following steps:
step S31: acquiring and converting the picture, and recognizing the characters of the picture by adopting an OCR picture recognition technology; when a client transacts business online and offline, uploading a picture file including an identity card, a bank card and a bill by adopting an OCR picture recognition technology for recognition and archiving;
the method specifically comprises the following steps:
step S311: collecting data in a camera lens or file uploading mode; setting a file shooting area according to the type of an uploaded file, and shooting and uploading the file according to shooting requirements when a user uploads a picture, wherein appropriate margins are left around the file to ensure that the uploaded picture contains all required information;
step S312: carrying out image preprocessing: firstly, rotating a picture to enable the picture to be horizontally aligned along the vertical direction, adjusting receipt data by using an adaptive thresholding function adaptive _ threshold and a scimit-image frame in Opencv, carrying out binarization by using an adaptive _ threshold method to obtain data after picture binarization processing, reserving white pixels in a high-gradient area by using scimit-image and reserving black pixels in a low-gradient area to obtain a high-contrast sample picture, and cutting the picture to obtain bill information;
step S313: picture segmentation: for a segment of multi-line text, the text segmentation comprises two steps of line segmentation and character segmentation; firstly, projecting the characters after inclination correction to a Y axis, and accumulating all values to obtain a histogram on the Y axis; secondly, performing horizontal scanning counted from left to right to obtain the number of black points in the line, wherein when the area with the y-axis numerical value not being 0 is the place where the characters exist, the area with the y-axis numerical value being 0 is the distance between every two lines; then, carrying out vertical scanning from top to bottom statistics to obtain each character;
step S314: step S311 to step S313 are all to obtain a small picture, and perform template rough classification and template fine matching on the feature vector extracted from the scanned text of each part of the picture and the feature template library to identify a character;
step S315: judging whether the picture uploaded by the user is the required material: performing keyword matching on the extracted characters, if the extracted characters meet the requirements, storing picture and character information, and if the extracted characters do not meet the requirements, reminding a user to upload again;
step S32: carrying out system intelligent screening on the client uploaded data: adopting a state network company marketing business system to firstly discriminate whether the client uploads the required data, if not, prompting that the client data is wrong, uploading again, then mutually comparing the information with the same attribute between the information obtained by identification and the application information filled online by the client, and if the key information is consistent, automatically dispatching the order by the system; if the information is inconsistent, sending a comparison result to a relevant department for manual rechecking;
step S33: manual auxiliary treatment: firstly, manually and periodically check the work orders which are automatically dispatched, secondly, intelligently providing detailed results by the system for the work orders with inconsistent information comparison, after manual check, screening opinion is fed back to a client, the client is required to adjust uploaded electronic data, and finally, the system summarizes the failure cases that manual check is normal, client feedback check is normal, sampling check is normal, but the system is discriminated to be abnormal.
8. The artificial intelligence based electric power business hall service method according to claim 1, wherein: the step S45 specifically includes the following steps:
step S451: based on a collaborative filtering algorithm, calculating the similarity between user figures according to the figure data of different dimensions of a client:
sim(u,i)=Σw k *sim(k,i) (12)
where sim (u, i) represents the weight of user u to behavior i, w k Representing the similarity of the user k and the user u, sim (k, i) representing the weight of the user k to the behavior i;
step S452: for a certain user, after calculating the similarity between the user and other users, acquiring a top100 associated user based on grade descending order;
step S453: according to the recent electricity transaction condition and electricity transaction service of a top100 associated user, if some service transaction occurs, actively inquiring whether the user needs the service transaction or not through information push, and providing a service-related skip page; if the user answers no or no answer, the piece of information is recorded.
CN202110545056.5A 2021-05-19 2021-05-19 Electric power business hall service method based on artificial intelligence Active CN113240394B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110545056.5A CN113240394B (en) 2021-05-19 2021-05-19 Electric power business hall service method based on artificial intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110545056.5A CN113240394B (en) 2021-05-19 2021-05-19 Electric power business hall service method based on artificial intelligence

Publications (2)

Publication Number Publication Date
CN113240394A CN113240394A (en) 2021-08-10
CN113240394B true CN113240394B (en) 2023-04-07

Family

ID=77137639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110545056.5A Active CN113240394B (en) 2021-05-19 2021-05-19 Electric power business hall service method based on artificial intelligence

Country Status (1)

Country Link
CN (1) CN113240394B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114066607A (en) * 2021-11-17 2022-02-18 中国银行股份有限公司 Bank branch business handling method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102364527A (en) * 2011-10-21 2012-02-29 中国科学技术大学 Real-time identity recognition and authentication method for self-service equipment system of bank
CN106096517A (en) * 2016-06-01 2016-11-09 北京联合大学 A kind of face identification method based on low-rank matrix Yu eigenface
CN108052864B (en) * 2017-11-17 2019-04-26 平安科技(深圳)有限公司 Face identification method, application server and computer readable storage medium
CN109544096A (en) * 2018-10-18 2019-03-29 国网福建省电力有限公司 Low pressure industry based on artificial intelligence expands business handling method and device on line
CN111881315A (en) * 2020-06-24 2020-11-03 华为技术有限公司 Image information input method, electronic device, and computer-readable storage medium

Also Published As

Publication number Publication date
CN113240394A (en) 2021-08-10

Similar Documents

Publication Publication Date Title
KR100543707B1 (en) Face recognition method and apparatus using PCA learning per subgroup
Samaria Face recognition using hidden Markov models
Torralba Contextual priming for object detection
US7596247B2 (en) Method and apparatus for object recognition using probability models
CN102906787A (en) Facial analysis techniques
EP2808827A1 (en) System and method for OCR output verification
JP2016134175A (en) Method and system for performing text-to-image queries with wildcards
JPH1055444A (en) Recognition of face using feature vector with dct as base
CN103605972A (en) Non-restricted environment face verification method based on block depth neural network
KR20040037180A (en) System and method of face recognition using portions of learned model
Machidon et al. Face recognition using eigenfaces, geometrical pca approximation and neural networks
CN111695456A (en) Low-resolution face recognition method based on active discriminability cross-domain alignment
CN116110089A (en) Facial expression recognition method based on depth self-adaptive metric learning
CN110555386A (en) Face recognition identity authentication method based on dynamic Bayes
CN110580510A (en) clustering result evaluation method and system
CN117151870A (en) Portrait behavior analysis method and system based on guest group
CN113240394B (en) Electric power business hall service method based on artificial intelligence
CN108932501A (en) A kind of face identification method being associated with integrated dimensionality reduction based on multicore
CN114971294A (en) Data acquisition method, device, equipment and storage medium
Schettini et al. Automatic classification of digital photographs based on decision forests
CN114357307A (en) News recommendation method based on multi-dimensional features
Reddy et al. Comparison of HOG and fisherfaces based face recognition system using MATLAB
Fan Efficient multiclass object detection by a hierarchy of classifiers
Machhour Efficient Image Retrieval Based on Support Vector Machine and Genetic Algorithm Using Color, Texture and Shape Features
Zhao et al. A head pose estimation method based on multi-feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant