CN111582184B - Page detection method, device, equipment and storage medium - Google Patents

Page detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN111582184B
CN111582184B CN202010392403.0A CN202010392403A CN111582184B CN 111582184 B CN111582184 B CN 111582184B CN 202010392403 A CN202010392403 A CN 202010392403A CN 111582184 B CN111582184 B CN 111582184B
Authority
CN
China
Prior art keywords
detected
page
image
region
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010392403.0A
Other languages
Chinese (zh)
Other versions
CN111582184A (en
Inventor
陈宗文
杜义明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hanhai Information Technology Shanghai Co Ltd
Original Assignee
Hanhai Information Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hanhai Information Technology Shanghai Co Ltd filed Critical Hanhai Information Technology Shanghai Co Ltd
Priority to CN202010392403.0A priority Critical patent/CN111582184B/en
Publication of CN111582184A publication Critical patent/CN111582184A/en
Application granted granted Critical
Publication of CN111582184B publication Critical patent/CN111582184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/02Recognising information on displays, dials, clocks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a page detection method, device, equipment and storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring a to-be-detected image corresponding to a to-be-detected page and gradient histogram information corresponding to a negative sample; dividing the region of the image to be detected to obtain a region to be detected, and determining a gradient histogram corresponding to the region to be detected; matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition; and determining that the page to be detected fails to be loaded according to the successful matching of the gradient histogram corresponding to any region to be detected. Based on the above process, each area to be detected is detected respectively, the detection granularity is finer, the condition that the page to be detected fails to be loaded due to the loading failure of a certain area can be identified, the information related to the gradient histogram is rich, and the accuracy of judging whether the page fails to be loaded is improved.

Description

Page detection method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a page detection method, device, equipment and storage medium.
Background
With the development of computer technology, more applications and web pages are installed on terminals. Such as map-like applications, navigation-like web pages, etc. During the use of these applications or web pages, page loading failures may occur, such as map page loading failure, shopping page loading failure, payment page loading failure, etc. How to accurately detect the page loading failure is a key for timely solving the problem of loading failure and further recovering the normal service of the application program or the webpage.
In the related art, matching a gray level histogram of the whole image to be detected corresponding to the page to be detected with a gray level histogram of a standard image when the page is successfully loaded, and judging whether the page fails to be loaded according to a matching result. In the page detection process, the whole image to be detected is detected by a matching method based on a gray histogram, the detection granularity is coarse, the accuracy of judging whether the page fails to be loaded is low, and the page detection effect is poor.
Disclosure of Invention
The embodiment of the application provides a page detection method, device, equipment and storage medium, which can be used for solving the problems in the related art. The technical scheme is as follows:
In one aspect, an embodiment of the present application provides a page detection method, where the method includes:
acquiring gradient histogram information corresponding to an image to be detected and a negative sample corresponding to a page to be detected, wherein the negative sample is obtained based on image training corresponding to the page with loading failure, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to areas at different positions in the negative sample;
dividing the image to be detected into areas to obtain an area to be detected, and determining a gradient histogram corresponding to the area to be detected;
matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to a candidate region, and the candidate region is a region which is in a position corresponding to the region to be detected in a negative sample meeting a matching condition;
and determining that the page to be detected fails to be loaded according to the successful matching of the gradient histogram corresponding to any region to be detected.
In another aspect, there is provided a page detection apparatus, the apparatus including:
the acquisition module is used for acquiring a to-be-detected image corresponding to a to-be-detected page and gradient histogram information corresponding to a negative sample, wherein the negative sample is obtained based on image training corresponding to the page with loading failure, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to areas at different positions in the negative sample;
The dividing module is used for dividing the region of the image to be detected to obtain a region to be detected;
the first determining module is used for determining a gradient histogram corresponding to the region to be detected;
the matching module is used for matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to a candidate region, and the candidate region is a region which is in a corresponding position with the region to be detected in a negative sample and meets the matching condition;
and the second determining module is used for determining that the page to be detected fails to be loaded in response to successful matching of the gradient histograms corresponding to any one of the areas to be detected.
Optionally, the dividing module is further configured to divide the to-be-detected image into regions according to a first granularity and a second granularity, so as to obtain a to-be-detected region with a first size and a to-be-detected region with a second size;
the first determining module is further configured to determine a gradient histogram corresponding to the region to be detected of the first size and a gradient histogram corresponding to the region to be detected of the second size.
Optionally, the first size is larger than the second size, the candidate gradient histogram includes a first candidate gradient histogram and a second candidate gradient histogram, and the matching module is further configured to match a gradient histogram corresponding to the region to be detected of the first size with the first candidate gradient histogram; and matching the gradient histogram corresponding to the region to be detected with the second size with the second candidate gradient histogram in response to failure in matching the gradient histograms corresponding to the region to be detected with the first size.
Optionally, the second determining module is further configured to determine that the page to be detected fails to be loaded in response to successful matching of the gradient histograms corresponding to the area to be detected of any first size; or determining that the page to be detected fails to be loaded in response to failure in matching of the gradient histograms corresponding to the areas to be detected in the first size and success in matching of the gradient histograms corresponding to the areas to be detected in any second size.
Optionally, the apparatus further comprises:
the edge detection module is used for responding to failure of matching of the gradient histograms corresponding to the region to be detected and carrying out edge detection on the image to be detected;
and the second determining module is further configured to determine that the page to be detected fails to be loaded in response to the result of edge detection indicating that the target shape does not exist in the image to be detected.
Optionally, the acquiring module is further configured to acquire a color value of at least one standard layer corresponding to the page to be detected;
the apparatus further comprises:
the operation module is used for responding to the result of edge detection to indicate that a target shape exists in the image to be detected, and performing binarization operation on the image to be detected according to the color value of at least one standard image layer corresponding to the page to be detected;
And the second determining module is further configured to determine that the page to be detected fails to be loaded in response to an image obtained by performing binarization operation on the image to be detected according to the color value of any standard image layer not meeting a color condition.
Optionally, the acquiring module is further configured to acquire a color value of at least one standard layer corresponding to the page to be detected;
the operation module is further configured to perform binarization operation on the image to be detected according to the color value of at least one standard image layer corresponding to the page to be detected in response to failure in matching the gradient histograms corresponding to the area to be detected;
and the second determining module is further configured to determine that the page to be detected fails to be loaded in response to an image obtained by performing binarization operation on the image to be detected according to the color value of any standard image layer not meeting a color condition.
Optionally, the apparatus further comprises:
the acquisition module is used for acquiring equipment information;
and the uploading module is used for uploading the equipment information and the image to be detected to a server.
In another aspect, a computer device is provided, the computer device including a processor and a memory, the memory storing at least one program code, the at least one program code loaded and executed by the processor to implement any of the above-described page detection methods.
In another aspect, there is provided a computer readable storage medium having at least one program code stored therein, the at least one program code loaded and executed by a processor to implement any of the above page detection methods.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
dividing an image to be detected into areas to be detected, matching a gradient histogram corresponding to the areas to be detected with a candidate gradient histogram corresponding to a candidate area in a negative sample, and determining that the page to be detected fails to be loaded when the gradient histogram corresponding to any one of the areas to be detected is successfully matched. In the page detection process, each area to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain area can be identified, in addition, the information related to the gradient histogram is rich, the detection is carried out based on the gradient histogram, the accuracy of judging whether the page fails to be loaded can be improved, and the page detection effect is good.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment of a page detection method according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for detecting a page according to an embodiment of the present application;
fig. 3 is a schematic diagram of an image to be detected and a gradient histogram corresponding to the image to be detected according to an embodiment of the present application;
fig. 4 is a schematic diagram of an image obtained in a process of performing shape detection on an image to be detected corresponding to a map page according to an embodiment of the present application;
fig. 5 is a schematic diagram of an image to be detected and an image obtained after binarizing the image to be detected according to a color value of a text label layer according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a page detection process according to an embodiment of the present application;
fig. 7 is a schematic diagram of a page detection device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a page detection device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
With the development of computer technology, more applications and web pages are installed on terminals. Such as map-like applications, navigation-like web pages, etc. During the use of these applications or web pages, page loading failures may occur, such as map page loading failure, shopping page loading failure, payment page loading failure, etc. How to accurately detect the page loading failure is a key for timely solving the problem of loading failure and further recovering the normal service of the application program or the webpage.
By way of example, taking an application as a map-class application, during the use of the map-class application, a map page loading failure (e.g., text label loading failure, navigation route loading failure, etc.) may occur. At this time, only if the page loading failure is detected in time, the normal map service can be restored by solving the problem of map loading failure. It should be noted that, the manner in which different map class applications recover normal map services may be different. For example, for a unified map class application that encapsulates maps provided by multiple service parties, when a failure of loading a map page of a certain map is detected, the normal map service may be restored by dynamically switching to the map page of another map seamlessly.
The embodiment of the application provides a page detection method for judging whether a page fails to be loaded or not. Referring to fig. 1, a schematic diagram of an implementation environment of a page detection method according to an embodiment of the present application is shown. The implementation environment may include: a terminal 11 and a server 12.
The terminal 11 is provided with an application program or a web page capable of displaying a page, and when the page displayed by the application program or the web page needs to be detected, the method provided by the embodiment of the application can be applied to page detection. The server 12 may train a negative sample for judging whether the page fails to be loaded, and then send gradient histogram information of the negative sample to the terminal 11, and the terminal 11 detects an image to be detected corresponding to the page to be detected based on the gradient histogram information of the negative sample, so as to judge whether the page fails to be loaded. Of course, the terminal 11 may also send the image to be detected corresponding to the page to be detected to the server 12, and the server 12 detects the image to be detected corresponding to the page to be detected based on the gradient histogram information of the negative sample, so as to determine whether the page fails to be loaded.
Alternatively, the terminal 11 may be any electronic product that can perform man-machine interaction with a user through one or more of a keyboard, a touch pad, a touch screen, a remote controller, a voice interaction or handwriting device, such as a PC (Personal Computer, a personal computer), a mobile phone, a smart phone, a PDA (Personal Digital Assistant, a personal digital assistant), a wearable device, a palm top computer PPC (Pocket PC), a tablet computer, a smart car machine, a smart television, a smart speaker, etc. The server 12 may be a server, a server cluster comprising a plurality of servers, or a cloud computing service center. The terminal 11 establishes a communication connection with the server 12 through a wired or wireless network.
Those skilled in the art will appreciate that the above-described terminal 11 and server 12 are by way of example only, and that other terminals or servers, either now present or later, may be suitable for use in the present application, and are intended to be within the scope of the present application and are incorporated herein by reference.
Based on the implementation environment shown in fig. 1, the embodiment of the present application provides a page detection method, which is applied to the terminal 11 as an example. As shown in fig. 2, the method provided in the embodiment of the present application may include the following steps:
in step 201, gradient histogram information corresponding to the image to be detected and the negative sample corresponding to the page to be detected is obtained.
The negative sample is obtained based on image training corresponding to the page with loading failure, and gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to areas at different positions in the negative sample.
The page to be detected refers to any page of the application program or the webpage installed by the terminal, for example, when the application program installed by the terminal is a map application program, the page to be detected may refer to a top page of the map application program, or may refer to a navigation page in the use process of the map application program, etc.
The image to be detected corresponding to the page to be detected may be an image obtained after the screenshot is performed on a visible area in the page to be detected, and the image may be an RGB (Red Green Blue) image. Optionally, the terminal may be provided with a page detection system, and the process of acquiring the image to be detected corresponding to the page to be detected by the terminal may be: responding to a detection instruction of the page to be detected, and calling a page detection system by the terminal to capture a visible area of the page to be detected to obtain a to-be-detected image corresponding to the page to be detected. The detection instruction of the page to be detected may be triggered automatically when the page to be detected is displayed, or may be triggered manually by a user, which is not limited in the embodiment of the present application.
Alternatively, the page detection system may be built based on a page detection SDK (Software Development Kit ). It should be noted that, in the embodiment of the present application, the start timing of the page navigation system is not limited, and the page navigation system may be started before being invoked, for example, the page navigation system may be started when the terminal starts a certain application program or a web page, or may be started when a detection instruction of a page to be detected is detected, and so on.
The negative sample is a sample under the condition of page loading failure and is used for detecting whether the image to be detected fails to be loaded or not. It should be noted that, the negative sample may refer to a negative sample common to each page to be detected, or may refer to a negative sample of an application program or a web page corresponding to the page to be detected, whether the negative sample is a common negative sample or a negative sample corresponding to a certain application program or a web page, and compared with the standard image (positive sample) obtained in the related art when each page is detected and the loading corresponding to the page is successful, the embodiment of the invention can provide a smaller number of samples, and the effect of page detection is better.
Alternatively, the negative samples may be trained based on images corresponding to pages that failed to be loaded. The page that fails to load may refer to one or more of a page that fails to load locally, a page that fails to load shape, and a page that fails to load layer. The process of training to obtain the negative sample may be performed by the terminal or may be performed by the server, which is not limited in this embodiment of the present application. The process of training to obtain the negative sample in the embodiment of the application is described as an example by the server.
The server trains and obtains the negative sample based on the image corresponding to the page with the loading failure, the image corresponding to the page with the loading failure can be uploaded to the server by the service side of the application program or the webpage, and the image can also be uploaded to the server by the user of the application program or the webpage. After obtaining the image corresponding to the page with the loading failure, the server trains the image corresponding to the page with the loading failure to obtain a negative sample. Illustratively, the training procedure for the image corresponding to the page with failed loading may be: and extracting the characteristics of each image, and training based on the characteristics of each image to obtain an image with the characteristics of a plurality of fused images as a negative sample. Optionally, in the process of training the images corresponding to the pages with failed loading, the images can be clustered, and then training is performed on each type of images respectively to obtain negative samples corresponding to each type of images. The clustering criteria may be loading failure reasons, loading failure areas, etc., which are not limited in this embodiment of the present application.
After the negative sample is obtained through training, the server can obtain gradient histogram information corresponding to the negative sample, wherein the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to areas at different positions in the negative sample. It should be noted that the number of negative samples may be one or more, and different negative samples may correspond to different loading failure reasons, different loading failure areas, different zoom levels, etc. When the number of the negative samples is multiple, the server respectively acquires gradient histogram information corresponding to each negative sample.
Alternatively, the gradient histogram information may refer to an entire gradient histogram corresponding to the negative sample, where the size of the entire gradient histogram is consistent with the size of the negative sample, and the gradient histogram corresponding to the region at different positions in the negative sample may be truncated. The gradient histogram information may also refer to gradient histograms corresponding to each unit area in the negative sample, and the gradient histograms corresponding to the plurality of unit areas are connected in series, so that the gradient histograms corresponding to the areas at different positions in the negative sample can be obtained.
Alternatively, the entire gradient histogram corresponding to the negative sample may be obtained based on the gradient histograms corresponding to the respective unit areas in the negative sample. The manner of obtaining the whole gradient histogram corresponding to the negative sample based on the gradient histogram corresponding to each unit area in the negative sample includes, but is not limited to, the following three ways:
Mode 1: and directly connecting the gradient histograms corresponding to each unit area in the negative sample in sequence to obtain the whole gradient histogram corresponding to the negative sample.
In this manner 1, the efficiency of obtaining the entire gradient histogram corresponding to the negative sample is high.
Mode 2: and carrying out normalization processing on the gradient histograms corresponding to the unit areas in the negative samples, and connecting the gradient histograms corresponding to the unit areas after the normalization processing in sequence to obtain the whole gradient histogram corresponding to the negative samples.
In this manner 2, the influence of illumination can be eliminated to some extent.
Mode 3: and merging the unit areas in the negative samples into larger blocks, carrying out normalization processing on the gradient histograms corresponding to the unit areas in the blocks according to the blocks, and connecting the gradient histograms corresponding to the unit areas after the normalization processing in sequence to obtain the whole gradient histogram corresponding to the negative samples.
In this manner 3, the individual cell regions are combined into large, spatially connected blocks. The different blocks overlap each other, so that the gradient histogram of each unit area appears in the whole gradient histogram corresponding to the negative sample multiple times with different results. The whole gradient histogram corresponding to the negative sample obtained according to the mode 3 is closer to the real characteristics of the negative sample.
No matter which way to obtain the whole gradient histogram corresponding to the negative sample based on the gradient histogram corresponding to each unit area in the negative sample is the above way, the gradient histogram corresponding to each unit area in the negative sample needs to be obtained before the whole gradient histogram corresponding to the negative sample is obtained. Alternatively, the gradient histogram in the embodiment of the present application may refer to a visualized gradient histogram, so as to facilitate visual comparison.
The nature of the negative sample is an image, and the process of acquiring a gradient histogram corresponding to each unit region in one image is described next. The gradient histogram may refer to HOG (Histogram of Oriented Gradient, directional gradient histogram). Alternatively, the process of acquiring the gradient histograms corresponding to the respective unit areas in one image may include the following 3 steps:
(1) The image is preprocessed.
The purpose of the preprocessing is to normalize the image to reduce the effect of local shadows and illumination variations on the image. Alternatively, the manner in which the image is preprocessed may be referred to as gamma correction of the image.
(2) A gradient is calculated for each pixel in the image.
The gradient of each pixel point comprises a gradient amplitude value and a gradient direction of each pixel point, and the gradient amplitude value and the gradient direction of each pixel point can be calculated based on the horizontal gradient and the vertical gradient of each pixel point. Therefore, before calculating the gradient of each pixel, it is necessary to calculate the horizontal gradient and the vertical gradient of each pixel. The horizontal gradient of each pixel point can be calculated based on formula 1, and the vertical gradient of each pixel point can be calculated based on formula 2:
G x (x, y) =h (x+1, y) -H (x-1, y) (formula 1)
G y (x, y) =h (x, y+1) -H (x, y-1) (formula 2)
Wherein G is x (x, y) represents a horizontal gradient of the pixel point having coordinates (x, y); g y (x, y) represents a vertical gradient of the pixel point having coordinates (x, y); h (x+1, y) represents a pixel value of a pixel point having coordinates (x+1, y); h (x-1, y) represents the pixel value of the pixel point with the coordinates (x-1, y); h (x, y+1) represents a pixel value of a pixel point having coordinates (x, y+1); h (x, y-1) represents the pixel value of the pixel point with coordinates (x, y-1).
The implementation process of calculating the horizontal gradient and the vertical gradient of each pixel point can be as follows: by [ -1,0,1]The gradient operator carries out convolution operation on the image to obtain a horizontal gradient (right is taken as positive direction); by [ -1,0,1] T The gradient operator carries out convolution operation on the image to obtain the gradient in the vertical direction (the upward direction is the positive direction).
After the horizontal gradient and the vertical gradient of each pixel point are calculated, the gradient amplitude and the gradient direction of each pixel point can be calculated by using the formula 3 and the formula 4 respectively, so that the gradient of each pixel point is obtained.
Wherein G (x, y) represents the gradient magnitude of the pixel point with coordinates (x, y); g x (x, y) represents a horizontal gradient of the pixel point having coordinates (x, y); g y (x, y) represents a vertical gradient of the pixel point having coordinates (x, y); α (x, y) represents the gradient direction of the pixel point with coordinates (x, y).
It should be noted that, for an RGB image, gradients (gradient magnitude and gradient direction) are calculated on three channels of each pixel point. Taking the maximum gradient amplitude value on the three channels as the gradient amplitude value of the pixel point, and taking the gradient direction corresponding to the maximum gradient amplitude value as the gradient direction of the pixel point.
(3) A gradient histogram is constructed for each unit region.
The preprocessed image is divided into a plurality of unit areas, each unit area being constituted by a plurality of pixels, for example, 6*6 pixels. And counting the gradient amplitude and the gradient direction of the pixel points contained in each unit area by adopting a direction block mode, so as to construct a gradient histogram for each unit area.
Optionally, the process of counting the gradient magnitude and gradient direction of the pixel point included in each unit area by adopting a direction block mode, so as to construct a gradient histogram for each unit area may be: the 360-degree gradient direction is cut into 9 direction blocks, and the gradient amplitude and the gradient direction of each pixel point in the unit area are subjected to weighted projection (mapped to a fixed angle range) in the histogram, so that a gradient histogram (direction gradient histogram) of the unit area is obtained. In the process of weighting projection of each pixel point in the unit area in the histogram by using the gradient amplitude and the gradient direction, the gradient direction is used for positioning the direction block, and the gradient amplitude is used as the weight of projection.
For example, the result of cutting a 360 degree gradient direction into 9 direction blocks may be: 0 to 20 degrees and 180 to 200 degrees represent the direction block 1, 20 to 40 degrees and 200 to 220 degrees represent the direction block 2, 160 to 180 degrees and 340 to 360 degrees represent the direction block 9. In the process of performing weighted projection on each pixel point in the unit area in the histogram by using the gradient amplitude and the gradient direction, if the gradient direction of a certain pixel point in the unit area is 30 degrees, the gradient amplitude of the pixel point is increased for the count of the direction block 2.
After the numerical value representation result of the gradient histogram of each unit area is obtained, the numerical value representation result can be visualized, and the visualized result is used as the gradient histogram corresponding to each unit area.
After the gradient histogram corresponding to each unit area is obtained, the server can obtain the gradient histogram information corresponding to the negative sample no matter the gradient histogram information refers to the whole gradient histogram corresponding to the negative sample or the gradient histogram corresponding to each unit area in the negative sample.
Alternatively, the process of acquiring the gradient histogram information corresponding to the negative sample by the terminal may: and the terminal calls a page detection system to pull gradient histogram information corresponding to the negative sample from the server.
It should be noted that, in the embodiment of the present application, the order of acquiring the to-be-detected image corresponding to the to-be-detected page and the gradient histogram information corresponding to the negative sample is not limited. For example, gradient histogram information corresponding to the negative sample may be acquired first, and then an image to be detected corresponding to the page to be detected may be acquired, so as to ensure that the image to be detected corresponding to the page to be detected is detected immediately after the image to be detected corresponding to the page to be detected is acquired; the method can also acquire the image to be detected corresponding to the page to be detected, and acquire the gradient histogram information corresponding to the negative sample so as to reduce the occupied memory of the terminal.
In step 202, the image to be detected is divided into regions to obtain a region to be detected, and a gradient histogram corresponding to the region to be detected is determined.
After the image to be detected is obtained, the image to be detected is subjected to region division to obtain a region to be detected, so that whether the page to be checked fails to be loaded is judged through detection of the region to be detected. Compared with the mode of directly detecting the whole image to be detected, the mode can refine the detection granularity and improve the accuracy of judging whether the page to be detected fails to be loaded.
Optionally, the number of times of area division of the image to be detected may be one time or may be multiple times, and each area division is performed on the whole image to be detected. It should be noted that, one dividing process may refer to an aliquoting process or a non-aliquoting process, which is not limited in the embodiment of the present application. When the primary dividing process is an equally dividing process, the to-be-detected areas with the same size are obtained after the secondary dividing process, and when the primary dividing process is a non-equally dividing process, the to-be-detected areas with different sizes are obtained after the secondary dividing process. The embodiment of the application is described by taking the example that each dividing process is an equally dividing process.
Alternatively, the process of equally dividing the region to be detected may be: and carrying out region division on the image to be detected according to the reference granularity. The reference granularity is used to indicate how to divide the image to be detected into equal parts, alternatively, the reference granularity may indicate that the length and the width of the image to be detected are divided into equal parts respectively, for example, the reference granularity may be 2×2, and at this reference granularity, the length and the width of the image to be detected are divided into 2 equal parts respectively.
The number and size of the areas to be detected are determined by the reference granularity. It should be noted that the reference granularity may be one granularity or may be multiple granularities. When the reference granularity is one granularity, carrying out primary region division on the image to be detected to obtain a size of region to be detected, wherein the number of the size of region to be detected is determined by the reference granularity. For example, assuming that the reference granularity is 2×2, the length and the width of the image to be detected are equally divided into 2 parts according to the granularity, respectively, to obtain 4 areas to be detected with a size of 1/4 of the size of the image to be detected.
When the reference granularity is multiple granularities, respectively carrying out primary region division on the image to be detected according to each granularity, and obtaining a region to be detected with one size after each region division. Taking two granularity as an example, the reference granularity is divided into a first granularity and a second granularity, in this case, the process of dividing the area of the image to be detected into the area to be detected may be: and dividing the region of the image to be detected according to the first granularity and the second granularity respectively to obtain a region to be detected with the first size and a region to be detected with the second size. Alternatively, the first particle size may be a coarser particle size and the second particle size may be a finer particle size. For example, the first particle size is 2 x 2 and the second particle size is 4*4. In this case, the first dimension is greater than the second dimension. Alternatively, any first size of the area to be detected may be constituted by at least two second sizes of the area to be detected. For example, when the first granularity is 2×2 and the second granularity is 4*4, the to-be-detected area of any one first size is composed of 4 to-be-detected areas of the second size.
When the reference granularity is multiple granularities, whether the page to be detected fails to be loaded or not can be judged through multi-granularity detection. Alternatively, the multiple granularities may be sequentially arranged from coarse granularity to fine granularity, where the coarse granularity corresponds to a large-sized region to be detected, and the fine granularity corresponds to a small-sized region to be detected. After the region division is carried out according to a plurality of granularities which are sequentially arranged from coarse to fine, the region to be detected with the size from large to small can be obtained. It should be noted that, since the entire image to be detected is divided into regions according to each granularity, the region to be detected of each size constitutes the entire image to be detected.
After the areas to be detected are obtained, determining gradient histograms corresponding to the areas to be detected, wherein each area to be detected corresponds to the gradient histogram. Optionally, the ways of determining the gradient histogram corresponding to the region to be detected include, but are not limited to, the following two ways:
mode one: extracting a gradient histogram corresponding to the image to be detected, and taking the gradient histogram at the same position as the region to be detected in the gradient histogram corresponding to the image to be detected as the gradient histogram corresponding to the region to be detected.
Mode two: extracting a gradient histogram corresponding to a unit area in the image to be detected, and connecting the gradient histograms of the unit areas forming the area to be detected to obtain the gradient histogram corresponding to the area to be detected.
The gradient histogram corresponding to the image to be detected in the first mode may be obtained based on the gradient histogram corresponding to the unit area in the image to be detected in the second mode. The process of extracting the gradient histogram corresponding to the unit area in the image to be detected and the implementation process of obtaining the gradient histogram corresponding to the image to be detected based on the gradient histogram corresponding to the unit area in the image to be detected can refer to the process of obtaining the gradient histogram information corresponding to the negative sample by the server in step 201, which is not described herein.
For example, the image to be detected and the gradient histogram corresponding to the image to be detected may be as shown in fig. 3. Fig. 3 (1) is an image to be detected, and fig. 3 (2) is a gradient histogram corresponding to the image to be detected shown in fig. 3 (1). As can be seen from (2) in fig. 3, gradients in the rectangular frame 301 and the rectangular frame 302 are small, and in the process of detecting the region to be detected based on the gradient histogram, it can be determined that loading of the regions to be detected corresponding to the rectangular frame 301 and the rectangular frame 302 fails.
Optionally, for the case that the image to be detected is divided into the region according to the first granularity and the second granularity to obtain the region to be detected with the first size and the region to be detected with the second size, the region to be detected includes the region to be detected with the first size and the region to be detected with the second size. At this time, the process of determining the gradient histogram corresponding to the region to be detected is: and determining a gradient histogram corresponding to the region to be detected of the first size and a gradient histogram corresponding to the region to be detected of the second size.
In step 203, the gradient histogram corresponding to the region to be detected is matched with the candidate gradient histogram, where the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is the region in the negative sample that meets the matching condition and is in the corresponding position with the region to be detected.
After the gradient histogram corresponding to the region to be detected is obtained, the region to be detected is detected based on the gradient histogram corresponding to the region to be detected, and whether the page to be detected fails to be loaded is further judged according to the detection result of the region to be detected.
And matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram in the process of detecting the region to be detected based on the gradient histogram corresponding to the region to be detected. The candidate gradient histogram is a gradient histogram corresponding to a candidate region in a corresponding position of the region to be detected in the negative sample meeting the matching condition. It should be noted that, in the implementation process of step 203, the number of the to-be-detected areas is plural, and each to-be-detected area is detected based on the gradient histogram corresponding to each to-be-detected area.
When there is only one size of the region to be detected, each region to be detected is sequentially detected until the gradient histogram corresponding to any one region to be detected is successfully matched, and step 204 is executed. When there are multiple to-be-detected areas with different sizes, the to-be-detected areas with large sizes can be detected sequentially, and if the gradient histogram matching corresponding to any one of the to-be-detected areas with large sizes is successful, step 204 is executed; if the gradient histograms corresponding to the large-size to-be-detected areas are failed to match, detecting each small-size to-be-detected area, and repeating the above process until the gradient histograms corresponding to any to-be-detected area are successfully matched, and executing step 204. It should be noted that, when the gradient histogram corresponding to any one of the to-be-detected areas is successfully matched, the detection process of other to-be-detected areas may be stopped, or the detection process of other to-be-detected areas may be continuously completed.
The detection process of any region to be detected is as follows no matter what size of region to be detected: and matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram. The candidate gradient histogram is a gradient histogram corresponding to a region in a corresponding position of the region to be detected in the negative sample meeting the matching condition. Next, a detection process of any one of the regions to be detected will be described as an example.
Each negative sample has a region in a position corresponding to the region to be detected. When the size of the negative sample is the same as that of the image to be detected, the region at the corresponding position to the region to be detected may refer to the region at the same position as the region to be detected; when the negative sample is different from the size of the image to be detected, the region at the corresponding position to the region to be detected may refer to a region where the relative position in the negative sample coincides with the relative position of the region to be detected in the image to be detected. And taking the region which is positioned at the corresponding position with the region to be detected as a candidate region.
Before the detection of the region to be detected, the negative samples meeting the matching condition need to be determined, so that the gradient histogram corresponding to the candidate region for matching is further determined according to the negative samples meeting the matching condition.
The number of negative samples may be plural, and there may be negative samples among the plural negative samples that are unsuitable for detecting the region to be detected, so that it is necessary to determine the negative samples suitable for detecting the region to be detected first. Optionally, negative examples suitable for detecting the region to be detected include, but are not limited to, the following features: failure in loading the region which is positioned at the corresponding position with the region to be detected; the zoom level is the same as the page to be detected.
After screening out the negative samples suitable for detecting the region to be detected, the negative samples meeting the matching condition can be determined according to the negative samples suitable for detecting the region to be detected. Alternatively, according to a negative sample suitable for detecting the region to be detected, the manner of determining the negative sample satisfying the matching condition may be: all negative samples suitable for detecting the region to be detected are taken as negative samples meeting the matching condition. Optionally, each negative sample may be preset with a priority, and according to a negative sample suitable for detecting a region to be detected, a manner of determining a negative sample that meets a matching condition may be: one or more negative samples with the front priority suitable for detecting the region to be detected are taken as negative samples meeting the matching condition.
After the negative sample meeting the matching condition is determined, the gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition can be used as a candidate gradient histogram, and then the gradient histogram corresponding to the region to be detected is matched with the candidate gradient histogram. It should be noted that, for any region to be detected, there may be one or more negative samples satisfying the matching condition. When a plurality of negative samples meeting the matching condition are provided, each candidate region in the negative samples meeting the matching condition corresponds to one candidate gradient histogram, and the gradient histogram corresponding to the region to be detected is matched with each candidate gradient histogram respectively.
The above process describes a process of determining the negative samples satisfying the matching condition by taking one to-be-detected area as an example, and it should be noted that the negative samples satisfying the matching condition corresponding to different to-be-detected areas may be the same or different, which is not limited in this embodiment of the present application.
For any region to be detected, in the process of matching with the candidate gradient histograms, the similarity between the gradient histograms corresponding to the region to be detected and the candidate gradient histograms can be calculated, and whether the gradient histograms corresponding to the region to be detected are successfully matched or not is judged according to the similarity. Alternatively, the manner of calculating the similarity between the two gradient histograms may be: the similarity between the two gradient histograms is calculated based on the euclidean distance.
In the process of matching the gradient histograms corresponding to the areas to be detected with each candidate gradient histogram respectively for a plurality of negative samples meeting the matching condition, a plurality of similarities can be obtained, then the average similarity of the plurality of similarities can be calculated, and whether the gradient histograms corresponding to the areas to be detected are successfully matched is judged according to the average similarity.
Optionally, the process of judging whether the gradient histograms corresponding to the to-be-detected areas are successfully matched according to the similarity comprises the following steps: determining that the gradient histogram corresponding to the region to be detected is successfully matched in response to the similarity exceeding a similarity threshold; and determining that the gradient histogram matching corresponding to the region to be detected fails in response to the similarity not exceeding the similarity threshold. The similarity threshold may be a preset fixed threshold, and the matching process of all the areas to be detected is compared with the fixed threshold. The similarity threshold may also be set according to negative samples, with different negative samples corresponding to different similarity thresholds. The manner of setting the similarity threshold according to the negative example may be: and determining a similarity threshold corresponding to the negative sample according to the noise condition of the negative sample.
For the situation that the different negative samples respectively correspond to the similarity threshold values, when the number of the negative samples meeting the matching condition is one, comparing the calculated similarity with the similarity threshold value corresponding to the negative sample meeting the matching condition; when the number of the negative samples meeting the matching condition is multiple, an average similarity threshold value of similarity thresholds corresponding to the negative samples meeting the matching condition can be calculated, and then the calculated average similarity is compared with the average similarity threshold value.
The above procedure describes a procedure for detecting an area to be detected. It should be noted that the process of detecting all the areas to be detected may be implemented according to the above process, which is not described herein.
Alternatively, for the case where the region to be detected includes a region to be detected of a first size and a region to be detected of a second size, the candidate gradient histogram includes a first candidate gradient histogram and a second candidate gradient histogram when the first size is larger than the second size. The process of matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram may include: matching the gradient histogram corresponding to the region to be detected with the first size with the first candidate gradient histogram; and matching the gradient histogram corresponding to the region to be detected with the second size with the second candidate gradient histogram in response to failure in matching the gradient histograms corresponding to the region to be detected with the first size.
The first candidate gradient histogram refers to a candidate gradient histogram for matching with a gradient histogram corresponding to a region to be detected of a first size, and the second candidate gradient histogram refers to a candidate gradient histogram for matching with a gradient histogram corresponding to a region to be detected of a second size. It should be noted that, each first size of the region to be detected corresponds to a first candidate gradient histogram, and each second size of the region to be detected corresponds to a second candidate gradient histogram. The method comprises the steps of detecting a large-size to-be-detected area, and detecting a small-size to-be-detected area when gradient histograms of the large-size to-be-detected area are failed to match.
The result of detecting the region to be detected includes the following two types:
1. the gradient histogram corresponding to any region to be detected is successfully matched.
It should be noted that any region to be detected herein may refer to any region to be detected of any size. Since the gradient histogram corresponding to the region to be detected is matched with the gradient histogram corresponding to the candidate region in the negative sample satisfying the matching condition in the detection process, the candidate region in the negative sample satisfying the matching condition fails to be loaded, so when the gradient histogram corresponding to any one region to be detected is successfully matched, it is indicated that the loading of any one region to be detected in the image to be detected fails, and at this time, step 204 is executed.
2. The gradient histograms corresponding to the areas to be detected are failed to be matched.
When the gradient histograms corresponding to the areas to be detected are all failed to be matched, the areas to be detected in the image to be detected are not loaded with failures in the area detection layer based on the gradient histograms. At this time, it may be determined that the page to be detected fails to be loaded at the level of the split detection based on the gradient histogram. It should be noted that, in the case that the to-be-detected area includes to-be-detected areas of various sizes, the failure of matching the gradient histograms corresponding to the to-be-detected areas refers to the failure of matching the gradient histograms corresponding to all to-be-detected areas of all sizes.
Optionally, in the case that the gradient histograms corresponding to the areas to be detected fail to match, the following three processes are included, but not limited to:
the first treatment process is as follows: and determining that the page to be detected is successfully loaded in response to failure in matching the gradient histograms corresponding to the area to be detected.
In the processing process, when the fact that the page to be detected fails to be loaded is determined on the partition detection level based on the gradient histogram, the page to be detected is considered to be loaded successfully. In the processing process, the whole page detection process of judging whether the page to be detected fails to be loaded is completed by detecting the page in the divided areas, and compared with the mode of directly detecting the whole image to be detected in the related technology, the detection granularity is finer.
And a second treatment process: responding to failure of matching of the gradient histograms corresponding to the areas to be detected, and performing edge detection on the images to be detected; and determining that the page to be detected fails to be loaded according to the result of edge detection, wherein the result of edge detection indicates that the target shape does not exist in the image to be detected.
When the fact that the page to be detected fails to be loaded is determined on the partition detection level based on the gradient histogram, the image to be detected can be further detected on the shape detection level, so that accuracy of judging whether the page to be detected fails to be loaded or not is improved. The process for detecting the image to be detected at the shape detection layer comprises the following steps: and carrying out edge detection on the image to be detected, and judging whether the object shape exists in the image to be detected according to the result of the edge detection.
Edge detection is a method for analyzing an image in image processing and computer vision, and the purpose of edge detection is to identify points with obvious brightness change in the image, and the points after edge detection often show outlines. The method for performing edge detection on the image to be detected may be a Sobel (Sobel) edge detection method, a Laplacian edge detection method or a Canny (Canny) edge detection method, and the detection effects of different edge detection methods may be different, but may all play a role in edge detection.
After the edge detection is performed on the image to be detected, an edge detection result can be obtained, and then the terminal can analyze and process the edge detection result to judge whether the object shape exists in the image to be detected according to the edge detection result. The target shape refers to a basic shape that should be included in the page to be detected. The target shape may be set according to an application program or a web page corresponding to the page to be detected, which is not limited in the embodiment of the present application. For example, when the application corresponding to the page to be detected is a map-type application, the target shape that should be included in the page to be detected may be set as a straight line. The number of target shapes may be one or more, and when the number of target shapes is one, the one target shape may have different hierarchies. For example, for a page to be detected in a map-like application, the target shape may be set as a straight line. The straight line may have a common road hierarchy, as well as a navigation route hierarchy.
Optionally, the analysis processing manner of the edge detection result may be: and carrying out Hough transformation on the result of the edge detection. Hough transform is one of the basic methods of identifying geometric shapes from images, and is mainly used to separate geometric shapes (e.g., straight lines, circles, etc.) having some identical features from images. The algorithm flow of the hough transform is approximately as follows: and (3) giving an edge detection result and a target shape to be distinguished, mapping edge detection characteristic points in an image space to a parameter space by a Hough transformation algorithm to vote, and detecting local extreme points of an accumulated result to obtain a point set conforming to a specific shape. If the target shape can be obtained after processing according to the Hough transform algorithm, indicating that the result of edge detection indicates that the target shape exists in the image to be detected; if the target shape cannot be obtained after processing according to the Hough transform algorithm, the result of the edge detection indicates that the target shape does not exist in the image to be detected.
Alternatively, for the case where the target shape includes two levels of straight lines, the process of performing hough transform on the result of edge detection may be: preprocessing the edge detection result, shielding a certain level of straight line, and then carrying out Hough transformation on the edge detection result to judge whether another level of straight line exists. Optionally, the lines of different levels may correspond to different colors, and the manner of shielding the line of a certain level may be: the same color as that of a straight line of a certain level in the result of edge detection is masked.
For example, in the case where the page to be detected is a map page in a map-like application, the target shape corresponding to the page to be detected may include a straight line at a normal road level and a straight line at a navigation route level. In the process of detecting the shape of the image to be detected corresponding to the map page, an image shown as 4 can be obtained. In fig. 4, (1) in fig. 4 shows an image obtained by masking a straight line of a normal road level in the result of edge detection, and after hough transform is performed on the image shown in (1) in fig. 4, an image shown in (2) in fig. 4 can be obtained, and the straight line of a navigation route level in the image to be detected can be known from the image shown in (2) in fig. 4.
After the edge detection result is analyzed, two results can be obtained:
result one: the result of the edge detection indicates that no target shape is present in the image to be detected.
At this point, it is explained that the target shape that should exist is missing in the image to be detected, and at this time, it can be determined that the page to be detected fails to be loaded. That is, in response to the result of edge detection indicating that the target shape does not exist in the image to be detected, it is determined that the page to be detected fails to be loaded.
And a second result: the result of the edge detection indicates the presence of a target shape in the image to be detected.
Under the second result, the target shape which should exist is not deleted in the image to be detected, and at this time, it can be considered that the page to be detected is not failed to be loaded on both the area detection level and the shape detection level based on the gradient histogram. At this time, the page to be detected can be directly considered to be successfully loaded, and detection of other layers can be further performed, so that whether the page to be detected fails to be loaded can be more finely judged according to detection results of the other layers. The detection process of other layers is not limited in the embodiment of the application. For example, the detection of other levels may refer to the detection of the layer level.
Alternatively, the layer level detection process may be: responding to the result of edge detection to indicate that a target shape exists in the image to be detected, and performing binarization operation on the image to be detected according to the color value of at least one standard image layer corresponding to the page to be detected; and determining that the page to be detected fails to be loaded according to the condition that the image obtained after binarization operation is carried out on the image to be detected according to the color value of any standard image layer does not meet the color condition.
Before the layer-level detection process is executed, the color value of at least one standard layer corresponding to the page to be detected needs to be acquired. The page to be detected is composed of at least one standard layer, and different standard layers are used for displaying different covers. For example, when the page to be detected is a map page, the standard map layer corresponding to the map page may include one or more of a Marker (logo) map layer, a Polylin (multi-line segment) map layer, a hand drawing layer, and a Text map layer.
Each standard layer corresponds to a color value. It should be noted that, the number of standard layers corresponding to the page to be detected and the color value of each standard layer may be determined according to the application program or the web page corresponding to the page to be detected, and the page to be detected from different application programs or web pages may correspond to different standard layers.
Optionally, the color value of at least one standard layer corresponding to the page to be detected may be already stored locally in the terminal when the application program or the web page corresponding to the page to be detected is installed, and at this time, the terminal may directly extract the color value of at least one standard layer corresponding to the page to be detected locally. Optionally, the color value of at least one standard layer corresponding to the page to be detected may also be stored in the server, and the terminal pulls the color value of at least one standard layer corresponding to the page to be detected from the server.
After the color value of at least one standard image layer corresponding to the page to be detected is obtained, binarizing the image to be detected according to the color value of at least one standard image layer corresponding to the page to be detected. It should be noted that, because the color values of different standard image layers are different, in the process of performing binarization operation on the image to be detected according to the color value of at least one standard image layer corresponding to the page to be detected, the binarization operation is performed on the image to be detected once according to the color value of each standard image layer.
And according to the color value of each standard image layer, performing binarization operation on the image to be detected once to obtain an image with at least one color of the colors corresponding to the color values of the white and standard image layers. After performing binarization operation on the image to be detected once according to the color value of each standard image layer, detecting whether the image obtained after performing binarization operation on the image to be detected according to the color value of the standard image layer meets the color condition. Alternatively, meeting the color condition may indicate a color in the image having a color value corresponding to the standard layer. Because the color values of different standard layers are different, the color conditions which need to be met by the image obtained after the binarization operation are different in the process of performing the binarization operation on the image to be detected according to the color values of different standard layers.
After binarization operation is performed on the image to be detected according to the color value of the standard image layer, two results can be obtained:
1. and performing binarization operation on the image to be detected according to the color value of any standard image layer, wherein the obtained image does not meet the color condition.
Under such a result, it can be considered that the covering corresponding to any standard layer is missing in the image to be detected. When the image to be detected is only white, the text label covering corresponding to the missing text label layer in the image to be detected is described. For example, as shown in fig. 5, the image to be detected is represented by (1) in fig. 5, and the image shown by (2) in fig. 5 can be obtained after binarizing the image to be detected shown by (1) in fig. 5 according to the color value of the text label layer. Since the image shown in (2) in fig. 5 has only white, the text label cover corresponding to the missing text label layer in the image to be detected can be determined from the image shown in (2) in fig. 5.
And determining that the page to be detected fails to be loaded according to the condition that the image obtained after binarization operation is carried out on the image to be detected according to the color value of any standard image layer does not meet the color condition.
And II: and performing binarization operation on the image to be detected according to the color value of the standard image layer, wherein the obtained images all meet the color condition.
Under such a result, the image to be detected can be considered to contain a cover corresponding to each standard layer. Under the result, it can be considered that the page to be detected has no loading failure at all of the divided region detection level, the shape detection level and the layer detection level based on the gradient histogram. At this time, it may be determined that the page to be detected is successfully loaded directly, that is, it is determined that the page to be detected is successfully loaded in response to the images obtained after the binarization operation is performed on the image to be detected according to the color value of the standard image layer corresponding to the page to be detected satisfying the color condition. Of course, other levels of detection may be further performed, which is not limited in the embodiments of the present application.
When the detection level of the subarea based on the gradient histogram determines that the page to be detected fails to be loaded, further detecting the shape level; and further detecting the layer level when the shape detection layer level still determines that the page to be detected fails to be loaded. The detection process of the three layers can effectively improve the accuracy of judging whether the page to be detected fails to be loaded.
And a treatment process III: responding to failure of matching of the gradient histograms corresponding to the areas to be detected, and performing binarization operation on the layers to be detected according to the color value of at least one standard layer corresponding to the pages to be detected; and determining that the page to be detected fails to be loaded according to the condition that the image obtained after binarization operation is carried out on the image to be detected according to the color value of any standard image layer does not meet the color condition.
When the fact that the page to be detected fails to be loaded is determined on the partition detection layer based on the gradient histogram, the image to be detected can be further detected on the layer detection layer, so that accuracy of judging whether the page to be detected fails to be loaded or not is improved. The process for detecting the image to be detected at the layer detection level comprises the following steps: and according to the color value of at least one standard image layer corresponding to the page to be detected, performing binarization operation on the image layer to be detected, and judging whether the page to be detected fails to be loaded according to whether the image obtained after the binarization operation meets the color condition.
The result of detecting the image to be detected at the layer detection level includes two types:
A. and performing binarization operation on the image to be detected according to the color value of any standard image layer, wherein the obtained image does not meet the color condition.
Under such a result, it can be considered that the covering corresponding to any standard layer is missing in the image to be detected. And determining that the page to be detected fails to be loaded according to the condition that the image obtained after binarization operation is carried out on the image to be detected according to the color value of any standard image layer does not meet the color condition.
B. And performing binarization operation on the image to be detected according to the color value of the standard image layer, wherein the obtained images all meet the color condition.
Under such a result, the image to be detected can be considered to contain a cover corresponding to each standard layer. Under the result, it can be considered that the page to be detected is not failed to be loaded at the split detection level based on the gradient histogram and the layer detection level. At this time, it may be determined directly that the page to be detected is successfully loaded, or further, detection at other layers may be performed, so as to more precisely determine whether the page to be detected fails to be loaded according to the detection results at other layers. The detection process of other layers is not limited in the embodiment of the application. For example, the detection at the other level may refer to the detection at the shape level.
Alternatively, the detection process of the shape layer surface can be: responding to the images obtained after binarization operation is carried out on the images to be detected according to the color values of the standard image layers, and carrying out edge detection on the images to be detected; and determining that the page to be detected fails to be loaded according to the result of edge detection, wherein the result of edge detection indicates that the target shape does not exist in the image to be detected. The implementation of this process may be referred to as a second process, and will not be described here again.
When the regional detection layer based on the gradient histogram determines that the page to be detected fails to be loaded, further detecting the layer; and further detecting the shape layer when the layer detection layer still determines that the page to be detected fails to be loaded. The detection process of the three layers can also effectively improve the accuracy of judging whether the page to be detected fails to be loaded.
In step 204, in response to successful matching of the gradient histograms corresponding to any of the regions to be detected, it is determined that the page to be detected fails to be loaded.
When the gradient histogram corresponding to any one of the areas to be detected is successfully matched, the gradient histogram of the area to be detected is larger in similarity with the candidate gradient histogram corresponding to the candidate area in the negative sample meeting the matching condition, and the loading failure of the candidate area of the negative sample meeting the matching condition is indicated at the moment, and the loading failure of the area to be detected in the image to be detected is indicated, and the loading failure of the page to be detected is indicated as the loading failure of the area to be detected.
Optionally, for the case that the to-be-detected area includes a to-be-detected area of a first size and a to-be-detected area of a second size, determining that the to-be-detected page fails to be loaded in response to successful matching of the gradient histogram corresponding to any one of the to-be-detected areas, including the following two cases:
Case 1: and determining that the page to be detected fails to be loaded in response to successful matching of the gradient histogram corresponding to the region to be detected of any first size.
In this case 1, the matching of the gradient histograms corresponding to the to-be-detected area with the first size is successful, which indicates that the to-be-detected area with the first size fails to be loaded, and the to-be-detected page fails to be loaded can be determined without detecting the to-be-detected area with the second size.
Case 2: and responding to the failure of matching of the gradient histograms corresponding to the to-be-detected areas of the first size, wherein the success of matching of the gradient histograms corresponding to the to-be-detected areas of any second size determines the failure of loading the to-be-detected page.
And when the gradient histograms corresponding to all the to-be-detected areas with the first size are failed to be matched, detecting the to-be-detected areas with the second size, and if the gradient histograms corresponding to any one of the to-be-detected areas with the second size are successfully matched, indicating that the to-be-detected areas with the second size are failed to be loaded, and determining that the to-be-detected page is failed to be loaded.
When the to-be-detected area comprises to-be-detected areas with various sizes, firstly executing the detection process of the to-be-detected area with the maximum size; when all the gradient histograms corresponding to the maximum-size to-be-detected areas fail to be matched, executing the detection process of the next-maximum-size to-be-detected areas; and by analogy, when all the gradient histograms corresponding to the regions to be detected with the smallest size fail to be matched, the detection process of the regions to be detected with the smallest size is executed. When the region to be detected includes N (integer not less than 1) sizes of regions to be detected, the matching success of the gradient histogram corresponding to any one region to be detected includes N cases:
Case 1: the gradient histogram corresponding to the region to be detected with any maximum size is successfully matched.
Case 2: the gradient histograms corresponding to the maximum-size to-be-detected areas are failed to match, and the gradient histograms corresponding to the large-size to-be-detected areas are successfully matched at any time.
Case N: the gradient histograms corresponding to the areas to be detected with the minimum size are failed to match, and the gradient histograms corresponding to the areas to be detected with any minimum size are successfully matched.
When any one of the N conditions exists, determining that the page to be detected fails to be loaded.
Optionally, after determining that the page to be detected fails to be loaded, the terminal may collect the device information, and then upload the device information and the image to be detected to the server. The device information is used to indicate the current state of the terminal, including, but not limited to, current device stack information, current device memory information, external device information of the current device connection, and current device CPU (Central Processing Unit ) information. It should be noted that, the case of determining that the page to be detected fails to load may include the case related to the second processing procedure and the third processing procedure in step 203 in addition to the case related to step 204.
After uploading the device information and the image to be detected to the server, the server may analyze the reason for the failure of loading the page to be detected based on the device information and the image to be detected. After analyzing the reasons of the loading failure, the server can carry out classified statistics on the uploaded information and can also send early warning information to the terminal, wherein the early warning information is used for informing the terminal of the reasons of the loading failure. Optionally, for the case that the detection process of other undetected areas to be detected is continuously completed when the gradient histogram corresponding to any one of the areas to be detected is successfully matched, when the loading failure of the page to be detected is determined, the terminal can determine all areas with failed loading, then the areas with failed loading can be marked in the image to be detected, and then the marked image to be detected and the equipment information are uploaded to the server together, so that the server can perform rapid classification statistics.
Optionally, after classifying and counting the uploaded information, the server may update the negative sample based on the to-be-detected image corresponding to the uploaded loading failure page, so as to facilitate the subsequent more accurate detection by using the new negative sample.
Optionally, after the server sends the early warning information to the terminal, the terminal may receive the early warning information sent by the server, and display the early warning information on the display page. The display page may refer to a page to be detected, or may refer to another page to which the page to be detected jumps, which is not limited in the embodiment of the present application. After the early warning information is displayed on the display page, a user of the terminal can timely acquire the reason of page loading failure, and then take solving measures to eliminate bad factors of page loading failure.
For example, for the case where the page to be inspected is a map ground in a map-like application, the cause of the map page loading failure may be: insufficient memory of the terminal, incorrect initialization of the map-like application, no access to the network by the terminal, and movement of the map to a non-supporting area (e.g., africa, etc.). Different reasons may correspond to different solutions. For example, when the early warning information indicates that the reason for the page loading failure is insufficient memory of the terminal, the user of the terminal can eliminate the bad factor for the page loading failure by cleaning the occupied memory; when the early warning information indicates that the reason for the page loading failure is that the initialization mode of the map type application program is wrong, a user of the terminal can eliminate bad factors for the page loading failure by restarting the map type application program; when the early warning information indicates that the reason for causing the page loading failure is that the terminal is not connected to the network, a user of the terminal can eliminate bad factors causing the page loading failure through connecting the network; when the early warning information indicates that the reason for the page loading failure is that the map is moved to the non-support area, the user of the terminal can eliminate the bad factor for the page loading failure by moving the map to the support area.
The entire page detection process may be as shown in fig. 6, for example. Starting an application program or a webpage of the terminal, starting a page detection system, and pulling gradient histogram information corresponding to the negative sample from a server by the page detection system; clicking a to-be-detected page entering an application program or a webpage by a user, and finishing loading the to-be-detected page; calling a page detection system to intercept a visible area of a page to be detected as an image to be detected; determining a gradient histogram corresponding to a region to be detected, which is divided by the image to be detected; and matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition. Judging whether the gradient histograms corresponding to the areas to be detected are matched with each other or not; when the gradient histogram corresponding to any region to be detected is successfully matched, determining that the page to be detected fails to be loaded; when the gradient histograms corresponding to the areas to be detected are failed to be matched, edge detection is carried out on the images to be detected; and judging whether the target shape exists in the image to be detected.
When the target shape does not exist in the image to be detected, determining that the page to be detected fails to be loaded; when a target shape exists in the image to be detected, performing binarization operation on the image to be detected according to the color value of the standard image layer corresponding to the page to be detected; and judging whether the images obtained after the binarization operation meet the color conditions. When any image obtained after the binarization operation does not meet the color condition, determining that the page to be detected fails to be loaded; and when the images obtained after the binarization operation meet the color conditions, determining that the page to be detected is successfully loaded. And when the loading failure of the page to be detected is determined, the terminal acquires equipment information and uploads the equipment information and the image to be detected to the server.
In the embodiment of the application, the terminal can automatically judge whether the page to be detected fails to be loaded. The terminal can detect the situation of partial loading failure, shape loading failure and layer loading failure, detection is more comprehensive, and accuracy in judging whether the page to be detected fails to be loaded is higher. After the failure of loading the page to be detected is determined, the equipment information and the image to be detected are uploaded to the server in time for analysis and processing, so that the problem of loading failure is solved in time, the normal service of the application program or the webpage is restored as soon as possible, and the page detection effect is good. In addition, the method provided by the embodiment of the application is not limited to the page type, and the application range is wider.
In the embodiment of the application, an image to be detected is divided into areas to be detected, a gradient histogram corresponding to the areas to be detected is matched with a candidate gradient histogram corresponding to a candidate area in a negative sample, and when the gradient histogram corresponding to any one of the areas to be detected is successfully matched, the failure of loading the page to be detected is determined. In the page detection process, each area to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain area can be identified, in addition, the information related to the gradient histogram is rich, the detection is carried out based on the gradient histogram, the accuracy of judging whether the page fails to be loaded can be improved, and the page detection effect is good.
Referring to fig. 7, an embodiment of the present application provides a page detection apparatus, including:
the acquiring module 701 is configured to acquire a to-be-detected image corresponding to a to-be-detected page and gradient histogram information corresponding to a negative sample, where the negative sample is obtained based on image training corresponding to a page that fails to be loaded, and the gradient histogram information corresponding to the negative sample is used to indicate gradient histograms corresponding to areas at different positions in the negative sample;
the dividing module 702 is configured to divide an area of the image to be detected to obtain an area to be detected;
a first determining module 703, configured to determine a gradient histogram corresponding to the region to be detected;
the matching module 704 is configured to match a gradient histogram corresponding to a region to be detected with a candidate gradient histogram, where the candidate gradient histogram is a gradient histogram corresponding to a candidate region, and the candidate region is a region in a negative sample that meets a matching condition and is located at a position corresponding to the region to be detected;
the second determining module 705 is configured to determine that the page to be detected fails to be loaded in response to successful matching of the gradient histograms corresponding to any one of the regions to be detected.
Optionally, the dividing module 702 is further configured to divide the image to be detected into a region according to a first granularity and a second granularity, so as to obtain a region to be detected with a first size and a region to be detected with a second size;
The first determining module 703 is further configured to determine a gradient histogram corresponding to the first size of the region to be detected and a gradient histogram corresponding to the second size of the region to be detected.
Optionally, the first size is larger than the second size, the candidate gradient histogram includes a first candidate gradient histogram and a second candidate gradient histogram, and the matching module 704 is further configured to match the gradient histogram corresponding to the region to be detected with the first candidate gradient histogram; and matching the gradient histogram corresponding to the region to be detected with the second size with the second candidate gradient histogram in response to failure in matching the gradient histograms corresponding to the region to be detected with the first size.
Optionally, the second determining module 705 is further configured to determine that the loading of the page to be detected fails in response to successful matching of the gradient histograms corresponding to the area to be detected of any one of the first sizes; or determining that the page to be detected fails to be loaded in response to the failure of matching of the gradient histograms corresponding to the areas to be detected in the first size and the success of matching of the gradient histograms corresponding to the areas to be detected in any second size.
Optionally, referring to fig. 8, the apparatus further includes:
the edge detection module 706 is configured to perform edge detection on the image to be detected in response to failure in matching of the gradient histograms corresponding to the region to be detected;
The second determining module 705 is further configured to determine that the page to be detected fails to be loaded in response to the result of edge detection indicating that the target shape does not exist in the image to be detected.
Optionally, the acquiring module 701 is further configured to acquire a color value of at least one standard layer corresponding to the page to be detected;
referring to fig. 8, the apparatus further includes:
an operation module 707, configured to respond to the result of edge detection to indicate that a target shape exists in the image to be detected, and perform binarization operation on the image to be detected according to a color value of at least one standard layer corresponding to the page to be detected;
the second determining module 705 is further configured to determine that the page to be detected fails to be loaded, in response to an image obtained by performing binarization operation on the image to be detected according to a color value of any standard layer not meeting a color condition.
Optionally, the acquiring module 701 is further configured to acquire a color value of at least one standard layer corresponding to the page to be detected;
the operation module 707 is further configured to perform binarization operation on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected in response to the failure of matching of the gradient histograms corresponding to the area to be detected;
the second determining module 705 is further configured to determine that the page to be detected fails to be loaded, in response to an image obtained by performing binarization operation on the image to be detected according to a color value of any standard layer not meeting a color condition.
Optionally, referring to fig. 8, the apparatus further includes:
an acquisition module 708 for acquiring device information;
an uploading module 709 for uploading the device information and the image to be detected to a server.
In the embodiment of the application, an image to be detected is divided into areas to be detected, a gradient histogram corresponding to the areas to be detected is matched with a candidate gradient histogram corresponding to a candidate area in a negative sample, and when the gradient histogram corresponding to any one of the areas to be detected is successfully matched, the failure of loading the page to be detected is determined. In the page detection process, each area to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain area can be identified, in addition, the information related to the gradient histogram is rich, the detection is carried out based on the gradient histogram, the accuracy of judging whether the page fails to be loaded can be improved, and the page detection effect is good.
It should be noted that, when the apparatus provided in the foregoing embodiment performs the functions thereof, only the division of the foregoing functional modules is used as an example, in practical application, the foregoing functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
Fig. 9 is a schematic structural diagram of a server provided in the embodiment of the present application, where the server may have a relatively large difference due to different configurations or performances, and may include one or more processors (Central Processing Units, CPU) 901 and one or more memories 902, where at least one program code is stored in the one or more memories 902, and the at least one program code is loaded and executed by the one or more processors 901, so as to implement the page detection method provided in each method embodiment described above. Of course, the server may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
Fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal may be: smart phones, tablet computers, notebook computers or desktop computers. Terminals may also be referred to by other names as user equipment, portable terminals, laptop terminals, desktop terminals, etc.
Generally, the terminal includes: a processor 1001 and a memory 1002.
The processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1001 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1001 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and drawing of content that the display screen needs to display. In some embodiments, the processor 1001 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. Memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one instruction for execution by processor 1001 to implement the page detection method provided by the method embodiments in the present application.
In some embodiments, the terminal may further optionally include: a peripheral interface 1003, and at least one peripheral. The processor 1001, the memory 1002, and the peripheral interface 1003 may be connected by a bus or signal line. The various peripheral devices may be connected to the peripheral device interface 1003 via a bus, signal wire, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch display 1005, camera assembly 1006, audio circuitry 1007, positioning assembly 1008, and power supply 1009.
Peripheral interface 1003 may be used to connect I/O (Input/Output) related at least one peripheral to processor 1001 and memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1001, memory 1002, and peripheral interface 1003 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
Radio Frequency circuit 1004 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. Radio frequency circuitry 1004 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. Radio frequency circuitry 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 1004 may also include NFC (Near Field Communication ) related circuitry, which is not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1005 is a touch screen, the display 1005 also has the ability to capture touch signals at or above the surface of the display 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this time, the display 1005 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1005 may be one, disposed on the front panel of the terminal; in other embodiments, the display 1005 may be at least two, respectively disposed on different surfaces of the terminal or in a folded design; in still other embodiments, the display 1005 may be a flexible display disposed on a curved surface or a folded surface of the terminal. Even more, the display 1005 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 1005 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1006 is used to capture images or video. Optionally, camera assembly 1006 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing, or inputting the electric signals to the radio frequency circuit 1004 for voice communication. For the purpose of stereo acquisition or noise reduction, a plurality of microphones can be respectively arranged at different parts of the terminal. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 1007 may also include a headphone jack.
The location component 1008 is used to locate the current geographic location of the terminal to enable navigation or LBS (Location Based Service, location-based services). The positioning component 1008 may be a positioning component based on the united states GPS (Global Positioning System ), the beidou system of china, the grainer system of russia, or the galileo system of the european union.
The power supply 1009 is used to supply power to the various components in the terminal. The power source 1009 may be alternating current, direct current, disposable battery or rechargeable battery. When the power source 1009 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal further includes one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyroscope sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
The acceleration sensor 1011 can detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the terminal. For example, the acceleration sensor 1011 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1001 may control the touch display 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal, and the gyro sensor 1012 may collect a 3D motion of the user to the terminal in cooperation with the acceleration sensor 1011. The processor 1001 may implement the following functions according to the data collected by the gyro sensor 1012: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1013 may be provided at a side frame of the terminal and/or a lower layer of the touch display 1005. When the pressure sensor 1013 is provided at a side frame of the terminal, a grip signal of the terminal by a user can be detected, and the processor 1001 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 1013. When the pressure sensor 1013 is provided at the lower layer of the touch display 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1005. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the identity of the user based on the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the identity of the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1014 may be provided on the front, back or side of the terminal. When a physical key or vendor Logo is provided on the terminal, the fingerprint sensor 1014 may be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the touch display 1005 based on the ambient light intensity collected by the optical sensor 1015. Specifically, when the intensity of the ambient light is high, the display brightness of the touch display screen 1005 is turned up; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is turned down. In another embodiment, the processor 1001 may dynamically adjust the shooting parameters of the camera module 1006 according to the ambient light intensity collected by the optical sensor 1015.
A proximity sensor 1016, also known as a distance sensor, is typically provided on the front panel of the terminal. The proximity sensor 1016 is used to collect the distance between the user and the front of the terminal. In one embodiment, when the proximity sensor 1016 detects a gradual decrease in the distance between the user and the front face of the terminal, the processor 1001 controls the touch display 1005 to switch from the bright screen state to the off screen state; when the proximity sensor 1016 detects that the distance between the user and the front surface of the terminal gradually increases, the touch display 1005 is controlled by the processor 1001 to switch from the off-screen state to the on-screen state.
It will be appreciated by those skilled in the art that the structure shown in fig. 10 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
In an exemplary embodiment, a computer device is also provided that includes a processor and a memory having at least one program code stored therein. The at least one piece of program code is loaded and executed by one or more processors to implement any of the page detection methods described above.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one program code loaded and executed by a processor of a computer device to implement any of the page detection methods described above.
Alternatively, the above-mentioned computer readable storage medium may be a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Read-Only optical disk (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
It should be noted that the terms "first," "second," and the like in the description and in the claims of the present application are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
The foregoing description of the exemplary embodiments of the present application is not intended to limit the invention to the particular embodiments disclosed, but on the contrary, the intention is to cover all modifications, equivalents, alternatives, and alternatives falling within the spirit and scope of the invention.

Claims (8)

1. A method of page detection, the method comprising:
acquiring a to-be-detected image corresponding to a to-be-detected page and gradient histogram information corresponding to a negative sample, wherein the negative sample is obtained based on image training corresponding to a page with loading failure, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to areas at different positions in the negative sample;
Dividing the image to be detected into areas to obtain an area to be detected, and determining a gradient histogram corresponding to the area to be detected;
matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to a candidate region, and the candidate region is a region which is in a position corresponding to the region to be detected in a negative sample meeting a matching condition;
responding to successful matching of the gradient histograms corresponding to any region to be detected, and determining that the page to be detected fails to be loaded;
the step of carrying out region division on the image to be detected to obtain a region to be detected, and determining a gradient histogram corresponding to the region to be detected comprises the following steps:
dividing the image to be detected into areas according to a first granularity and a second granularity respectively to obtain a first-size area to be detected and a second-size area to be detected;
determining a gradient histogram corresponding to the first-size region to be detected and a gradient histogram corresponding to the second-size region to be detected;
the first size is larger than the second size, the candidate gradient histogram includes a first candidate gradient histogram and a second candidate gradient histogram, and the matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram includes:
Matching the gradient histogram corresponding to the region to be detected with the first size with a first candidate gradient histogram;
and matching the gradient histogram corresponding to the region to be detected with the second size with the second candidate gradient histogram in response to failure in matching the gradient histograms corresponding to the region to be detected with the first size.
2. The method of claim 1, wherein the determining that the page to be detected fails to load in response to successful gradient histogram matching for any one of the regions to be detected comprises:
responding to successful matching of the gradient histograms corresponding to the to-be-detected areas with any first size, and determining that the to-be-detected page fails to be loaded; or,
and determining that the page to be detected fails to be loaded in response to failure in matching of the gradient histograms corresponding to the areas to be detected in the first size and success in matching of the gradient histograms corresponding to the areas to be detected in any second size.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
responding to failure of matching of gradient histograms corresponding to the region to be detected, and performing edge detection on the image to be detected;
and responding to the result of edge detection to indicate that the target shape does not exist in the image to be detected, and determining that the page to be detected fails to be loaded.
4. A method according to claim 3, characterized in that the method further comprises:
acquiring a color value of at least one standard layer corresponding to a page to be detected;
responding to an edge detection result to indicate that a target shape exists in the image to be detected, and performing binarization operation on the image to be detected according to a color value of at least one standard image layer corresponding to the page to be detected;
and determining that the page to be detected fails to be loaded according to the condition that the image obtained after binarization operation is carried out on the image to be detected according to the color value of any standard image layer does not meet the color condition.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring a color value of at least one standard layer corresponding to a page to be detected;
in response to failure in matching of the gradient histograms corresponding to the areas to be detected, performing binarization operation on the images to be detected according to the color value of at least one standard image layer corresponding to the pages to be detected;
and determining that the page to be detected fails to be loaded according to the condition that the image obtained after binarization operation is carried out on the image to be detected according to the color value of any standard image layer does not meet the color condition.
6. The method of claim 1, wherein after the determining that the page to be detected fails to load, the method further comprises:
collecting equipment information;
and uploading the equipment information and the image to be detected to a server.
7. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one program code that is loaded and executed by the processor to implement the page detection method of any of claims 1 to 6.
8. A computer readable storage medium having stored therein at least one program code, the at least one program code being loaded and executed by a processor to implement the page detection method of any of claims 1 to 6.
CN202010392403.0A 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium Active CN111582184B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010392403.0A CN111582184B (en) 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010392403.0A CN111582184B (en) 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111582184A CN111582184A (en) 2020-08-25
CN111582184B true CN111582184B (en) 2024-02-20

Family

ID=72118757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010392403.0A Active CN111582184B (en) 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111582184B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077437B (en) * 2021-03-31 2023-07-25 上海晨兴希姆通电子科技有限公司 Workpiece quality detection method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049751A (en) * 2013-01-24 2013-04-17 苏州大学 Improved weighting region matching high-altitude video pedestrian recognizing method
WO2018033155A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Video image processing method, apparatus and electronic device
CN110163287A (en) * 2019-05-24 2019-08-23 三亚中科遥感研究所 A kind of mesoscale eddy detection method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5756322B2 (en) * 2011-04-08 2015-07-29 任天堂株式会社 Information processing program, information processing method, information processing apparatus, and information processing system
US20150154659A1 (en) * 2013-12-03 2015-06-04 Yahoo! Inc. System and method for displaying transitional mobile ads during network page download latency time
CN107273031B (en) * 2017-06-23 2020-10-16 阿里巴巴(中国)有限公司 Information flow page loading method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049751A (en) * 2013-01-24 2013-04-17 苏州大学 Improved weighting region matching high-altitude video pedestrian recognizing method
WO2018033155A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Video image processing method, apparatus and electronic device
CN110163287A (en) * 2019-05-24 2019-08-23 三亚中科遥感研究所 A kind of mesoscale eddy detection method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
老实和尚 ; .保护应用程序的一种方法――模拟Windows PE加载器,从内存资源中加载DLL.程序员.2008,(07),全文. *

Also Published As

Publication number Publication date
CN111582184A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN110070056B (en) Image processing method, image processing apparatus, storage medium, and device
CN110807361B (en) Human body identification method, device, computer equipment and storage medium
CN110059685B (en) Character area detection method, device and storage medium
CN110222789B (en) Image recognition method and storage medium
CN110490179B (en) License plate recognition method and device and storage medium
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN112749613B (en) Video data processing method, device, computer equipment and storage medium
CN109886208B (en) Object detection method and device, computer equipment and storage medium
CN110490186B (en) License plate recognition method and device and storage medium
CN110570460A (en) Target tracking method and device, computer equipment and computer readable storage medium
CN111754386A (en) Image area shielding method, device, equipment and storage medium
CN110647881A (en) Method, device, equipment and storage medium for determining card type corresponding to image
CN110705614A (en) Model training method and device, electronic equipment and storage medium
CN110738185B (en) Form object identification method, form object identification device and storage medium
CN111586279A (en) Method, device and equipment for determining shooting state and storage medium
CN110728167A (en) Text detection method and device and computer readable storage medium
CN111582184B (en) Page detection method, device, equipment and storage medium
CN110163192B (en) Character recognition method, device and readable medium
CN111931712A (en) Face recognition method and device, snapshot machine and system
CN112053360A (en) Image segmentation method and device, computer equipment and storage medium
CN111444749B (en) Method and device for identifying road surface guide mark and storage medium
CN113343709B (en) Method for training intention recognition model, method, device and equipment for intention recognition
CN113378705B (en) Lane line detection method, device, equipment and storage medium
CN112990424B (en) Neural network model training method and device
CN110728275B (en) License plate recognition method, license plate recognition device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant