CN111582184A - Page detection method, device, equipment and storage medium - Google Patents

Page detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN111582184A
CN111582184A CN202010392403.0A CN202010392403A CN111582184A CN 111582184 A CN111582184 A CN 111582184A CN 202010392403 A CN202010392403 A CN 202010392403A CN 111582184 A CN111582184 A CN 111582184A
Authority
CN
China
Prior art keywords
detected
page
region
image
gradient histogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010392403.0A
Other languages
Chinese (zh)
Other versions
CN111582184B (en
Inventor
陈宗文
杜义明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hanhai Information Technology Shanghai Co Ltd
Original Assignee
Hanhai Information Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hanhai Information Technology Shanghai Co Ltd filed Critical Hanhai Information Technology Shanghai Co Ltd
Priority to CN202010392403.0A priority Critical patent/CN111582184B/en
Publication of CN111582184A publication Critical patent/CN111582184A/en
Application granted granted Critical
Publication of CN111582184B publication Critical patent/CN111582184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/02Recognising information on displays, dials, clocks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a page detection method, a page detection device, page detection equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring a to-be-detected image corresponding to a to-be-detected page and gradient histogram information corresponding to a negative sample; performing region division on an image to be detected to obtain a region to be detected, and determining a gradient histogram corresponding to the region to be detected; matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition; and responding to the successful matching of the gradient histogram corresponding to any region to be detected, and determining that the page to be detected fails to be loaded. Based on the process, each region to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain region can be identified, the information related to the gradient histogram is rich, and the accuracy of judging whether the page fails to be loaded or not is improved.

Description

Page detection method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a page detection method, a page detection device, page detection equipment and a storage medium.
Background
With the development of computer technology, more and more applications and web pages can be installed on a terminal. Such as map-like applications, navigation-like web pages, and the like. During the use of these applications or web pages, page loading failures may occur, such as map page loading failures, shopping page loading failures, payment page loading failures, and the like. How to accurately detect the page loading failure is a key to timely solve the problem of the loading failure and further recover the normal service of the application program or the webpage.
In the related technology, the gray histogram of the whole image to be detected corresponding to the page to be detected is matched with the gray histogram of the standard image when the page is successfully loaded, and whether the page is failed to be loaded is judged according to the matching result. In the page detection process, the whole image to be detected is detected by a matching method based on a gray histogram, the detection granularity is thicker, the accuracy rate of judging whether the page fails to be loaded is lower, and the page detection effect is poorer.
Disclosure of Invention
The embodiment of the application provides a page detection method, a page detection device, page detection equipment and a storage medium, which can be used for solving the problems in the related art. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a page detection method, where the method includes:
acquiring an image to be detected corresponding to a page to be detected and gradient histogram information corresponding to a negative sample, wherein the negative sample is obtained based on image training corresponding to a page with loading failure, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to regions at different positions in the negative sample;
dividing the image to be detected into regions to obtain a region to be detected, and determining a gradient histogram corresponding to the region to be detected;
matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is a region which is in a position corresponding to the region to be detected in the negative sample meeting the matching condition;
and responding to the successful matching of the gradient histogram corresponding to any region to be detected, and determining that the page to be detected fails to be loaded.
In another aspect, an apparatus for detecting a page is provided, the apparatus including:
the acquiring module is used for acquiring an image to be detected corresponding to a page to be detected and gradient histogram information corresponding to a negative sample, wherein the negative sample is obtained based on image training corresponding to a page with failed loading, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to regions at different positions in the negative sample;
the dividing module is used for carrying out region division on the image to be detected to obtain a region to be detected;
the first determining module is used for determining a gradient histogram corresponding to the to-be-detected region;
the matching module is used for matching the gradient histogram corresponding to the to-be-detected region with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is a region which is in a position corresponding to the to-be-detected region in the negative sample meeting the matching condition;
and the second determining module is used for responding to the successful matching of the gradient histogram corresponding to any region to be detected and determining that the page to be detected fails to be loaded.
Optionally, the dividing module is further configured to perform region division on the image to be detected according to a first granularity and a second granularity respectively to obtain a region to be detected of a first size and a region to be detected of a second size;
the first determining module is further configured to determine a gradient histogram corresponding to the region to be detected in the first size and a gradient histogram corresponding to the region to be detected in the second size.
Optionally, the first size is larger than the second size, the candidate gradient histograms include a first candidate gradient histogram and a second candidate gradient histogram, and the matching module is further configured to match the gradient histogram corresponding to the to-be-detected region of the first size with the first candidate gradient histogram; and in response to the failure of matching of the gradient histograms corresponding to the to-be-detected region with the first size, matching the gradient histogram corresponding to the to-be-detected region with the second candidate gradient histogram.
Optionally, the second determining module is further configured to determine that the page to be detected fails to be loaded in response to a successful matching of the gradient histogram corresponding to the region to be detected of any first size; or, determining that the page to be detected fails to be loaded in response to the fact that the gradient histograms corresponding to the regions to be detected of the first size fail to be matched and the gradient histograms corresponding to the regions to be detected of any second size fail to be matched.
Optionally, the apparatus further comprises:
the edge detection module is used for responding to the matching failure of the gradient histograms corresponding to the to-be-detected area and carrying out edge detection on the to-be-detected image;
the second determining module is further configured to determine that the page to be detected fails to be loaded in response to a result of the edge detection indicating that the target shape does not exist in the image to be detected.
Optionally, the obtaining module is further configured to obtain a color value of at least one standard layer corresponding to the page to be detected;
the device further comprises:
the operation module is used for responding to the result of edge detection to indicate that a target shape exists in the image to be detected and carrying out binarization operation on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected;
the second determining module is further configured to determine that the page to be detected fails to be loaded in response to that an image obtained after binarization operation is performed on the image to be detected according to the color value of any standard layer does not meet a color condition.
Optionally, the obtaining module is further configured to obtain a color value of at least one standard layer corresponding to the page to be detected;
the operation module is further configured to perform binarization operation on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected in response to a failure in matching all the gradient histograms corresponding to the area to be detected;
the second determining module is further configured to determine that the page to be detected fails to be loaded in response to that an image obtained after binarization operation is performed on the image to be detected according to the color value of any standard layer does not meet a color condition.
Optionally, the apparatus further comprises:
the acquisition module is used for acquiring equipment information;
and the uploading module is used for uploading the equipment information and the image to be detected to a server.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one program code is stored in the memory, and the at least one program code is loaded and executed by the processor to implement any one of the above-mentioned page detection methods.
In another aspect, a computer-readable storage medium is provided, where at least one program code is stored, and the at least one program code is loaded and executed by a processor to implement any of the above-mentioned page detection methods.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
dividing the image to be detected into areas to be detected, matching the gradient histogram corresponding to the areas to be detected with the candidate gradient histogram corresponding to the candidate area in the negative sample, and determining that the page to be detected fails to be loaded when the gradient histogram corresponding to any area to be detected is successfully matched. In the page detection process, each region to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain region can be identified, in addition, the information related to the gradient histogram is rich, the detection is carried out based on the gradient histogram, the accuracy rate of judging whether the page fails to be loaded can be improved, and the page detection effect is good.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a page detection method provided in an embodiment of the present application;
fig. 2 is a flowchart of a page detection method provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of an image to be detected and a gradient histogram corresponding to the image to be detected according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an image obtained in a process of performing shape detection on an image to be detected corresponding to a map page according to an embodiment of the present application;
fig. 5 is a schematic diagram of an image to be detected and an image obtained after binarization operation is performed on the image to be detected according to a color value of a text labeling layer according to an embodiment of the application;
FIG. 6 is a schematic diagram of a page detection process provided in an embodiment of the present application;
fig. 7 is a schematic diagram of a page detection apparatus according to an embodiment of the present application;
fig. 8 is a schematic diagram of a page detection apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
With the development of computer technology, more and more applications and web pages can be installed on a terminal. Such as map-like applications, navigation-like web pages, and the like. During the use of these applications or web pages, page loading failures may occur, such as map page loading failures, shopping page loading failures, payment page loading failures, and the like. How to accurately detect the page loading failure is a key to timely solve the problem of the loading failure and further recover the normal service of the application program or the webpage.
For example, taking the application program as a map application program as an example, in the using process of the map application program, a map page loading failure may occur (for example, a text label loading failure, a navigation route loading failure, and the like). At this time, only if the page loading failure is detected in time, the normal map service can be recovered by solving the problem of map loading failure. It should be noted that different map applications may recover normal map services in different ways. For example, for a unified map application program that encapsulates maps provided by multiple service parties, when a map page loading failure of a certain map is detected, a way of recovering a normal map service may be to dynamically and seamlessly switch to a map page of another map.
The embodiment of the application provides a page detection method to judge whether a page fails to be loaded. Please refer to fig. 1, which illustrates a schematic diagram of an implementation environment of a page detection method according to an embodiment of the present application. The implementation environment may include: a terminal 11 and a server 12.
The terminal 11 is installed with an application program or a web page capable of displaying a page, and when the page displayed by the application program or the web page needs to be detected, the method provided by the embodiment of the present application can be applied to perform page detection. The server 12 may train a negative sample for determining whether the page fails to be loaded, and then send gradient histogram information of the negative sample to the terminal 11, and the terminal 11 detects an image to be detected corresponding to the page to be detected based on the gradient histogram information of the negative sample to determine whether the page fails to be loaded. Certainly, the terminal 11 may also send the image to be detected corresponding to the page to be detected to the server 12, and the server 12 detects the image to be detected corresponding to the page to be detected based on the gradient histogram information of the negative sample to determine whether the page fails to be loaded.
Alternatively, the terminal 11 may be any electronic product capable of performing man-machine interaction with a user through one or more modes of a keyboard, a touch pad, a touch screen, a remote controller, voice interaction or handwriting equipment, such as a PC (Personal computer), a mobile phone, a smart phone, a PDA (Personal Digital Assistant), a wearable device, a pocket PC (pocket PC), a tablet computer, a smart car, a smart television, a smart speaker, and the like. The server 12 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center. The terminal 11 establishes a communication connection with the server 12 through a wired or wireless network.
It should be understood by those skilled in the art that the above-mentioned terminal 11 and server 12 are only examples, and other existing or future terminals or servers may be suitable for the present application and are included within the scope of the present application and are herein incorporated by reference.
Based on the implementation environment shown in fig. 1, the present application provides a page detection method, which is applied to the terminal 11 as an example. As shown in fig. 2, the method provided by the embodiment of the present application may include the following steps:
in step 201, gradient histogram information corresponding to the to-be-detected image and the negative sample corresponding to the to-be-detected page is obtained.
The negative sample is obtained based on image training corresponding to the page with the loading failure, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to regions at different positions in the negative sample.
The page to be detected refers to any page in an application program or a webpage installed by the terminal, for example, when the application program installed by the terminal is a map application program, the page to be detected may refer to a home page of the map application program, or may refer to a navigation page in the use process of the map application program.
The image to be detected corresponding to the page to be detected may be an image obtained by capturing a screen of a visible region in the page to be detected, and the image may be an RGB (Red Green Blue ) image. Optionally, the terminal may be equipped with a page detection system, and the process of the terminal acquiring the to-be-detected image corresponding to the to-be-detected page may be: and responding to a detection instruction of the page to be detected, and calling a page detection system by the terminal to capture the screen of the visible area of the page to be detected to obtain the image to be detected corresponding to the page to be detected. The detection instruction of the page to be detected may be automatically triggered when the page to be detected is displayed, or may be manually triggered by a user, which is not limited in the embodiment of the present application.
Alternatively, the page detection system may be constructed based on a page detection SDK (Software Development Kit). It should be noted that, in the embodiment of the present application, the starting time of the page navigation system is not limited, and the page navigation system may be started before being called, for example, the page navigation system may be started when a terminal starts a certain application program or a web page, or may be started when a detection instruction of a page to be detected is detected.
The negative sample is a sample under the condition of page loading failure and is used for detecting whether the loading of the image to be detected fails. It should be noted that the negative examples may refer to negative examples common to each page to be detected, or may refer to negative examples for a certain application program or a certain web page corresponding to the page to be detected, and whether the negative examples are common negative examples or negative examples for a certain application program or a certain web page, compared with a standard image (positive example) obtained when loading corresponding to each page is successful when detecting each page in the related art, the embodiment of the present application may provide a smaller number of examples, and the page detection effect is better.
Alternatively, the negative examples may be obtained based on image training corresponding to the page with failed loading. The page with failed loading can refer to one or more of a page with failed local loading, a page with failed shape loading and a page with failed layer loading. The process of training to obtain the negative sample may be executed by the terminal or by the server, which is not limited in this embodiment of the application. The embodiment of the present application is described by taking an example in which a process of training to obtain a negative sample is performed by a server.
The server obtains a negative sample based on image training corresponding to the page with the failed loading, and the image corresponding to the page with the failed loading may be uploaded to the server by the application program or the business side of the web page, or may be uploaded to the server by the user using the application program or the web page, which is not limited in the embodiment of the present application. After the image corresponding to the page with the failed loading is obtained, the server trains the image corresponding to the page with the failed loading to obtain a negative sample. For example, the process of training the image corresponding to the page with failed loading may be: and extracting the features of each image, training based on the features of each image, and obtaining an image fusing the features of the plurality of images as a negative sample. Optionally, in the process of training the images corresponding to the page with failed loading, the images may be clustered, and then each type of image is trained respectively to obtain a negative sample corresponding to each type of image. The clustering standard may be a loading failure reason, a loading failure area, and the like, which is not limited in the embodiment of the present application.
After the negative sample is obtained through training, the server may obtain gradient histogram information corresponding to the negative sample, where the gradient histogram information corresponding to the negative sample is used to indicate gradient histograms corresponding to regions at different positions in the negative sample. It should be noted that the number of the negative examples may be one or more, and different negative examples may correspond to different loading failure reasons, different loading failure areas, different scaling levels, and the like. And when the number of the negative samples is multiple, the server respectively acquires the gradient histogram information corresponding to each negative sample.
Alternatively, the gradient histogram information may refer to an entire gradient histogram corresponding to the negative sample, where the size of the entire gradient histogram is consistent with the size of the negative sample, and the gradient histograms corresponding to regions at different positions in the negative sample may be truncated in the entire gradient histogram. The gradient histogram information may also be a gradient histogram corresponding to each unit region in the negative sample, and the gradient histograms corresponding to the plurality of unit regions are connected in series, so that the gradient histograms corresponding to the regions at different positions in the negative sample are obtained.
Alternatively, the entire gradient histogram corresponding to the negative examples may be obtained based on the gradient histograms corresponding to the respective unit regions in the negative examples. The method for obtaining the whole gradient histogram corresponding to the negative sample based on the gradient histogram corresponding to each unit region in the negative sample includes, but is not limited to, the following three ways:
mode 1: and directly connecting the gradient histograms corresponding to all unit areas in the negative sample in sequence to obtain the whole gradient histogram corresponding to the negative sample.
In this way 1, the efficiency of obtaining the whole gradient histogram corresponding to the negative sample is high.
Mode 2: and carrying out normalization processing on the gradient histograms corresponding to the unit areas in the negative sample, and connecting the gradient histograms corresponding to the unit areas after the normalization processing according to the sequence to obtain the whole gradient histogram corresponding to the negative sample.
In this manner 2, the influence of light can be eliminated to some extent.
Mode 3: combining the unit areas in the negative sample into a larger block, normalizing the gradient histograms corresponding to the unit areas in the block according to the block, and connecting the gradient histograms corresponding to the normalized unit areas in sequence to obtain the whole gradient histogram corresponding to the negative sample.
In such mode 3, the individual cell regions are combined into large, spatially connected blocks. Different blocks are overlapped with each other, so that the gradient histogram of each unit area appears in the whole gradient histogram corresponding to the negative sample for a plurality of times with different results. The whole gradient histogram corresponding to the negative sample obtained in the manner 3 is closer to the real characteristics of the negative sample.
Regardless of the above-described manner of obtaining the entire gradient histogram corresponding to the negative sample based on the gradient histogram corresponding to each unit region in the negative sample, the gradient histogram corresponding to each unit region in the negative sample needs to be obtained before obtaining the entire gradient histogram corresponding to the negative sample. Optionally, the gradient histogram in the embodiment of the present application may refer to a visualized gradient histogram, so as to facilitate visual comparison.
The nature of the negative examples is image, and the process of obtaining the gradient histogram corresponding to each unit area in an image is described next. The Histogram of gradients may be referred to as HOG (Histogram of Oriented gradients). Alternatively, the process of obtaining the gradient histogram corresponding to each unit region in one image may include the following 3 steps:
(1) and preprocessing the image.
The purpose of the preprocessing is to normalize the image to reduce the influence caused by local shadows and illumination changes of the image. Alternatively, the manner of preprocessing the image may refer to gamma-correcting the image.
(2) The gradient of each pixel point in the image is calculated.
The gradient of each pixel point comprises a gradient amplitude and a gradient direction of each pixel point, and the gradient amplitude and the gradient direction of each pixel point can be obtained by calculation based on the horizontal direction gradient and the vertical direction gradient of each pixel point. Therefore, before calculating the gradient of each pixel point, it is necessary to calculate the horizontal gradient and the vertical gradient of each pixel point. The horizontal direction gradient of each pixel point can be calculated based on formula 1, and the vertical direction gradient of the pixel point can be calculated based on formula 2:
Gx(x, y) ═ H (x +1, y) -H (x-1, y) (equation 1)
Gy(x, y) ═ H (x, y +1) -H (x, y-1) (formula 2)
Wherein G isx(x, y) represents the horizontal gradient of the pixel point with the coordinate (x, y); gy(x, y) represents the vertical gradient of the pixel point with the coordinate (x, y); h (x +1, y) represents a pixel value of a pixel point having coordinates (x +1, y); h (x-1, y) represents the pixel value of the pixel point with the coordinate of (x-1, y); h (x, y +1) represents a pixel value of a pixel point having coordinates (x, y + 1); h (x, y-1) represents coordinates of(x, y-1) pixel values of the pixel points.
The implementation process of calculating the horizontal direction gradient and the vertical direction gradient of each pixel point may be as follows: using [ -1,0,1 [)]Performing convolution operation on the image by a gradient operator to obtain a horizontal gradient (taking the right direction as a positive direction); using [ -1,0,1 [)]TAnd (4) carrying out convolution operation on the image by a gradient operator to obtain a vertical gradient (taking the upward direction as a positive direction).
After the horizontal direction gradient and the vertical direction gradient of each pixel point are obtained through calculation, the gradient amplitude and the gradient direction of each pixel point can be calculated through a formula 3 and a formula 4 respectively, and therefore the gradient of each pixel point is obtained.
Figure BDA0002486298840000071
Figure BDA0002486298840000072
Wherein G (x, y) represents the gradient amplitude of the pixel point with the coordinate (x, y); gx(x, y) represents the horizontal gradient of the pixel point with the coordinate (x, y); gy(x, y) represents the vertical gradient of the pixel point with the coordinate (x, y), and α (x, y) represents the gradient direction of the pixel point with the coordinate (x, y).
It should be noted that, for the RGB image, gradients (gradient magnitude and gradient direction) are calculated on three channels of each pixel. And taking the maximum gradient amplitude values on the three channels as the gradient amplitude values of the pixel points, and taking the gradient direction corresponding to the maximum gradient amplitude values as the gradient direction of the pixel points.
(3) A gradient histogram is constructed for each unit region.
The preprocessed image is divided into a plurality of unit regions, each unit region is composed of a plurality of pixel points, for example, each unit region is composed of 6 × 6 pixel points. And counting the gradient amplitude and the gradient direction of the pixel points contained in each unit region by adopting a direction block mode, thereby constructing a gradient histogram for each unit region.
Optionally, the gradient magnitude and the gradient direction of the pixel point included in each unit region are counted in a direction block manner, so that the process of constructing the gradient histogram for each unit region may be: and cutting the gradient direction of 360 degrees into 9 direction blocks, and performing weighted projection (mapping to a fixed angle range) on each pixel point in the unit region in the histogram by using the gradient amplitude and the gradient direction to obtain a gradient histogram (directional gradient histogram) of the unit region. In the process of carrying out weighted projection on each pixel point in the unit region in the histogram by using the gradient amplitude and the gradient direction, the gradient direction is used for positioning a direction block, and the gradient amplitude is used as a weight of the projection.
For example, the result of cutting a 360 degree gradient direction into 9 directional blocks may be: 0 to 20 degrees and 180 to 200 degrees represent direction blocks 1, 20 to 40 degrees and 200 to 220 degrees represent direction blocks 2, 160 to 180 degrees and 340 to 360 degrees represent direction block 9. In the process of performing weighted projection on each pixel point in the unit region in the histogram by using the gradient amplitude and the gradient direction, if the gradient direction of a certain pixel point in the unit region is 30 degrees, the gradient amplitude of the pixel point is increased for counting the direction block 2.
After obtaining the numerical representation result of the gradient histogram for each unit region, the numerical representation result may be visualized, and the visualized result may be used as the gradient histogram corresponding to each unit region.
After obtaining the gradient histogram corresponding to each unit region, the server may obtain the gradient histogram information corresponding to the negative sample no matter whether the gradient histogram information refers to the entire gradient histogram corresponding to the negative sample or the gradient histogram corresponding to each unit region in the negative sample.
Optionally, the process of the terminal acquiring the gradient histogram information corresponding to the negative sample may be: and the terminal calls a page detection system to pull the gradient histogram information corresponding to the negative sample from the server.
It should be noted that, the order of obtaining the to-be-detected image corresponding to the to-be-detected page and the gradient histogram information corresponding to the negative sample is not limited in the embodiment of the present application. Illustratively, gradient histogram information corresponding to the negative sample can be obtained first, and then an image to be detected corresponding to the page to be detected is obtained, so as to ensure that detection is performed immediately after the image to be detected corresponding to the page to be detected is obtained; the image to be detected corresponding to the page to be detected can be obtained first, and then the gradient histogram information corresponding to the negative sample is obtained, so that the memory occupied by the terminal is reduced.
In step 202, the image to be detected is subjected to region division to obtain a region to be detected, and a gradient histogram corresponding to the region to be detected is determined.
After the image to be detected is obtained, performing area division on the image to be detected to obtain an area to be detected, and judging whether the page to be detected fails to be loaded through detection of the area to be detected. Compared with the mode of directly detecting the whole image to be detected, the method can refine the detection granularity and improve the accuracy of judging whether the page to be detected fails to be loaded.
Optionally, the number of times of performing the area division on the image to be detected may be one time, or may also be multiple times, and each time of the area division is performed on the whole image to be detected. It should be noted that one dividing process may refer to an equal dividing process or a non-equal dividing process, and this is not limited in the embodiment of the present application. And when the primary dividing process is an equal dividing process, obtaining the areas to be detected with the same size after the primary dividing process, and when the primary dividing process is a non-equal dividing process, obtaining the areas to be detected with different sizes after the primary dividing process. In the embodiment of the present application, each dividing process is an equal dividing process as an example.
Alternatively, the process of equally dividing the area to be detected may be: and carrying out region division on the image to be detected according to the reference granularity. The reference granularity is used to indicate how to equally divide the image to be detected, and alternatively, the reference granularity may indicate that the length and width of the image to be detected are equally divided into equal parts, respectively, for example, the reference granularity may be 2 × 2, and in this reference granularity, the length and width of the image to be detected are equally divided into 2 equal parts, respectively.
The number and size of the regions to be detected is determined by the reference granularity. The reference particle size may be one particle size or a plurality of particle sizes. When the reference granularity is one granularity, performing region division on the image to be detected once to obtain regions to be detected with one size, wherein the number of the regions to be detected with the one size is determined by the one reference granularity. For example, assuming that the reference particle size is 2 × 2, after dividing the length and width of the image to be detected into 2 parts respectively according to the particle size, 4 regions to be detected of 1/4 having the size of the image to be detected are obtained.
When the reference granularity is multiple granularities, the image to be detected is subjected to region division once according to each granularity, and a region to be detected with one size is obtained after each region division. Taking the reference granularity as two granularities as an example, the two granularities are divided into a first granularity and a second granularity, under the circumstance, the process of obtaining the to-be-detected region by performing region division on the to-be-detected image can be as follows: and respectively carrying out region division on the image to be detected according to the first granularity and the second granularity to obtain a region to be detected with a first size and a region to be detected with a second size. Alternatively, the first particle size may be a coarser particle size and the second particle size may be a finer particle size. For example, the first particle size is 2 x 2 and the second particle size is 4 x 4. In this case, the first dimension is larger than the second dimension. Alternatively, the region to be detected of any first size may be constituted by at least two regions to be detected of a second size. For example, when the first granularity is 2 × 2 and the second granularity is 4 × 4, the region to be detected of any first size is composed of 4 regions to be detected of the second size.
When the reference granularity is multiple granularities, whether the page to be detected fails to be loaded can be judged through multi-granularity detection. Optionally, the multiple granularities may be arranged in sequence from coarse to fine, the coarse granularity corresponds to a large-sized region to be detected, and the fine granularity corresponds to a small-sized region to be detected. After the regions are divided according to the multiple granularities which are sequentially arranged from coarse to fine, the regions to be detected with the sizes from large to small can be obtained. It should be noted that, since the entire image to be detected is divided into regions according to each granularity, the region to be detected of each size constitutes the entire image to be detected.
After the regions to be detected are obtained, determining a gradient histogram corresponding to the regions to be detected, wherein each region to be detected corresponds to a gradient histogram. Optionally, the manner of determining the gradient histogram corresponding to the region to be detected includes, but is not limited to, the following two ways:
the first method is as follows: and extracting a gradient histogram corresponding to the image to be detected, and taking the gradient histogram at the same position as the region to be detected in the gradient histogram corresponding to the image to be detected as the gradient histogram corresponding to the region to be detected.
The second method comprises the following steps: and extracting the gradient histograms corresponding to the unit areas in the image to be detected, and connecting the gradient histograms of the unit areas forming the area to be detected to obtain the gradient histogram corresponding to the area to be detected.
The gradient histogram corresponding to the image to be detected in the first mode can be obtained based on the gradient histogram corresponding to the unit region in the image to be detected in the second mode. The process of extracting the gradient histogram corresponding to the unit region in the image to be detected and the implementation process of obtaining the gradient histogram corresponding to the image to be detected based on the gradient histogram corresponding to the unit region in the image to be detected may refer to the process of obtaining the gradient histogram information corresponding to the negative sample by the server in step 201, and are not described herein again.
For example, the image to be detected and the corresponding gradient histogram of the image to be detected can be as shown in fig. 3. Fig. 3 (1) shows an image to be detected, and fig. 3 (2) shows a gradient histogram corresponding to the image to be detected shown in fig. 3 (1). As can be seen from (2) in fig. 3, the gradients in the rectangular frame 301 and the rectangular frame 302 are smaller, and it can be determined that the loading of the to-be-detected region corresponding to the rectangular frame 301 and the rectangular frame 302 fails in the process of detecting the to-be-detected region based on the gradient histogram.
Optionally, for the condition that the image to be detected is divided into regions according to the first granularity and the second granularity respectively to obtain the region to be detected of the first size and the region to be detected of the second size, the region to be detected includes the region to be detected of the first size and the region to be detected of the second size. At this time, the process of determining the gradient histogram corresponding to the region to be detected is as follows: and determining a gradient histogram corresponding to the region to be detected with the first size and a gradient histogram corresponding to the region to be detected with the second size.
In step 203, the gradient histogram corresponding to the region to be detected is matched with a candidate gradient histogram, where the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is a region in the negative sample that satisfies the matching condition, where the region to be detected is located at a position corresponding to the region to be detected.
After the gradient histogram corresponding to the to-be-detected region is obtained, the to-be-detected region is detected based on the gradient histogram corresponding to the to-be-detected region, and whether the loading of the to-be-detected page fails is judged according to the detection result of the to-be-detected region.
And matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram in the process of detecting the region to be detected based on the gradient histogram corresponding to the region to be detected. And the candidate gradient histogram is the gradient histogram corresponding to the candidate region at the corresponding position of the region to be detected in the negative sample meeting the matching condition. It should be noted that the number of the regions to be detected is multiple, and in the implementation process of step 203, each region to be detected is detected based on the gradient histogram corresponding to each region to be detected.
When there is only one region to be detected in one size, each region to be detected is detected in sequence until the gradient histogram corresponding to any region to be detected is successfully matched, and step 204 is executed. When there are various regions to be detected with different sizes, the large regions to be detected can be detected in sequence, and if the gradient histogram corresponding to any large region to be detected is successfully matched, step 204 is executed; if the matching of the gradient histograms corresponding to the large-sized regions to be detected fails, detecting each region to be detected in a smaller size, and repeating the above process until the matching of the gradient histograms corresponding to any region to be detected succeeds, and executing step 204. It should be noted that, when the matching of the gradient histogram corresponding to any one region to be detected is successful, the detection process of other undetected regions to be detected may be stopped, or the detection process of other undetected regions to be detected may be continuously completed, which is not limited in the embodiment of the present application.
No matter what size of the region to be detected, the process of detecting any region to be detected is as follows: and matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram. And the candidate gradient histogram is a gradient histogram corresponding to a region, corresponding to the region to be detected, in the negative sample meeting the matching condition. Next, a description will be given of a detection process of any one of the regions to be detected as an example.
Each negative sample has a region corresponding to the region to be detected. When the size of the negative sample is the same as that of the image to be detected, the region corresponding to the region to be detected may be a region at the same position as the region to be detected; when the size of the negative sample is different from that of the image to be detected, the region corresponding to the region to be detected may be a region whose relative position in the negative sample is the same as that of the region to be detected in the image to be detected. And taking the region at the corresponding position with the region to be detected as a candidate region.
Before detecting the region to be detected, the negative sample meeting the matching condition needs to be determined first, so as to further determine the gradient histogram corresponding to the candidate region for matching according to the negative sample meeting the matching condition.
The number of the negative samples may be multiple, and negative samples which are not suitable for detecting the region to be detected may exist in the multiple negative samples, so that the negative samples which are suitable for detecting the region to be detected need to be determined first. Alternatively, negative samples suitable for use in detecting the region to be detected include, but are not limited to, the following features: the area loading corresponding to the area to be detected fails; the zoom level is the same as the page to be detected.
After the negative sample suitable for detecting the region to be detected is screened out, the negative sample meeting the matching condition can be determined according to the negative sample suitable for detecting the region to be detected. Optionally, according to a negative sample suitable for detecting the region to be detected, the manner of determining the negative sample satisfying the matching condition may be: and taking all negative samples which are suitable for detecting the area to be detected as the negative samples meeting the matching condition. Optionally, each negative sample may be preset with a priority, and according to the negative sample suitable for detecting the area to be detected, the manner of determining the negative sample satisfying the matching condition may further be: and taking one or more negative samples with the top priority suitable for detecting the area to be detected as the negative samples meeting the matching condition.
After the negative sample meeting the matching condition is determined, the gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition can be used as a candidate gradient histogram, and then the gradient histogram corresponding to the region to be detected is matched with the candidate gradient histogram. It should be noted that, for any region to be detected, there may be one or more negative samples satisfying the matching condition. And when a plurality of negative samples meeting the matching condition exist, each candidate region in each negative sample meeting the matching condition corresponds to one candidate gradient histogram, and the gradient histograms corresponding to the regions to be detected are respectively matched with each candidate gradient histogram.
The process of determining the negative samples satisfying the matching condition is described by taking one to-be-detected region as an example in the above process, it should be noted that the negative samples satisfying the matching condition corresponding to different to-be-detected regions may be the same or different, and this is not limited in the embodiment of the present application.
For any region to be detected, in the process of matching with the candidate gradient histogram, the similarity between the gradient histogram corresponding to the region to be detected and the candidate gradient histogram can be calculated, and then whether the gradient histogram corresponding to the region to be detected is successfully matched or not is judged according to the similarity. Alternatively, the way to calculate the similarity between two gradient histograms may be: the similarity between the two gradient histograms is calculated based on the euclidean distance.
For the condition that a plurality of negative samples meet the matching condition, a plurality of similarities can be obtained in the process of matching the gradient histogram corresponding to the region to be detected with each candidate gradient histogram, then the average similarity of the similarities can be calculated, and whether the gradient histogram corresponding to the region to be detected is successfully matched or not is judged according to the average similarity.
Optionally, the process of determining whether the gradient histogram corresponding to the region to be detected is successfully matched according to the similarity includes: responding to the similarity exceeding a similarity threshold, and determining that the gradient histogram corresponding to the to-be-detected region is successfully matched; and determining that the gradient histogram corresponding to the region to be detected fails to be matched in response to the similarity not exceeding the similarity threshold. The similarity threshold may be a preset fixed threshold, and the matching process of all the regions to be detected is compared with the fixed threshold. The similarity threshold can also be set according to negative samples, and different negative samples correspond to different similarity thresholds. The way of setting the similarity threshold according to the negative sample may be: and determining the similarity threshold corresponding to the negative sample according to the noise condition of the negative sample.
For the situation that different negative samples respectively correspond to the similarity threshold, when the number of the negative samples meeting the matching condition is one, comparing the calculated similarity with the similarity threshold corresponding to the negative sample meeting the matching condition; when the number of the negative samples satisfying the matching condition is multiple, an average similarity threshold of similarity thresholds corresponding to the multiple negative samples satisfying the matching condition may be calculated, and then the calculated average similarity may be compared with the average similarity threshold.
The above process describes the process of detecting a region to be detected. It should be noted that the process of detecting all the regions to be detected can be implemented according to the above process, and details are not described here.
Optionally, for a case that the region to be detected includes a region to be detected of a first size and a region to be detected of a second size, when the first size is larger than the second size, the candidate gradient histogram includes a first candidate gradient histogram and a second candidate gradient histogram. The process of matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram may include: matching the gradient histogram corresponding to the to-be-detected region with the first size with the first candidate gradient histogram; and in response to the failure of matching of the gradient histograms corresponding to the to-be-detected region with the first size, matching the gradient histogram corresponding to the to-be-detected region with the second candidate gradient histogram.
The first candidate gradient histogram is a candidate gradient histogram for matching with a gradient histogram corresponding to the region to be detected with the first size, and the second candidate gradient histogram is a candidate gradient histogram for matching with a gradient histogram corresponding to the region to be detected with the second size. It should be noted that each region to be detected of the first size corresponds to one first candidate gradient histogram, and each region to be detected of the second size corresponds to one second candidate gradient histogram. The large-size region to be detected is detected, and when the gradient histograms of the large-size region to be detected fail to match, the small-size region to be detected is detected, so that the multi-granularity detection process can be realized, and the detection efficiency can be improved.
The detection result of the region to be detected comprises the following two types:
1. and matching the gradient histograms corresponding to any region to be detected successfully.
It should be noted that any region to be detected herein may refer to any region to be detected of any size. Since the gradient histogram corresponding to the to-be-detected region is matched with the gradient histogram corresponding to the candidate region in the negative sample satisfying the matching condition in the detection process, and the candidate region in the negative sample satisfying the matching condition fails to be loaded, when the gradient histogram corresponding to any to-be-detected region is successfully matched, it is indicated that any to-be-detected region in the to-be-detected image fails to be loaded, and at this time, step 204 is executed.
2. And matching the gradient histograms corresponding to the areas to be detected fails.
When the matching of the gradient histograms corresponding to the regions to be detected fails, it can be considered that the regions to be detected in the image to be detected are not loaded and failed on the aspect of regional detection based on the gradient histograms. At this time, it may be determined that the page to be detected has not failed to be loaded at the partitioned detection level based on the gradient histogram. It should be noted that, for the case that the region to be detected includes regions to be detected of various sizes, the failure of matching the gradient histograms corresponding to the regions to be detected here means that the failure of matching the gradient histograms corresponding to all the regions to be detected of all sizes.
Optionally, in a case that all gradient histograms corresponding to the regions to be detected fail to match, the following three processing procedures are included, but not limited to:
the first treatment process comprises the following steps: and responding to the failure of matching of the gradient histograms corresponding to the to-be-detected area, and determining that the page to be detected is loaded successfully.
In the processing process, when it is determined that the page to be detected fails to be loaded in the partition detection level based on the gradient histogram, the page to be detected is considered to be loaded successfully. In the processing process, the whole page detection process for judging whether the page to be detected fails to be loaded is completed by detecting in different regions, and compared with a mode of directly detecting the whole image to be detected in the related technology, the detection granularity is finer.
And a second treatment process: responding to the failure of matching of all gradient histograms corresponding to the to-be-detected region, and performing edge detection on the to-be-detected image; and determining that the page to be detected fails to be loaded in response to the result of the edge detection indicating that the target shape does not exist in the image to be detected.
When the page to be detected is determined to be not loaded and failed in the partitioned detection level based on the gradient histogram, the image to be detected can be further detected in the shape detection level, so that the accuracy of judging whether the page to be detected is loaded and failed is improved. The process of detecting the image to be detected on the shape detection level comprises the following steps: and carrying out edge detection on the image to be detected, and judging whether the target shape exists in the image to be detected according to the edge detection result.
Edge detection is a method for analyzing an image in image processing and computer vision, and aims to identify points with obvious brightness change in the image, and often shows a contour after edge detection. The method for performing edge detection on the image to be detected can be a Sobel (Sobel) edge detection method, a Laplacian edge detection method or a Canny edge detection method, the detection effects of different edge detection methods may be different, but all the methods can perform the edge detection function, and the embodiment of the application does not limit which edge detection method is used for performing edge detection on the image to be detected.
After the edge detection is carried out on the image to be detected, the result of the edge detection can be obtained, and then the terminal can carry out analysis processing on the result of the edge detection so as to judge whether the target shape exists in the image to be detected or not according to the result of the edge detection. The target shape refers to a basic shape that should be included in the page to be detected. The target shape may be set according to an application program or a web page corresponding to the page to be detected, which is not limited in the embodiment of the present application. For example, when the application program corresponding to the page to be detected is a map application program, the target shape that should be included in the page to be detected may be set as a straight line. The number of the object shapes may be one or more, and when the number of the object shapes is one, the one object shape may have different hierarchies. For example, for a page to be detected in a map application, the target shape may be set to be a straight line. The straight line may have a general road hierarchy, as well as a navigation route hierarchy.
Optionally, the analyzing and processing of the result of the edge detection may be: and carrying out Hough transform on the result of the edge detection. The hough transform is one of the basic methods for identifying geometric shapes from images, and is mainly used for separating geometric shapes (such as straight lines, circles and the like) with certain identical features from the images. The algorithm flow of hough transform is roughly: given the result of edge detection and the shape of the target to be distinguished, the Hough transform algorithm maps the edge detection feature points in the image space to the parameter space for voting, and a set of points conforming to a certain specific shape is obtained by detecting the local extreme points of the accumulated result. If the target shape can be obtained after the processing according to the Hough transform algorithm, indicating that the target shape exists in the image to be detected by the result of the edge detection; if the target shape cannot be obtained after the processing according to the Hough transform algorithm, the result of the edge detection indicates that the target shape does not exist in the image to be detected.
Optionally, for a case that the target shape includes two levels of straight lines, the process of performing hough transform on the result of the edge detection may be: preprocessing the result of edge detection, shielding the straight line of a certain level, and then carrying out Hough transform on the result of edge detection to judge whether the straight line of another level exists. Optionally, the straight lines at different levels may correspond to different colors, and the manner of shielding the straight line at a certain level may be: and masking the color which is the same as the color of the straight line of a certain level in the edge detection result.
For example, for a case that the page to be detected is a map page in a map application, the target shape corresponding to the page to be detected may include a straight line of a common road hierarchy and a straight line of a navigation route hierarchy. In the process of detecting the shape of the image to be detected corresponding to the map page, the image shown as 4 can be obtained. In fig. 4, (1) in fig. 4 shows an image obtained by masking a straight line of a normal road hierarchy in the edge detection result, and after the image shown in (1) in fig. 4 is subjected to hough transform, an image shown in (2) in fig. 4 can be obtained, and from the image shown in (2) in fig. 4, it can be known that a straight line of a navigation route hierarchy exists in the image to be detected.
After analyzing and processing the edge detection result, two results can be obtained:
the result is as follows: the result of the edge detection indicates that the target shape is not present in the image to be detected.
The result indicates that the target shape which should exist in the image to be detected is absent, and at this time, the page to be detected can be determined to be failed to be loaded. That is, in response to the result of the edge detection indicating that the target shape does not exist in the image to be detected, it is determined that the page to be detected fails to be loaded.
And a second result: the result of the edge detection indicates the presence of the target shape in the image to be detected.
Under the second result, it is indicated that the target shape which should exist in the image to be detected is not missing, and at this time, it can be considered that the page to be detected is determined to have no loading failure in the partitioned detection layer and the shape detection layer which are performed based on the gradient histogram. At this time, the page to be detected can be directly considered to be successfully loaded, and detection of other layers can be further performed, so that whether the page to be detected fails to be loaded is more finely judged according to detection results of other layers. The detection process of other layers is not limited in the embodiment of the application. The detection of other layers may refer to detection of layer layers, for example.
Optionally, the detection process of the layer level may be: responding to the result of the edge detection to indicate that a target shape exists in the image to be detected, and performing binarization operation on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected; and determining that the page to be detected fails to be loaded according to the fact that the image obtained after the binarization operation is carried out on the image to be detected according to the color value of any standard layer does not meet the color condition.
Before executing the layer level detection process, it is necessary to obtain the color value of at least one standard layer corresponding to the page to be detected. The page to be detected is composed of at least one standard layer, and different standard layers are used for displaying different covers. For example, when the page to be detected is a map page, the standard layers corresponding to the map page may include one or more layers of a Marker layer, a Polylin layer, a hand-drawn layer, and a Text layer.
Each standard layer corresponds to a color value. It should be noted that the number of the standard layers corresponding to the page to be detected and the color value of each standard layer may be determined according to the application program or the web page corresponding to the page to be detected, and the page to be detected from different application programs or web pages may correspond to different standard layers, which is not limited in this embodiment of the application.
Optionally, the color value of the at least one standard layer corresponding to the page to be detected may be already stored locally in the terminal when the application program or the webpage corresponding to the page to be detected is installed, and at this time, the terminal may directly extract the color value of the at least one standard layer corresponding to the page to be detected locally. Optionally, the color value of the at least one standard layer corresponding to the page to be detected may also be stored in the server, and the terminal pulls the color value of the at least one standard layer corresponding to the page to be detected from the server.
After the color value of at least one standard layer corresponding to the page to be detected is obtained, binarization operation is carried out on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected. It should be noted that, because the color values of different standard layers are different, in the process of performing binarization operation on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected, binarization operation is performed on the image to be detected once according to the color value of each standard layer.
And according to the color value of each standard layer, after the image to be detected is subjected to binarization operation for one time, obtaining an image only having at least one color of the colors corresponding to the color values of the white and standard layers. After primary binarization operation is carried out on the image to be detected according to the color value of each standard layer, whether the image obtained after binarization operation is carried out on the image to be detected according to the color value of the standard layer meets the color condition or not is detected. Optionally, the color condition being satisfied may indicate a color in the image corresponding to the color value having the standard layer. Because the color values of different standard layers are different, in the process of carrying out binarization operation on the image to be detected according to the color values of different standard layers, the color conditions required to be met by the image obtained after the binarization operation are different.
After binarization operation is performed on an image to be detected according to the color value of the standard image layer, two results can be obtained:
firstly, carrying out binarization operation on an image to be detected according to the color value of any standard layer to obtain an image which does not meet the color condition.
Under the result, the covering corresponding to any standard layer in the image to be detected can be considered to be absent. Exemplarily, when any standard layer is a text labeling layer, it is assumed that a color corresponding to a color value of the text labeling layer is red, and when an image obtained after performing binarization operation on an image to be detected according to the color value of the text labeling layer only has white color, it indicates that a text labeling covering corresponding to the text labeling layer is absent from the image to be detected. For example, as shown in fig. 5, (1) in fig. 5 represents an image to be detected, and after the binarization operation is performed on the image to be detected as shown in (1) in fig. 5 according to the color value of the text label layer, an image as shown in (2) in fig. 5 can be obtained. Since the image shown in (2) in fig. 5 has only white color, the text annotation overlay corresponding to the missing text annotation layer in the image to be detected can be determined according to the image shown in (2) in fig. 5.
And determining that the page to be detected fails to be loaded according to the fact that the image obtained after the binarization operation is carried out on the image to be detected according to the color value of any standard layer does not meet the color condition.
II, secondly: and performing binarization operation on the image to be detected according to the color value of the standard image layer to obtain an image which meets the color condition.
Under the result, the image to be detected can be considered to include the covering corresponding to each standard layer. Under the result, it can be considered that the page to be detected is determined to have no loading failure in the partitioned detection layer, the shape detection layer and the layer detection layer based on the gradient histogram. At this time, it may be directly determined that the page to be detected is successfully loaded, that is, it is determined that the page to be detected is successfully loaded in response to that the images obtained after the binarization operation is performed on the image to be detected according to the color values of the standard layers corresponding to the page to be detected satisfy the color condition. Of course, other levels of detection may be further performed, which is not limited in the embodiments of the present application.
When the page to be detected is determined to be not loaded and failed on the basis of the partition detection level performed by the gradient histogram, further performing shape level detection; and when the shape detection layer still determines that the page to be detected fails to be loaded, further detecting the layer. The detection processes of the three layers can effectively improve the accuracy of judging whether the page to be detected fails to be loaded or not.
And a third treatment process: responding to the matching failure of all gradient histograms corresponding to the to-be-detected area, and performing binarization operation on the to-be-detected layer according to the color value of at least one standard layer corresponding to the to-be-detected page; and determining that the page to be detected fails to be loaded according to the fact that the image obtained after the binarization operation is carried out on the image to be detected according to the color value of any standard layer does not meet the color condition.
When it is determined that the page to be detected does not fail to be loaded on the partitioned detection level based on the gradient histogram, the image to be detected can be further detected on the layer detection level, so as to improve the accuracy of judging whether the page to be detected fails to be loaded. The process of detecting the image to be detected on the layer detection level is as follows: and performing binarization operation on the layer to be detected according to the color value of at least one standard layer corresponding to the page to be detected, and judging whether the page to be detected fails to be loaded according to whether the image obtained after the binarization operation meets the color condition.
The results of detecting the image to be detected on the layer detection layer include two types:
A. and performing binarization operation on the image to be detected according to the color value of any standard layer to obtain an image which does not meet the color condition.
Under the result, the covering corresponding to any standard layer in the image to be detected can be considered to be absent. And determining that the page to be detected fails to be loaded according to the fact that the image obtained after the binarization operation is carried out on the image to be detected according to the color value of any standard layer does not meet the color condition.
B. And performing binarization operation on the image to be detected according to the color value of the standard image layer to obtain an image which meets the color condition.
Under the result, the image to be detected can be considered to include the covering corresponding to each standard layer. Under the result, it can be considered that the page to be detected is determined to have not failed to be loaded in the partition detection layer and the layer detection layer which are performed based on the gradient histogram. At this time, it can be directly determined that the page to be detected is successfully loaded, and detection of other layers can be further performed, so as to more finely judge whether the page to be detected fails to be loaded according to detection results of other layers. The detection process of other layers is not limited in the embodiment of the application. Illustratively, the detection of other levels may refer to the detection of shape levels.
Optionally, the detection process of the shape level may be: performing edge detection on the image to be detected in response to that the image obtained after performing binarization operation on the image to be detected according to the color value of the standard image layer meets the color condition; and determining that the page to be detected fails to be loaded in response to the result of the edge detection indicating that the target shape does not exist in the image to be detected. The implementation process of this process may refer to process two, which is not described herein again.
When the page to be detected is determined to be not loaded and failed on the partitioned detection level based on the gradient histogram, further detecting the layer level; and further detecting the shape layer when the layer detection layer still determines that the page to be detected fails to be loaded. The detection processes of the three layers can also effectively improve the accuracy of judging whether the page to be detected fails to be loaded.
In step 204, in response to the successful matching of the gradient histogram corresponding to any one of the regions to be detected, it is determined that the page to be detected fails to be loaded.
When the matching of the gradient histogram corresponding to any one to-be-detected region is successful, the similarity of the gradient histogram of the to-be-detected region and the candidate gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition is higher, because the candidate region of the negative sample meeting the matching condition fails to be loaded, the loading of any one to-be-detected region in the to-be-detected image is failed, and because the loading of the to-be-detected region fails, the loading of the to-be-detected page is considered to fail.
Optionally, for the condition that the to-be-detected region includes a to-be-detected region of a first size and a to-be-detected region of a second size, determining that the to-be-detected page fails to be loaded in response to a successful matching of the gradient histogram corresponding to any one of the to-be-detected regions, where the following two conditions are included:
case 1: and determining that the page to be detected fails to be loaded in response to the successful matching of the gradient histogram corresponding to the area to be detected with any first size.
In case 1, matching of the gradient histogram corresponding to the to-be-detected region of the first size is successful, which indicates that the to-be-detected region of the first size fails to be loaded, and the page to be detected fails to be loaded can be determined without detecting the to-be-detected region of the second size.
Case 2: and responding to the matching failure of the gradient histograms corresponding to the regions to be detected in the first size, and determining the loading failure of the page to be detected if the matching of the gradient histograms corresponding to the regions to be detected in any second size is successful.
And when the gradient histograms corresponding to all the regions to be detected with the first size fail to be matched, detecting the regions to be detected with the second size, and if the gradient histograms corresponding to any region to be detected with the second size fail to be matched, indicating that the region to be detected with the second size fails to be loaded, and determining that the page to be detected fails to be loaded at this moment.
When the area to be detected comprises areas to be detected with various sizes, executing a detection process of the area to be detected with the largest size; when all the gradient histograms corresponding to the areas to be detected with the largest size fail to be matched, executing the detection process of the areas to be detected with the next largest size; and by analogy, when all the gradient histograms corresponding to the regions to be detected in the next small size fail to be matched, executing the detection process of the regions to be detected in the minimum size. When the region to be detected includes regions to be detected of N (integers not less than 1) sizes, successful matching of the gradient histogram corresponding to any region to be detected includes N cases:
case 1: and matching the gradient histograms corresponding to the regions to be detected with any maximum size successfully.
Case 2: matching of the gradient histograms corresponding to the regions to be detected with the largest size fails, and matching of the gradient histograms corresponding to the regions to be detected with any size succeeds.
Case N: matching of the gradient histograms corresponding to the regions to be detected with the sizes except the minimum size fails, and matching of the gradient histograms corresponding to the regions to be detected with any minimum size succeeds.
And when any one of the N conditions exists, determining that the page to be detected fails to be loaded.
Optionally, after determining that the page to be detected fails to be loaded, the terminal may collect the device information, and then upload the device information and the image to be detected to the server. The device information is used to indicate the current state of the terminal, and includes, but is not limited to, current device stack information, current device memory information, external device information of the current device connection, and current device CPU (central processing Unit) information. It should be noted that, the determining that the page loading failure to be detected includes, in addition to the cases involved in step 204, the cases involved in the second processing procedure and the third processing procedure in step 203.
After the device information and the image to be detected are uploaded to the server, the server can analyze the reason of the failure of loading of the page to be detected based on the device information and the image to be detected. After the reason of the loading failure is analyzed, the server can perform classification statistics on the uploaded information and can also send early warning information to the terminal, wherein the early warning information is used for informing the terminal of the reason of the loading failure of the page. Optionally, when the matching of the gradient histogram corresponding to any one to-be-detected region is successful, the detection process of other undetected to-be-detected regions is continuously completed, when it is determined that the page to be detected fails to be loaded, the terminal may determine all regions with failed loading, then may mark the regions with failed loading in the to-be-detected image, and then upload the marked to-be-detected image and the device information together to the server, so as to facilitate the server to perform fast classification statistics.
Optionally, after performing classification statistics on the uploaded information, the server may update the negative sample based on the image to be detected corresponding to the uploaded loading failure page, so as to perform more accurate detection subsequently by using a new negative sample.
Optionally, after the server sends the warning information to the terminal, the terminal may receive the warning information sent by the server, and display the warning information on the display page. The display page may refer to a page to be detected, or may refer to another page to which the page to be detected jumps, which is not limited in the embodiment of the present application. After the page display early warning information is displayed, a user of the terminal can timely know the reason causing the page loading failure, and then take a solving measure to eliminate bad factors causing the page loading failure.
For example, in the case that the page to be checked is the map ground in the map application, the reason for the failure of loading the map page may be: insufficient memory of the terminal, wrong initialization mode of the map application program, no network access of the terminal, and movement of the map to a non-support area (such as Africa). Different reasons may correspond to different solutions. For example, when the early warning information indicates that the reason causing the page loading failure is insufficient memory of the terminal, a user of the terminal can eliminate the bad factors causing the page loading failure by cleaning the occupied memory; when the early warning information indicates that the reason causing the page loading failure is the initialization mode error of the map application program, the user of the terminal can eliminate the adverse factor causing the page loading failure by restarting the map application program; when the early warning information indicates that the reason causing the page loading failure is that the terminal does not access the network, a user of the terminal can eliminate adverse factors causing the page loading failure by connecting the network; when the early warning information indicates that the reason causing the page loading failure is that the map is moved to the non-support area, the user of the terminal can eliminate the bad factor causing the page loading failure by moving the map to the support area.
Illustratively, the entire page detection process may be as shown in FIG. 6. Starting an application program or a webpage of the terminal, starting a page detection system, and pulling gradient histogram information corresponding to the negative sample from the server by the page detection system; clicking a to-be-detected page entering an application program or a webpage by a user, and completing loading of the to-be-detected page; calling a page detection system to intercept a visible area of a page to be detected as an image to be detected; determining a gradient histogram corresponding to a to-be-detected region divided by an image to be detected; and matching the gradient histogram corresponding to the region to be detected with the candidate gradient histogram corresponding to the candidate region in the negative sample meeting the matching condition. Judging whether the gradient histograms corresponding to the areas to be detected are all failed to be matched; when the gradient histogram corresponding to any region to be detected is successfully matched, determining that the page to be detected fails to be loaded; when the gradient histograms corresponding to the areas to be detected fail to be matched, carrying out edge detection on the images to be detected; and judging whether the target shape exists in the image to be detected.
When the target shape does not exist in the image to be detected, determining that the page to be detected fails to be loaded; when the target shape exists in the image to be detected, carrying out binarization operation on the image to be detected according to the color value of the standard image layer corresponding to the page to be detected; and judging whether the images obtained after the binarization operation all meet the color condition. When any image obtained after binarization operation does not meet the color condition, determining that the page to be detected fails to be loaded; and when the images obtained after the binarization operation all meet the color condition, determining that the page to be detected is loaded successfully. And when the loading failure of the page to be detected is determined, the terminal acquires the equipment information and uploads the equipment information and the image to be detected to the server.
In the embodiment of the application, the terminal can automatically judge whether the page to be detected fails to be loaded. The terminal can detect the conditions of local loading failure, shape loading failure and layer loading failure, the detection is more comprehensive, and the accuracy rate of judging whether the page to be detected fails in loading is higher. After the loading failure of the page to be detected is determined, the equipment information and the image to be detected are uploaded to the server for analysis and processing, the problem of the loading failure is solved in time, the normal service of the application program or the webpage is recovered as soon as possible, and the page detection effect is good. In addition, the method provided by the embodiment of the application does not limit the page type, and the application range is wider.
In the embodiment of the application, the image to be detected is divided into the areas to be detected, the gradient histogram corresponding to the areas to be detected is matched with the candidate gradient histogram corresponding to the candidate area in the negative sample, and when the gradient histogram corresponding to any area to be detected is successfully matched, the page to be detected is determined to be failed to be loaded. In the page detection process, each region to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain region can be identified, in addition, the information related to the gradient histogram is rich, the detection is carried out based on the gradient histogram, the accuracy rate of judging whether the page fails to be loaded can be improved, and the page detection effect is good.
Referring to fig. 7, an embodiment of the present application provides a page detection apparatus, including:
an obtaining module 701, configured to obtain an image to be detected corresponding to a page to be detected and gradient histogram information corresponding to a negative sample, where the negative sample is obtained based on image training corresponding to a page with failed loading, and the gradient histogram information corresponding to the negative sample is used to indicate gradient histograms corresponding to regions in different positions in the negative sample;
a dividing module 702, configured to perform region division on an image to be detected to obtain a region to be detected;
a first determining module 703, configured to determine a gradient histogram corresponding to a to-be-detected region;
a matching module 704, configured to match a gradient histogram corresponding to the to-be-detected region with a candidate gradient histogram, where the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is a region, in the negative sample that meets the matching condition, that is in a position corresponding to the to-be-detected region;
the second determining module 705 is configured to determine that the page to be detected fails to be loaded in response to a successful gradient histogram matching corresponding to any one of the regions to be detected.
Optionally, the dividing module 702 is further configured to perform region division on the image to be detected according to a first granularity and a second granularity, so as to obtain a region to be detected of a first size and a region to be detected of a second size;
the first determining module 703 is further configured to determine a gradient histogram corresponding to the to-be-detected region with the first size and a gradient histogram corresponding to the to-be-detected region with the second size.
Optionally, the first size is larger than the second size, the candidate gradient histograms include a first candidate gradient histogram and a second candidate gradient histogram, and the matching module 704 is further configured to match the gradient histogram corresponding to the to-be-detected region of the first size with the first candidate gradient histogram; and in response to the failure of matching of the gradient histograms corresponding to the to-be-detected region with the first size, matching the gradient histogram corresponding to the to-be-detected region with the second candidate gradient histogram.
Optionally, the second determining module 705 is further configured to determine that the page to be detected fails to be loaded in response to a successful matching of the gradient histogram corresponding to the region to be detected of any first size; or, determining that the page to be detected fails to be loaded in response to the fact that the gradient histograms corresponding to the regions to be detected of the first size fail to be matched and the gradient histograms corresponding to the regions to be detected of any second size fail to be matched.
Optionally, referring to fig. 8, the apparatus further comprises:
the edge detection module 706 is configured to perform edge detection on the image to be detected in response to a failure in matching all the gradient histograms corresponding to the region to be detected;
the second determining module 705 is further configured to determine that the page to be detected fails to be loaded in response to that the result of the edge detection indicates that the target shape does not exist in the image to be detected.
Optionally, the obtaining module 701 is further configured to obtain a color value of at least one standard layer corresponding to the page to be detected;
referring to fig. 8, the apparatus further comprises:
an operation module 707, configured to perform binarization operation on the image to be detected according to a color value of at least one standard layer corresponding to the page to be detected in response to an edge detection result indicating that a target shape exists in the image to be detected;
the second determining module 705 is further configured to determine that the page to be detected fails to be loaded in response to that an image obtained after performing binarization operation on the image to be detected according to the color value of any standard layer does not satisfy a color condition.
Optionally, the obtaining module 701 is further configured to obtain a color value of at least one standard layer corresponding to the page to be detected;
the operation module 707 is further configured to, in response to that matching of all gradient histograms corresponding to the to-be-detected region fails, perform binarization operation on the to-be-detected image according to a color value of at least one standard layer corresponding to the to-be-detected page;
the second determining module 705 is further configured to determine that the page to be detected fails to be loaded in response to that an image obtained after performing binarization operation on the image to be detected according to the color value of any standard layer does not satisfy a color condition.
Optionally, referring to fig. 8, the apparatus further comprises:
an acquisition module 708 for acquiring device information;
and an uploading module 709, configured to upload the device information and the image to be detected to a server.
In the embodiment of the application, the image to be detected is divided into the areas to be detected, the gradient histogram corresponding to the areas to be detected is matched with the candidate gradient histogram corresponding to the candidate area in the negative sample, and when the gradient histogram corresponding to any area to be detected is successfully matched, the page to be detected is determined to be failed to be loaded. In the page detection process, each region to be detected is detected respectively, the detection granularity is fine, the condition that the page to be detected fails to be loaded due to the loading failure of a certain region can be identified, in addition, the information related to the gradient histogram is rich, the detection is carried out based on the gradient histogram, the accuracy rate of judging whether the page fails to be loaded can be improved, and the page detection effect is good.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application, where the server may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 901 and one or more memories 902, where the one or more memories 902 store at least one program code, and the at least one program code is loaded and executed by the one or more processors 901 to implement the page detection method provided by the foregoing method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
Fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal may be: a smartphone, a tablet, a laptop, or a desktop computer. A terminal may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, a terminal includes: a processor 1001 and a memory 1002.
Processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1001 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1001 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1002 is used to store at least one instruction for execution by the processor 1001 to implement the page detection method provided by the method embodiments herein.
In some embodiments, the terminal may further include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch screen display 1005, camera assembly 1006, audio circuitry 1007, positioning assembly 1008, and power supply 1009.
The peripheral interface 1003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1001 and the memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral interface 1003 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 also has the ability to capture touch signals on or over the surface of the display screen 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this point, the display screen 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1005 may be one, disposed on a front panel of the terminal; in other embodiments, the display screens 1005 may be at least two, respectively disposed on different surfaces of the terminal or in a folded design; in still other embodiments, the display 1005 may be a flexible display, disposed on a curved surface or a folded surface of the terminal. Even more, the display screen 1005 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing or inputting the electric signals to the radio frequency circuit 1004 for realizing voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones can be arranged at different parts of the terminal respectively. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
The positioning component 1008 is used to locate the current geographic Location of the terminal to implement navigation or LBS (Location based service). The positioning component 1008 may be a positioning component based on a Global Positioning System (GPS) in the united states, a beidou system in china, a graves system in russia, or a galileo system in the european union.
The power supply 1009 is used to supply power to each component in the terminal. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may support wired charging or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal also includes one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
The acceleration sensor 1011 can detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1001 may control the touch display screen 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to collect a 3D motion of the user with respect to the terminal. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1013 may be disposed at a side frame of the terminal and/or at a lower layer of the touch display screen 1005. When the pressure sensor 1013 is disposed on a side frame of the terminal, a user's holding signal of the terminal can be detected, and the processor 1001 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the touch display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the user according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1014 may be disposed on the front, back, or side of the terminal. When a physical key or vendor Logo is provided on the terminal, the fingerprint sensor 1014 may be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the touch display screen 1005 according to the intensity of the ambient light collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is turned down. In another embodiment, the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
A proximity sensor 1016, also known as a distance sensor, is typically provided on the front panel of the terminal. The proximity sensor 1016 is used to collect the distance between the user and the front of the terminal. In one embodiment, when the proximity sensor 1016 detects that the distance between the user and the front surface of the terminal gradually decreases, the processor 1001 controls the touch display screen 1005 to switch from a bright screen state to a dark screen state; when the proximity sensor 1016 detects that the distance between the user and the front surface of the terminal gradually becomes larger, the touch display screen 1005 is controlled by the processor 1001 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer device is also provided that includes a processor and a memory having at least one program code stored therein. The at least one program code is loaded and executed by one or more processors to implement any of the above-described page detection methods.
In an exemplary embodiment, there is also provided a computer readable storage medium having at least one program code stored therein, the at least one program code being loaded and executed by a processor of a computer device to implement any of the above-described page detection methods.
Alternatively, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It is noted that the terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (11)

1. A page detection method, characterized in that the method comprises:
acquiring an image to be detected corresponding to a page to be detected and gradient histogram information corresponding to a negative sample, wherein the negative sample is obtained based on image training corresponding to a page with loading failure, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to regions at different positions in the negative sample;
dividing the image to be detected into regions to obtain a region to be detected, and determining a gradient histogram corresponding to the region to be detected;
matching the gradient histogram corresponding to the region to be detected with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is a region which is in a position corresponding to the region to be detected in the negative sample meeting the matching condition;
and responding to the successful matching of the gradient histogram corresponding to any region to be detected, and determining that the page to be detected fails to be loaded.
2. The method according to claim 1, wherein the dividing the image to be detected into regions to obtain a region to be detected, and determining the gradient histogram corresponding to the region to be detected comprises:
respectively carrying out region division on the image to be detected according to a first granularity and a second granularity to obtain a region to be detected with a first size and a region to be detected with a second size;
and determining a gradient histogram corresponding to the region to be detected with the first size and a gradient histogram corresponding to the region to be detected with the second size.
3. The method according to claim 2, wherein the first size is larger than the second size, the candidate gradient histograms include a first candidate gradient histogram and a second candidate gradient histogram, and the matching the gradient histogram corresponding to the region to be detected with the candidate gradient histograms includes:
matching the gradient histogram corresponding to the to-be-detected region with the first size with a first candidate gradient histogram;
and in response to the failure of matching of the gradient histograms corresponding to the to-be-detected region with the first size, matching the gradient histogram corresponding to the to-be-detected region with the second candidate gradient histogram.
4. The method according to claim 3, wherein the determining that the page to be detected fails to be loaded in response to the successful gradient histogram matching corresponding to any one of the regions to be detected comprises:
responding to the successful matching of the gradient histogram corresponding to the area to be detected with any first size, and determining that the page to be detected fails to be loaded; or,
and determining that the page to be detected fails to be loaded in response to the fact that the gradient histograms corresponding to the regions to be detected of the first size fail to be matched and the gradient histograms corresponding to the regions to be detected of any second size fail to be matched.
5. The method according to any one of claims 1-4, further comprising:
responding to the matching failure of the gradient histograms corresponding to the to-be-detected area, and performing edge detection on the to-be-detected image;
and determining that the page to be detected fails to be loaded in response to the result of the edge detection indicating that the target shape does not exist in the image to be detected.
6. The method of claim 5, further comprising:
obtaining a color value of at least one standard layer corresponding to a page to be detected;
responding to the result of edge detection to indicate that a target shape exists in the image to be detected, and performing binarization operation on the image to be detected according to the color value of at least one standard layer corresponding to the page to be detected;
and determining that the page to be detected fails to be loaded in response to the image which is obtained after the binarization operation is performed on the image to be detected according to the color value of any standard layer and does not meet the color condition.
7. The method according to any one of claims 1-4, further comprising:
obtaining a color value of at least one standard layer corresponding to a page to be detected;
responding to the matching failure of the gradient histograms corresponding to the to-be-detected area, and performing binarization operation on the to-be-detected image according to the color value of at least one standard image layer corresponding to the to-be-detected page;
and determining that the page to be detected fails to be loaded in response to the image which is obtained after the binarization operation is performed on the image to be detected according to the color value of any standard layer and does not meet the color condition.
8. The method according to claim 1, wherein after determining that the page to be detected fails to be loaded, the method further comprises:
collecting equipment information;
and uploading the equipment information and the image to be detected to a server.
9. A page detection apparatus, characterized in that the apparatus comprises:
the acquiring module is used for acquiring an image to be detected corresponding to a page to be detected and gradient histogram information corresponding to a negative sample, wherein the negative sample is obtained based on image training corresponding to a page with failed loading, and the gradient histogram information corresponding to the negative sample is used for indicating gradient histograms corresponding to regions at different positions in the negative sample;
the dividing module is used for carrying out region division on the image to be detected to obtain a region to be detected;
the first determining module is used for determining a gradient histogram corresponding to the to-be-detected region;
the matching module is used for matching the gradient histogram corresponding to the to-be-detected region with a candidate gradient histogram, wherein the candidate gradient histogram is the gradient histogram corresponding to the candidate region, and the candidate region is a region which is in a position corresponding to the to-be-detected region in the negative sample meeting the matching condition;
and the second determining module is used for responding to the successful matching of the gradient histogram corresponding to any region to be detected and determining that the page to be detected fails to be loaded.
10. A computer device comprising a processor and a memory, the memory having stored therein at least one program code, the at least one program code being loaded and executed by the processor to implement a page detection method as claimed in any one of claims 1 to 8.
11. A computer-readable storage medium having stored therein at least one program code, the at least one program code being loaded and executed by a processor, to implement the page detection method according to any one of claims 1 to 8.
CN202010392403.0A 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium Active CN111582184B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010392403.0A CN111582184B (en) 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010392403.0A CN111582184B (en) 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111582184A true CN111582184A (en) 2020-08-25
CN111582184B CN111582184B (en) 2024-02-20

Family

ID=72118757

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010392403.0A Active CN111582184B (en) 2020-05-11 2020-05-11 Page detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111582184B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077437A (en) * 2021-03-31 2021-07-06 上海晨兴希姆通电子科技有限公司 Workpiece quality detection method and system
CN114185624A (en) * 2021-12-30 2022-03-15 深圳前海微众银行股份有限公司 Report loading update detection method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120257788A1 (en) * 2011-04-08 2012-10-11 Creatures Inc. Computer-readable storage medium having information processing program stored therein, information processing method, information processing apparatus, and information processing system
CN103049751A (en) * 2013-01-24 2013-04-17 苏州大学 Improved weighting region matching high-altitude video pedestrian recognizing method
US20150154659A1 (en) * 2013-12-03 2015-06-04 Yahoo! Inc. System and method for displaying transitional mobile ads during network page download latency time
WO2018033155A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Video image processing method, apparatus and electronic device
US20180373804A1 (en) * 2017-06-23 2018-12-27 Guangzhou Shenma Mobile Information Technology Co. Ltd. Method and device for loading information stream page
CN110163287A (en) * 2019-05-24 2019-08-23 三亚中科遥感研究所 A kind of mesoscale eddy detection method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120257788A1 (en) * 2011-04-08 2012-10-11 Creatures Inc. Computer-readable storage medium having information processing program stored therein, information processing method, information processing apparatus, and information processing system
CN103049751A (en) * 2013-01-24 2013-04-17 苏州大学 Improved weighting region matching high-altitude video pedestrian recognizing method
US20150154659A1 (en) * 2013-12-03 2015-06-04 Yahoo! Inc. System and method for displaying transitional mobile ads during network page download latency time
WO2018033155A1 (en) * 2016-08-19 2018-02-22 北京市商汤科技开发有限公司 Video image processing method, apparatus and electronic device
US20180373804A1 (en) * 2017-06-23 2018-12-27 Guangzhou Shenma Mobile Information Technology Co. Ltd. Method and device for loading information stream page
CN110163287A (en) * 2019-05-24 2019-08-23 三亚中科遥感研究所 A kind of mesoscale eddy detection method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
老实和尚;: "保护应用程序的一种方法――模拟Windows PE加载器,从内存资源中加载DLL" *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113077437A (en) * 2021-03-31 2021-07-06 上海晨兴希姆通电子科技有限公司 Workpiece quality detection method and system
CN113077437B (en) * 2021-03-31 2023-07-25 上海晨兴希姆通电子科技有限公司 Workpiece quality detection method and system
CN114185624A (en) * 2021-12-30 2022-03-15 深圳前海微众银行股份有限公司 Report loading update detection method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111582184B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
CN110070056B (en) Image processing method, image processing apparatus, storage medium, and device
CN109410220B (en) Image segmentation method and device, computer equipment and storage medium
CN110059685B (en) Character area detection method, device and storage medium
CN109815150B (en) Application testing method and device, electronic equipment and storage medium
CN110490179B (en) License plate recognition method and device and storage medium
CN110650379B (en) Video abstract generation method and device, electronic equipment and storage medium
CN110807361A (en) Human body recognition method and device, computer equipment and storage medium
CN112749613B (en) Video data processing method, device, computer equipment and storage medium
CN110839128B (en) Photographing behavior detection method and device and storage medium
CN110490186B (en) License plate recognition method and device and storage medium
CN110570460A (en) Target tracking method and device, computer equipment and computer readable storage medium
CN113378705B (en) Lane line detection method, device, equipment and storage medium
CN111027490A (en) Face attribute recognition method and device and storage medium
CN110647881A (en) Method, device, equipment and storage medium for determining card type corresponding to image
CN113627413A (en) Data labeling method, image comparison method and device
CN111582184B (en) Page detection method, device, equipment and storage medium
CN111586279B (en) Method, device and equipment for determining shooting state and storage medium
CN110728167A (en) Text detection method and device and computer readable storage medium
CN111931712A (en) Face recognition method and device, snapshot machine and system
CN110163192B (en) Character recognition method, device and readable medium
CN112818979A (en) Text recognition method, device, equipment and storage medium
CN112053360A (en) Image segmentation method and device, computer equipment and storage medium
CN113343709B (en) Method for training intention recognition model, method, device and equipment for intention recognition
CN114118236A (en) Method and device for training intelligent model
CN110728275B (en) License plate recognition method, license plate recognition device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant