WO2022170554A1 - Procédé d'affichage d'image, terminal, puce et support de stockage - Google Patents

Procédé d'affichage d'image, terminal, puce et support de stockage Download PDF

Info

Publication number
WO2022170554A1
WO2022170554A1 PCT/CN2021/076494 CN2021076494W WO2022170554A1 WO 2022170554 A1 WO2022170554 A1 WO 2022170554A1 CN 2021076494 W CN2021076494 W CN 2021076494W WO 2022170554 A1 WO2022170554 A1 WO 2022170554A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
quadrilateral
group
stable
terminal
Prior art date
Application number
PCT/CN2021/076494
Other languages
English (en)
Chinese (zh)
Inventor
顾磊
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to PCT/CN2021/076494 priority Critical patent/WO2022170554A1/fr
Priority to CN202180084568.4A priority patent/CN116686281A/zh
Publication of WO2022170554A1 publication Critical patent/WO2022170554A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa

Definitions

  • the present invention relates to the technical field of image processing, and in particular, to an image display method, a terminal and a storage medium.
  • the related art proposes a document image scanning technology based on photos, which can realize automatic identification of information by scanning photos.
  • the scanning technology relies on the quadrilateral detection method of the image.
  • the terminal Before using scanning for information identification, the terminal needs to use the detection method to find the quadrilateral frame containing the target object from the captured image, and then preview the currently captured image in real time and search for The obtained quadrilateral frame, so as to further realize the information acquisition of the target object in the quadrilateral frame.
  • the embodiments of the present application provide an image display method, a terminal, a chip and a storage medium, which solve the problem of unstable display of a quadrangle frame in a preview picture, and overcome the defect that the preview picture is not displayed smoothly.
  • an embodiment of the present application provides an image display method, the method includes:
  • an embodiment of the present application provides a terminal, the terminal includes: an acquisition part, a detection part, a clustering part, a selection part, a determination part, and a display part,
  • the acquisition part is configured to acquire the i-th frame preview image corresponding to the target object
  • the detection part is configured to perform frame detection processing on the i-th frame preview image to obtain the i-th quadrilateral frame corresponding to the target object; wherein, the i is an integer greater than 0;
  • the clustering part is configured to perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame to obtain at least one frame group;
  • the selection part is configured to select a target frame group from the at least one frame group
  • the determining part is configured to determine an initial stable frame from the target frame group; and determine an i-th stable frame based on the initial stable frame and the (i-1)th stable frame;
  • the display part is configured to perform display processing on the i-th frame preview image according to the i-th stable frame.
  • an embodiment of the present application provides a terminal, where the terminal includes: a quadrilateral detection module, a timing stabilization module, a denoising stabilization module, and a preview module,
  • the quadrilateral detection module is configured to obtain the i-th frame preview image corresponding to the target object; and perform frame detection processing on the i-th frame preview image to obtain the i-th quadrilateral frame corresponding to the target object;
  • the i is an integer greater than 0;
  • the timing stabilization module is configured to perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame to obtain at least one frame group; and from the at least one frame group Selecting a target frame group; and determining an initial stable frame from the target frame group;
  • the denoising stabilization module is configured to determine the i-th stable frame based on the initial stable frame and the (i-1)-th stable frame;
  • the preview module is configured to perform display processing on the i-th frame preview image according to the i-th stable frame.
  • an embodiment of the present application provides a terminal, where the terminal includes a quadrilateral detection module, a timing stabilization module, a denoising stabilization module, a preview module, a processor, and a memory storing executable instructions of the processor.
  • the terminal includes a quadrilateral detection module, a timing stabilization module, a denoising stabilization module, a preview module, a processor, and a memory storing executable instructions of the processor.
  • an embodiment of the present application provides a chip, wherein the chip includes a processor and an interface, the processor obtains program instructions through the interface, and the processor is used to execute the program instructions, to perform the image display method as described above.
  • an embodiment of the present application provides a computer-readable storage medium on which a program is stored and applied in a terminal.
  • the program is executed by a processor, the above-described image display method is implemented.
  • the embodiments of the present application provide an image display method, a terminal, a chip, and a storage medium.
  • the terminal acquires the ith frame preview image corresponding to the target object, and performs frame detection processing on the ith frame preview image to obtain the ith frame preview image corresponding to the target object.
  • i is an integer greater than 0; perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame to obtain at least one frame group; select the target from at least one frame group frame group, and determine the initial stable frame from the target frame group; determine the i-th stable frame based on the initial stable frame and the (i-1)-th stable frame; display the i-th frame preview image according to the i-th stable frame . That is to say, in the embodiment of the present application, after performing frame detection processing on the current preview image containing the target object to obtain a quadrilateral frame corresponding to the target object, the terminal may first perform clustering based on frame similarity on the quadrilateral frame.
  • the terminal no longer directly performs image preview based on the quadrilateral frame obtained by frame detection, but performs similarity clustering on the quadrilateral frame obtained by detection, selection of the target frame group, and determination of the initial stable frame, etc.
  • the screen display is not smooth defect.
  • FIG. 1 is a schematic diagram 1 of an implementation flow of an image display method proposed by an embodiment of the present application
  • FIG. 2 is a second implementation flowchart of the image display method proposed by the embodiment of the present application.
  • FIG. 3 is a schematic diagram of a curve of a frame group smoothing filtering proposed by an embodiment of the present application
  • FIG. 4 is a schematic diagram 3 of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 5 is a schematic diagram of a scene of initial stable frame smoothing filtering proposed by an embodiment of the present application.
  • FIG. 6 is a fourth schematic diagram of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 7 is a schematic diagram five of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 8 is a sixth schematic diagram of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 9 is a seventh schematic diagram of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 10 is a schematic diagram eight of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 11 is a schematic diagram 9 of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 12 is a schematic diagram ten of the implementation flow of the image display method proposed by the embodiment of the present application.
  • FIG. 13A is a schematic diagram 1 of a scene for determining a target stable frame according to an embodiment of the present application
  • FIG. 13B is a second schematic diagram of a scene for determining a target stable frame proposed by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of an execution flow of image processing proposed by an embodiment of the present application.
  • FIG. 15 is a schematic diagram 1 of the composition structure of a terminal proposed by an embodiment of the present application.
  • FIG. 16 is a second schematic diagram of the composition and structure of a terminal according to an embodiment of the present application.
  • FIG. 17 is a third schematic diagram of the composition and structure of a terminal according to an embodiment of the present application.
  • the scanning technology relies on the quadrilateral detection method of the image.
  • the terminal can use the detection method to first find the quadrilateral frame containing the target object from the captured image, and then preview the currently captured image and the found quadrilateral frame in real time, so as to finally realize the Obtain information about the target object in the quadrilateral frame.
  • the field adopts a direct time series filtering method, such as Kalman filtering, mean filtering, etc., to reduce the negative influence of unstable quadrilateral display.
  • a direct time series filtering method such as Kalman filtering, mean filtering, etc.
  • the embodiments of the present application provide an image display method, a terminal, a chip and a storage medium. Specifically, after performing frame detection processing on the current preview image containing the target object to obtain a quadrilateral frame corresponding to the target object, the terminal may first perform clustering processing based on frame similarity on the quadrilateral frame, and obtain at least one frame from the obtained at least one frame.
  • the target frame group is selected from the group, and the initial stable frame is further determined from the target frame group, and the current stable frame is further determined based on the comparison between the initial stable frame and the historical stable frame, so that the current preview image will be adjusted according to the current stable frame Perform display processing.
  • the terminal no longer directly performs image preview based on the quadrilateral frame obtained by frame detection, but performs similarity clustering on the quadrilateral frame obtained by detection, selection of the target frame group, and determination of the initial stable frame, etc.
  • the screen display is not smooth defect.
  • FIG. 1 is a schematic diagram 1 of the implementation flow of the image display method proposed by the embodiment of the present application.
  • the method for performing image processing by a terminal may include the following steps:
  • Step 101 Obtain the ith frame preview image corresponding to the target object, and perform frame detection processing on the ith frame preview image to obtain the ith quadrilateral frame corresponding to the target object; wherein, i is an integer greater than 0.
  • the terminal may acquire a preview image including the target object in real time, that is, the i-th frame preview image, and perform frame detection processing on the i-th frame preview image to obtain the real-time quadrilateral frame corresponding to the target object, That is, the i-th quadrilateral border.
  • the terminal may be any electronic device with a text scanning function.
  • the terminal may have a camera, and image frames are collected through the camera.
  • the terminal is not limited to electronic devices such as a smart phone, a tablet computer, a personal computer (Personal Computer, PC), and a notebook computer.
  • electronic devices such as a smart phone, a tablet computer, a personal computer (Personal Computer, PC), and a notebook computer.
  • the ith frame preview image refers to a frame preview image of the document picture collected by the terminal at the ith moment when the terminal captures the document picture through the camera.
  • the target object refers to the target object specified in the preview image, such as a rectangular object whose frame is a rectangle.
  • a document picture may include documents, paper, business cards, photos, whiteboards, screens, etc.
  • the target objects may be various rectangles such as person photos, ID cards, passports, driver's licenses, tickets, business cards, work cards, etc. in the document picture object.
  • the terminal may perform frame detection processing on the preview image, such as quadrilateral detection, to acquire the quadrilateral corresponding to the target object frame.
  • the terminal may determine the contour of the rectangular object by using the method of feature line detection, that is, the quadrilateral frame.
  • the terminal may also establish a quadrilateral detection model based on deep learning, and after obtaining the preview image in real time, input the preview image into the pre-trained model to perform quadrilateral detection processing on the image frame to be detected, and then output the quadrilateral frame.
  • a quadrilateral detection model based on deep learning
  • the terminal may further The clustering processing based on the similarity of the border is performed based on the quadrilateral border.
  • Step 102 Perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame to obtain at least one frame group.
  • the terminal may further perform similarity clustering processing based on the first quadrilateral frame to the i-th quadrilateral frame, and then obtain at least one Set of borders.
  • the clustering process is unsupervised machine learning to group similar objects into groups.
  • the terminal may perform clustering on the quadrilateral borders based on the similarity of the borders, and classify the quadrilateral borders with high similarity into one category.
  • the terminal may obtain vertex coordinate data corresponding to the quadrilateral frame, and perform similarity calculation based on the vertex coordinate data, so as to realize the classification of the quadrilateral frame based on the similarity result.
  • the terminal may obtain vertex coordinate data corresponding to the first quadrilateral frame to the i-th quadrilateral frame, and then perform similarity clustering processing based on the vertex coordinate data to construct at least one frame group.
  • the terminal may further perform the clustering process based on the at least one border group based on the at least one border group. Carry out the selection process of the target frame group.
  • Step 103 Select a target frame group from at least one frame group, and determine an initial stable frame from the target frame group.
  • the terminal may first select a frame group from the at least one frame group as a target frame group (step 103a), and then select one frame group from the at least one frame group as a target frame group (step 103a).
  • One frame in the frame group is determined as the initial stable frame (step 103b).
  • the number of quadrilateral frame samples in each frame group may be different.
  • the terminal may select a relatively stable frame group from the at least one frame group as a sample of the quadrilateral frame as Target border group.
  • FIG. 2 is a second implementation flowchart of the image display method proposed by the embodiment of the application.
  • the method for the terminal to select a target frame group from at least one frame group includes the following steps:
  • Step 103a1 Acquire the number of quadrilateral frames included in each frame group in at least one frame group.
  • Step 103a2 Determine the frame group corresponding to the maximum number of quadrilateral frames as the target frame group.
  • the terminal may directly determine a frame group in at least one frame group with a maximum number of quadrilateral frame samples as a target frame group.
  • the terminal may first perform a certain smoothing filtering process on each frame group, such as mean filtering, so as to reduce the jump when selecting the target frame group.
  • the terminal can track each frame group in time sequence, perform smooth filtering on the number of quadrilateral frame samples in the frame group, and then select a frame group with the largest number of quadrilateral frame samples from the filtered at least one frame group as a frame group.
  • Target border group a frame group with the largest number of quadrilateral frame samples from the filtered at least one frame group.
  • FIG. 3 is a schematic diagram of a curve for smoothing filtering of a frame group proposed by an embodiment of the present application.
  • the abscissa of the schematic diagram of the curve indicates different time sequences, and the ordinate indicates the change in the number of samples in the frame group;
  • the thick solid line represents the curve of the number of quadrilateral frame samples in the original frame group 1
  • the thin solid line represents the curve of the number of quadrilateral frame samples in the original frame group 2
  • the thick dashed line represents the curve of the number of quadrilateral frame samples in the filtered frame group 1
  • the thin dotted line represents the curve of the number of quadrilateral frame samples in frame group 2 after filtering.
  • the number of quadrilaterals in the original frame group 2 is greater than the number of quadrilaterals in the original frame group 1 for a period of time.
  • the target frame group cannot be accurately selected.
  • the terminal performs smooth filtering.
  • the number of quadrilaterals in frame group 2 is always greater than the number of quadrilaterals in frame group 1 after filtering.
  • the target frame group Determined as border group 2.
  • the number of quadrilateral samples in the filtered frame group 1 is always greater than that of the filtered frame group 2.
  • the terminal can select frame group 1 as the target frame group; During the time period from t2 to t3, after smooth filtering, the number of quadrilateral samples in filtered frame group 2 is always greater than that of filtered frame group 1. At this time, the terminal can select frame group 2 as the target frame group.
  • the terminal may further determine a frame from the target frame group as an initial stable frame.
  • FIG. 4 is a schematic diagram 3 of the implementation flow of the image display method proposed by the embodiment of the present application.
  • the terminal determines the initial stable frame (step 103b ) from the target frame group.
  • the method includes the following steps:
  • Step 103b1 arranging the quadrilateral frames in the target frame group in a chronological order to obtain a frame list.
  • Step 103b2 Determine the last quadrilateral frame in the frame list as the initial stable frame.
  • the terminal can All quadrilateral borders in the border group are arranged and processed in the order of time from first to last to obtain a border time series list. Further, the terminal may determine the last quadrilateral frame in the list as the initial stable frame, that is, the quadrilateral frame corresponding to the latest preview image in the target frame group is determined as the initial stable frame.
  • the terminal may also perform mean filtering processing on all quadrilateral frames in the target frame group, such as Kalman filtering, and the filtered object is the quadrilateral frame vertex coordinate data or the center point coordinate data, Then the initial stable frame is obtained.
  • mean filtering processing on all quadrilateral frames in the target frame group, such as Kalman filtering, and the filtered object is the quadrilateral frame vertex coordinate data or the center point coordinate data, Then the initial stable frame is obtained.
  • FIG. 5 is a schematic diagram of the initial stable frame smoothing filtering proposed by the embodiment of the present application. It is assumed that the target frame group includes a quadrilateral frame A, a quadrilateral frame B, and a quadrilateral frame C, as shown in FIG. 5 , although three Frames A, B, and C belong to a frame group and are similar frames, but there is actually a gap between the three frames, and the vertex coordinate data and the center point coordinate data are different. Therefore, the terminal can perform mean filtering on the three frames in chronological order. Thus, a more stable quadrilateral frame D is obtained, and its frame D is determined as the initial stable frame.
  • the terminal may further perform the determination process of the target stable frame.
  • Step 104 Determine the i-th stable frame based on the initial stable frame and the (i-1)-th stable frame.
  • the terminal may further base on the initial stable frame and the historical stable reference frame, namely the first stable frame.
  • the (i-1) stable borders further determine the quadrilateral border for final preview output, that is, the i-th stable border.
  • the (i-1)th stable frame refers to the stable frame of the final output preview of the previous preview image.
  • the terminal After the terminal performs the processing procedures such as similarity clustering processing, target frame group selection, initial stable frame determination, and stable frame determination corresponding to the preview image of each frame, the terminal will process the stable frame information corresponding to the preview image of the current frame. It is stored and used as the historical reference stable frame when the next frame preview image is determined to be stable frame.
  • the processing procedures such as similarity clustering processing, target frame group selection, initial stable frame determination, and stable frame determination corresponding to the preview image of each frame.
  • the terminal in order to reduce the jitter of the quadrilateral frame when the preview interface is displayed, the terminal will not directly determine the currently obtained initial stable frame as the i-th stable frame corresponding to the current i-th frame preview image, Instead, the similarity is compared between the currently obtained initial stable frame and the pre-stored historical (i-1) stable frame, and then the i-th stable frame for final output preview is determined based on the comparison result.
  • the terminal may further perform display processing on the i-th preview image according to the stable frame.
  • Step 105 Perform display processing on the ith preview image according to the ith stable frame.
  • the terminal may further perform display processing on the i-th preview image according to the stable frame.
  • the terminal may perform rendering processing on the ith preview image based on the ith stable frame, obtain a rendered preview image, and then display the rendered preview image.
  • the terminal renders the i-th stable frame in the i-th preview image, obtains a post-rendering stable frame, and then generates a post-rendering preview image based on the post-rendering stable frame and the i-th preview image, so as to display it on the preview screen. Preview image after this rendering.
  • the terminal may perform real-time scanning processing on the rendered preview image, so as to obtain specific parameters of the target object. Specifically, the terminal may only perform real-time scanning processing on the target object in the target stable frame, so as to perform automatic information identification.
  • An embodiment of the present application provides an image display method. After performing frame detection processing on a current preview image containing a target object and obtaining a quadrilateral frame corresponding to the target object, the terminal may first perform clustering based on frame similarity on the quadrilateral frame. process, and select a target frame group from the obtained at least one frame group, and further determine an initial stable frame from the target frame group, and then further determine the current stable frame based on the comparison between the initial stable frame and the historical stable frame, so that The current preview image will be displayed according to the current stable frame.
  • the terminal no longer directly performs image preview based on the quadrilateral frame obtained by frame detection, but performs similarity clustering on the quadrilateral frame obtained by detection, selection of the target frame group, and determination of the initial stable frame, etc.
  • the screen display is not smooth defect.
  • FIG. 6 is a fourth schematic diagram of the implementation flow of the image display method proposed by the embodiment of the present application.
  • the terminal performs frame detection processing on the i-th frame preview image , after the i-th quadrilateral frame corresponding to the target object is obtained, that is, after step 101, and based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame, similarity clustering processing is performed, and before at least one frame group is obtained, that is, Before step 102, the method for the terminal to perform image processing includes:
  • Step 106 Store the i-th quadrilateral frame in the N-th position of a first-in-first-out queue (First Input First Output, FIFO); wherein, N is an integer greater than 2, and N represents the maximum storage capacity of the FIFO.
  • FIFO First Input First Output
  • the terminal performs quadrilateral detection processing on each frame of preview image, and after obtaining each quadrilateral frame, it will first store the quadrilateral frame corresponding to the current preview image in the FIFO queue. The tail, the last bit of the queue.
  • the number of FIFO queues is determined by its maximum storage capacity, that is, the maximum storage capacity is, and how many image frames can be stored in the FIFO queue. If it is assumed that there are N bits in the current FIFO queue, the maximum storage number representing the FIFO queue is N.
  • the FIFO follows the "first-in, first-out" principle.
  • the terminal always stores the i-th quadrilateral frame obtained by detecting the current i-th frame preview image to the tail of the FIFO queue, that is, the N-th position.
  • the historical quadrilateral frame obtained by the historical preview image detection is shifted forward in sequence in the FIFO queue; the historical quadrilateral frame originally located at the first position in the FIFO queue will be moved out of the queue, and the (i-1) The history quad border will be shifted to the (N-1)th bit.
  • FIG. 7 is a schematic diagram 5 of the implementation flow of the image display method proposed in the embodiment of the present application.
  • the terminal stores the i-th quadrilateral frame to the first-in-first-out order.
  • the terminal After dequeuing the Nth bit of the FIFO, that is, after step 106, if i is less than N, then the terminal performs similarity clustering processing based on the first quadrilateral border corresponding to the target object to the i-th quadrilateral border, and obtains at least one border group.
  • the method includes the following steps:
  • Step 102a Read the first quadrilateral frame to the i-th quadrilateral frame from the FIFO.
  • Step 102b Perform similarity clustering processing based on the first quadrilateral frame to the i-th quadrilateral frame to obtain at least one frame group.
  • the number of quadrilateral borders stored in the FIFO is associated with the maximum storage number N of the FIFO queue.
  • the FIFO queue contains the first to i-th quadrilateral frames, that is, the FIFO queue space It's big enough that there's no quad bezel that's been removed.
  • the terminal may read the first to i-th quadrilateral frames corresponding to the preview images from the first frame to the i-th frame from the FIFO queue, and perform clustering processing based on frame similarity based on the i quadrilateral frames.
  • FIG. 8 is a schematic diagram 6 of the implementation process of the image display method proposed by the embodiment of the present application.
  • the terminal performs similarity clustering processing based on the first quadrilateral frame to the i-th quadrilateral frame, and obtains at least one
  • the methods of the border group include:
  • Step 102b1 obtain the kth vertex coordinate data corresponding to the kth quadrilateral frame, and the first (k-1) vertex coordinate data corresponding to the first (k-1) quadrilateral frame; wherein, k is greater than 1 and less than or an integer equal to i.
  • Step 102b2 Calculate the first (k-1) distance differences corresponding to the kth vertex coordinate data and the previous (k-1) vertex coordinate data according to the preset similarity function.
  • Step 102b3 Determine the minimum distance difference from the previous (k-1) distance differences.
  • Step 102b4 Construct at least one border group based on the minimum distance difference and the first historical border group corresponding to the first (k-1) quadrilateral borders.
  • the terminal first performs the clustering processing of the first quadrilateral frame from the first to the i-th quadrilateral frame in the FIFO queue, because the first quadrilateral frame does not exist before Any quadrilateral frame sample that has been clustered, that is, without any frame group, at this time, the terminal can first create a new frame group for the first quadrilateral frame.
  • the terminal when performing the clustering processing of the second quadrilateral frame in the FIFO queue, that is, when k is equal to 2, the terminal can first compare the similarity between the second quadrilateral frame and the clustered first quadrilateral frame, Determines the border group to which the second quadrilateral border belongs based on the comparison result.
  • the terminal may obtain the vertex coordinate data of the second quadrilateral frame and the vertex coordinate data of the first quadrilateral frame respectively, and then calculate the distance difference that can characterize the similarity based on the preset similarity function and the two vertex coordinate data.
  • the terminal may calculate the distance difference based on formula (1) to formula (3) to determine the similarity of the quadrilateral frame.
  • the quadrilateral information Q is the coordinate positions of the four vertices of each quadrilateral frame.
  • the preset similarity function is to obtain the distance difference between the two quadrilaterals.
  • p is the spatial norm of L p . More commonly, when p is 1, it is the Manhattan distance, when p is 2, it is the Euclidean distance, and when p is ⁇ , the maximum absolute value is calculated.
  • M(Q) is a mapping function of quadrilateral information, which is used to map the original quadrilateral information Q to the space of distance calculation.
  • k i (Q) is a specific mapping function. E.g, That is, the center point position, area, etc. of the quadrilateral are calculated as mapping items.
  • the terminal can first compare the two quadrilaterals based on the formula (1) and formula (3) in the preset similarity function, as well as the vertex coordinate data of the first quadrilateral frame and the vertex coordinate data of the second quadrilateral frame.
  • the borders are mapped to the distance space respectively, the corresponding distances of the two quadrilateral borders are obtained, and then the distance difference is calculated based on formula (3) to determine the similarity comparison result between the first quadrilateral border and the second quadrilateral border.
  • the terminal may preset a preset distance threshold that can characterize the similarity result, and the terminal may compare the above-mentioned distance difference with the preset distance threshold, and then determine the first quadrilateral frame and the second quadrilateral frame based on the comparison result. similarity results.
  • the terminal can determine that the first quadrilateral frame is similar to the second quadrilateral frame, and the terminal determines that the second quadrilateral frame is classified as the frame group to which the first quadrilateral frame belongs . If the distance difference is greater than or equal to the above preset distance threshold, the terminal can determine that the first quadrilateral frame is not similar to the second quadrilateral frame, then the terminal re-establishes a new frame group, and classifies the second quadrilateral frame as The new border group.
  • the terminal can calculate the difference between the kth quadrilateral frame and the first (k-1) quadrilateral frame. (k-1) distance difference, and determine the minimum difference from the (k-1) distance differences, and then construct at least one border based on the border group corresponding to the first (k-1) quadrilateral borders based on the minimum difference Group.
  • the terminal may create a new frame group corresponding to the kth quadrilateral frame, and build at least one frame group based on the newly-added frame group and the first historical frame group.
  • the terminal can classify the k-th quadrilateral frame into the frame group corresponding to the first (k-1) quadrilateral frames and the target frame group corresponding to the minimum distance difference, and construct the frame group based on the updated sample number of the quadrilateral frame. At least one border group.
  • the terminal uses formula (1) to formula (3) to calculate the third quadrilateral frame and the first quadrilateral frame, The distance difference of the second quad border. If the distance difference between the third quadrilateral and the first quadrilateral frame is less than the preset distance threshold, and the distance difference between the third quadrilateral and the second quadrilateral frame is greater than the preset distance threshold, the terminal may determine that the third quadrilateral frame is classified as The border group to which the first quadrilateral border belongs; if the distance difference between the third quadrilateral and the first quadrilateral border is less than the preset distance threshold, and the distance difference from the second quadrilateral border is also less than the preset distance threshold, but Then the terminal classifies the third quadrilateral frame into the frame group to which the first quadrilateral frame with a smaller distance difference belongs; if the distance difference is greater than the preset distance threshold, the terminal re-establis
  • FIG. 9 is a seventh schematic diagram of the implementation flow of the image display method proposed by the embodiment of the present application.
  • the terminal stores the i-th quadrilateral frame in the FIFO.
  • the terminal After the Nth bit, that is, after step 106, if i is greater than or equal to N, the terminal performs similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame, and the method for obtaining at least one frame group is also The following steps can be included:
  • Step 102c Read the (i-N+1)th quadrilateral frame to the i-th quadrilateral frame from the FIFO.
  • Step 102d Perform similarity clustering processing based on the (i-N+1)th quadrilateral frame to the ith quadrilateral frame to obtain at least one frame group.
  • the FIFO queue contains (i-N+1)-th to i-th quadrilateral frames at this time. , that is, there is not enough space in the FIFO queue, and the first (i-N+2) quadrilateral borders have been removed from the FIFO queue.
  • the terminal can read the (i-N+1)th to the ith four-deformed frame corresponding to the (i-N+1)th frame to the i-th frame preview image from the FIFO queue, and perform a frame-based similarity calculation. degree of clustering.
  • FIG. 10 is a schematic diagram 8 of the implementation process of the image display method proposed by the embodiment of the present application.
  • the terminal performs similarity aggregation based on the (i-N+1)th quadrilateral frame to the ith quadrilateral frame.
  • Class processing, the methods for obtaining at least one border group include:
  • Step 102d1 obtain the (i-N+k)th vertex coordinate data corresponding to the (i-N+k)th quadrilateral frame, and the first (i-N+k-1) quadrilateral frame corresponding to the previous (i- N+k-1) vertex coordinate data; wherein, k is an integer greater than 1 and less than or equal to N.
  • Step 102d2 calculate the first (i-N+k-1) corresponding to the (i-N+k)th vertex coordinate data and the previous (i-N+k-1) vertex coordinate data according to the preset similarity function distance difference.
  • Step 102d3 Determine the minimum distance difference from the previous (i-N+k-1) distance differences.
  • Step 102d4 Construct at least one border group based on the minimum distance difference and the second historical border group corresponding to the first (i-N+k-1) quadrilateral borders.
  • the terminal always performs clustering processing only on all the quadrilateral frames currently existing in the FIFO sequence, and does not save the clustering results of the quadrilateral frames in the historical FIFO sequence.
  • the terminal first performs aggregation of the (i-N+1)th to the i-th quadrilateral frame and the (i-N+1)th quadrilateral frame in the FIFO queue.
  • Class processing since there is no quadrilateral frame sample that has completed clustering before the (i-N+1)th quadrilateral frame, that is, there is no frame group, at this time, the terminal can be the (i-N+1)th first.
  • the quad border creates a new border group.
  • the terminal when performing the clustering processing of the (i-N+2)th quadrilateral frame in the FIFO queue, that is, when k is equal to 2, the terminal can first perform clustering on the (i-N+2)th quadrilateral frame and the The similarity of the (i-N+1)th quadrilateral frame of the class is compared, and the frame group to which the (i-N+2)th quadrilateral frame belongs is determined based on the comparison result.
  • the terminal can obtain the vertex coordinate data of the (i-N+2)th quadrilateral frame and the vertex coordinate data of the (i-N+1)th quadrilateral frame respectively, and then can obtain the vertex coordinate data based on formula (1) to formula (3) ) Calculate the distance difference, and compare it with the preset distance threshold that characterizes the similarity result, if the distance difference is less than or equal to the above-mentioned preset distance threshold, then the terminal can determine the (i-N+1)th quadrilateral border and the first The (i-N+2) quadrilateral frames are similar, then the terminal determines that the (i-N+2) th quadrilateral frame is classified as the frame group to which the (i-N+1) th quadrilateral frame belongs.
  • the terminal can determine that the (i-N+1)th quadrilateral frame is not similar to the (i-N+2)th quadrilateral frame, then the terminal re-establishes a new frame group, And classify the (i-N+2)th quadrilateral frame into the new frame group.
  • the terminal can calculate the (i-N+ The (i-N+k-1) distance difference between k) quadrilateral borders and the previous (i-N+k-1) quadrilateral borders, and from these (i-N+k-1) distance differences The minimum difference value is determined, and then at least one frame group is constructed based on the frame group corresponding to the minimum difference value and the first (i-N+k-1) quadrilateral frames.
  • the terminal can establish a new border group corresponding to the (i-N+k)th quadrilateral border, and based on the new border group and the A historical border group constructs at least one border group.
  • the terminal can classify the (i-N+k) th quadrilateral frame as the one corresponding to the first (i-N+k-1) quadrilateral frame.
  • the target frame group corresponding to the minimum distance difference, and at least one frame group is constructed based on the frame group after the number of quadrilateral frame samples is updated.
  • the terminal uses formula (1) to formula (3) to calculate the (i-N+3th) based on the vertex coordinate data. ), the distance difference between the quadrilateral frame and the (i-N+1) th quadrilateral frame and the (i-N+2) th quadrilateral frame, respectively.
  • the terminal can determine the (i-N+3)th quadrilateral frame classification and the frame group to which the (i-N+1)th quadrilateral frame belongs; if the (i-N+3)th quadrilateral frame belongs to the frame group; The distance difference between the quadrilateral and the (i-N+1)th quadrilateral border is less than the preset distance threshold, and the distance difference with the (i-N+2)th quadrilateral border is also smaller than the preset distance threshold, but then the terminal Classify the (i-N+3)th quadrilateral frame into the frame group to which the (i-N+1)th quadrilateral frame with a smaller distance difference belongs; if the distance difference is greater than the preset distance threshold, the terminal restarts Create a border
  • FIG. 11 is a schematic diagram 9 of the implementation flow of the image display method proposed by the embodiment of the present application.
  • the terminal based on the first image display method corresponding to the target object may further include the following steps:
  • Step 102e Obtain the i-th vertex coordinate data corresponding to the i-th quadrilateral frame, and the first (i-1) vertex coordinate data corresponding to the previous (i-1) quadrilateral frames that have been grouped in history.
  • Step 102f Calculate (i-1) distance differences corresponding to the i-th vertex coordinate data and the previous (i-1) vertex coordinate data according to the preset similarity function.
  • Step 102g Determine the minimum distance difference from the (i-1) distance differences.
  • Step 102h construct at least one border group based on the minimum distance difference and the third historical border group corresponding to the first (i-1) quadrilateral borders.
  • the terminal does not need to store the quadrilateral frame obtained by the detection in the FIFO queue, but directly completes the classification of the latest frame obtained by the detection, that is, the i-th quadrilateral frame corresponding to the current i-th preview image and the classified frame.
  • the similarity of the quadrilateral border samples is compared, and the clustering of quadrilateral borders is carried out.
  • the terminal can use formula (1) to formula (3) to calculate the distance difference between the i-th quadrilateral frame and the previous (i-1) quadrilateral frame in the history, and the distance difference between each historical quadrilateral frame, that is, (i-1 ) distance difference, and compare the distance difference with the preset distance threshold, so as to determine the frame similarity result according to the comparison result, and realize the clustering of the quadrilateral frame.
  • the terminal may determine a minimum distance difference from the (i-1) distance differences, and construct at least one frame group based on the minimum distance difference and a historical frame group corresponding to the previous (i-1) quadrilateral frames.
  • the terminal may establish a new frame group corresponding to the i-th quadrilateral frame, and build at least one frame group based on the newly added frame group and the first historical frame group.
  • the terminal can classify the i-th quadrilateral frame into the frame group corresponding to the first (i-1) quadrilateral frame and the target frame group corresponding to the minimum distance difference, and construct at least the frame group based on the updated quadrilateral frame sample number.
  • a border group if the minimum distance difference is less than the preset distance threshold, that is, there is a frame group corresponding to the i-th quadrilateral frame, and the i-th quadrilateral frame can be classified into the frame group corresponding to the first (i-1) quadrilateral frame, Then the terminal can classify the i-th quadrilateral frame into the frame group corresponding to the first (i-1) quadrilateral frame and the target frame group corresponding to the minimum distance difference, and construct at least the frame group based on the updated quadrilateral frame sample number.
  • a border group if the minimum distance difference is less than the preset distance threshold, that is, there is a frame group corresponding to the i-th quadrilateral frame, and
  • the embodiment of the present application proposes an image display method, in which the terminal cannot perform the similarity clustering, the selection of the target frame group, the determination of the initial stable frame, etc. on the quadrilateral frame obtained by the detection to remove abnormal frames, which solves the problem of the quadrilateral in the preview screen.
  • the frame display is unstable, which overcomes the defect that the preview screen is not displayed smoothly.
  • FIG. 12 is a schematic diagram tenth of the implementation flow of the image display method proposed by the embodiment of the present application.
  • the terminal is based on the initial stable frame and the (i-1th) ) stable borders and the method for determining the i-th stable border may include the following steps:
  • Step 104a Obtain the first vertex coordinate data corresponding to the initial stable frame and the second vertex coordinate data corresponding to the (i-1)th stable frame.
  • Step 104b Calculate the distance difference between the first vertex coordinate data and the second vertex coordinate data according to the preset similarity function.
  • Step 104c If the distance difference is smaller than the preset distance threshold, determine the (i-1)th stable frame as the i-th stable frame.
  • Step 104d If the distance difference is greater than or equal to the preset distance threshold, determine the initial stable frame as the i-th stable frame.
  • the terminal in the process of determining the i-th stable frame based on the initial stable frame and the (i-1)-th stable frame, may first obtain the coordinates of the first vertex corresponding to the initial stable frame data, and the second vertex coordinate data corresponding to the (i-1)th stable frame, and then calculate the initial stable quadrilateral frame and The similarity of the (i-1)th stable bounding box.
  • the terminal may first base on formula (1) and formula (3) in the preset similarity function, as well as the first vertex coordinate data of the initial stable quadrilateral frame and the second vertex coordinate of the (i-1)th stable frame.
  • the data first maps its two quadrilateral frames to the distance space respectively to obtain the first distance corresponding to the initial stable quadrilateral frame and the second distance corresponding to the (i-1)th stable frame, and then calculate the distance difference based on formula (3). .
  • the terminal may preset a preset distance threshold representing the similarity result, and the terminal may compare the above-mentioned distance difference with the preset distance threshold, and then determine the initial stable quadrilateral border and the (i-1)th threshold based on the comparison result. Similarity results for stable bounding boxes.
  • the terminal can determine that the initial stable quadrilateral frame is similar to the (i-1)th stable frame, then in order to ensure the smoothness of the preview screen, the terminal adopts the same The same stable quadrilateral frame of the previous frame image, that is, the (i-1)th stable frame is continuously determined as the i-th stable frame corresponding to the current i-th preview image.
  • the terminal since the i-th stable frame has not changed, the terminal also does not update the pre-stored (i-1)-th stable frame for stable frame comparison, and continues as the next frame, that is, the (i-th stable frame) +1) Reference stable quad bounding box when performing stable quad bounding box determination.
  • FIG. 13A is a schematic diagram 1 of a scenario for determining a stable frame proposed by an embodiment of the present application. It is assumed that the dotted line is the (i-1)th stable frame, and the solid line is the initial stable frame. As shown in FIG. 13A , the initial stable frame If the similarity with the (i-1) th stable frame is high, the terminal can retain the (i-1) th stable frame as the i-th stable frame of the current image frame.
  • the terminal can determine that the initial stable frame is not similar to the (i-1)th stable frame, that is, the quadrilateral frame corresponding to the target object in the preview image changes , then in order to ensure the accuracy of the preview image, the terminal determines the currently determined initial stable quadrilateral frame as the stable quadrilateral frame corresponding to the current i-th frame preview image.
  • the terminal needs to update the (i-1)th stable frame previously stored for stable frame comparison at the same time, and continue the initial stable frame corresponding to the current i-th preview image.
  • the next frame that is, the reference stable quadrilateral frame when the (i+1)th stable quadrilateral frame is determined.
  • FIG. 13B is a schematic diagram of the second scenario for determining the stable frame proposed by the embodiment of the present application. It is assumed that the dotted line is the (i-1)th stable frame, and the solid line is the initial stable frame. As shown in FIG. 13B , the initial stable frame If the similarity with the (i-1)th stable frame is poor, the terminal can update the stored (i-1)th stable frame, and use the initial stable frame as the i-th stable frame of the i-th frame image.
  • An embodiment of the present application provides an image display method.
  • the terminal can determine the current stable quadrilateral frame according to different similarity results by comparing the similarity between the quadrilateral frame of the current latest frame and the historically stored reference stable quadrilateral frame. , which solves the problem of unstable display of the quadrilateral frame in the preview screen, overcomes the defect that the preview screen is not displayed smoothly, and further realizes high-efficiency picture scanning.
  • FIG. 14 is a schematic diagram of the execution flow of the image processing proposed by the embodiment of the present application.
  • the terminal first obtains the preview image (step S01), then the terminal performs frame detection on the preview image, such as quadrilateral detection processing (step S02); and stores the obtained quadrilateral frame at the tail of the FIFO, that is, the last bit of the queue (step S03).
  • the terminal can sequentially select unclassified quadrilateral frame samples from the quadrilateral frame samples existing in the current FIFO queue according to the sequence of entering the FIFO queue (step S03), and perform distance calculation according to the above-mentioned preset similarity function (step S03).
  • step S04 judges based on the distance difference whether there is a classifiable border group corresponding to the unclassified quadrilateral border sample in the frame group that has been clustered (step S05); If there is a frame group with a similar distance, it can be determined that the unclassified quadrilateral frame sample belongs to the frame group with a similar distance, and the quadrilateral frame can be directly added to the frame group (step S06); There are multiple border groups with similar distances in the border group. You can also sort the distances and add the quadrilateral borders to the border group with the closest distance. On the other hand, if there is no similar distance in the sorted border groups The terminal can create a new frame group, and add the quadrilateral frame to the new frame group (step S07).
  • the terminal can judge whether all the unclassified quadrilateral frame samples in the FIFO sequence have completed the clustering, that is, whether there are unclassified quadrilateral frame samples in the current FIFO queue (step S08, if it is determined to exist, then the terminal jumps Go to step S03, repeat the above steps; if not, the terminal can select the target frame group from at least one frame group obtained by clustering, such as at least one frame group, the frame group with the largest number of quadrilateral frame samples as the target A frame group (step S09), and based on the time sequence, a quadrilateral frame sample corresponding to the latest frame is selected from the target frame group to determine the initial stable quadrilateral frame (step S010).
  • the terminal may perform similarity-based distance calculation on the initial stable quadrilateral frame and the historically stored reference stable quadrilateral frame (step S011 ). And determine whether the distance is less than a preset distance threshold (step S012). If it is less than, then the terminal does not need to update the historical reference stable quadrilateral frame, but directly uses the historical reference stable quadrilateral frame as the target stable quadrilateral frame corresponding to the current preview image and outputs (step S013); if not less than, then the terminal can use The initial stable quadrilateral frame is updated with the historical reference stable quadrilateral frame (step S014), and the currently determined new historical reference stable quadrilateral frame is determined as the target stable quadrilateral frame corresponding to the current preview image and output. Further, the terminal may render the obtained stable quadrilateral frame, and generate and display a rendered preview image based on the rendered quadrilateral frame and the current preview image (step S015).
  • the terminal no longer The image frame preview is directly based on the detected quadrilateral frame, but after a stable quadrilateral frame is obtained, the image preview is performed based on the stable quadrilateral frame, which solves the problem of unstable display of the quadrilateral frame in the preview screen and overcomes the preview screen. Displays not smooth defects.
  • FIG. 15 is a schematic diagram of the composition structure of the terminal proposed by the present application.
  • the terminal 10 proposed by the embodiment of the present application may include a quadrilateral detection module 11, Timing stabilization module 12, denoising stabilization module 13 and preview module 14,
  • the quadrilateral detection module 11 is configured to obtain the i-th frame preview image corresponding to the target object; and perform frame detection processing on the i-th frame preview image to obtain the i-th quadrilateral frame corresponding to the target object; wherein, The i is an integer greater than 0;
  • the timing stabilization module 12 is configured to perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame to obtain at least one frame group; and from the at least one frame group Choose a target frame group from the middle; And determine the initial stable frame from the target frame group;
  • the denoising stabilization module 13 is configured to determine the i-th stable frame based on the initial stable frame and the (i-1)-th stable frame;
  • the preview module 14 is configured to perform display processing on the i-th frame preview image according to the i-th stable frame.
  • FIG. 16 is a second schematic diagram of the composition and structure of the terminal proposed by the present application.
  • the terminal 10 proposed by the embodiment of the present application may include an acquisition part 15 that detects part 16, clustering part 17, selection part 18, determination part 19, display part 110, storage part 111
  • the acquisition part 15 is configured to acquire the ith frame preview image corresponding to the target object
  • the detection part 16 is configured to perform frame detection processing on the i-th frame preview image to obtain the i-th quadrilateral frame corresponding to the target object; wherein, the i is an integer greater than 0;
  • the clustering part 17 is configured to perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame to obtain at least one frame group;
  • the selection part 18 is configured to select a target frame group from the at least one frame group
  • the determining part 19 is configured to determine an initial stable frame from the target frame group; and determine the i-th stable frame based on the initial stable frame and the (i-1)-th stable frame;
  • the display part 110 is configured to perform display processing on the i-th frame preview image according to the i-th stable frame.
  • the storage part 111 is configured to perform similarity clustering processing based on the i-th quadrilateral frame after obtaining the i-th quadrilateral frame corresponding to the target object, Before obtaining at least one frame group, the i-th quadrilateral frame is stored in the N-th bit of the FIFO; wherein, N is an integer greater than 2, and the N represents the maximum storage capacity of the FIFO.
  • the clustering part 17 is specifically configured to read the first quadrilateral border to the i-th from the FIFO and performing the similarity clustering process based on the first quadrilateral frame to the i-th quadrilateral frame to obtain the at least one frame group.
  • the clustering part 17 is further specifically configured to read the (i-N+1)th item from the FIFO From the quadrilateral frame to the i-th quadrilateral frame; and performing the similarity clustering process based on the (i-N+1)-th quadrilateral frame to the i-th quadrilateral frame to obtain the at least one frame group .
  • the clustering part 17 is also specifically configured to obtain the kth vertex coordinate data corresponding to the kth quadrilateral border, which corresponds to the first (k-1) quadrilateral borders
  • the first historical frame group corresponding to the first (k-1) quadrilateral frames constructs the at least one frame group.
  • the clustering part 17 is also specifically configured to establish a new new feature corresponding to the kth quadrilateral frame if the minimum distance difference is greater than or equal to a preset distance threshold. a frame group, and construct the at least one frame group based on the newly added frame group and the first historical frame group; and if the minimum distance difference is less than a preset distance threshold, the kth quadrilateral frame It is classified into the first historical border group, and the at least one border group is constructed based on the first historical border group.
  • the clustering part 17 is also specifically configured to acquire the (i-N+k)th vertex coordinate data corresponding to the (i-N+k)th quadrilateral frame , and the first (i-N+k-1) vertex coordinate data corresponding to the first (i-N+k-1) quadrilateral borders; wherein, k is an integer greater than 1 and less than or equal to N; Let the similarity function calculate the first (i-N+k-1) distances corresponding to the (i-N+k)th vertex coordinate data and the first (i-N+k-1) vertex coordinate data difference; and determining a minimum distance difference from the first (i-N+k-1) distance differences; and based on the minimum distance difference and the first (i-N+k-1)
  • the second historical frame group corresponding to the quadrangular frame constructs the at least one frame group.
  • the clustering part 17 is also specifically configured to obtain the i-th vertex coordinate data corresponding to the i-th quadrilateral frame, and the previous grouped (i-1) The first (i-1) vertex coordinate data corresponding to the quadrilateral border; i-1) distance differences; and determining a minimum distance difference from the (i-1) distance differences; and corresponding to the first (i-1) quadrilateral borders based on the minimum distance difference.
  • the third historical border group of the at least one border group is constructed.
  • the selection part 18 is specifically configured to obtain the number of quadrilateral frames included in each frame group in the at least one frame group; The group is determined as the target border group.
  • the determining part 19 is specifically configured to arrange and process the quadrilateral frames in the target frame group in a chronological order to obtain a frame list; , the last quadrilateral border is determined as the initial stable border.
  • the determining part 19 is further specifically configured to perform mean filtering processing on the quadrilateral frames in the target frame group to obtain an initial stable frame.
  • the determining part 19 is further specifically configured to acquire the first vertex coordinate data corresponding to the initial stable frame and the second corresponding to the (i-1)th stable frame. vertex coordinate data; and calculating the distance difference between the first vertex coordinate data and the second vertex coordinate data according to a preset similarity function; and if the distance difference is less than a preset distance threshold, the (i-1) stable frame is determined as the i-th stable frame; and if the distance difference is greater than or equal to the preset distance threshold, then the initial stable frame is determined as the i-th stable frame frame.
  • the display part 110 is specifically configured to perform rendering processing on the i-th stable frame to obtain a post-render stable frame; and based on the post-render stable frame and the ith stable frame
  • the i-frame preview image generates a preview image after rendering; and displays the preview image after rendering.
  • FIG. 17 is a schematic diagram 3 of the composition structure of the terminal proposed by the embodiment of the present application.
  • the terminal 10 proposed by the embodiment of the present application may further include a processor 112, a The processor 112 has a memory 113 for executing instructions.
  • the terminal 10 may further include a communication interface 114 and a bus 115 for connecting the processor 112 , the memory 113 and the communication interface 114 .
  • the above-mentioned processor 112 may be an application specific integrated circuit (ASIC), a digital signal processor (Digital Signal Processor, DSP), a digital signal processing device (Digital Signal Processing Device, DSPD) ), Programmable Logic Device (ProgRAMmable Logic Device, PLD), Field Programmable Gate Array (Field ProgRAMmable Gate Array, FPGA), Central Processing Unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor at least one of.
  • ASIC application specific integrated circuit
  • DSP Digital Signal Processor
  • DSPD digital signal processing device
  • PLD Programmable Logic Device
  • Field Programmable Gate Array Field ProgRAMmable Gate Array
  • FPGA Field ProgRAMmable Gate Array
  • CPU Central Processing Unit
  • controller microcontroller, microprocessor at least one of.
  • the terminal 10 may also include a memory 113, which may be connected to the processor 112, wherein the memory 113 is used for storing executable program codes, the program codes including computer operation instructions, the memory 113 may include high-speed RAM memory, or may also include Non-volatile memory, for example, at least two disk drives.
  • the bus 115 is used to connect the communication interface 114 , the processor 112 and the memory 113 and the mutual communication among these devices.
  • the memory 113 is used to store instructions and data.
  • the above-mentioned processor 112 is configured to obtain the ith frame preview image corresponding to the target object, and perform frame detection processing on the ith frame preview image to obtain the ith frame preview image corresponding to the target object.
  • the i-th quadrilateral frame wherein, the i is an integer greater than 0; based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame, similarity clustering is performed to obtain at least one frame group; Select a target frame group from the at least one frame group, and determine an initial stable frame from the target frame group; determine the ith stable frame based on the initial stable frame and the (i-1)th stable frame; according to The i-th stable frame performs display processing on the i-th frame preview image.
  • the above-mentioned memory 113 may be a volatile memory (volatile memory), such as a random access memory (Random-Access Memory, RAM); or a non-volatile memory (non-volatile memory), such as a read-only memory (Read-Only Memory, ROM), flash memory (flash memory), hard disk (Hard Disk Drive, HDD) or solid-state drive (Solid-State Drive, SSD); or a combination of the above types of memory, and send it to the processor 112 Provide instructions and data.
  • volatile memory such as a random access memory (Random-Access Memory, RAM)
  • non-volatile memory such as a read-only memory (Read-Only Memory, ROM), flash memory (flash memory), hard disk (Hard Disk Drive, HDD) or solid-state drive (Solid-State Drive, SSD); or a combination of the above types of memory, and send it to the processor 112 Provide instructions and data.
  • each functional module in this embodiment may be integrated into one file restoration unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware, or can be implemented in the form of software function modules.
  • the integrated unit is implemented in the form of software function modules and is not sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of this embodiment is essentially or correct. Part of the contribution made by the prior art or all or part of the technical solution can be embodied in the form of a software product, the computer software product is stored in a storage medium, and includes several instructions to make a computer device (which can be a personal A computer, a server, or a network device, etc.) or a processor (processor) executes all or part of the steps of the method in this embodiment.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes.
  • An embodiment of the present application provides a terminal. After performing frame detection processing on a current preview image containing a target object and obtaining a quadrilateral frame corresponding to the target object, the terminal may first perform clustering processing based on frame similarity on the quadrilateral frame. And select the target frame group from the obtained at least one frame group, and further determine the initial stable frame from the target frame group, and then further determine the current stable frame based on the comparison between the initial stable frame and the historical stable frame, so as to follow The current stable frame performs display processing on the current preview image.
  • the terminal no longer directly performs image preview based on the quadrilateral frame obtained by frame detection, but performs similarity clustering on the quadrilateral frame obtained by detection, selection of the target frame group, and determination of the initial stable frame, etc.
  • the screen display is not smooth defect.
  • An embodiment of the present application provides a computer-readable storage medium, on which a program is stored, and when the program is executed by a processor, the above-described image display method is implemented.
  • a program instruction corresponding to an image display method in this embodiment may be stored on a storage medium such as an optical disk, a hard disk, a U disk, etc.
  • a storage medium such as an optical disk, a hard disk, a U disk, etc.
  • An embodiment of the present application provides a chip, which includes a processor and an interface, the processor obtains program instructions through the interface, and the processor is configured to execute the program instructions to implement the image display method as described above.
  • the image display method includes the following steps:
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein, including but not limited to disk storage, optical storage, and the like.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • An apparatus implements the functions specified in a flow or flows of the implementation flow diagram and/or a block or blocks of the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer-implemented process such that The instructions provide steps for implementing the functions specified in the flow or blocks of the implementing flow diagram and/or the block or blocks of the block diagram.
  • the embodiments of the present application disclose an image display method, a terminal, and a storage medium.
  • the method includes: acquiring an i-th frame preview image corresponding to a target object, performing frame detection processing on the i-th frame preview image, and obtaining an i-th frame preview image corresponding to the target object.
  • the i-th quadrilateral frame wherein, i is an integer greater than 0; perform similarity clustering processing based on the first quadrilateral frame corresponding to the target object to the i-th quadrilateral frame, and obtain at least one frame group; from at least one frame group Select the target frame group, and determine the initial stable frame from the target frame group; determine the i-th stable frame based on the initial stable frame and the (i-1)th stable frame; perform the i-th frame preview image according to the i-th stable frame. Display processing.
  • the terminal no longer directly performs image preview based on the quadrilateral frame obtained by frame detection, but performs similarity clustering, selection of target frame group and initial Determination of the stable frame and other operations to remove abnormal frames, as well as the denoising stabilization operation compared with the historical stable frame, after obtaining the current stable quadrilateral frame, image preview is performed based on the stable quadrilateral frame, which solves the problem of unstable quadrilateral frame display in the preview screen. problem, overcome the defect that the preview screen is not displayed smoothly.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

Des modes de réalisation de la présente demande concernent un procédé d'affichage d'image, un terminal, une puce et un support de stockage. Le procédé consiste à : acquérir une ième trame d'image de prévisualisation correspondant à un objet cible, puis effectuer un traitement de détection de trame sur la ième trame d'image de prévisualisation afin d'obtenir une ième trame quadrilatérale correspondant à l'objet cible, i étant un nombre entier supérieur à 0 ; effectuer un traitement de regroupement de similarité d'après une première trame quadrilatérale avec la ième trame quadrilatérale qui correspond à l'objet cible, de façon à obtenir au moins un groupe de trames ; sélectionner un groupe de trames cibles à partir du ou des groupes de trames, puis déterminer une trame stable initiale à partir du groupe de trames cibles ; déterminer une ième trame stable d'après la trame stable initiale et une (i-1)ème trame stable ; et effectuer un traitement d'affichage sur la ième trame d'image de prévisualisation selon la ième trame stable.
PCT/CN2021/076494 2021-02-10 2021-02-10 Procédé d'affichage d'image, terminal, puce et support de stockage WO2022170554A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/076494 WO2022170554A1 (fr) 2021-02-10 2021-02-10 Procédé d'affichage d'image, terminal, puce et support de stockage
CN202180084568.4A CN116686281A (zh) 2021-02-10 2021-02-10 图像显示方法、终端、芯片及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/076494 WO2022170554A1 (fr) 2021-02-10 2021-02-10 Procédé d'affichage d'image, terminal, puce et support de stockage

Publications (1)

Publication Number Publication Date
WO2022170554A1 true WO2022170554A1 (fr) 2022-08-18

Family

ID=82837433

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/076494 WO2022170554A1 (fr) 2021-02-10 2021-02-10 Procédé d'affichage d'image, terminal, puce et support de stockage

Country Status (2)

Country Link
CN (1) CN116686281A (fr)
WO (1) WO2022170554A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10115031B1 (en) * 2015-02-27 2018-10-30 Evernote Corporation Detecting rectangular page and content boundaries from smartphone video stream
CN111445566A (zh) * 2020-03-27 2020-07-24 腾讯科技(深圳)有限公司 一种信息处理方法、装置及计算机可读存储介质
CN111464716A (zh) * 2020-04-09 2020-07-28 腾讯科技(深圳)有限公司 一种证件扫描方法、装置、设备及存储介质
CN112183529A (zh) * 2020-09-23 2021-01-05 创新奇智(北京)科技有限公司 四边形物体检测、模型训练方法、装置、设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10115031B1 (en) * 2015-02-27 2018-10-30 Evernote Corporation Detecting rectangular page and content boundaries from smartphone video stream
CN111445566A (zh) * 2020-03-27 2020-07-24 腾讯科技(深圳)有限公司 一种信息处理方法、装置及计算机可读存储介质
CN111464716A (zh) * 2020-04-09 2020-07-28 腾讯科技(深圳)有限公司 一种证件扫描方法、装置、设备及存储介质
CN112183529A (zh) * 2020-09-23 2021-01-05 创新奇智(北京)科技有限公司 四边形物体检测、模型训练方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN116686281A (zh) 2023-09-01

Similar Documents

Publication Publication Date Title
CN110532984B (zh) 关键点检测方法、手势识别方法、装置及系统
CN108920580B (zh) 图像匹配方法、装置、存储介质及终端
CN108805170B (zh) 形成用于全监督式学习的数据集
CN107403424B (zh) 一种基于图像的车辆定损方法、装置及电子设备
US11176415B2 (en) Assisted image annotation
RU2711029C2 (ru) Классификация касаний
TW201837786A (zh) 基於圖像的車輛定損方法、裝置、電子設備及系統
JP6188400B2 (ja) 画像処理装置、プログラム及び画像処理方法
WO2019080411A1 (fr) Appareil électrique, procédé de recherche de regroupement d'images faciales, et support d'informations lisible par ordinateur
US11288307B2 (en) Method, electronic device, and computer readable medium for photo organization
US9779292B2 (en) System and method for interactive sketch recognition based on geometric contraints
JP6245880B2 (ja) 情報処理装置および情報処理手法、プログラム
JP6997369B2 (ja) プログラム、測距方法、及び測距装置
WO2022126914A1 (fr) Procédé et appareil de détection de corps vivant, dispositif électronique et support de stockage
US11526708B2 (en) Information processing device, information processing method, and recording medium
WO2022247403A1 (fr) Procédé de détection de point clé, dispositif électronique, programme, et support de stockage
JP2020154773A (ja) 画像処理装置、画像処理方法及び画像処理システム
CN114255223A (zh) 基于深度学习的双阶段卫浴陶瓷表面缺陷检测方法和设备
CN113780116A (zh) 发票分类方法、装置、计算机设备和存储介质
WO2022170554A1 (fr) Procédé d'affichage d'image, terminal, puce et support de stockage
WO2020244076A1 (fr) Procédé et appareil de reconnaissance faciale, dispositif électronique et support d'informations
CN109961061A (zh) 一种边缘计算视频数据结构化方法及系统
CN113591657B (zh) Ocr版面识别的方法、装置、电子设备及介质
WO2018168515A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et support d'enregistrement
CN114494751A (zh) 证照信息识别方法、装置、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21925217

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180084568.4

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21925217

Country of ref document: EP

Kind code of ref document: A1