CN109815349B - Information acquisition method and terminal equipment - Google Patents
Information acquisition method and terminal equipment Download PDFInfo
- Publication number
- CN109815349B CN109815349B CN201811564076.1A CN201811564076A CN109815349B CN 109815349 B CN109815349 B CN 109815349B CN 201811564076 A CN201811564076 A CN 201811564076A CN 109815349 B CN109815349 B CN 109815349B
- Authority
- CN
- China
- Prior art keywords
- images
- result information
- image
- target object
- version
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention discloses an information acquisition method and terminal equipment, relates to the technical field of communication, and aims to solve the problem that interface difference points before and after updating of an application version cannot be accurately and comprehensively determined in a manual checking and comparing mode. The method comprises the following steps: acquiring a first image set and a second image set, wherein the first image set comprises M first images, the second image set comprises N second images, the M first images are images of a target object of a first version, and the N second images are images of the target object of a second version; comparing the M first images with the N second images to obtain target result information, wherein the target result information is the difference information of the target object of the first version and the target object of the second version; and outputting the target result information. The method can be applied to the scene that the terminal equipment acquires the difference information between the application with different versions, the webpage with different versions or the appearance structure of the equipment with different versions.
Description
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an information acquisition method and terminal equipment.
Background
With the rapid development of communication technology, the terminal devices (such as mobile phones, tablet computers or e-book readers) are updated more and more rapidly.
Currently, a practitioner of an internet company generally obtains a difference point before and after application update by manually checking and comparing. Specifically, if the internet company practitioner needs to know the content of the updated version of the application 1 (hereinafter, referred to as version a) updated relative to the version of the application 1 before update (hereinafter, referred to as version B), the internet company practitioner may determine the difference point between each interface in the application 1 of the version B and the application 1 of the version a by comparing each corresponding interface in the application 1 of the version B and the application 1 of the version a (for example, comparing the interface 1 of the application 1 of the version B and the interface 1 of the application 1 of the version a), so as to know the difference point before and after update of the application 1.
However, in the above method, because the difference points before and after the application version is updated are usually obtained by manually checking and comparing, the difference points before and after the application version is updated are easily missed, that is, the above method may not obtain all the difference points before and after the application version is updated, so that the interface difference points before and after the application version is updated cannot be accurately and comprehensively determined.
Disclosure of Invention
The embodiment of the invention provides an information acquisition method, which aims to solve the problem that interface difference points before and after updating of an application version cannot be accurately and comprehensively determined in a manual checking and comparing mode.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information obtaining method, where the method may include: acquiring a first image set and a second image set; comparing the M first images with the N second images to obtain target result information; and outputting the target result information. The first image set comprises M first images, the M first images are images of a target object of a first version, the second image set comprises N second images, the N second images are images of a target object of a second version, and the target result information is difference information between the target object of the first version and the target object of the second version.
In a second aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes an obtaining module, a comparing module, and an output module. An obtaining module, configured to obtain a first image set and a second image set, where the first image set includes M first images, the second image set includes N second images, the M first images are images of a target object in a first version, and the N second images are images of a target object in a second version; the comparison module is used for comparing the M first images and the N second images acquired by the acquisition module to acquire target result information, wherein the target result information is difference information between the target object of the first version and the target object of the second version; and the output module is used for outputting the target result information obtained by the comparison of the comparison module.
In a third aspect, an embodiment of the present invention provides a terminal device, where the terminal device includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when the computer program is executed by the processor, the terminal device implements the steps of the information obtaining method provided in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the information acquisition method provided in the first aspect.
In the embodiment of the present invention, when a target object is updated (for example, the target object has a new version release) or a user triggers, an image set of the target object before updating (i.e., a first image set) and an image set of the target object after updating (i.e., a second image set) may be automatically obtained; comparing the image set of the target object before updating with the image set of the target object after updating to obtain difference information between the target object after updating and the target object before updating (namely difference information between target objects of different versions); and outputting the difference information. According to the scheme, the terminal equipment can obtain the difference information between the target object of the new version and the target object of the old version by automatically comparing the image set of the target object of the new version with the image set of the target object of the old version under the condition that the target object is updated or triggered by a user, and the difference information of the target object before and after the target object is updated is not required to be obtained in a manual checking and comparing mode, so that the difference points of the target object before and after the target object is updated can be accurately and comprehensively determined.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an information obtaining method according to an embodiment of the present invention;
fig. 3 is a second schematic diagram of an information obtaining method according to an embodiment of the present invention;
fig. 4 is a third schematic diagram of an information obtaining method according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating difference information between similar interfaces of different versions of applications, according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of various regions in an application interface according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 8 is a hardware schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first image and the second image, etc. are for distinguishing different images, rather than for describing a particular order of the images.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of sub-results means two or more sub-results, and the like.
In the embodiment of the present invention, the same interface in the applications of different versions means that the interface is completely the same interface in the applications of different versions. That is, if an interface is identical in different versions of an application, the interface may be referred to as the same interface in different versions of the application. The similar interfaces in the applications of different versions refer to interfaces with similarity greater than or equal to a preset similarity threshold in the applications of different versions.
In the embodiment of the present invention, when a target object is updated (for example, the target object has a new version release) or a user triggers, an image set of the target object before updating (i.e., a first image set) and an image set of the target object after updating (i.e., a second image set) may be automatically obtained; comparing the image set of the target object before updating with the image set of the target object after updating to obtain difference information between the target object after updating and the target object before updating (namely difference information between target objects of different versions); and outputting the difference information. According to the scheme, the terminal equipment can obtain the difference information between the target object of the new version and the target object of the old version by automatically comparing the image set of the target object of the new version with the image set of the target object of the old version under the condition that the target object is updated or triggered by a user, and the difference information of the target object before and after the target object is updated is not required to be obtained in a manual checking and comparing mode, so that the difference points of the target object before and after the target object is updated can be accurately and comprehensively determined.
The terminal in the embodiment of the present invention may be a terminal having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the information acquisition method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the information acquisition method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the information acquisition method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal can implement the information acquisition method provided by the embodiment of the invention by running the software program in the android operating system.
The terminal equipment in the embodiment of the invention can be a mobile terminal or a non-mobile terminal. For example, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal may be a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
The execution main body of the information acquisition method provided by the embodiment of the present invention may be the terminal device, or may also be a functional module and/or a functional entity capable of implementing the information acquisition method in the terminal device, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a terminal device as an example to exemplarily describe the information acquisition method provided by the embodiment of the present invention.
In the embodiment of the present invention, the target object may be an application program (hereinafter referred to as an application), may also be a web page, and may also be any object that may have different versions, such as an appearance structure of a terminal device. When the target object is an application or a web page, the update of the target object may be an update of an element (for example, the element may be a name of an interface, a function option, or the like) in the application or the web page; when the target object is an appearance structure of the terminal device, the update of the target object may be an update of the appearance structure of the terminal device (for example, specifically, a shape of the terminal device, a function key on the terminal device, or the like). Taking the target object as an application as an example, in order to know the update dynamics of the application in real time, a user needs to timely, accurately and comprehensively know the difference points before and after the application is updated (i.e. the difference information before and after the target object is updated in the embodiment of the present invention). Specifically, under the condition that the application has an update (for example, the application has a new version release) or the user triggers, the terminal device may automatically acquire images of all interfaces before the application is updated (for example, M first images in the embodiment of the present invention) and images of all interfaces after the application is updated (for example, N second images in the embodiment of the present invention), and may acquire the difference points before and after the application is updated by comparing the images of all interfaces before and after the application is updated, so that the terminal device displays the difference points to the user to enable the user to know the difference points before and after the application is updated.
It can be understood that the user in the embodiment of the present invention may be a practitioner (for example, any possible background worker such as a programmer and a software maintenance worker) in the internet industry, and may also be an end user using a terminal device. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
The following describes an information acquisition method provided by an embodiment of the present invention by way of example with reference to the accompanying drawings.
As shown in fig. 2, an embodiment of the present invention provides an information acquisition method, which may include S201 to S203 described below.
S201, the terminal device acquires a first image set and a second image set.
The first image set includes M first images, the second image set includes N second images, the M first images are images of a first version of a target object, and the N second images are images of a second version of the target object.
Optionally, in the embodiment of the present invention, the target object may be any one of an application installed in the terminal device, a webpage supported by the terminal device, and an appearance structure of the terminal device, and may be specifically determined according to an actual use requirement, which is not limited in the embodiment of the present invention.
Optionally, in the embodiment of the present invention, taking the target object as an application installed in the terminal device as an example, the terminal device may automatically run the application in the background through a virtual machine or a simulator, so as to obtain the first image set and the second image set. Specifically, when the version of the application is the first version, the terminal device may automatically run the application in the background through the virtual machine or the simulator to obtain the image set (that is, the first image set) of the application in the first version; when the version of the application is the second version, the terminal device may automatically run the application in the background through a virtual machine or a simulator to obtain the image set (i.e., the second image set) of the application in the second version.
In this embodiment of the present invention, when detecting that the application has an update (for example, the application has a new version release), the terminal device may automatically run the application in the background through the virtual machine or the simulator to obtain the first image set and the second image set. Specifically, the terminal device may detect whether an application in the terminal device is updated in real time, and if the terminal device detects that the application is updated (for example, the application has a new version release), the terminal device may automatically run the application in the background through a virtual machine or a simulator to obtain the first image set and the second image set.
Optionally, in the embodiment of the present invention, the terminal device may capture a screenshot of each interface in the application to obtain the first image set and the second image set in the process of running the application. Specifically, when the terminal device captures the screen of each interface in the application, the screen capture may be performed according to a certain sequence (for example, a display sequence of each interface). It can be understood that, in order to improve the accuracy of the acquired difference information between applications of different versions, the terminal device may capture images in the same order when capturing images of each interface in the applications of different versions.
For example, in a case that the terminal device detects that the application is updated to the first version, the terminal device may automatically run the application in the background through a virtual machine or a simulator, and capture and save M interfaces in the application in a certain order (hereinafter referred to as a first order) (hereinafter referred to as a first capture and save); in the case that the terminal device detects that the application is updated to the second version, the terminal device may automatically run the application in the background again through the virtual machine or the simulator, and capture and save the N interfaces in the application in the same order as the first order (hereinafter referred to as second capture and save). It can be understood that the screenshot image of the interface captured and saved for the first time is the image of the target object in the first version (i.e., the M first images), and the screenshot image of the interface captured and saved for the second time is the image of the target object in the second version (i.e., the N second images).
The embodiment of the invention can capture the interfaces when the interfaces of the application are operated, namely, images of the interfaces are stored when the interfaces of the application are operated.
In the embodiment of the invention, after the terminal equipment captures the images of all interfaces in the application according to a certain sequence, the captured images obtained by capturing the images can be stored according to the sequence during capturing the images. Wherein the order may be an order of respective interfaces running the application. Specifically, the terminal device may set a number (for distinguishing screenshot images of the same interface of different versions or screenshot images of similar interfaces) for the screenshot images of each interface by combining the version number of the current version of the application and the interface hierarchy identifier of each interface in the application, and then store the screenshot images according to the numbers of the screenshot images.
Optionally, in the embodiment of the present invention, the method for setting numbers for the screenshot images of each interface by the terminal device specifically may be: version number + interface level identification.
For example, assume that the version number of the version of the application before update is Va, and the version number of the version of the application after update is Vb; the interface hierarchy of each interface in the application is represented as a 0-level interface, a 1-level interface, a 2-level interface, a 3-level interface, … … and an X-level interface, wherein X is an integer. Then, the following exemplarily describes a method for setting numbers for screenshot images of each interface provided in the embodiment of the present invention with reference to table 1, where the version number is Va.
The level 0 interface may be an interface when the application is started, the level 1 interface may be a main interface of the application, the level 2 interface may be a next level interface of the level 1 interface (for example, an interface displayed by the terminal device is triggered after the user clicks any function option on the main interface), the level 3 interface may be a next level interface of the level 2 interface, … …, and so on, and the level X interface may be a next level interface of the level X-1 interface.
TABLE 1
Level 0 interface | Level 1 interface | Level 2 | Level | 3 interface | … | X level interface |
Va00 | Va01 | Va01-01 | Va01-01-01 | … | Va01-01-…-01 | |
Va02 | Va01-02 | Va01-01-02 | … | Va01-01-…-02 | ||
Va03 | Va01-03 | Va01-01-03 | … | Va01-01-…-03 | ||
… | … | … | … | … |
As can be seen from table 1, in the embodiment of the present invention, interfaces at different levels may be represented by different interface level identifiers. For example, a level 0 interface may be represented by the interface hierarchy identifier "00"; a level 1 interface may be represented by interface hierarchy identifiers "01", "02", "03", and so forth; a level 2 interface may be represented by an interface hierarchy identifier of "01-01", "01-02", "01-03", and so forth; the 3-level interfaces may be represented by interface hierarchy symbols "01-01-01", "01-01-02", "01-01-03", etc., and the X-level interfaces may be represented by interface hierarchy symbols "Va 01-01- … -01", "Va 01-01- … -02", "Va 01-01- … -03", etc., and so on. It can be understood that table 1 above is exemplarily illustrated by taking a 0-level interface, a 1-level interface, a 2-level interface, a 3-level interface, and an X-level interface as examples, and for other interfaces after the 3-level interface and before the X-level interface, the representation manner is similar to that of the interfaces at each level listed in table 1, and details are not repeated here.
Optionally, in this embodiment of the present invention, the first version may be a version before the target object is updated, and the second version may be a version after the target object is updated; alternatively, the first version may be a version of the target object after update, and the second version may be a version of the target object before update. The specific method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
S202, the terminal equipment compares the M first images with the N second images to obtain target result information.
The target result information is difference information between the target object of the first version and the target object of the second version.
Optionally, in this embodiment of the present invention, the terminal device may use an image recognition technology to recognize contents in the M first images and the N second images, and then compare the contents in the M first images and the N second images to obtain difference information between the M first images and the N second images.
Of course, in the embodiment of the present invention, the terminal device may also compare the M first images and the N second images by using any other possible technology for comparing difference points between the images, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited.
Optionally, in this embodiment of the present invention, the target result information may include first result information and second result information. Wherein, the first result information is a result obtained by comparing any two images in the M first images and the N second images; the second result information is obtained by comparing the images with the same number in the M first images and the N second images. This is explained in detail below with reference to fig. 3.
For example, in conjunction with fig. 2, as shown in fig. 3, the above S202 may be specifically implemented by the following S202a and S202 b.
S202a, the terminal device compares the ith first image in the M first images with the jth second image in the N second images to obtain the first result information.
Wherein i is an integer from 1 to M, and j is an integer from 1 to N.
For example, in the embodiment of the present invention, assuming that M is equal to 5 and N is equal to 5, i may be taken from 1 to 5, and j may also be taken from 1 to 5. With the scheme of S202a described above, the terminal device may sequentially compare the 1 st first image of the 5 first images with each of the 5 second images, compare the 2 nd first image of the 5 first images with each of the 5 second images, and so on until the terminal device compares the 5 th first image of the 5 first images with each of the 5 second images. In this manner, the terminal device can obtain the above-described first result information by comparing each of the 5 first images with each of the 5 second images.
Specifically, the specific method for the terminal device to obtain the first result information by comparing each of the 5 first images with each of the 5 second images will be described in detail in the following embodiments, and details of the method are not repeated herein.
S202b, the terminal device compares the q-th first image in the M first images with the q-th second image in the N second images to obtain second result information.
Wherein q is an integer from 1 to P, and P is the minimum of M and N.
It should be noted that, in this embodiment of the present invention, since M and N may not be equal, the second result information may include result information obtained by comparing the image with the content of another image that is not compared by the terminal device (i.e., the content of the other image may also be considered as difference information between the M first images and the N second images).
For example, in the embodiment of the present invention, if M is equal to 5 and N is equal to 4, q may be taken from 1 to 4. With the scheme of S202b described above, the terminal device may sequentially compare the 1 st first image in the 5 first images with the 1 st second image in the 4 second images, compare the 2 nd first image in the 5 first images with the 2 nd second image in the 4 second images, and so on until the terminal device compares the 4 th first image in the 5 first images with the 4 th second image in the 4 second images. In this way, the terminal device may obtain the second result information by comparing 4 first images and 4 second images of the 5 first images and combining 1 first image not compared of the 5 first images. It is understood that the second result information may include result information of comparison between 4 first images and 4 second images of the 5 first images by the terminal device, and content of another first image other than the compared 4 first images of the 5 first images (i.e., the content of the another first image may be considered as difference information between the 5 first images and the 4 second images).
The details of the above S202a and S202b are described below with reference to fig. 4 and table 1.
Optionally, in the embodiment of the present invention, with reference to fig. 3, as shown in fig. 4, the above-mentioned S202a may be specifically implemented by the following S202a1-S202a 3.
S202a1, the terminal device compares the ith first image in the M first images with the jth second image in the N second images, and determines K image groups, and the similarity between the images in each image group in the K image groups is larger than or equal to a preset similarity threshold value.
In the embodiment of the present invention, the terminal device may determine an image combination in which the similarity between the images is greater than or equal to the preset similarity threshold by comparing any two images of the M first images and the N second images. Specifically, the terminal device may determine K image groups by comparing any two images of the M first images and the N second images, and the similarity between the images in each image group is greater than or equal to a preset similarity threshold (that is, each image group includes one first image and one second image, and the similarity between the first image and the second image in each image group is greater than or equal to the preset similarity threshold). It is understood that the similarity between images in different image groups is less than a preset similarity threshold.
Illustratively, taking K as 3 as an example, it is assumed that 3 image groups are an image group 1, an image group 2, and an image group 3, respectively, and an image 10 and an image 11 are included in the image group 1, an image 20 and an image 21 are included in the image group 2, and an image 30 and an image 31 are included in the image group 3. Then, the similarity between the image 10 and the image 11 is greater than or equal to the preset similarity threshold, the similarity between the image 20 and the image 21 is greater than or equal to the preset similarity threshold, and the similarity between the image 30 and the image 31 is also greater than or equal to the preset similarity threshold. And the similarity between the images 10, 20 and 30 is less than a preset similarity threshold.
It should be noted that the first image in each image group may be an image of a first version of the target object, and the second image in each image group may be an image of a second version of the target object.
It can be understood that, in the embodiment of the present invention, an image with a similarity greater than or equal to a preset similarity threshold in the M first images and the N second images may be an image with the same content or similar content of target objects in different versions. For example, screenshot images of the same interface or similar interfaces that may be different versions of an application, etc.
Optionally, in the embodiment of the present invention, the preset similarity threshold may be a default numerical value of a system of the terminal device, or may be a user-defined numerical value. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
When the preset similarity threshold is a user-defined numerical value, a setting interface may be provided to the user by the terminal device (for example, a setting option may be added in the setting application for the user to set the preset similarity threshold), so that the user may set the preset similarity threshold through the setting interface.
S202a2, the terminal device adopts the image recognition technology to recognize and compare the images in each image group, and K pieces of first recognition result information are obtained.
Wherein each image group corresponds to one first recognition result information of the K pieces of first recognition result information respectively. It can be understood that different image groups correspond to different first recognition result information, and K is a positive integer.
S202a3, the terminal device sets at least one piece of the K pieces of first recognition result information as first result information.
In the embodiment of the present invention, after the terminal device determines K image groups (each image group includes a first image and a second image respectively), for each image group, the terminal device may identify the content of the first image (hereinafter referred to as a first content) and the content of the second image (hereinafter referred to as a second content) in the image group by using an image identification technology, and obtain identification result information of the image group (i.e., difference information between the first image and the second image in the image group) by comparing the first content and the second content. In this way, the terminal device may obtain K pieces of first recognition result information by performing recognition comparison on each of the K image groups, and then the terminal device may determine at least one piece of the K pieces of first recognition result information, which indicates that there is difference information between two images, as the first result information (i.e., the terminal device may ignore the first recognition result information, which indicates that there is no difference between images, in the K pieces of first recognition result information).
For example, in the embodiment of the present invention, assuming that K is 3, that is, the terminal device determines 3 image groups after performing S202a1, where each image group includes two images (the two images are respectively an image of the first version of the target object and an image of the second version of the target object), for each image group, the terminal device may identify the contents of the two images in the image group by using an image recognition technology, and compare the contents of the two images in the image group to obtain one piece of first identification result information. In this manner, the terminal device can obtain 3 pieces of first recognition result information corresponding to the 3 image groups by performing recognition comparison on the 3 image groups, respectively, and then, the terminal device can output at least one piece of first recognition result information indicating information that there is a difference between images among the 3 pieces of first recognition result information as the first result information (i.e., the terminal device can ignore the first recognition result information indicating no difference between images among the 3 pieces of first recognition result information).
In the embodiment of the invention, the terminal equipment can determine the image group of which the similarity between the images is greater than or equal to the preset similarity threshold by fully comparing the M first images with the N second images, then identifies the content of the images in the image group by adopting an image identification technology, and finally determines the difference information between the images by comparing the content. Taking the target object as an application as an example, by using the scheme of S202a (or S202a1-S202a3), the difference information between any two interfaces with higher similarity can be determined, for example, even if the interface level of a certain interface in an application of a different version changes, the scheme of S202a (or S202a1-S202a3) can still be used to determine the difference point of the interface in the application of the different version.
Optionally, in the embodiment of the present invention, with reference to fig. 3, as shown in fig. 4, the above-mentioned S202b may be specifically implemented by the following S202b1 and S202b 2.
S202b1, the terminal equipment adopts an image recognition technology to recognize and compare the q-th first image in the M first images and the q-th second image in the N second images, and T pieces of second recognition result information are obtained.
Wherein T is the maximum of M and N.
S202b2, the terminal device sets at least one piece of second recognition result information among the T pieces of second recognition result information as second result information.
In the embodiment of the present invention, the terminal device may recognize the content of the q-th first image in the M first images and the content of the q-th second image in the N second images by using an image recognition technology, and compare the content of the q-th first image and the content of the q-th second image to obtain recognition result information (i.e., difference information between the q-th first image and the q-th second image) on the q-th first image and the q-th second image. In this manner, the terminal device may sequentially recognize and compare the contents of the M first images and the contents of the N second images to obtain the T pieces of second recognition result information, so that the terminal device may determine at least one piece of second recognition result information indicating that there is difference information between the two images among the T pieces of second recognition result information as the above piece of second result information (i.e., the terminal device may ignore the second recognition result information indicating that there is no difference between the images among the T pieces of second recognition result information).
For example, as shown in fig. 5, assuming that the q-th first image is an image as shown in (a) of fig. 5 and the q-th second image is an image as shown in (b) of fig. 5, the terminal device may identify the content of the q-th first image and the content of the q-th second image using an image recognition technique, and compare the content of the q-th first image and the content of the q-th second image to obtain recognition result information of the q-th first image and the q-th second image (i.e., difference information between the q-th first image and the q-th second image), for example, the difference information between the q-th first image and the q-th second image may be as shown in 50 and 51 of fig. 5.
In the embodiment of the invention, the terminal equipment can determine the difference information between the images with the same interface level identification in different versions by comparing the images with the same interface level identification in the M first images and the N second images one by one. Taking the target object as an application as an example, by using the scheme of S202b (or S202b1), difference information between images identified by the same interface level of different versions can be determined, for example, even if the similarity of an interface between applications of different versions is smaller than a preset similarity threshold, by using the scheme of S202b (or S202b1), difference points of the interface in applications of different versions can still be determined.
It should be noted that, in the embodiment of the present invention, in combination with the schemes of S202a and S202b (or S202a1-S202a3 and S202b1), no matter that the interface level of the interface in the application of different versions changes, or whether the similarity of the interface in the application of different versions is smaller than the preset similarity threshold, the terminal device can determine the difference point of the interface in the application of different versions.
And S203, the terminal equipment outputs target result information.
In the embodiment of the present invention, after the terminal device obtains the comparison result information (i.e., the target result information) obtained by comparing the M first images and the N second images, the terminal device may output the comparison result information to show the comparison result information to the user, so that the user may obtain the difference information before and after the target object is updated according to the comparison result information, and may further timely, accurately and comprehensively know the update dynamics of the target object.
Optionally, in this embodiment of the present invention, the target result information may include a plurality of pieces of sub-result information, where each piece of sub-result information in the plurality of pieces of sub-result information is a piece of difference information between the target object of the first version and the target object of the second version. That is, the target result information includes a plurality of items of difference information between the target object of the first version and the target object of the second version.
Optionally, in this embodiment of the present invention, the manner of outputting the target result information by the terminal device may be displaying the target result information (including the plurality of pieces of sub-result information) on a display screen of the terminal device, or outputting the target result information in a document form. Further optionally, the terminal device may display at least one piece of sub-result information of the plurality of pieces of sub-result information on a display screen of the terminal device. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
For example, the terminal device displays at least one piece of sub-result information in the plurality of pieces of sub-result information on a display screen of the terminal device. Optionally, in the embodiment of the present invention, with reference to fig. 3, the step S203 may be specifically implemented by the step S203a described below.
S203a, the terminal device displays at least one piece of sub-result information on the display screen of the terminal device.
Optionally, in an embodiment of the present invention, the at least one sub-result information may be a sub-result information of which an importance level is greater than or equal to a preset level threshold in the plurality of sub-result information. Wherein, the importance level of one sub-result information may be determined by the importance level of the corresponding area of the difference information indicated by one sub-result information in the target object.
Optionally, in the embodiment of the present invention, taking the target object as an application as an example, the area corresponding to the difference information indicated by the multiple pieces of sub-result information in the application may be an area in an interface of the application. Specifically, the areas may include an area in which the title bar is located (i.e., an area in which the title in the interface is located, such as the title bar shown in fig. 6), an area in which the bottom bar is located (i.e., an area in which the function control located at the bottom of the interface is located, such as the bottom bar shown in fig. 6), and an area in which the content area is located (i.e., an area in which the data in the interface is located, such as the content area shown in fig. 6). The importance level of the area in which the title bar is located may be "highest", the importance level of the area in which the bottom bar is located may be "high", and the importance level of the area in which the content area is located may be "normal".
Accordingly, the importance level of the difference information corresponding to the area where the title bar is located is "highest" (i.e., the importance level of the sub-result information indicating the difference information is highest); the importance level of the difference information corresponding to the area where the bottom column is located is "high" (i.e., the importance level of the sub-result information indicating the difference information is high); the degree of importance of the difference information corresponding to the area where the content area is located is "general" (i.e., the degree of importance of the sub-result information indicating the difference information is general).
It should be noted that the descriptions of the importance levels such as "highest", "high", and "general" are exemplary lists of importance levels. In particular implementations, the importance level may also be quantified. For example, the importance level of each region in an interface can be measured by how much the change in the content of each region affects the frame of the interface. Specifically, the greater the degree of influence of the change in the content of a certain area on the frame of the interface, the higher the importance level of the area.
In the embodiment of the present invention, the importance level of each region in the interface may be predefined.
Optionally, in this embodiment of the present invention, the preset degree level threshold may be the "highest", "high", or "general" level of importance, or may be a quantized value. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in the embodiment of the present invention, the preset degree level threshold may be a default of a system of the terminal device, or may be user-defined. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
When the preset degree level threshold is customized by the user, a setting interface may be provided to the user by the terminal device (for example, a setting option may be added to the setting application for the user to set the preset degree level threshold), so that the user may set the preset degree level threshold through the setting interface.
In this embodiment of the present invention, the terminal device may determine the importance level of each of the plurality of sub-result information according to the importance level of the corresponding area of the difference information in the target object indicated by the sub-result information. Therefore, the terminal equipment can display at least one piece of sub-result information with the importance degree grade larger than or equal to the preset degree grade threshold value on the display screen of the terminal equipment according to the importance degree grade of each piece of sub-result information. For the sub-result information with the importance degree level smaller than the preset degree level threshold value in the plurality of sub-result information, the terminal device may not output or output in a document form.
For example, in the embodiment of the present invention, assuming that the default preset degree level threshold of the system of the terminal device is "high", after the terminal device obtains a plurality of pieces of sub-result information, the terminal device may display, on a display screen of the terminal device, the pieces of sub-result information whose importance degree levels are "high" and "highest" in the plurality of pieces of sub-result information, so as to visually indicate, to a user, the pieces of sub-result information whose importance degree levels are relatively high. For the sub-result information whose remaining importance level is "general", the terminal device may not output or output in the form of a document. In this way, the sub-result information with a higher level of importance can be intuitively indicated to the user.
In the embodiment of the present invention, the terminal device may directly display, on the display screen of the terminal device, the sub-result information whose importance level is greater than or equal to the preset level threshold among the plurality of sub-result information (difference information indicating objects of different versions), and may not output or output in a document form the sub-result information whose importance level is less than the preset level threshold among the plurality of sub-result information. Therefore, the sub-result information with higher importance degree levels can be intuitively indicated to the user, so that the user can conveniently and directly check the difference information indicated by the sub-result information with higher importance degree levels, and the user experience can be improved.
In the embodiment of the present invention, the information obtaining method shown in each of the above drawings is exemplarily described by referring to one of the drawings in the embodiment of the present invention. In specific implementation, the information obtaining method shown in each of the above figures may also be implemented by combining any other figures that may be combined, which are illustrated in the above embodiments, and are not described herein again.
As shown in fig. 7, an embodiment of the present invention provides a terminal device 300, which includes an obtaining module 301, a comparing module 302, and an outputting module 303. Specifically, the obtaining module 301 is configured to obtain a first image set and a second image set, where the first image set includes M first images, the second image set includes N second images, the M first images are images of a target object in a first version, and the N second images are images of a target object in a second version; a comparing module 302, configured to compare the M first images and the N second images obtained by the obtaining module 301 to obtain target result information, where the target result information is difference information between a target object in a first version and a target object in a second version; and the output module 303 is configured to output the target result information obtained by the comparison performed by the comparison module 302.
Optionally, in this embodiment of the present invention, the comparing module 302 is specifically configured to compare an ith first image in the M first images with a jth second image in the N second images to obtain first result information; comparing the q-th first image in the M first images with the q-th second image in the N second images to obtain second result information; where i is an integer from 1 to M, j is an integer from 1 to N, q is an integer from 1 to P, and P is the minimum of M and N.
Optionally, in this embodiment of the present invention, the comparing module 302 is specifically configured to compare an ith first image in the M first images with a jth second image in the N second images, and determine K image groups; identifying and comparing the images in each image group by adopting an image identification technology to obtain K pieces of first identification result information; and using at least one piece of the K pieces of first recognition result information as first result information; in each image group of the K image groups, the similarity between the images is greater than or equal to a preset similarity threshold, each image group corresponds to one piece of first identification result information, and K is a positive integer.
Optionally, in this embodiment of the present invention, the comparing module 302 is specifically configured to identify and compare a q-th first image in the M first images and a q-th second image in the N second images by using an image identification technology, to obtain T pieces of second identification result information, and use at least one piece of second identification result information in the T pieces of second identification result information as the second result information, where T is a maximum value of M and N.
Optionally, in this embodiment of the present invention, the output module 303 is specifically configured to display at least one piece of sub-result information on a display screen of the terminal device, where the at least one piece of sub-result information is a piece of sub-result information of which an information importance level is greater than or equal to a preset degree level threshold in the plurality of sub-results, and an importance level of one piece of sub-result information is determined by an importance level of a corresponding area of the difference information indicated by one piece of sub-result information in the target object.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and is not described herein again to avoid repetition.
The embodiment of the invention provides a terminal device, which can automatically acquire an image set (namely a first image set) of a target object before updating and an image set (namely a second image set) of the target object after updating under the condition that the target object is updated (for example, the target object is released by a new version) or a user triggers; comparing the image set of the target object before updating with the image set of the target object after updating to obtain difference information between the target object after updating and the target object before updating; and outputting the difference information. According to the scheme, the terminal equipment can obtain the difference information between the target object of the new version and the target object of the old version by automatically comparing the image set of the target object of the new version with the image set of the target object of the old version under the condition that the target object is updated or triggered by a user, and the difference information of the target object before and after the target object is updated is not required to be obtained in a manual checking and comparing mode, so that the difference points of the target object before and after the target object is updated can be accurately and comprehensively determined.
Fig. 8 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention. As shown in fig. 8, the terminal device 100 includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 8 does not constitute a limitation of the terminal device, and that the terminal device may include more or fewer components than shown, or combine certain components, or a different arrangement of components. In the embodiment of the present invention, the terminal device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 110 is configured to obtain a first image set and a second image set, compare the M first images and the N second images to obtain target result information, and output the target result information. The first image set includes M first images, the M first images are images of a first version of a target object, the second image set includes N second images, the N second images are images of a second version of the target object, and the target result information is difference information between the first version of the target object and the second version of the target object.
Optionally, in this embodiment of the present invention, the processor 110 may be specifically configured to control the display unit 106 to output the target result information (for example, the target result information may be displayed on a display screen of the terminal device).
The embodiment of the invention provides a terminal device, which can automatically acquire an image set (namely a first image set) of a target object before updating and an image set (namely a second image set) of the target object after updating under the condition that the target object is updated (for example, the target object is released by a new version) or a user triggers; comparing the image set of the target object before updating with the image set of the target object after updating to obtain difference information between the target object after updating and the target object before updating; and outputting the difference information. According to the scheme, the terminal equipment can obtain the difference information between the target object of the new version and the target object of the old version by automatically comparing the image set of the target object of the new version with the image set of the target object of the old version under the condition that the target object is updated or triggered by a user, and the difference information of the target object before and after the target object is updated is not required to be obtained in a manual checking and comparing mode, so that the difference points of the target object before and after the target object is updated can be accurately and comprehensively determined.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal device posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 8, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which includes the processor 110, the memory 109, and a computer program stored in the memory 109 and capable of being executed on the processor 110 as shown in fig. 8, where the computer program is executed by the processor 110 to implement the processes of the method embodiments, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (7)
1. An information acquisition method is applied to a terminal device, and the method comprises the following steps:
acquiring a first image set and a second image set, wherein the first image set comprises M first images, the second image set comprises N second images, the M first images are images obtained by screenshot of the terminal device on M interfaces of a first version of a target object, and the N second images are images obtained by screenshot of the terminal device on N interfaces of a second version of the target object;
comparing the M first images with the N second images to obtain target result information, wherein the target result information is difference information of the target object of the first version and the target object of the second version, and the target result information comprises first result information and second result information;
the comparing the M first images with the N second images to obtain target result information includes:
comparing the ith first image in the M first images with the jth second image in the N second images to obtain the first result information, wherein i is an integer from 1 to M, and j is an integer from 1 to N;
wherein the comparing the ith first image of the M first images with the jth second image of the N second images to obtain the first result information includes:
comparing the ith first image in the M first images with the jth second image in the N second images, and determining K image groups, wherein in each image group in the K image groups, the similarity between the images is greater than or equal to a preset similarity threshold value, and K is a positive integer;
identifying and comparing the images in each image group by adopting an image identification technology to obtain K pieces of first identification result information, wherein each image group corresponds to one piece of first identification result information;
taking at least one piece of the K pieces of first recognition result information as the first result information;
outputting the target result information;
the target result information comprises a plurality of pieces of sub-result information, and each piece of sub-result information in the plurality of pieces of sub-result information is one piece of difference information between the target object of the first version and the target object of the second version;
the outputting the target result information includes:
displaying at least one piece of sub-result information on a display screen of the terminal device, where the at least one piece of sub-result information is a piece of sub-result information of which the importance degree level is greater than or equal to a preset degree level threshold value in the plurality of pieces of sub-result information, and the importance degree level of one piece of sub-result information is determined by the importance degree level of the corresponding area of the difference information indicated by the one piece of sub-result information in the target object.
2. The method of claim 1, wherein comparing the M first images to the N second images to obtain target result information further comprises:
and comparing the q-th first image in the M first images with the q-th second image in the N second images to obtain the second result information, wherein q is an integer from 1 to P, and P is the minimum value of M and N.
3. The method according to claim 1 or 2, wherein the comparing the q-th first image of the M first images with the q-th second image of the N second images to obtain the second result information comprises:
identifying and comparing a q-th first image in the M first images with a q-th second image in the N second images by adopting an image identification technology to obtain T pieces of second identification result information, wherein T is the maximum value of M and N;
and using at least one piece of second identification result information in the T pieces of second identification result information as the second result information.
4. The terminal equipment is characterized by comprising an acquisition module, a comparison module and an output module;
the acquiring module is configured to acquire a first image set and a second image set, where the first image set includes M first images, the second image set includes N second images, the M first images are images obtained by screenshot of the terminal device on M interfaces of a first version of a target object, and the N second images are images obtained by screenshot of the terminal device on N interfaces of a second version of the target object;
the comparison module is configured to compare the M first images and the N second images acquired by the acquisition module to obtain target result information, where the target result information is difference information between a target object of the first version and a target object of the second version, and the target result information includes first result information and second result information;
the comparison module is specifically configured to compare an ith first image of the M first images with a jth second image of the N second images to obtain the first result information; wherein i is an integer from 1 to M, and j is an integer from 1 to N;
the comparison module is specifically configured to compare an ith first image in the M first images with a jth second image in the N second images, and determine K image groups; identifying and comparing the images in each image group by adopting an image identification technology to obtain K pieces of first identification result information; and using at least one piece of the K pieces of first recognition result information as the first result information; in each image group of the K image groups, the similarity between the images is greater than or equal to a preset similarity threshold, each image group corresponds to first identification result information, and K is a positive integer;
the output module is used for outputting the target result information obtained by the comparison of the comparison module;
the target result information comprises a plurality of pieces of sub-result information, and each piece of sub-result information in the plurality of pieces of sub-result information is one piece of difference information between the target object of the first version and the target object of the second version;
the output module is specifically configured to display at least one piece of sub-result information on a display screen of the terminal device, where the at least one piece of sub-result information is a piece of sub-result information of which an importance level is greater than or equal to a preset degree level threshold in the plurality of pieces of sub-result information, and an importance level of one piece of sub-result information is determined by an importance level of a region corresponding to the difference information indicated by the one piece of sub-result information in the target object.
5. The terminal device of claim 4,
the comparison module is further specifically configured to compare a qth first image of the M first images with a qth second image of the N second images to obtain the second result information; wherein q is an integer from 1 to P, and P is the minimum of M and N.
6. The terminal device according to claim 5, wherein the comparing module is specifically configured to identify and compare a q-th first image in the M first images and a q-th second image in the N second images by using an image recognition technology to obtain T pieces of second recognition result information, and use at least one piece of second recognition result information in the T pieces of second recognition result information as the second result information, where T is a maximum value of M and N.
7. A terminal device, characterized by comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the information acquisition method according to any one of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811564076.1A CN109815349B (en) | 2018-12-20 | 2018-12-20 | Information acquisition method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811564076.1A CN109815349B (en) | 2018-12-20 | 2018-12-20 | Information acquisition method and terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109815349A CN109815349A (en) | 2019-05-28 |
CN109815349B true CN109815349B (en) | 2022-02-22 |
Family
ID=66601730
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811564076.1A Active CN109815349B (en) | 2018-12-20 | 2018-12-20 | Information acquisition method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109815349B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112906474B (en) * | 2019-08-01 | 2022-12-16 | 北京旷视科技有限公司 | Prompt message generation method and device, electronic equipment and storage medium |
CN111930826A (en) * | 2020-07-16 | 2020-11-13 | 深圳市富途网络科技有限公司 | Order generation method and system of software interface |
CN115225930B (en) * | 2022-07-25 | 2024-01-09 | 广州博冠信息科技有限公司 | Live interaction application processing method and device, electronic equipment and storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106557335A (en) * | 2015-09-28 | 2017-04-05 | 网易(杭州)网络有限公司 | The update method of application program, apparatus and system |
CN108170606B (en) * | 2018-01-03 | 2022-02-15 | 中国工商银行股份有限公司 | System and method for testing system software upgrading application |
CN108874392A (en) * | 2018-05-10 | 2018-11-23 | 中国联合网络通信集团有限公司 | User's guideline interface generation method and device |
-
2018
- 2018-12-20 CN CN201811564076.1A patent/CN109815349B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN109815349A (en) | 2019-05-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107580147B (en) | Management method of notification message and mobile terminal | |
CN110062105B (en) | Interface display method and terminal equipment | |
CN109857494B (en) | Message prompting method and terminal equipment | |
CN109543099B (en) | Content recommendation method and terminal equipment | |
CN109828705B (en) | Icon display method and terminal equipment | |
CN110764666B (en) | Display control method and electronic equipment | |
CN109240783B (en) | Interface display method and terminal equipment | |
CN109600297B (en) | Identification clearing method and terminal equipment | |
CN109976611B (en) | Terminal device control method and terminal device | |
CN110908750B (en) | Screen capturing method and electronic equipment | |
CN108874906B (en) | Information recommendation method and terminal | |
CN111030917B (en) | Message display method and electronic equipment | |
CN110046013A (en) | A kind of interface display method and terminal device | |
CN110049187B (en) | Display method and terminal equipment | |
CN109815349B (en) | Information acquisition method and terminal equipment | |
CN109901761B (en) | Content display method and mobile terminal | |
CN110012151B (en) | Information display method and terminal equipment | |
CN111596819A (en) | Unread message processing method and electronic equipment | |
CN111090489A (en) | Information control method and electronic equipment | |
CN110012152B (en) | Interface display method and terminal equipment | |
CN110167006B (en) | Method for controlling application program to use SIM card and terminal equipment | |
CN109829707B (en) | Interface display method and terminal equipment | |
CN109067975B (en) | Contact person information management method and terminal equipment | |
CN111381753B (en) | Multimedia file playing method and electronic equipment | |
CN111130995B (en) | Image control method, electronic device, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |