CN115686320A - Application analysis report generation method and device, computer equipment and storage medium - Google Patents

Application analysis report generation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115686320A
CN115686320A CN202211149521.4A CN202211149521A CN115686320A CN 115686320 A CN115686320 A CN 115686320A CN 202211149521 A CN202211149521 A CN 202211149521A CN 115686320 A CN115686320 A CN 115686320A
Authority
CN
China
Prior art keywords
report
target
image
analysis
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211149521.4A
Other languages
Chinese (zh)
Inventor
苏日娜
孙亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202211149521.4A priority Critical patent/CN115686320A/en
Publication of CN115686320A publication Critical patent/CN115686320A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application relates to an application analysis report generation method, an application analysis report generation device, computer equipment and a storage medium, relates to the technical field of computers, and can be used in the field of financial science and technology or other related fields. The method comprises the following steps: the method comprises the steps of obtaining an object operation image which is acquired and sent by wearable equipment in the process of operating a target application by a target object, determining operation behavior data and interface element identification data according to the object operation image, adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed, analyzing and processing the report to be analyzed to obtain an analysis result indicating whether an analysis item meets a standardized inspection condition, and generating a target analysis report for auditing the target application according to the analysis result and the report to be analyzed. By adopting the method, the target application can be analyzed based on the preset standardized inspection conditions to obtain a target analysis report, and the standardized auditing efficiency and the analysis efficiency of the target application are improved.

Description

Application analysis report generation method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of computers, and in particular, to a method, an apparatus, a computer device, a storage medium, and a computer program product for generating an application analysis report.
Background
Under the background of financial science and technology, internet financial products become the external image of an enterprise and the important media of getting customers and sticking customers, but the internet financial products belong to different development teams due to the membership of different business departments, and in the design and research and development processes of the application products, the condition of non-uniform standards exists, so that the problem of inconsistency of the application products displayed by the same brand is caused, the experience splitting feeling is caused for the customers, the user experience is influenced, and meanwhile, the professional degree of the application products is also reduced.
However, in the conventional technology, the standardization of the internet financial products has the problems of high cost, low efficiency and long time consumption, and the error rate is high in the standardization process of the application products, so that the analysis efficiency of the application products is low, and the improvement of the standardization auditing efficiency of the application products is not facilitated.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an application analysis report generation method, an apparatus, a computer device, a computer readable storage medium, and a computer program product, which can improve the efficiency of application product analysis and audit, in view of the above technical problems.
In a first aspect, the present application provides an application analysis report generation method, applied to a terminal device, the method including:
acquiring an object operation image sent by wearable equipment of a target object; the object operation image is an image acquired by the wearable device in the process of operating a target application by the target object;
determining operation behavior data of the target object and interface element identification data of the target application according to the object operation image;
adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed;
analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets the corresponding standardized inspection condition;
generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
In one embodiment, the determining, according to the object operation image, the operation behavior data of the target object and the interface element identification data of the target application includes:
determining an eye movement track of the target object according to the eye image of the target object;
determining the position information of the fixation point of the target object in each application interface image according to the eye movement track;
and taking the position information as the operation behavior data of the target object.
In one embodiment, the determining, according to the object operation image, operation behavior data of the target object and interface element identification data of the target application includes:
performing image recognition on the application interface image, and determining an interface recognition result of the target application; the interface identification result comprises at least one of interface character information or interface icon information; the interface icon information comprises at least one of size information, position information and color information of the icon object;
and taking the interface identification result of the target application as interface element identification data of the target application.
In one embodiment, the obtaining a target analysis report according to the analysis result and the report to be analyzed includes:
adding the analysis result to the report to be analyzed to obtain a report to be annotated:
and acquiring a use record of the target object on the terminal, and annotating the report to be annotated by using the use record as annotation information to obtain the target analysis report.
In one embodiment, after obtaining the target analysis report according to the analysis result and the report to be analyzed, the method further includes:
sending an analysis report uploading request to a cloud platform; the analysis report uploading request is used for indicating the cloud platform to store the target analysis report carried by the analysis report uploading request;
the cloud platform is used for responding to a report processing request of the target analysis report, and executing report processing operation corresponding to the processing request on the target analysis report.
The application provides an application analysis report generation method, which is applied to wearable equipment and comprises the following steps:
acquiring an object operation image generated by a target object in the process of operating a target application;
sending the object operation image to a terminal device; the terminal equipment is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image; adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed; analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition; generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
In one embodiment, the wearable device is provided with a first camera and a second camera, and the acquiring an object operation image generated by a target object in the process of operating a target application includes:
acquiring an image of the eye of the target object through the first camera to obtain an eye image corresponding to the target object;
acquiring an image of the interface of the target application through the second camera to obtain an application interface image corresponding to the target application;
and taking the eye image and the application interface image as the object operation image.
In one embodiment, the method further comprises:
displaying an image setting area in a user interface of the wearable device; the image setting area is used for representing an area where the application interface image is located in the user interface;
responding to the adjustment operation of the image setting area, and displaying the adjusted image setting area; the adjustment operation includes an adjustment operation of a focus area range or a focus position in the image setting area.
In one embodiment, the wearable device is further provided with a laser transmitter for transmitting laser onto a target display device; the target display device is a display device of an electronic device running the target application, and the method further comprises:
identifying a landing position of laser emitted by the laser emitter in the application interface image;
determining a collection state of the wearable device based on a distance between the drop point location and the focus area range or focus location;
and outputting acquisition adjustment prompt information through the wearable equipment under the condition that the acquisition state meets an abnormal acquisition condition.
In one embodiment, the wearable device is further provided with an inertial sensing module, and the method further comprises:
acquiring the motion speed and the equipment posture of the wearable equipment through the inertial sensing module;
determining a wearing state of the wearable device based on the movement speed and the device posture;
and outputting wearing adjustment prompt information through the wearable equipment under the condition that the wearing state meets the abnormal wearing condition.
In a second aspect, the present application further provides an application analysis report generation apparatus, applied to a terminal device, where the apparatus includes:
the image acquisition module is used for acquiring an object operation image sent by wearable equipment of a target object; the object operation image is an image acquired by the wearable device in the process of operating a target application by the target object;
the data identification module is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image;
the data adding module is used for adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed;
the report analysis module is used for analyzing and processing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets the corresponding standardized inspection condition;
the report generation module is used for generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
The application also provides an application analysis report generation device, is applied to wearable equipment, the device includes:
the image acquisition module is used for acquiring an object operation image generated by a target object in the process of operating a target application;
the image sending module is used for sending the object operation image to the terminal equipment; the terminal equipment is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image; adding the operation behavior data and the interface element identification data to a preset analysis report template to obtain a report to be analyzed; analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition; generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
In a third aspect, the present application further provides a computer device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the method in any one of the above aspects when executing the computer program.
In a fourth aspect, the present application also provides a computer-readable storage medium having a computer program stored thereon, which when executed by a processor, performs the steps of the method of any one of the above aspects.
In a fifth aspect, the present application also provides a computer program product comprising a computer program that, when executed by a processor, performs the steps of the method of any one of the above aspects.
The application analysis report generation method, the device, the computer equipment, the storage medium and the computer program product acquire the object operation image acquired by the wearable equipment worn by the target object, determine the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image, add the operation behavior data and the operation behavior data into a preset analysis report template to obtain a report to be analyzed, analyze and process the report to be analyzed to obtain an analysis result, and generate a target analysis report according to the analysis result and the report to be analyzed.
Drawings
For a better description and illustration of those embodiments and examples disclosed herein, reference may be made to one or more of the drawings. The additional details or examples used to describe the figures should not be considered as limiting the scope of any of the disclosed inventions, the presently described embodiments and/or examples, and the presently understood best modes of these inventions.
FIG. 1 is a diagram of an application environment in which a method for generating an application analysis report is implemented, according to an embodiment;
FIG. 2 is a flow diagram that illustrates a method for application analytics report generation, according to one embodiment;
FIG. 3 is a schematic flow chart diagram that illustrates the determination of operational behavior data and interface element identification data in one embodiment;
FIG. 4 is a schematic flow diagram illustrating the determination of a target analysis report in one embodiment;
FIG. 5 is a flow chart illustrating a method for generating an application analysis report according to another embodiment;
FIG. 6 is a schematic flow chart illustrating an embodiment of collecting an image of an operation of an object;
FIG. 7 is a flowchart illustrating an embodiment of adjusting a setting region of an image;
FIG. 8 is a schematic flow chart illustrating the determination of the acquisition state in one embodiment;
FIG. 9 is a schematic diagram of a process for determining a wearing state according to an embodiment;
FIG. 10 is a block diagram of an application analysis report generation apparatus according to an embodiment;
FIG. 11 is a block diagram showing the construction of an application analysis report generation apparatus according to another embodiment;
FIG. 12 is a diagram of the internal structure of a computer device, in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It should be noted that the application analysis report generation method, apparatus, computer device, storage medium, and computer program product disclosed in the present application may be applied to the field of financial technology, and may also be applied to any field other than the field of financial technology.
The application analysis report generation method provided by the embodiment of the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104, or may be located on the cloud or other network server. In practical application, the terminal 102 acquires an object operation image sent by the wearable device 106 of the target object; the object operation image is an image acquired in the process of operating a target application by a target object through wearable equipment; the terminal 102 determines operation behavior data of a target object and interface element identification data of a target application according to the object operation image; the terminal 102 adds the operation behavior data and the interface element identification data to a preset analysis report template to obtain a report to be analyzed; the terminal 102 analyzes the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets the corresponding standardized inspection condition; the terminal 102 generates a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application. The terminal 102 sends a target analysis report to the server 104. The terminal 102 may be, but is not limited to, various personal computers, notebook computers, smart phones, and tablet computers. The server 104 may be implemented as a stand-alone server or as a server cluster comprised of multiple servers. It is understood that the method may also be applied to the server 104, and may also be applied to a system comprising the terminal 102 and the server 104, and implemented through interaction of the terminal 102 and the server 104.
In an exemplary embodiment, as shown in fig. 2, an application analysis report generating method is provided, which is described in this embodiment by taking an example that the method is applied to a terminal, and in this embodiment, the method includes the following steps:
and S210, acquiring the object operation image sent by the wearable device of the target object.
Wherein the target object may refer to a person object wearing the wearable device. In practical applications, the human object may include, but is not limited to, a user using the target application and an application inspector performing an analysis review on the target application.
The object operation image may be an image acquired by the wearable device in a process of operating the target application by the target object. In practical applications, the Image formats of the object manipulation Image may include, but are not limited to, bitmap Format (BMP), tag Image File Format (TIFF), joint Photographic Experts Group Format (JPEG), and Portable Network Graphics Format (PNG).
The target application may refer to an application program that needs to be analyzed and audited. For example, the target application may be an internet financial product that requires a standardized audit.
In some embodiments, when the target object needs to perform standardized auditing on an application program, the target object can acquire an image acquired in the process that the target object operates a target application through currently worn wearable equipment, so that the wearable equipment can shoot an object operation image; then, the wearable device sends the shot object operation image to the terminal; and the terminal acquires the object operation image sent by the wearable device.
And S220, determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image.
The operation behavior data may refer to behavior data of the target object acquired by the wearable device in the process of operating the target application by the target object, and in practical application, the operation behavior data may include, but is not limited to, eye movement speed of the target object, eye dwell time on different pages, wearable device movement time, and angle.
The interface element identification information may be element identification data obtained based on interface image identification of the target application acquired by the wearable device in a process that the target object operates the target application, and in actual application, the interface element identification data may include text data and icon data in an interface corresponding to the target.
In some embodiments, the terminal performs image recognition on the object operation image based on an image recognition technology to obtain operation behavior data of a target object and interface element recognition data of a target application; for example, the terminal recognizes the object operation image through Character Recognition (OCR), and obtains Character data in the interface element Recognition data of the target application.
And S230, adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed.
The analysis report template may refer to an analysis report reference template preset by the system, and in practical applications, the format of the analysis report template includes, but is not limited to, word format and a preset special format, and the analysis report template may also be an imported analysis report template filled in by hand.
The report to be analyzed may refer to a report in which the content that needs to be analyzed is located, and in practical applications, the format of the report to be analyzed includes, but is not limited to, a Word format and a preset special format.
In some embodiments, the terminal identifies an object operation image to obtain operation behavior data and interface element identification data, and the terminal adds the operation behavior data and the interface element identification data to a preset analysis report template according to a preset sequence to obtain a report to be analyzed.
S240, analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets the corresponding standardized inspection condition.
As one example, the standardized check conditions may include, but are not limited to, rule check conditions, page check conditions, information definition check conditions, and behavior check conditions.
In some embodiments, the terminal checks the content in the report to be analyzed item by item based on a preset standardized checking condition to obtain an analysis result.
In an exemplary embodiment, the terminal checks the text information in the interface element identification data item by item according to a preset rule checking condition; the terminal firstly determines a keyword in text information in the interface element identification data, and by taking the keyword 'detail' as an example, the terminal checks whether the text information contains the following contents according to a preset check rule corresponding to the keyword: information elements such as product codes, product numbers, product types, product effective days, product expiration dates, product unit prices, product procedure rates, product charging modes, currency types, interest rates, currency/remittance, third party information and the like; if the keyword is contained in the keyword list, the terminal judges that the rule passes, and continues to search the next keyword until the terminal completes all keyword searches, if the keyword list does not contain the keyword list or has an item missing, the terminal judges that the rule does not pass, the terminal records information which does not meet the requirement, and after the recording is completed, the terminal continues to search the next keyword until the terminal completes all keyword searches. And after all the rule checks are completed, the terminal generates a rule check failing to pass the information list to obtain a rule check result.
In an exemplary embodiment, the terminal checks interface information in the interface element identification data item by item according to a preset page checking condition to judge whether the page design meets the standardization requirement; taking the size of the button below the page as an example of the checking result, the terminal judges whether the size of the button below the page meets the requirements or not according to the image recognition result, if the size meets the page checking requirements, the terminal judges that the rule passes, the next item is checked continuously until the terminal finishes all item checking, if the size does not meet the page checking requirements, the terminal judges that the rule does not pass, the terminal records the information which does not meet the requirements, and after the recording is finished, the next item is checked continuously until the terminal finishes all item checking. And after all the page checks are completed, the terminal generates a page check non-passing information list to obtain a page check result.
In an exemplary embodiment, the terminal checks input and output information definitions in the interface element identification data item by item according to preset information definition checking conditions; taking an example of checking a page name, a preset standardized rule requires that a page name format is 'atomic function name | page name', a terminal judges whether the page name meets requirements or not according to an image recognition result, if the page name meets the information definition requirements, the terminal judges that the rule passes through, and continues to check the next item until the terminal finishes all item checking, if the page name does not meet the information definition requirements, the terminal judges that the rule does not pass through, the terminal records information which does not meet the requirements, and after the recording is finished, the terminal continues to check the next item until the terminal finishes all item checking. And after all the checks are finished, generating an information definition which does not pass through the information list to obtain an information definition check result.
As one example, the analysis results may include, but are not limited to, rule check results, page check results, and information definition check results.
S250, generating a target analysis report according to the analysis result and the report to be analyzed; and the target analysis report is used for auditing the target application.
The format of the target analysis report includes, but is not limited to, word format and preset proprietary format.
In some embodiments, after the terminal completes analysis, a target analysis report is generated according to the analysis result and the report to be analyzed, in practical application, the terminal may add the analysis result to the report to be analyzed in real time, or may separately form a preset file from the analysis result, and then combine the preset file formed from the analysis result with the report to be analyzed.
The application analysis report generation method, the device, the computer equipment, the storage medium and the computer program product acquire the object operation image acquired by the wearable equipment worn by the target object, determine the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image, add the operation behavior data and the operation behavior data into a preset analysis report template to obtain a report to be analyzed, analyze and process the report to be analyzed to obtain an analysis result, and generate a target analysis report according to the analysis result and the report to be analyzed.
In an exemplary embodiment, as shown in fig. 3, the determining, according to the object operation image, operation behavior data of the target object and interface element identification data of the target application includes:
and S310, determining the eye movement track of the target object according to the eye image of the target object.
The eye image may be an image of an eye of the target object, which is acquired by a camera arranged on the wearable device in a process that the wearable device operates the target application; the eye movement trajectory may refer to a movement trajectory of an eye of the target object obtained from the eye image analysis.
In some embodiments, the terminal determines the preset features in the eye images as reference points according to at least two eye images of the target object acquired by the wearable device, in practical applications, the preset features may include pupil positions, and the terminal obtains the eye movement trajectory of the target object by comparing position changes of the reference points in the at least two eye images.
And S320, determining the position information of the fixation point of the target object in each application interface image according to the eye movement track.
As an example, the gaze point may refer to a point corresponding to a pupil of the target object.
In some embodiments, the terminal determines a movement track of a gaze point of the target object according to the eye movement track, determines position information of the gaze point of the target object in the application interface image based on a correspondence between the eye movement track and the application interface image, and determines position information change data of the gaze point in the application interface based on the position information and the movement track of the gaze point.
And S330, using the position information as the operation behavior data of the target object.
The position information may be position information corresponding to the point of regard in the application interface.
In some embodiments, the terminal calculates the staying time of the eyes of the target object on different coordinates in the application page according to the operation behavior data to provide a decision basis for adjusting the page content, the structure and the salient mark, and the operation behavior data of the target object can be used for behavior check in practical application.
In this embodiment, the eye image in the object operation image is recognized and analyzed, the position information of the gaze point of the target object in the application interface image is determined, and the position information is used as the operation behavior data of the target object, so that the operation behavior of the target object can be acquired and obtained, and a basis is provided for analyzing the operation behavior data of the target object.
In an exemplary embodiment, the determining, according to the object operation image, operation behavior data of the target object and interface element identification data of the target application includes:
performing image recognition on the application interface image, and determining an interface recognition result of the target application; the interface identification result comprises at least one of interface character information or interface icon information; the interface icon information includes at least one of size information, position information, and color information of the icon object.
The interface recognition result may refer to element information included in an application interface image obtained through image recognition, and in practical application, the interface recognition result may include, but is not limited to, text information, page design information, and page icon information in an interface.
In some embodiments, the terminal performs image recognition on the application interface image based on an image recognition technology, and obtains an interface recognition result corresponding to the target application according to a preset standardized auditing rule; in practical application, the preset standardized auditing rule may include, but is not limited to, a positional relationship between the interface text information and the interface icon information; standardized auditing rules can be formulated according to specific business requirements, for example, button icons in the interface icons are arranged below the interface; when the page has a plurality of buttons, the 'back' button is positioned at the lowest part of all the associated transaction buttons, the associated transaction buttons are all solid, and the 'back' button is white; when the application displays the operation success interface, the size of the icon identifying the operation success may be 128px 80px; when the page only has a return button, the return button is solid; the format of the date in the interface text message includes, but is not limited to, day/month/year and year-month-day.
And taking the interface identification result of the target application as the interface element identification data of the target application.
In this embodiment, the interface recognition result of the target application is determined by recognizing and analyzing the eye image in the target operation image, and the interface recognition result is used as the interface element recognition data of the target application, so that the interface element of the target application can be acquired and acquired, and a basis is provided for analyzing the interface element recognition data of the target application.
In an exemplary embodiment, as shown in fig. 4, the obtaining a target analysis report according to the analysis result and the report to be analyzed includes:
and S410, adding the analysis result to the report to be analyzed to obtain a report to be annotated.
The report to be annotated may be a report that needs to be annotated after the analysis is completed, and in practical application, the format of the report to be annotated includes, but is not limited to, a Word format and a preset special format.
And S420, acquiring a use record of the target object on the terminal, and annotating the report to be annotated by using the use record as annotation information to obtain the target analysis report.
Here, the usage record may refer to data for recording usage traces.
In practice, usage records may include, but are not limited to, user information for the analysis tool, usage ip, and process logs.
In some embodiments, the terminal obtains a usage record related to the operation in the usage record database, and the terminal annotates the to-be-annotated report with the usage record as annotation information to obtain a target analysis report corresponding to the target application.
In this embodiment, the usage record of the target object on the terminal is used as the annotation information to annotate the report to be annotated, so that the association between the usage record and the target analysis report can be realized, the information content of the target analysis report is enriched, the association between the usage record and the target analysis report is enhanced, and meanwhile, the generation process of the target analysis report can be monitored and recorded.
In an exemplary embodiment, after obtaining the target analysis report according to the analysis result and the report to be analyzed, the method further includes:
sending an analysis report uploading request to a cloud platform; the analysis report uploading request is used for indicating the cloud platform to store the target analysis report carried by the analysis report uploading request; the cloud platform is used for responding to a report processing request of the target analysis report and executing report processing operation corresponding to the processing request on the target analysis report.
As an example, the report processing request may include, but is not limited to, a store request, a view request, and a call request.
In some embodiments, the cloud platform is further provided with a communication module, a storage module, a control module, a distribution module, a tool updating module and a client information checking module; the communication module is used for interacting with the wearable equipment and the terminal; the storage module is used for storing, viewing and managing reports; storing, viewing and managing scanning information of the terminal and the wearable device; the storage module can also provide an API (application programming interface) interface and support multi-application and multi-platform access and calling; the control module is used for checking and managing the running state of the terminal; the distribution module is used for checking and managing the manual posts and supporting distribution tasks; the tool updating module is used for developing, updating, debugging and uploading new tools, can be developed by relying on various development languages such as python and vb, and supports auxiliary development of user UI interaction by relying on a recording screen, control grabbing and the like, and can be used for directly developing, updating and debugging the tools in the module; the client information checking module is used for checking version information and the like of each local client and the intelligent wearable device to confirm the local client needing to be updated.
In an exemplary embodiment, the terminal is further provided with a scanning module, an installation module, a document version detection module and a debugging module, based on a preset period, the terminal automatically starts the scanning module to scan the application version information of the terminal, after the scanning is completed, the scanning module sends the update information of the application version number, ip, mac address and the like of the terminal to the storage module in the cloud platform, in practical application, the preset period parameter of the scanning module can be adjusted according to the requirement of a target object, and the adjusted period parameter includes but is not limited to once a week, once a month, once a quarter and once a year. Taking application updating on a terminal as an example, the terminal compares a scanning result obtained after scanning by a scanning module with latest scanning information corresponding to the terminal in a storage module of a cloud platform, determines an application needing updating, generates a list to be updated, the terminal obtains an updating command and an installation package in the list to be updated sent by the cloud platform according to ip and mac addresses in the updating information, installs and updates the application, executes operations such as registration installation, script execution, plug-in installation and the like, detects document editing software on the terminal through a document version detection module after installation, automatically installs latest version document editing software such as word and wps and the like if the document editing software does not meet the version requirements, directly enters a debugging module if the document editing software meets the version requirements, executes operations such as newly-built paths and newly-built standard word banks and the like, generates an installation log after installation and updating, sends the installation log to the cloud platform, and updates the installation log by the cloud platform.
In this embodiment, the analysis report is uploaded to the cloud-side platform in response to the analysis report uploading request, so that the management of the target analysis report on the cloud-side platform can be realized, and the operation of calling the target analysis report is facilitated.
In an exemplary embodiment, as shown in fig. 5, there is provided an application analysis report generation method applied to a wearable device, the method including:
and S510, acquiring an object operation image generated by the target object in the process of operating the target application.
S520, sending the object operation image to the terminal equipment; the terminal equipment is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image; adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed; analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets the corresponding standardized inspection condition; generating a target analysis report according to the analysis result and the report to be analyzed; and the target analysis report is used for auditing the target application.
In this embodiment, the wearable device collects the object operation image, and sends the object operation image to the terminal device, so that interaction between the wearable device and the terminal device of the object operation image can be realized after the wearable device collects the object operation image, and the terminal device can conveniently identify and analyze the object operation image.
In an exemplary embodiment, as shown in fig. 6, the wearable device is provided with a first camera and a second camera, and the acquiring an object operation image generated by a target object in the process of operating a target application includes:
and S610, acquiring an image of the eye of the target object through the first camera, and obtaining an eye image corresponding to the target object.
The first camera may refer to a camera for acquiring an image of an eye of the target object; the second camera may refer to a camera for image acquisition of an interface of the target application.
And S620, acquiring an image of the interface of the target application through the second camera to obtain an application interface image corresponding to the target application.
S630, using the eye image and the application interface image as the object operation image.
In some embodiments, the first camera performs image acquisition from the front side to obtain an eye image corresponding to the target object, the second camera performs image acquisition from the side to obtain an application interface image corresponding to the target application, and the wearable device uses the eye image and the application interface image as the object operation image.
In this embodiment, the wearable device collects the eye image corresponding to the target object and the application interface image corresponding to the target application, so that the collection of the operation image of the object can be realized, and a data basis is provided for the analysis of the operation image of the object.
In an exemplary embodiment, as shown in fig. 7, the method further includes:
s710, displaying an image setting area in a user interface of the wearable device; the image setting area is used for representing the area of the application interface image in the user interface.
The user interface may refer to an interface through which the target object interacts with the wearable device.
S720, responding to the adjustment operation of the image setting area, and displaying the adjusted image setting area; the adjustment operation includes an adjustment operation of a focal region range or a focal position in the image setting region.
As an example, the image setting area may include or overlap with the focus area range.
In some embodiments, the target object adjusts the image setting area according to the operation state or the operation habit, the wearable device responds to the adjustment operation of the target object for the image setting area, and the wearable device correspondingly adjusts the range size or the position of the image setting area based on the adjustment operation.
In this embodiment, the flexibility of the wearable device when acquiring the operation image of the object can be increased by adjusting the image setting area of the wearable device.
In an exemplary embodiment, as shown in fig. 8, the wearable device is further provided with a laser emitter for emitting laser onto a target display device; the target display device is a display device of an electronic device running the target application, and the method further includes:
and S810, identifying the position of the falling point of the laser emitted by the laser emitter in the application interface image.
The laser emitter may emit laser light, and the laser light may include visible light or invisible light.
In some embodiments, the terminal acquires an application interface image acquired by the second camera, extracts a laser drop point in the application interface by a preset extraction method to obtain drop point information, and performs denoising processing on the drop point information to obtain drop point position information in the application interface.
S820, determining the collecting status of the wearable device based on the distance between the landing point position and the focus area range or the focus position.
The focal region range may refer to an area range of a region to be analyzed and checked in the representation image setting region, where the area range occupies a preset proportion of the image setting region, and in practical application, the preset proportion of the focal region range in the area range of the image setting region may include 1/3 and 1/2.
The focal position may refer to a position of a central point representing a focal area range, and in practical applications, the shape of the focal area range may include a circle and a rectangle.
The acquisition state may refer to a working state representing that the wearable device acquires a target application, and in practical applications, the acquisition state may include, but is not limited to, image capture, image non-capture, focusing and defocusing.
In some embodiments, position information of a focus area range or a focus position is acquired through image recognition, and whether a laser drop point is in the focus area range is determined based on the position information of a boundary of the drop point position information and the focus area range; determining the distance between the laser drop point and the focus based on the drop point position information and the focus position information; if the laser drop point is in the focus area range, judging that the state is an image captured, and if the laser drop point is not in the focus area range, judging that the state is an image not captured; if the distance between the laser drop point and the focus is larger than a preset value, judging that the state is defocusing, and if the distance between the laser drop point and the focus is smaller than or equal to the preset value, judging that the state is focusing; by determining the relation between the laser falling point and the range or position of the focus area, whether an image is acquired can be judged, and the method can be used for assisting focusing.
And S830, outputting acquisition adjustment prompt information through the wearable device under the condition that the acquisition state meets the abnormal acquisition condition.
The abnormal acquisition condition may refer to an acquisition state that does not meet a preset requirement.
The acquisition adjustment prompt information may be information for prompting a target object to adjust an acquisition action, and in practical application, the prompt mode for acquiring the adjustment prompt information may include voice broadcast and text prompt.
In some embodiments, after the wearable device obtains the position information of the focus area range or the focus position, if the laser drop point is not in the focus area range or the distance between the laser drop point and the focus is greater than a preset value, the wearable device may prompt the target object to adjust the collection action in a voice broadcast manner, or display preset collection adjustment text information through a user interface of the wearable device.
In this embodiment, by determining the position relationship between the position of the laser drop point and the focal region range or the focal position, the acquisition state of the target object during acquisition by using the wearable device can be monitored in real time, whether an image is captured or out of focus is determined, and monitoring and correction prompting of the acquisition state of the wearable device is realized.
In an exemplary embodiment, as shown in fig. 9, the wearable device is further provided with an inertial sensing module, and the method further includes:
s910, acquiring the motion speed and the equipment posture of the wearable equipment through the inertial sensing module.
The inertial sensing module can be a module for measuring the movement speed and the equipment posture of the wearable equipment; the movement speed of the wearable device may refer to a movement speed characterizing the wearable device while working; the device pose of the wearable device may refer to a device pose that characterizes when the wearable device is worn by the target object.
And S920, determining the wearing state of the wearable device based on the movement speed and the device posture.
The wearing state of the wearable device can be a physical state of the wearable device when the wearable device is worn by a representation target user, and in practical application, the wearing state can be obtained by measuring and calculating the included angle degree between the wearable device and the vertical direction through the inertia sensing module.
In some embodiments, after the wearable device obtains the movement speed, the movement speed is compared with a preset speed threshold value to determine whether the movement speed meets a preset requirement; after the wearable device acquires the device posture, the included angle degree used for representing the device posture is compared with a preset included angle threshold value, and whether the device posture meets the preset requirement is determined.
And S930, outputting, by the wearable device, wearing adjustment prompt information when the wearing state satisfies an abnormal wearing condition.
The abnormal wearing condition may refer to a wearing state which does not meet a preset requirement;
the wearing adjustment prompt information may be information for prompting the target object to adjust the wearing state, and in practical application, the prompt mode of wearing adjustment prompt information may include voice broadcast and text prompt.
In some embodiments, the wearable device obtains the movement speed and the device posture of the wearable device through the inertial sensing module, and if the movement speed is greater than a preset speed threshold value or a difference value between an included angle degree used for representing the device posture and the preset included angle threshold value is not in a preset error interval, the wearable device judges that the wearing state is required to be adjusted, and the wearable device can prompt a target object to adjust the wearing state in a voice broadcasting mode or display preset wearing adjustment text information through a user interface of the wearable device; for example, when the target object rotates the head quickly when wearing the wearable device, or the wearing angle of the wearable device is too inclined, the wearable device outputs the wearing adjustment prompt text information to prompt the target user to adjust the wearing state.
In some embodiments, the prompt function of collecting the adjustment prompt and the wearing adjustment prompt may be implemented, the target object may selectively turn on or off the prompt function according to specific situations such as use habits, and after the target object turns off the prompt function, the wearable device no longer outputs the collection adjustment prompt information and the wearing adjustment prompt information based on the collection state and the wearing state.
In an exemplary embodiment, the wearable device, the terminal, and the cloud platform may be connected via a communication link, which may include a plurality of connection types, such as a wired and/or wireless communication link.
In this embodiment, acquire the velocity of motion and the equipment gesture of wearable equipment through inertial sensing module, and then confirm the wearing state of wearable equipment, can use wearable equipment to gather when the target object, carry out real-time supervision to the wearing state of wearable equipment, be convenient for detect wearing state and correct the suggestion, realize the control to the wearing state of wearable equipment.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the present application further provides an application analysis report generation apparatus for implementing the application analysis report generation method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme described in the above method, so specific limitations in one or more embodiments of the application analysis report generation device provided below can be referred to the limitations in the above application analysis report generation method, and are not described herein again.
In an exemplary embodiment, as shown in fig. 10, there is provided an application analysis report generation apparatus, applied to a terminal device, including: an image acquisition module 1010, a data identification module 1020, a data addition module 1030, a report analysis module 1040, and a report generation module 1050, wherein:
the image acquisition module 1010 is used for acquiring an object operation image sent by wearable equipment of a target object; the object operation image is an image acquired by the wearable device in the process of operating a target application by the target object;
a data identification module 1020, configured to determine, according to the object operation image, operation behavior data of the target object and interface element identification data of the target application;
the data adding module 1030 is configured to add the operation behavior data and the interface element identification data to a preset analysis report template to obtain a report to be analyzed;
the report analysis module 1040 is configured to analyze the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition;
a report generating module 1050, configured to generate a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
In an exemplary embodiment, the data identification module 1020 is specifically configured to determine an eye movement trajectory of the target object according to the eye image of the target object, determine position information of a gaze point of the target object in each application interface image according to the eye movement trajectory, and use the position information as operation behavior data of the target object.
In an exemplary embodiment, the data identification module 1020 is further specifically configured to perform image identification on the application interface image, and determine an interface identification result of the target application; the interface identification result comprises at least one of interface character information or interface icon information; the interface icon information comprises at least one of size information, position information and color information of an icon object, and an interface identification result of the target application is used as interface element identification data of the target application.
In an exemplary embodiment, the report generating module 1050 is specifically configured to add the analysis result to the report to be analyzed to obtain a report to be annotated, obtain a usage record of the target object on the terminal, and annotate the report to be annotated by using the usage record as annotation information to obtain the target analysis report.
In an exemplary embodiment, the apparatus further includes a report uploading module, where the report uploading module is configured to send an analysis report uploading request to the cloud platform; the analysis report uploading request is used for indicating the cloud platform to store the target analysis report carried by the analysis report uploading request; the cloud platform is used for responding to a report processing request of the target analysis report, and executing report processing operation corresponding to the processing request on the target analysis report.
In an exemplary embodiment, as shown in fig. 11, there is provided an application analysis report generation apparatus applied to a wearable device, including: an image acquisition module 1110 and an image transmission module 1120, wherein:
an image collecting module 1110, configured to collect an object operation image generated by a target object in an operation target application process;
an image sending module 1120, configured to send the object operation image to a terminal device; the terminal equipment is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image; adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed; analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition; generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
In an exemplary embodiment, the image capturing module 1110 is specifically configured to perform image capturing on the interface of the target application through the first camera to obtain an application interface image corresponding to the target application, perform image capturing on the eye of the target object through the second camera to obtain an eye image corresponding to the target object, and use the application interface image and the eye image as the object operation image.
In an exemplary embodiment, the apparatus further includes a focus adjustment module, configured to display an image setting area in a user interface of the wearable device; the image setting area is used for representing an area where the application interface image is located in the user interface, responding to adjustment operation on the image setting area, and displaying the adjusted image setting area; the adjustment operation includes an adjustment operation of a focus area range or a focus position in the image setting area.
In an exemplary embodiment, the above apparatus further includes an acquisition adjustment module, the wearable device is further provided with a laser emitter, and the laser emitter is configured to emit laser onto a target display device; the target display device is a display device of an electronic device running the target application, the acquisition adjusting module is used for identifying a falling point position of laser emitted by the laser emitter in the application interface image, determining an acquisition state of the wearable device based on a distance between the falling point position and the focus area range or the focus position, and outputting acquisition adjusting prompt information through the wearable device under the condition that the acquisition state meets an abnormal acquisition condition.
In an exemplary embodiment, the apparatus further includes a wearing adjustment module, where the wearing adjustment module is configured to obtain a movement speed and an apparatus posture of the wearable apparatus through an inertial sensing module, determine a wearing state of the wearable apparatus based on the movement speed and the apparatus posture, and output wearing adjustment prompt information through the wearable apparatus when the wearing state meets an abnormal wearing condition.
The modules in the application analysis report generation device can be implemented in whole or in part by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In an exemplary embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 12. The computer apparatus includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input device. The processor, the memory and the input/output interface are connected by a system bus, and the communication interface, the display unit and the input device are connected by the input/output interface to the system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The input/output interface of the computer device is used for exchanging information between the processor and an external device. The communication interface of the computer device is used for communicating with an external terminal in a wired or wireless manner, and the wireless manner can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program when executed by a processor implements an application analysis report generation method. The display unit of the computer device is used for forming a visual picture and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 12 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an exemplary embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the above-described method embodiments when executing the computer program.
In an exemplary embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an exemplary embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the relevant laws and regulations and standards of the relevant country and region.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), magnetic Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (15)

1. An application analysis report generation method, applied to a terminal device, the method comprising:
acquiring an object operation image sent by wearable equipment of a target object; the object operation image is an image acquired by the wearable device in the process of operating a target application by the target object;
determining operation behavior data of the target object and interface element identification data of the target application according to the object operation image;
adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed;
analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition;
generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
2. The method according to claim 1, wherein the object manipulation image comprises an application interface image of the target application and an eye image of the target object, and the determining the manipulation behavior data of the target object and the interface element identification data of the target application according to the object manipulation image comprises:
determining the eye movement track of the target object according to the eye image of the target object;
determining the position information of the fixation point of the target object in each application interface image according to the eye movement track;
and taking the position information as the operation behavior data of the target object.
3. The method according to claim 2, wherein the determining operation behavior data of the target object and interface element identification data of the target application according to the object operation image comprises:
performing image recognition on the application interface image, and determining an interface recognition result of the target application; the interface identification result comprises at least one of interface character information or interface icon information; the interface icon information comprises at least one of size information, position information and color information of the icon object;
and taking the interface identification result of the target application as interface element identification data of the target application.
4. The method of claim 1, applied to a terminal device, wherein obtaining a target analysis report according to the analysis result and the report to be analyzed comprises:
adding the analysis result to the report to be analyzed to obtain a report to be annotated:
and acquiring a use record of the target object on the terminal, and annotating the report to be annotated by using the use record as annotation information to obtain the target analysis report.
5. The method according to claim 1, wherein after obtaining a target analysis report according to the analysis result and the report to be analyzed, the method further comprises:
sending an analysis report uploading request to a cloud platform; the analysis report uploading request is used for indicating the cloud platform to store the target analysis report carried by the analysis report uploading request;
the cloud platform is used for responding to a report processing request of the target analysis report and executing report processing operation corresponding to the processing request on the target analysis report.
6. An application analysis report generation method applied to a wearable device, the method comprising:
acquiring an object operation image generated by a target object in the process of operating a target application;
sending the object operation image to a terminal device; the terminal equipment is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image; adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed; analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets the corresponding standardized inspection condition; generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
7. The method of claim 6, wherein the wearable device is provided with a first camera and a second camera, and the acquiring of the object operation image generated by the target object in the operation of the target application comprises:
acquiring an image of the eye of the target object through the first camera to obtain an eye image corresponding to the target object;
acquiring an image of the interface of the target application through the second camera to obtain an application interface image corresponding to the target application;
and taking the eye image and the application interface image as the object operation image.
8. The method of claim 6, further comprising:
displaying an image setting area in a user interface of the wearable device; the image setting area is used for representing an area where the application interface image is located in the user interface;
responding to the adjustment operation of the image setting area, and displaying the adjusted image setting area; the adjustment operation includes an adjustment operation of a focus area range or a focus position in the image setting area.
9. The method of claim 8, wherein the wearable device is further provided with a laser emitter for emitting laser light onto a target display device; the target display device is a display device of an electronic device running the target application, and the method further comprises:
identifying a landing position of laser emitted by the laser emitter in the application interface image;
determining a collection state of the wearable device based on a distance between the drop point location and the focus area range or focus location;
and outputting acquisition adjustment prompt information through the wearable equipment under the condition that the acquisition state meets an abnormal acquisition condition.
10. The method of claim 6, wherein the wearable device is further provided with an inertial sensing module, the method further comprising:
acquiring the movement speed and the equipment posture of the wearable equipment through the inertial sensing module;
determining a wearing state of the wearable device based on the movement speed and the device posture;
and outputting wearing adjustment prompt information through the wearable equipment under the condition that the wearing state meets the abnormal wearing condition.
11. An application analysis report generation apparatus, applied to a terminal device, the apparatus comprising:
the image acquisition module is used for acquiring an object operation image sent by wearable equipment of a target object; the object operation image is an image acquired by the wearable device in the process of operating a target application by the target object;
the data identification module is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image;
the data adding module is used for adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed;
the report analysis module is used for analyzing and processing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition;
the report generation module is used for generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
12. An application analysis report generation device, applied to a wearable device, the device comprising:
the image acquisition module is used for acquiring an object operation image generated by a target object in the process of operating a target application;
the image sending module is used for sending the object operation image to the terminal equipment; the terminal equipment is used for determining the operation behavior data of the target object and the interface element identification data of the target application according to the object operation image; adding the operation behavior data and the interface element identification data into a preset analysis report template to obtain a report to be analyzed; analyzing the report to be analyzed to obtain an analysis result; the analysis result is used for representing whether at least one analysis item in the report to be analyzed meets a corresponding standardized inspection condition; generating a target analysis report according to the analysis result and the report to be analyzed; the target analysis report is used for auditing the target application.
13. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the method of any one of claims 1 to 10 when executing the computer program.
14. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 10.
15. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 10 when executed by a processor.
CN202211149521.4A 2022-09-21 2022-09-21 Application analysis report generation method and device, computer equipment and storage medium Pending CN115686320A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211149521.4A CN115686320A (en) 2022-09-21 2022-09-21 Application analysis report generation method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211149521.4A CN115686320A (en) 2022-09-21 2022-09-21 Application analysis report generation method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115686320A true CN115686320A (en) 2023-02-03

Family

ID=85061762

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211149521.4A Pending CN115686320A (en) 2022-09-21 2022-09-21 Application analysis report generation method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115686320A (en)

Similar Documents

Publication Publication Date Title
US10438051B1 (en) Facial recognition pet identifying system
KR101925701B1 (en) Determination of attention towards stimuli based on gaze information
US11601391B2 (en) Automated image processing and insight presentation
CN105022773B (en) Image processing system including picture priority
BR112021005196A2 (en) object scanning for a network-based service
JP2010267030A (en) Apparatus and method for processing information
CN103348315A (en) Content storage management in cameras
CN113906413A (en) Contextual media filter search
US20170262949A1 (en) Investigative interview management system
CN111124863B (en) Intelligent device performance testing method and device and intelligent device
US9569465B2 (en) Image processing
US20220138237A1 (en) Systems, devices, and methods for content selection
US20220101355A1 (en) Determining lifetime values of users in a messaging system
US20200311136A1 (en) Measuring and increasing the quality of user-provided information
CN111949859A (en) User portrait updating method and device, computer equipment and storage medium
CN108369647B (en) Image-based quality control
US20190042852A1 (en) Supplementing a media stream with additional information
CN115686320A (en) Application analysis report generation method and device, computer equipment and storage medium
US20150269177A1 (en) Method and system for determining user interest in a file
US20180018893A1 (en) Method and system for identifying marked response data on a manually filled paper form
US11250271B1 (en) Cross-video object tracking
CN114979124B (en) File sharing method and device based on AR technology, terminal and storage medium
US20220101349A1 (en) Utilizing lifetime values of users to select content for presentation in a messaging system
CN117081753A (en) Method, device, computer equipment and storage medium for replacing password card
US20210124911A1 (en) Classification of subjects within a digital image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination