CN114546762A - Processing method and equipment - Google Patents

Processing method and equipment Download PDF

Info

Publication number
CN114546762A
CN114546762A CN202210114309.8A CN202210114309A CN114546762A CN 114546762 A CN114546762 A CN 114546762A CN 202210114309 A CN202210114309 A CN 202210114309A CN 114546762 A CN114546762 A CN 114546762A
Authority
CN
China
Prior art keywords
data
target
target device
image data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210114309.8A
Other languages
Chinese (zh)
Inventor
张雅
陈笑曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202210114309.8A priority Critical patent/CN114546762A/en
Publication of CN114546762A publication Critical patent/CN114546762A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3055Monitoring arrangements for monitoring the status of the computing system or of the computing system component, e.g. monitoring if the computing system is on, off, available, not available
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures

Abstract

The application discloses a processing method and equipment, comprising the following steps: obtaining first data transmitted by target equipment; obtaining second data containing environment information of the environment where the target device is located; and processing the first data and the second data to obtain the first data and the second data which are associated.

Description

Processing method and equipment
Technical Field
The present application relates to the field of information processing technologies, and in particular, to a processing method and device.
Background
In the process of using the electronic device by a user, the user can generally only judge the data processing process of the electronic device according to the experience of the user to obtain the external expression data of the electronic device, and even if the user can obtain internal data such as software and hardware configuration parameters of the electronic device through human-computer interaction, the external expression data of the electronic device cannot be associated with the internal data of the electronic device.
Disclosure of Invention
Based on the above problems, the embodiments of the present application provide a processing method and device.
The technical scheme provided by the embodiment of the application is as follows:
the embodiment of the application provides a processing method, which comprises the following steps:
obtaining first data transmitted by target equipment;
obtaining second data containing environment information of the environment where the target device is located;
and processing the first data and the second data to obtain the first data and the second data which are associated.
In one embodiment, the obtaining the first data transmitted by the target device includes:
obtaining first data related to the running state of the target device through communication connection with the target device;
the obtaining of the second data containing the environment information of the environment where the target device is located includes:
acquiring second data which is output by the target equipment and is related to the running state of the target equipment;
wherein the first data and the second data associated with the establishment can be used to determine an operating status of the target device.
In one embodiment, the method further comprises:
obtaining input third data; wherein the first data and the second data, and the third data associated with the target device are usable to determine an operating status of the target device.
In one embodiment, the first data comprises data generated by the target device after the communication connection is established; the method further comprises the following steps:
obtaining fourth data transmitted by the target equipment; wherein the fourth data comprises data generated by the target device prior to establishment of the communication connection; the first data and the second data, and the fourth data, which establish the association, can be used to determine an operating state of the target device.
In one embodiment, the method further comprises:
analyzing the first data to determine acquisition parameters; the acquisition parameters at least comprise position information and/or identification information of at least one module of the target equipment; the collected data is used for the data collecting device to collect the second data.
In one embodiment, the first data comprises text data; the second data comprises first image data; the processing the first data and the second data to obtain the first data and the second data which establish the association includes:
processing the text data and the first image data to obtain second image data; wherein the second image data includes at least a part of the first image data and data derived from at least a part of the text data.
In one embodiment, the first image data comprises a collection of multiple frames of image data; the second image data comprises a set of multiple frames of image data; the processing the text data and the first image data to obtain second image data includes:
acquiring first time period information of the text data and second time period information of the first image data;
determining a target period based on the first period information and the second period information;
processing target image data and target text data to obtain second image data; wherein the target image data includes a plurality of frames of image data within the target period in the first image data; the target text data comprises at least part of text data in the text data within the target time period; the nth frame image data of the second image data comprises at least partial image data of the nth frame image data of the target image data and data obtained from text data at the nth moment of the target text data; n is an integer greater than or equal to 1.
In one embodiment, the method further comprises:
analyzing the first data and the second data which are associated to obtain an analysis result;
obtaining first instruction information based on the analysis result; the first instruction information comprises instruction information for executing operation on the target equipment;
and outputting the first instruction information.
In one embodiment, the method further comprises:
sending the first data and the second data associated to a remote processing device;
obtaining second instruction information transmitted by the remote processing equipment; the second instruction information is obtained by analyzing the first data and the second data which are related; the second instruction information comprises instruction information for executing operation on the target device;
and outputting the second instruction information.
An embodiment of the present application further provides a processing device, including:
the first communication module is used for obtaining first data transmitted by target equipment;
the acquisition module is used for acquiring second data containing environment information of the environment where the target equipment is located;
and the processing module is used for processing the first data and the second data to obtain the first data and the second data which are associated.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor of an electronic device, the processing method as in any one of the foregoing is implemented.
In the processing method provided by the embodiment of the application, the processing device can obtain first data related to the operating state of the target device through the communication connection between the processing device and the target device, can also acquire second data related to the operating state of the target device and output by the target device, and obtain the first data and the second data which are associated, so that the overall, high-flexibility and objective association between the internal data, namely the first data, of the target device and the external data, namely the second data, of the target device is realized.
Drawings
Fig. 1 is a schematic flow chart of a processing method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of obtaining second image data according to an embodiment of the present application;
fig. 3 is a schematic flowchart of outputting first instruction information according to an embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating outputting second instruction information according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an implementation of a processing method provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of a processing apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the process of using various electronic devices by a user, the user usually can only subjectively judge the external performance of the data processing process of the electronic device according to the experience of the user to obtain the external performance data of the electronic device, sometimes the user can also obtain the internal data of the electronic device, such as software and hardware configuration parameters, but the user cannot associate the external performance data with the internal data of the electronic device.
Based on the above problems, embodiments of the present application provide a processing method and a processing device.
The embodiment of the present invention first provides a Processing method, which can be implemented by a Processor of a Processing Device, where the Processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor.
Fig. 1 is a schematic flowchart of a processing method according to an embodiment of the present disclosure. As shown in fig. 1, the process may include steps 101 to 103:
step 101, obtaining first data transmitted by target equipment.
In one embodiment, the target device may include an electronic device used by a user, such as a television, a washing machine, an air conditioner, a computer device, and the like; for example, the target device may also be a smart mobile device, such as a smart phone, a smart watch, and the like
In one embodiment, a communication connection may be established between the processing device and the target device, and the processing device receives first data transmitted by the target device through the communication connection; for example, the communication connection may be a wired connection or a wireless connection; illustratively, the wired connection may be through a standard communication bus such as Type-C connecting the processing device with the target device.
In one embodiment, the target device may trigger the real-time transmission of the first data upon detecting that the communication connection between the target device and the processing device is established; for example, the target device may also transmit the first data to the processing device after receiving a data acquisition instruction sent by the processing device; for example, the target device may further obtain or generate the first data and transmit the first data to the target device after establishing a communication connection with the processing device and receiving a data acquisition instruction sent by the processing device.
In one embodiment, the first data may include data currently processed by the target device, such as video data currently played by the target device, text data displayed by the target device, and the like; for example, the first data may further include additional parameters related to data currently processed by the target device, where the additional parameters may include a size of the video data, a playing rate of the video data, a decoding manner of the video data, a decoding rate of the video data, and the like, and the additional parameters may further include a storage address of the text data, identification information of an application program for presenting the text data, and the like.
In an embodiment, the first data may further include resource data that the target device depends on in the data processing process, and for example, the resource data that the target device depends on in the data processing process may include data representations of hardware resources or software resources that are occupied to implement the data processing process, such as a size of a memory space occupied to implement the data processing process, a size of a hard disk storage space occupied to implement the data processing process, a number of created processes, an identifier of each process, and priorities of threads included in each process.
In an embodiment, the first data may further include data generated by the target device during data processing, such as a target video generated during video capture, a target file obtained during data downloading, a target text obtained during text editing, and the like.
In one embodiment, the first data may further include operation data performed by the user during data processing of the target device, such as the number of times the user pauses video capture during video capture, resolution parameters set by the user for the image capture device, such as network connection parameters selected by the user during data downloading, and data storage addresses.
And 102, obtaining second data containing environment information of the environment where the target device is located.
In one embodiment, the environment information of the environment where the target device is located may include brightness information, humidity information, temperature information, noise information, time information, air quality level information, and the like of the environment where the target device is located.
In one embodiment, the environment information of the environment where the target device is located may include information of an influence of the target device on the environment where the target device is located, for example, in a case where the target device is a television, the environment information may include information of a degree of influence on a temperature and/or a brightness of the environment where the television is located due to continuous playing of a television program; in the case where the target device is a cleaning device, such as an air purifier, the environmental information of the environment in which the cleaning device is located may include air quality information in the space in which the cleaning device is located.
In one embodiment, the second data may further include time information corresponding to environment information including an environment of the target device, such as environment information in which the air quality information corresponds to a first time period, environment information in which the temperature information corresponds to a second time period, and the like.
In one embodiment, the second data may be obtained by a sensor device of the processing device, for example, temperature information of an environment where the target device is located may be obtained by a temperature sensor of the processing device, brightness information of the environment where the target device is located may be obtained by a brightness sensor of the processing device, and the like.
And 103, processing the first data and the second data to obtain the first data and the second data which are associated.
In an embodiment, the processing device may associate the first data and the second data according to time information carried in the first data and the second data, so as to obtain the first data and the second data that establish association.
In one embodiment, the processing device may associate the first data and the second data according to a preset association policy, so as to obtain the first data and the second data establishing the association; for example, the association policy may include analyzing a degree of correlation between at least a portion of the first data and at least a portion of the second data, and associating the first data with the second data if the degree of correlation is greater than or equal to a preset threshold, and correspondingly, not associating the first data with the second data if the degree of correlation is less than the preset threshold.
As can be seen from the above, the processing method provided in the embodiment of the present application can process the first data and the second data to obtain the first data and the second data associated with each other after obtaining the first data transmitted by the target device and the second data including the environment information of the environment where the target device is located. Because the first data is the internal data of the target device transmitted by the target device, and the second data is the external data containing the environmental information of the environment where the target device is located, the internal state information of the target device can be obtained through the first data, and the second data can objectively represent the external data of the influence of the target device on the environment where the target device is located, so that the data of the internal state and the external expression of the target device can be objectively and accurately obtained at the same time through establishing the related first data and second data, and the internal data and the external expression data of the target device can be flexibly integrated and associated, thereby realizing the objective, high-flexibility and integrated integration between the internal data and the external expression data of the target device.
For example, in the case that the target device is a device used by a user, the processing device may be a device such as a smart phone convenient for the user to operate, and external data and internal data of the device used by the user may be objectively and accurately associated through simple data processing of the smart phone device familiar to the user and capable of being flexibly operated, so that a foundation is laid for the user to accurately grasp the real performance of the electronic device.
Based on the foregoing embodiment, in the processing method provided in the embodiment of the present application, obtaining the first data transmitted by the target device may be implemented in the following manner:
first data related to the operating state of the target device is obtained through the communication connection with the target device.
In one embodiment, the operation state of the target device may include a normal operation state of the target device, and may also include an abnormal operation state of the target device.
In one embodiment, the first data related to the operation state of the target device may include data related to a continuous operation state of the target device, such as a memory occupancy rate of the computer device in the continuous operation state.
In one embodiment, the first data related to the operation state of the target device may include data related to a specified operation state of the target device, such as data of a task scheduling process of the computer device for multitask concurrency when the computer device is in a data downloading state.
In one embodiment, the first data related to the operation state of the target device may include data related to an abnormal operation state of the target device, such as data related to a severe stuck state occurring during video playing of the target device, and for example, the first data at this time may include video data acquired by a processor of the target device, decoding state data of a video decoder, decoding state data of an audio decoder, an image display state of a display device, and the like.
In one embodiment, the first data related to the operating state of the target device may include data that is continuously updated as the operating state of the target device continues, and may also include data that is generated after the target device switches to a specified state, such as device abnormal operation data captured by an abnormality capture mechanism provided in the target device after the target device operates abnormally.
In an embodiment, the first data related to the operating state of the target device may be data generated after the target device receives a data acquisition instruction sent by the processing device through the communication connection.
Correspondingly, in the processing method provided by the embodiment of the present application, obtaining the second data including the environment information of the environment where the target device is located may be implemented in the following manner:
and acquiring second data which is output by the target equipment and is related to the running state of the target equipment.
The first data and the second data which are associated are established and can be used for determining the operation state of the target equipment.
In one embodiment, the processing device may acquire second data related to an operating state of the target device through the sensor; for example, the processing device may collect data of a type corresponding to at least one sensor through the at least one sensor and determine the data as the second data, and for example, the processing device may collect a blinking state of an indicator light of the target device through the image collecting apparatus, collect sound output by the target device through the sound sensor, and collect luminance information of a display screen of the target device through the luminance sensor, and then determine at least one of the blinking state of the indicator light, the output sound, and the luminance information as the second data.
In one embodiment, the processing device may integrate at least two data collected by the processing device related to the operation state of the target device, and determine the integration result as the second data, for example, the processing device integrates the sound output by the target device collected by the sound sensor and the brightness information of the display screen of the target device collected by the brightness sensor, so as to obtain the second data related to the video playing state of the target device.
In one embodiment, the first data and the second data associated with each other are established, and may be used to determine a current operating state of the target device, such as whether a current video playing state of the target device is normal; for example, the first data and the second data which are associated may be used to determine a historical operating state of the target device, such as an operating state of the target device from five minutes before the current time to between the current time, in which case, a time range covered by the first data and/or the second data may include a time period from five minutes before the current time to between the current time.
In an embodiment, the first data and the second data that are associated with each other are also used to predict an operating state of the target device at a next time or after a specified time period, for example, according to the first data and the second data that are associated with each other, a memory resource currently available in the target device and a memory resource that needs to be applied by at least one application program currently in an operating state may be obtained, so as to predict an operating state of the target device at the next time, where the operating state of the target device at the next time may include an abnormal operating state of the at least one application program due to a failure in applying for the memory resource.
As can be seen from the above, in the processing method provided in the embodiment of the present application, the processing device can obtain, through the communication connection between the processing device and the target device, first data related to the operating state of the target device, and can also collect, by the target device, second data related to the operating state of the target device, so as to determine the operating state of the target device based on the first data and the second data associated with each other. In this way, after acquiring the internal data, namely the first data, which can accurately represent the operation state of the target device, the processing device can acquire the external data, namely the second data, which is related to the operation state of the target device, so that on the basis of improving the objectivity of the external data of the operation state of the target device, the operation state of the target device can be determined more comprehensively and objectively from the internal and external aspects by establishing the related first data and second data.
In the related art, when a user uses an electronic device, the user may seek assistance from an after-market technician of the electronic device if the actual performance of the electronic device is different from the desired result of the user. If the user does not know the working principle of the electronic equipment enough or the user does not describe the actual operation performance of the electronic equipment objectively and accurately, the after-sales technicians cannot obtain the real state of the electronic equipment accurately, so that a device use suggestion with higher pertinence cannot be provided for the user, the communication efficiency between the user and the after-sales technicians is further reduced, and the efficiency of solving the user problems is also reduced.
In the processing method provided in the embodiment of the present application, when the target device is a device used by a user and the processing device is a smart phone that can be used by the user with skill, under the condition that the user does not need to have professional knowledge related to the operating state of the electronic device, the first data related to the operating state inside the device used by the user and the collected second data related to the operating state output by the device used by the user can be associated comprehensively, objectively and quickly by the data processing process of the application program that is arranged in the smart phone and can realize the processing method, on one hand, the objectivity of the second data used by the user is improved, on the other hand, by establishing the associated first data and second data, not only is the flexible integration between the external data and the internal data of the device used by the user realized, but also the operating state of the electronic device used by the user can be determined integrally and objectively, therefore, the technical problem that in the related art, due to the fact that a user cannot accurately describe the actual operation performance of the user using the equipment, after-sale technicians cannot accurately master the fault state of the user using the equipment is solved.
Based on the foregoing embodiment, the processing method provided in the embodiment of the present application may further include the following operations:
the input third data is obtained.
The first data and the second data and the third data which are associated are established and can be used for determining the operation state of the target equipment.
In one embodiment, the third data may include at least one of text data, picture data, audio data, and gesture data.
In one embodiment, the third data may include data related to the overall operation state of the target electronic device, for example, in a case where the target electronic device is a computer device, the third data may include: the computer equipment can respond to the input instruction of the user in time.
In one embodiment, the third data may include data related to an operating state of at least one module of the target electronic device, for example, in a case where the target electronic device is a computer device, the third data may include: the display screen is in a pause state in the video playing process, and at the moment, the third data comprises data related to the running states of the display screen and the video playing module.
In one embodiment, the third data may include data for a duration of a state of at least one module, such as a display screen in a stuck state for 10 minutes; for example, the third data may also include an operation performed by the target device before the at least one module is in a certain state, such as a video conference state before the display screen is in a stuck state.
In an embodiment of the present application, the third data may be obtained by any one of the following methods:
the human-computer interaction window of the processing equipment can display a control or an area for a user to input data, the user can perform data input operation after touching the control or the area, and the processing equipment detects and identifies the input operation performed by the user so as to obtain third data; for example, the user's editing operation on the above-mentioned control or region may further include a selection operation on various options.
A sensor of the processing device may collect audio data of the user and determine the collected audio data as third data.
The image acquisition device of the processor can continuously acquire the image of the user, recognize the gesture and/or posture in the acquired image and determine the result of gesture and/or posture recognition as the third data.
In an embodiment, analyzing the first data and the second data, and the third data that establish the association may obtain a first result, and based on the first result, the overall operating state of the target device may be determined, for example, in the third data, the following may be performed: if the data related to the abnormal operation state does not exist in the first data and the second data, it can be determined that the overall operation state of the target device is consistent with the operation state represented by the third data, and if the data related to the abnormal operation state exists in the first data and the second data, it can be determined that the overall operation state of the target device is inconsistent with the operation state represented by the third data.
In an embodiment, the second result obtained by analyzing the first data and the second data, which are associated with each other, and the third data may be used to determine the operating state of the at least one module of the target device, for example, in a case that the third data includes data related to the at least one module, the third data may be analyzed to obtain a module identifier of the at least one module, and then data corresponding to the at least one module identifier in the first data and the second data, which are associated with each other, is analyzed to determine the operating state of the at least one module.
In one embodiment, the third data may be analyzed, a target module may be determined based on the fault description information obtained from the third data, and the first data and the second data associated with each other may be analyzed, so as to determine the operating state of the target module.
In one embodiment, time information may be obtained from the third data, and the first data and the second data associated with each other are analyzed based on the time information, so as to determine an operating state of at least one module in a period corresponding to the time information.
In one embodiment, the operation state of at least one module can be analyzed to determine the abnormal level of at least one module; for example, the operating state of the CPU of the target device may be analyzed to determine whether the operating state of the CPU is a serious abnormality level.
In one embodiment, the operation state of at least one module may be analyzed, the load degree of at least one module is determined, and the operation state of the target device is determined based on the number of modules with the load degree greater than or equal to a preset threshold; for example, if the number of modules with the load degree greater than or equal to the preset threshold is greater than or equal to the first threshold, it may be determined that the target device is currently in the overload operation state; if the number of modules with the load degree greater than or equal to the preset threshold is smaller than the first threshold, it may be determined that the target device is currently in a steady operation state.
In an implementation manner, the analysis process of the first data, the second data, and the third data that are associated with each other may be implemented by a processor of a processing device, or may be implemented by a processor of another device, which is defined in this embodiment of the present application.
As can be seen from the above, the processing method provided in the embodiment of the present application can further obtain third data related to the target device after obtaining the first data and the second data that are associated with each other, and the first data, the second data, and the third data that are associated with each other can be used to determine the operating state of the target device. Therefore, the operating states of different aspects, different times or different modules of the target equipment can be determined by adjusting or changing the third data, so that diversified flexible analysis of the first data and the second data which are associated can be realized through the third data, and the flexibility of determining the operating states of the target equipment can be further improved.
Based on the foregoing embodiment, in the processing method provided by the embodiment of the present application, the first data includes data generated by the target device after the communication connection is established.
In one embodiment, the target device may start generating the first data and transmit the first data to the processing device via the communication connection upon detecting that the communication connection is established.
In one embodiment, the target device starts generating the first data after detecting the communication connection, and for example, the target device may transmit the first data to the processing device after receiving a data acquisition instruction sent by the processing device.
Correspondingly, the processing method provided by the embodiment of the application can further include the following steps:
and acquiring fourth data transmitted by the target equipment.
Wherein the fourth data comprises data generated by the target device before the communication connection is established; the first data and the second data and the fourth data which are associated are established and can be used for determining the operation state of the target equipment.
In one embodiment, the fourth data may include data generated by the target device within a preset time period before the establishment of the communication connection, for example, data generated within five minutes before the establishment of the communication connection.
In one embodiment, the fourth data may include data relating to an operating state of at least one module generated by the target device within a preset time period before the establishment of the communication connection, such as data relating to an operating state of a video decoder generated within five minutes before the establishment of the communication connection.
In an implementation manner, after receiving a data acquisition instruction containing a module identifier and/or time period information sent by a processing device, a target device may filter historical data generated by the target device before a communication connection is established based on the module identifier and/or time period information in the data acquisition instruction to obtain fourth data, and send the fourth data to the processing device.
In one embodiment, the first data and the second data which are associated with each other are analyzed to determine the operating state of the target device at the time of establishing the communication connection and in the time period covered by the first data, and the fourth data is analyzed to obtain the operating state of the target device before establishing the communication connection and in the time period covered by the fourth data.
In one embodiment, by analyzing the first data and the second data which are associated with each other, a module identifier of the target device in an abnormal operating state at the time of establishing the communication connection and in the time period covered by the first data may be determined, and then, the fourth data may be analyzed based on the module identifier, so that the operating states of the module corresponding to the module identifier at the time of establishing the communication connection, in the time period covered by the first data and in the time period covered by the fourth data may be determined; illustratively, the number of the module identifications may be at least one.
In one embodiment, the operation state of the target device or the at least one module after the period covered by the first data may be predicted based on the operation states of the target device or the at least one module at the time of establishing the communication connection, the period covered by the first data, and the period covered by the fourth data.
For example, the analysis process of the first data, the second data, and the fourth data that are associated with each other may be implemented by a processing device, or implemented by other devices, and the embodiment of the present application is not limited to this.
As can be seen from the above, in the processing method provided in the embodiment of the present application, the first data and the fourth data are respectively data generated by the target device before and after the communication connection is established, so that by establishing the associated first data and geothermal data, and the fourth data, at least the historical operating state and the current operating state of the target device can be determined, thereby implementing the consistency determination of the operating state of the target device within a certain time length range.
Based on the foregoing embodiment, the processing method provided in the embodiment of the present application may further include the following steps:
and analyzing the first data to determine acquisition parameters.
The acquisition parameters at least comprise position information and/or identification information of at least one module of the target equipment; the collected data is used for the data collection device to collect the second data.
In one embodiment, the at least one module of the target device may include a module provided integrally with the target device, such as a display screen module and an indicator light module of a television device.
In one embodiment, the at least one module of the target device may further include a module disposed in association with the target device, such as an external storage module connected to the computer device.
In one embodiment, the location information of the at least one module of the target device may include location information of the at least one module of the target device relative to a designated module, where the designated module may include a module with a higher degree of recognizability in the target device, for example, in a case where the target device is a television, the designated module may be a display screen, and the location information of the at least one module of the television may be described by the location information of the at least one module relative to the display screen, for example, the location information of the image capturing module may include an edge area of a top end of the display screen.
In one embodiment, the identification information of the at least one module of the target device may include naming information including a functional description of the at least one module, for example, the identification information of the camera module may include "image capture module".
In one embodiment, the identification information of the at least one module of the target device may include number information of the at least one module, such as number 1 of the image capturing module, number 2 of the display screen, and the like.
In one embodiment, the acquisition parameters may further include at least one of attitude information when the image acquisition device acquires the second data, acquisition duration information when the image acquisition device acquires the second data, the number of times the image acquisition device performs image acquisition, and time interval information between different image acquisition operations.
In one embodiment, the image acquisition device may be provided in the processing device, and may also be provided in another device that establishes a communication connection with the processing device.
In one embodiment, the processing device may determine the acquisition parameters by:
analyzing at least part of the first data to obtain identification information of a module of which the target device is currently in the running state, and exemplarily, determining position information of the module according to the identification information of the module and the structure information of the target device; for example, the structure information of the target device may include a corresponding relationship between a relative position relationship between a plurality of modules in the target device, position information of the modules, and identification information of the modules.
Analyzing at least part of the first data to obtain a time point when the target device is in a specified state, then acquiring identification information of at least one module in a time period when the first data contains the time point, and determining the position information of the module according to the identification information of the module and the structure information of the target device.
In one embodiment, the processing device may output the acquisition parameters to instruct a user to control the image acquisition device to perform the image acquisition process based on at least one of position information and/or identification information of the at least one module, posture information of the image acquisition device, and continuous acquisition time of the image acquisition device included in the acquisition parameters, so as to obtain the second data.
In one embodiment, the processing device may transmit the acquisition parameters to an image acquisition device where the image acquisition device is located, so that the image acquisition device automatically and intelligently controls the image acquisition device to acquire the second data based on the acquisition parameters; illustratively, the image capturing device may be the processing device itself or an electronic device that establishes a communication connection with the processing device.
As can be seen from the above, the processing method provided in the embodiment of the present application can determine, after analyzing the first data, the acquisition parameters at least including the location information and/or the identification information of the at least one module of the target device, so that the image acquisition device acquires the second data. Therefore, the processing method provided by the embodiment of the disclosure can determine how the image acquisition device acquires the second data according to the acquisition parameters obtained by analyzing the first data, so that the correlation between the second data acquired by the image acquisition device and the first data is stronger, and therefore, the operation state of the target device can be more comprehensively and objectively represented by establishing the correlated first data and the correlated second data.
Based on the foregoing embodiment, in the processing method provided by the embodiment of the present application, the first data includes text data, and the second data includes first image data.
Correspondingly, the first data and the second data are processed to obtain the first data and the second data which establish the association, and the method can be realized by the following steps:
and processing the text data and the first image data to obtain second image data.
The second image data comprises at least part of the first image data and data obtained from at least part of the text data.
In one embodiment, the text data may include log data generated by the target device in the running process, and for example, the log data may include all data which is generated in the running process of the target device, is used for recording the running state of the target device, and carries time information; illustratively, the log data may include abnormal log data acquired by an abnormal capturing mechanism set in the target device after an abnormality occurs in the operation process of the target device; the log data can also include data representing a system and/or application program running log of the target device, which is captured when a specified condition is detected to be met in the running process of the target device; for example, the specified condition may include at least one of a memory occupancy greater than or equal to a first threshold and a concurrent task amount greater than or equal to a second threshold.
In one embodiment, the log data may include data generated by the target device during software operation, and may also include data generated by the target device during hardware driving or during hardware operation.
In one embodiment, the second data may include first image data acquired by an image acquisition device of the processing apparatus, or the first image data may be transmitted to the processing apparatus after the first image data is acquired by the image acquisition apparatus provided with the image acquisition device.
In one embodiment, at least a portion of the first image data may include data resulting from cropping the first image data; illustratively, the first image data may be clipped according to a keyword carried in the text data.
In one embodiment, the data obtained from at least part of the text data may include module identifications of all modules included in the text data, and may further include module identifications of modules in the text data in the target operating state; for example, the target operation state may include a state in which an abnormal phenomenon occurs at least once.
In one embodiment, the data derived from at least a portion of the textual data may include a module identification of at least one module, and the location information of the at least one module may be determined in conjunction with configuration information of the target device.
In one embodiment, the processing of the text data and the first image data to obtain the second image data may be implemented by any one of the following methods:
and marking an image area corresponding to at least one module in the first image data according to the position information of the at least one module obtained by processing the text data, thereby obtaining second image data.
According to the position information of at least one module, identifying the first image data to obtain an identification result, cutting the first image data according to the identification result to obtain at least part of data of the first image data comprising the at least one module, and marking at least one module in the at least part of data of the first image data obtained by cutting to obtain second image data; for example, the recognition result may include pixel position information of the image area corresponding to the at least one module in the first image data.
For example, marking an image area corresponding to at least one module in the first image data, and marking at least one module in at least part of the data in the cropped first image data may include at least one of identifying the image area corresponding to at least one module with a line, and adding a text description of the module in a certain operating state at a certain time or within a certain period of time, which is extracted from the text data, to the image area.
As can be seen from the above, in the processing method provided in the embodiment of the present application, when the first data includes text data and the second data includes first image data, the second image data obtained by processing the text data and the first image data includes at least part of the first image data and data obtained from at least part of the text data, so that the second image data includes not only data obtained from the text data obtained from the target device, that is, internal data of the target device, but also at least part of data in the first image data, that is, external data of the target device, and thus, through the second image data, intuitive presentations of the internal data and the external data of the target device can be obtained simultaneously.
In the related art, the most direct means for the user to obtain the external performance data of the electronic device is generally to take a photo of at least a partial area of the electronic device, and the user can also obtain the text data output by the electronic device by means of some devices or apparatuses, but the user cannot associate the photo taken by the user with the text data output by the electronic device.
By the processing method provided by the embodiment of the application, the text data output by the target device and the acquired first image data can be processed through the data processing process of the application program which is set in the device known by users such as a smart phone and used for realizing the processing method, so that the first image data and the text data output by the target device can be associated visually, intuitively and objectively.
In practical applications, a user usually obtains text data transmitted by the electronic device and collects the first image data when the electronic device is abnormally operated or needs to know the performance of the electronic device, but the user cannot determine the actual operating state or performance of the electronic device through the text data transmitted by the electronic device and the collected first image data.
By the processing method provided by the embodiment of the application, under the condition that the text data contains the data related to the running state of the target equipment and the first image data comprises the image data related to the actual running state of the target equipment, the running state information and/or the performance information of the target equipment can be intuitively, vividly, objectively and accurately obtained by processing the text data and the second image data obtained by processing the first image data.
Based on the foregoing embodiment, an embodiment of the present application provides a processing method, in which the first image data includes a set of multiple frames of image data, and the second image data includes a set of multiple frames of image data.
In one embodiment, the first image data may be video data acquired by an image acquisition device; accordingly, the second image data may also be video data including a plurality of frames of image data.
Accordingly, the second image data may be implemented by the process shown in fig. 2, where fig. 2 is a schematic view of a process for obtaining the second image data provided in the embodiment of the present application, and as shown in fig. 2, the process may include steps 201 to 203:
step 201, acquiring first period information of the text data and second period information of the first image data.
In one embodiment, the text data may be log data transmitted by the target device, and therefore, the text data may carry time information corresponding to each data; illustratively, the first period information may include period information between the time information generated by the first log data in the text data and the time information generated by the last log data in the text data, for example, the first log data is generated at 12:29:36 on 26/1/2022, and the last log data is generated at 12:39:36 on 26/1/2022, so that the first period information may be information of a period between 12:36 on 26/1/2022 and 12:39 on 26/1/2022.
In one embodiment, the first period information may include a period from a time point of occurrence of an abnormality in the log data to a time point of completion of the abnormality recovery process of the target device, for example, a time point of generation of data in which an abnormality occurs in the log data is 12: 30/26/1/2022, and a time point of completion of the abnormality recovery process of the target device is 12: 39/1/26/2022, and then the first period information may include information of a period from 12: 30/26/1/2022.
In one embodiment, the first period information may be determined from the period information covered by the text data based on a preset analysis strategy or based on a selection of a user after analyzing the text data to obtain the period information covered by the text data.
In one embodiment, the second period information may include information of a period from a time point corresponding to the first frame image data of the first image data to a time point corresponding to the last frame image data of the first image data.
In one embodiment, the second period information may include a partial period in a period from a time point corresponding to a first frame image data of the first image data to a time point corresponding to a last frame image data of the first image data; for example, the first image data may be analyzed based on data obtained from at least part of text data of the text data, so as to obtain second period information, for example, if data obtained by analyzing at least part of the text data indicates that the first module and the second module of the target device are in an abnormal operation state, the position information of the first module and the second module may be obtained based on module identifiers of the first module and the second module, and then the multi-frame image in the first image data may be identified based on the position information of the first module and the second module, and a period including a corresponding time point in the multi-frame image data of the first module and the second module may be determined as the second period information.
Step 202, determining a target time interval based on the first time interval information and the second time interval information.
In one embodiment, an intersection period of the first period information and the second period information may be determined as the target period. For example, the first period information includes 12:30 at 26/1/2022 and 12:39 at 26/1/2022, and the second period information includes 12:31 at 26/1/2022 and 12:40 at 26/1/2022, the period between 12:31 at 26/1/2022 at 2022 may be determined as the target period. For example, in the case where the first period information completely coincides with the second period information, the target period may be a period represented by the first period information or a period represented by the second period information.
In an embodiment, the target time period may be determined based on the data content of at least part of the text data corresponding to the first time period information and the image features carried in the multi-frame image corresponding to the second time period information, for example, if the at least part of the text data corresponding to the first time period information indicates that the operating state of the first module and the operating state of the second module are in an abnormal state, and the multi-frame image corresponding to the second time period information indicates that the operating state of the second module is in an abnormal state, the first time period information may be intercepted based on data related to the abnormal state of the second module in the at least part of the text data to obtain third time period information, and then an intersection of the third time period information and the second time period information may be determined as the target time period.
And step 203, processing the target image data and the target text data to obtain second image data.
The target image data comprises a plurality of frames of image data in a target time interval in the first image data; the target text data comprises at least part of text data in a target time period in the text data; the nth frame image data of the second image data includes at least partial image data of the nth frame image data of the target image and data obtained from text data at the nth time of the target text data, and n is an integer greater than or equal to 1.
In one embodiment, the target image data may be obtained by cutting first image data based on a target time period, and for example, a plurality of frames of images with time information within the target time period may be cut from the first image data, and the cut plurality of frames of images may be determined as the target image data.
In one embodiment, the target text data may be obtained by intercepting at least a part of the text data based on the target period.
In an embodiment, each frame of image data in the target image data and each frame of data in the target text data may be fused according to time information carried by each frame of image data in the target image data and time information carried by each frame of data in the target text data to obtain each frame of fused image data, and then each frame of fused image data may be integrated according to the time information of each frame of fused image data to obtain the second image data.
For example, the obtaining process of each frame of image data in the second image data may be similar to the obtaining process of the second image data in the foregoing embodiment, and details are not repeated here.
As can be seen from the above, in the processing method provided in the embodiment of the present application, in the case that the first image includes a set of multi-frame image data, the target period can be determined based on the first period information of the text data and the second period information of the first image data, and then at least part of the text data corresponding to the target period in the text data and the multi-frame image data corresponding to the target period in the first image data are processed to obtain the second image data including the multi-frame data. Therefore, the data integration results of the internal dimension and the external dimension of the target equipment can be displayed more intuitively through the acquired image set of the multi-frame image data and the second image data obtained through the text information transmitted by the target equipment, and the data integration results of the target equipment in the target time interval can be displayed more continuously in the time dimension.
And under the condition that the multi-frame image data included in the first image data is related to the running state of the target device and the text data is the data which is transmitted by the target device and related to the running state of the target device, the second image data is obtained by processing the text data and the first image data, and the running state of the target device in a certain period of time can be displayed continuously, vividly and objectively in a time dimension.
Based on the foregoing embodiment, the processing method provided in the embodiment of the present application may further include a flow shown in fig. 3, and fig. 3 is a schematic flow diagram of outputting the first instruction information provided in the embodiment of the present application. As shown in fig. 3, the process may include steps 301 to 303:
step 301, analyzing the first data and the second data which are associated to obtain an analysis result.
In one embodiment, the analysis result may be used to indicate whether the current operation state of the target device is normal; for example, the analysis result may be used to indicate whether the target device is currently in an overload operation state.
In one embodiment, the analysis result may include a cause causing the target device to switch to an abnormal operation state, or a cause causing at least one module in the target device to switch to an abnormal operation state; for example, the analysis result may further include a reason for causing the target device to be currently in an overload operation state.
In one embodiment, the first data and the second data which are associated in a specified time period can be analyzed, so that an analysis result is obtained; for example, the specified time period may be determined by the processing device based on the data analysis policy, or may be determined based on a user selection of a time range covered by the first data and the second data for establishing the association.
For example, the third data, the first data establishing the association, and the second data may be analyzed, so as to obtain an analysis result.
Step 302, obtaining first instruction information based on the analysis result.
The first instruction information comprises instruction information for executing operation on the target equipment.
In one embodiment, the first instruction information may include information on the number of times the operation is performed on the target device.
In one embodiment, in the case where the first instruction information includes a plurality of operations to be performed on the target device, the first instruction information may further include order information between the respective operations, and time interval information between adjacent operations.
In one embodiment, the first instruction information may further include at least one operation execution condition information, such as a first operation executed when a designated indicator lamp of the target device flashes red light, a second operation executed when the designated indicator lamp flashes blue light, a third operation executed when the second operation execution ends and the display screen of the target device outputs the first information, and a fourth operation executed when the second operation execution ends and the display screen outputs the second information.
In one embodiment, the first instruction information may be obtained based on an operation state of the target device indicated by the analysis result; for example, in a case that the analysis result indicates that at least one module of the target device is in an abnormal operation state, the first instruction information obtained based on the analysis result may include an instruction to reset the at least one module; for example, in a case where the analysis result indicates that at least one module of the target device is in an overload operation, the first instruction information obtained based on the analysis result may include an instruction to pause or stop occupying the at least one application.
And step 303, outputting the first instruction information.
In one embodiment, after outputting the first instruction information, a user of the processing device and/or the target device may perform an operation on the target device based on various information contained in the first instruction information.
As can be seen from the above, the processing method provided in the embodiment of the present application, after obtaining the first data transmitted by the target device and the second data of the environment information of the environment where the target device is located, can analyze the first data and the second data that are associated to obtain an analysis result, and then obtain and output the first instruction information based on the analysis result to instruct to perform an operation on the target device. The first instruction information is determined based on an analysis result obtained by analyzing the first data and the second data which are associated, so that the first instruction information is related to both the internal data, namely the first data, of the target device and the external data, namely the second data, and the first instruction information can be consistent with the device characteristics of the target device, so that the accuracy and pertinence of operation are improved, and the probability of misoperation is reduced.
In practical applications, if the target device is a device with a large size, such as a large-sized electronic display screen, and if a user has a problem in the process of using the electronic display screen and the user cannot clearly and accurately describe the failure performance of the electronic display screen when communicating with an after-sales technician, the after-sales technician needs to carry more accessories or detection tools, so that on one hand, the cost of home-based maintenance of the after-sales technician is increased, and on the other hand, the pertinence of home-based maintenance of the after-sales technician is also reduced.
In order to solve the technical problems, after-sales technicians usually guide users to take some pictures or videos on line according to subjective descriptions of the users and send the pictures or videos to the after-sales technicians so as to assist the after-sales technicians in knowing about fault conditions of electronic equipment; however, in a case where the malfunction representation of the electronic device is complicated and the user does not have sufficient understanding of the operation principle of the electronic device, the accuracy of the subjective description of the user is insufficient, and thus the pertinence of the picture taking advice or the video taking advice provided by the after-market technician may also be insufficient.
In the processing method provided in the embodiment of the present application, when the first data and the second data are both related to the operating state of the target device, in a process that the user uses the target device, if the operating state of the target device is abnormal, internal data, that is, first data, transmitted by the target device and external data, that is, second data, including environment information of an environment where the target device is located may be obtained in real time, and then, based on an analysis result obtained by analyzing the first data and the second data that are associated, the first instruction information may be obtained and output, so that the user of the target device may perform self-inspection on the target device.
Based on the foregoing embodiment, the processing method provided in the embodiment of the present application may further include a flow shown in fig. 4, where fig. 4 is a schematic flow diagram of outputting the second instruction information provided in the embodiment of the present application, and as shown in fig. 4, the flow may include steps 401 to 403:
step 401, sending the first data and the second data establishing the association to a remote processing device.
For example, the remote processing device may be a device for further analysis of the first data and the second data establishing the association; for example, the remote processing device may be a device where an after-sales service system of the target device is located, and may receive data related to the operating state of the target device sent by other devices, such as the processing device, remotely and analyze the data, and may also receive an operation of a professional technician, such as an after-sales technician of the target device, such as a manual analysis operation of the first data and the second data that are associated by the after-sales technician.
For example, the processing device may further send the third data and/or the fourth data to a remote processing device for the remote processing device to analyze the third data and/or the fourth data, the first data establishing the association, and the second data.
Step 402, obtaining second instruction information transmitted by the remote processing device.
The second instruction information is obtained by analyzing the first data and the second data which are related; the second instruction information includes instruction information to perform an operation on the target device.
In one embodiment, the second instruction information may be obtained by analyzing the first data and the second data associated with each other by the remote processing device.
In an embodiment, the second instruction information may also be obtained by analyzing, by an after-market technician of the target device, the first data and the second data received by the remote processing device, which are associated with each other.
The second instruction information may also be obtained by analyzing the third data and/or the fourth data, the first data and the second data for establishing the association by an after-sales technician of the remote device or the target device.
For example, in this embodiment of the application, the second instruction information may include all contents in the first instruction information, and may further include generation time of the second instruction information, an overall analysis report on the current operating state of the target device, and the like; for example, in the case that the second instruction information is generated based on the analysis of the after-sales technician of the target device, the second instruction information may further include the contact information of the after-sales technician, so as to facilitate efficient communication between the user of the target device and the after-sales technician.
And step 403, outputting second instruction information.
For example, after the processing device outputs the second instruction information, the user of the target device may perform at least one operation on the target device according to the second instruction information.
As can be seen from the above, after the processing method provided in the embodiment of the present application sends the first data and the second data associated with each other to the remote processing device, the second instruction information transmitted by the remote processing device can be obtained and output, so that the user of the target device can perform at least one operation on the target device. Therefore, the processing method provided by the embodiment of the application realizes efficient and accurate remote maintenance of the fault problem occurring in the operation process of the target equipment, can reduce the cost of on-door maintenance of after-sales technicians of the target equipment, and can improve the fault maintenance efficiency of the target equipment.
Fig. 5 is a schematic structural diagram of an implementation of a processing method according to an embodiment of the present application. As shown in fig. 5, the target device 501 may include a first processor 5011, a first chip 5012, a second chip 5013, and may further include a first module 5014, a second module 5015, and a third module 5016; the first processor 5011, the first chip 5012, and the second chip 5013 may further include a first module 5014, a second module 5015, and a third module 5016, which may be disposed on a motherboard of the target device 501, the second processor 5017 may be a processor disposed on the target device 501 for collecting the first data and transmitting the first data to the processing device 502, and the second processor 5017 may be fixed on the motherboard of the target device 501 or may be connected to the motherboard of the target device 501 in a pluggable manner. Illustratively, the first processor 5011 may be a CPU of the target device 501.
For example, the second processor 5017 may respectively establish data connections with the first processor 5011, the first chip 5012, the second chip 5013, and the first module 5014, the second module 5015, and the third module 5016; these data connections may include, for example, connections established via a Universal data Bus (USB), a two-wire Serial Bus (I2C), a Universal Asynchronous Receiver/Transmitter (UART), a Serial Peripheral Interface (SPI), or a General Purpose Input/Output (GPIO).
Illustratively, the first processor 5011, the first chip 5012, the second chip 5013, the first module 5014, the second module 5015, and the third module 5016 may transmit log data generated during their operations to the second processor 5017 through data connections between them and the second processor 5017, respectively, and the second processor 5017 may transmit the log data to the processing apparatus 502 through a communication connection between it and the communication module 5021 of the processing apparatus 502; for example, the log data may be the first data in the foregoing embodiment.
Illustratively, the processing device 502 may include a communication module 5021, an image acquisition module 5022, a data display module 5023, and a video fusion module 5024; the communication module 5021 is configured to receive log data, i.e., first data, from the target device 501; for example, the processing device 502 may further include a third processor (not shown in the figure), where the third processor may analyze the log data, that is, the first data, to obtain and output collected data, so as to instruct a user of the processing device 502 to control the image collection module 5022 of the processing device 502 to collect the first image data based on the collected data, and after the collection of the first image data is finished, perform fusion processing on the first image data and the log data through the video fusion module 5024 to obtain second image data; illustratively, the first image data and the second image data may be displayed by the data display module 5023.
For example, the third processor may analyze the second image data to obtain an analysis result, obtain first instruction information according to the analysis result, and output the first instruction information through the data display module 5023 to instruct the user to perform at least one operation on the target device.
For example, the communication module 5021 may transmit the second image data to the remote processing device, receive second instruction information transmitted by the remote processing device, and output the second instruction information to instruct the user to perform at least one operation on the target device.
Illustratively, the user may further perform data input through a human-computer interaction window provided by the data display module 5023, and the processing device 501 may obtain third data, where the third data, the first data and the second data that establish the association may determine an operating state of the target device 501.
For example, the processing device 502 may further obtain historical data before log data generation in the target device, that is, fourth data; the fourth data, the first data and the second data which establish the association can be used for determining the operation state of the target equipment.
As can be seen from the above, according to the processing method provided in the embodiment of the present application, under the condition that a user does not need to know an operation principle of the target device 501 and the user does not need to describe an operation state of the target device 501 in detail, the accurate positioning of the operation state of the target device 501 can be achieved through data transmission between the processing device 502 and the target device 501, data acquisition of the processing device 502 and video fusion operation, so that the cost of after-sales technicians for home maintenance is reduced, and the efficiency of fault positioning and solution of the target device 501 is also improved.
Based on the foregoing embodiment, an embodiment of the present application further provides a processing device 502, fig. 6 is a schematic structural diagram of the processing device 502 provided in the embodiment of the present application, and as shown in fig. 6, the processing device 502 may include a first communication module 601, an acquisition module 602, and a processing module 603; wherein:
a first communication module 601, configured to obtain first data transmitted by a target device;
the acquisition module 602 is configured to obtain second data including environment information of an environment where the target device is located;
the processing module 603 is configured to process the first data and the second data to obtain the first data and the second data associated with each other.
In one embodiment, the first communication module 601 is configured to obtain first data related to an operating state of a target device through a communication connection with the target device;
the acquisition module 602 is configured to acquire second data, which is output by the target device and is related to the operating state of the target device;
the first data and the second data which are associated are established and can be used for determining the operation state of the target equipment.
In one embodiment, the processing module 603 is configured to obtain input third data; the first data and the second data and the third data which are associated are established and can be used for determining the operation state of the target equipment.
In one embodiment, the first communication module 601 is configured to obtain fourth data transmitted by the target device; wherein the fourth data comprises data generated by the target device before the communication connection is established; the first data and the second data and the fourth data which are associated are established and can be used for determining the operation state of the target equipment.
In one embodiment, the processing module 603 is configured to analyze the first data to determine an acquisition parameter; acquiring parameters, wherein the parameters at least comprise position information and/or identification information of at least one module of the target equipment; the collected data is used for the data collection device to collect the second data.
In one embodiment, the first data includes text data; the second data includes first image data; a processing module 603, configured to process the text data and the first image data to obtain second image data; and the second image data comprises at least part of the first image data and data obtained from at least part of the text data.
In one embodiment, the first image data comprises a set of multiple frames of image data; the second image data includes a set of multiple frames of image data;
a processing module 603, configured to obtain first period information of the text data and second period information of the first image data; determining a target period based on the first period information and the second period information; processing the target image data and the target text data to obtain second image data; the target image data comprises a plurality of frames of image data in a target time interval in the first image data; the target text data comprises at least part of text data in a target time period in the text data; image data of an nth frame of the second image data including at least partial image data of an nth frame of the target image data and data obtained from text data of an nth time of the target text data; n is an integer greater than or equal to 1.
In one embodiment, the processing module 603 is configured to analyze the first data and the second data that are associated with each other, so as to obtain an analysis result; obtaining first instruction information based on the analysis result; the first instruction information comprises instruction information for executing operation on the target equipment; and outputting the first instruction information.
In one embodiment, the processing device 502 further comprises a second communication module for transmitting the first data and the second data establishing the association to a remote processing device; obtaining second instruction information transmitted by the remote processing equipment; the second instruction information is obtained by analyzing the first data and the second data which are related; the second instruction information includes instruction information for executing an operation on the target device;
the processing module 603 is configured to output the second instruction information.
Illustratively, the first communication module 601, the acquisition module 602, the processing module 603, and the second communication module may be implemented by a processor of the processing device 502.
Based on the foregoing embodiments, the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor of an electronic device, the processing method according to any one of the foregoing embodiments can be implemented.
The foregoing description of the various embodiments is intended to highlight various differences between the embodiments, and the same or similar parts may be referred to each other, and for brevity, will not be described again herein.
The methods disclosed in the method embodiments provided by the present application can be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in various product embodiments provided by the application can be combined arbitrarily to obtain new product embodiments without conflict.
The features disclosed in the various method or apparatus embodiments provided herein may be combined in any combination to arrive at new method or apparatus embodiments without conflict.
The computer-readable storage medium may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a magnetic Random Access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); and may be various electronic devices such as mobile phones, computers, tablet devices, personal digital assistants, etc., including one or any combination of the above-mentioned memories.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus necessary general hardware nodes, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method described in the embodiments of the present application.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a heat generating module of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the heat generating module of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. A method of processing, comprising:
obtaining first data transmitted by target equipment;
obtaining second data containing environment information of the environment where the target device is located;
and processing the first data and the second data to obtain the first data and the second data which are associated.
2. The method of claim 1, wherein the obtaining first data transmitted by a target device comprises:
obtaining first data related to the running state of the target device through communication connection with the target device;
the obtaining of the second data containing the environment information of the environment where the target device is located includes:
acquiring second data which is output by the target equipment and is related to the running state of the target equipment;
wherein the first data and the second data associated with the target device are usable to determine an operating status of the target device.
3. The method of claim 2, wherein the method further comprises:
obtaining input third data; wherein the first data and the second data, and the third data associated with the target device are usable to determine an operating status of the target device.
4. The method of claim 2, wherein the first data comprises data generated by the target device after the communication connection is established; the method further comprises the following steps:
obtaining fourth data transmitted by the target equipment; wherein the fourth data comprises data generated by the target device prior to establishment of the communication connection; the first data and the second data, and the fourth data, which establish the association, can be used to determine an operating state of the target device.
5. The method of any of claims 1 to 4, further comprising:
analyzing the first data to determine acquisition parameters; the acquisition parameters at least comprise position information and/or identification information of at least one module of the target equipment; the collected data is used for the data collecting device to collect the second data.
6. The method of claim 1, wherein the first data comprises text data; the second data comprises first image data; the processing the first data and the second data to obtain the first data and the second data which establish the association includes:
processing the text data and the first image data to obtain second image data; wherein the second image data includes at least a part of the first image data and data derived from at least a part of the text data.
7. The method of claim 6, wherein the first image data comprises a set of multiple frames of image data; the second image data comprises a set of multiple frames of image data; the processing the text data and the first image data to obtain second image data includes:
acquiring first time period information of the text data and second time period information of the first image data;
determining a target period based on the first period information and the second period information;
processing target image data and target text data to obtain second image data; wherein the target image data includes a plurality of frames of image data within the target period in the first image data; the target text data comprises at least part of text data in the text data within the target time period; the nth frame image data of the second image data comprises at least partial image data of the nth frame image data of the target image data and data obtained from text data at the nth moment of the target text data; n is an integer greater than or equal to 1.
8. The method of claim 1, wherein the method further comprises:
analyzing the first data and the second data which are associated to obtain an analysis result;
obtaining first instruction information based on the analysis result; the first instruction information comprises instruction information for executing operation on the target equipment;
and outputting the first instruction information.
9. The method of claim 1, wherein the method further comprises:
sending the first data and the second data associated to a remote processing device;
obtaining second instruction information transmitted by the remote processing equipment; the second instruction information is obtained by analyzing the first data and the second data which are related; the second instruction information comprises instruction information for executing operation on the target device;
and outputting the second instruction information.
10. A processing device, comprising:
the first communication module is used for obtaining first data transmitted by target equipment;
the acquisition module is used for acquiring second data containing environment information of the environment where the target equipment is located;
and the processing module is used for processing the first data and the second data to obtain the first data and the second data which are associated.
CN202210114309.8A 2022-01-30 2022-01-30 Processing method and equipment Pending CN114546762A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210114309.8A CN114546762A (en) 2022-01-30 2022-01-30 Processing method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210114309.8A CN114546762A (en) 2022-01-30 2022-01-30 Processing method and equipment

Publications (1)

Publication Number Publication Date
CN114546762A true CN114546762A (en) 2022-05-27

Family

ID=81672767

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210114309.8A Pending CN114546762A (en) 2022-01-30 2022-01-30 Processing method and equipment

Country Status (1)

Country Link
CN (1) CN114546762A (en)

Similar Documents

Publication Publication Date Title
EP2688296B1 (en) Video monitoring system and method
EP2930916A1 (en) Method for controlling background through voice and mobile terminal
CN105100748B (en) A kind of video monitoring system and method
CN105376335B (en) Collected data uploading method and device
CN105278986A (en) Control method and apparatus of electronic device
US11837016B2 (en) Intelligent analysis system, method and device
CN108833222B (en) Household appliance control method, household appliance control device, remote controller, terminal, server and medium
CN108833223B (en) Household appliance control method, household appliance control device, remote controller, terminal, server and medium
CN105812723A (en) Intelligent camera monitoring method and device
US20190051147A1 (en) Remote control method, apparatus, terminal device, and computer readable storage medium
KR101391658B1 (en) Integrated management apparatus of charging infra for electric vehicle
EP2869584A1 (en) Method and device for realizing distributed remote control, and television terminal and mobile terminal thereof
CN103905710A (en) Image capture method, and mobile terminal and device thereof
CN105979149A (en) Shooting method and terminal
CN112908321A (en) Device control method, device, storage medium, and electronic apparatus
CN110235114B (en) Decentralized coordination system, equipment behavior monitoring device, and home appliance
KR101964230B1 (en) System for processing data
CN111263061B (en) Camera control method and communication terminal
KR101773768B1 (en) System and method for monitoring internet of things device
CN110361978B (en) Intelligent equipment control method, device and system based on Internet of things operating system
CN114546762A (en) Processing method and equipment
CN105718160A (en) Application operation triggering method, apparatus and system
CN114579394A (en) Log viewing method, log generating method and device and electronic equipment
CN102402570B (en) Data management device and recording medium
CN115314426A (en) Data acquisition method, system, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination