CN114003160B - Data visual display method, device, computer equipment and storage medium - Google Patents

Data visual display method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN114003160B
CN114003160B CN202111277492.5A CN202111277492A CN114003160B CN 114003160 B CN114003160 B CN 114003160B CN 202111277492 A CN202111277492 A CN 202111277492A CN 114003160 B CN114003160 B CN 114003160B
Authority
CN
China
Prior art keywords
target
video frame
data analysis
data
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111277492.5A
Other languages
Chinese (zh)
Other versions
CN114003160A (en
Inventor
符峥
龙良曲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Insta360 Innovation Technology Co Ltd
Original Assignee
Insta360 Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Insta360 Innovation Technology Co Ltd filed Critical Insta360 Innovation Technology Co Ltd
Priority to CN202111277492.5A priority Critical patent/CN114003160B/en
Publication of CN114003160A publication Critical patent/CN114003160A/en
Priority to PCT/CN2022/125844 priority patent/WO2023071861A1/en
Application granted granted Critical
Publication of CN114003160B publication Critical patent/CN114003160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Generation (AREA)

Abstract

The application relates to a data visual display method, a data visual display device, computer equipment and a storage medium. The method comprises the following steps: responding to the video display operation of a user, and displaying the video data to be processed; responding to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame; responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame; acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification; and drawing and rendering the target data analysis result to a target video frame. By adopting the method, redundancy and information blurring of data visualization can be reduced.

Description

Data visual display method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a method and apparatus for displaying data in a visual manner, a computer device, and a storage medium.
Background
With the popularity of video capturing and playing devices, video data is becoming an important carrier for delivering information. Video data is data containing both spatial attributes (position structure of an object in an image) and temporal attributes (image frame sequence), and the relationship and variation of data of variables in the spatial dimension and the temporal dimension need to be considered at the time of analysis.
In the conventional technology, for analysis results such as charts corresponding to video data and spatial structure data such as detection frames, a corresponding layer is mainly obtained by rendering, and then the layer is displayed to the video data for visual display.
However, the conventional method may cause redundancy and blurring of data visualization due to direct display of the layers to the video data.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a data visualization display method, apparatus, computer device, and storage medium that can reduce redundancy and information blurring of data visualization.
A method of visual presentation of data, the method comprising:
responding to the video display operation of a user, and displaying the video data to be processed;
responding to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame;
acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and drawing and rendering the target data analysis result to a target video frame.
In one embodiment, the data visualization display method further comprises:
acquiring video data to be processed;
performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed;
respectively carrying out information analysis on video frames in a video frame set to obtain a data analysis result corresponding to the video frames;
and acquiring a video frame identifier of the video frame, classifying the same item of data analysis result, generating a corresponding data analysis identifier, and establishing an association relationship between the video frame identifier and the data analysis result according to the data analysis identifier.
In one embodiment, respectively performing information analysis on video frames in a video frame set to obtain a data analysis result corresponding to the video frames, where the data analysis result includes:
and respectively carrying out target detection on the video frames in the video frame set to obtain target detection results corresponding to the video frames, wherein the target detection results comprise detection frame information.
In one embodiment, rendering and rendering the target data analysis results to the target video frame includes:
determining the data type of the target data analysis result;
and calling the graphic type according to the data type, and drawing and rendering the target data analysis result to the target video frame.
In one embodiment, invoking the graphics type according to the data type, drawing and rendering the target data analysis result to the target video frame comprises:
when the data type is numerical data, drawing the same item of data analysis results corresponding to all video frames in the video frame set into a numerical curve;
marking a target data analysis result on a numerical curve;
and rendering the marked numerical curve to a target video frame.
In one embodiment, invoking the graphics type according to the data type, drawing and rendering the target data analysis result to the target video frame includes:
when the data type is structural data, determining drawing coordinates according to a target data analysis result;
and drawing and rendering the target data analysis result to the target video frame according to the drawing coordinates.
In one embodiment, the data visualization display method further comprises:
creating a data analysis selection interaction control for responding to the user data analysis selection operation.
A data visualization presentation device, the device comprising:
the video display module is used for responding to the video display operation of a user and displaying the video data to be processed;
the video frame display module is used for responding to the user video frame selection operation, determining a target video frame in the video data to be processed and displaying the target video frame;
The data analysis response module is used for responding to the user data analysis selection operation and determining a target data analysis identifier corresponding to the target video frame;
the data acquisition module is used for acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and the processing module is used for drawing the target data analysis result and rendering the target data analysis result to a target video frame.
A computer device comprising a memory storing a computer program and a processor which when executing the computer program performs the steps of:
responding to the video display operation of a user, and displaying the video data to be processed;
responding to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame;
acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and drawing and rendering the target data analysis result to a target video frame.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
responding to the video display operation of a user, and displaying the video data to be processed;
responding to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame;
acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and drawing and rendering the target data analysis result to a target video frame.
According to the data visualization display method, the device, the computer equipment and the storage medium, the data to be processed is displayed in response to the user video display operation, the target video frame in the video data to be processed is determined in response to the user video frame selection operation, the target video frame is displayed, the target data analysis identification corresponding to the target video frame is determined in response to the user data analysis selection operation, the target data analysis result corresponding to the target video frame is obtained according to the video frame identification of the target video frame and the target data analysis identification, the target data analysis result is drawn and rendered to the target video frame, and the whole process can realize real-time display control of the data analysis result by responding to the user operation instead of directly displaying the data analysis result corresponding layer to the video data, so that redundancy and information blurring of data visualization can be reduced.
Drawings
FIG. 1 is a flow chart of a method for visualizing data in one embodiment;
FIG. 2 is a schematic diagram of response to user operation in one embodiment;
FIG. 3 is a flow chart of drawing and rendering the target data analysis result to a target video frame in one embodiment;
FIG. 4 is a flow chart of rendering a marked numerical curve to a target video frame according to one embodiment;
FIG. 5 is a flow chart of a method for visualizing data in another embodiment;
FIG. 6 is a block diagram of a data visualization presentation apparatus in one embodiment;
fig. 7 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, a method for visually displaying data is provided, where the method is applied to a terminal to illustrate the method, it is understood that the method may also be applied to a server, and may also be applied to a system including the terminal and the server, and implemented through interaction between the terminal and the server. The terminal may be, but not limited to, various personal computers, notebook computers, smartphones, tablet computers, portable wearable devices, motion cameras, panoramic cameras or pan-tilt cameras, and the server may be implemented by an independent server or a server cluster formed by a plurality of servers. In this embodiment, the method includes the steps of:
Step 102, displaying the video data to be processed in response to the video display operation of the user.
The user video display operation refers to an operation in which a user wants to display video on a terminal. The video data to be processed refers to video data that the user wants to display on the terminal.
Specifically, when the visual display of data is required, the user first needs to select video data corresponding to the visual display of data, and the user can perform video display operation through interaction with the terminal, namely, select the video data to be displayed, and the terminal can respond to the video display operation of the user and acquire and display the video data to be processed according to the video display operation of the user. For example, a video list may be displayed on a display interface of the terminal, and a user may select video data desired to be displayed on the terminal according to the video list.
Step 104, in response to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame.
The user video frame selection operation refers to an operation that a user wants to display any video frame in the video data to be processed on the terminal. The target video frame refers to a video frame that the user wants to display on the terminal.
Specifically, after the terminal displays the video data to be processed, the user further interacts with the terminal to perform video frame selection operation, namely, the terminal selects the video frame data to be displayed, responds to the user video frame selection operation, determines a target video frame in the video data to be processed according to the user video frame selection operation, and displays the target video frame. For example, when the terminal displays the video data to be processed, the terminal synchronously displays a video progress bar corresponding to the video data to be processed, and the user can perform video frame selection operation by dragging the video progress bar, clicking any position on the video progress bar, and the like.
And step 106, responding to the user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame.
The user data analysis selection operation refers to an operation in which a user wants to display an arbitrary data analysis result on the terminal. The target data analysis identifier is a data analysis identifier corresponding to a data analysis result which the user wants to display on the terminal, the data analysis identifier is used for distinguishing the data analysis result, and the data analysis identifiers corresponding to different data analysis results are different. For example, the target data analysis identifier may specifically refer to a character string corresponding to a data analysis result that the user wants to display on the terminal.
Specifically, after the terminal displays the target video frame, the user further interacts with the terminal to perform data analysis selection operation, namely, selects the target data analysis identifier corresponding to the data analysis result to be displayed, and the terminal responds to the user data analysis selection operation and determines the target data analysis identifier corresponding to the target video frame according to the user data analysis selection operation. For example, when the terminal displays the target video frame, the terminal synchronously displays the data analysis selection interaction control, and the user can complete the data analysis selection operation by selecting the data analysis selection interaction control.
For example, as shown in fig. 2, the terminal may display the video data to be processed in response to the user video display operation, while displaying the video data to be processed, the terminal may synchronously display a video progress bar corresponding to the video data to be processed, the user may perform a video frame selection operation by dragging the video progress bar, clicking any position on the video progress bar, or the like, the terminal may respond to the user video frame selection operation, display the target video frame, while displaying the target video frame, the terminal may synchronously display a data analysis selection interaction control, and the user may perform the data analysis selection operation by selecting the data analysis selection interaction control to determine the data analysis result to be displayed on the terminal.
Further, when the user needs to look up other video frames in the video data to be processed and the corresponding data analysis results, the user can reselect the target video frame by dragging the video progress bar, clicking any position on the video progress bar, and the like, and change the selection of the data analysis selection interaction control so as to perform the data analysis selection operation. It should be noted that, after the user reselects the target video frame, the user may not select the interactive control for data analysis, and the displayed data analysis result corresponds to the data analysis result displayed on the previous frame of the target video frame and belongs to the same data analysis result.
And step 108, acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification.
Wherein the video frame identification is used to distinguish video frames. For example, the video frame identification may specifically refer to a sequence number of the video frame. For example, the video frame identifier may specifically refer to a timestamp corresponding to a video frame. The target data analysis result refers to a data analysis result which the user wants to display on the terminal, and the data analysis result refers to analysis of spatial information and temporal information in a video frame by adopting a proper statistical analysis method, and extracted analysis information which is useful for a data analyst or the user and can form conclusions is usually expressed in the form of numerical values or characters. For example, the data analysis result may specifically refer to global image information such as brightness values of video frames. For another example, the data analysis result may specifically refer to a target detection result of an object in a video frame detected by using a target detection method, where the target detection result includes detection frame information. For another example, the data analysis result may specifically refer to character action category information in a video frame.
Specifically, the terminal may obtain a video frame identifier of a video frame in advance, classify the same item of data analysis result, generate a corresponding data analysis identifier, establish an association relationship between the video frame identifier and the data analysis result according to the data analysis identifier, and after determining a target video frame identifier and a target data analysis identifier, query the established association relationship according to the video frame identifier of the target video frame and the target data analysis identifier, and obtain a target data analysis result corresponding to the target video frame.
And 110, drawing and rendering the target data analysis result to a target video frame.
Specifically, when information analysis is performed on video frames in the video data to be processed in advance, the terminal designates a data type of the data analysis result and stores the data type and the data analysis result together. Therefore, after the target data analysis result is obtained, the terminal can synchronously obtain the data type of the target data analysis result, and further can call the corresponding graph type according to the data type so as to draw and render the target data analysis result to the target video frame. Wherein the data type comprises numerical data, text data, structural data and the like.
According to the data visualization display method, the data to be processed is displayed in response to the user video display operation, the target video frame in the data to be processed is determined in response to the user video frame selection operation, the target video frame is displayed, the target data analysis identification corresponding to the target video frame is determined in response to the user data analysis selection operation, the target data analysis result corresponding to the target video frame is obtained according to the video frame identification of the target video frame and the target data analysis identification, the target data analysis result is drawn and rendered to the target video frame, and in the whole process, real-time display control of the data analysis result can be achieved in response to the user operation instead of directly displaying the layer corresponding to the data analysis result to the video data, and redundancy and information blurring of data visualization can be reduced.
In one embodiment, the data visualization display method further comprises:
acquiring video data to be processed;
performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed;
respectively carrying out information analysis on video frames in a video frame set to obtain a data analysis result corresponding to the video frames;
And acquiring a video frame identifier of the video frame, classifying the same item of data analysis result, generating a corresponding data analysis identifier, and establishing an association relationship between the video frame identifier and the data analysis result according to the data analysis identifier.
The video data to be processed refers to video data shot by shooting equipment capable of acquiring digital video, and the shooting equipment comprises, but is not limited to, a single-lens reflex camera, a non-lens reflex camera, a mobile phone with shooting and photographing functions, a moving camera, a panoramic camera, a cradle head camera, an unmanned aerial vehicle and the like.
Specifically, when video data needs to be analyzed, the terminal acquires the video data to be processed, performs frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed, applies an image analysis and video analysis method based on digital image processing to respectively perform information analysis on video frames in the video frame set to obtain a data analysis result corresponding to the video frames, acquires video frame identifiers of the video frames, classifies the same data analysis result, generates corresponding data analysis identifiers, and establishes association relation between the video frame identifiers and the data analysis result according to the data analysis identifiers. The frame extraction mode of the video data to be processed may be uniform frame extraction, or may be set according to needs, which is not particularly limited herein.
The method comprises the steps of carrying out information analysis on video frames in a video frame set by using an image analysis method and a video analysis method based on digital image processing, wherein the information analysis is mainly used for obtaining global or local information of the video frames. For example, performing information analysis may specifically refer to determining global image information such as luminance values of video frames. By way of further example, the information analysis may specifically refer to obtaining detection frame information of an object of a video frame using a target detection method. By way of further example, performing information analysis may specifically refer to determining a character action category in a video frame, and so forth. In this embodiment, the data analysis result is represented in the form of a numerical value or a character. For example, the luminance value may be represented directly by a numerical value, the detection frame information may be represented by numerical coordinates, and the character action category may be represented by a character.
The association relation between the video frame identification and the data analysis result is established, namely, after the same item of data analysis result is classified, the corresponding data analysis identification is generated, the time attribute synchronous with the video frame is added on the data analysis result, and the data analysis result added with the time attribute synchronous with the video frame is stored according to the data analysis identification, so that the dynamic synchronous display of the video frame and the data analysis result is facilitated. The video frame identification herein may specifically refer to a sequence number or a corresponding timestamp of the video frame. When analyzing the video data to be processed, the number of the video frames to be analyzed and the information type to be analyzed can be set according to the needs, and if the information analysis is performed on all the video frames in the video frame set, the association relationship between all the video frame identifiers and the data analysis result is directly established. If the information analysis is not performed on all the video frames in the video frame set, that is, only the information analysis is performed on part of the video frames, after the information analysis of the video frames in the whole video data to be processed is completed, when the association relation is established, the data analysis result needs to be supplemented or interpolated so that the number of the data analysis result is consistent with the number of the video frame identifiers, that is, the association relation between the video frame identifiers corresponding to the video frames and the data analysis result needs to be established for the video frames which are not subjected to the information analysis in the video frame set.
The data analysis result may be supplemented or interpolated when the association relationship is established or when the data analysis result is displayed. When the data analysis result is displayed, if the corresponding data analysis result is not obtained in the association relation according to the video frame identifier corresponding to the video frame, supplementing or interpolating is performed according to the existing data analysis result in the association relation so as to obtain the data analysis result of the video frame.
The association relationship refers to a correspondence relationship between a video frame identifier and a data analysis result represented by structured sequence data, where the structured sequence data includes, but is not limited to, tables, csv (common-Separated Values), json (JavaScript Object Notation, json object profile) data, and the like. The specific format is that the data containing the analysis result of the data corresponding to each frame or time are arranged according to the video frame or time sequence, and taking csv data as an example, the association relation can be specifically expressed as follows: frame_id, value (frame number, value): 0,1.000;1,1.000;2,1.234.
In this embodiment, the data analysis result can be obtained and the video frame identifier and the data analysis result can be associated, so that the alignment and accurate display can be realized when the data analysis result needs to be displayed on the video frame.
In one embodiment, respectively performing information analysis on video frames in a video frame set to obtain a data analysis result corresponding to the video frames, where the data analysis result includes:
and respectively carrying out target detection on the video frames in the video frame set to obtain target detection results corresponding to the video frames, wherein the target detection results comprise detection frame information.
The detection frame information refers to coordinate information of an object detected in the video frame. Taking the detection frame information in csv format as an example, frame_id is the frame number, bbox_id is the frame detection frame number, x1, y1 is the upper left corner image coordinate of the detection frame, and x2, y2 is the lower right corner image coordinate of the detection frame. The detection frame information may be specifically expressed as: frame_id, bbox_id, x1, y1, x2, y2:0, 20, 20, 40, 30;0,1, 15, 15, 25, 35;1,2, 30, 30, 45, 50.
Specifically, the terminal acquires a pre-trained target detection model, and performs target detection on video frames in the video frame set by using the pre-trained target detection model to obtain a target detection result corresponding to the video frames, wherein the target detection result comprises detection frame information. The method comprises the steps that a pre-trained target detection model is obtained through training sample image data carrying a target frame, when the target detection model is pre-trained, a terminal firstly obtains an initial target detection model, then the sample image data is input into the initial target detection model, a prediction detection frame corresponding to the sample image data is output by the initial target detection model, a loss function corresponding to the initial target detection model is obtained through comparing the prediction detection frame and the target frame, model parameters of the initial target detection model are adjusted, and the initial target detection model after the model parameter adjustment is used for carrying out target detection on the sample image data again to obtain a corresponding loss function until the loss function converges, so that the pre-trained target detection model is obtained.
In this embodiment, by respectively performing object detection on video frames in the video frame set, an object detection result corresponding to the video frames can be obtained, where the object detection result includes detection frame information, so as to implement information analysis on the video frames. For example, the analysis of the information of the video frame may specifically refer to that the target object included on the video frame and the type and the position of the target object may be determined according to the target detection result, so that the corresponding target detection result may be displayed when the video frame is displayed.
In one embodiment, rendering and rendering the target data analysis results to the target video frame includes:
determining the data type of the target data analysis result;
and calling the graphic type according to the data type, and drawing and rendering the target data analysis result to the target video frame.
Specifically, when information analysis is performed on video frames in the video data to be processed in advance, the terminal designates a data type of the data analysis result and stores the data type and the data analysis result together. Therefore, after the target data analysis result is obtained, the terminal can synchronously obtain the data type of the target data analysis result, and further can call the corresponding graph type according to the data type so as to draw and render the target data analysis result to the target video frame. Wherein the data type comprises numerical data, text data, structural data and the like. Further, when drawing and rendering, different graphic colors and fonts can be adopted to identify different target data analysis results.
Specifically, the terminal can draw and render the target data analysis result to the target video frame by using a programmable GUI (Graphical User Interface ) library, which refers to a GUI code library that can be displayed and operated in a graphical manner by a program instruction and control a computer, and can draw the target data analysis result into a specific shape, chart or text according to the data type and render the target data analysis result to the target video frame. For example, a flow chart for drawing and rendering the target data analysis result to the target video frame may be shown in fig. 3. The program in the terminal can read the analysis result data (namely the target data analysis result), call the corresponding chart or graph type of the GUI library (namely the programmable GUI library) according to the data type appointed in the analysis result file, so that the GUI library calls the system graphical interface, draws and renders the analysis result chart, graph and target video frame, and displays the rendering result on the system graphical interface of the terminal.
In this embodiment, by determining the data type of the target data analysis result and calling the graphics type according to the data type, the target data analysis result can be drawn and rendered to the target video frame.
In one embodiment, invoking the graphics type according to the data type, drawing and rendering the target data analysis result to the target video frame comprises:
when the data type is numerical data, drawing the same item of data analysis results corresponding to all video frames in the video frame set into a numerical curve;
marking a target data analysis result on a numerical curve;
and rendering the marked numerical curve to a target video frame.
Specifically, when the data type is numerical data, the terminal draws the same item of data analysis results corresponding to all video frames in the video frame set into a numerical curve, marks the target data analysis result on the numerical curve, and renders the marked numerical curve to the target video frame. For example, when the data analysis result is the luminance value of the video frame, the data type is numerical data, and when the luminance value corresponding to the video frame needs to be displayed on the target video frame, the terminal draws the luminance values corresponding to all the video frames into a numerical curve, marks the luminance value corresponding to the target video frame on the numerical curve, and renders the marked numerical curve to the target video frame.
For example, a flow chart of rendering the marked numerical curve to the target video frame may be shown in fig. 4, where for convenience of illustration, the marked numerical curve and the target video frame are displayed in a layout form instead of a layer stacking form, and as can be seen from fig. 4, the numerical curve may be drawn by using the same item of data analysis results corresponding to all video frames, and for the left video frame and the right video frame, the corresponding target data analysis results are marked on the numerical curve, where the target data analysis result corresponding to the left Bian Shipin frame is 0.9, and the target data analysis result corresponding to the right video frame is 0.2.
In this embodiment, when the data type is numerical data, the same item of data analysis results corresponding to all video frames in the video frame set are drawn into a numerical curve, the numerical curve is marked with the target data analysis result, and the marked numerical curve is rendered to the target video frame, so that visual display of the numerical data can be realized.
In one embodiment, invoking the graphics type according to the data type, drawing and rendering the target data analysis result to the target video frame includes:
when the data type is structural data, determining drawing coordinates according to a target data analysis result;
and drawing and rendering the target data analysis result to the target video frame according to the drawing coordinates.
Specifically, when the data type is structural data, the terminal determines drawing coordinates according to the analysis result of the target data, and draws and renders the analysis result of the target data to the target video frame according to the drawing coordinates. For example, when the data analysis result is the detection frame information of the video frame, the data type is structural data, and when the corresponding detection frame information is required to be displayed on the target video frame, the terminal determines the drawing coordinates, namely the detection frame coordinates, according to the detection frame information, and then draws and renders the detection frame to the target video frame according to the detection frame coordinates.
In this embodiment, when the data type is structural data, the drawing coordinates are determined according to the analysis result of the target data, and the analysis result of the target data is drawn and rendered to the target video frame according to the drawing coordinates, so that visual display of the structural data can be realized.
In one embodiment, the data visualization display method further comprises:
creating a data analysis selection interaction control for responding to the user data analysis selection operation.
Specifically, the terminal creates a data analysis selection interaction control through the programmable GUI library, where the data analysis selection interaction control corresponds to the number of data analysis results, and is configured to respond to a user data analysis selection operation, and the user can control which data analysis results are displayed on the video frame in real time through the data analysis selection interaction control. Further, the terminal also creates a data type selection interactive control to respond to the user data type selection operation and control which data type data analysis results are displayed on the video frame in real time.
In the embodiment, the data analysis selection interaction control is created, so that the response to the user data analysis selection operation can be realized by utilizing the data analysis selection interaction control, and the real-time control data visual display is realized.
In one embodiment, as shown in fig. 5, a flow chart is used to illustrate a data visualization method of the present application, where the data visualization method specifically includes the following steps:
step 502, obtaining video data to be processed;
step 504, performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed;
step 506, respectively performing information analysis on video frames in the video frame set to obtain a data analysis result corresponding to the video frames;
step 508, obtaining the video frame identification of the video frame, classifying the same item of data analysis result, generating a corresponding data analysis identification, and establishing the association relationship between the video frame identification and the data analysis result according to the data analysis identification;
step 510, responding to the video display operation of the user, and displaying the video data to be processed;
step 512, in response to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
step 514, creating a data analysis selection interaction control, wherein the data analysis selection interaction control is used for responding to user data analysis selection operation;
step 516, responding to the user data analysis selection operation, and determining a target data analysis identifier corresponding to the target video frame;
Step 518, obtaining a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
step 520, determining the data type of the target data analysis result, jumping to step 522 when the data type is the numerical data, and jumping to step 528 when the data type is the structural data;
step 522, drawing the same item of data analysis result corresponding to all video frames in the video frame set into a numerical curve;
step 524, marking the analysis result of the target data on the numerical curve;
step 526, rendering the marked numerical curve to the target video frame;
step 528, determining the drawing coordinates according to the analysis result of the target data;
and step 530, drawing and rendering the target data analysis result to the target video frame according to the drawing coordinates.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least a part of the steps in the flowcharts related to the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages performed is not necessarily sequential, but may be performed alternately or alternately with at least a part of the steps or stages in other steps or other steps.
In one embodiment, as shown in fig. 6, there is provided a data visualization display apparatus including: a video display module 602, a video frame display module 604, a data analysis response module 606, a data acquisition module 608, and a processing module 610, wherein:
the video display module 602 is configured to display video data to be processed in response to a user video display operation;
the video frame display module 604 is configured to determine a target video frame in the video data to be processed in response to a user video frame selection operation, and display the target video frame;
a data analysis response module 606 for determining a target data analysis identifier corresponding to the target video frame in response to the user data analysis selection operation;
the data acquisition module 608 is configured to acquire a target data analysis result corresponding to a target video frame according to a video frame identifier of the target video frame and a target data analysis identifier;
and the processing module 610 is used for drawing and rendering the target data analysis result to a target video frame.
According to the data visualization display device, the data to be processed is displayed in response to the user video display operation, the target video frame in the data to be processed is determined in response to the user video frame selection operation, the target video frame is displayed, the target data analysis identification corresponding to the target video frame is determined in response to the user data analysis selection operation, the target data analysis result corresponding to the target video frame is obtained according to the video frame identification of the target video frame and the target data analysis identification, the target data analysis result is drawn and rendered to the target video frame, and in the whole process, real-time display control of the data analysis result can be achieved in response to the user operation instead of directly displaying the layer corresponding to the data analysis result to the video data, so that redundancy and information blurring of data visualization can be reduced.
In one embodiment, the data visualization display device further comprises an analysis module, the analysis module is used for obtaining video data to be processed, performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed, respectively performing information analysis on video frames in the video frame set to obtain a data analysis result corresponding to the video frames, obtaining video frame identifiers of the video frames, classifying the same item of data analysis result, generating corresponding data analysis identifiers, and establishing association relation between the video frame identifiers and the data analysis result according to the data analysis identifiers.
In one embodiment, the analysis module is further configured to perform object detection on video frames in the video frame set, to obtain an object detection result corresponding to the video frames, where the object detection result includes detection frame information.
In one embodiment, the processing module is further configured to determine a data type of the target data analysis result, call a graphics type according to the data type, and draw and render the target data analysis result to the target video frame.
In one embodiment, when the data type is numerical data, the processing module is further configured to draw the same item of data analysis results corresponding to all video frames in the video frame set into a numerical curve, mark the numerical curve with the target data analysis result, and render the marked numerical curve to the target video frame.
In one embodiment, the processing module is further configured to determine, when the data type is structural data, a rendering coordinate according to the target data analysis result, and render the target data analysis result to the target video frame according to the rendering coordinate.
In one embodiment, the data visualization presentation apparatus further includes a creation module for creating a data analysis selection interaction control for responding to the user data analysis selection operation.
For specific embodiments of the data visualization display apparatus, reference may be made to the above embodiments of the data visualization display method, which are not described herein. The modules in the data visualization display device can be realized in whole or in part by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a data visualization presentation method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 7 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
responding to the video display operation of a user, and displaying the video data to be processed;
responding to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame;
acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and drawing and rendering the target data analysis result to a target video frame.
In one embodiment, the processor when executing the computer program further performs the steps of: obtaining video data to be processed, performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed, respectively performing information analysis on video frames in the video frame set to obtain a data analysis result corresponding to the video frames, obtaining video frame identifiers of the video frames, classifying the same item of data analysis result to generate corresponding data analysis identifiers, and establishing association relation between the video frame identifiers and the data analysis result according to the data analysis identifiers.
In one embodiment, the processor when executing the computer program further performs the steps of: and respectively carrying out target detection on the video frames in the video frame set to obtain target detection results corresponding to the video frames, wherein the target detection results comprise detection frame information.
In one embodiment, the processor when executing the computer program further performs the steps of: and determining the data type of the target data analysis result, calling the graph type according to the data type, and drawing and rendering the target data analysis result to a target video frame.
In one embodiment, the processor when executing the computer program further performs the steps of: when the data type is numerical data, drawing the same item of data analysis results corresponding to all video frames in the video frame set into a numerical curve, marking the target data analysis result on the numerical curve, and rendering the marked numerical curve to the target video frame.
In one embodiment, the processor when executing the computer program further performs the steps of: when the data type is structural data, determining drawing coordinates according to the analysis result of the target data, and drawing and rendering the analysis result of the target data to the target video frame according to the drawing coordinates.
In one embodiment, the processor when executing the computer program further performs the steps of: creating a data analysis selection interaction control for responding to the user data analysis selection operation.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
responding to the video display operation of a user, and displaying the video data to be processed;
responding to the user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame;
acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and drawing and rendering the target data analysis result to a target video frame.
In one embodiment, the computer program when executed by the processor further performs the steps of: obtaining video data to be processed, performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed, respectively performing information analysis on video frames in the video frame set to obtain a data analysis result corresponding to the video frames, obtaining video frame identifiers of the video frames, classifying the same item of data analysis result to generate corresponding data analysis identifiers, and establishing association relation between the video frame identifiers and the data analysis result according to the data analysis identifiers.
In one embodiment, the computer program when executed by the processor further performs the steps of: and respectively carrying out target detection on the video frames in the video frame set to obtain target detection results corresponding to the video frames, wherein the target detection results comprise detection frame information.
In one embodiment, the computer program when executed by the processor further performs the steps of: and determining the data type of the target data analysis result, calling the graph type according to the data type, and drawing and rendering the target data analysis result to a target video frame.
In one embodiment, the computer program when executed by the processor further performs the steps of: when the data type is numerical data, drawing the same item of data analysis results corresponding to all video frames in the video frame set into a numerical curve, marking the target data analysis result on the numerical curve, and rendering the marked numerical curve to the target video frame.
In one embodiment, the computer program when executed by the processor further performs the steps of: when the data type is structural data, determining drawing coordinates according to the analysis result of the target data, and drawing and rendering the analysis result of the target data to the target video frame according to the drawing coordinates.
In one embodiment, the computer program when executed by the processor further performs the steps of: creating a data analysis selection interaction control for responding to the user data analysis selection operation.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (10)

1. A method for visually displaying data, the method comprising:
responding to the video display operation of a user, and displaying the video data to be processed;
responding to user video frame selection operation, determining a target video frame in the video data to be processed, and displaying the target video frame;
responding to user data analysis selection operation, and determining a target data analysis identification corresponding to the target video frame;
Acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and drawing the target data analysis result and rendering the target data analysis result to the target video frame.
2. The method according to claim 1, wherein the method further comprises:
acquiring video data to be processed;
performing frame extraction processing on the video data to be processed to obtain a video frame set corresponding to the video data to be processed;
respectively carrying out information analysis on video frames in the video frame set to obtain a data analysis result corresponding to the video frames;
and acquiring a video frame identifier of the video frame, classifying the same item of data analysis result, generating a corresponding data analysis identifier, and establishing an association relationship between the video frame identifier and the data analysis result according to the data analysis identifier.
3. The method according to claim 2, wherein the respectively performing information analysis on the video frames in the video frame set to obtain a data analysis result corresponding to the video frames includes:
and respectively carrying out target detection on video frames in the video frame set to obtain target detection results corresponding to the video frames, wherein the target detection results comprise detection frame information.
4. The method of claim 1, wherein said rendering and rendering the target data analysis results to the target video frame comprises:
determining the data type of the target data analysis result;
and calling a graph type according to the data type, and drawing and rendering the target data analysis result to the target video frame.
5. The method of claim 4, wherein invoking a graphics type based on the data type, drawing and rendering the target data analysis result to the target video frame comprises:
when the data type is numerical data, drawing the same item of data analysis results corresponding to all video frames in the video frame set corresponding to the video data to be processed into a numerical curve;
marking the target data analysis result on the numerical curve;
and rendering the marked numerical curve to the target video frame.
6. The method of claim 4, wherein invoking a graphics type based on the data type, drawing and rendering the target data analysis result to the target video frame comprises:
when the data type is structural data, determining drawing coordinates according to the analysis result of the target data;
And drawing and rendering the target data analysis result to the target video frame according to the drawing coordinates.
7. The method as recited in claim 1, further comprising:
a data analysis selection interaction control is created for responding to a user data analysis selection operation.
8. A data visualization display device, the device comprising:
the video display module is used for responding to the video display operation of a user and displaying the video data to be processed;
the video frame display module is used for responding to the user video frame selection operation, determining a target video frame in the video data to be processed and displaying the target video frame;
the data analysis response module is used for responding to user data analysis selection operation and determining a target data analysis identifier corresponding to the target video frame;
the data acquisition module is used for acquiring a target data analysis result corresponding to the target video frame according to the video frame identification of the target video frame and the target data analysis identification;
and the processing module is used for drawing the target data analysis result and rendering the target data analysis result to the target video frame.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202111277492.5A 2021-10-29 2021-10-29 Data visual display method, device, computer equipment and storage medium Active CN114003160B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111277492.5A CN114003160B (en) 2021-10-29 2021-10-29 Data visual display method, device, computer equipment and storage medium
PCT/CN2022/125844 WO2023071861A1 (en) 2021-10-29 2022-10-18 Data visualization display method and apparatus, computer device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111277492.5A CN114003160B (en) 2021-10-29 2021-10-29 Data visual display method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114003160A CN114003160A (en) 2022-02-01
CN114003160B true CN114003160B (en) 2024-03-29

Family

ID=79925927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111277492.5A Active CN114003160B (en) 2021-10-29 2021-10-29 Data visual display method, device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN114003160B (en)
WO (1) WO2023071861A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114003160B (en) * 2021-10-29 2024-03-29 影石创新科技股份有限公司 Data visual display method, device, computer equipment and storage medium
CN115237317B (en) * 2022-06-13 2024-05-17 北京达佳互联信息技术有限公司 Data display method and device, electronic equipment and storage medium
CN117370602A (en) * 2023-04-24 2024-01-09 深圳云视智景科技有限公司 Video processing method, device, equipment and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109714526A (en) * 2018-11-22 2019-05-03 中国科学院计算技术研究所 Intelligent video camera head and control system
CN110134830A (en) * 2019-04-15 2019-08-16 深圳壹账通智能科技有限公司 Video information data processing method, device, computer equipment and storage medium
CN111079553A (en) * 2019-11-25 2020-04-28 上海眼控科技股份有限公司 Method, device, equipment and storage medium for acquiring customer analysis data

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9693030B2 (en) * 2013-09-09 2017-06-27 Arris Enterprises Llc Generating alerts based upon detector outputs
CN106534944B (en) * 2016-11-30 2020-01-14 北京字节跳动网络技术有限公司 Video display method and device
CN110213599A (en) * 2019-04-16 2019-09-06 腾讯科技(深圳)有限公司 A kind of method, equipment and the storage medium of additional information processing
CN111586319B (en) * 2020-05-27 2024-04-09 北京百度网讯科技有限公司 Video processing method and device
CN112637670B (en) * 2020-12-15 2022-07-29 上海哔哩哔哩科技有限公司 Video generation method and device
CN112948630B (en) * 2021-02-09 2024-02-06 北京奇艺世纪科技有限公司 List updating method, electronic equipment, storage medium and device
CN114003160B (en) * 2021-10-29 2024-03-29 影石创新科技股份有限公司 Data visual display method, device, computer equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109714526A (en) * 2018-11-22 2019-05-03 中国科学院计算技术研究所 Intelligent video camera head and control system
CN110134830A (en) * 2019-04-15 2019-08-16 深圳壹账通智能科技有限公司 Video information data processing method, device, computer equipment and storage medium
CN111079553A (en) * 2019-11-25 2020-04-28 上海眼控科技股份有限公司 Method, device, equipment and storage medium for acquiring customer analysis data

Also Published As

Publication number Publication date
WO2023071861A1 (en) 2023-05-04
CN114003160A (en) 2022-02-01

Similar Documents

Publication Publication Date Title
CN114003160B (en) Data visual display method, device, computer equipment and storage medium
US10755480B2 (en) Displaying content in an augmented reality system
CN109117760B (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN108876934B (en) Key point marking method, device and system and storage medium
CN108762505B (en) Gesture-based virtual object control method and device, storage medium and equipment
US10430456B2 (en) Automatic grouping based handling of similar photos
US11030808B2 (en) Generating time-delayed augmented reality content
US11238623B2 (en) Automatic line drawing coloring program, automatic line drawing coloring apparatus, and graphical user interface program
CN113516666A (en) Image cropping method and device, computer equipment and storage medium
CN113608805B (en) Mask prediction method, image processing method, display method and device
CN111832561B (en) Character sequence recognition method, device, equipment and medium based on computer vision
CN111223155B (en) Image data processing method, device, computer equipment and storage medium
KR20210040305A (en) Method and apparatus for generating images
CN115690496A (en) Real-time regional intrusion detection method based on YOLOv5
CN113727039B (en) Video generation method and device, electronic equipment and storage medium
CN114168052A (en) Multi-graph display method, device, equipment and storage medium
CN113794831A (en) Video shooting method and device, electronic equipment and medium
CN116524088B (en) Jewelry virtual try-on method, jewelry virtual try-on device, computer equipment and storage medium
CN112612989A (en) Data display method and device, computer equipment and storage medium
CN109766530B (en) Method and device for generating chart frame, storage medium and electronic equipment
CN108614657B (en) Image synthesis method, device and equipment and image carrier thereof
US20200226833A1 (en) A method and system for providing a user interface for a 3d environment
CN106708958B (en) Method and device for displaying typesetting structure of browser kernel
CN115729544A (en) Desktop component generation method and device, electronic equipment and readable storage medium
US20220283698A1 (en) Method for operating an electronic device in order to browse through photos

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant