CN113610939A - UI object positioning method, terminal device and computer-readable storage medium - Google Patents

UI object positioning method, terminal device and computer-readable storage medium Download PDF

Info

Publication number
CN113610939A
CN113610939A CN202110857820.2A CN202110857820A CN113610939A CN 113610939 A CN113610939 A CN 113610939A CN 202110857820 A CN202110857820 A CN 202110857820A CN 113610939 A CN113610939 A CN 113610939A
Authority
CN
China
Prior art keywords
texture
vertex
target
drawing instruction
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110857820.2A
Other languages
Chinese (zh)
Inventor
李旻昊
商泽利
高光磊
陈汉文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110857820.2A priority Critical patent/CN113610939A/en
Publication of CN113610939A publication Critical patent/CN113610939A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The embodiment of the application discloses a UI object positioning method, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: firstly, a drawing instruction sent to a graphic processor is intercepted, a first texture identifier corresponding to the drawing instruction is obtained, then under the condition that the first texture identifier is the texture identifier of a target UI object, position coordinate information of a vertex corresponding to the drawing instruction is obtained, and finally an area corresponding to the target UI object is determined according to the position coordinate information. By implementing the method, the purposes of efficiently positioning the area corresponding to the target UI object and reducing the calculated amount can be achieved.

Description

UI object positioning method, terminal device and computer-readable storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method for positioning a UI object, a terminal device, and a computer-readable storage medium.
Background
At present, terminal devices (such as mobile phones and computers) mostly support adding a custom effect in a display area of a UI (User Interface) element, and the existing process of adding the custom effect mostly includes: the terminal equipment firstly utilizes an image recognition (AI) algorithm to locate the position of the UI element to be added with the custom effect, and then adds the custom effect at the identified position.
In practice, it is found that it usually takes a long time to locate the position of the UI element to which the custom effect is to be added by using the AI algorithm, so how to efficiently locate the position of the UI element to which the custom effect is to be added becomes a problem that needs to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a UI object positioning method, terminal equipment and a computer readable storage medium, which are used for efficiently positioning an area corresponding to a target UI object and reducing the calculation amount.
A first aspect of an embodiment of the present application provides a method for positioning a UI object, including:
intercepting a drawing instruction sent to a graphics processor;
determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier;
if the first texture identifier is the texture identifier of the target UI object, obtaining vertex data corresponding to the drawing instruction, wherein the vertex data comprises position coordinate information of a vertex;
and determining the area corresponding to the target UI object according to the position coordinate information.
A second aspect of the embodiments of the present application provides a terminal device, including:
an intercepting unit for intercepting a rendering instruction sent to the graphics processor;
the determining unit is used for determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier;
the obtaining unit is used for obtaining vertex data corresponding to the drawing instruction when the first texture identifier is a texture identifier of a target UI object, and the vertex data comprises position coordinate information of a vertex;
the determining unit is further configured to determine an area corresponding to the target UI object according to the position coordinate information.
A third aspect of the embodiments of the present application provides a terminal device, which may include:
a memory storing executable program code;
and a processor coupled to the memory;
the processor calls the executable program code stored in the memory, and when executed by the processor, the executable program code causes the processor to implement the method according to the first aspect of the embodiments of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, on which executable program code is stored, and when the executable program code is executed by a processor, the method according to the first aspect of embodiments of the present application is implemented.
A fifth aspect of embodiments of the present application discloses a computer program product, which, when run on a computer, causes the computer to perform any one of the methods disclosed in the first aspect of embodiments of the present application.
A sixth aspect of the present embodiment discloses an application publishing platform, configured to publish a computer program product, where when the computer program product runs on a computer, the computer is caused to execute any of the methods disclosed in the first aspect of the present embodiment.
According to the technical scheme, the embodiment of the application has the following advantages:
in the embodiment of the application, a drawing instruction sent to a graphics processor is intercepted, a first texture identifier corresponding to the drawing instruction is obtained, then position coordinate information of a vertex corresponding to the drawing instruction is obtained when the first texture identifier is a texture identifier of a target UI object, and finally an area corresponding to the target UI object is determined according to the position coordinate information. By implementing the method, each drawing instruction sent to the graphics processor is intercepted, whether the drawing instruction is directed to the target UI object is judged based on the texture identifier corresponding to the intercepted drawing instruction, if yes, the position coordinate information of the vertex corresponding to the drawing instruction is obtained and analyzed, and therefore the area corresponding to the target UI object is obtained. Therefore, compared with the positioning of the target UI object through the image recognition AI algorithm, the positioning method for the target UI object disclosed by the embodiment of the application is realized through simple vertex position coordinate analysis, and the purposes of efficiently positioning the area corresponding to the target UI object and reducing the calculated amount are achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following briefly introduces the embodiments and the drawings used in the description of the prior art, and obviously, the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained according to the drawings.
Fig. 1A is a schematic flowchart of a method for locating a UI object according to an embodiment of the present disclosure;
FIG. 1B is a schematic flow chart illustrating a process for determining a rectangular target area according to an embodiment of the present disclosure;
FIG. 1C is a schematic flow chart illustrating the determination of a circular target area according to the embodiment of the present application;
FIG. 2 is a flow chart illustrating another method for locating a UI object disclosed in an embodiment of the present application;
FIG. 3 is a schematic illustration of a game interface disclosed in an embodiment of the present application;
FIG. 4 is another schematic illustration of a game interface disclosed in an embodiment of the present application;
FIG. 5 is a further schematic illustration of a game interface disclosed in an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device disclosed in an embodiment of the present application;
fig. 7 is a schematic diagram of another embodiment of the terminal device disclosed in the embodiment of the present application.
Detailed Description
The embodiment of the application provides a UI object positioning method, terminal equipment and a computer readable storage medium, which can efficiently position an area corresponding to a target UI object and reduce the calculation amount.
For a person skilled in the art to better understand the present application, the technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. The embodiments in the present application shall fall within the protection scope of the present application.
The execution main body in the embodiment of the present application may be a terminal device, or may also be a functional Unit or module in the terminal device, for example, a Central Processing Unit (CPU) in the terminal device. The following method embodiments are described by taking an example in which the execution subject may be a central processing unit. It should be noted that the central Processing Unit runs the upper layer application program, and if the upper layer application program needs to perform UI interface rendering, the central Processing Unit may call a Graphics Processing Unit (GPU) to enable the Graphics Processing Unit to perform rendering operation on the UI interface. The upper layer application may include a series of application packages, and the upper layer application may include applications such as a camera, a gallery, a calendar, a call, a map, a navigation, a WLAN, bluetooth, music, a video, and a short message, but is not limited thereto.
It should be noted that, in the embodiment of the present application, the terminal device may include a general handheld electronic terminal device, such as a mobile phone, a smart phone, a portable terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP) device, a notebook Computer, a notebook (Note Pad), a Wireless Broadband (Wibro) terminal, a tablet Computer (PC), a smart PC, a Point of Sales (POS), a vehicle-mounted Computer, and the like.
The terminal device may also comprise a wearable device. The wearable device may be worn directly on the user or may be a portable electronic device integrated into the user's clothing or accessory. Wearable equipment is not only a hardware equipment, can realize powerful intelligent function through software support and data interaction, high in the clouds interaction more, for example: the system has the functions of calculation, positioning and alarming, and can be connected with a mobile phone and various terminals. Wearable devices may include, but are not limited to, wrist-supported watch types (e.g., wrist watches, wrist-supported products), foot-supported shoes types (e.g., shoes, socks, or other leg-worn products), head-supported Glass types (e.g., glasses, helmets, headbands, etc.), and various types of non-mainstream products such as smart clothing, bags, crutches, accessories, and the like.
The technical solution of the present application is further described below by way of examples:
referring to fig. 1A, fig. 1A is a schematic flowchart illustrating a method for positioning a UI object according to an embodiment of the present disclosure. May include the steps of:
101. truncating the drawing instructions sent to the graphics processor.
In the embodiment of the present application, the drawing instruction corresponds to a UI object on the UI interface, and the UI object may indicate a bar, a button, a switch, an icon, a dynamic effect or a guide page on the UI interface. It should be noted that the graphics processor may implement rendering of the UI object corresponding to the drawing instruction by executing the drawing instruction.
In some embodiments, the central processor in the terminal device may intercept the drawing instructions sent to the graphics processor, which may include: the central processing unit identifies the type of the instruction needing to be sent to the graphics processor, and intercepts the drawing instruction when the instruction needing to be sent to the graphics processor is identified as the drawing instruction.
It should be noted that, when a frame image (one frame image may include a plurality of UI objects) is rendered by the graphics processor, the central processing unit may send, in addition to a drawing instruction to the graphics processor, an instruction for a rendering state, an instruction for setting a rendering data stream, and the like to the graphics processor, and therefore, the central processing unit needs to identify the type of the sent instruction and intercept the sent instruction when identifying that the sent instruction is a drawing instruction. It can be understood that the function corresponding to the instruction for rendering the state, the function corresponding to the rendering data stream for setting, and the function corresponding to the drawing instruction have different function names, and the central processing unit may determine whether the instruction is a drawing instruction by identifying the function name of the function corresponding to the instruction. Specifically, if the central processing unit recognizes that the function name of the function corresponding to the instruction to be sent is the function name corresponding to the drawing instruction, the central processing unit determines that the instruction to be sent is the drawing instruction.
102. And determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier.
In this embodiment, it should be noted that, when the graphics processor executes the rendering instruction to render the UI object corresponding to the rendering instruction, the texture of the UI object corresponding to the rendering instruction also needs to be obtained. Therefore, when the central processing unit sends the drawing instruction to the graphics processor, the texture of the UI object corresponding to the drawing instruction needs to be read from the memory, and then the read texture of the UI object corresponding to the drawing instruction is written into the video memory, so that the graphics processor can read the texture from the video memory when rendering the UI object corresponding to the drawing instruction. Based on this, the drawing instruction and the texture corresponding to the drawing instruction need to be bound in advance.
In some embodiments, the binding of the drawing instruction and the texture corresponding to the drawing instruction may include, but is not limited to, the following:
(1) the drawing instruction carries a texture identifier of a texture corresponding to the drawing instruction;
(2) the drawing instruction carries the storage address of the texture corresponding to the drawing instruction in the memory.
Further, in some embodiments, the central processor determines the first texture identifier corresponding to the drawing instruction, which may include, but is not limited to, the following:
in the mode 1, the drawing instruction may carry a first texture identifier, and the central processing unit may obtain the first texture identifier by analyzing the drawing instruction.
In the mode 2, the drawing instruction may carry a storage address of a texture corresponding to the drawing instruction, and the central processing unit may obtain the storage address of the texture corresponding to the drawing instruction by analyzing the drawing instruction, and further read the first texture identifier from the storage address of the texture corresponding to the drawing instruction.
In this embodiment, the first texture identifier is used to uniquely identify a texture corresponding to the drawing instruction. In some embodiments, the first texture identifier may include one or a combination of numbers, letters, special characters, and the like, and the embodiments of the present application are not limited thereto. Illustratively, the first texture identification may be the number 1, the letter N, or a special character #.
103. And if the first texture identifier is the texture identifier of the target UI object, acquiring vertex data corresponding to the drawing instruction, wherein the vertex data comprises position coordinate information of a vertex.
In this embodiment, different UI objects may respectively correspond to different texture identifiers, and the texture identifier (such as the first texture identifier and the texture identifier of the target UI object) may determine a unique UI object. The target UI object may be a pre-marked UI object, and if the first texture identifier is the texture identifier of the target UI object, it indicates that the drawing instruction is used to trigger the graphics processor to render the pre-marked UI object, and at this time, vertex data corresponding to the drawing instruction, that is, vertex data of the pre-marked UI object, is obtained. The vertex data may include position coordinate information of the vertices, which may include position coordinates of the respective vertices, transparency information, index data, and texture coordinate information.
When the graphics processor executes a drawing instruction to render a UI object corresponding to the drawing instruction, vertex data of the UI object corresponding to the drawing instruction also needs to be obtained. Therefore, when the central processing unit sends the drawing instruction to the graphics processing unit, it is further required to read the vertex data of the UI object corresponding to the drawing instruction from the memory, and then write the read vertex data of the UI object corresponding to the drawing instruction into the video memory, so that the graphics processing unit can read the vertex data from the video memory when rendering the UI object corresponding to the drawing instruction. Based on this, the drawing instruction and vertex data of the UI object corresponding to the drawing instruction need to be bound in advance.
In some embodiments, the manner in which the drawing instruction and the vertex data corresponding to the drawing instruction are bound may include, but is not limited to: the drawing instruction can carry the cache address of the vertex data corresponding to the drawing instruction in the memory.
Further, the central processing unit may obtain a cache address of vertex data corresponding to the drawing instruction in the memory by analyzing the drawing instruction, and further read the vertex data corresponding to the drawing instruction from the cache address.
104. And determining the area corresponding to the target UI object according to the position coordinate information.
In some embodiments, the determining, by the central processor, the area corresponding to the target UI object according to the position coordinate information may include: the central processing unit obtains the position coordinates of a first vertex from the position coordinate information, wherein the first vertex can comprise a vertex with the largest abscissa, a vertex with the smallest abscissa, a vertex with the largest ordinate and a vertex with the smallest ordinate in all vertexes corresponding to the vertex data; determining a target area with a preset shape according to the position coordinates of the first vertex, wherein the target area can cover all vertexes corresponding to the vertex data; and taking the target area as an area corresponding to the target UI object. The area corresponding to the target UI object may refer to a display position of the target UI object on the UI interface.
In some embodiments, the preset shape may include any one of a circle, a rectangle, an ellipse, and the like, and the embodiments of the present application are not limited thereto.
In one embodiment, the central processor determines the target area of the preset shape according to the position coordinates of the first vertex, which may include, but is not limited to, the following ways:
in the mode 1, as an example, the preset shape is a rectangle, the central processing unit makes straight lines parallel to the longitudinal axis at the vertex with the maximum abscissa and the vertex with the minimum abscissa, respectively makes straight lines parallel to the transverse axis at the vertex with the maximum ordinate and the vertex with the minimum ordinate, and takes a rectangular region surrounded by the 4 straight lines as a target region. One side of the rectangular area is an abscissa difference value between a vertex having the largest abscissa and a vertex having the smallest abscissa, and the other side is an ordinate difference value between a vertex having the largest ordinate and a vertex having the smallest ordinate. Referring to fig. 1B, fig. 1B is a schematic flowchart illustrating a process for determining a rectangular target area according to an embodiment of the present disclosure. As shown in FIG. 1B, vertex A with the largest abscissa, vertex B with the smallest abscissa, vertex C with the largest ordinate, and vertex D with the smallest ordinate.
Mode 2, exemplarily, if the preset shape is a circle, the central processing unit connects the vertex with the maximum abscissa and the vertex with the minimum abscissa to obtain a first straight line, and connects the vertex with the maximum ordinate and the vertex with the minimum ordinate to obtain a second straight line, and obtains the position coordinates of the intersection point of the first straight line and the second straight line, and determines a second vertex farthest from the intersection point from the vertex with the maximum abscissa, the vertex with the minimum abscissa, the vertex with the maximum ordinate and the vertex with the minimum ordinate, and obtains the distance values of the second vertex and the intersection point, and determines a circular region with the distance value as a radius with the intersection point as a center of a circle, and determines the circular region as a target region. Referring to fig. 1C, fig. 1C is a schematic flow chart illustrating a process of determining a circular target area according to an embodiment of the disclosure. As shown in FIG. 1C, vertex A with the largest abscissa, vertex B with the smallest abscissa, vertex C with the largest ordinate, and vertex D with the smallest ordinate. Since the first straight line passes through the vertex a and the vertex B, the second straight line passes through the vertex C and the vertex D, the intersection point of the first straight line and the second straight line is O, and the vertex farthest from O is the vertex a, the target area indicated in fig. 1C is a circular area having a radius of the length of a line segment OA (one end of the line segment OA is the intersection point O, and the other end is the vertex a) with the center of the circle O.
By implementing the method, each drawing instruction sent to the graphics processor is intercepted, whether the drawing instruction is directed to the target UI object is judged based on the texture identifier corresponding to the intercepted drawing instruction, if yes, the position coordinate information of the vertex corresponding to the drawing instruction is obtained and analyzed, and therefore the area corresponding to the target UI object is obtained. Therefore, compared with the positioning of the target UI object through the image recognition AI algorithm, the positioning method for the target UI object disclosed by the embodiment of the application is realized through simple vertex position coordinate analysis, and the purposes of efficiently positioning the area corresponding to the target UI object and reducing the calculated amount are achieved.
Referring to fig. 2, fig. 2 is a schematic flowchart of another UI object positioning method disclosed in the embodiment of the present application, which may include the following steps:
201. a draw instruction currently sent to the graphics processor is intercepted.
In some embodiments, the central processor may include a graphics rendering instruction processing module by which the central processor may generate rendering instructions and send the rendering instructions to the graphics processor through the graphics rendering instruction processing module. Based on this, the cpu truncates the rendering instructions sent to the graphics processor, which may include: and the central processing unit intercepts the drawing instruction generated by the graphic drawing instruction processing module. It is to be understood that the instruction for rendering state, the instruction for setting rendering data stream, and the like may be generated by the graphics-drawing-instruction processing module and transmitted to the graphics processor through the graphics-drawing-instruction processing module.
In some embodiments, the central processor truncating the rendering instructions sent to the graphics processor may include: the central processing unit monitors the drawing function provided for the graphic processor through the first function and intercepts the drawing instruction for calling the drawing function.
Based on the above description, the drawing instruction is generated by the graphics drawing instruction processing module, and therefore, in some embodiments, a first function may be provided on the graphics drawing instruction processing module, and the first function may be configured to monitor a drawing function called by the graphics drawing instruction processing module and intercept a drawing instruction calling the drawing function.
In some embodiments, the first function may comprise a first HOOK function, which may comprise two portions of program code, one portion of program code for snooping draw functions called by the graphics drawing instruction processing module, and another portion of program code for intercepting draw instructions calling the draw functions.
202. And determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier.
In some embodiments, the determining, by the central processor, the first texture identifier corresponding to the drawing instruction may include: the central processing unit obtains a texture storage instruction related to the drawing instruction, wherein the texture storage instruction is used for indicating to send the texture corresponding to the drawing instruction to the video memory; a first texture identification is determined based on the texture store instruction.
In this embodiment, the texture storage instruction may be generated by the central processing unit, and the central processing unit may read the texture corresponding to the drawing instruction from the memory by executing the texture storage instruction, and load the texture into the video memory.
In some embodiments, the texture storing instruction may include a first texture identifier, and the central processor may obtain the first texture identifier by parsing the texture storing instruction.
In some embodiments, the central processing unit may include a texture processing module, and the central processing unit may generate a texture storage instruction through the texture processing module, and execute the texture storage instruction through the texture processing module, so as to read a texture corresponding to the drawing instruction from the memory and write the texture into the video memory. Based on this, the obtaining, by the central processing unit, the texture storage instruction related to the drawing instruction may include: the central processing unit intercepts the texture storage instruction which is generated by the texture processing module and is related to the drawing instruction.
In some embodiments, the central processor may listen to the texture load function via the second function and intercept a texture store instruction associated with the draw instruction that calls the texture load function.
Based on the above description, since the texture store instruction may be generated by the texture processing module, in some embodiments, a second function may be provided on the texture processing module, and the second function may be configured to listen to a texture load function called by the texture processing module and intercept a texture store instruction related to the draw instruction that calls the texture load function. Illustratively, the texture load function may include a glBindTexture function, which is used to call a named texture.
In some embodiments, the second function may comprise a second HOOK function, which may comprise two portions of program code, one portion of program code for snooping a texture load function called by the texture processing module, and another portion of program code for intercepting a texture store instruction related to the draw instruction in calling the texture load function.
In some embodiments, the texture load function may be obtained by a texture processing module calling a graphics Application Program Interface (API). It should be noted that the Graphics API disclosed in the embodiments of the present application may include, but is not limited to, any one of an Open Graphics Library (OpenGL), a multimedia programming interface (DX), an Open Graphics Library for three-dimensional Graphics (OpenGL ES), and a low-level rendering application programming interface Metal.
203. And if the first texture identifier is the texture identifier of the target UI object, acquiring vertex data corresponding to the drawing instruction, wherein the vertex data comprises position coordinate information and transparency information of a vertex.
In some embodiments, the obtaining, by the central processor, vertex data corresponding to the drawing instruction may include:
the central processing unit obtains a vertex storage instruction related to the drawing instruction, wherein the vertex storage instruction is used for indicating to send vertex data to the video memory; and determining the cache address of the vertex data in the memory according to the vertex storage instruction.
In this embodiment, the vertex storing instruction may be generated by the central processing unit, and the central processing unit reads vertex data corresponding to the drawing instruction from the memory by executing the vertex storing instruction, and loads the vertex data into the video memory.
In some embodiments, the vertex storage instruction may include a cache address in the memory for storing vertex data corresponding to the drawing instruction, and the central processing unit may obtain the cache address by parsing the vertex storage instruction, and further read the vertex data corresponding to the drawing instruction from the cache address.
In some embodiments, the central processing unit may include a vertex processing module, and the central processing unit may generate a vertex storage instruction through the vertex processing module, and execute the vertex storage instruction through the vertex processing module, so as to read vertex data corresponding to the drawing instruction from the memory and write the vertex data into the video memory. Based on this, the central processing unit intercepting the vertex storage instruction related to the drawing instruction may include: the central processor intercepts the vertex storage instructions generated by the vertex processing module.
In some embodiments, the central processor may listen to the graphics memory function through the third function and intercept vertex memory instructions related to the drawing instructions that call the graphics memory function.
Based on the above description, since the vertex storage instruction may be generated by the vertex processing module, in some embodiments, a third function may be provided on the vertex processing module, and the third function may be configured to listen to the graphics storage function called by the vertex processing module and intercept the vertex storage instruction related to the drawing instruction calling the graphics storage function.
In some embodiments, the graphics storage function may be derived by the vertex processing module calling a graphics API. Illustratively, the graph store function may include a glmappbufferrange function for obtaining all vertex data pointing to the target UI object and a glumapbufferr function for obtaining part of the vertex data pointing to the target UI object.
In some embodiments, the third function may comprise a third HOOK function, which may comprise two portions of program code, one portion of program code for snooping graphics memory functions called by the vertex processing module, and another portion of program code for intercepting vertex memory instructions called by the graphics memory functions that are related to the drawing instructions.
204. And determining the area corresponding to the target UI object according to the position coordinate information.
It should be noted that, in the embodiment of the present application, for detailed description of step 204, please refer to the description of step 104 in fig. 1A, which is not described herein again.
In some embodiments, the central processor may add a custom image display effect to the region corresponding to the target UI object.
In some embodiments, the content of the image display effect may include, but is not limited to, one or a combination of text, images, and video.
In this embodiment of the application, the target UI object may be any UI object on a UI interface corresponding to any application, and this embodiment of the application is not limited. In the following embodiments, a game interface is taken as an example, and the target UI object is a UI object in the game interface.
Referring to fig. 3, fig. 3 is a schematic view of a game interface disclosed in the embodiment of the present application. The game interface shown in FIG. 3 includes gun 310, gun 320, and graphical display effect 330, and the content of graphical display effect 330 is the text "first creditworthiness". Wherein the gun 310 is a target UI object, and the image display effect 330 is an image display effect corresponding to the gun 310. It is to be appreciated that the central processor can resolve the area corresponding to the gun 310, i.e., the display area of the gun 310 on the gaming interface, based on the position coordinate information corresponding to the gun 310, and further add the image display effect 330 "first creditable" at the corresponding location of the display area of the gun 310.
205. And determining a trigger state corresponding to the target UI object according to the vertex data, wherein the trigger state comprises a use state or a non-use state.
In the embodiment of the present application, the trigger status indication may be whether the target UI object is in a selected state.
In some embodiments, the manner of determining, by the central processor, the trigger state corresponding to the target UI object according to the vertex data may include, but is not limited to, the following manners:
in the mode 1, the vertex data further comprises transparency information corresponding to a vertex, and when the transparency information of the vertex corresponding to the drawing instruction is matched with the first transparency information, the central processing unit determines that the target UI object is in a use state; when the transparency information of the vertex corresponding to the drawing instruction matches the second transparency information, the central processor determines that the target UI object is in a non-use state. Optionally, the transparency information corresponding to the vertices may include transparency corresponding to each vertex. By implementing the method, the central processing unit can accurately identify the state of the target UI object according to the transparency information of the vertex.
Mode 2, if the second texture identifier corresponding to the drawing instruction is the texture identifier of the selected component, determining an area corresponding to the selected component according to the vertex data corresponding to the drawing instruction; if the area corresponding to the selection component is matched with the area corresponding to the target UI object, determining that the target UI object is in a use state; and if the area corresponding to the selection component is not matched with the area corresponding to the target UI object, determining that the target UI object is in a non-use state.
Optionally, matching the area corresponding to the selection component with the area corresponding to the target UI object may include, but is not limited to, the following:
(1) selecting an overlapping area between the area corresponding to the component and the area corresponding to the target UI object;
(2) the area corresponding to the selection component is not overlapped with the area corresponding to the target UI object, but the interval is smaller than a preset threshold value.
Illustratively, the target UI object may be a weapon device currently being used by the game character. In practice, it has been found that in order to facilitate the user to know which weapon device the current character is using, a selection component is often used on the game interface to identify the weapon device the current character is using. Alternatively, the selection component may include, but is not limited to, any one of a rectangular box pattern, an oval pattern, and a circular pattern. Based on the method and the device, the areas corresponding to the selection assemblies and the areas corresponding to the weapon equipment to be judged can be matched, so that the weapon equipment used by the current role can be determined. It should be noted that the obtaining manner of the region corresponding to the selection component is as in steps 201 to 204, that is, the steps 201 to 204 are performed with the selection component as the target UI object to obtain the region corresponding to the selection component, the steps 201 to 204 are performed with the weapon device to be determined as the target UI object to obtain the region corresponding to the determined weapon device, and then, by matching the regions corresponding to the two, it is determined whether the weapon device to be determined is in the use state. Specifically, the method comprises the following steps: if the area corresponding to the selection component is matched with the area corresponding to the weapon equipment to be judged, the weapon equipment to be judged is in the use state, and if the area corresponding to the selection component is not matched with the area corresponding to the weapon equipment to be judged, the weapon equipment to be judged is not in the use state.
Referring to fig. 4, fig. 4 is another schematic view of a game interface disclosed in the embodiment of the present application. The game interface shown in fig. 4 includes a gun 410 and a gun 420, wherein the gun 410 is in use. It is understood that in the game interface shown in fig. 4, the transparency information of the vertex corresponding to the gun 410 is matched with the first transparency information. In the game interface shown in fig. 3, the gun 310 is not in use, and the transparency information of the vertex corresponding to the gun 310 is matched with the second transparency information.
Referring to fig. 5, fig. 5 is another schematic view of a game interface disclosed in the embodiment of the present application. The game interface shown in fig. 5 includes a gun 510, a gun 520, and a pick assembly 530, wherein the gun 510 is in use and the gun 520 is not in use. It can be understood that the central processing unit obtains the area corresponding to the gun 510 and the area corresponding to the selecting element 530 through steps 201 to 205, and determines that the gun 510 is in the use state if the area corresponding to the gun 510 is matched with the area corresponding to the selecting element 530.
206. And if the trigger state is the use state, generating prompt information corresponding to the target UI object.
In some embodiments, the output mode of the prompt message may include, but is not limited to, one or a combination of audio, video, text, image, and vibration. The embodiment of the present application is described with the prompt information of the vibration output mode:
in practice, it is found that when a game is played against a game-type network, a game role currently used by a user usually has a gun, and in order to provide an immersive game experience for the user, the embodiment of the application can preset different vibration intensities for different guns. It is understood that the terminal device may vibrate with its built-in vibrator, that is, the intensity of vibration of the vibrator is controlled to be different for different guns, and for example, in the case of a pistol and a submachine gun, the vibration intensity preset for the pistol by the terminal device is smaller than the vibration intensity preset for the submachine gun.
The following description is made with reference to the game interface shown in fig. 5: the preset vibration strength may include a first vibration strength corresponding to the gun 510 and a second vibration strength corresponding to the gun 520, in the game interface shown in fig. 5, the gun 510 is in a use state, the gun 520 is in a non-use state, and the cpu controls the vibrator of the terminal device to vibrate with the first vibration strength when recognizing that the gun 510 is in the use state.
By implementing the method, each drawing instruction sent to the graphics processor is intercepted, whether the drawing instruction is directed to the target UI object is judged based on the texture identifier corresponding to the intercepted drawing instruction, if yes, the position coordinate information of the vertex corresponding to the drawing instruction is obtained and analyzed, and therefore the area corresponding to the target UI object is obtained. Compared with the positioning of the target UI object through an image recognition AI algorithm, the method is realized through simple vertex position coordinate analysis, and the aims of efficiently positioning the area corresponding to the target UI object and reducing the calculated amount are fulfilled. In addition, when the target UI object is a game object in the game interface, the state of the game object can be identified based on the vertex data, so that more immersive game experience is provided for the user according to the identification result, and the consistency of the user is further improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a terminal device disclosed in the embodiment of the present application. The terminal device as shown in fig. 6 may include: a clipping unit 601, a determining unit 602, and an obtaining unit 603, wherein:
an intercepting unit 601 configured to intercept a rendering instruction sent to the graphics processor;
a determining unit 602, configured to determine texture data corresponding to a drawing instruction, where the texture data includes a first texture identifier;
an obtaining unit 603, configured to obtain vertex data corresponding to the drawing instruction when the first texture identifier is a texture identifier of the target UI object, where the vertex data includes position coordinate information of a vertex;
the determining unit 602 is further configured to determine, according to the position coordinate information, an area corresponding to the target UI object.
In some embodiments, the manner for intercepting the drawing instruction sent to the graphics processor by the intercepting unit 601 may specifically include: the intercepting unit 601 is configured to monitor a rendering function provided by the graphics processor by the first function, and intercept a rendering instruction for calling the rendering function.
In some embodiments, the manner in which the determining unit 602 is configured to determine the first texture identifier corresponding to the drawing instruction may specifically include: a determining unit 602, configured to obtain a texture storage instruction related to a drawing instruction, where the texture storage instruction is used to instruct to send a texture corresponding to the drawing instruction to a video memory; according to the texture storage instruction, a first texture identification is determined.
In some embodiments, the manner in which the determining unit 602 is configured to obtain the texture storage instruction corresponding to the drawing instruction may specifically include: a determining unit 602, configured to listen to the texture load function through the second function, and intercept a texture store instruction related to the draw instruction, which calls the texture load function.
In some embodiments, the manner for the obtaining unit 603 to obtain vertex data corresponding to the drawing instruction may specifically include: an obtaining unit 603, configured to obtain a vertex storage instruction related to a drawing instruction, where the vertex storage instruction is used to instruct to send vertex data corresponding to the drawing instruction to a video memory; and according to the vertex storage instruction, determining the cache address of the vertex data corresponding to the drawing instruction in the memory, and copying the vertex data corresponding to the drawing instruction from the cache address.
In some embodiments, the manner for the obtaining unit 603 to obtain the vertex storage instruction related to the drawing instruction may specifically include: the obtaining unit 603 is configured to monitor the graph storage function through the third function, and intercept a vertex storage instruction related to the drawing instruction, which calls the graph storage function.
In some embodiments, the determining unit 602 is further configured to, after the obtaining unit 603 obtains vertex data corresponding to the drawing instruction, determine a trigger state corresponding to the target UI object according to the vertex data, where the trigger state includes a use state or a non-use state; and if the trigger state is the use state, generating prompt information corresponding to the target UI object.
In some embodiments, the manner of determining, by the determining unit 602, the trigger state corresponding to the target UI object according to the vertex data may specifically include, but is not limited to, the following manners:
mode 1, a determining unit 602, configured to determine that the target UI object is in a use state when transparency information of a vertex corresponding to the drawing instruction matches the first transparency information; and when the transparency information of the vertex corresponding to the drawing instruction is matched with the second transparency information, determining that the target UI object is in a non-use state.
Mode 2, the texture data may further include a second texture identifier, and the determining unit 602 is configured to determine, when the second texture identifier is the texture identifier of the selected component, an area corresponding to the selected component according to vertex data corresponding to the drawing instruction; when the area corresponding to the selected component is matched with the area corresponding to the target UI object, determining that the target UI object is in a use state; and when the area corresponding to the selection component is not matched with the area corresponding to the target UI object, determining that the target UI object is in a non-use state.
In some embodiments, the determining unit 602 is further configured to add a customized image display effect to the area corresponding to the target UI object after determining the area corresponding to the target UI object according to the position coordinate information.
Referring to fig. 7, fig. 7 is a schematic diagram of another embodiment of the terminal device disclosed in the embodiment of the present application, and fig. 7 is a block diagram of a partial structure of a mobile phone related to the terminal device provided in the embodiment of the present application. Referring to fig. 7, the handset includes: memory 710, display unit 720, and processor 730. Those skilled in the art will appreciate that the handset configuration shown in fig. 7 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 7:
the memory 710 may be used to store software programs and modules, and the processor 730 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 710. The memory 710 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 710 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The display unit 720 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The display unit 720 may include a display panel 721, a video memory 722, and a graphics processor 723. Alternatively, the Display panel 721 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The video memory 722, also called a frame buffer, is used to store rendering data processed or to be fetched by the video card chip. The graphic processor 723 is used for driving the display information required by the mobile phone system, providing a line scan signal to the display panel 721, and controlling the display on the display panel 721.
The processor 730 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 710 and calling data stored in the memory 710, thereby performing overall monitoring of the mobile phone. Optionally, processor 730 may include one or more processing units; preferably, the processor 730 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 730.
Although not shown, the mobile phone may further include a Radio Frequency (RF) circuit, an input unit, a sensor, an audio circuit, a Wireless Fidelity (WiFi) module, a power supply, a camera, a bluetooth module, and other components, which are not described herein again.
In this embodiment, the processor 730 included in the terminal device further has the following functions:
optionally, the processor 730 further has the following functions:
intercepting a drawing instruction sent to a graphics processor;
determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier;
if the first texture identifier is the texture identifier of the target UI object, vertex data corresponding to the drawing instruction is obtained, and the vertex data comprises position coordinate information of a vertex;
and determining the area corresponding to the target UI object according to the position coordinate information.
Optionally, the processor 730 further has the following functions:
and monitoring a drawing function provided for the graphics processor through the first function, and intercepting a drawing instruction for calling the drawing function.
Optionally, the processor 730 further has the following functions:
acquiring a texture storage instruction related to the drawing instruction, wherein the texture storage instruction is used for indicating to send a texture corresponding to the drawing instruction to a video memory;
according to the texture storage instruction, a first texture identification is determined.
Optionally, the processor 730 further has the following functions:
and monitoring the texture loading function through the second function, and intercepting a texture storage instruction which calls the texture loading function and is related to the drawing instruction.
Optionally, the processor 730 further has the following functions:
acquiring a vertex storage instruction related to the drawing instruction, wherein the vertex storage instruction is used for indicating to send vertex data to a video memory;
determining a cache address of vertex data corresponding to the drawing instruction in a memory according to the vertex storage instruction;
and copying the vertex data corresponding to the drawing instruction from the cache address of the vertex data in the memory.
Optionally, the processor 730 further has the following functions:
and monitoring the graph storage function through a third function, and intercepting a vertex storage instruction which calls the graph storage function and is related to the drawing instruction.
Optionally, the processor 730 further has the following functions:
determining a trigger state corresponding to the target UI object according to the vertex data, wherein the trigger state comprises a use state or a non-use state;
and if the trigger state is the use state, generating prompt information corresponding to the target UI object.
Optionally, the vertex data further includes transparency information corresponding to the vertex, and the processor 730 further has the following functions:
if the transparency information of the vertex corresponding to the drawing instruction is matched with the first transparency information, determining that the target UI object is in a use state;
and if the transparency information of the vertex corresponding to the drawing instruction is matched with the second transparency information, determining that the target UI object is in a non-use state.
The texture data may further include a second texture identifier, and optionally, the processor 730 further has the following functions:
if the second texture identifier is the texture identifier of the selected component, determining an area corresponding to the selected component according to the vertex data corresponding to the drawing instruction;
if the area corresponding to the selection component is matched with the area corresponding to the target UI object, determining that the target UI object is in a use state;
and if the area corresponding to the selection component is not matched with the area corresponding to the target UI object, determining that the target UI object is in a non-use state.
Optionally, the processor 730 further has the following functions:
and adding a self-defined image display effect in the area corresponding to the target UI object.
The embodiment of the application discloses a computer readable storage medium, which stores a computer program, wherein the computer program realizes the method described in the above embodiment when being executed by a processor.
Embodiments of the present application disclose a computer program product comprising a non-transitory computer readable storage medium storing a computer program, and the computer program, when executed by a processor, implements the method as described in the embodiments above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a ROM, etc.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

1. A method for locating a UI object, comprising:
intercepting a drawing instruction sent to a graphics processor;
determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier;
if the first texture identifier is the texture identifier of the target UI object, obtaining vertex data corresponding to the drawing instruction, wherein the vertex data comprises position coordinate information of a vertex;
and determining the area corresponding to the target UI object according to the position coordinate information.
2. The method of claim 1, wherein truncating rendering instructions sent to a graphics processor comprises:
and monitoring a drawing function provided for the graphics processor through the first function, and intercepting a drawing instruction for calling the drawing function.
3. The method of claim 1 or 2, wherein determining the first texture identifier corresponding to the drawing instruction comprises:
acquiring a texture storage instruction related to the drawing instruction, wherein the texture storage instruction is used for indicating to send a texture corresponding to the drawing instruction to a video memory;
and determining a first texture identifier according to the texture storage instruction.
4. The method of claim 3, wherein said fetching texture store instructions associated with said rendering instructions comprises:
and monitoring a texture loading function through a second function, and intercepting a texture storage instruction which calls the texture loading function and is related to the drawing instruction.
5. The method according to claim 1 or 2, wherein the obtaining vertex data corresponding to the drawing instruction comprises:
obtaining a vertex storage instruction related to the drawing instruction, wherein the vertex storage instruction is used for indicating to send the vertex data to a video memory;
determining the cache address of the vertex data in the memory according to the vertex storage instruction;
and copying the vertex data from the cache address.
6. The method of claim 5, wherein said fetching vertex storage instructions associated with said rendering instructions comprises:
and monitoring a graph storage function through a third function, and intercepting a vertex storage instruction which calls the graph storage function and is related to the drawing instruction.
7. The method of claim 1, wherein after obtaining vertex data corresponding to the drawing instruction, the method further comprises:
determining a trigger state corresponding to the target UI object according to the vertex data, wherein the trigger state comprises a use state or a non-use state;
and if the trigger state is the use state, generating prompt information corresponding to the target UI object.
8. The method of claim 7, wherein the vertex data further includes transparency information corresponding to a vertex, and wherein determining the trigger state corresponding to the target UI object according to the vertex data includes:
if the transparency information of the vertex corresponding to the drawing instruction is matched with the first transparency information, determining that the target UI object is in the use state;
and if the transparency information of the vertex corresponding to the drawing instruction is matched with the second transparency information, determining that the target UI object is in the non-use state.
9. The method of claim 7, wherein the texture data further includes a second texture identifier, and wherein determining the trigger state corresponding to the target UI object according to the vertex data comprises:
if the second texture identifier is the texture identifier of the selected component, determining an area corresponding to the selected component according to the vertex data corresponding to the drawing instruction;
if the area corresponding to the selection component is matched with the area corresponding to the target UI object, determining that the target UI object is in the use state;
and if the area corresponding to the selection component is not matched with the area corresponding to the target UI object, determining that the target UI object is in the non-use state.
10. The method according to claim 1, wherein after determining the area corresponding to the target UI object according to the position coordinate information, the method further comprises:
and adding a self-defined image display effect in the area corresponding to the target UI object.
11. A terminal device, comprising:
an intercepting unit for intercepting a rendering instruction sent to the graphics processor;
the determining unit is used for determining texture data corresponding to the drawing instruction, wherein the texture data comprises a first texture identifier;
the obtaining unit is used for obtaining vertex data corresponding to the drawing instruction when the first texture identifier is a texture identifier of a target UI object, and the vertex data comprises position coordinate information of a vertex;
the determining unit is further configured to determine an area corresponding to the target UI object according to the position coordinate information.
12. A terminal device, comprising:
a memory storing executable program code;
and a processor coupled to the memory;
the processor calls the executable program code stored in the memory, which when executed by the processor causes the processor to implement the method of any of claims 1-10.
13. A computer-readable storage medium having executable program code stored thereon, wherein the executable program code, when executed by a processor, implements the method of any of claims 1-10.
CN202110857820.2A 2021-07-28 2021-07-28 UI object positioning method, terminal device and computer-readable storage medium Pending CN113610939A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110857820.2A CN113610939A (en) 2021-07-28 2021-07-28 UI object positioning method, terminal device and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110857820.2A CN113610939A (en) 2021-07-28 2021-07-28 UI object positioning method, terminal device and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN113610939A true CN113610939A (en) 2021-11-05

Family

ID=78305800

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110857820.2A Pending CN113610939A (en) 2021-07-28 2021-07-28 UI object positioning method, terminal device and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN113610939A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797506A (en) * 2022-12-16 2023-03-14 江苏泽景汽车电子股份有限公司 Method and device for drawing lane line object, terminal equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369249A (en) * 2007-08-14 2009-02-18 国际商业机器公司 Method and apparatus for marking GUI component of software
CN101952857A (en) * 2008-03-28 2011-01-19 科乐美数码娱乐株式会社 Image processing device, image processing device control method, program, and information storage medium
CN102789311A (en) * 2011-05-17 2012-11-21 天津市卓立成科技有限公司 Multipoint analyzing method in interactive system
CN111611031A (en) * 2019-02-26 2020-09-01 华为技术有限公司 Graph drawing method and electronic equipment
CN111625233A (en) * 2020-05-25 2020-09-04 天津中新智冠信息技术有限公司 Configuration method, device and equipment of state diagram and storage medium
CN112947905A (en) * 2019-11-26 2021-06-11 腾讯科技(深圳)有限公司 Picture loading method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369249A (en) * 2007-08-14 2009-02-18 国际商业机器公司 Method and apparatus for marking GUI component of software
CN101952857A (en) * 2008-03-28 2011-01-19 科乐美数码娱乐株式会社 Image processing device, image processing device control method, program, and information storage medium
CN102789311A (en) * 2011-05-17 2012-11-21 天津市卓立成科技有限公司 Multipoint analyzing method in interactive system
CN111611031A (en) * 2019-02-26 2020-09-01 华为技术有限公司 Graph drawing method and electronic equipment
CN112947905A (en) * 2019-11-26 2021-06-11 腾讯科技(深圳)有限公司 Picture loading method and device
CN111625233A (en) * 2020-05-25 2020-09-04 天津中新智冠信息技术有限公司 Configuration method, device and equipment of state diagram and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797506A (en) * 2022-12-16 2023-03-14 江苏泽景汽车电子股份有限公司 Method and device for drawing lane line object, terminal equipment and storage medium
CN115797506B (en) * 2022-12-16 2023-11-17 江苏泽景汽车电子股份有限公司 Method, device, terminal equipment and storage medium for drawing lane line object

Similar Documents

Publication Publication Date Title
CN116996609A (en) Method for determining dial image and electronic device thereof
US10186244B2 (en) Sound effect processing method and device, plug-in unit manager and sound effect plug-in unit
CN107741820B (en) Input method keyboard display method and mobile terminal
CN106502703B (en) Function calling method and device
US11263997B2 (en) Method for displaying screen image and electronic device therefor
CN112230923A (en) User interface rendering method, user interface rendering device and server
CN111966491B (en) Method for counting occupied memory and terminal equipment
CN113610939A (en) UI object positioning method, terminal device and computer-readable storage medium
CN115018955B (en) Image generation method and device
CN107465646B (en) A kind of application method for down loading, system and relevant device
CN109718554B (en) Real-time rendering method and device and terminal
CN111580883A (en) Application program starting method, device, computer system and medium
KR20180088859A (en) A method for changing graphics processing resolution according to a scenario,
CN111210496A (en) Picture decoding method, device and equipment
KR102589496B1 (en) Method for displaying screen and electronic device implementing the same
CN112367429B (en) Parameter adjusting method and device, electronic equipment and readable storage medium
CN112905931A (en) Page information display method and device, electronic equipment and storage medium
CN108829600B (en) Method and device for testing algorithm library, storage medium and electronic equipment
CN107943495B (en) Method and device for setting business object, server and storage medium
CN112035180A (en) Automatic instance loading method and device, electronic equipment and storage medium
CN113392120A (en) Method and device for acquiring execution information of SQLite
CN108073508B (en) Compatibility detection method and device
CN115237317B (en) Data display method and device, electronic equipment and storage medium
CN112784622A (en) Image processing method and device, electronic equipment and storage medium
KR20140020108A (en) Method for recognizing touch pen and an electronic device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination