CN113703622A - Display interface processing method and device, electronic equipment and storage medium - Google Patents

Display interface processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113703622A
CN113703622A CN202110368441.7A CN202110368441A CN113703622A CN 113703622 A CN113703622 A CN 113703622A CN 202110368441 A CN202110368441 A CN 202110368441A CN 113703622 A CN113703622 A CN 113703622A
Authority
CN
China
Prior art keywords
display
display interface
reference image
edge
straight line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110368441.7A
Other languages
Chinese (zh)
Inventor
孙笑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110368441.7A priority Critical patent/CN113703622A/en
Publication of CN113703622A publication Critical patent/CN113703622A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a display interface processing method, a display interface processing device, electronic equipment and a storage medium; displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element; responding to a first operation aiming at an operation control, and overlaying and displaying a reference image on a display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image. The scheme can effectively improve the accuracy of the display interface processing. Applicable areas include, but are not limited to, areas of maps, traffic, social, gaming, news, medical, education, meetings, and the like.

Description

Display interface processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a display interface processing method and apparatus, an electronic device, and a storage medium.
Background
In the development process of an Application (APP), the development process mainly comprises two stages, namely a design stage comprising UI (User Interface) design of the APP and a development stage comprising UI visual restoration of the APP. Wherein, visual reduction refers to the process of implementing the UI designed by the designer to the APP.
In order to enable the interfaces developed by the same design draft to achieve a unified visual effect on different devices, in the interface development process, the interfaces realized by developers need to be compared with the interfaces designed by designers, for example, the accuracy of visual restoration can be detected in a layer overlapping manner. However, the existing visual restoration scheme is manually operated in the detection process, so that the manual consumption is high, the consumed time is long, the restoration deviation caused by the inaccuracy of human eyes is easy to occur during the visual restoration, and the accuracy is poor.
Disclosure of Invention
The embodiment of the application provides a display interface processing method and device, electronic equipment and a storage medium, and the accuracy of display interface processing can be effectively improved.
The embodiment of the application provides a display interface processing method, which comprises the following steps:
displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element;
responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface;
and detecting the position of a display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
Correspondingly, an embodiment of the present application further provides a display interface processing apparatus, including:
the display unit is used for displaying a display interface corresponding to the target design drawing and an operation control, and the display interface comprises at least one display element;
the superposition unit is used for superposing and displaying a reference image on the display interface in response to a first operation aiming at the operation control, so that the reference image is aligned with the boundary of the display interface, and the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface;
the detection unit is used for detecting the position of the display element in the display interface based on the reference image and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
Optionally, in some embodiments, the attribute information of the display interface includes a screen resolution of the display interface, and the overlay unit may include an acquisition subunit, an adjustment subunit, and an overlay subunit, as follows:
the obtaining subunit is configured to obtain, in response to a first operation on the operation control, a screen resolution of the display interface;
the adjusting subunit is configured to adjust the target design drawing by using a dynamic adaptation rule based on the screen resolution of the display interface to obtain a reference image;
and the superposition subunit is used for superposing and displaying the reference image on the display interface.
Optionally, in some embodiments, the adjusting subunit may be specifically configured to adjust the target design drawing by using a dynamic adaptation rule based on the screen resolution of the display interface to obtain an adjusted design drawing; if the boundaries of the adjusted design drawing and the display interface are not completely aligned, responding to the stretching operation aiming at the adjusted design drawing to generate a stretched design drawing; when the stretched design drawing is completely aligned with the boundary of the display interface, taking the stretched design drawing as a reference image; and if the adjusted design drawing is completely aligned with the boundary of the display interface, taking the adjusted design drawing as a reference image.
Optionally, in some embodiments, the stretching operation includes a touch operation and a moving operation, and the adjusting subunit is specifically configured to, in response to the touch operation on the adjusted design drawing, obtain a first coordinate value of a touch point on the display interface corresponding to the touch operation; responding to the moving operation of the adjusted design drawing, and acquiring a second coordinate value of a moving point corresponding to the moving operation on the display interface when the moving operation is stopped; and stretching the adjusted design drawing based on the first coordinate value and the second coordinate value to obtain a stretched design drawing.
Optionally, in some embodiments, the detecting unit may include a generating subunit, a determining subunit, a calculating subunit, and a marking subunit, as follows:
the generating subunit is used for generating a display image according to the display interface;
the determining subunit is configured to determine positions of the display element on the display image and the reference image respectively to obtain first position information and second position information of the display element;
the calculating subunit is configured to calculate a difference between the first position information and the second position information of the display element, so as to obtain position difference information of the display element in the display interface relative to the reference image;
the marking subunit is configured to mark the position difference information on the display interface.
Optionally, in some embodiments, the determining subunit may be specifically configured to identify an edge line group corresponding to the display element in the display image and an edge line group corresponding to the display element in the reference image, respectively, to obtain a first edge line group and a second edge line group; respectively determining the position information of the first edge straight line group in the display image and the position information of the second edge straight line group in the reference image to obtain the first position information of the first edge straight line group and the second position information of the second edge straight line group
Optionally, the calculating subunit may be specifically configured to calculate a difference between first position information of the first edge straight line group and second position information of the second edge straight line group, so as to obtain position difference information of a display element in the display interface relative to the reference image.
Optionally, in some embodiments, the determining subunit is specifically configured to perform edge detection on the display image and the reference image respectively to obtain first edge information corresponding to the display image and second edge information corresponding to the reference image; performing straight line detection on the first edge information to obtain at least one first edge straight line; forming first edge straight lines corresponding to each display element into first edge straight line groups based on edge straight line conditions preset by the display elements, wherein each first edge straight line group comprises at least one first edge straight line; performing straight line detection on the second edge information to obtain at least one second edge straight line; and acquiring second edge straight lines corresponding to the display elements based on preset edge straight line conditions of the display elements to form second edge straight line groups, wherein each second edge straight line group comprises at least one second edge straight line.
Optionally, in some embodiments, the determining subunit is specifically configured to perform gaussian filtering on the display image, remove noise of the display image, and obtain a denoised display image; calculating the gradient amplitude and the gradient direction of the denoised display image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised display image; detecting the initial edge point set by using a dual-threshold algorithm, and determining a target edge point set of the denoised display image according to a detection result; performing edge connection on the denoised display interface based on the target edge point set, and determining first edge information corresponding to the denoised display image; performing edge detection on the reference image to obtain second edge information corresponding to the reference image; performing straight line detection on the second edge information to obtain at least one second edge straight line; and acquiring second edge straight lines corresponding to the display elements based on preset edge straight line conditions of the display elements to form second edge straight line groups, and determining second position information of the second edge straight line groups, wherein each second edge straight line group comprises at least one second edge straight line.
Optionally, in some embodiments, the determining subunit is specifically configured to perform gaussian filtering on the reference image, remove noise of the reference image, and obtain a denoised reference image; calculating the gradient amplitude and the gradient direction of the denoised reference image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised reference image; detecting the initial edge point set of the denoised reference image by using a dual-threshold algorithm, and determining a target edge point set of the denoised reference image according to a detection result; and performing edge connection on the denoised reference image based on the target edge point set, and determining second edge information corresponding to the denoised reference image.
Optionally, in some embodiments, the calculating subunit may be specifically configured to compare first position information of the first edge straight line group with second position information of a corresponding second edge straight line group; screening out a first edge straight line group different from second position information of the second edge straight line group according to the comparison result to obtain a screened edge straight line group; and calculating the difference between the first position information of the screened edge straight line group and the second position information of the corresponding target second edge straight line group to obtain the position difference information of the display element in the display interface relative to the reference image.
Optionally, in some embodiments, the position difference information includes a coordinate difference value, and the marking subunit may be specifically configured to convert the coordinate difference value into a pixel density difference value according to a screen resolution of the display interface; marking the screened edge straight line group and the pixel density difference value on a display interface.
In addition, a computer-readable storage medium is provided, where multiple instructions are stored, and the instructions are suitable for being loaded by a processor to perform steps in any one of the display interface processing methods provided in the embodiments of the present application.
In addition, an electronic device is further provided in an embodiment of the present application, and includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps in any one of the display interface processing methods provided in the embodiment of the present application when executing the program.
According to an aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer-readable storage medium, the computer instructions being read by a processor of a computer device from the computer-readable storage medium, the computer instructions being executable by the processor to cause the computer device to perform the method provided in the various alternative implementations of the background update aspect described above.
The embodiment can display a display interface and an operation control corresponding to the target design drawing, wherein the display interface comprises at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image. The scheme can effectively improve the accuracy of the display interface processing. The fields in which the scheme can be applied include, but are not limited to, the fields of maps, traffic, social interactions, games, news, medical, education, meetings, and the like.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a scene schematic diagram of a display interface processing method provided in an embodiment of the present application;
FIG. 1b is a first flowchart of a display interface processing method according to an embodiment of the present application;
FIG. 2a is a second flowchart of a display interface processing method provided in an embodiment of the present application;
FIG. 2b is a diagram illustrating an embodiment of the present application for opening a floating window;
FIG. 2c is an overall architecture diagram of a development interface detection method provided in an embodiment of the present application;
FIG. 2d is a schematic diagram of a function operation control provided by an embodiment of the present application;
FIG. 2e is a schematic diagram of a target layout provided by an embodiment of the present application;
FIG. 2f is a schematic diagram of a reference image and development interface overlay provided by an embodiment of the present application;
FIG. 2g is a third flowchart of a display interface processing method provided by an embodiment of the present application;
FIG. 2h is a schematic diagram of a development interface detection process provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of a display interface processing apparatus provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The principles of the present application are illustrated as being implemented in a suitable computing environment. In the description that follows, specific embodiments of the present application will be described with reference to steps and symbols executed by one or more computers, unless otherwise indicated. Accordingly, these steps and operations will be referred to, several times, as being performed by a computer, the computer performing operations involving a processing unit of the computer in electronic signals representing data in a structured form. This operation transforms the data or maintains it at locations in the computer's memory system, which may be reconfigured or otherwise altered in a manner well known to those skilled in the art. The data maintains a data structure that is a physical location of the memory that has particular characteristics defined by the data format. However, while the principles of the application have been described in language specific to above, it is not intended to be limited to the specific form set forth herein, and it will be recognized by those of ordinary skill in the art that various of the steps and operations described below may be implemented in hardware.
The term "unit" as used herein may be considered a software object executing on the computing system. The various components, units, engines, and services described herein may be viewed as objects of implementation on the computing system. The apparatus and method described herein may be implemented in software, or may be implemented in hardware, and are within the scope of the present application.
The terms "first", "second", and "third", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but rather, some embodiments may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiment of the application provides a display interface processing method and device, electronic equipment and a storage medium. The display interface processing device may be integrated in an electronic device, and the electronic device may be a server or a terminal.
For example, as shown in fig. 1a, first, the electronic device integrated with the display interface processing apparatus may display a display interface corresponding to a target design drawing, and an operation control, where the display interface includes at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image. Because the display interface is detected by the scheme and the detection result is displayed on the display interface, developers can carry out efficient visual reduction effect detection at any time during development, the problem of reduction deviation caused by human eye inaccuracy during visual reduction is avoided, the accuracy of display interface processing is improved, and meanwhile, due to normal dynamic adaptation resolution, the developers do not need to carry out resolution adaptation when using various electronic devices, and the adaptation development efficiency is greatly improved.
The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
The embodiment will be described from the perspective of a display interface processing apparatus, which may be specifically integrated in an electronic device, where the electronic device may be a server or a terminal; the terminal may include a mobile phone, a tablet Computer, a vehicle-mounted device, a notebook Computer, a Personal Computer (PC), and other devices.
In one or more embodiments, there is provided a display interface processing method including: displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image. Applicable areas include, but are not limited to, areas of maps, traffic, social, gaming, news, medical, education, meetings, and the like.
As shown in fig. 1b, a specific flow of the display interface processing method may be as follows:
101. and displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element.
The target design drawing may refer to a design drawing designed by a visual designer, and the display interface may refer to a development effect interface for displaying a developer to develop based on the target design drawing. The display elements may refer to a series of elements that a developer needs to develop according to a design drawing and display on the display interface, and the display elements may be represented in many forms, for example, a series of elements such as a display control, a status bar, a scroll bar, and the like may be in the display interface, a corresponding series of image elements such as a display control, a status bar, a scroll bar, and the like may be in the display image, a corresponding series of image elements such as a display control, a status bar, a scroll bar, and the like may be included in the reference image, and the like.
The control can have a variety of expressions, for example, it can be an icon, an input box, a button, a selection box, etc., and can be set as required. The display mode of the operation control may be various, for example, the operation control may be suspended on the display interface, may also be at other interfaces of the electronic device, and the like, and the specific expression form is not limited herein, and the control capable of implementing the first operation may be. For example, a floating window may be provided on the display interface, which may include operational controls, hidden controls, and the like. Hovering a window may refer to surface hovering a movable window over a display interface in order to open a different application. The operation control can be used for adding the design drawing, so that the design drawing can be suspended on the display interface, position comparison between display elements in the design drawing and display elements in the display interface is facilitated, and display elements with unnecessary positions, namely display elements with deviation (inaccuracy) in visual restoration, are found out.
For example, the target design drawing may be specifically displayed on the electronic device for a developer, and a display interface and an operation control may be displayed on the electronic device, where the operation control may be suspended on the display interface, and the display interface may include at least one display element.
102. And responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting the target design drawing according to the attribute information of the display interface.
The attribute information of the display interface may refer to a characteristic of the display interface, and for example, the attribute information of the display interface may include a screen resolution of the display interface, a size of the display interface, and the like. The screen resolution of the display interface may refer to a screen resolution of a display screen used by the electronic device to display the display interface. The size of the display interface may refer to the size of the display screen of the electronic device when the display screen of the electronic device displays the display interface full-screen.
For example, the screen resolution of the display interface may be specifically obtained in response to a first operation on the operation control; based on the screen resolution of the display interface, adjusting the target design drawing by using a dynamic adaptation rule to obtain a reference image; and superposing and displaying the reference image on the display interface.
The dynamic adaptation rule may refer to scaling according to different screen sizes (e.g., display interface sizes), for example, a percentage layout may be used for different screen widths. For example, the electronic device obtains a target design drawing in response to a first operation of the user on the operation control, then dynamically adjusts the target design drawing according to the screen resolution of the display interface, so that the target design drawing occupies the entire display screen of the electronic device, obtains a reference image, and displays the reference image on the display interface in an overlapping manner.
Optionally, after the target design drawing is dynamically adjusted according to the dynamic adjustment rule, if the boundary between the adjusted design drawing and the display interface is not completely aligned, the adjusted design drawing may be adaptively adjusted, so that the reference image and the boundary between the display interface are completely aligned, and the requirements of different screen sizes are met. For example, the step "based on the screen resolution of the display interface, adjust the target design drawing by using the dynamic adaptation rule to obtain the reference image" may specifically be:
based on the screen resolution of the display interface, adjusting the target design drawing by using a dynamic adaptation rule to obtain an adjusted design drawing; if the boundaries of the adjusted design drawing and the display interface are not completely aligned, responding to the stretching operation aiming at the adjusted design drawing to generate a stretched design drawing; when the stretched design drawing is completely aligned with the boundary of the display interface, taking the stretched design drawing as a reference image; and if the adjusted design drawing is completely aligned with the boundary of the display interface, taking the adjusted design drawing as a reference image. By the method, the flexibility of display interface processing can be effectively improved, the adaptability of different screen sizes is improved, and the adaptability of the display interface processing method is improved.
The stretching operation may be performed on the adjusted design drawing in various ways, for example, the boundary position of the image may be changed by recording the horizontal and vertical coordinate values of the boundary of the image before and after stretching and calculating the difference value before and after stretching. For example, the stretching operation may comprise two steps: touch operation and moving operation, the step "generating a stretched design drawing in response to the stretching operation for the adjusted design drawing", may specifically be:
responding to the touch operation aiming at the adjusted design drawing, and acquiring a first coordinate value of a touch point corresponding to the touch operation on the display interface; responding to the moving operation of the adjusted design drawing, and acquiring a second coordinate value of a moving point corresponding to the moving operation on the display interface when the moving operation is stopped; and stretching the adjusted design drawing based on the first coordinate value and the second coordinate value to obtain a stretched design drawing.
For example, when the added target design drawing is suspended in the form of a suspended window and superimposed on the display interface, the target design drawing may be dynamically adjusted by using a dynamic adaptation rule based on the screen resolution of the display interface to obtain an adjusted design drawing, and if the boundaries of the adjusted design drawing and the display interface are not completely aligned, the adjusted design drawing may be stretched (i.e., dragged). When the adjusted design drawing is stretched, it is necessary to ensure that the display screen of the electronic device cannot obtain a key input focus (for example, a display control on the display interface cannot obtain an input focus), and cannot send a key or a button event to the floating window, and when the window of the floating window can obtain a focus, a touch gesture outside the window range of the floating window is still sent to a subsequent window for processing. Wherein obtaining key input focus may refer to a conventional keyboard input message being sent to the component by default. Most directly for a button, it is the same effect that pressing the space bar or enter key of the keyboard after it gets the focus as if the button was clicked with the mouse.
For example, in a Windows-based system, the DispatchTouchEvent function can be recalled by listening to the View gesture, first, when a finger is pressed (in response to a touch operation on the adjusted design drawing), recording the abscissa value xInView or the ordinate value yInView on the View of the floating window when the finger is pressed, and the abscissa value xwnInScreen or the ordinate value yDownInScreen on the display screen of the electronic device, then, when the finger is moved (i.e., in response to a movement operation on the adjusted design drawing), recording the abscissa value xInScreen or the ordinate value yInScreen on the screen of the current finger position, and then, calculating the abscissa value:
WindowManager.LayoutParam.x=xInScreen-xInView;
WindowManager.LayoutParam.y=yInScreen-yInView;
and then, transferring new window parameters WindowManager.LayoutParam.x or WindowManager.LayoutParam.y by calling an updateViewLayout method of the window, and changing the position of the window so as to realize the stretching operation of the adjusted design drawing.
In other systems, such as Linux, IOS, etc., a similar implementation is used.
103. And detecting the position of a display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
For example, a display image may be generated specifically according to the display interface; respectively determining the positions of the display elements on the display image and the reference image to obtain first position information and second position information of the display elements; calculating the difference between the first position information and the second position information of the display element to obtain the position difference information of the display element in the display interface relative to the reference image; and marking the position difference information on the display interface. By calculating the difference between the first position information and the second position information of the display elements and marking the position difference information on the display interface, developers can quickly know the display elements with the difference and the specific position difference information, the developers can accurately and quickly know the specific deviation values of the display elements and quickly modify the display elements with the deviations, and the development efficiency of the developers is improved.
For example, screenshot can be performed on a display interface to obtain a display image, then control identification is performed on the display image and a reference image respectively, control elements in the display image and the reference image and positions of the control elements on the display image and the reference image are identified, first position information of the control elements on the display image and second position information of the control elements on the reference image are determined, then, based on the first position information and the second position information, comparison detection is performed on the control elements in the display image and the corresponding control elements in the reference image, control elements with different first position information and second position information are recorded, then, a difference between the control elements is calculated, position difference information is drawn on the display image, and a drawing result is displayed on the display interface.
Optionally, edge straight lines corresponding to the display elements may be detected, and the position difference value may be calculated by using the straight lines. For example, the display image and the reference image may be respectively subjected to line detection to obtain a first edge line and a second edge line, the first edge line meeting the preset edge line condition of the display element is added to the display line set, the first position information of the first edge line in the display line set is determined, the second edge line meeting the preset edge line condition of the display element is added to the reference line set, the second position information of the second edge line in the reference line set is determined, then lines in the display line set and the reference line set are traversed, the first edge lines with different first position information and second position information are marked to obtain a marked line set, and a difference between the first edge line in the marked line set and the second edge line in the corresponding reference line set is calculated, and obtaining position difference information of display elements in the display interface relative to the reference image, and marking the position difference information on the display interface. The method may also include performing line detection on the display image, determining a first edge line group corresponding to the display element in the display image and first position information of the first edge line group, performing line detection on the reference image, determining a second edge line group corresponding to the display element in the reference image and second position information of the second edge line group, then calculating a difference between the first position information of the first edge line group and the second position information of the second edge line group, obtaining position difference information of the display element in the display interface relative to the reference image, and then marking the position difference information on the display interface. And so on. The accuracy of the display interface processing can be effectively improved.
The setting mode of the edge straight line condition may be various, for example, the setting mode may be flexibly set according to the requirement of the practical application, and the setting mode may also be preset and stored in the electronic device. In addition, the preset condition may be built in the electronic device, or may be stored in a memory and transmitted to the electronic device, or the like. For example, the preset edge straight line condition may include one or more of the following: the two lines are kept parallel with a certain distance (e.g., a specific setting) between them, the two lines are both horizontal or vertical, the length of the lines does not exceed 1/2, etc. the width or height of the picture.
For example, the step "determining the positions of the display elements on the display image and the reference image respectively to obtain the first position information and the second position information of the display elements" may specifically be:
respectively identifying the corresponding edge straight line group of the display element in the display image and the corresponding edge straight line group of the display element in the reference image to obtain a first edge straight line group and a second edge straight line group; respectively determining the position information of the first edge straight line group in the display image and the position information of the second edge straight line group in the reference image to obtain first position information of the first edge straight line group and second position information of the second edge straight line group;
for example, specifically, an image recognition algorithm may be used to respectively recognize the edge line group corresponding to the display element in the display image and the edge line group corresponding to the display element in the reference image.
Then, the step "calculating a difference between the first position information and the second position information of the display element to obtain position difference information of the display element in the display interface with respect to the reference image" may specifically be: and calculating the difference between the first position information of the first edge straight line group and the second position information of the second edge straight line group to obtain the position difference information of the display elements in the display interface relative to the reference image.
Optionally, edge detection may be performed on the image, then linear detection may be performed on the detected edge information, then an edge linear group corresponding to each display element is determined, and then contrast detection may be performed on the edge linear group. For example, the step "respectively identify the edge straight line group corresponding to the display element in the display image and the edge straight line group corresponding to the display element in the reference image to obtain the first edge straight line group and the second edge straight line group", specifically may be:
respectively carrying out edge detection on the display image and the reference image to obtain first edge information corresponding to the display image and second edge information corresponding to the reference image; performing straight line detection on the first edge information to obtain at least one first edge straight line; forming first edge straight lines corresponding to each display element into first edge straight line groups based on edge straight line conditions preset by the display elements, wherein each first edge straight line group comprises at least one first edge straight line; performing straight line detection on the second edge information to obtain at least one second edge straight line; and acquiring second edge straight lines corresponding to the display elements based on preset edge straight line conditions of the display elements to form second edge straight line groups, and determining second position information of the second edge straight line groups, wherein each second edge straight line group comprises at least one second edge straight line. By the method, the accuracy of the display interface processing can be effectively improved.
For example, the image may be filtered and denoised, two matrices of partial derivatives of the image in x and y directions are obtained by using a first-order finite difference to calculate gradients, data of non-maximum values are suppressed, a binary image is screened by using double threshold values, and an edge image closest to the real edge of the image can be obtained by selecting a proper large threshold value and a proper small threshold value. For example, the step of "performing edge detection on the display image and the reference image respectively to obtain first edge information corresponding to the display image and second edge information corresponding to the reference image" may specifically be:
carrying out Gaussian filtering processing on the display image, removing the noise of the display image, and obtaining a display image after denoising; calculating the gradient amplitude and the gradient direction of the denoised display image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised display image; detecting the initial edge point set by using a dual-threshold algorithm, and determining a target edge point set of the denoised display image according to a detection result; performing edge connection on the denoised display interface based on the target edge point set, and determining first edge information corresponding to the denoised display image;
performing Gaussian filtering processing on the reference image, and removing noise of the reference image to obtain a denoised reference image; calculating the gradient amplitude and the gradient direction of the denoised reference image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised reference image; detecting the initial edge point set of the denoised reference image by using a dual-threshold algorithm, and determining a target edge point set of the denoised reference image according to a detection result; and performing edge connection on the denoised reference image based on the target edge point set, and determining second edge information corresponding to the denoised reference image.
By the method, the accuracy of edge detection can be improved, so that actual edges are identified as much as possible, false alarm caused by noise is reduced as much as possible, the error rate is reduced, the identified edges are close to the actual edges in the image as much as possible, the positioning performance is improved, and the edges in the image can be identified only once, so that the minimum response is achieved.
Optionally, the step of "calculating a difference between first position information of the first edge straight line group and second position information of the second edge straight line group to obtain position difference information of a display element in the display interface with respect to the reference image" may specifically be:
comparing the first position information of the first edge straight line group with the second position information of the corresponding second edge straight line group; screening out a first edge straight line group different from second position information of the second edge straight line group according to the comparison result to obtain a screened edge straight line group; and calculating the difference between the first position information of the screened edge straight line group and the second position information of the corresponding target second edge straight line group to obtain the position difference information of the display element in the display interface relative to the reference image. By the method, the accuracy of display interface processing is effectively improved.
Wherein, the target second edge straight line group may refer to a second edge straight line group corresponding to the post-screening edge straight line group.
Optionally, the first edge straight line with the difference may be drawn on the display image, and the difference value may be marked clearly. For example, the position difference information includes a coordinate difference value, and the step "mark the position difference information on the display interface" may specifically be: converting the coordinate difference value into a pixel density difference value according to the screen resolution of the display interface; marking the screened edge straight line group and the pixel density difference value on a display interface. By the method, the detection result can be more visual and clear.
The fields in which the scheme can be applied include, but are not limited to, the fields of maps, traffic, social interactions, games, news, medical, education, meetings, and the like.
As can be seen from the above, the present embodiment may display a display interface and an operation control corresponding to the target design drawing, where the display interface includes at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image. Because the display interface is detected by the scheme and the detection result is displayed on the display interface, developers can carry out efficient visual reduction effect detection at any time during development, the problem of reduction deviation caused by human eye inaccuracy during visual reduction is avoided, the accuracy of display interface processing is improved, and meanwhile, due to normal dynamic adaptive resolution, the developers do not need to carry out resolution adaptation when using various electronic devices, and the adaptive display efficiency is greatly improved.
The method described in the previous embodiment is further detailed by way of example.
In this embodiment, the display interface processing apparatus is specifically integrated in an electronic device, the display interface is specifically a development interface, the display element is specifically a development element, the operation control is specifically an addition control, the first operation is specifically an addition operation, and the display image is specifically a development image, and the display interface processing method may be specifically a development interface detection method, which is described as an example.
As shown in fig. 2a, a development interface detection method may specifically include the following steps:
201. the electronic equipment displays a development interface corresponding to the target design drawing and an adding control, wherein the development interface comprises at least one development element.
Wherein, the development interface may refer to an interface developed by a developer based on the target design drawing. The development element may refer to a series of elements that a developer needs to develop according to a design drawing, and the expression form of the development element may be various, for example, one or more elements of a development control, a status bar, a scroll bar, and the like may be in a development interface, one or more image elements corresponding to the development control, the status bar, the scroll bar, and the like may be in a development image, one or more image elements corresponding to the development control, the status bar, the scroll bar, and the like may be included in a reference image, and the like.
The display modes of adding the control can be various, for example, the control can be suspended on the development interface, can be also at other interfaces of the electronic device, and the like, the concrete expression form is not limited here, and the control capable of realizing the adding operation is only required. For example, a floating window may be provided on the development interface that may include adding controls, hiding controls, and so forth. Floating a window may refer to a surface on a development interface floating a movable window in order to open different applications. The adding control can be used for adding the design drawing, so that the design drawing can be suspended on the development interface, position comparison between the development elements in the design drawing and the development elements in the development interface is facilitated, and development elements with unnecessary positions are found, namely the development elements with deviation (inaccuracy) in visual restoration.
For example, the electronic device may specifically install an application a for performing development interface detection, such as "pixeye. The application program a may include a main page, a floating window authority manager, a Read-Only Memory (ROM) reading tool class, an authorized page skipper, a floating window control manager, a floating window parameter generator, a visual design control, a function operation control, and so on, for example, as shown in fig. 2c, the functions of the modules may be as follows:
(1) homepage: the APP starts a main page;
(2) the floating window authority manager: the system is responsible for judging whether the floating window has the authority;
(3) ROM read tools class: judging the ROM version of the system by reading the SystemProperties attribute of the electronic equipment system;
(4) authorizing a page skipper: skipping to a mobile phone floating window authority management page through different ROM versions;
(5) the floating window control manager: comprises a floating window parameter generator, a function operation panel control and a visual design draft control
(6) A floating window parameter generator: through a series of parameter settings, the floating window can be displayed in full screen, and the screen resolution is dynamically adapted
(7) Visual design draft control: the design draft picture displayed in the floating window may specifically display a reference image obtained by dynamically adjusting the target design drawing, for example;
(8) and (4) function operation controls: may include displaying hidden PixEye controls, adding a blueprint control (i.e., adding a control), displaying hidden blueprint controls, blueprint reverse processing controls, and so forth. For example, as shown in fig. 2d, the four functional operation controls are a 01 display hidden PixEye control, a 02 add design control (i.e., an add control), a 03 display hidden design control, and a 04 design reverse color processing control, respectively.
Next, the original visual design png or jpg picture (e.g., the target design drawing) may be imported into a designated storage path of the electronic device, for example, the original design drawing is as shown in fig. 2 e.
For example, the development may be specifically performed by a developer on an electronic device based on a target design drawing to obtain an application B, a development interface of the application B and an addition control may be specifically displayed on the electronic device, the addition control may be suspended on the development interface, and the development interface may include at least one development element. The electronic equipment can suspend the target design drawing above a development interface of an application program B by applying the characteristics of the suspension window, dynamically adapt the resolution of the suspension window, keep the suspension window and the development interface in the same proportion, completely cover the suspension window and the development interface by dragging, then perform page screenshot to generate a development image and a reference image, and identify the positions of all control elements in the two images by edge detection and Hough transformation straight line detection; and (diff) distinguishing the control elements at different positions in the two images, drawing and outputting the control elements to the original design draft (namely, the target design drawing), wherein a specific flow can be shown in fig. 2g, and a specific implementation process can be detailed in the following steps.
202. And the electronic equipment responds to the adding operation aiming at the adding control and superposes and displays a reference image on the development interface.
For example, the electronic device may specifically respond to the addition operation for the addition control, and superimpose and display a reference image on the development interface so that the reference image is aligned with the boundary of the development interface, where the reference image is an image obtained by adjusting the target design drawing according to the attribute information of the development interface.
For example, the imported target design drawing can be specifically selected by clicking the add control of PixEye, and compared with the development interface of the application program B. And superposing the target design drawing and the development interface of the application program B, and carrying out visual restoration on the accuracy detection result, wherein the superposition result can be shown as a figure 2 f.
For example, the electronic device may specifically obtain the screen resolution of the development interface in response to the addition operation for the addition control; based on the screen resolution of the development interface, adjusting the target design drawing by using a dynamic adaptation rule to obtain a reference image; and displaying the reference image on the development interface in an overlapping manner. For example, the floating window may dynamically adapt to the screen resolution, ensure that the floating window occupies the entire screen of the electronic device, and hide all decorative borders, such as decorative strips, status bars, and the like. When a layout is requested, the window may appear above or below the status bar, causing an occlusion. An Application Programming Interface (API) provided by the electronic device system (i.e., an operating window manager) is used to ensure that the contents of the window are not covered by the decoration bar or the status bar.
Wherein, dynamically adapting the rule may refer to scaling equally according to different screen sizes (e.g. development interface sizes), for example, for different screen widths, a percentage layout may be used. For example, the electronic device obtains a target design drawing in response to an adding operation of a user for the adding control, then dynamically adjusts the target design drawing according to a screen resolution of a development interface, so that the target design drawing occupies the whole display screen of the electronic device to obtain a reference image, and displays the reference image on the development interface in an overlapping manner.
Optionally, after the target design drawing is dynamically adjusted, if the boundary between the adjusted design drawing and the development interface is not completely aligned, the adjusted design drawing may be adaptively adjusted, for example, the floating window is dragged to completely cover the development interface, so that the boundary between the reference image and the development interface is completely aligned, and the requirements of different screen sizes are met. For example, the step "based on the screen resolution of the development interface, adjust the target design drawing by using the dynamic adaptation rule to obtain the reference image" may specifically be:
based on the screen resolution of the development interface, adjusting the target design drawing by using a dynamic adaptation rule to obtain an adjusted design drawing; if the boundaries of the adjusted design drawing and the development interface are not completely aligned, responding to the touch operation aiming at the adjusted design drawing, and acquiring a first coordinate value of a touch point corresponding to the touch operation on the development interface; responding to the moving operation of the adjusted design drawing, and acquiring a second coordinate value of a moving point corresponding to the moving operation on the development interface when the moving operation is stopped; stretching the adjusted design drawing based on the first coordinate value and the second coordinate value to obtain a stretched design drawing; when the stretched design drawing is completely aligned with the boundary of the development interface, taking the stretched design drawing as a reference image; and if the adjusted design drawing is completely aligned with the boundary of the development interface, taking the adjusted design drawing as a reference image. By the method, the flexibility of development interface detection can be effectively improved, the adaptability of different screen sizes is improved, and the adaptability of the development interface detection method is improved.
For example, when the added target design drawing is suspended in the form of a suspended window and superimposed on a development interface, a dynamic adaptation rule may be first used to dynamically adjust the target design drawing based on the screen resolution of the development interface to obtain an adjusted design drawing, and if the boundaries of the adjusted design drawing and the development interface are not completely aligned, the adjusted design drawing may be stretched (i.e., dragged). When the adjusted design drawing is stretched, it is necessary to ensure that a display screen of the electronic device cannot obtain a key input focus (for example, a development control on a development interface cannot obtain an input focus), cannot send a key or a button event to the floating window, and when a window of the floating window can obtain a focus, still send a touch gesture outside the window range of the floating window to a subsequent window for processing. Wherein obtaining key input focus may refer to a conventional keyboard input message being sent to the component by default. Most directly for a button, it is the same effect that pressing the space bar or enter key of the keyboard after it gets the focus as if the button was clicked with the mouse.
For example, the electronic device may specifically monitor the gesture of View to call back the dispatch touch function, first, when a finger is pressed (in response to a touch operation on the adjusted design drawing), record a value xlnview of an abscissa or a value yInView of an ordinate on the View of the floating window when the finger is pressed, and record an abscissa value xdownlonscreen or an ordinate value yindonlscreen on the display screen of the electronic device, then, when the finger is moved (i.e., in response to a movement operation on the adjusted design drawing), record an abscissa value xlnscreen or a value yInScreen of an ordinate on the screen of the current finger position, and then, calculate the abscissa value and the ordinate of the window parameter:
WindowManager.LayoutParam.x=xInScreen-xInView;
WindowManager.LayoutParam.y=yInScreen-yInView;
and then, transferring new window parameters WindowManager.LayoutParam.x or WindowManager.LayoutParam.y by calling an updateViewLayout method of the window, and changing the position of the window so as to realize the stretching operation of the adjusted design drawing.
203. And the electronic equipment generates a development image according to the development interface.
For example, the electronic device may specifically obtain a screenshot of the development interface to obtain a development image corresponding to the development interface.
204. The electronic equipment identifies a development edge straight line group corresponding to the development element in the development image and current position information of the development edge straight line group by using an image identification algorithm.
For example, the electronic device may specifically perform edge detection on the development image (for example, as shown in fig. 2h (1)) to obtain development edge information of the development image, for example, as shown in fig. 2h (2); performing line detection on the development edge information to obtain at least one development edge line, which may be shown in fig. 2h (3), for example; forming development edge straight lines corresponding to each development element into a development edge straight line group based on edge straight line conditions preset by the development elements, and determining current position information of the development edge straight line group, wherein each development edge straight line group comprises at least one development edge straight line.
The setting mode of the edge straight line condition may be various, for example, the setting mode may be flexibly set according to the requirement of the practical application, and the setting mode may also be preset and stored in the electronic device. In addition, the preset condition may be built in the electronic device, or may be stored in a memory and transmitted to the electronic device, or the like. For example, the preset edge straight line condition may include one or more of the following: the two straight lines are parallel, there is a certain distance (e.g. a specific setting) between the two straight lines, the two straight lines are both in horizontal or vertical direction, the length of the straight lines does not exceed 1/2 of the width or height of the picture, etc.
For example, filtering and denoising the image, calculating the gradient by using a first-order finite difference to obtain two matrixes of partial derivatives of the image in the x and y directions, suppressing data with non-maximum values, screening the binary image by using double thresholds, and selecting a proper large threshold and a proper small threshold to obtain an edge image which is closest to the real edge of the image. For example, the electronic device may specifically perform gaussian filtering on the development image to remove noise of the development image, so as to obtain a de-noised development image; calculating the gradient amplitude and the gradient direction of the de-noised development image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the de-noised development image; detecting the initial edge point set by using a dual-threshold algorithm, and determining a target edge point set of the de-noised development image according to a detection result; and performing edge connection on the de-noised development interface based on the target edge point set, and determining development edge information of the de-noised development image. By the method, the accuracy of edge detection can be effectively improved, so that actual edges are identified as much as possible, false alarms generated by noise are reduced as much as possible, the error rate is reduced, the identified edges are close to the actual edges in the image as much as possible, the positioning is improved, the edges in the image can be identified only once, and the minimum response is achieved.
Since the algorithm for edge detection is mainly based on the first and second inverse of the image intensity, but the inverse is usually sensitive to noise, a filter can be used to improve the performance of the noise-dependent edge detector, and the image is subjected to gaussian blurring. Then, edge detection is performed using canny's algorithm, which mainly uses a dual-threshold method to reduce the number of false edges, i.e., if the magnitude of a certain pixel location exceeds a high threshold, the pixel is retained as an edge pixel, if the magnitude of a certain pixel location is less than a low threshold, the pixel is excluded, if the magnitude of a certain pixel location is between two thresholds, the pixel is retained only when connected to a pixel above the high threshold. In the edge detection, when the set threshold is too high, important information may be missed, and when the threshold is too low, branch information may be regarded as important, so that a double threshold method may be adopted. For example, before detection, by importing all pages of the map application, performing edge detection accuracy identification, and finally confirming that the low threshold is set to 50 and the high threshold is 200, as many control elements of the pages can be identified as possible. After edge detection, straight line detection may be performed using hough transform. For example, by analyzing the styles and sizes of all control elements in the map APP page, it is determined that the target straight line is characterized by two straight lines which are parallel to each other, a certain distance exists between the two straight lines, the two straight lines are horizontal or vertical, the length of the straight line is 1/2 which is not the width or height of the picture, and the like. Thus, the edge straight line condition may be set such that two straight lines must be parallel, there is a certain distance between the two straight lines, the two straight lines must be in a horizontal or vertical direction, and the length of the straight lines does not exceed 1/2, which is the width or height of the picture, and so on. The specific identification of the reference image may be similar to the development image below.
205. And the electronic equipment identifies the corresponding reference edge straight line group of the development element in the reference image and the reference position information of the reference edge straight line group by using an image identification algorithm.
For example, the electronic device may specifically perform edge detection on the reference image to obtain reference edge information corresponding to the reference image; performing linear detection on the reference edge information to obtain at least one reference edge straight line; acquiring reference edge straight lines corresponding to development elements based on edge straight line conditions preset by the development elements, forming reference edge straight line groups, and determining reference position information of the reference edge straight line groups, wherein each reference edge straight line group comprises at least one reference edge straight line.
For example, the electronic device performs edge detection on the reference image, and specifically may perform gaussian filtering on the reference image to remove noise of the reference image, so as to obtain a denoised reference image; calculating the gradient amplitude and the gradient direction of the denoised reference image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised reference image; detecting the initial edge point set of the denoised reference image by using a dual-threshold algorithm, and determining a target edge point set of the denoised reference image according to a detection result; and performing edge connection on the denoised reference image based on the target edge point set, and determining reference edge information corresponding to the denoised reference image.
206. And the electronic equipment calculates the difference between the current position information of the development edge straight line group and the reference position information of the reference edge straight line group to obtain the position difference information of the development element in the development interface relative to the reference image.
For example, the electronic device may specifically compare current position information of the development edge straight line group with reference position information of a corresponding reference edge straight line group; screening out development edge straight line groups different from the reference position information of the reference edge straight line group according to the comparison result to obtain screened edge straight line groups; and calculating the difference between the current position information of the screened edge straight line and the reference position information of the corresponding target reference edge straight line to obtain the position difference information of the development element in the development interface relative to the reference image.
207. And the electronic equipment marks the position difference information on the development interface.
Optionally, the development edge straight line with the difference can be drawn on the development image, and the difference value is marked clearly. For example, the electronic device may specifically convert the coordinate difference value into a pixel density difference value according to a screen resolution of the development interface; marking the post-screening edge line group and the pixel density difference on a development interface. For example, after the development edge line group of the development element and the reference edge line group of the development element, edge lines (lines) having different coordinate values may be recorded to generate difflines (difference line sets), and coordinate value diff (difference) values may be recorded in an array diffsize (coordinate difference array), and the difflines may be drawn on the development image. And converts the values in diffsize to dp by screen density resolution. In order to more clearly and clearly express the difference result, the control element of the development image may be marked as a sign + on the left side or on the top of the control element of the reference image, and marked on the right side or on the bottom of the control element of the reference image, and then marking is completed, for example, the control element may be marked as-2 dp, +3dp, +1dp shown in fig. 2h (4), or other marking manners, which are not limited herein, and may be set according to the needs of developers. The detection result can be more intuitive and clear through the method.
For example, the unit corresponding to the coordinate difference is resolution, and then the coordinate difference is converted into pixel density to obtain a pixel density difference between a development edge straight line and a reference edge straight line of the development element, and the partial codes of the specific conversion mode between them may be as follows:
Figure BDA0003008277780000231
as can be seen from the above, the present embodiment may display a development interface corresponding to the target design drawing and an addition control, where the development interface includes at least one development element; then, responding to the adding operation aiming at the adding control, and overlaying and displaying a reference image on the development interface to enable the reference image to be aligned with the boundary of the development interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the development interface; and detecting the position of the development element in the development interface based on the reference image, and displaying a detection result on the development interface, wherein the detection result comprises position difference information of the development element in the development interface relative to the reference image. According to the scheme, through the PixEye software developed by the method, when the interface of an application program is developed, a design draft jpg/png designed by a visual designer is led into a mobile phone, the image of the design draft is suspended above the development interface of the developed application program by utilizing the characteristics of a suspension window of an electronic equipment system, the resolution self-adaptation of the image of the design draft is completely consistent with the development interface through the dynamic adaptation screen of the suspension window and the suspension window dragging technology, the images are overlapped, and the positions of control elements in two images corresponding to the development interface of the design draft are identified through the image identification technology. And finding out the control elements with different positions through the control element coordinate comparison, namely finding out the control elements with deviation (inaccuracy) in the visual restoration. And calculating the distance between control elements with different positions, converting the distance information into screen pixel information, namely, pixels of visual restoration deviation, and marking the deviation pixels on the APP page picture by plus and minus signs, thereby completing the detection of the visual restoration accuracy. Because through this scheme, the developer can carry out efficient visual reduction effect detection at any time when developing, has avoided the reduction deviation problem that brings because of the people's eye is inaccurate when the vision reduces simultaneously, has improved the accuracy that the development interface detected, simultaneously because normal dynamic adaptation resolution ratio need not the developer and carries out the resolution ratio adaptation when using multiple electronic equipment, has greatly improved adaptation development efficiency.
In order to better implement the method, correspondingly, an embodiment of the present application further provides a display interface processing apparatus, where the display interface processing apparatus may be specifically integrated in an electronic device, and the electronic device may be a server or a terminal.
For example, as shown in fig. 3, the display interface processing apparatus may include a display unit 301, a superimposing unit 302, and a detecting unit 303, as follows:
the display unit 301 is configured to display a display interface and an operation control corresponding to the target design drawing, where the display interface includes at least one display element;
the superimposing unit 302 is configured to superimpose and display a reference image on the display interface in response to a first operation on the operation control, so that the reference image is aligned with a boundary of the display interface, where the reference image is an image obtained by adjusting a target design drawing according to attribute information of the display interface;
a detecting unit 303, configured to detect a position of a display element in the display interface based on the reference image, and display a detection result on the display interface, where the detection result includes position difference information of the display element in the display interface with respect to the reference image.
Optionally, in some embodiments, the attribute information of the display interface includes a screen resolution of the display interface, and the superimposing unit 302 may include an acquiring subunit, an adjusting subunit, and a superimposing subunit, as follows:
the obtaining subunit is configured to obtain, in response to a first operation on the operation control, a screen resolution of the display interface;
the adjusting subunit is configured to adjust the target design drawing by using a dynamic adaptation rule based on the screen resolution of the display interface to obtain a reference image;
and the superposition subunit is used for superposing and displaying the reference image on the display interface.
Optionally, in some embodiments, the adjusting subunit may be specifically configured to adjust the target design drawing by using a dynamic adaptation rule based on the screen resolution of the display interface to obtain an adjusted design drawing; if the boundaries of the adjusted design drawing and the display interface are not completely aligned, responding to the stretching operation aiming at the adjusted design drawing to generate a stretched design drawing; when the stretched design drawing is completely aligned with the boundary of the display interface, taking the stretched design drawing as a reference image; and if the adjusted design drawing is completely aligned with the boundary of the display interface, taking the adjusted design drawing as a reference image.
Optionally, in some embodiments, the stretching operation includes a touch operation and a moving operation, and the adjusting subunit is specifically configured to, in response to the touch operation on the adjusted design drawing, obtain a first coordinate value of a touch point on the display interface corresponding to the touch operation; responding to the moving operation of the adjusted design drawing, and acquiring a second coordinate value of a moving point corresponding to the moving operation on the display interface when the moving operation is stopped; and stretching the adjusted design drawing based on the first coordinate value and the second coordinate value to obtain a stretched design drawing.
Optionally, in some embodiments, the detecting unit 303 may include a generating subunit, a determining subunit, a calculating subunit, and a marking subunit, as follows:
the generating subunit is used for generating a display image according to the display interface;
the determining subunit is configured to determine positions of the display element on the display image and the reference image respectively to obtain first position information and second position information of the display element;
the calculating subunit is configured to calculate a difference between the first position information and the second position information of the display element, so as to obtain position difference information of the display element in the display interface relative to the reference image;
the marking subunit is configured to mark the position difference information on the display interface.
Optionally, in some embodiments, the determining subunit may be specifically configured to identify an edge line group corresponding to the display element in the display image and an edge line group corresponding to the display element in the reference image, respectively, to obtain a first edge line group and a second edge line group; and respectively determining the position information of the first edge straight line group in the display image and the position information of the second edge straight line group in the reference image to obtain the first position information of the first edge straight line group and the second position information of the second edge straight line group.
Optionally, the calculating subunit may be specifically configured to calculate a difference between first position information of the first edge straight line group and second position information of the second edge straight line group, so as to obtain position difference information of a display element in the display interface relative to the reference image.
Optionally, in some embodiments, the determining subunit is specifically configured to perform edge detection on the display image and the reference image respectively to obtain first edge information corresponding to the display image and second edge information corresponding to the reference image; performing straight line detection on the first edge information to obtain at least one first edge straight line; forming first edge straight lines corresponding to each display element into first edge straight line groups based on edge straight line conditions preset by the display elements, wherein each first edge straight line group comprises at least one first edge straight line; performing straight line detection on the second edge information to obtain at least one second edge straight line; and acquiring second edge straight lines corresponding to the display elements based on preset edge straight line conditions of the display elements to form second edge straight line groups, wherein each second edge straight line group comprises at least one second edge straight line.
Optionally, in some embodiments, the determining subunit is specifically configured to perform gaussian filtering on the display image, remove noise of the display image, and obtain a denoised display image; calculating the gradient amplitude and the gradient direction of the denoised display image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised display image; detecting the initial edge point set by using a dual-threshold algorithm, and determining a target edge point set of the denoised display image according to a detection result; performing edge connection on the denoised display interface based on the target edge point set, and determining first edge information corresponding to the denoised display image; performing edge detection on the reference image to obtain second edge information corresponding to the reference image; performing straight line detection on the second edge information to obtain at least one second edge straight line; and acquiring second edge straight lines corresponding to the display elements based on preset edge straight line conditions of the display elements to form second edge straight line groups, and determining second position information of the second edge straight line groups, wherein each second edge straight line group comprises at least one second edge straight line.
Optionally, in some embodiments, the determining subunit is specifically configured to perform gaussian filtering on the reference image, remove noise of the reference image, and obtain a denoised reference image; calculating the gradient amplitude and the gradient direction of the denoised reference image; performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised reference image; detecting the initial edge point set of the denoised reference image by using a dual-threshold algorithm, and determining a target edge point set of the denoised reference image according to a detection result; and performing edge connection on the denoised reference image based on the target edge point set, and determining second edge information corresponding to the denoised reference image.
Optionally, in some embodiments, the calculating subunit may be specifically configured to compare first position information of the first edge straight line group with second position information of a corresponding second edge straight line group; screening out a first edge straight line group different from second position information of the second edge straight line group according to the comparison result to obtain a screened edge straight line group; and calculating the difference between the first position information of the screened edge straight line group and the second position information of the corresponding target second edge straight line group to obtain the position difference information of the display element in the display interface relative to the reference image.
Optionally, in some embodiments, the position difference information includes a coordinate difference value, and the marking subunit may be specifically configured to convert the coordinate difference value into a pixel density difference value according to a screen resolution of the display interface; marking the screened edge straight line group and the pixel density difference value on a display interface.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
As can be seen from the above, in this embodiment, the display unit 301 may display a display interface and an operation control corresponding to the target design drawing, where the display interface includes at least one display element; then, in response to the first operation on the operation control, the superimposing unit 302 superimposes and displays a reference image on the display interface so that the reference image is aligned with the boundary of the display interface, where the reference image is an image obtained by adjusting the target design drawing according to the attribute information of the display interface; the detection unit 303 detects the position of the display element in the display interface based on the reference image, and displays a detection result on the display interface, where the detection result includes position difference information of the display element in the display interface relative to the reference image. Because the display interface is detected by the scheme and the detection result is displayed on the display interface, developers can carry out efficient visual reduction effect detection at any time during development, the problem of reduction deviation caused by human eye inaccuracy during visual reduction is avoided, the accuracy of display interface processing is improved, and meanwhile, due to normal dynamic adaptation resolution, the developers do not need to carry out resolution adaptation when using various electronic devices, and the adaptation development efficiency is greatly improved.
In addition, an electronic device according to an embodiment of the present application is further provided, as shown in fig. 4, which shows a schematic structural diagram of the electronic device according to an embodiment of the present application, and specifically:
the electronic device may include components such as a processor 401 of one or more processing cores, memory 402 of one or more computer-readable storage media, a power supply 403, and an input unit 404. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 4 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 401 is a control center of the electronic device, connects various parts of the whole electronic device by various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device. Optionally, processor 401 may include one or more processing cores; preferably, the processor 401 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by operating the software programs and modules stored in the memory 402. The memory 402 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 402 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 access to the memory 402.
The electronic device further comprises a power supply 403 for supplying power to the various components, and preferably, the power supply 403 is logically connected to the processor 401 through a power management system, so that functions of managing charging, discharging, and power consumption are realized through the power management system. The power supply 403 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The electronic device may further include an input unit 404, and the input unit 404 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the electronic device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 401 in the electronic device loads the executable file corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 401 runs the application program stored in the memory 402, thereby implementing various functions as follows:
displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
As can be seen from the above, the present embodiment may display a display interface and an operation control corresponding to the target design drawing, where the display interface includes at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image. Because the display interface is detected by the scheme and the detection result is displayed on the display interface, developers can carry out efficient visual reduction effect detection at any time during development, the problem of reduction deviation caused by human eye inaccuracy during visual reduction is avoided, the accuracy of display interface processing is improved, and meanwhile, due to normal dynamic adaptation resolution, the developers do not need to carry out resolution adaptation when using various electronic devices, and the adaptation development efficiency is greatly improved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application further provide a storage medium, where a plurality of instructions are stored, where the instructions can be loaded by a processor to perform steps in any one of the display interface processing methods provided in the embodiments of the present application. For example, the instructions may perform the steps of:
displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element; then, responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface; and detecting the position of the display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any display interface processing method provided in the embodiments of the present application, beneficial effects that can be achieved by any display interface processing method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The display interface processing method, the display interface processing apparatus, the electronic device, and the storage medium provided in the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (13)

1. A display interface processing method is characterized by comprising the following steps:
displaying a display interface corresponding to the target design drawing and an operation control, wherein the display interface comprises at least one display element;
responding to a first operation aiming at the operation control, and overlaying and displaying a reference image on the display interface to enable the reference image to be aligned with the boundary of the display interface, wherein the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface;
and detecting the position of a display element in the display interface based on the reference image, and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
2. The method of claim 1, wherein the property information of the display interface comprises a screen resolution of the display interface, and wherein the displaying a reference image on the display interface in an overlaid manner in response to the first operation on the operation control comprises:
responding to a first operation aiming at the operation control, and acquiring the screen resolution of the display interface;
based on the screen resolution of the display interface, adjusting the target design drawing by using a dynamic adaptation rule to obtain a reference image;
and superposing and displaying the reference image on the display interface.
3. The method of claim 2, wherein the adjusting the target design drawing to obtain the reference image using the dynamic adaptation rule based on the screen resolution of the display interface comprises:
based on the screen resolution of the display interface, adjusting the target design drawing by using a dynamic adaptation rule to obtain an adjusted design drawing;
if the boundaries of the adjusted design drawing and the display interface are not completely aligned, responding to the stretching operation aiming at the adjusted design drawing to generate a stretched design drawing;
when the stretched design drawing is completely aligned with the boundary of the display interface, taking the stretched design drawing as a reference image;
and if the adjusted design drawing is completely aligned with the boundary of the display interface, taking the adjusted design drawing as a reference image.
4. The method of claim 3, wherein the stretching operation comprises a touch operation and a move operation, and wherein generating the stretched design drawing in response to the stretching operation for the adjusted design drawing comprises:
responding to the touch operation aiming at the adjusted design drawing, and acquiring a first coordinate value of a touch point corresponding to the touch operation on the display interface;
responding to the moving operation of the adjusted design drawing, and acquiring a second coordinate value of a moving point corresponding to the moving operation on the display interface when the moving operation is stopped;
and stretching the adjusted design drawing based on the first coordinate value and the second coordinate value to obtain a stretched design drawing.
5. The method according to any one of claims 1 to 4, wherein the detecting the position of the display element in the display interface based on the reference image and displaying the detection result on the display interface comprises:
generating a display image according to the display interface;
respectively determining the positions of the display elements on the display image and the reference image to obtain first position information and second position information of the display elements;
calculating the difference between the first position information and the second position information of the display element to obtain the position difference information of the display element in the display interface relative to the reference image;
and marking the position difference information on the display interface.
6. The method of claim 5, wherein the determining the position of the display element on the display image and the reference image respectively to obtain first position information and second position information of the display element comprises:
respectively identifying the corresponding edge straight line group of the display element in the display image and the corresponding edge straight line group of the display element in the reference image to obtain a first edge straight line group and a second edge straight line group;
respectively determining the position information of the first edge straight line group in the display image and the position information of the second edge straight line group in the reference image to obtain first position information of the first edge straight line group and second position information of the second edge straight line group;
the calculating a difference between the first position information and the second position information of the display element to obtain position difference information of the display element in the display interface relative to the reference image includes: and calculating the difference between the first position information of the first edge straight line group and the second position information of the second edge straight line group to obtain the position difference information of the display elements in the display interface relative to the reference image.
7. The method of claim 6, wherein the identifying the corresponding set of edge lines of the display element in the display image and the corresponding set of edge lines of the display element in the reference image respectively to obtain a first set of edge lines and a second set of edge lines comprises:
respectively carrying out edge detection on the display image and the reference image to obtain first edge information corresponding to the display image and second edge information corresponding to the reference image;
performing straight line detection on the first edge information to obtain at least one first edge straight line;
forming first edge straight lines corresponding to each display element into first edge straight line groups based on edge straight line conditions preset by the display elements, wherein each first edge straight line group comprises at least one first edge straight line;
performing straight line detection on the second edge information to obtain at least one second edge straight line;
and acquiring second edge straight lines corresponding to the display elements based on preset edge straight line conditions of the display elements to form second edge straight line groups, wherein each second edge straight line group comprises at least one second edge straight line.
8. The method according to claim 7, wherein the performing edge detection on the display image and the reference image respectively to obtain first edge information corresponding to the display image and second edge information corresponding to the reference image comprises:
carrying out Gaussian filtering processing on the display image, removing the noise of the display image, and obtaining a display image after denoising;
calculating the gradient amplitude and the gradient direction of the denoised display image;
performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised display image;
detecting the initial edge point set by using a dual-threshold algorithm, and determining a target edge point set of the denoised display image according to a detection result;
performing edge connection on the denoised display interface based on the target edge point set, and determining first edge information corresponding to the denoised display image;
performing Gaussian filtering processing on the reference image, and removing noise of the reference image to obtain a denoised reference image;
calculating the gradient amplitude and the gradient direction of the denoised reference image;
performing non-maximum suppression on the gradient amplitude value based on the gradient direction, and determining an initial edge point set of the denoised reference image;
detecting the initial edge point set of the denoised reference image by using a dual-threshold algorithm, and determining a target edge point set of the denoised reference image according to a detection result;
and performing edge connection on the denoised reference image based on the target edge point set, and determining second edge information corresponding to the denoised reference image.
9. The method according to claim 6, wherein the calculating a difference between first position information of the first edge straight line group and second position information of the second edge straight line group to obtain position difference information of a display element in the display interface relative to the reference image comprises:
comparing the first position information of the first edge straight line group with the second position information of the corresponding second edge straight line group;
screening out a first edge straight line group different from second position information of the second edge straight line group according to the comparison result to obtain a screened edge straight line group;
and calculating the difference between the first position information of the screened edge straight line group and the second position information of the corresponding target second edge straight line group to obtain the position difference information of the display element in the display interface relative to the reference image.
10. The method of claim 5, wherein the position discrepancy information comprises a coordinate difference value, and wherein marking the position discrepancy information on the display interface comprises:
converting the coordinate difference value into a pixel density difference value according to the screen resolution of the display interface;
marking the screened edge straight line group and the pixel density difference value on a display interface.
11. A display interface processing apparatus, comprising:
the display unit is used for displaying a display interface corresponding to the target design drawing and an operation control, and the display interface comprises at least one display element;
the superposition unit is used for superposing and displaying a reference image on the display interface in response to a first operation aiming at the operation control, so that the reference image is aligned with the boundary of the display interface, and the reference image is an image obtained by adjusting a target design drawing according to the attribute information of the display interface;
the detection unit is used for detecting the position of the display element in the display interface based on the reference image and displaying a detection result on the display interface, wherein the detection result comprises position difference information of the display element in the display interface relative to the reference image.
12. A computer-readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the display interface processing method according to any one of claims 1 to 10.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method according to any of claims 1 to 10 are implemented when the program is executed by the processor.
CN202110368441.7A 2021-04-06 2021-04-06 Display interface processing method and device, electronic equipment and storage medium Pending CN113703622A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110368441.7A CN113703622A (en) 2021-04-06 2021-04-06 Display interface processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110368441.7A CN113703622A (en) 2021-04-06 2021-04-06 Display interface processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113703622A true CN113703622A (en) 2021-11-26

Family

ID=78647923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110368441.7A Pending CN113703622A (en) 2021-04-06 2021-04-06 Display interface processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113703622A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108269311A (en) * 2017-12-25 2018-07-10 福建省华渔教育科技有限公司 Method, the storage medium of Vision Design design sketch are checked based on UI
CN108984399A (en) * 2018-06-29 2018-12-11 上海连尚网络科技有限公司 Detect method, electronic equipment and the computer-readable medium of interface difference
CN110554957A (en) * 2019-07-31 2019-12-10 北京三快在线科技有限公司 method and device for testing user interface, electronic equipment and readable storage medium
CN111443978A (en) * 2020-04-17 2020-07-24 贝壳技术有限公司 User interface adjusting method and device, storage medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108269311A (en) * 2017-12-25 2018-07-10 福建省华渔教育科技有限公司 Method, the storage medium of Vision Design design sketch are checked based on UI
CN108984399A (en) * 2018-06-29 2018-12-11 上海连尚网络科技有限公司 Detect method, electronic equipment and the computer-readable medium of interface difference
CN110554957A (en) * 2019-07-31 2019-12-10 北京三快在线科技有限公司 method and device for testing user interface, electronic equipment and readable storage medium
CN111443978A (en) * 2020-04-17 2020-07-24 贝壳技术有限公司 User interface adjusting method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN110610453B (en) Image processing method and device and computer readable storage medium
EP3058512B1 (en) Organizing digital notes on a user interface
US10452920B2 (en) Systems and methods for generating a summary storyboard from a plurality of image frames
CN104375797B (en) Information processing method and electronic equipment
JP6264293B2 (en) Display control apparatus, display control method, and program
US10013408B2 (en) Information processing apparatus, information processing method, and computer readable medium
EP3163423B1 (en) Method and device for setting background of ui control
CN107608668B (en) Method and device for making and compatibly displaying H5 page, terminal equipment and storage medium
CN106648319A (en) Operation method and apparatus used for mind map
CN110865753B (en) Application message notification method and device
CN111179159B (en) Method and device for eliminating target image in video, electronic equipment and storage medium
CN104574454A (en) Image processing method and device
CN109951635A (en) It takes pictures processing method, device, mobile terminal and storage medium
CN109815854B (en) Method and device for presenting associated information of icon on user equipment
CN108762740A (en) Generation method, device and the electronic equipment of page data
CN112752158A (en) Video display method and device, electronic equipment and storage medium
CN107817935A (en) Display methods, device, terminal and the computer-readable recording medium of application interface
CN111598996B (en) Article 3D model display method and system based on AR technology
CN113516697B (en) Image registration method, device, electronic equipment and computer readable storage medium
US20130169660A1 (en) Image editing system and method
CN103645937B (en) The method and electronic equipment of a kind of data processing
CN109766530B (en) Method and device for generating chart frame, storage medium and electronic equipment
DE102017102024A1 (en) INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD
JP2010028429A (en) Image processing apparatus, image processing method, and program
CN113703622A (en) Display interface processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination