CN116974681A - View processing method, device, electronic equipment, medium and product - Google Patents

View processing method, device, electronic equipment, medium and product Download PDF

Info

Publication number
CN116974681A
CN116974681A CN202310982335.7A CN202310982335A CN116974681A CN 116974681 A CN116974681 A CN 116974681A CN 202310982335 A CN202310982335 A CN 202310982335A CN 116974681 A CN116974681 A CN 116974681A
Authority
CN
China
Prior art keywords
view
processed
interaction
information
normalization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310982335.7A
Other languages
Chinese (zh)
Inventor
涂勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Photosynthetic Signal Technology Co ltd
Original Assignee
Chengdu Photosynthetic Signal Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Photosynthetic Signal Technology Co ltd filed Critical Chengdu Photosynthetic Signal Technology Co ltd
Priority to CN202310982335.7A priority Critical patent/CN116974681A/en
Publication of CN116974681A publication Critical patent/CN116974681A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure discloses a view processing method, a device, an electronic device, a storage medium and a product, wherein the method comprises the following steps: acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed; determining interaction information corresponding to the view to be processed based on the interaction detection view; and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed. The technical scheme of the embodiment of the disclosure solves the problem of low view processing efficiency, and improves the view processing efficiency.

Description

View processing method, device, electronic equipment, medium and product
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a view processing method, a device, electronic equipment, a medium and a product.
Background
When optimizing and adjusting the page or the interactive interface of the application program, the page adjustment process is generally required to be performed on the basis of an initial page application, and then a final view processing scheme is further determined. The process of repeatedly adjusting the view page may occur, so that the view processing efficiency is low, and the optimization process of the application program is not facilitated.
Disclosure of Invention
The invention provides a view processing method, a device, electronic equipment, a medium and a product, which can reduce repeated view adjustment process in view optimization process and improve view processing efficiency.
In a first aspect, an embodiment of the present disclosure provides a view processing method, including:
acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed;
determining interaction information corresponding to the view to be processed based on the interaction detection view;
and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed.
In a second aspect, embodiments of the present disclosure further provide a view processing apparatus, including:
the view creation module is used for acquiring a view to be processed and determining an interaction detection view corresponding to the view to be processed;
the interaction information acquisition module is used for determining interaction information corresponding to the view to be processed based on the interaction detection view;
and the view processing analysis module is used for carrying out normalization processing on the interaction information to determine a normalization processing result and sending the normalization processing result to a server so that the server can obtain a processing result corresponding to the view to be processed.
In a third aspect, embodiments of the present disclosure further provide an electronic device, including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the view processing method as described in any of the embodiments of the present disclosure.
In a fourth aspect, the presently disclosed embodiments also provide a storage medium containing computer executable instructions that, when executed by a computer processor, are for performing the view processing method as described in any of the presently disclosed embodiments.
In a fifth aspect, the disclosed embodiments also provide a computer program product comprising a computer program which, when executed by a processor, implements a view processing method according to any of the embodiments of the invention.
According to the embodiment of the disclosure, the view to be processed is obtained in the view processing process, and a corresponding interaction detection view is determined for the view to be processed; further, based on the interaction detection view, interaction information corresponding to the view to be processed is determined; and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed. The technical scheme of the embodiment of the disclosure solves the problem of low view processing efficiency, can reduce repeated view adjustment process in view optimization process, and improves view processing efficiency.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a schematic flow diagram of a view processing method according to an embodiment of the disclosure;
FIG. 2 is a schematic flow diagram of a view processing method according to an embodiment of the disclosure;
FIG. 3 is a schematic diagram of interactive coordinate information for an interactive operation according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an interactive coordinate information normalization process for interactive operations according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a coordinate point marking provided by an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a view processing device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server or a storage medium for executing the operation of the technical scheme of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt information is sent to the user may be, for example, a popup, in which the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide personal information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
It will be appreciated that the data (including but not limited to the data itself, the acquisition or use of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
Fig. 1 is a schematic flow chart of a view processing method provided by an embodiment of the present disclosure, where the embodiment of the present disclosure is applicable to a scenario of designing and optimizing an interactive interface, and the method may be performed by a view processing apparatus, where the apparatus may be implemented in a form of software and/or hardware, and optionally, may be implemented by an electronic device, where the electronic device may be a mobile terminal, a PC side, a server, or the like.
As shown in fig. 1, the view processing method includes:
s110, acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed.
The view to be processed may be an interface view in any computer program product or application program, such as an interface view of a user interaction interface, a graphic display interface, and the like. The view to be processed may also be a view of any one of the functional areas in the application interface, which is part of a presentation interface. The view to be processed is one that requires content, region size, or other rendering parameter adjustment and optimization.
The obtaining of the view to be processed may be achieved by a view to be processed information determination instruction input by a user. The view processing device may acquire view name or identification information of the view to be processed input by the user, and determine an instruction to perform view processing, so that the view to be processed may be determined. In the process of determining the views to be processed, one view to be processed can be determined at a time, or a plurality of views to be processed can be determined, so that each view to be processed is analyzed and processed respectively.
After the pending view is acquired, a corresponding interactive detection view is created for the pending view. The interaction detection view is a view (view) that is the same size and transparent as the corresponding view to be processed. The interaction detection view is only used for determining the interaction operation in the view to be processed, and does not influence the service execution logic corresponding to the normal interaction operation.
In a possible implementation manner, the interactive detection view is created by firstly acquiring dimension size parameters of the view to be processed, then adding the interactive detection view corresponding to the dimension size parameters based on the view adding component, and adding a corresponding view tag for the interactive detection view, so as to finally obtain a view with the same size as the corresponding view to be processed.
S120, determining interaction information corresponding to the view to be processed based on the interaction detection view.
In this embodiment, the interaction information may be interaction operation information, where the interaction operation may be an operation triggered by clicking, long pressing, sliding, or other manners, and the interaction information may specifically be position coordinate information corresponding to the interaction operation, that is, coordinate information. The interaction information corresponding to the view to be processed may be the interaction information received by the view to be processed.
Specifically, to obtain the interaction information of the interaction operation on the view to be processed through the interaction detection view, first, the interaction detection view is added in an overlapping manner on the upper layer of the view to be processed. It is understood that the interaction detection view is overlaid on the view to be processed, with the two being coincident. The interaction detection view is a view to be processed, wherein the interaction information can be first coordinate information of interaction operation in a first coordinate system and/or second coordinate information in a second coordinate system; the first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to equipment associated with the view to be processed.
It may be understood that, in this embodiment, the interaction detection view is only used to determine the interaction operation for the view to be processed, and in order not to affect the operation of the normal service logic, after normalization processing is performed on the interaction information, event termination information corresponding to the normalized interaction information needs to be returned, that is, after the interaction information of the interaction detection view layer is obtained, the task logic is ended, the return value is false, and the task execution logic corresponding to the interaction is ended. Returning false corresponds to a terminator, preventing business logic triggered by performing an interactive operation at the interactive detection view layer, canceling event bubbling, and stopping callback execution to return immediately. In addition, corresponding business logic is executed at this view layer of the view to be processed.
S130, carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed.
The interaction information may be first coordinate information of the interaction operation in a first coordinate system, and/or second coordinate information in a second coordinate system. The first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to equipment associated with the view to be processed. The interactive information is normalized because the dimensions of the dimensions are different when the views to be processed are displayed on different device screens. Moreover, the same point of coordinate values in different terminal screens may not be the same point. Therefore, it is necessary to scale the interaction information in a normalized manner with respect to the width and height of the screen and the width and height of the view to be processed, so that the calculated normalization result of the interaction information may be able to adapt to various screens, so that the view processing method of the embodiment has versatility.
Further, after the interactive information normalization processing is performed, a normalization processing result is sent to the target server. The target server side can accordingly serve as a view processing result of the view to be processed according to the obtained normalization processing result of the interaction information on the view to be processed, which is uploaded by each client side.
According to the technical scheme, the view to be processed is obtained in the view processing process, and a corresponding interaction detection view is determined for the view to be processed; further, based on the interaction detection view, interaction information corresponding to the view to be processed is determined; and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server, so that the server obtains a processing result corresponding to the view to be processed, and determining the processing result corresponding to the view to be processed, which has universality to different types of terminal equipment screens, through the normalization processing result. The technical scheme of the embodiment of the disclosure solves the problem of low view processing efficiency, can reduce repeated view adjustment process in view optimization process, and improves view processing efficiency.
Fig. 2 is a schematic flow chart of a further view processing method according to an embodiment of the present disclosure, and further illustrates a process of interactive information normalization processing in a process of implementing the view processing method flow on the basis of the above embodiment. The method may be performed by view processing means, which may be implemented in the form of software and/or hardware, optionally by an electronic device, which may be a mobile terminal, a PC-side or a server, etc.
As shown in fig. 2, the view processing method includes:
s210, acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed.
S220, covering the interaction detection view on the upper layer of the view to be processed.
S230, determining interaction coordinate information for interaction of the view to be processed based on the interaction detection view.
Wherein the interactive coordinate information may include first coordinate information interactively operating in a first coordinate system and/or second coordinate information in a second coordinate system. The first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to a screen of equipment associated with the view to be processed. By way of example, a schematic diagram of the screen interface of any of the terminal devices shown in fig. 3, wherein the larger peripheral boxes represent screens and the relatively smaller boxes within the screen interface are views (views) in which the interaction detection View overlaps with the View to be processed.
Under the condition that the interaction operation is triggered aiming at the view interface, the first coordinate information of the interaction operation in a first coordinate system corresponding to the view to be processed and the second coordinate information in a second coordinate system corresponding to the terminal interface for displaying the view to be processed can be determined. The coordinates of the interactive position point relative to the point at the upper left corner (0, 0) of the view are the first coordinate information in the first coordinate system, and the coordinates of the interactive position point relative to the point at the upper left corner (0, 0) of the screen are the second coordinate information in the second coordinate system.
S240, carrying out normalization processing based on the first coordinate information and the first size information of the view to be processed, and determining a first normalized coordinate; and carrying out normalization processing based on the second coordinate information and second size information of a screen of the equipment, and determining second normalized coordinates.
In the step, the interactive coordinate information of the interactive operation is normalized under two coordinate systems respectively based on the first size information of the view to be processed and the second size information of the terminal interface. The first normalized coordinates may be values divided by the first coordinate information values by the corresponding width or height of the view to be processed, respectively. The second normalized coordinates may be values divided by the corresponding width or height of the terminal screen interface, respectively, by the second coordinate information values.
The process of calculating the second normalized coordinates is illustrated in fig. 4. In fig. 4, the interface coordinates of the terminal device corresponding to the interactive operation are (100 ), the screen width (width) of the terminal device is 720, and the height (height) is 1080, so that the normalization process is (100/720, 100/1080), and the final normalized position coordinates are (0.1389,0.0926).
S250, the first normalized coordinates and the second normalized coordinates are sent to a server, so that the server obtains a processing result corresponding to the view to be processed according to the first normalized coordinates and the second normalized coordinates.
Specifically, the first normalized coordinates and the second normalized coordinates may be sent to the target server, so that the target server may construct an interaction information set based on the normalized processing results, the server may receive multiple normalized processing results, where the multiple normalized processing results may be normalized processing results of interaction information sent by different devices, so as to be integrated into one interaction information set, and then mark the normalized processing results in the interaction information set in a corresponding target normalized coordinate system, as shown in fig. 5, a graph of marking coordinate points in one normalized coordinate system, so as to determine a processing result corresponding to the view to be processed.
Because the coordinates received by the target server are coordinates after normalization processing, the normalization result relative to the view in the normalization result can be marked in a normalization coordinate system corresponding to the view to be processed; marking the normalization result relative to the associated equipment in the normalization coordinate system corresponding to the equipment. In addition, the normalization result of the view and the normalization result relative to the device can be further subjected to normalization calculation to obtain the normalization coordinate value of the interactive operation in the standard dimension to-be-processed view area in the standard dimension device interface, and further the processing result corresponding to the to-be-processed view is determined. In addition, a final view adjustment scheme can be directly determined for the view to be processed according to the distribution condition of the interaction coordinate information in the view to be processed, and the view is processed to obtain a further view processing result, so that multiple adjustment and interaction analysis of the view adjustment scheme in the view processing process are avoided.
According to the technical scheme, the view to be processed is obtained in the view processing process, and a corresponding interaction detection view is determined for the view to be processed; further, based on the interaction detection view, interaction information corresponding to the view to be processed is determined; and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server, so that the server obtains a processing result corresponding to the view to be processed, and determining the processing result corresponding to the view to be processed, which has universality to different types of terminal equipment screens, through the normalization processing result. The technical scheme of the embodiment of the disclosure solves the problem of low view processing efficiency, can reduce repeated view adjustment process in view optimization process, and improves view processing efficiency.
Fig. 6 is a view processing apparatus provided by the embodiment of the present disclosure, where the apparatus is suitable for a scenario of designing and optimizing an interactive interface, and the view processing apparatus may be implemented in a form of software and/or hardware, and may be configured in an electronic device, where the electronic device may be a mobile terminal, a PC side, a server, or the like.
As shown in fig. 6, the view processing apparatus includes: a view creation module 310, an interaction information acquisition module 320, and a view processing analysis module 330.
The view creation module 310 is configured to obtain a view to be processed, and determine an interaction detection view corresponding to the view to be processed; an interaction information obtaining module 320, configured to determine interaction information corresponding to the view to be processed based on the interaction detection view; and the view processing analysis module 330 is configured to perform normalization processing on the interaction information to determine a normalization processing result, and send the normalization processing result to a server, so that the server obtains a processing result corresponding to the view to be processed.
According to the technical scheme, the view to be processed is obtained in the view processing process, and a corresponding interaction detection view is determined for the view to be processed; further, based on the interaction detection view, interaction information corresponding to the view to be processed is determined; and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed. The technical scheme of the embodiment of the disclosure solves the problem of low view processing efficiency, can reduce repeated view adjustment process in view optimization process, and improves view processing efficiency.
In an alternative embodiment, the interaction information obtaining module 320 is specifically configured to:
covering the interaction detection view on the upper layer of the view to be processed;
and determining interaction coordinate information corresponding to the view to be processed based on the interaction detection view.
In an alternative embodiment, the interactive coordinate information includes:
the interaction includes first coordinate information in a first coordinate system and/or second coordinate information in a second coordinate system; the first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to equipment associated with the view to be processed.
In an alternative embodiment, the view processing analysis module 330 is specifically configured to:
performing normalization processing based on the first coordinate information and the first size information of the view to be processed, and determining a first normalized coordinate;
and carrying out normalization processing based on the second coordinate information and second size information of a screen of the equipment, and determining second normalized coordinates.
In an alternative embodiment, the view processing analysis module 330 may be further configured to:
and sending the normalization processing result to a target server side so that the target server side builds an interaction information set based on the normalization processing result, marking the normalization processing result in the interaction information set in a corresponding target normalization coordinate system, and determining a processing result corresponding to the view to be processed according to the marking point distribution condition in the target normalization coordinate system.
In an alternative embodiment, the view processing analysis module 330 is specifically configured to:
the normalization processing result is sent to a target server side, so that the target server side marks the normalization result of the normalization processing result relative to the view to be processed in a normalization coordinate system corresponding to the view to be processed; marking the normalization result relative to the equipment in the normalization processing result in a normalization coordinate system corresponding to the equipment.
The view processing device provided by the embodiment of the disclosure can execute the view processing method provided by any embodiment of the disclosure, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that each unit and module included in the above apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for convenience of distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present disclosure.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring now to fig. 7, a schematic diagram of an electronic device (e.g., a terminal device or server in fig. 7) 400 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 7 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 7, the electronic device 400 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 401, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage means 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the electronic device 400 are also stored. The processing device 401, the ROM 402, and the RAM 403 are connected to each other by a bus 404. An edit/output (I/O) interface 405 is also connected to bus 404.
In general, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, magnetic tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate with other devices wirelessly or by wire to exchange data. While fig. 7 shows an electronic device 400 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via communications device 409, or from storage 408, or from ROM 402. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 401.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The electronic device provided by the embodiment of the present disclosure and the view processing method provided by the foregoing embodiment belong to the same inventive concept, and technical details not described in detail in the present embodiment may be referred to the foregoing embodiment, and the present embodiment has the same beneficial effects as the foregoing embodiment.
The embodiment of the present disclosure also provides a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the view processing method provided by the above embodiment.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to:
acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed;
determining interaction information corresponding to the view to be processed based on the interaction detection view;
and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The disclosed embodiments also provide a computer program product comprising a computer program which, when executed by a processor, implements a view processing method as provided by any of the embodiments of the disclosure.
Computer program product in an implementation, computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
According to one or more embodiments of the present disclosure, there is provided a view processing method [ example one ], the method comprising:
acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed;
determining interaction information corresponding to the view to be processed based on the interaction detection view;
and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed.
According to one or more embodiments of the present disclosure, there is provided a view processing method [ example two ] further comprising:
in some optional implementations, determining, based on the interaction detection view, interaction information corresponding to the view to be processed includes:
covering the interaction detection view on the upper layer of the view to be processed;
and determining interaction coordinate information corresponding to the view to be processed based on the interaction detection view.
According to one or more embodiments of the present disclosure, there is provided a view processing method, including:
in some alternative implementations, the interaction coordinate information includes:
The interaction includes first coordinate information in a first coordinate system and/or second coordinate information in a second coordinate system; the first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to equipment associated with the view to be processed.
According to one or more embodiments of the present disclosure, there is provided a view processing method [ example four ], further comprising:
in some optional implementations, the determining the normalization result by performing normalization on the interaction information includes:
performing normalization processing based on the first coordinate information and the first size information of the view to be processed, and determining a first normalized coordinate;
and carrying out normalization processing based on the second coordinate information and second size information of a screen of the equipment, and determining second normalized coordinates.
According to one or more embodiments of the present disclosure, there is provided a view processing method [ example five ], further comprising:
in some optional implementations, sending the normalized processing result to a server, so that the server obtains a processing result corresponding to the view to be processed, where the processing result includes:
And sending the normalization processing result to a target server side so that the target server side builds an interaction information set based on the normalization processing result, marking the normalization processing result in the interaction information set in a corresponding target normalization coordinate system, and determining a processing result corresponding to the view to be processed according to the marking point distribution condition in the target normalization coordinate system.
According to one or more embodiments of the present disclosure, there is provided a view processing method [ example six ], further comprising:
in some optional implementations, marking the normalized processing results in the interaction information set in a corresponding target normalized coordinate system includes:
marking the normalization result relative to the view to be processed in the normalization processing result in a normalization coordinate system corresponding to the view to be processed;
marking the normalization result relative to the equipment in the normalization processing result in a normalization coordinate system corresponding to the equipment.
According to one or more embodiments of the present disclosure, there is provided a view processing apparatus, comprising:
the view creation module is used for acquiring a view to be processed and determining an interaction detection view corresponding to the view to be processed;
The interaction information acquisition module is used for determining interaction information corresponding to the view to be processed based on the interaction detection view;
and the view processing analysis module is used for carrying out normalization processing on the interaction information to determine a normalization processing result and sending the normalization processing result to a server so that the server can obtain a processing result corresponding to the view to be processed.
According to one or more embodiments of the present disclosure, there is provided a view processing apparatus [ example eight ], further comprising:
in an optional implementation manner, the interaction information acquisition module is specifically configured to:
covering the interaction detection view on the upper layer of the view to be processed;
and determining interaction coordinate information corresponding to the view to be processed based on the interaction detection view.
According to one or more embodiments of the present disclosure, there is provided a view processing apparatus, further comprising:
in an alternative embodiment, the interactive coordinate information includes:
the interaction includes first coordinate information in a first coordinate system and/or second coordinate information in a second coordinate system; the first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to equipment associated with the view to be processed.
According to one or more embodiments of the present disclosure, there is provided a view processing apparatus, further comprising:
in an alternative embodiment, the view manipulation analysis die body is configured to:
performing normalization processing based on the first coordinate information and the first size information of the view to be processed, and determining a first normalized coordinate;
and carrying out normalization processing based on the second coordinate information and second size information of a screen of the equipment, and determining second normalized coordinates.
According to one or more embodiments of the present disclosure, there is provided a view processing apparatus [ example eleven ], further comprising:
in an alternative embodiment, the view processing analysis module may be further configured to:
and sending the normalization processing result to a target server side so that the target server side builds an interaction information set based on the normalization processing result, marking the normalization processing result in the interaction information set in a corresponding target normalization coordinate system, and determining a processing result corresponding to the view to be processed according to the marking point distribution condition in the target normalization coordinate system.
According to one or more embodiments of the present disclosure, there is provided a view processing apparatus [ example twelve ], further comprising:
In an alternative embodiment, the view manipulation analysis die body is configured to:
marking the normalization result relative to the view to be processed in the normalization processing result in a normalization coordinate system corresponding to the view to be processed;
marking the normalization result relative to the equipment in the normalization processing result in a normalization coordinate system corresponding to the equipment.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (10)

1. A method of view processing, comprising:
acquiring a view to be processed, and determining an interaction detection view corresponding to the view to be processed;
determining interaction information corresponding to the view to be processed based on the interaction detection view;
and carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server obtains a processing result corresponding to the view to be processed.
2. The method of claim 1, wherein determining interaction information corresponding to the view to be processed based on the interaction detection view comprises:
covering the interaction detection view on the upper layer of the view to be processed;
and determining interaction coordinate information corresponding to the view to be processed based on the interaction detection view.
3. The method of claim 2, wherein the interaction coordinate information comprises:
first coordinate information in a first coordinate system, and/or second coordinate information in a second coordinate system; the first coordinate system is a coordinate system corresponding to the view to be processed, and the second coordinate system is a coordinate system corresponding to equipment associated with the view to be processed.
4. A method according to claim 3, wherein said normalizing the interaction information to determine a normalized result comprises:
normalizing the first coordinate information based on the first coordinate information and the first size information of the view to be processed, and determining a first normalized coordinate;
and carrying out normalization processing on the second coordinate information based on the second coordinate information and second size information of the screen of the equipment, and determining a second normalized coordinate.
5. The method according to claim 3 or 4, wherein sending the normalized processing result to a server, so that the server obtains a processing result corresponding to the view to be processed, includes:
and sending the normalization processing result to a target server side so that the target server side builds an interaction information set based on the normalization processing result, and marking the normalization processing result in the interaction information set in a corresponding target normalization coordinate system, thereby determining the processing result corresponding to the view to be processed.
6. The method of claim 5, wherein labeling normalized processing results in the set of interaction information in a corresponding target normalized coordinate system comprises:
marking the normalization result relative to the view to be processed in the normalization processing result in a normalization coordinate system corresponding to the view to be processed;
marking the normalization result relative to the equipment in the normalization processing result in a normalization coordinate system corresponding to the equipment.
7. A view processing apparatus, comprising:
the view creation module is used for acquiring a view to be processed and determining an interaction detection view corresponding to the view to be processed;
the interaction information acquisition module is used for determining interaction information corresponding to the view to be processed based on the interaction detection view;
and the view processing analysis module is used for carrying out normalization processing on the interaction information to determine a normalization processing result, and sending the normalization processing result to a server so that the server can obtain a processing result corresponding to the view to be processed.
8. An electronic device, the electronic device comprising:
One or more processors;
storage means for storing one or more programs,
when executed by the one or more processors, causes the one or more processors to implement the view processing method of any of claims 1-6.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the view processing method according to any of claims 1-6.
10. A computer program product comprising a computer program which, when executed by a processor, implements the view processing method according to any of claims 1-6.
CN202310982335.7A 2023-08-04 2023-08-04 View processing method, device, electronic equipment, medium and product Pending CN116974681A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310982335.7A CN116974681A (en) 2023-08-04 2023-08-04 View processing method, device, electronic equipment, medium and product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310982335.7A CN116974681A (en) 2023-08-04 2023-08-04 View processing method, device, electronic equipment, medium and product

Publications (1)

Publication Number Publication Date
CN116974681A true CN116974681A (en) 2023-10-31

Family

ID=88482960

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310982335.7A Pending CN116974681A (en) 2023-08-04 2023-08-04 View processing method, device, electronic equipment, medium and product

Country Status (1)

Country Link
CN (1) CN116974681A (en)

Similar Documents

Publication Publication Date Title
CN109857486B (en) Method, device, equipment and medium for processing program page data
CN110634049B (en) Page display content processing method and device, electronic equipment and readable medium
CN110865852B (en) Webpage component loading method and device, electronic equipment and storage medium
CN114742934B (en) Image rendering method and device, readable medium and electronic equipment
CN111240786A (en) Walkthrough method and device, electronic equipment and storage medium
CN111258582B (en) Window rendering method and device, computer equipment and storage medium
CN113391860B (en) Service request processing method and device, electronic equipment and computer storage medium
CN111460432B (en) On-line document authority control method, device, equipment and computer readable medium
CN116069227A (en) Interface interaction method, device, equipment and storage medium
CN110427584A (en) Page generation method, device, electronic equipment and computer readable storage medium
CN111290812B (en) Display method, device, terminal and storage medium of application control
CN116974681A (en) View processing method, device, electronic equipment, medium and product
CN113204557A (en) Electronic form importing method, device, equipment and medium
CN116266206A (en) Page display method and device and electronic equipment
CN111738311A (en) Multitask-oriented feature extraction method and device and electronic equipment
CN116414359A (en) Card layout method, device, medium and electronic equipment
CN111026983B (en) Method, device, medium and electronic equipment for realizing hyperlink
CN114510309B (en) Animation effect setting method, device, equipment and medium
CN118409691A (en) Method, device, equipment, medium and product for adjusting display position of screen projection picture
CN118695046A (en) Wheat connecting method, device, equipment and storage medium
CN117953123A (en) Filter special effect processing method, device, equipment and storage medium
CN117033834A (en) Page rendering method and device, computer readable medium and electronic equipment
CN118015124A (en) Rendering method, device, medium, electronic device and program product of material
CN117753008A (en) Virtual object interaction method, device, medium and electronic equipment
CN118656004A (en) Information display method, device, equipment, storage medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination