CN116466905A - OpenHarmony-based window split-screen operation interaction method and device - Google Patents

OpenHarmony-based window split-screen operation interaction method and device Download PDF

Info

Publication number
CN116466905A
CN116466905A CN202310603541.2A CN202310603541A CN116466905A CN 116466905 A CN116466905 A CN 116466905A CN 202310603541 A CN202310603541 A CN 202310603541A CN 116466905 A CN116466905 A CN 116466905A
Authority
CN
China
Prior art keywords
split
screen
split screen
parameters
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310603541.2A
Other languages
Chinese (zh)
Inventor
戴海清
何举刚
李煊
李振华
贾誓言
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Ordnance Equipment Group Ordnance Equipment Research Institute
Original Assignee
China Ordnance Equipment Group Ordnance Equipment Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Ordnance Equipment Group Ordnance Equipment Research Institute filed Critical China Ordnance Equipment Group Ordnance Equipment Research Institute
Priority to CN202310603541.2A priority Critical patent/CN116466905A/en
Publication of CN116466905A publication Critical patent/CN116466905A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a window split-screen operation interaction method and device based on OpenHarmony. Wherein the method comprises the following steps: acquiring window information and split screen operation signals; extracting split screen parameters in the window information; generating a split screen execution strategy according to the split screen operation signal and the split screen parameters; and executing the split screen operation according to the split screen execution strategy to obtain a split screen display result. The invention solves the technical problems that in the interactive method of the application program of the hong Mongolian system, the window split-screen operation mode is only equal division according to the window value, the intelligent degree is low, and the split-screen operation can not be carried out reasonably and flexibly according to the running environment of the actual application program and the requirements of users.

Description

OpenHarmony-based window split-screen operation interaction method and device
Technical Field
The invention relates to the field of man-machine interface interaction, in particular to a window split-screen operation interaction method and device based on OpenHarmony.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, for the split-screen operation of an application window based on an open source hong Menu system, multiple windows are usually adopted for merging processing or displaying, or split-screen dividing processing is carried out according to different content display areas of individual windows, but in the interactive method of the application program of the hong Menu system in the prior art, the mode of the split-screen operation of the windows is only equal division according to window values, so that the intelligent degree is low, and the split-screen operation which is more reasonable and flexible cannot be carried out flexibly according to the running environment of an actual application program and the requirements of users.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a window split-screen operation interaction method and device based on OpenHarmony, which at least solve the technical problems that in the prior art, in the application program interaction method of a hong Mongolian system, window split-screen operation is only divided in an equal amount according to window values, the degree of intellectualization is low, and more reasonable and flexible split-screen operation cannot be performed flexibly according to the running environment of an actual application program and the requirements of users.
According to an aspect of the embodiment of the invention, there is provided an OpenHarmony-based window split-screen operation interaction method, including: acquiring window information and split screen operation signals; extracting split screen parameters in the window information; generating a split screen execution strategy according to the split screen operation signal and the split screen parameters; and executing the split screen operation according to the split screen execution strategy to obtain a split screen display result.
Optionally, the extracting the split-screen parameter in the window information includes: extracting size parameters and content parameters in the window information; and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
Optionally, the generating a split-screen execution policy according to the split-screen operation signal and the split-screen parameter includes: acquiring the split screen requirement data in the split screen operation signal; inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters; and generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters.
Optionally, inputting the split-screen requirement data to a split-screen policy matching matrix, and obtaining the optimal split-screen parameter includes: matching a matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
According to another aspect of the embodiment of the present invention, there is also provided an OpenHarmony-based window split-screen operation interaction device, including: the acquisition module is used for acquiring window information and split screen operation signals; the extraction module is used for extracting the split screen parameters in the window information; the generation module is used for generating a split screen execution strategy according to the split screen operation signal and the split screen parameters; and the execution module is used for executing the split-screen operation according to the split-screen execution strategy to obtain a split-screen display result.
Optionally, the extracting module includes: an extracting unit, configured to extract a size parameter and a content parameter in the window information; and the screening unit is used for screening the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
Optionally, the generating module includes: the acquisition unit is used for acquiring the split screen requirement data in the split screen operation signal; the matching unit is used for inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters; and the generating unit is used for generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters.
Optionally, the matching unit includes: matching nodes for matching the matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
According to another aspect of the embodiment of the invention, a nonvolatile storage medium is provided, the nonvolatile storage medium comprises a stored program, and the program is used for controlling a device where the nonvolatile storage medium is located to execute a window split screen operation interaction method based on OpenHarmony when running.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a window split screen operation interaction method based on OpenHarmony when executed.
In the embodiment of the invention, window information and split screen operation signals are acquired; extracting split screen parameters in the window information; generating a split screen execution strategy according to the split screen operation signal and the split screen parameters; the method for obtaining the split screen display result by executing the split screen operation according to the split screen execution strategy solves the technical problems that in the interactive method for the hong and Monte system application program in the prior art, the window split screen operation mode is only equal division according to window values, the degree of intellectualization is low, and the split screen operation can not be performed reasonably and flexibly according to the running environment of the actual application program and the requirements of users.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flowchart of a window split screen operation interaction method based on OpenHarmony according to an embodiment of the invention;
FIG. 2 is a block diagram of a window split-screen operation interaction device based on OpenHarmony according to an embodiment of the invention;
fig. 3 is a block diagram of a terminal device for performing the method according to the invention according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of an OpenHarmony-based window split-screen operation interaction method, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions, and that although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different from that herein.
Example 1
Fig. 1 is a flowchart of a window split screen operation interaction method based on OpenHarmony according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, window information and split screen operation signals are acquired.
Specifically, in order to solve the technical problem that in the interactive method of the application program of the hong and Monte-ging system in the prior art, the window split-screen operation mode is simply equal division according to the window value, so that the lower intelligent degree exists, and the more reasonable and flexible split-screen operation cannot be performed according to the running environment of the actual application program and the requirements of the user, firstly, the attribute information of the window and the split-screen operation signal need to be acquired and obtained, wherein the window information characterizes the basic attribute value of the window, such as size and hiding, and the like, the split-screen operation signal is used for characterizing the signal sent by the user for the split-screen operation, and can be a touch input signal or a voice input signal, and particularly, which input signal is used as the split-screen operation signal of the embodiment of the invention is not particularly limited.
Step S104, extracting the split screen parameters in the window information.
Optionally, the extracting the split-screen parameter in the window information includes: extracting size parameters and content parameters in the window information; and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
Specifically, in order to separate and acquire parameters related to split screen operation in window information, the embodiment of the invention needs to extract size parameters and content parameters in the window information; and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
And S106, generating a split screen execution strategy according to the split screen operation signal and the split screen parameters.
Optionally, the generating a split-screen execution policy according to the split-screen operation signal and the split-screen parameter includes: acquiring the split screen requirement data in the split screen operation signal; inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters; and generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters.
Optionally, inputting the split-screen requirement data to a split-screen policy matching matrix, and obtaining the optimal split-screen parameter includes: matching a matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
And S108, executing split screen operation according to the split screen execution strategy to obtain a split screen display result.
Specifically, after the embodiment of the invention acquires the split-screen execution strategy for split-screen operation, the processor is required to extract the execution elements and the execution parameters in the split-screen execution strategy, and split-screen display is performed according to the extracted execution elements and the execution parameters.
By the embodiment, the technical problems that in the interactive method of the application program of the hong and Monte system in the prior art, the window split-screen operation mode is simply equal division according to the window value, the intelligent degree is low, and the split-screen operation can not be performed reasonably and flexibly according to the running environment of the actual application program and the requirements of users are solved.
Example two
Fig. 2 is a block diagram of a window split-screen operation interaction device based on OpenHarmony according to an embodiment of the present invention, and as shown in fig. 2, the device includes:
and the acquisition module 20 is used for acquiring the window information and the split screen operation signal.
Specifically, in order to solve the technical problem that in the interactive method of the application program of the hong and Monte-ging system in the prior art, the window split-screen operation mode is simply equal division according to the window value, so that the lower intelligent degree exists, and the more reasonable and flexible split-screen operation cannot be performed according to the running environment of the actual application program and the requirements of the user, firstly, the attribute information of the window and the split-screen operation signal need to be acquired and obtained, wherein the window information characterizes the basic attribute value of the window, such as size and hiding, and the like, the split-screen operation signal is used for characterizing the signal sent by the user for the split-screen operation, and can be a touch input signal or a voice input signal, and particularly, which input signal is used as the split-screen operation signal of the embodiment of the invention is not particularly limited.
And the extracting module 22 is used for extracting the split screen parameters in the window information.
Optionally, the extracting module includes: an extracting unit, configured to extract a size parameter and a content parameter in the window information; and the screening unit is used for screening the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
Specifically, in order to separate and acquire parameters related to split screen operation in window information, the embodiment of the invention needs to extract size parameters and content parameters in the window information; and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
And the generating module 24 is configured to generate a split-screen execution policy according to the split-screen operation signal and the split-screen parameter.
Optionally, the generating module includes: the acquisition unit is used for acquiring the split screen requirement data in the split screen operation signal; the matching unit is used for inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters; and the generating unit is used for generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters.
Optionally, the matching unit includes: matching nodes for matching the matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
And the execution module 26 is configured to execute the split-screen operation according to the split-screen execution policy, so as to obtain a split-screen display result.
Specifically, after the embodiment of the invention acquires the split-screen execution strategy for split-screen operation, the processor is required to extract the execution elements and the execution parameters in the split-screen execution strategy, and split-screen display is performed according to the extracted execution elements and the execution parameters.
By the embodiment, the technical problems that in the interactive method of the application program of the hong and Monte system in the prior art, the window split-screen operation mode is simply equal division according to the window value, the intelligent degree is low, and the split-screen operation can not be performed reasonably and flexibly according to the running environment of the actual application program and the requirements of users are solved.
According to another aspect of the embodiment of the invention, a nonvolatile storage medium is provided, the nonvolatile storage medium comprises a stored program, and the program is used for controlling a device where the nonvolatile storage medium is located to execute a window split screen operation interaction method based on OpenHarmony when running.
Specifically, the method comprises the following steps: acquiring window information and split screen operation signals; extracting split screen parameters in the window information; generating a split screen execution strategy according to the split screen operation signal and the split screen parameters; and executing the split screen operation according to the split screen execution strategy to obtain a split screen display result. Optionally, the extracting the split-screen parameter in the window information includes: extracting size parameters and content parameters in the window information; and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter. Optionally, the generating a split-screen execution policy according to the split-screen operation signal and the split-screen parameter includes: acquiring the split screen requirement data in the split screen operation signal; inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters; and generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters. Optionally, inputting the split-screen requirement data to a split-screen policy matching matrix, and obtaining the optimal split-screen parameter includes: matching a matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a window split screen operation interaction method based on OpenHarmony when executed.
Specifically, the method comprises the following steps: acquiring window information and split screen operation signals; extracting split screen parameters in the window information; generating a split screen execution strategy according to the split screen operation signal and the split screen parameters; and executing the split screen operation according to the split screen execution strategy to obtain a split screen display result. Optionally, the extracting the split-screen parameter in the window information includes: extracting size parameters and content parameters in the window information; and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter. Optionally, the generating a split-screen execution policy according to the split-screen operation signal and the split-screen parameter includes: acquiring the split screen requirement data in the split screen operation signal; inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters; and generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters. Optionally, inputting the split-screen requirement data to a split-screen policy matching matrix, and obtaining the optimal split-screen parameter includes: matching a matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (10)

1. The window split-screen operation interaction method based on OpenHarmony is characterized by comprising the following steps of:
acquiring window information and split screen operation signals;
extracting split screen parameters in the window information;
generating a split screen execution strategy according to the split screen operation signal and the split screen parameters;
and executing the split screen operation according to the split screen execution strategy to obtain a split screen display result.
2. The method of claim 1, wherein the extracting the split-screen parameters in the window information comprises:
extracting size parameters and content parameters in the window information;
and selecting the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
3. The method of claim 1, wherein generating a split-screen execution policy based on the split-screen operation signal and the split-screen parameter comprises:
acquiring the split screen requirement data in the split screen operation signal;
inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters;
and generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters.
4. The method of claim 3, wherein inputting the split-screen requirement data into a split-screen policy matching matrix, obtaining an optimal split-screen parameter comprises:
matching a matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
5. An OpenHarmony-based window split-screen operation interaction device is characterized by comprising:
the acquisition module is used for acquiring window information and split screen operation signals;
the extraction module is used for extracting the split screen parameters in the window information;
the generation module is used for generating a split screen execution strategy according to the split screen operation signal and the split screen parameters;
and the execution module is used for executing the split-screen operation according to the split-screen execution strategy to obtain a split-screen display result.
6. The apparatus of claim 5, wherein the extraction module comprises:
an extracting unit, configured to extract a size parameter and a content parameter in the window information;
and the screening unit is used for screening the size parameter and the content parameter according to a preset split screen rule to obtain the split screen parameter.
7. The apparatus of claim 5, wherein the generating module comprises:
the acquisition unit is used for acquiring the split screen requirement data in the split screen operation signal;
the matching unit is used for inputting the split screen requirement data into a split screen strategy matching matrix to obtain optimal split screen parameters;
and the generating unit is used for generating the split screen execution strategy according to the optimal split screen parameters and the split screen parameters.
8. The apparatus of claim 7, wherein the matching unit comprises:
matching nodes for matching the matrix through the split-screen strategy
And generating the optimal split screen parameters, wherein X1 and X2 are split screen requirement data, and Y1 and Y2 are the optimal split screen parameters.
9. A non-volatile storage medium, characterized in that the non-volatile storage medium comprises a stored program, wherein the program, when run, controls a device in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for executing the processor, wherein the computer readable instructions when executed perform the method of any of claims 1 to 4.
CN202310603541.2A 2023-05-25 2023-05-25 OpenHarmony-based window split-screen operation interaction method and device Pending CN116466905A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310603541.2A CN116466905A (en) 2023-05-25 2023-05-25 OpenHarmony-based window split-screen operation interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310603541.2A CN116466905A (en) 2023-05-25 2023-05-25 OpenHarmony-based window split-screen operation interaction method and device

Publications (1)

Publication Number Publication Date
CN116466905A true CN116466905A (en) 2023-07-21

Family

ID=87173848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310603541.2A Pending CN116466905A (en) 2023-05-25 2023-05-25 OpenHarmony-based window split-screen operation interaction method and device

Country Status (1)

Country Link
CN (1) CN116466905A (en)

Similar Documents

Publication Publication Date Title
CN115426525B (en) High-speed dynamic frame linkage image splitting method and device
CN116614453B (en) Image transmission bandwidth selection method and device based on cloud interconnection
CN116466905A (en) OpenHarmony-based window split-screen operation interaction method and device
CN116302041B (en) Optimization method and device for light field camera interface module
CN116389915B (en) Method and device for reducing flicker of light field camera
CN115345808B (en) Picture generation method and device based on multi-element information acquisition
CN116723298B (en) Method and device for improving transmission efficiency of camera end
CN116468883B (en) High-precision image data volume fog recognition method and device
CN115460389B (en) Image white balance area optimization method and device
CN115695267B (en) Data interface-oriented testing and verifying method and device
CN116723419B (en) Acquisition speed optimization method and device for billion-level high-precision camera
CN116506423A (en) Information security data reporting method and device
CN116228593B (en) Image perfecting method and device based on hierarchical antialiasing
CN116579965B (en) Multi-image fusion method and device
CN116521301A (en) Application responsive layout method and device based on OpenHarmony
CN116402935B (en) Image synthesis method and device based on ray tracing algorithm
CN116700538A (en) Global information popup method and device based on OpenHarmony
CN115511735B (en) Snow field gray scale picture optimization method and device
CN116579964B (en) Dynamic frame gradual-in gradual-out dynamic fusion method and device
CN116797479B (en) Image vertical distortion conversion method
CN116431392A (en) Important data separation method and device
CN116774929A (en) Data storage method and system based on big data
CN117118822A (en) Network diagnosis processing method and system
CN117911870A (en) Emergency safety prediction method based on hundred million-level image acquisition means
CN116389887A (en) Dynamic optimization-based light field camera configuration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination