WO2021164313A1 - 界面布局方法、装置及系统 - Google Patents

界面布局方法、装置及系统 Download PDF

Info

Publication number
WO2021164313A1
WO2021164313A1 PCT/CN2020/125607 CN2020125607W WO2021164313A1 WO 2021164313 A1 WO2021164313 A1 WO 2021164313A1 CN 2020125607 W CN2020125607 W CN 2020125607W WO 2021164313 A1 WO2021164313 A1 WO 2021164313A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
terminal device
information
screen
sub
Prior art date
Application number
PCT/CN2020/125607
Other languages
English (en)
French (fr)
Inventor
马晓慧
周星辰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/801,197 priority Critical patent/US20230099824A1/en
Priority to EP20920102.9A priority patent/EP4080345A4/en
Priority to JP2022550007A priority patent/JP2023514631A/ja
Publication of WO2021164313A1 publication Critical patent/WO2021164313A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0442Handling or displaying different aspect ratios, or changing the aspect ratio
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller

Definitions

  • This application belongs to the field of artificial intelligence recognition technology, and particularly relates to an interface layout method, device and system.
  • terminal devices can not only display the interface of the application, but also project the interface of the application to other terminal devices so that users can control the application through other terminal devices. Performing different functions enables users to experience seamless and continuous services with consistent operations on different terminal devices.
  • the first terminal device detects the screen projection operation triggered by the user during the process of loading the application program, it can change the interface currently displayed by the application program to the second terminal device indicated by the screen projection operation according to the screen projection operation.
  • the second terminal device can display the interface displayed by the application program in the first terminal device.
  • the embodiments of the present application provide an interface layout method, device and system, which can solve the problem that after the first terminal device projects the displayed interface to the second terminal device, the user cannot conveniently control the projected interface based on the second terminal device The problem.
  • an embodiment of the present application provides an interface layout method, which is applied to a first terminal device, and the first terminal device is connected to a second terminal device, and the method includes:
  • a second interface for displaying on the second terminal device is generated, where the first interface is the interface displayed by the first terminal device, and the second device The information is used to indicate the screen size and screen status of the second terminal device.
  • the generating the second interface for display on the second terminal device according to the interface information of the first interface and the second device information includes:
  • the interface information of the first interface includes element information of at least one interface element in the first interface, and the element information is used to indicate the interface The name and type of the element, and the position of the interface element in the first interface;
  • the interface information of the first interface further includes interface attributes, and the interface attributes are used to indicate the second One interface size and interface direction;
  • the identifying and determining the interface category according to the element information of at least one interface element in combination with a pre-trained interface recognition model includes:
  • the interface feature data is input into the interface recognition model, and the interface feature data is recognized through the interface recognition model to obtain the interface category output by the interface recognition model.
  • the at least one interface element is sorted according to the interface category and the second device information.
  • the second interface including:
  • each of the interface elements in each of the sub-regions is adjusted to obtain The second interface.
  • the size of the display area indicated by the second device information and each of the The number of interface elements arranged in the sub-regions, and the adjustment of each of the interface elements in each of the sub-regions to obtain the second interface includes:
  • the preset arrangement rules and the number of elements corresponding to each of the sub-regions, the size and direction of each interface element in each of the sub-regions are adjusted to obtain the adjustment After the interface elements;
  • the positions of the adjusted interface elements in the sub-region are adjusted in the sub-region to obtain the second interface.
  • the method further includes:
  • the method further includes :
  • the feedback information is information that the user gives feedback on the second interface displayed by the second terminal device
  • the interface recognition model is updated according to the feedback information.
  • the method further includes:
  • element information of a plurality of the interface elements is generated.
  • the method further includes:
  • the arrangement rule is adjusted according to the adjustment operation.
  • an embodiment of the present application provides an interface layout device, which is applied to a first terminal device, and the first terminal device is connected to a second terminal device, and the device includes:
  • a receiving module configured to receive a screen casting instruction, the screen casting instruction being used to instruct the first terminal device to cast a screen to the second terminal device;
  • a generating module configured to generate a second interface for display on the second terminal device according to the interface information of the first interface and the second device information, where the first interface is the interface displayed by the first terminal device,
  • the second device information is used to indicate the screen size and screen state of the second terminal device.
  • the generating module is specifically configured to obtain interface information of the first interface and the second device information, and the interface information of the first interface includes the Element information of at least one interface element in the first interface, where the element information is used to indicate the name and type of the interface element and the position of the interface element in the first interface; according to the at least one interface element
  • the element information is identified in combination with a pre-trained interface recognition model to determine the interface category; at least one of the interface elements is arranged according to the interface category and the second device information to obtain the second interface.
  • the interface information of the first interface further includes interface attributes, and the interface attributes are used to indicate the first interface.
  • the generating module is further specifically configured to perform feature extraction on at least one of the element information according to the interface attributes to obtain interface feature data; input the interface feature data into the interface recognition model, and use the interface recognition model to The interface feature data is recognized, and the interface category output by the interface recognition model is obtained.
  • the generating module is further specifically configured to perform an analysis of the information of the second device according to the interface category.
  • the display area of the indicated second terminal device is divided to obtain multiple sub-areas; the interface elements arranged in each of the sub-areas are determined; according to the size of the display area indicated by the second device information and each The number of interface elements arranged in the sub-regions is adjusted to each of the interface elements in each of the sub-regions to obtain the second interface.
  • the generating module is further specifically configured to determine the interface element in each of the sub-regions. Number of elements; adjust the size and direction of each interface element in each sub-area according to the size of the display area, preset arrangement rules, and the number of elements corresponding to each sub-area , Obtain the adjusted interface element; for each sub-area, adjust the position of the adjusted interface element in the sub-area in the sub-area according to the number of elements corresponding to the sub-area to obtain the Mentioned the second interface.
  • the apparatus further includes:
  • the sending module is configured to send the second interface to the second terminal device, so that the second terminal device displays the second interface.
  • the device further includes:
  • An obtaining module configured to obtain feedback information, where the feedback information is information that a user gives back to the second interface displayed by the second terminal device;
  • the update module is configured to update the interface recognition model according to the feedback information if the feedback information meets the preset update condition.
  • the device further includes:
  • An extraction module configured to extract interface elements in the first interface according to an extraction operation triggered by a user to obtain a plurality of the interface elements
  • the supplementary module is used to generate element information of multiple interface elements according to supplementary operations triggered by the user.
  • the apparatus further includes:
  • a recording module configured to record an adjustment operation triggered by a user on at least one interface element in the second interface
  • the adjustment module is used to adjust the arrangement rule according to the adjustment operation.
  • an embodiment of the present application provides an interface layout system, including: a first terminal device and a second terminal device, where the first terminal device is connected to the second terminal device;
  • the first terminal device generates a second interface for display on the second terminal device according to the interface information of the first interface and the second device information, and the first interface is the interface displayed by the first terminal device ,
  • the second device information is used to indicate the screen size and screen state of the second terminal device;
  • the second terminal device receives and displays the second interface.
  • an embodiment of the present application provides a terminal device.
  • the terminal device includes a memory, a processor, and a computer program that is stored in the memory and can run on the processor, and the processor executes the The computer program implements the interface layout method according to any one of the above-mentioned first aspects.
  • an embodiment of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the implementation is as described in any one of the above-mentioned first aspects.
  • the embodiments of the present application provide a computer program product that, when the computer program product runs on a terminal device, causes the terminal device to execute the interface layout method described in any one of the above-mentioned first aspects.
  • the first terminal device in the embodiment of the present application receives a screen projection instruction that instructs the first terminal device to project a screen to the second terminal device, and generates a screen for The second interface displayed on the second terminal device, where the second device information is used to indicate the screen size and screen state of the second terminal device, so that the second terminal device can display a second interface that matches the second terminal device, and the user Based on the second terminal device, the second interface can be easily controlled, which avoids the problem that users cannot easily control the screen-casting interface, and improves the convenience for users to control the second interface based on the second terminal device and based on different terminal devices. Consistency of manipulation.
  • FIG. 1 is a system architecture diagram of an interface layout system involved in an interface layout method provided by an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of a mobile phone provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a layered architecture of a software system provided by an embodiment of the present application.
  • FIG. 4 is a schematic flowchart of an interface layout method provided by an embodiment of the present application.
  • Fig. 5 is a schematic diagram of a first interface of a player provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an interface of an interface category 1 provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an interface of an interface category 2 provided by an embodiment of the present application.
  • Figure 8-a is a schematic diagram of an interface of interface category 3 provided by an embodiment of the present application.
  • Figure 8-b is a schematic diagram of another interface category 3 provided by an embodiment of the present application.
  • Figure 9-a is a schematic diagram of an interface of interface category 4 provided by an embodiment of the present application.
  • Figure 9-b is a schematic diagram of another interface category 4 provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of an interface of an interface category 5 provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an interface of an interface category 6 provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of an interface of an interface category 7 provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of an interface of an interface category 8 provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of an interface for different terminal devices provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of another interface for different terminal devices provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of another interface for different terminal devices provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of an interface of a first interface provided by an embodiment of the present application.
  • FIG. 18 is a schematic diagram of an IDE interface provided by an embodiment of the present application.
  • FIG. 19 is a structural block diagram of an interface layout device provided by an embodiment of the present application.
  • 20 is a structural block diagram of another interface layout device provided by an embodiment of the present application.
  • FIG. 21 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • the interface layout method provided by the embodiments of this application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, and ultra-mobile personal computers
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • netbooks netbooks
  • PDA personal digital assistant
  • the terminal device may be a station (STAION, ST) in a WLAN, a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a wireless local loop (Wireless Local Loop, WLL) station, Personal Digital Assistant (PDA) equipment, handheld devices with wireless communication functions, in-vehicle devices, car networking terminals, computers, laptop computers, handheld communication devices, handheld computing devices, satellite wireless devices, etc.
  • STAION, ST station
  • WLAN Wireless Local Loop
  • PDA Personal Digital Assistant
  • the wearable device can also be a general term for applying wearable technology to intelligently design daily wear and develop wearable devices, such as glasses, gloves, Watches, clothing and shoes, etc.
  • a wearable device is a portable device that is directly worn on the body or integrated into the user's clothes or accessories.
  • Wearable devices are not only a kind of hardware device, but also realize powerful functions through software support, data interaction, and cloud interaction.
  • wearable smart devices include full-featured, large-sized, complete or partial functions that can be implemented without relying on smart phones, such as smart watches or smart glasses, and only focus on a certain type of application function, and need to be used in conjunction with other devices such as smart phones. , Such as all kinds of smart bracelets and smart jewelry for physical sign monitoring.
  • FIG. 1 is a system architecture diagram of an interface layout system involved in an interface layout method provided by an embodiment of the present application.
  • the interface layout system may include: a first terminal device 101 and at least one second terminal device 102.
  • the first terminal device can be connected to each second terminal device.
  • the first terminal device may be a terminal device that is convenient for the user to perform input operations
  • the second terminal device may be a terminal device that is frequently used by the user but is inconvenient to perform input operations.
  • the first terminal device may be a mobile phone or a tablet computer
  • the second terminal device may be a TV, a speaker, a headset, or a vehicle-mounted device, etc.
  • the input operations performed by the user may include: inputting text information, and clicking on various interface elements in the interface. Operation, the click operation can be a single-click operation, a double-click operation, or other forms of operation.
  • the first terminal device may load different application programs, and may display the first interface corresponding to the application program on the screen of the first terminal device. If the first terminal device detects the screen projection command triggered by the user, indicating that the user expects to project the first interface to the second terminal device, and the second terminal device displays the interface on which the application program runs, the first terminal device can obtain the first terminal device.
  • the interface information of an interface and the second device information of the second terminal device, and the re-layout of the second interface is generated according to the interface information and the second device information.
  • the first terminal device may send the re-arranged second interface to the second terminal device, and the second terminal device may display the re-arranged second interface.
  • the interface information of the first interface may include element information of interface elements in the first interface that can be displayed on the second terminal device.
  • the element information may include the position of the interface element in the first interface, and the interface element to which the interface element belongs. The element type and the name of the interface element, etc.
  • the second device information may include information such as the screen size, screen orientation, and screen resolution of the second terminal device.
  • the second device information may indicate that the resolution of the second terminal device is 2244*1080 in a landscape orientation.
  • the first terminal device can analyze the preprocessed interface information through the pre-trained interface recognition model to determine the interface type; According to the interface type, combined with the screen size and screen orientation of the second terminal device indicated by the second device information, each interface element included in the interface information is arranged according to the screen of the second terminal device to obtain a re-layout The second interface.
  • the first terminal device can perform interface layout for one first interface, or for multiple first interfaces at the same time.
  • each first interface can correspond to one Interface category. If there are multiple first interfaces, each first interface can correspond to an interface category.
  • the embodiment of this application only takes one first interface and one interface category as an example for description. The number of categories is not limited.
  • the embodiments of the present application mainly relate to the field of artificial intelligence (AI) recognition, and particularly relate to the field of machine learning and/or neural network technology.
  • AI artificial intelligence
  • the interface recognition model in the embodiment of the present application is obtained through technical training of AI recognition and machine learning.
  • the first terminal device is a mobile phone as an example.
  • FIG. 2 is a schematic structural diagram of a mobile phone 200 according to an embodiment of the present application.
  • the mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, antenna 1, antenna 2, mobile communication module 251, wireless communication module 252, Audio module 270, speaker 270A, receiver 270B, microphone 270C, earphone interface 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, SIM card interface 295 and so on.
  • a processor 210 an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, antenna 1, antenna 2, mobile communication module 251, wireless communication module 252, Audio module 270, speaker 270A, receiver 270B, microphone 270C, earphone interface 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, SIM card interface 295 and so on.
  • the sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (Of course, the mobile phone 200 may also include other sensors, such as temperature sensors, pressure sensors, distance sensors, and magnetic sensors. , Ambient light sensor, air pressure sensor, bone conduction sensor, etc., not shown in the figure).
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 200.
  • the mobile phone 200 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units.
  • the processor 210 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (Neural-network Processing Unit, NPU) Wait.
  • AP application processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural network Processing Unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the mobile phone 200.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in the processor 210 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 210. If the processor 210 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 210 is reduced, and the efficiency of the system is improved.
  • the memory may store interface attributes of the first terminal device, such as the interface size and interface direction of the first interface.
  • the processor 210 may run the interface layout method provided by the embodiment of the present application, so as to improve the convenience of the user to manipulate the second interface based on the second terminal device and the consistency of the manipulation based on different terminal devices.
  • the processor 210 may include different devices. For example, when a CPU and a GPU are integrated, the CPU and GPU can cooperate to execute the interface layout method provided in the embodiment of the present application. In order to get faster processing efficiency. For example, the CPU can obtain the interface information of the first interface currently displayed and the device information of the terminal device being screened according to the received screen projection instruction, and the GPU can generate a terminal device suitable for the screen being screened according to the interface information and device information. The second interface.
  • the display screen 294 is used to display images, videos, and the like.
  • the display screen 294 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the mobile phone 200 may include one or N display screens 294, and N is a positive integer greater than one.
  • the display screen 294 can be used to display information input by the user or information provided to the user and various graphical user interfaces (GUI).
  • GUI graphical user interface
  • the display 294 may display photos, videos, web pages, or files.
  • the display 294 may display a graphical user interface.
  • the graphical user interface may include a status bar, a hidden navigation bar, time and weather widgets, and application icons, such as browser icons.
  • the status bar includes the name of the operator (for example, China Mobile), mobile network (for example, 4G), time, and remaining power.
  • the navigation bar includes a back button icon, a home button icon, and a forward button icon.
  • the status bar may also include a Bluetooth icon, a Wi-Fi icon, an external device icon, and the like.
  • the graphical user interface may also include a Dock bar, and the Dock bar may include commonly used application icons and the like.
  • the display screen 294 may be an integrated flexible display screen, or a spliced display screen composed of two rigid screens and a flexible screen located between the two rigid screens.
  • the processor 210 may control the GPU to generate a second interface for display by the second terminal device.
  • the camera 293 (a front camera or a rear camera, or one camera can be used as a front camera or a rear camera) is used to capture still images or videos.
  • the camera 293 may include photosensitive elements such as a lens group and an image sensor, where the lens group includes a plurality of lenses (convex lens or concave lens) for collecting the light signal reflected by the object to be photographed and transmitting the collected light signal to the image sensor .
  • the image sensor generates an original image of the object to be photographed according to the light signal.
  • the internal memory 221 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 210 executes various functional applications and data processing of the mobile phone 200 by running instructions stored in the internal memory 221.
  • the internal memory 221 may include a storage program area and a storage data area.
  • the storage program area can store operating system, application program (such as camera application, WeChat application, etc.) codes and so on.
  • the data storage area can store data created during the use of the mobile phone 200 (such as images and videos collected by a camera application) and the like.
  • the internal memory 221 may also store one or more computer programs corresponding to the interface layout method provided in the embodiments of the present application.
  • the one or more computer programs are stored in the aforementioned memory 221 and configured to be executed by the one or more processors 210.
  • the one or more computer programs include instructions, and the aforementioned instructions can be used to execute the instructions shown in FIGS. 4 to 4 18
  • the computer program may include a receiving module and a generating module.
  • the receiving module is used to receive a screen projection instruction, which is used to instruct the first terminal device to cast a screen to the second terminal device;
  • the first interface is an interface displayed by the first terminal device, and the second device information is used to indicate the screen size and screen status of the second terminal device.
  • the internal memory 221 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • a non-volatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • the code of the interface layout method provided in the embodiments of the present application may also be stored in an external memory.
  • the processor 210 may run the code of the interface layout method stored in the external memory through the external memory interface 220, and the processor 210 may control the GPU to generate the second interface for display by the second terminal device.
  • the function of the sensor module 280 is described below.
  • the gyroscope sensor 280A may be used to determine the movement posture of the mobile phone 200.
  • the angular velocity of the mobile phone 200 around three axes ie, x, y, and z axes
  • the gyroscope sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or static.
  • the gyroscope sensor 280A can be used to detect the folding or unfolding operation on the display screen 294.
  • the gyroscope sensor 280A may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folding state or unfolding state of the display screen 294.
  • the acceleration sensor 280B can detect the magnitude of the acceleration of the mobile phone 200 in various directions (generally three axes). That is, the gyroscope sensor 280A can be used to detect the current motion state of the mobile phone 200, such as shaking or static. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B can be used to detect folding or unfolding operations on the display screen 294. The acceleration sensor 280B may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folding state or unfolding state of the display screen 294.
  • the proximity light sensor 280G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the mobile phone emits infrared light through light-emitting diodes. Mobile phones use photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the phone. When insufficient reflected light is detected, the mobile phone can determine that there is no object near the mobile phone.
  • the proximity light sensor 280G can be arranged on the first screen of the foldable display screen 294, and the proximity light sensor 280G can detect the first screen according to the optical path difference of the infrared signal.
  • the gyroscope sensor 280A (or the acceleration sensor 280B) may send the detected motion state information (such as angular velocity) to the processor 210.
  • the processor 210 determines whether it is currently in the hand-held state or the tripod state based on the motion state information (for example, when the angular velocity is not 0, it means that the mobile phone 200 is in the hand-held state).
  • the fingerprint sensor 280H is used to collect fingerprints.
  • the mobile phone 200 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • Touch sensor 280K also called “touch panel”.
  • the touch sensor 280K may be disposed on the display screen 294, and the touch screen is composed of the touch sensor 280K and the display screen 294, which is also called a “touch screen”.
  • the touch sensor 280K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 294.
  • the touch sensor 280K may also be disposed on the surface of the mobile phone 200, which is different from the position of the display screen 294.
  • the display screen 294 of the mobile phone 200 displays a main interface, and the main interface includes icons of multiple applications (such as a camera application, a WeChat application, etc.).
  • the display screen 294 displays the interface of the camera application, such as a viewfinder interface.
  • the wireless communication function of the mobile phone 200 can be realized by the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile phone 200 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 251 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied on the mobile phone 200.
  • the mobile communication module 251 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 251 can receive electromagnetic waves from the antenna 1, filter, amplify, and other processing of the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 251 can also amplify the signal modulated by the modem processor, and convert it to electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 251 may be provided in the processor 210.
  • the mobile communication module 251 can also be used to exchange information with other terminal devices, that is, to send audio output requests to other terminal devices, or the mobile communication module 251 can be used to receive audio output requests and transfer the received audio
  • the output request is encapsulated into a message in a specified format.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 270A, a receiver 270B, etc.), or displays an image or video through the display screen 294.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 210 and be provided in the same device as the mobile communication module 251 or other functional modules.
  • the wireless communication module 252 can provide applications on the mobile phone 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellite systems. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 252 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 252 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210.
  • the wireless communication module 252 may also receive the signal to be sent from the processor 210, perform frequency modulation, amplify it, and convert it into electromagnetic waves and radiate it through the antenna 2.
  • the wireless communication module 252 is configured to establish a connection with an audio output device, and output a voice signal through the audio output device.
  • the wireless communication module 252 may be used to access the access point device, send messages corresponding to audio output requests to other terminal devices, or receive messages corresponding to audio output requests sent from other terminal devices.
  • the wireless communication module 252 may also be used to receive voice data from other terminal devices.
  • the mobile phone 200 can implement audio functions through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the earphone interface 270D, and the application processor. For example, music playback, recording, etc.
  • the mobile phone 200 can receive the key 290 input, and generate key signal input related to the user settings and function control of the mobile phone 200.
  • the mobile phone 200 can use the motor 291 to generate a vibration notification (for example, an incoming call vibration notification).
  • the indicator 292 in the mobile phone 200 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 295 in the mobile phone 200 is used to connect to the SIM card.
  • the SIM card can be connected to and separated from the mobile phone 200 by inserting into the SIM card interface 295 or pulling out from the SIM card interface 295.
  • the mobile phone 200 may include more or less components than those shown in FIG. 2, which is not limited in the embodiment of the present application.
  • the illustrated mobile phone 200 is only an example, and the mobile phone 200 may have more or fewer components than shown in the figure, may combine two or more components, or may have different component configurations.
  • the various components shown in the figure may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the software system of the terminal device can adopt a layered architecture, event-driven architecture, micro-core architecture, micro-service architecture, or cloud architecture.
  • the embodiment of the present invention takes an Android system with a layered architecture as an example to exemplify the software structure of the terminal device.
  • Fig. 3 is a software structure block diagram of a terminal device according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as phone, camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, and screen projection.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and so on.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the interface attributes of the first interface can be acquired, such as the interface size and interface direction of the first interface.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the telephone manager is used to provide the communication function of the terminal device. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, and so on.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompt text messages in the status bar sound a prompt tone, terminal equipment vibration, flashing indicator lights, etc.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • Fig. 4 is a schematic flowchart of an interface layout method provided by an embodiment of the present application. As an example and not a limitation, the method can be applied to the above-mentioned first terminal device. Referring to Fig. 4, the method includes:
  • Step 401 Receive a screen projection instruction.
  • the screen projection instruction is used to instruct the first terminal device to cast a screen to the second terminal device.
  • the screen projection instruction may include a second device identifier for instructing the second terminal device, and the first terminal device may determine to cast a screen to the second terminal device according to the second device identifier.
  • the first terminal device When the first terminal device is loading the application program, it can display the interface of the application program, and when the network where the first terminal device is located also includes other terminal devices, for example, when the second terminal device is included, the first terminal device can detect that the user triggers If the triggering of the screen projection instruction to the second terminal device is detected, the screen projection instruction can be received, so that in the subsequent steps, the first terminal device can generate a match with the second terminal device The second interface.
  • the first terminal device may be a mobile phone
  • the second terminal device may be a TV
  • the first terminal device loads a fitness application program
  • the interface displayed on the first terminal device may be a fitness video
  • the user is in the process of exercising, It is inconvenient to hold a mobile phone, and the screen of the mobile phone is small.
  • the first terminal device can detect the screen-casting instruction triggered by the user.
  • the screen-casting instruction instructs the screen of the fitness application program to be screened to the TV so that the user can view the fitness video on the TV. .
  • Step 402 Obtain interface information of the first interface and second device information.
  • the first terminal device After the first terminal device receives the screen projection instruction, it indicates that the user expects to project the interface displayed by the first terminal device to the second terminal device, and the second terminal device displays the interface displayed by the first terminal device, so that the user can Conveniently and quickly control the projected interface based on the second terminal device.
  • the first terminal device can obtain the interface information of the first interface and the second device information, so that in the subsequent steps, it can be based on the interface information of the first interface and the second device.
  • the second device information is to generate a second interface that matches the second terminal device and is used to display on the second terminal device.
  • the user needs to use different operations to control the second terminal device.
  • the first terminal device can report to the first terminal device during the process of casting a screen to a different second terminal device.
  • the first interface displayed by the device is adjusted to obtain a second interface that matches the second terminal device.
  • the first interface is the interface displayed by the first terminal device.
  • the interface information may include interface attributes and element information of at least one interface element in the first interface.
  • the interface attributes are used to indicate the interface size and interface direction of the first interface.
  • the element information of is used to indicate the name and type of the interface element and the position of the interface element in the first interface.
  • the first terminal device may recognize each interface element in the first interface according to a preset element identification mode, and determine multiple interface elements in the first interface and element information of each interface element.
  • Figure 5 shows a first interface of the player displayed by the first terminal device.
  • the first interface may include a title 501, a cover 502, and a seek 503. , Repeat 504, pre 505, play 506, next 507 and menu 508 and many other interface elements.
  • the first terminal device may also obtain element information of each interface element, and the element information of each interface element described above may include:
  • label is used to indicate the identification of each interface element, such as the number of each interface element
  • labelname is used to indicate the name of each interface element
  • uiRect is used to indicate the area corresponding to each interface element in the first interface
  • viewID Used to indicate the view identifier, which indicates the identification information of the image corresponding to the interface element.
  • uiRect can include four parameters: bottom, top, left and right, where bottom is used to indicate the lower boundary of the interface element, top is used to indicate the upper boundary of the interface element, left is used to indicate the left boundary of the interface element, and right Used to indicate the right boundary of interface elements.
  • the unit of each parameter in the element information may be pixels.
  • the area corresponding to the song title is: the upper boundary 102 pixels, the lower boundary 170 pixels, the left boundary 168 pixels, and the right boundary 571 pixels.
  • the interface element recognized by the first terminal device is an interface element that can be displayed on the second terminal device.
  • the first terminal device can firstly The element is identified, and the identified interface elements are compared and matched with the obtained second device information according to the preset recommendation algorithm. If it is determined that an interface element can be displayed on the second terminal device, the interface element can be extracted , To obtain the element information of the interface element, if it is determined that an interface element cannot be displayed on the second terminal device, the interface element can be ignored and the interface element is no longer extracted.
  • the first terminal device may first request the second device information from the second terminal device according to the second device identifier carried in the screen projection instruction, and the second terminal device will receive the second device information.
  • the screen size and screen state of the second terminal device can be extracted according to the preset configuration information, and the second device information composed of the screen size and screen state can be fed back to the first terminal device. Then the first terminal device has completed acquiring the second device information.
  • the second device information of the second terminal device may include: (dst_width: 2244, dst_height: 1080, 2), which means that the resolution of the second terminal device is 2244*1080, and the screen status of the second terminal device is 2 The horizontal screen state represented.
  • Step 403 Perform identification according to the element information of at least one interface element in combination with the pre-trained interface recognition model, and determine the interface category.
  • the first terminal device can analyze the element information in the interface information according to the interface recognition model obtained in advance and the interface attributes included in the interface information, so as to determine the interface category corresponding to the first interface.
  • the various interface elements can be arranged according to the interface category.
  • the element information can be preprocessed, that is, each interface element is mapped in a smaller mapping area, And perform feature extraction on the mapping area to obtain interface feature data, and then determine the interface category according to the location of each interface element indicated by the interface feature data.
  • the first terminal device may perform feature extraction on the element information of multiple interface elements according to the interface attributes to obtain interface feature data, input the interface feature data into the interface recognition model, and identify the interface feature data through the interface recognition model to obtain The interface category of the interface recognition model output.
  • the first terminal device may first obtain the position of each interface element according to multiple element information, and combine the interface attributes in the interface information to calculate through a preset mapping formula to obtain the location of each interface element. The location in the mapping area, and then according to whether there are interface elements in each location in the mapping area, feature extraction is performed on the mapping area to obtain interface feature data indicating the location of the interface element. After that, the first terminal device can input the interface feature data into the pre-trained interface recognition model, analyze the interface feature data representing the position of the interface element through the interface recognition model, and finally identify the position of each interface element in the first interface. The interface category of the first interface.
  • the interfaces of each application program can be divided into multiple interface types, and the embodiment of the present application does not limit the number of interface types.
  • 8 interface categories can be preset, and the schematic diagrams corresponding to each interface category are shown in Figs. 6 to 13 respectively.
  • Figure 6 shows a schematic diagram of interface category 1.
  • multiple interface elements can be located on the same layer, and there is no overlap between each interface element. For example, it can be applied to the interface of music playback
  • Figure 7 shows the interface A schematic diagram of category 2.
  • multiple interface elements can also be located on the same layer, but the interface elements are superimposed, for example, it can be applied to the video playback interface
  • Figure 8-a and Figure 8-b respectively show Schematic diagram of interface category 3 in portrait and landscape mode.
  • multiple interface elements can be located on the same layer, and the extension items in the interface can be superimposed. For example, it can be applied to the music playing interface or the pop-up playlist. The video playback page of the selection pops up.
  • the playlist and the video selection are sliding parts;
  • Figure 9-a and Figure 9-b respectively show the schematic diagram of interface category 4 in the vertical screen state, and each interface element in the interface is located on a different layer ,
  • the Views area in the interface can slide up and down or in any direction. For example, it can be applied to a page displaying multiple videos, such as the homepage or navigation interface of a video application.
  • Figure 10 shows a schematic diagram of interface category 5.
  • multiple interface elements can be located on different layers.
  • the top and bottom of the interface are equipped with information bars (Bars), and the Views area of the interface can be slid, for example, it can be applied to social networking.
  • the chat interface or mail interface of the software Figure 11 shows a schematic diagram of interface category 6.
  • multiple interface elements can be located on different layers. Bars are set at the top of the interface, and the Views area of the interface can be slid. For example, it can be applied to The homepage of the mail application or the search interface of the shopping application;
  • Figure 12 shows a schematic diagram of interface category 7.
  • multiple interface elements can be located on different layers. The upper and lower parts of the interface are the Views area, and the Views area at the top is fixed, and the Views area at the bottom is fixed. The Views area can be slid, for example, it can be applied to the interface of live video;
  • Figure 13 shows a schematic diagram of interface category 8.
  • multiple interface elements can be located on different layers, from top to bottom, they are Bars, pictures, and labels. Tabbar, Views and Bars, where Views can slide, for example, can be applied to the product detail interface of a shopping application.
  • Step 404 Arrange at least one interface element according to the interface category and the second device information to obtain a second interface.
  • the first terminal device may, according to the determined interface category, combine the second device information of the second terminal device, and compare at least one interface according to the screen size and screen orientation of the second terminal device indicated by the second device information.
  • the elements are arranged to obtain a second interface matching the second terminal device.
  • the first terminal device may divide the display area of the second terminal device indicated by the second device information according to the interface category to obtain multiple sub-areas, and determine the interface elements arranged in each sub-area, and then according to The size of the display area indicated by the second device information and the number of interface elements arranged in each sub-area are adjusted to each interface element in each sub-area to obtain the second interface.
  • the first terminal device may determine the interface elements that can be arranged in each sub-areas according to the divided multiple sub-areas. After that, according to the size of the display area and the number of interface elements that can be arranged in each sub-area, for each interface element in each sub-area, according to the number of elements corresponding to the sub-area and the importance of each interface element, The size, position and direction of each interface element in the sub-area are adjusted to obtain the second interface.
  • the first terminal device may first count the interface elements arranged in each sub-area, and determine the number of elements of each interface element in each sub-area. And according to the size of the display area, the preset arrangement rules and the number of elements corresponding to each sub-area, the size and direction of each interface element in each sub-area are adjusted to obtain the adjusted interface element, so that the adjusted The interface elements are more compatible with the second terminal device. Finally, for each sub-area, the first terminal device can adjust the position of the adjusted interface element in the sub-area in the sub-area according to the number of elements corresponding to the sub-area to obtain the second interface.
  • the importance of each adjusted interface element can also be obtained, and according to the importance of each adjusted interface element, the parameter value of the important degree is adjusted to the maximum
  • the interface elements of are arranged in the central area of the sub-area.
  • the first terminal device may perform various adjustment operations such as zooming, rotating, and shifting the interface elements, and the embodiment of the present application does not limit the adjustment operations.
  • the display area of the second terminal device can be divided into three sub-areas, upper, middle, and lower.
  • the area occupies 50% of the display area, and the lower sub-area occupies 33% of the display area.
  • the song name and/or artist name can be located in the upper sub-area, and the cover and/or lyrics can be located in the middle sub-area, including play, table of contents, previous song, next song, loop playback control and progress bar.
  • the interface elements can be located in the lower sub-area, which is the control area. According to the number of interface elements in the lower sub-area, all other interface elements except the progress bar can be arranged below the progress bar, or arranged separately The upper and lower sides of the progress bar.
  • the interface elements in the lower subregion are less than the element threshold, you can arrange the interface elements at equal intervals under the progress bar; if the number of interface elements in the lower subregion is greater than or equal to the element threshold, you can Arrange each interface element on the upper and lower sides of the progress bar.
  • the preset element threshold is 6, and the number of other interface elements except the progress bar in the lower sub-area shown in Figure 5 is 5, which is less than the element threshold, other interface elements can be arranged at equal intervals Below the progress bar. Moreover, in the process of arranging, you can arrange the most important playback interface elements in the middle, and then arrange the less important previous and next songs on the left and right sides of the playback interface elements, and finally you can loop The playback controls are arranged on the far left, and the directory interface elements are arranged on the far right.
  • each sub-area is set according to the preset arrangement rules, and the element threshold of each sub-area can be obtained after learning according to user habits.
  • the size of each interface element The degree of importance can also be obtained according to the frequency with which the user triggers an interface element. For example, the higher the frequency of triggering, the higher the importance of the interface element.
  • each sub-area occupies the size of the display area and the elements of each sub-area.
  • the threshold and the way of determining the importance are not limited.
  • the second terminal device may include multiple terminal devices, and the interface layout of each terminal device is also different.
  • the top, middle and bottom layouts can be used for TVs, laptops and tablets
  • the left and right layouts can be used for vehicle-mounted terminal devices
  • the layouts of different layers can be used for watches, such as Set up the Views area on the bottom layer, using a floating up and down layout.
  • the overlay layout taking the overlay layout as an example.
  • the up and down layout can be adopted for TVs
  • the left and right layout can be adopted for laptops, tablet computers, and vehicle-mounted terminal devices.
  • Step 405 Send the second interface to the second terminal device, so that the second terminal device displays the second interface.
  • the first terminal device After the first terminal device generates the second interface, it can send the second interface to the second terminal device so that the second terminal device can display the second interface and show the user the second interface that matches the screen of the second terminal device. Two interface.
  • the first terminal device arrange the interface elements according to the interface category to obtain the second interface according to steps 403 and 404, but also perform steps 403 and steps via the second terminal device.
  • the second terminal device can receive the interface category and interface elements sent by the first terminal device, and arrange the interface elements according to the interface category and the second device information, thereby generating and displaying the second interface, the second
  • the process of generating the second interface by the terminal device is similar to the process in step 403, and will not be repeated here.
  • Step 406 Update the interface recognition model according to the obtained feedback information.
  • the first terminal device After the first terminal device sends the second interface to the second terminal device so that the second terminal device displays the second interface, the first terminal device can detect the operation triggered by the user, and obtain the feedback information input by the user for the second interface, so as to A terminal device can update the interface recognition model according to the acquired feedback information.
  • the first terminal device may first display the feedback interface to the user and detect the input operation triggered by the user. If the input operation is detected, the feedback information input by the user may be obtained. After the feedback information is recorded, if the feedback information recorded this time and the previously recorded feedback information meet the preset update conditions, the interface recognition model can be updated according to the multiple recorded feedback information.
  • the number of recorded feedbacks of multiple feedback information can be obtained, and the number of feedbacks can be compared with a preset feedback threshold, if the number of feedbacks is greater than or equal to the feedback threshold ,
  • the interface recognition model can be updated according to the multiple recorded feedback information, so that the updated interface recognition model can more accurately determine the interface category.
  • the interface layout method provided in the embodiments of the present application can be applied not only to scenes of interface projection, but also to scenes of interface development.
  • the interface elements in the first interface can be manually extracted.
  • the first terminal device may extract the interface elements in the first interface according to the extraction operation triggered by the user to obtain multiple interface elements, and then generate the element information of the multiple interface elements according to the supplementary operation triggered by the user. So that in the subsequent steps, the interface layout can be performed according to the generated element information.
  • the first terminal device may load an integrated development environment (Integrated Development Environment, IDE), and input the image corresponding to the first interface and the interface attributes of the first interface into the IDE according to the input operation triggered by the user, that is, to The IDE inputs the first interface image and the resolution corresponding to the first interface.
  • IDE integrated development Environment
  • the first terminal device can detect the user-triggered box selection operation for interface elements, and perform box selection on multiple interface elements in the first interface according to the box selection operation (refer to the dashed box selection in Figure 17). Part) to get multiple interface elements.
  • the area occupied by each interface element can be determined according to the selection box of the interface element.
  • the coordinates corresponding to the four sides of the selection box can be determined as the corresponding coordinates of the interface element in the first interface.
  • the first terminal device can also remind the user to supplement each interface element according to a preset table, and generate element information of each interface element. For example, it can obtain the name and element of each interface element according to the input operation triggered by the user. Type and other multiple information to generate element information of interface elements, and generate an overall element list based on the element information of multiple interface elements.
  • the first terminal device may also obtain the second device information of the second terminal device input by the user according to an operation triggered by the user.
  • the second device information may include the name, screen resolution, and portrait orientation of the second terminal device. state.
  • the first terminal device can perform operations similar to steps 402 and 403 to generate a second interface. After that, it can detect the adjustment operation triggered by the user and adjust the size and position of each interface element in the second interface. Make adjustments and record the adjustment operations triggered by the user, so that the preset arrangement rules can be adjusted according to the recorded adjustment operations, that is, the first terminal device can record the user's triggering of at least one interface element in the second interface Adjust the operation, and adjust the arrangement rules according to the adjustment operation.
  • Figure 18 shows the IDE interface displayed by the first terminal device. As shown in Figure 18, the left side of the figure shows the first interface after selecting interface elements, and the upper right part records various interface elements.
  • the middle on the right shows the name of the first terminal device "mobile” and the name of the second terminal device “TV”, and the screen resolution of the first terminal device "720*1080”
  • the screen resolution of the second terminal device is "2244*1080”
  • the horizontal and vertical screen status of the first terminal device is "1" (indicating vertical screen)
  • the horizontal and vertical screen status of the second terminal device is "2" (indicating horizontal screen )
  • the lower right side shows the generated second interface
  • the first terminal device can further adjust each interface element in the second interface according to the adjustment operation triggered by the user.
  • the first terminal device receives a screen projection instruction that instructs the first terminal device to cast a screen to the second terminal device, and displays it according to the second device information and the first terminal device.
  • the interface information of the first interface is used to generate a second interface for display on the second terminal device, where the second device information is used to indicate the screen size and screen state of the second terminal device, so that the second terminal device can display the same.
  • the second interface that matches the two terminal devices the user can conveniently control the second interface based on the second terminal device, avoiding the problem that the user cannot easily control the screen-casting interface, and improving the user’s ability to control the second interface based on the second terminal device.
  • users of interface development can provide the user with a rearranged second interface, and adjust each interface element in the second interface again according to the operation triggered by the user, and the user can get the second interface without manual operation.
  • the time spent on user development interface is reduced, and the efficiency of user development interface is improved.
  • the process of determining the interface category through the interface recognition model first perform feature extraction on the element information of the interface element to obtain interface feature data. Through the interface feature data, the amount of calculation for determining the interface category can be reduced, thereby improving the efficiency of determining the interface category. .
  • the accuracy of the interface recognition model identifying the interface type is improved.
  • FIG. 19 is a structural block diagram of an interface layout device provided in an embodiment of the present application. For ease of description, only parts related to the embodiment of the present application are shown.
  • the device includes:
  • the receiving module 1901 is configured to receive a screen projection instruction, where the screen projection instruction is used to instruct the first terminal device to cast a screen to the second terminal device;
  • the generating module 1902 is configured to generate a second interface for display on the second terminal device according to the interface information of the first interface and the second device information, the first interface is the interface displayed by the first terminal device, and the second interface The second device information is used to indicate the screen size and screen status of the second terminal device.
  • the generating module 1902 is specifically configured to obtain interface information of the first interface and the second device information, the interface information of the first interface includes element information of at least one interface element in the first interface, and the element The information is used to indicate the name and type of the interface element and the position of the interface element in the first interface; according to the element information of at least one interface element, combined with a pre-trained interface recognition model for identification, the interface category is determined; The interface category and the second device information arrange at least one interface element to obtain the second interface.
  • the interface information of the first interface further includes interface attributes, and the interface attributes are used to indicate the interface size and interface direction of the first interface;
  • the generating module 1902 is also specifically configured to perform feature extraction on at least one piece of element information according to the interface attribute to obtain interface feature data; input the interface feature data into the interface recognition model, and perform the interface feature data on the interface recognition model through the interface recognition model. Recognition, the interface category output by the interface recognition model is obtained.
  • the generating module 1902 is further specifically configured to divide the display area of the second terminal device indicated by the second device information according to the interface type to obtain multiple sub-areas; and determine the row in each sub-area. According to the size of the display area indicated by the second device information and the number of interface elements arranged in each sub-area, adjust each of the interface elements in each sub-area to obtain The second interface.
  • the generating module 1902 is also specifically configured to determine the number of elements of each interface element in each sub-area; according to the size of the display area, preset arrangement rules and elements corresponding to each sub-area The size and direction of each interface element in each sub-area are adjusted to obtain the adjusted interface element; for each sub-area, according to the number of elements corresponding to the sub-area, The position of the adjusted interface element in the sub-region is adjusted to obtain the second interface.
  • the device further includes:
  • the sending module 1903 is configured to send the second interface to the second terminal device, so that the second terminal device displays the second interface.
  • the device further includes:
  • the obtaining module 1904 is configured to obtain feedback information, where the feedback information is information that the user gives feedback on the second interface displayed by the second terminal device;
  • the update module 1905 is configured to update the interface recognition model according to the feedback information if the feedback information meets the preset update condition.
  • the device further includes:
  • the extraction module 1906 is configured to extract interface elements in the first interface according to the extraction operation triggered by the user to obtain multiple interface elements;
  • the supplement module 1907 is configured to generate multiple element information of the interface element according to the supplement operation triggered by the user.
  • the device also includes:
  • the recording module 1908 is used to record the adjustment operation triggered by the user on at least one interface element in the second interface;
  • the adjustment module 1909 is configured to adjust the arrangement rule according to the adjustment operation.
  • the first terminal device receives a screen projection instruction that instructs the first terminal device to cast a screen to the second terminal device, and displays according to the second device information and the first terminal device.
  • the interface information of the first interface is used to generate a second interface for display on the second terminal device, where the second device information is used to indicate the screen size and screen state of the second terminal device, so that the second terminal device can display the same.
  • the second interface that matches the two terminal devices the user can conveniently control the second interface based on the second terminal device, avoiding the problem that the user cannot easily control the screen-casting interface, and improving the user’s ability to control the second interface based on the second terminal device.
  • An embodiment of the present application also provides a terminal device, including a memory, a processor, and a computer program stored in the memory and running on the processor.
  • the processor implements any of the foregoing when the computer program is executed. Steps in the embodiment of the interface layout method.
  • the embodiments of the present application also provide a computer-readable storage medium that stores a computer program that, when executed by a processor, implements the steps in any of the above-mentioned interface layout method embodiments.
  • FIG. 21 is a schematic structural diagram of a terminal device provided by an embodiment of the present application.
  • the terminal device 21 of this embodiment includes: at least one processor 211 (only one is shown in FIG. 21), a processor, a memory 212, and a processor that is stored in the memory 212 and can be processed in the at least one processor.
  • the computer program 212 running on the processor 211 when the processor 211 executes the computer program 212, the steps in any of the above-mentioned interface layout method embodiments are implemented.
  • the terminal device 21 may be a computing device such as a desktop computer, a notebook, a palmtop computer, and a cloud server.
  • the terminal device may include, but is not limited to, a processor 211 and a memory 212.
  • FIG. 21 is only an example of the terminal device 21, and does not constitute a limitation on the terminal device 21. It may include more or less components than those shown in the figure, or a combination of certain components, or different components. , For example, can also include input and output devices, network access devices, and so on.
  • the so-called processor 211 may be a central processing unit (Central Processing Unit, CPU), and the processor 211 may also be other general-purpose processors, digital signal processors (Digital Signal Processors, DSPs), and application specific integrated circuits (Application Specific Integrated Circuits). , ASIC), ready-made programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory 212 may be an internal storage unit of the terminal device 21 in some embodiments, such as a hard disk or a memory of the terminal device 21. In other embodiments, the memory 212 may also be an external storage device of the terminal device 21, such as a plug-in hard disk equipped on the terminal device 21, a smart media card (SMC), and a secure digital (Secure Digital, SD) card, Flash Card, etc. Further, the memory 212 may also include both an internal storage unit of the terminal device 21 and an external storage device.
  • the memory 212 is used to store an operating system, an application program, a boot loader (BootLoader), data, and other programs, such as the program code of the computer program. The memory 212 can also be used to temporarily store data that has been output or will be output.
  • the disclosed device and method may be implemented in other ways.
  • the system embodiment described above is merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be It can be combined or integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the computer program can be stored in a computer-readable storage medium. When executed by the processor, the steps of the foregoing method embodiments can be implemented.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file, or some intermediate forms.
  • the computer-readable medium may at least include: any entity or device capable of carrying computer program code to a terminal device, a recording medium, a computer memory, a read-only memory (ROM, Read-Only Memory), and a random access memory (RAM, Random Access Memory), electric carrier signal, telecommunications signal and software distribution medium.
  • ROM read-only memory
  • RAM random access memory
  • electric carrier signal telecommunications signal and software distribution medium.
  • U disk mobile hard disk, floppy disk or CD-ROM, etc.
  • computer-readable media cannot be electrical carrier signals and telecommunication signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Digital Computer Display Output (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)

Abstract

本申请适用于人工智能识别技术领域,提供了一种界面布局方法、装置及系统,所述方法包括:第一终端设备接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;并根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态,使得第二终端设备可以显示与第二终端设备相匹配的第二界面,用户基于第二终端设备可以方便地对第二界面进行操控,避免了用户无法方便操控投屏的界面的问题,提高了用户基于第二终端设备对第二界面进行操控的便捷性和基于不同终端设备进行操控的一致性。

Description

界面布局方法、装置及系统
本申请要求于2020年2月10日提交国家知识产权局、申请号为2020101068011、申请名称为“界面布局方法、装置及系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于人工智能识别技术领域,尤其涉一种界面布局方法、装置及系统。
背景技术
随着终端设备的不断发展,终端设备在加载应用程序的过程中,不但可以展示应用程序的界面,还可以向其他终端设备投屏该应用程序的界面,以便用户能够通过其他终端设备控制应用程序执行不同的功能,使得用户能够在不同终端设备上体验具有一致操作的无缝连续服务。
相关技术中,第一终端设备在加载应用程序的过程中,若检测到用户触发的投屏操作,则可以根据该投屏操作,将应用程序当前展示的界面向投屏操作所指示的第二终端设备进行投屏,则第二终端设备可以展示该应用程序在第一终端设备中所展示的界面。
但是,不同终端设备的屏幕大小不同,而且用户对各个终端设备进行操控的方便程度不同,导致将第一终端设备展示的界面投屏到第二终端设备后,用户基于第二终端设备可能无法方便对投屏的界面进行操控。
发明内容
本申请实施例提供了一种界面布局方法、装置及系统,可以解决第一终端设备将展示的界面投屏到第二终端设备后,用户基于第二终端设备无法方便对投屏的界面进行操控的问题。
第一方面,本申请实施例提供了一种界面布局方法,应用于第一终端设备,所述第一终端设备与第二终端设备连接,所述方法包括:
接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;
根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态。
在第一方面的第一种可能的实现方式中,所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,包括:
获取所述第一界面的界面信息和所述第二设备信息,所述第一界面的界面信息包括所述第一界面中至少一个界面元素的元素信息,所述元素信息用于表示所述界面元素的名称、类型以及所述界面元素在所述第一界面中的位置;
根据至少一个界面元素的所述元素信息,结合预先训练的界面识别模型进行识别, 确定界面类别;
根据所述界面类别和所述第二设备信息对至少一个所述界面元素进行排布,得到所述第二界面。
基于第一方面的第一种可能的实现方式,在第一方面的第二种可能的实现方式中,所述第一界面的界面信息还包括界面属性,所述界面属性用于表示所述第一界面的界面尺寸和界面方向;
所述根据至少一个界面元素的所述元素信息,结合预先训练的界面识别模型进行识别,确定界面类别,包括:
根据所述界面属性对至少一个所述元素信息进行特征提取,得到界面特征数据;
将所述界面特征数据输入所述界面识别模型,通过所述界面识别模型对所述界面特征数据进行识别,得到所述界面识别模型输出的所述界面类别。
基于第一方面的第一种可能的实现方式,在第一方面的第三种可能的实现方式中,所述根据所述界面类别和所述第二设备信息对至少一个所述界面元素进行排布,得到所述第二界面,包括:
根据所述界面类别,对所述第二设备信息所指示的第二终端设备的显示区域进行划分,得到多个子区域;
确定每个所述子区域内排布的界面元素;
根据所述第二设备信息所指示的所述显示区域的尺寸和每个所述子区域内排布的界面元素的元素数目,对各个所述子区域内的各个所述界面元素进行调整,得到所述第二界面。
基于第一方面的第三种可能的实现方式,在第一方面的第四种可能的实现方式中,所述根据所述第二设备信息所指示的所述显示区域的尺寸和每个所述子区域内排布的界面元素的元素数目,对各个所述子区域内的各个所述界面元素进行调整,得到所述第二界面,包括:
确定每个所述子区域内各个所述界面元素的所述元素数目;
根据所述显示区域的尺寸、预先设置的排布规则和每个所述子区域对应的元素数目,对每个所述子区域内的每个所述界面元素的大小和方向进行调整,得到调整后的界面元素;
对于每个所述子区域,根据所述子区域对应的元素数目,对所述子区域内调整后的界面元素在所述子区域内的位置进行调整,得到所述第二界面。
基于第一方面的第一种至第四种任意一种可能的实现方式,在第一方面的第五种可能的实现方式中,在所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面之后,所述方法还包括:
向所述第二终端设备发送所述第二界面,使得所述第二终端设备展示所述第二界面。
基于第一方面的第五种可能的实现方式,在第一方面的第六种可能的实现方式中,在所述向所述第二终端设备发送所述第二界面之后,所述方法还包括:
获取反馈信息,所述反馈信息为用户针对所述第二终端设备展示的所述第二界面进行反馈的信息;
若所述反馈信息满足预先设置的更新条件,根据所述反馈信息对界面识别模型进行更新。
基于第一方面的第一种至第四种任意一种可能的实现方式,在第一方面的第七种可能的实现方式中,在所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面之前,所述方法还包括:
根据用户触发的提取操作,对所述第一界面中的界面元素进行提取,得到多个所述界面元素;
根据用户触发的补充操作,生成多个所述界面元素的元素信息。
基于第一方面的第一种至第四种任意一种可能的实现方式,在第一方面的第八种可能的实现方式中,在所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面之后,所述方法还包括:
记录用户对所述第二界面中至少一个界面元素触发的调整操作;
根据所述调整操作对排布规则进行调整。
第二方面,本申请实施例提供了一种界面布局装置,应用于第一终端设备,所述第一终端设备与第二终端设备连接,所述装置包括:
接收模块,用于接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;
生成模块,用于根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态。
在第二方面的第一种可能的实现方式中,所述生成模块,具体用于获取所述第一界面的界面信息和所述第二设备信息,所述第一界面的界面信息包括所述第一界面中至少一个界面元素的元素信息,所述元素信息用于表示所述界面元素的名称、类型以及所述界面元素在所述第一界面中的位置;根据至少一个界面元素的所述元素信息,结合预先训练的界面识别模型进行识别,确定界面类别;根据所述界面类别和所述第二设备信息对至少一个所述界面元素进行排布,得到所述第二界面。
基于第二方面的第一种可能的实现方式,在第二方面的第二种可能的实现方式中,所述第一界面的界面信息还包括界面属性,所述界面属性用于表示所述第一界面的界面尺寸和界面方向;
所述生成模块,还具体用于根据所述界面属性对至少一个所述元素信息进行特征提取,得到界面特征数据;将所述界面特征数据输入所述界面识别模型,通过所述界面识别模型对所述界面特征数据进行识别,得到所述界面识别模型输出的所述界面类别。
基于第二方面的第一种可能的实现方式,在第二方面的第三种可能的实现方式中,所述生成模块,还具体用于根据所述界面类别,对所述第二设备信息所指示的第二终端设备的显示区域进行划分,得到多个子区域;确定每个所述子区域内排布的界面元素;根据所述第二设备信息所指示的所述显示区域的尺寸和每个所述子区域内排布的界面元素的元素数目,对各个所述子区域内的各个所述界面元素进行调整,得到所述第二界面。
基于第二方面的第三种可能的实现方式,在第二方面的第四种可能的实现方式中,所述生成模块,还具体用于确定每个所述子区域内各个所述界面元素的元素数目;根据所述显示区域的尺寸、预先设置的排布规则和每个所述子区域对应的元素数目,对每个所述子区域内的每个所述界面元素的大小和方向进行调整,得到调整后的界面元素;对于每个所述子区域,根据所述子区域对应的元素数目,对所述子区域内调整后的界面元素在所述子区域内的位置进行调整,得到所述第二界面。
基于第二方面的第一种至第四种任意一种可能的实现方式,在第二方面的第五种可能的实现方式中,所述装置还包括:
发送模块,用于向所述第二终端设备发送所述第二界面,使得所述第二终端设备展示所述第二界面。
基于第二方面的第五种可能的实现方式,所述装置还包括:
获取模块,用于获取反馈信息,所述反馈信息为用户针对所述第二终端设备展示的所述第二界面进行反馈的信息;
更新模块,用于若所述反馈信息满足预先设置的更新条件,根据所述反馈信息对界面识别模型进行更新。
基于第二方面的第一种至第四种任意一种可能的实现方式,所述装置还包括:
提取模块,用于根据用户触发的提取操作,对所述第一界面中的界面元素进行提取,得到多个所述界面元素;
补充模块,用于根据用户触发的补充操作,生成多个所述界面元素的元素信息。
基于第二方面的第一种至第四种任意一种可能的实现方式,在第二方面的第八种可能的实现方式中,所述装置还包括:
记录模块,用于记录用户对所述第二界面中至少一个界面元素触发的调整操作;
调整模块,用于根据所述调整操作对排布规则进行调整。
第三方面,本申请实施例提供了一种界面布局系统,包括:第一终端设备和第二终端设备,所述第一终端设备与所述第二终端设备连接;
所述第一终端设备接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;
所述第一终端设备根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态;
所述第一终端设备向所述第二终端设备发送所述第二界面;
所述第二终端设备接收并展示所述第二界面。
第四方面,本申请实施例提供了一种终端设备,终端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如上述第一方面中任一项所述的界面布局方法。
第五方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如上述第一方面中任一项所述的界面布局方法。
第六方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在终端 设备上运行时,使得终端设备执行上述第一方面中任一项所述的界面布局方法。
本申请实施例与现有技术相比存在的有益效果是:
本申请实施例第一终端设备通过接收指示第一终端设备向第二终端设备投屏的投屏指令,并根据第二设备信息和第一终端设备展示的第一界面的界面信息,生成用于在第二终端设备展示的第二界面,其中第二设备信息用于表示第二终端设备的屏幕尺寸和屏幕状态,使得第二终端设备可以显示与第二终端设备相匹配的第二界面,用户基于第二终端设备可以方便地对第二界面进行操控,避免了用户无法方便操控投屏的界面的问题,提高了用户基于第二终端设备对第二界面进行操控的便捷性和基于不同终端设备进行操控的一致性。
附图说明
图1是本申请实施例提供的一种界面布局方法所涉及的界面布局系统的系统架构图;
图2是本申请实施例提供的手机的结构示意图;
图3是本申请实施例提供的一种软件系统的分层架构示意图;
图4是本申请实施例提供的一种界面布局方法的示意性流程图;
图5是本申请实施例提供的一种播放器的第一界面的示意图;
图6是本申请实施例提供的一种界面类别1的界面示意图;
图7是本申请实施例提供的一种界面类别2的界面示意图;
图8-a是本申请实施例提供的一种界面类别3的界面示意图;
图8-b是本申请实施例提供的另一种界面类别3的界面示意图;
图9-a是本申请实施例提供的一种界面类别4的界面示意图;
图9-b是本申请实施例提供的另一种界面类别4的界面示意图;
图10是本申请实施例提供的一种界面类别5的界面示意图;
图11是本申请实施例提供的一种界面类别6的界面示意图;
图12是本申请实施例提供的一种界面类别7的界面示意图;
图13是本申请实施例提供的一种界面类别8的界面示意图;
图14是本申请实施例提供的一种针对不同终端设备的界面示意图;
图15是本申请实施例提供的另一种针对不同终端设备的界面示意图;
图16是本申请实施例提供的又一种针对不同终端设备的界面示意图;
图17是本申请实施例提供的一种第一界面的界面示意图;
图18是本申请实施例提供的一种IDE界面的界面示意图;
图19是本申请实施例提供的一种界面布局装置的结构框图;
图20是本申请实施例提供的另一种界面布局装置的结构框图;
图21是本申请一实施例提供的终端设备的结构示意图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一个”、“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请实施例中,“一个或多个”是指一个、两个或两个以上;“和/或”,描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
本申请实施例提供的界面布局方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等终端设备上,本申请实施例对终端设备的具体类型不作任何限制。
例如,所述终端设备可以是WLAN中的站点(STAION,ST),可以是蜂窝电话、无绳电话、会话启动协议(Session InitiationProtocol,SIP)电话、无线本地环路(Wireless Local Loop,WLL)站、个人数字处理(Personal Digital Assistant,PDA)设备、具有无线通信功能的手持设备、车载设备、车联网终端、电脑、膝上型计算机、手持式通信设备、手持式计算设备、卫星无线设备等。
作为示例而非限定,当所述终端设备为可穿戴设备时,该可穿戴设备还可以是应用穿戴式技术对日常穿戴进行智能化设计、开发出可以穿戴的设备的总称,如眼镜、手套、手表、服饰及鞋等。可穿戴设备即直接穿在身上,或是整合到用户的衣服或配件的一种便携式设备。可穿戴设备不仅仅是一种硬件设备,更是通过软件支持以及数据交互、云端交互来实现强大的功能。广义穿戴式智能设备包括功能全、尺寸大、可不依赖智能手机实现完整或者部分的功能,如智能手表或智能眼镜等,以及只专注于某一类应用功能,需要和其它设备如智能手机配合使用,如各类进行体征监测的智能手环、智能首饰等。
图1是本申请实施例提供的一种界面布局方法所涉及的界面布局系统的系统架构图,如图1所示,该界面布局系统可以包括:第一终端设备101和至少一个第二终端设备102,第一终端设备可以与每个第二终端设备连接。
其中,第一终端设备可以为用户方便执行输入操作的终端设备,第二终端设备可以为用户经常使用,但是不便执行输入操作的终端设备。例如,第一终端设备可以为手机或平板电脑,第二终端设备可以为电视、音箱、耳机或车载设备等,用户执行的输入操作可以包括:输入文本信息、对界面中各个界面元素触发的点击操作,该点击操作可以为单击操作、双击操作或其他形式的操作。
第一终端设备可以加载不同的应用程序,并可以在第一终端设备的屏幕上显示该应用程序对应的第一界面。若第一终端设备检测到用户触发的投屏指令,说明用户期望将该第一界面投影到第二终端设备,通过第二终端设备显示该应用程序运行的界面,则第一终端设备可以获取第一界面的界面信息和第二终端设备的第二设备信息,并根据界面信息和第二设备信息,生成重新布局的第二界面。之后,第一终端设备可以向 第二终端设备发送重新布局的第二界面,第二终端设备则可以按照重新排布的第二界面进行展示。
其中,第一界面的界面信息可以包括第一界面中可以在第二终端设备进行展示的界面元素的元素信息,例如,该元素信息可以包括界面元素在第一界面中的位置、界面元素所属的元素类型以及界面元素的名称等。而且,第二设备信息可以包括第二终端设备的屏幕尺寸、屏幕方向和屏幕分辨率等信息。例如,第二设备信息可以指示第二终端设备的分辨率为2244*1080,横屏方向。
另外,在根据界面信息和第二设备信息,生成重新布局的第二界面的过程中,第一终端设备可以通过预先训练的界面识别模型对预处理后的界面信息进行分析,确定界面类型;再根据该界面类型,结合第二设备信息所指示的第二终端设备的屏幕大小和屏幕方向,按照第二终端设备的屏幕对界面信息中所包括的每个界面元素进行排布,得到重新布局的第二界面。
需要说明的是,在实际应用中,第一终端设备可以针对一个第一界面进行界面布局,也可以同时针对多个第一界面进行界面布局,相对应的,每个第一界面均可以对应一个界面类别,若存在多个第一界面,则每个第一界面均可以对应一个界面类别,本申请实施例仅是以一个第一界面和一个界面类别为例进行说明,对第一界面和界面类别的数目不做限定。
另外,本申请实施例主要涉及人工智能(Artificial Intelligence,AI)识别领域,尤其涉及机器学习和/或神经网络技术领域。例如,本申请实施例中的界面识别模型即为通过AI识别和机器学习的技术训练得到的。
下文以第一终端设备是手机为例,图2是本申请实施例提供的手机200的结构示意图。
手机200可以包括处理器210,外部存储器接口220,内部存储器221,USB接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块251,无线通信模块252,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及SIM卡接口295等。其中传感器模块280可以包括陀螺仪传感器280A,加速度传感器280B,接近光传感器280G、指纹传感器280H,触摸传感器280K(当然,手机200还可以包括其它传感器,比如温度传感器,压力传感器、距离传感器、磁传感器、环境光传感器、气压传感器、骨传导传感器等,图中未示出)。
可以理解的是,本发明实施例示意的结构并不构成对手机200的具体限定。在本申请另一些实施例中,手机200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立 的器件,也可以集成在一个或多个处理器中。其中,控制器可以是手机200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。例如,存储器中可以存储第一终端设备的界面属性,如第一界面的界面尺寸和界面方向。
处理器210可以运行本申请实施例提供的界面布局方法,以便于提高用户基于第二终端设备对第二界面进行操控的便捷性和基于不同终端设备进行操控的一致性。处理器210可以包括不同的器件,比如集成CPU和GPU时,CPU和GPU可以配合执行本申请实施例提供的界面布局方法,比如界面布局方法中部分算法由CPU执行,另一部分算法由GPU执行,以得到较快的处理效率。例如,CPU可以根据接收的投屏指令,获取当前显示的第一界面的界面信息、以及被投屏的终端设备的设备信息,GPU则可以根据界面信息和设备信息生成适合被投屏的终端设备的第二界面。
显示屏294用于显示图像,视频等。显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机200可以包括1个或N个显示屏294,N为大于1的正整数。显示屏294可用于显示由用户输入的信息或提供给用户的信息以及各种图形用户界面(graphical user interface,GUI)。例如,显示器294可以显示照片、视频、网页、或者文件等。再例如,显示器294可以显示图形用户界面。其中,该图形用户界面上可以包括状态栏、可隐藏的导航栏、时间和天气小组件(widget)、以及应用的图标,例如浏览器图标等。状态栏中包括运营商名称(例如中国移动)、移动网络(例如4G)、时间和剩余电量。导航栏中包括后退(back)键图标、主屏幕(home)键图标和前进键图标。此外,可以理解的是,在一些实施例中,状态栏中还可以包括蓝牙图标、Wi-Fi图标、外接设备图标等。还可以理解的是,在另一些实施例中,该图形用户界面中还可以包括Dock栏,Dock栏中可以包括常用的应用图标等。当处理器210检测到用户的手指(或触控笔等)针对某一应用图标的触摸事件后,响应于该触摸事件,打开与该应用图标对应的应用的用户界面,并在显示器294上显示该应用的用户界面。
在本申请实施例中,显示屏294可以是一个一体的柔性显示屏,也可以采用两个刚性屏以及位于两个刚性屏之间的一个柔性屏组成的拼接显示屏。当处理器210运行本申请实施例提供的界面布局方法后,处理器210可以控制GPU生成用于第二终端设备展示的第二界面。
摄像头293(前置摄像头或者后置摄像头,或者一个摄像头既可作为前置摄像头, 也可作为后置摄像头)用于捕获静态图像或视频。通常,摄像头293可以包括感光元件比如镜头组和图像传感器,其中,镜头组包括多个透镜(凸透镜或凹透镜),用于采集待拍摄物体反射的光信号,并将采集的光信号传递给图像传感器。图像传感器根据所述光信号生成待拍摄物体的原始图像。
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行手机200的各种功能应用以及数据处理。内部存储器221可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,应用程序(比如相机应用,微信应用等)的代码等。存储数据区可存储手机200使用过程中所创建的数据(比如相机应用采集的图像、视频等)等。
内部存储器221还可以存储本申请实施例提供的界面布局方法对应的一个或多个计算机程序。该一个或多个计算机程序被存储在上述存储器221中并被配置为被该一个或多个处理器210执行,该一个或多个计算机程序包括指令,上述指令可以用于执行如图4至图18相应实施例中的各个步骤,该计算机程序可以包括接收模块和生成模块。其中,接收模块用于接收投屏指令,该投屏指令用于指示第一终端设备向第二终端设备投屏;生成模块,用于根据第一界面的界面信息和第二设备信息,生成用于在第二终端设备展示的第二界面,该第一界面为第一终端设备展示的界面,该第二设备信息用于表示第二终端设备的屏幕尺寸和屏幕状态。
此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
当然,本申请实施例提供的界面布局方法的代码还可以存储在外部存储器中。这种情况下,处理器210可以通过外部存储器接口220运行存储在外部存储器中的界面布局方法的代码,处理器210可以控制GPU生成用于第二终端设备展示的第二界面。
下面介绍传感器模块280的功能。
陀螺仪传感器280A,可以用于确定手机200的运动姿态。在一些实施例中,可以通过陀螺仪传感器280A确定手机200围绕三个轴(即,x,y和z轴)的角速度。即陀螺仪传感器280A可以用于检测手机200当前的运动状态,比如抖动还是静止。
当本申请实施例中的显示屏为可折叠屏时,陀螺仪传感器280A可用于检测作用于显示屏294上的折叠或者展开操作。陀螺仪传感器280A可以将检测到的折叠操作或者展开操作作为事件上报给处理器210,以确定显示屏294的折叠状态或展开状态。
加速度传感器280B可检测手机200在各个方向上(一般为三轴)加速度的大小。即陀螺仪传感器280A可以用于检测手机200当前的运动状态,比如抖动还是静止。当本申请实施例中的显示屏为可折叠屏时,加速度传感器280B可用于检测作用于显示屏294上的折叠或者展开操作。加速度传感器280B可以将检测到的折叠操作或者展开操作作为事件上报给处理器210,以确定显示屏294的折叠状态或展开状态。
接近光传感器280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。手机通过发光二极管向外发射红外光。手机使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定手 机附近有物体。当检测到不充分的反射光时,手机可以确定手机附近没有物体。当本申请实施例中的显示屏为可折叠屏时,接近光传感器280G可以设置在可折叠的显示屏294的第一屏上,接近光传感器280G可根据红外信号的光程差来检测第一屏与第二屏的折叠角度或者展开角度的大小。
陀螺仪传感器280A(或加速度传感器280B)可以将检测到的运动状态信息(比如角速度)发送给处理器210。处理器210基于运动状态信息确定当前是手持状态还是脚架状态(比如,角速度不为0时,说明手机200处于手持状态)。
指纹传感器280H用于采集指纹。手机200可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器280K,也称“触控面板”。触摸传感器280K可以设置于显示屏294,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于手机200的表面,与显示屏294所处的位置不同。
示例性的,手机200的显示屏294显示主界面,主界面中包括多个应用(比如相机应用、微信应用等)的图标。用户通过触摸传感器280K点击主界面中相机应用的图标,触发处理器210启动相机应用,打开摄像头293。显示屏294显示相机应用的界面,例如取景界面。
手机200的无线通信功能可以通过天线1,天线2,移动通信模块251,无线通信模块252,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块251可以提供应用在手机200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块251可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块251可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块251还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块251的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块251的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。在本申请实施例中,移动通信模块251还可以用于与其它终端设备进行信息交互,即向其它终端设备发送音频输出请求,或者移动通信模块251可用于接收音频输出请求,并将接收的音频输出请求封装成指定格式的消息。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处 理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器270A,受话器270B等)输出声音信号,或通过显示屏294显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器210,与移动通信模块251或其他功能模块设置在同一个器件中。
无线通信模块252可以提供应用在手机200上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块252可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块252经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块252还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。本申请实施例中,无线通信模块252,用于与音频输出设备建立连接,通过音频输出设备输出语音信号。或者无线通信模块252可以用于接入接入点设备,向其它终端设备发送音频输出请求对应的消息,或者接收来自其它终端设备发送的音频输出请求对应的消息。可选地,无线通信模块252还可以用于接收来自其它终端设备的语音数据。
另外,手机200可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。手机200可以接收按键290输入,产生与手机200的用户设置以及功能控制有关的键信号输入。手机200可以利用马达291产生振动提示(比如来电振动提示)。手机200中的指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。手机200中的SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和手机200的接触和分离。
应理解,在实际应用中,手机200可以包括比图2所示的更多或更少的部件,本申请实施例不作限定。图示手机200仅是一个范例,并且手机200可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
终端设备的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明终端设备的软件结构。图3是本发明实施例的终端设备的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图3所示,应用程序包可以包括电话、相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息和投屏等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图3所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。例如,可以获取第一界面的界面属性,如第一界面的界面尺寸和界面方向等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供终端设备的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,终端设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
图4是本申请实施例提供的一种界面布局方法的示意性流程图,作为示例而非限 定,该方法可以应用于上述第一终端设备中,参见图4,该方法包括:
步骤401、接收投屏指令。
其中,该投屏指令用于指示第一终端设备向第二终端设备投屏。例如,该投屏指令可以包括用于指示第二终端设备的第二设备标识,则第一终端设备可以根据第二设备标识确定向第二终端设备进行投屏。
第一终端设备在加载应用程序的过程中,可以显示应用程序的界面,而第一终端设备所在网络中也包括其他终端设备时,例如包括第二终端设备时,第一终端设备可以检测用户触发的投屏指令,若检测到触发的向第二终端设备进行投屏的投屏指令,则可以接收该投屏指令,以便在后续步骤中,第一终端设备可以生成与第二终端设备相匹配的第二界面。
例如,第一终端设备可以为手机,第二终端设备可以为电视,第一终端设备加载健身类的应用程序,则在第一终端设备显示的界面可以为健身视频,而用户在健身过程中,不方便手持手机,且手机屏幕较小,第一终端设备可以检测用户触发的投屏指令,该投屏指令指示将健身类的应用程序的界面投屏至电视,以便用户通过电视查看该健身视频。
步骤402、获取第一界面的界面信息和第二设备信息。
第一终端设备在接收到投屏指令后,说明用户期望将第一终端设备显示的界面投屏到第二终端设备,通过第二终端设备显示第一终端设备所显示的界面,而为了用户可以方便快捷地基于第二终端设备对投屏的界面进行操控,第一终端设备可以获取第一界面的界面信息和第二设备信息,以便在后续步骤中,可以根据第一界面的界面信息和第二设备信息,生成与第二终端设备相匹配的、用于在第二终端设备展示的第二界面。
也即是,对于不同类型的第二终端设备,用户需要采用不同的操作控制第二终端设备,基于此,第一终端设备在向不同第二终端设备投屏的过程中,可以对第一终端设备显示的第一界面进行调整,得到与第二终端设备相匹配的第二界面。
其中,第一界面为第一终端设备展示的界面,界面信息可以包括界面属性和第一界面中至少一个界面元素的元素信息,界面属性用于表示第一界面的界面尺寸和界面方向,界面元素的元素信息用于表示界面元素的名称、类型以及界面元素在所述第一界面中的的位置。例如,第一终端设备可以根据预先设置的元素识别方式,对第一界面中的各个界面元素进行识别,确定第一界面中多个界面元素和每个界面元素的元素信息。
例如,参见图5,图5示出了第一终端设备显示的播放器的第一界面,该第一界面中可以包括歌名(title)501、封面(cover)502、进度条(seek)503、循环播放控件(repeat)504、上一首(pre)505、播放(play)506、下一首(next)507和目录(menu)508等多个界面元素。
进一步地,第一终端设备还可以获取每个界面元素的元素信息,上述各个界面元素的元素信息可以包括:
[{"label":0,"labelName":"title","uiRect":{"bottom":170,"left":168,"right":571,"top":102},"viewId":684},{"label":1,"labelName":"seek","uiRect":{"bottom":1992,"left":0,"right" :1080,"top":1924},"viewId":670},{"label":2,"labelName":"repeat","uiRect":{"bottom":2167,"left":84,"right":204,"top":2047},"viewId":675},{"label":3,"labelName":"pre","uiRect":{"bottom":2167,"left":279,"right":399,"top":2047},"viewId":676},{"label":4,"labelName":"play","uiRect":{"bottom":2212,"left":435,"right":645,"top":2002},"viewId":677},{"label":5,"labelName":"next","uiRect":{"bottom":2167,"left":681,"right":801,"top":2047},"viewId":678},{"label":6,"labelName":"menu","uiRect":{"bottom":2167,"left":876,"right":996,"top":2047},"viewId":679},{"label":7"labelName":"cover","uiRect":{"bottom":1255,"left":0,"right":1080,"top":451},"viewId":618}]
其中,label用于表示各个界面元素的标识,例如可以是对各个界面元素的编号,lablename用于表示各个界面元素的名称,uiRect用于表示每个界面元素在第一界面中对应的区域,viewID用于表示视图标识,表示界面元素对应的图像的标识信息。进一步地,uiRect可以包括bottom、top、left和right共4个参数,其中bottom用于表示界面元素的下边界,top用于表示界面元素的上边界,left用于表示界面元素的左边界,right用于表示界面元素的右边界。而且,元素信息中各个参数的单位均可以为像素,例如,歌名对应的区域为:上边界102像素、下边界170像素、左边界168像素、右边界571像素。
需要说明的是,上述各个界面元素的元素信息中示出的参数均为示例性的,对界面元素的元素信息不做限定。
还需要说明的是,上述第一终端设备识别得到界面元素为能够在第二终端设备显示的界面元素,在识别界面元素的过程中,第一终端设备可以先对第一界面中的每个界面元素进行识别,再根据预先设置的推荐算法,将识别得到的各个界面元素与获取的第二设备信息进行比较匹配,若确定某个界面元素能够在第二终端设备显示,则可以提取该界面元素,得到该界面元素的元素信息,若确定某个界面元素无法在第二终端设备显示,则可以忽略该界面元素,不再提取该界面元素。
另外,第一终端设备在获取第二设备信息的过程中,可以先根据投屏指令中所携带的第二设备标识,向第二终端设备请求第二设备信息,则第二终端设备在接收到第一终端设备发送的请求后,可以根据预先设置的配置信息,提取得到第二终端设备的屏幕尺寸和屏幕状态,并向第一终端设备反馈由屏幕尺寸和屏幕状态组成的第二设备信息,则第一终端设备获取第二设备信息完毕。
例如,第二终端设备的第二设备信息可以包括:(dst_width:2244,dst_height:1080,2),则说明第二终端设备的分辨率为2244*1080,且第二终端设备的屏幕状态为2所表示的横屏状态。
步骤403、根据至少一个界面元素的元素信息,结合预先训练的界面识别模型进行识别,确定界面类别。
第一终端设备在获取界面信息后,可以根据预先训练得到的界面识别模型结合界面信息所包括的界面属性,对界面信息中的元素信息进行分析,从而可以确定第一界面对应的界面类别,以便在后续步骤中,可以根据界面类别对各个界面元素进行排布。
由于不同第一终端设备的屏幕分辨率各不相同,为了减少第一终端设备的计算量,可以对元素信息进行预处理,也即是,将各个界面元素映射在尺寸较小的映射区域中, 并对映射区域进行特征提取,得到界面特征数据,进而根据该界面特征数据所指示的各个界面元素所在的位置确定界面类别。
可选的,第一终端设备可以根据界面属性对多个界面元素的元素信息进行特征提取,得到界面特征数据,将界面特征数据输入界面识别模型,通过界面识别模型对界面特征数据进行识别,得到界面识别模型输出的界面类别。
在一种可能的实现方式中,第一终端设备可以先根据多个元素信息获取各个界面元素的位置,并结合界面信息中的界面属性,通过预先设置的映射公式进行计算,得到各个界面元素在映射区域中的位置,再根据映射区域中各个位置是否存在界面元素,对映射区域进行特征提取,得到表示界面元素所在位置的界面特征数据。之后,第一终端设备可以将界面特征数据输入预先训练的界面识别模型,通过界面识别模型对表示界面元素位置的界面特征数据进行分析,最后根据各个界面元素在第一界面中的位置,识别得到第一界面的界面类别。
例如,映射公式可以为
Figure PCTCN2020125607-appb-000001
也即是,xt≥ftop,xb≤fbot,xl≥fleft且xr≤fright时,f(x)=ftop+fleft+c;其他情况下f(x)=0。且若f(x)=0,则c=0;其他情况下c为非零常数。
其中,x=(xt,xb.xl,xr),ftop=top*dsth/src_height,fbot=bottom*dsth/src_height,fleft=left*dstw/src_width,fright=right*dstw/src_width,dsth表示映射区域的高度,dstw表示映射区域的宽度,src_height表示第一界面的高度,src_width表示第一界面的宽度。
需要说明的是,在实际应用中,可以将各个应用程序的界面划分为多个界面类型,本申请实施例对界面类型的数量不做限定。例如,可以预先设置8个界面类别,每个界面类别对应的示意图分别如图6至图13所示。
其中,图6展示了界面类别1的示意图,在该界面中多个界面元素可以位于同一图层,且各个界面元素之间无叠加,例如,可以应用于音乐播放的界面;图7展示了界面类别2的示意图,在该界面中多个界面元素也可以位于同一图层,但是界面元素存在叠加的情况,例如,可以应用于视频播放的界面;图8-a和图8-b分别展示了界面类别3在竖屏和横屏状态下的示意图,在该界面中多个界面元素可以位于同一图层,且界面中的扩展项可叠加,例如,可以应用于弹出歌单的音乐播放界面或弹出选集的视频播放页面,歌单和视频选集属于可滑动部分;图9-a和图9-b分别展示了界面类别4在竖屏状态下的示意图,界面中的各个界面元素位于不同图层,界面中的视图(Views)区域可以上下滑动或按照任意方向滑动,例如,可以应用在展示多个视频的页面,如视频应用程序的首页或导航界面。图10展示了界面类别5的示意图,在该界面中多个界面元素可以位于不同图层,界面顶部和底部均设置有信息栏(Bars),界面的Views区域可滑动,例如,可以应用于社交软件的聊天界面或邮件界面;图11展示了界面类别6的示意图,在该界面中多个界面元素可以位于不同图层,界面顶部设置有Bars,界面的Views区域可滑动,例如,可以应用于邮件应用程序的首页或购物应用程序的搜索界面;图12展示了界面类别7的示意图,在该界面中多个界面元素可以位于不同图层,界面上下均为Views区域,上方Views区域固定,下方Views区 域可滑动,例如,可以应用于视频直播的界面;图13展示了界面类别8的示意图,在该界面中多个界面元素可以位于不同图层,从上到下依次为Bars、图片、标签栏(tabbar)、Views和Bars,其中Views可滑动,例如,可以应用于购物应用程序的商品详情界面。
步骤404、根据界面类别和第二设备信息对至少一个界面元素进行排布,得到第二界面。
在确定界面类别后,第一终端设备可以根据确定的界面类别,结合第二终端设备的第二设备信息,按照第二设备信息所指示的第二终端设备的屏幕尺寸和屏幕方向对至少一个界面元素进行排布,得到与第二终端设备相匹配的第二界面。
可选的,第一终端设备可以根据界面类别,对第二设备信息所指示的第二终端设备的显示区域进行划分,得到多个子区域,并确定每个子区域内排布的界面元素,再根据第二设备信息所指示的显示区域的尺寸和每个子区域内排布的界面元素的元素数目,对各个子区域内的各个界面元素进行调整,得到第二界面。
在一种可能的实现方式中,第一终端设备可以根据划分的多个子区域,确定每个子区域中可以排布的界面元素。之后,可以根据显示区域的尺寸和每个子区域中可以排布的界面元素的元素数目的大小,对各个子区域中的各个界面元素,按照子区域对应的元素数目和各个界面元素的重要程度,对各个界面元素在子区域中的大小、位置和方向进行调整,从而得到第二界面。
进一步地,在对界面元素的大小、位置和方向进行调整的过程中,第一终端设备可以先对每个子区域中排布的界面元素进行统计,确定每个子区域内各个界面元素的元素数目,并根据显示区域的尺寸、预先设置的排布规则和每个子区域对应的元素数目,对每个子区域内的每个界面元素的大小和方向进行调整,得到调整后的界面元素,使得调整后的界面元素与第二终端设备更匹配。最后,对于每个子区域,第一终端设备可以根据子区域对应的元素数目,对子区域内调整后的界面元素在子区域内的位置进行调整,得到第二界面。
而且,在对调整后的界面元素进行调整的过程中,还可以获取各个调整后的界面元素的重要程度,并根据各个调整后的界面元素的重要程度,将重要程度的参数值最大的调整后的界面元素排布在子区域的中央区域。
其中,第一终端设备可以对界面元素执行缩放、旋转和位移等多种调整操作,本申请实施例对调整操作不做限定。
例如,参见图5,若图5展示的界面类别为类别1,则可以将第二终端设备的显示区域划分为上中下三个子区域,其中,上部子区域占据显示区域的17%、中部子区域占据显示区域的50%、下部子区域占据显示区域的33%。歌名和/或歌手名均可以位于上部子区域,封面和/或歌词均可以位于中部子区域,包括播放、目录、上一首、下一首、循环播放控件和进度条在内的多个界面元素则可以位于下部子区域,也即是控制区域,可以根据下部子区域中界面元素的数量,将除去进度条之外的其他界面元素均排布在进度条的下方,或者分别排布在进度条的上下两侧。
例如,若下部子区域的界面元素的元素数目小于元素阈值,则可以将各个界面元素等间距排布在进度条的下方;若下部子区域的界面元素的元素数目大于或等于元素 阈值,则可以将各个界面元素排布在进度条的上下两侧。
若预先设置的元素阈值为6,而图5中所示的下部子区域中除去进度条之外的其他界面元素的元素数目为5,小于元素阈值,则可以将其他界面元素等间距的排布在进度条的下方。而且,在排布的过程中,可以将最重要的播放界面元素排布在中间,再将次重要的上一首和下一首分别排布在播放界面元素的左右两侧,最后可以将循环播放控件排布在最左侧,并将目录界面元素排布在最右侧。
需要说明的是,每个子区域占据显示区域的大小是根据预先设置的排布规则设置的,而每个子区域的元素阈值则可以根据用户使用习惯进行学习后得到,类似的,每个界面元素的重要程度也可以根据用户触发界面元素的频次得到,例如触发的频次越高,则该界面元素的重要程度也越高,本申请实施例对每个子区域占据显示区域的大小、每个子区域的元素阈值、以及重要程度的确定方式不做限定。
另外,在实际应用中,第二终端设备可以包括多种终端设备,而每种终端设备的界面布局也有所不同。
例如,参见图14,以非覆盖式布局为例,针对电视、笔记本电脑和平板电脑可以采用上中下布局,针对车载终端设备可以采用左右布局,针对手表则可以采用不同图层的布局,如在底层设置Views区域,采用上下悬浮的布局。参见图15,以覆盖式布局为例,针对电视、笔记本电脑、平板电脑、车载终端设备和手表均可以采用底层设置Views,上层设置上下悬浮布局,如车载终端设备加载的地图类型的应用程序。另外,参见图16,以覆盖滚动式布局为例,针对电视可以采用上下布局的方式,针对笔记本电脑、平板电脑和车载终端设备均可以采用左右布局的方式。
步骤405、向第二终端设备发送第二界面,使得第二终端设备展示该第二界面。
第一终端设备在生成第二界面后,则可以向第二终端设备发送该第二界面,以便第二终端设备能够显示该第二界面,向用户展示与第二终端设备的屏幕相匹配的第二界面。
需要说明的是,在实际应用中,不但可以按照步骤403和步骤404,通过第一终端设备根据界面类别对界面元素进行排布得到第二界面,还可以通过第二终端设备执行步骤403和步骤404,也即是,第二终端设备可以接收第一终端设备发送的界面类别和界面元素,并根据界面类别和第二设备信息对界面元素进行排布,从而生成并显示第二界面,第二终端设备生成第二界面的过程与步骤403中的过程类似,在此不再赘述。
步骤406、根据获取的反馈信息,对界面识别模型进行更新。
第一终端设备向第二终端设备发送第二界面使得第二终端设备展示第二界面后,第一终端设备可以检测用户触发的操作,获取用户输入的针对该第二界面的反馈信息,以便第一终端设备能够根据获取的反馈信息对界面识别模型进行更新。
在一种可能的实现方式中,第一终端设备在生成第二界面后,可以先向用户展示反馈界面,并检测用户触发的输入操作,若检测到输入操作,可以获取用户输入的反馈信息,在对反馈信息记录完毕后,若本次记录的反馈信息以及之前记录的反馈信息满足预先设置的更新条件,则可以根据记录的多个反馈信息,对界面识别模型进行更新。
进一步地,在确定反馈信息是否满足更新条件的过程中,可以获取记录的多个反馈信息的反馈数目,并将该反馈数目与预先设置的反馈阈值进行比较,若该反馈数目大于或等于反馈阈值,则可以根据记录的多个反馈信息对界面识别模型进行更新,使得更新后的界面识别模型能够更加准确的确定界面类别。
需要说明的是,本申请实施例提供的界面布局方法,不但可以应用在界面投影的场景,还可以应用在界面开发的场景。相应的,若应用在界面开发的场景中,在步骤401之前,可以对第一界面中的界面元素进行手动提取。
可选的,第一终端设备可以根据用户触发的提取操作,对第一界面中的界面元素进行提取,得到多个界面元素,再根据用户触发的补充操作,生成多个界面元素的元素信息,以便在后续步骤中,可以根据生成的元素信息进行界面布局。
例如,第一终端设备可以加载集成开发环境(Integrated Development Environment,IDE),并根据用户触发的输入操作,向IDE中输入第一界面对应的图像以及第一界面的界面属性,也即是,向IDE输入第一界面图像和第一界面对应的分辨率。之后,参见图17,第一终端设备可以检测用户触发的针对界面元素的框选操作,并根据框选操作对第一界面中的多个界面元素进行框选(参照图17中的虚线框选部分),得到多个界面元素。
其中,每个界面元素占据的区域可以根据框选界面元素的选框确定,例如可以根据选框的4条边,将4条边分别对应的坐标,确定为界面元素在第一界面中对应的上下左右对应的坐标。
而且,第一终端设备还可以根据预先设置的表格,提醒用户对各个界面元素进行补充,生成每个界面元素的元素信息,例如,可以根据用户触发的输入操作,获取各个界面元素的名称和元素类型等多个信息,从而生成界面元素的元素信息,并根据多个界面元素的元素信息,生成总体元素列表。
另外,第一终端设备还可以根据用户触发的操作,获取用户输入的第二终端设备的第二设备信息,例如,该第二设备信息可以包括第二终端设备的名称、屏幕分辨率和横竖屏状态。
第一终端设备在获取界面元素后,可以执行与步骤402和步骤403类似的操作,生成第二界面,之后,可以检测用户触发的调整操作,对第二界面中的各个界面元素的大小和位置进行调整,并记录用户触发的调整操作,从而可以根据记录的调整操作对预先设置的排布规则进行调整,也即是,第一终端设备可以记录用户对第二界面中至少一个界面元素触发的调整操作,并根据调整操作对排布规则进行调整。
例如,参见图18,图18示出了第一终端设备展示的IDE界面,如图18所示,图中左侧展示了框选界面元素后的第一界面,右侧上部记录了各个界面元素的属性信息,如名称、位置和类型等,右侧中间展示了第一终端设备的名称“手机”和第二终端设备的名称“电视”、第一终端设备的屏幕分辨率“720*1080”、第二终端设备的屏幕分辨率“2244*1080”、第一终端设备的屏幕横竖屏状态“1”(表示竖屏)、以及第二终端设备的屏幕横竖屏状态“2”(表示横屏),右侧下方展示了生成的第二界面,第一终端设备可以根据用户触发的调整操作,对第二界面中的各个界面元素进行进一步调整。
综上所述,本申请实施例提供的界面布局方法,第一终端设备通过接收指示第一终端设备向第二终端设备投屏的投屏指令,并根据第二设备信息和第一终端设备展示的第一界面的界面信息,生成用于在第二终端设备展示的第二界面,其中第二设备信息用于表示第二终端设备的屏幕尺寸和屏幕状态,使得第二终端设备可以显示与第二终端设备相匹配的第二界面,用户基于第二终端设备可以方便地对第二界面进行操控,避免了用户无法方便操控投屏的界面的问题,提高了用户基于第二终端设备对第二界面进行操控的便捷性和基于不同终端设备进行操控的一致性。
而且,针对界面开发的用户,可以向用户提供重新排布的第二界面,并根据用户触发的操作再次对第二界面中的各个界面元素进行调整,用户无需手动操作即可得到第二界面,减少了用户开发界面所花费的时间,提高了用户开发界面的效率。
另外,在通过界面识别模型确定界面类别的过程中,先对界面元素的元素信息进行特征提取,得到界面特征数据,通过界面特征数据可以减少确定界面类别的计算量,从而提高确定界面类别的效率。
进一步地,通过获取反馈信息,并根据反馈信息对界面识别模型进行更新,提高了界面识别模型识别界面类型的准确度。
最后,通过对第一界面中各个界面元素的筛选和重新排布,仅对能够在第二终端设备显示的界面元素进行提取,并根据第二终端设备的屏幕大小和方向对提取的界面元素进行排布,使得生成的第二界面更加符合第二终端设备,提高了第二界面的美观可视性。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
对应于上文实施例所述的界面布局方法,图19是本申请实施例提供的一种界面布局装置的结构框图,为了便于说明,仅示出了与本申请实施例相关的部分。
参照图19,该装置包括:
接收模块1901,用于接收投屏指令,该投屏指令用于指示该第一终端设备向该第二终端设备投屏;
生成模块1902,用于根据第一界面的界面信息和第二设备信息,生成用于在该第二终端设备展示的第二界面,该第一界面为该第一终端设备展示的界面,该第二设备信息用于表示该第二终端设备的屏幕尺寸和屏幕状态。
可选的,该生成模块1902,具体用于获取该第一界面的界面信息和该第二设备信息,该第一界面的界面信息包括该第一界面中至少一个界面元素的元素信息,该元素信息用于表示该界面元素的名称、类型以及该界面元素在该第一界面中的位置;根据至少一个界面元素的该元素信息,结合预先训练的界面识别模型进行识别,确定界面类别;根据该界面类别和该第二设备信息对至少一个该界面元素进行排布,得到该第二界面。
可选的,该第一界面的界面信息还包括界面属性,该界面属性用于表示该第一界面的界面尺寸和界面方向;
该生成模块1902,还具体用于根据该界面属性对至少一个该元素信息进行特征提 取,得到界面特征数据;将该界面特征数据输入该界面识别模型,通过该界面识别模型对该界面特征数据进行识别,得到该界面识别模型输出的该界面类别。
可选的,该生成模块1902,还具体用于根据该界面类别,对该第二设备信息所指示的第二终端设备的显示区域进行划分,得到多个子区域;确定每个该子区域内排布的界面元素;根据该第二设备信息所指示的该显示区域的尺寸和每个该子区域内排布的界面元素的元素数目,对各个该子区域内的各个该界面元素进行调整,得到该第二界面。
可选的,该生成模块1902,还具体用于确定每个该子区域内各个该界面元素的元素数目;根据该显示区域的尺寸、预先设置的排布规则和每个该子区域对应的元素数目,对每个该子区域内的每个该界面元素的大小和方向进行调整,得到调整后的界面元素;对于每个该子区域,根据该子区域对应的元素数目,对该子区域内调整后的界面元素在该子区域内的位置进行调整,得到该第二界面。
可选的,参见图20,该装置还包括:
发送模块1903,用于向该第二终端设备发送该第二界面,使得该第二终端设备展示该第二界面。
可选的,参见图20,该装置还包括:
获取模块1904,用于获取反馈信息,该反馈信息为用户针对该第二终端设备展示的该第二界面进行反馈的信息;
更新模块1905,用于若该反馈信息满足预先设置的更新条件,根据该反馈信息对界面识别模型进行更新。
可选的,参见图20,该装置还包括:
提取模块1906,用于根据用户触发的提取操作,对该第一界面中的界面元素进行提取,得到多个该界面元素;
补充模块1907,用于根据用户触发的补充操作,生成多个该界面元素的元素信息。
该装置还包括:
记录模块1908,用于记录用户对该第二界面中至少一个界面元素触发的调整操作;
调整模块1909,用于根据该调整操作对排布规则进行调整。
综上所述,本申请实施例提供的界面布局装置,第一终端设备通过接收指示第一终端设备向第二终端设备投屏的投屏指令,并根据第二设备信息和第一终端设备展示的第一界面的界面信息,生成用于在第二终端设备展示的第二界面,其中第二设备信息用于表示第二终端设备的屏幕尺寸和屏幕状态,使得第二终端设备可以显示与第二终端设备相匹配的第二界面,用户基于第二终端设备可以方便地对第二界面进行操控,避免了用户无法方便操控投屏的界面的问题,提高了用户基于第二终端设备对第二界面进行操控的便捷性和基于不同终端设备进行操控的一致性。
本申请实施例还提供一种终端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现上述任意各个界面布局方法实施例中的步骤。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现上述任意各个界面布局方法实施例 中的步骤。
图21是本申请一实施例提供的终端设备的结构示意图。如图21所示,该实施例的终端设备21包括:至少一个处理器211(图21中仅示出一个)处理器、存储器212以及存储在所述存储器212中并可在所述至少一个处理器211上运行的计算机程序212,所述处理器211执行所述计算机程序212时实现上述任意各个界面布局方法实施例中的步骤。
所述终端设备21可以是桌上型计算机、笔记本、掌上电脑及云端服务器等计算设备。该终端设备可包括,但不仅限于,处理器211、存储器212。本领域技术人员可以理解,图21仅仅是终端设备21的举例,并不构成对终端设备21的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如还可以包括输入输出设备、网络接入设备等。
所称处理器211可以是中央处理单元(Central Processing Unit,CPU),该处理器211还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器212在一些实施例中可以是所述终端设备21的内部存储单元,例如终端设备21的硬盘或内存。所述存储器212在另一些实施例中也可以是所述终端设备21的外部存储设备,例如所述终端设备21上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述存储器212还可以既包括所述终端设备21的内部存储单元也包括外部存储设备。所述存储器212用于存储操作系统、应用程序、引导装载程序(BootLoader)、数据以及其他程序等,例如所述计算机程序的程序代码等。所述存储器212还可以用于暂时地存储已经输出或者将要输出的数据。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。另外,各功能单元、模块的具体名称也只是为了便于相互区分,并不用于限制本申请的保护范围。上述系统中单元、模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专 业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的系统实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质至少可以包括:能够将计算机程序代码携带到终端设备的任何实体或装置、记录介质、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。在某些司法管辖区,根据立法和专利实践,计算机可读介质不可以是电载波信号和电信信号。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (13)

  1. 一种界面布局方法,其特征在于,应用于第一终端设备,所述第一终端设备与第二终端设备连接,所述方法包括:
    接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;
    根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态。
  2. 如权利要求1所述的界面布局方法,其特征在于,所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,包括:
    获取所述第一界面的界面信息和所述第二设备信息,所述第一界面的界面信息包括所述第一界面中至少一个界面元素的元素信息,所述元素信息用于表示所述界面元素的名称、类型以及所述界面元素在所述第一界面中的位置;
    根据至少一个界面元素的所述元素信息,结合预先训练的界面识别模型进行识别,确定界面类别;
    根据所述界面类别和所述第二设备信息对至少一个所述界面元素进行排布,得到所述第二界面。
  3. 如权利要求2所述的界面布局方法,其特征在于,所述第一界面的界面信息还包括界面属性,所述界面属性用于表示所述第一界面的界面尺寸和界面方向;
    所述根据至少一个界面元素的所述元素信息,结合预先训练的界面识别模型进行识别,确定界面类别,包括:
    根据所述界面属性对至少一个所述元素信息进行特征提取,得到界面特征数据;
    将所述界面特征数据输入所述界面识别模型,通过所述界面识别模型对所述界面特征数据进行识别,得到所述界面识别模型输出的所述界面类别。
  4. 如权利要求2所述的界面布局方法,其特征在于,所述根据所述界面类别和所述第二设备信息对至少一个所述界面元素进行排布,得到所述第二界面,包括:
    根据所述界面类别,对所述第二设备信息所指示的第二终端设备的显示区域进行划分,得到多个子区域;
    确定每个所述子区域内排布的界面元素;
    根据所述第二设备信息所指示的所述显示区域的尺寸和每个所述子区域内排布的界面元素的元素数目,对各个所述子区域内的各个所述界面元素进行调整,得到所述第二界面。
  5. 如权利要求4所述的界面布局方法,其特征在于,所述根据所述第二设备信息所指示的所述显示区域的尺寸和每个所述子区域内排布的界面元素的元素数目,对各个所述子区域内的各个所述界面元素进行调整,得到所述第二界面,包括:
    确定每个所述子区域内各个所述界面元素的所述元素数目;
    根据所述显示区域的尺寸、预先设置的排布规则和每个所述子区域对应的元素数目,对每个所述子区域内的每个所述界面元素的大小和方向进行调整,得到调整后的 界面元素;
    对于每个所述子区域,根据所述子区域对应的元素数目,对所述子区域内调整后的界面元素在所述子区域内的位置进行调整,得到所述第二界面。
  6. 如权利要求1至5任一所述的界面布局方法,其特征在于,在所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面之后,所述方法还包括:
    向所述第二终端设备发送所述第二界面,使得所述第二终端设备展示所述第二界面。
  7. 如权利要求6所述的界面布局方法,其特征在于,在所述向所述第二终端设备发送所述第二界面之后,所述方法还包括:
    获取反馈信息,所述反馈信息为用户针对所述第二终端设备展示的所述第二界面进行反馈的信息;
    若所述反馈信息满足预先设置的更新条件,根据所述反馈信息对界面识别模型进行更新。
  8. 如权利要求1至5任一所述的界面布局方法,其特征在于,在所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面之前,所述方法还包括:
    根据用户触发的提取操作,对所述第一界面中的界面元素进行提取,得到多个所述界面元素;
    根据用户触发的补充操作,生成多个所述界面元素的元素信息。
  9. 如权利要求1至5任一所述的界面布局方法,其特征在于,在所述根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面之后,所述方法还包括:
    记录用户对所述第二界面中至少一个界面元素触发的调整操作;
    根据所述调整操作对排布规则进行调整。
  10. 一种界面布局装置,其特征在于,应用于第一终端设备,所述第一终端设备与第二终端设备连接,包括:
    接收模块,用于接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;
    生成模块,用于根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态。
  11. 一种界面布局系统,其特征在于,包括:第一终端设备和第二终端设备,所述第一终端设备与所述第二终端设备连接;
    所述第一终端设备接收投屏指令,所述投屏指令用于指示所述第一终端设备向所述第二终端设备投屏;
    所述第一终端设备根据第一界面的界面信息和第二设备信息,生成用于在所述第二终端设备展示的第二界面,所述第一界面为所述第一终端设备展示的界面,所述第二设备信息用于表示所述第二终端设备的屏幕尺寸和屏幕状态;
    所述第一终端设备向所述第二终端设备发送所述第二界面;
    所述第二终端设备接收并展示所述第二界面。
  12. 一种终端设备,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至9任一项所述的方法。
  13. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至9任一项所述的方法。
PCT/CN2020/125607 2020-02-20 2020-10-30 界面布局方法、装置及系统 WO2021164313A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/801,197 US20230099824A1 (en) 2020-02-20 2020-10-30 Interface layout method, apparatus, and system
EP20920102.9A EP4080345A4 (en) 2020-02-20 2020-10-30 Interface layout method, apparatus and system
JP2022550007A JP2023514631A (ja) 2020-02-20 2020-10-30 インタフェースレイアウト方法、装置、及び、システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010106801.1 2020-02-20
CN202010106801.1A CN111399789B (zh) 2020-02-20 2020-02-20 界面布局方法、装置及系统

Publications (1)

Publication Number Publication Date
WO2021164313A1 true WO2021164313A1 (zh) 2021-08-26

Family

ID=71436045

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/125607 WO2021164313A1 (zh) 2020-02-20 2020-10-30 界面布局方法、装置及系统

Country Status (5)

Country Link
US (1) US20230099824A1 (zh)
EP (1) EP4080345A4 (zh)
JP (1) JP2023514631A (zh)
CN (1) CN111399789B (zh)
WO (1) WO2021164313A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113997786A (zh) * 2021-12-30 2022-02-01 江苏赫奕科技有限公司 一种适用于车辆的仪表界面显示方法和装置
WO2023050546A1 (zh) * 2021-09-30 2023-04-06 上海擎感智能科技有限公司 投屏处理方法、系统、电子设备和存储介质

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324327B (zh) * 2020-02-20 2022-03-25 华为技术有限公司 投屏方法及终端设备
CN111399789B (zh) * 2020-02-20 2021-11-19 华为技术有限公司 界面布局方法、装置及系统
CN114363678A (zh) * 2020-09-29 2022-04-15 华为技术有限公司 一种投屏方法及设备
CN114115629A (zh) * 2020-08-26 2022-03-01 华为技术有限公司 一种界面显示方法及设备
EP4191400A4 (en) * 2020-08-25 2024-01-10 Huawei Tech Co Ltd METHOD AND DEVICE FOR IMPLEMENTING A USER INTERFACE
CN112153459A (zh) * 2020-09-01 2020-12-29 三星电子(中国)研发中心 用于投屏显示的方法和装置
CN114816294A (zh) 2020-09-02 2022-07-29 华为技术有限公司 一种显示方法及设备
CN113553014B (zh) * 2020-09-10 2023-01-06 华为技术有限公司 多窗口投屏场景下的应用界面显示方法及电子设备
CN114168236A (zh) * 2020-09-10 2022-03-11 华为技术有限公司 一种应用接入方法及相关装置
CN112040468B (zh) * 2020-11-04 2021-01-08 博泰车联网(南京)有限公司 用于车辆交互的方法、计算设备和计算机存储介质
CN112423084B (zh) * 2020-11-11 2022-11-01 北京字跳网络技术有限公司 热点榜单的显示方法、装置、电子设备和存储介质
CN112269527B (zh) * 2020-11-16 2022-07-08 Oppo广东移动通信有限公司 应用界面的生成方法及相关装置
CN112492358B (zh) * 2020-11-18 2023-05-30 深圳万兴软件有限公司 一种投屏方法、装置、计算机设备及存储介质
CN114579223A (zh) * 2020-12-02 2022-06-03 华为技术有限公司 一种界面布局方法、电子设备和计算机可读存储介质
CN112616078A (zh) * 2020-12-10 2021-04-06 维沃移动通信有限公司 投屏处理方法、装置、电子设备和存储介质
CN114756184A (zh) * 2020-12-28 2022-07-15 华为技术有限公司 协同显示方法、终端设备及计算机可读存储介质
CN112711389A (zh) * 2020-12-31 2021-04-27 安徽听见科技有限公司 应用于电子白板的多终端上屏方法、装置以及设备
CN112965773A (zh) * 2021-03-03 2021-06-15 闪耀现实(无锡)科技有限公司 用于信息显示的方法、装置、设备和存储介质
CN114286152A (zh) * 2021-08-02 2022-04-05 海信视像科技股份有限公司 显示设备、通信终端及投屏画面动态显示方法
CN113835802A (zh) * 2021-08-30 2021-12-24 荣耀终端有限公司 设备交互方法、系统、设备及计算机可读存储介质
CN113794917A (zh) * 2021-09-15 2021-12-14 海信视像科技股份有限公司 一种显示设备和显示控制方法
CN113934390A (zh) * 2021-09-22 2022-01-14 青岛海尔科技有限公司 一种投屏的反向控制方法和装置
CN113992958B (zh) * 2021-10-18 2023-07-18 深圳康佳电子科技有限公司 一种多窗口同屏互动方法、终端及存储介质
CN116243759B (zh) * 2021-12-08 2024-04-02 荣耀终端有限公司 一种nfc通信方法、电子设备、存储介质及程序产品
CN117850715A (zh) * 2022-09-30 2024-04-09 华为技术有限公司 投屏显示方法、电子设备及系统
CN116820229A (zh) * 2023-05-17 2023-09-29 荣耀终端有限公司 Xr空间的显示方法、xr设备、电子设备及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130159880A1 (en) * 2011-12-14 2013-06-20 International Business Machines Corporation Dynamic screen sharing for optimal performance
CN108268225A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 投屏方法及投屏装置
CN108874341A (zh) * 2018-06-13 2018-11-23 深圳市东向同人科技有限公司 屏幕投影方法及终端设备
CN109448709A (zh) * 2018-10-16 2019-03-08 华为技术有限公司 一种终端投屏的控制方法和终端
CN109508189A (zh) * 2018-10-18 2019-03-22 北京奇艺世纪科技有限公司 一种布局模板处理方法、装置及计算机可读存储介质
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN111399789A (zh) * 2020-02-20 2020-07-10 华为技术有限公司 界面布局方法、装置及系统

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375733A (zh) * 2010-08-24 2012-03-14 北大方正集团有限公司 一种便捷的界面布局方法
CN103462695B (zh) * 2013-09-11 2015-11-18 深圳市科曼医疗设备有限公司 监护仪及其屏幕的布局方法与系统
CN103823620B (zh) * 2014-03-04 2017-01-25 飞天诚信科技股份有限公司 一种屏幕适配的方法和装置
CN104731589A (zh) * 2015-03-12 2015-06-24 用友网络科技股份有限公司 用户界面的自动生成方法及自动生成装置
CN106055327B (zh) * 2016-05-27 2020-02-21 联想(北京)有限公司 一种显示方法及电子设备
CN107168712B (zh) * 2017-05-19 2021-02-23 Oppo广东移动通信有限公司 界面绘制方法、移动终端及计算机可读存储介质
US20190296930A1 (en) * 2018-03-20 2019-09-26 Essential Products, Inc. Remote control of an assistant device using an adaptable user interface
CN109144656B (zh) * 2018-09-17 2022-03-08 广州视源电子科技股份有限公司 多元素布局的方法、装置、计算机设备和存储介质
CN110688179B (zh) * 2019-08-30 2021-02-12 华为技术有限公司 一种显示方法及终端设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130159880A1 (en) * 2011-12-14 2013-06-20 International Business Machines Corporation Dynamic screen sharing for optimal performance
CN108268225A (zh) * 2016-12-30 2018-07-10 乐视汽车(北京)有限公司 投屏方法及投屏装置
CN108874341A (zh) * 2018-06-13 2018-11-23 深圳市东向同人科技有限公司 屏幕投影方法及终端设备
CN109448709A (zh) * 2018-10-16 2019-03-08 华为技术有限公司 一种终端投屏的控制方法和终端
CN109508189A (zh) * 2018-10-18 2019-03-22 北京奇艺世纪科技有限公司 一种布局模板处理方法、装置及计算机可读存储介质
CN110377250A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏场景下的触控方法及电子设备
CN110381195A (zh) * 2019-06-05 2019-10-25 华为技术有限公司 一种投屏显示方法及电子设备
CN111399789A (zh) * 2020-02-20 2020-07-10 华为技术有限公司 界面布局方法、装置及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4080345A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023050546A1 (zh) * 2021-09-30 2023-04-06 上海擎感智能科技有限公司 投屏处理方法、系统、电子设备和存储介质
CN113997786A (zh) * 2021-12-30 2022-02-01 江苏赫奕科技有限公司 一种适用于车辆的仪表界面显示方法和装置

Also Published As

Publication number Publication date
EP4080345A4 (en) 2023-06-28
US20230099824A1 (en) 2023-03-30
CN111399789B (zh) 2021-11-19
EP4080345A1 (en) 2022-10-26
CN111399789A (zh) 2020-07-10
JP2023514631A (ja) 2023-04-06

Similar Documents

Publication Publication Date Title
WO2021164313A1 (zh) 界面布局方法、装置及系统
WO2021164631A1 (zh) 投屏方法及终端设备
WO2022100315A1 (zh) 应用界面的生成方法及相关装置
WO2021159922A1 (zh) 卡片显示方法、电子设备及计算机可读存储介质
WO2021036571A1 (zh) 一种桌面的编辑方法及电子设备
WO2021000841A1 (zh) 一种生成用户头像的方法及电子设备
WO2021082835A1 (zh) 启动功能的方法及电子设备
JP7217357B2 (ja) ミニプログラムのデータバインディング方法、装置、デバイス及びコンピュータプログラム
WO2021213449A1 (zh) 一种触控操作方法及设备
WO2022057852A1 (zh) 一种多应用程序之间的交互方法
WO2023130921A1 (zh) 一种适配多设备的页面布局的方法及电子设备
WO2022152024A1 (zh) 一种微件的显示方法与电子设备
WO2022222752A1 (zh) 一种显示方法及相关装置
WO2021254113A1 (zh) 一种三维界面的控制方法和终端
WO2022228138A1 (zh) 一种处理服务卡片的方法和电子设备
WO2022001261A1 (zh) 提示方法及终端设备
WO2022228043A1 (zh) 显示方法、电子设备、存储介质和程序产品
WO2024067122A1 (zh) 一种窗口显示方法及电子设备
WO2023160455A1 (zh) 删除对象的方法及电子设备
WO2023083184A1 (zh) 桌面管理方法、图形用户界面及电子设备
WO2024066976A1 (zh) 控件显示方法及电子设备
WO2023226922A1 (zh) 卡片管理方法、电子设备及计算机可读存储介质
WO2022089276A1 (zh) 一种收藏处理的方法及相关装置
WO2024037542A1 (zh) 一种触控输入的方法、系统、电子设备及存储介质
WO2023072113A1 (zh) 显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20920102

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020920102

Country of ref document: EP

Effective date: 20220722

ENP Entry into the national phase

Ref document number: 2022550007

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE