WO2020253282A1 - 一种项目开启方法及装置、显示设备 - Google Patents

一种项目开启方法及装置、显示设备 Download PDF

Info

Publication number
WO2020253282A1
WO2020253282A1 PCT/CN2020/079703 CN2020079703W WO2020253282A1 WO 2020253282 A1 WO2020253282 A1 WO 2020253282A1 CN 2020079703 W CN2020079703 W CN 2020079703W WO 2020253282 A1 WO2020253282 A1 WO 2020253282A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
display area
application
view display
target application
Prior art date
Application number
PCT/CN2020/079703
Other languages
English (en)
French (fr)
Inventor
张振宝
申静
张耀仁
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2020253282A1 publication Critical patent/WO2020253282A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating

Definitions

  • This application relates to the field of display technology, in particular to a method and device for opening a project, and a display device.
  • the applications in different split-screen windows do not know the information of other applications in the split-screen. Therefore, when you select a resource in one of the split-screen windows, you cannot know whether the resource is It can be opened by applications in other split-screen windows, and it is not possible to specify resources in the selected application to be opened by other applications in split-screen mode; in addition, due to the limitations of the Android system, in split-screen mode, The interaction can only be opened through a specified interface, and the current operating system cannot open resources by dragging resources in one application to other applications.
  • Exemplary implementations of this application provide a method and device for opening a project, and a display device, which are used to open the project at a designated position by dragging between applications in a split screen mode.
  • a display device including:
  • a display configured to display a user interface, wherein the user interface includes a first view display area and a second view display area, the first view display area is the user interface of the starting application, and the second view display area is The user interface of the target application, wherein each of the view display areas includes one or more different items in the layout, and the user interface also includes a selector indicating that the item is selected, and the user can move the The position of the selector in the user interface, so that the item moves according to the selector;
  • a controller in communication with the display, the controller configured to perform presentation of the user interface
  • the controller receives an instruction that an item located in the display area of the first view is selected by the user and used for movement, wherein the instruction includes the storage path of the item and the currently selected location;
  • controller determines that the item can be opened by the target application located in the second view display area, drawing the first display icon of the item;
  • the controller obtains the starting application And the attributes of the target application and the boundary position of the first view display area and the second view display area; if the controller determines that the item can be opened by the target application located in the second view display area, draw all The first display icon of the item; after detecting that the first display icon is dragged by the user to the second view display area, use the target application to open the item, so that in split-screen mode, through the application Drag the item to open at the specified position.
  • the controller after the controller receives the instruction that the item located in the display area of the first view is selected by the user, it obtains the file type of the item according to the storage path, and cancels the item in the beginning. The event selected by the user in the initial application.
  • the controller before the controller determines that the item can be opened by the target application located in the second view display area, the controller is further configured to:
  • the selected position of the item it is determined that the item needs to be opened by the target application located in the second view display area;
  • the controller determining that the project can be opened by the target application located in the second view display area specifically includes: the controller matches the file type of the project with the package name of the target application located in the second view display area, It is determined that the item can be opened by the target application.
  • the controller is further used for:
  • a second display icon of the item is drawn.
  • the second display icon includes the icon of the item and indicates that the item cannot be opened.
  • the controller when the controller detects that the first display icon is dragged to the second view display area by the user, and uses the target application to open the item, the controller is specifically configured to:
  • the project is opened by using the target application
  • the project is opened by using the target application; Or, when it is detected that the position coordinate of the object used to drag the first display icon is smaller than the boundary position coordinate and the position coordinate of the object is larger than the boundary position coordinate, the project is opened by using the target application.
  • An exemplary implementation manner of this application provides a project opening method, including:
  • the user interface includes a first view display area and a second view display area
  • the first view display area is the user interface of the starting application
  • the second view display area is the user interface of the target application
  • an instruction indicating that an item located in the display area of the first view is selected by the user and used for movement is received, where the instruction includes the storage path of the item and the currently selected location; acquiring the starting application and the The attributes of the target application and the boundary position of the first view display area and the second view display area, wherein the user interface includes a first view display area and a second view display area, and the first view display area is the start
  • the second view display area is the user interface of the target application; if it is determined that the item can be opened by the target application located in the second view display area, the first display icon of the item is drawn; The first display icon is dragged by the user to the second view display area, and the target application is used to open the item, so that in the split screen mode, the item is opened at a designated position by dragging between applications.
  • the instruction includes a storage path of the item and a location of the item when the instruction for being selected and moved by the user is received;
  • the drag animation application After the drag animation application receives the message that the item is selected, the event that the item is selected by the user in the initial application is cancelled.
  • the method before the drag animation application determines that the item can be opened by the target application located in the second view display area, the method further includes:
  • the drag animation application obtains the package name of the application located in the display area of the first view, and the package name of the target application located in the display area of the second view;
  • the drag animation application determines, according to the selected position of the item, that the item needs to be opened by the target application located in the second view display area;
  • the drag animation application determines that the item can be opened by the target application located in the second view display area, which specifically includes: the drag animation application matches the file type of the item with the target application located in the second view display area
  • the name of the package determines that the project can be opened by the target application.
  • a second display icon of the item is drawn, and the second display icon includes all An icon of the item and an icon indicating that the item cannot be opened by the target application;
  • the drag animation application draws an animation for instructing the item to return to the first view display area.
  • a project opening device provided in the exemplary embodiment of the present application includes:
  • the first unit is configured to receive an instruction that an item located in the display area of the first view is selected by the user and used to move, wherein the instruction includes the storage path of the item and the currently selected location;
  • the second unit is used to obtain the attributes of the start application and the target application and the boundary position of the first view display area and the second view display area, wherein the user interface includes a first view display area and The second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application;
  • the third unit is configured to draw the first display icon of the item if it is determined that the item can be opened by the target application located in the second view display area;
  • the fourth unit is configured to detect that the first display icon is dragged by the user to the second view display area, and use the target application to start the item.
  • a computing device provided in an exemplary implementation manner of the present application includes:
  • Memory used to store program instructions
  • the processor is configured to call the program instructions stored in the memory, and execute any one of the methods provided in the foregoing embodiments of the present application according to the obtained program.
  • An exemplary embodiment of the present application provides a computer storage medium that stores computer-executable instructions, and the computer-executable instructions are used to make the computer execute any of the above methods.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment
  • FIG. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment
  • FIG. 3 exemplarily shows a schematic diagram of the functional configuration of the display device 200 according to an exemplary embodiment
  • FIG. 4 exemplarily shows a configuration diagram of a software system in the display device 200 according to an exemplary embodiment
  • FIG. 5 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment
  • 6 to 8 exemplarily show schematic diagrams of operations between the user interface and user instructions in the display device 200 according to an exemplary embodiment
  • FIG. 9 exemplarily shows a schematic diagram of the locations of the A application and the B application in the split screen mode according to an exemplary embodiment
  • FIG. 10 exemplarily shows a schematic diagram of the effect of opening an item in the A application in the split screen mode according to an exemplary embodiment
  • Fig. 11 exemplarily shows a schematic diagram of a project opening method according to an exemplary embodiment
  • FIG. 12 exemplarily shows a schematic flow diagram of a method for opening a project by dragging an item according to an exemplary embodiment
  • FIG. 13 exemplarily shows an animation schematic diagram of an item drag process according to an exemplary embodiment
  • FIG. 14 exemplarily shows a schematic diagram of the split screen position where the display icon is judged according to an exemplary embodiment
  • FIG. 15 exemplarily shows a schematic diagram of an existing project opening method according to an exemplary embodiment
  • 16 to 18 exemplarily show schematic diagrams of a project opening method implemented by dragging items according to an exemplary embodiment
  • FIG. 19 exemplarily shows a schematic diagram of a project opening device according to an exemplary embodiment
  • Fig. 20 also exemplarily shows a schematic diagram of a project opening device according to an exemplary embodiment.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or a combination of hardware or/and software code that can perform functions related to the element.
  • gesture used in this application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 by controlling the device 100.
  • the display device 200 may be a liquid crystal display, an OLED display, or a projection display device.
  • the specific display device type, size, resolution, etc. are not limited, and those skilled in the art can understand that the display device 200 can make some changes in performance and configuration as required.
  • Fig. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
  • the control device 100 is configured to control the display device 200, and can receive input operation instructions from the user, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, and play an intermediary role in the interaction between the user and the display device 200.
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
  • control device 100 may be a smart device.
  • control device 100 can install various applications for controlling the display device 200 according to user requirements.
  • the mobile terminal 100B or other smart electronic devices can perform similar functions to the control device 100 after installing an application for controlling the display device 200.
  • the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 100B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 100.
  • the controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus.
  • the controller 110 is used to control the operation and operation of the control device 100, as well as the communication and cooperation between internal components, and external and internal data processing functions.
  • the communicator 130 realizes communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
  • the communicator 130 may include at least one of communication modules such as a WIFI module 131, a Bluetooth module 132, and an NFC module 133.
  • the user input/output interface 140 wherein the input interface includes at least one of input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
  • the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
  • the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, which is sent to the display device 200.
  • the output interface includes an interface for sending the received user instruction to the display device 200.
  • it may be an infrared interface or a radio frequency interface.
  • the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 via the infrared sending module.
  • a radio frequency signal interface a user input instruction needs to be converted into a digital signal, which is then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency transmitting terminal.
  • control device 100 includes at least one of a communicator 130 and an output interface.
  • the control device 100 is equipped with a communicator 130, such as WIFI, Bluetooth, NFC and other modules, which can encode user input commands through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send to the display device 200.
  • a communicator 130 such as WIFI, Bluetooth, NFC and other modules, which can encode user input commands through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send to the display device 200.
  • the memory 190 is used to store various operating programs, data and applications for driving and controlling the control device 100 under the control of the controller 110.
  • the memory 190 can store various control signal instructions input by the user.
  • the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller 110. Can battery and related control circuit.
  • the controller 110 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display device 200, the controller 110 may perform an operation related to the object selected by the user command.
  • FIG. 3 exemplarily shows a schematic diagram of the functional configuration of the display device 200 according to an exemplary embodiment.
  • the memory 290 is used to store an operating system, application programs, content, user data, etc., and executes system operation of driving the display device 200 and responds to various operations of the user under the control of the controller 110.
  • the memory 290 may include volatile and/or non-volatile memory.
  • the memory 290 is specifically used to store the operating program that drives the controller 110 in the display device 200, and store various application programs built in the display device 200, various application programs downloaded by the user from an external device, and various application-related programs. Graphical user interface, and various objects related to the graphical user interface, user data information, and various internal data supporting applications.
  • the memory 290 is used to store system software such as an operating system (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
  • OS operating system
  • the memory 290 is specifically used to store driver programs and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner and demodulator 220, the detector 240, and the input/output interface.
  • the memory 290 may store software and/or programs.
  • the software programs used to represent an operating system (OS) include, for example, a kernel, middleware, application programming interface (API), and/or application programs.
  • OS operating system
  • the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, API, or application program), and the kernel may provide interfaces to allow middleware and APIs, or applications to access the controller , In order to achieve control or management of system resources.
  • the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907, a communication control module 2908, and an optical receiver Module 2909, power control module 2910, operating system 2911, and other application programs 2912, browser module, etc.
  • the controller 110 executes various software programs in the memory 290 such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, external command Recognition function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, browser function and other applications.
  • the memory 290 includes various software modules for driving and controlling the display device 200.
  • various software modules stored in the memory 290 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
  • the basic module is the underlying software module used for signal communication between various hardware in the display device 200 and sending processing and control signals to the upper module.
  • the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion and analysis management.
  • FIG. 4 exemplarily shows a configuration block diagram of a software system in the display device 200 according to an exemplary embodiment.
  • the operating system 2911 includes operating software for processing various basic system services and for implementing hardware-related tasks, and acts as a medium for data processing between application programs and hardware components.
  • part of the operating system kernel may include a series of software to manage the hardware resources of the display device and provide services for other programs or software codes.
  • part of the operating system kernel may include one or more device drivers, and the device drivers may be a set of software codes in the operating system to help operate or control devices or hardware associated with the display device.
  • the drive may contain code to manipulate video, audio, and/or other multimedia components. Examples include displays, cameras, Flash, WiFi, and audio drivers.
  • the accessibility module 2911-1 is used to modify or access the application program, so as to realize the accessibility of the application program and the operability of its display content.
  • the communication module 2911-2 is used to connect to other peripherals via related communication interfaces and communication networks.
  • the user interface module 2911-3 is used to provide objects that display the user interface for access by various applications, and can realize user operability.
  • the control application 2911-4 is used to control process management, including runtime applications.
  • the event transmission system 2914 can be implemented in the operating system 2911 or in the application 2912. In some embodiments, it is implemented in the operating system 2911 on the one hand, and implemented in the application program 2912 at the same time, for monitoring various user input events, and responding to the recognition results of various events or sub-events according to various events. And implement one or more sets of pre-defined operation procedures.
  • the event monitoring module 2914-1 is used to monitor input events or sub-events of the user input interface.
  • the event recognition module 2914-1 is used to input the definitions of various events to various user input interfaces, recognize various events or sub-events, and transmit them to the processing to execute the corresponding one or more groups of processing programs .
  • the event or sub-event refers to the input detected by one or more sensors in the display device 200 and the input of an external control device (such as the control device 100).
  • an external control device such as the control device 100.
  • various sub-events of voice input, gesture input sub-events of gesture recognition, and sub-events of remote control button command input of control devices include multiple forms, including but not limited to one or a combination of pressing up/down/left/right/, confirming keys, and pressing keys.
  • non-physical buttons such as moving, pressing, and releasing.
  • the interface layout management module 2913 which directly or indirectly receives various user input events or sub-events monitored by the event transmission system 2914, is used to update the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the container
  • the size, position, level, etc. of the interface are related to various execution operations.
  • Fig. 5 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment.
  • the user interface includes multiple view display areas, for example, a first view display area 201 and a second view display area 202, and each view display area includes one or more different items in the layout.
  • the user interface also includes a selector indicating that any item is selected, and the position of the selector can be moved through user input to change the selection of different items.
  • the display area of the first view and the display area of the second view may be in different layers, or in the same layer, or may be nested or embedded in the layer where the display area of the first view is located.
  • the first view display area may also be referred to as the first view display layer
  • the second view display area may be referred to as the second view display layer.
  • the display area can also be a window.
  • the display area of the first view and the display area of the second view are not in the same layer, in some cases, it will be blocked.
  • the multiple view display areas may be visible boundaries or invisible boundaries.
  • different view display areas can be identified by the different background colors of the display areas of each view, visible marks such as boundary lines can also be used, or there can be invisible invisible boundaries.
  • there is no visible or invisible boundary and only the related items in a certain area are displayed on the screen.
  • the certain area is regarded as the same kind.
  • the existence of the boundary of the view partition for example, the items in the view display area 201 are simultaneously reduced or enlarged, but the change of the view display area 202 is different.
  • the first view display area 201 is a zoomable view display.
  • Scalable may mean that the size or proportion of the first view display area 201 on the screen is scalable, or that the size or proportion of the items in the first view display 201 is scalable on the screen.
  • the first view display area 201 is a scroll view display area, which can scroll and update the number of items displayed on the screen through user input.
  • Items refer to visual objects displayed in each view display area of the user interface of the display device 200 to represent corresponding content such as icons, thumbnails, and video clips.
  • the item can represent the image content or video clip of a movie, TV series, audio content of music, application program, or other user access content history information.
  • "item” may display image thumbnails.
  • the project when the project is a movie or TV series, the project can be displayed as a poster of the movie or TV series. If the item is music, the poster of the music album can be displayed.
  • the project when the project is an application, it can be displayed as an icon of the application, or a screenshot of the content of the application captured when the application was executed most recently.
  • the project when the project is a user's access history, it can be displayed as a screenshot of the content during the most recent execution.
  • "Projects" can be displayed as video clips.
  • the project is a dynamic picture of a video clip of a trailer of a TV or TV series.
  • Each item can have the same size or different sizes. In some implementation examples, the size of the item can be changed.
  • “Selector” is used to indicate that any item has been selected, such as cursor or focus object. Positioning the selection information input according to the icon or menu position touched by the user in the display 200 can enable the movement of the focus object displayed in the display device 200 to select the control item, and select or control one or more items.
  • the focus object refers to the object that moves between items based on user input. For example, draw a thick line on the edge of the item to realize or identify the position of the focus object.
  • the focus form is not limited to examples. It can be a tangible or intangible form that can be recognized by the user, such as a cursor, such as a 3D deformation of the item, and the border line and size of the text or image of the focused item can also be changed. , Color, transparency, outline, and/or font.
  • items in the first view display area 201 and the second view display area 202 are respectively associated with different content or links.
  • each item in the first view display area 201 is a thumbnail or video clip of a poster
  • the second view display area 202 displays text and/or icons of various applications.
  • the first view display area 201 and the second view display area 202 each present different applications.
  • the first view display area 201 is a user interface that displays the initial application
  • the second view display area 202 is a user interface that displays the target application. .
  • the interface layout management module 2913 is used to monitor the state of the user interface (including the position and/or size, change process, etc. of view partitions, items, focus or cursor objects, etc.), and can modify the display of each view according to the event or sub-event
  • modifying and adjusting the layout includes displaying or not displaying each view partition or the content of items in the view partition on the screen.
  • the user input interface is used to send the user's input signal to the controller 110, or to transmit the signal output from the controller to the user.
  • the control device (such as a mobile terminal or a remote control) may send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then the user input interface forwards the input signal to the controller;
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 200, and the user input interface receives the user input command through the graphical user interface (GUI).
  • GUI graphical user interface
  • the user can input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
  • An exemplary embodiment of the present application provides a display device, including:
  • a display configured to display a user interface, wherein the user interface includes a first view display area and a second view display area, the first view display area is the user interface of the starting application, and the second view display area is The user interface of the target application, wherein each of the view display areas includes one or more different items in the layout, and the user interface also includes a selector indicating that the item is selected, and the user can move the The position of the selector in the user interface, so that the item moves according to the selector;
  • a controller in communication with the display, the controller configured to perform presentation of the user interface
  • the controller receives an instruction that an item located in the display area of the first view is selected by the user and used for movement, wherein the instruction includes the storage path of the item and the currently selected location;
  • controller determines that the item can be opened by the target application located in the second view display area, drawing the first display icon of the item;
  • FIGS. 6 to 8 exemplarily show schematic diagrams of operations between the user interface and user instructions in the display device 200 according to an exemplary embodiment.
  • a plurality of different items are arranged in the first view display area 201, for example, items 1 to 6 are a plurality of different posters including text and image thumbnails.
  • No items have been displayed in the second view display area 202 (it is not limited here whether items are displayed in the second view display area, of course, pictures and other items can also be displayed).
  • the user can control the item 5 in the display device 200 by triggering an instruction for selecting (for example, the user long-presses the item 5) and moving the item 5 (resource 5); when the user is on the display device 200
  • an instruction for selecting for example, the user long-presses the item 5
  • moving the item 5 for example, the user long-presses the item 5
  • the user interface changes to as shown in Figure 7.
  • the display icon of item 5 is located at the boundary between the first view display area 201 and the second view display area 202 (the display icon of item 5 can be dragged by the user to any position on the user interface of the display device 200 );
  • the user continues to drag the display icon of item 5, when item 5 is dragged to the second view display area 202, the user raises his hand, the user interface changes to as shown in Figure 8, item 5 is directly displayed in the second view Area 202, in the second view display area 202, the content of item 5 is displayed enlarged or half-screened.
  • it can be dragged to the second view display area, or it can be directly opened by the target application.
  • the controller after the controller receives an instruction that an item in the display area of the first view is selected by the user and instructs to move, it obtains the storage path of the item and the selected location of the item, and cancels all items. In the event that the item is selected by the user in the initial application, the controller obtains the file type according to the storage path of the item.
  • the controller before the controller determines that the item can be opened by the target application located in the second view display area, the controller is further configured to:
  • the selected position of the item it is determined that the item needs to be opened by the target application located in the second view display area;
  • the controller determining that the project can be opened by the target application located in the second view display area specifically includes: the controller matches the file type of the project with the package name of the target application located in the second view display area, It is determined that the item can be opened by the target application.
  • the controller is further used for:
  • a second display icon of the item is drawn.
  • the second display icon includes the icon of the item and indicates that the item cannot be opened.
  • FIG. 9 exemplarily shows a schematic diagram of the locations of the A application and the B application in the split screen mode according to an exemplary embodiment, where the A application is the starting application and is displayed in the first view display area; the B application is the target application , Displayed in the second view display area.
  • the A application is the starting application and is displayed in the first view display area
  • the B application is the target application , Displayed in the second view display area.
  • For application A and application B in split-screen mode they do not know the information of other applications in the split-screen mode. Therefore, when an item in application A is selected, it is impossible to know whether the item can be opened by application B, nor can it be specified by The B application in the split screen mode opens the selected item in the A application.
  • start Activity when opening a project, even if it is specified to be opened by an application in another window that is in a split-screen state with the current window, it is possible that the launched application directly covers the window selected by the current project, rather than in another The window opens (because the Activity startup mode does not necessarily maintain the current split-screen state), FIG.
  • FIG. 10 exemplarily shows the effect diagram of the existing project in the A application in the split-screen mode according to the exemplary embodiment; currently In the split-screen mode, select the item in the A application and open it, there will be the following four situations: the first situation, the project is opened by the C application, and the opened interface is in the A window; the second situation, the project is opened by the C application Open, the opened interface is in window B; in the third case, the project is opened by application B, and the opened interface is in window A; in the fourth case, the project is opened by application B, and the opened interface is in window B; the following example Explain application A, application B, and application C: For example, application A is a file manager, and a PPT document is displayed as the project of application A, and application B is an electronic whiteboard tool. Since the electronic whiteboard tool does not support PPT document opening, the original The WPS application is integrated in the machine, and the PPT can be opened by WPS, then the WPS application is the C application.
  • the first situation the project is opened by
  • FIG. 11 exemplarily shows a method for opening a project according to an exemplary embodiment, including:
  • S101 Receive an instruction that an item located in the display area of the first view is selected by the user and used for movement, where the instruction includes the storage path of the item and the currently selected location;
  • the item is selected by the user and used to move the instructions, for example, the item is long-pressed by the user, the long-press is a drag trigger event, when the item is long-pressed by the user, the drag animation application AnimationApp will be notified, and the AnimationApp receives the long-press event and the item’s Storage path and selected location.
  • the drag animation application receives the instruction to move to the target application, it obtains the information of the starting application and the target application and the split screen position, and parses the file type of the project through the storage path of the project, and determines whether Can be opened by the B application.
  • the first display icon includes a thumbnail of the item and a green color indicating that the item can be opened by a preset application. Plus sign; if the long-pressed item is another project file, the first display icon includes the icon of the item (the icon is a copy of the long-pressed item icon) and is used to indicate that the item can be The green plus sign that the preset app opens.
  • the user interface of a smart TV or mobile phone terminal includes a first view display area and a second view display area.
  • the first view display area is the left screen of the interface
  • the second view display area is the right screen of the interface.
  • the first view display area is the right screen of the interface
  • the second view display area is the left screen of the interface.
  • the instruction includes a storage path of the item and a location of the item when the instruction for being selected and moved by the user is received;
  • the drag animation application After the drag animation application receives the message that the item is selected, the event that the item is selected by the user in the initial application is cancelled.
  • the message that the item is long-pressed includes the path information when the item is long-pressed.
  • the drag animation should obtain the extension according to the path information of the item, so as to obtain the file type of the item; the event of canceling the selected item is, for example, cancel The item is long pressed.
  • the method before the drag animation application determines that the item can be opened by the target application located in the second view display area, the method further includes:
  • the drag animation application obtains the package name of the application located in the display area of the first view, and the package name of the target application located in the display area of the second view;
  • the drag animation application determines, according to the selected position of the item, that the item needs to be opened by the target application located in the second view display area;
  • the drag animation application determines that the item can be opened by the target application located in the second view display area, which specifically includes: the drag animation application matches the file type of the item with the target application located in the second view display area
  • the name of the package determines that the project can be opened by the target application.
  • a second display icon of the item is drawn, and the second display icon includes all An icon of the item and an icon indicating that the item cannot be opened by the target application;
  • the second display icon includes the thumbnail of the item and a red stop sign indicating that the item cannot be opened by the preset application; if the long-pressed item is For other project files, the second display icon includes the icon of the project (the icon is a copy of the long-pressed project icon) and a red stop sign used to indicate that the project cannot be opened by a preset application.
  • the drag animation application draws an animation for instructing the item to return to the first view display area.
  • Fig. 12 exemplarily shows a flowchart of a method for opening an item by dragging an item according to an exemplary embodiment, which specifically includes the following content:
  • the first step after the operating system is started, drag to the animation application AnimationApp to start, it will save the custom project opening method and the supported file type userHashMap (userHashMap is a custom opening method mapping table, such as an electronic whiteboard application, which supports pictures If you insert the file, a piece of mapping data will be generated.
  • the electronic whiteboard package name is hashkey, and the method of inserting the picture is data), for example, the customized information is saved in the xml file, and the animationApp is parsed and saved with the package name index (the package name can be used as The identification of the application in the operating system, the package name of each application in the operating system is different);
  • FIG. 13 exemplarily shows an animation diagram of the item drag process according to the exemplary embodiment.
  • the A application and the B application are in the split screen mode, and long press the item S in the A application.
  • Such as picture s.jpg long-pressing the application will notify the AnimationApp item S by the user to be long-pressed by broadcasting, and pass the saving path of item S to AnimationApp (for example, the saving path of the picture s.jpg being long-pressed);
  • AnimationApp receives After notification, send MotionEvent.ACTION_CANCEL through inputfilter (long press is a continuous process, unless the user raises his hand, this step is to force the end of the long press button at the system level, that is, send the touch cancel event to the current long press application Interface), so as to cancel the long press event of the A application to prevent the drag process from affecting the A application when dragging the item;
  • inputfilter is a key processing channel, and all touch events are filtered through this channel first to determine this event Whether to continue to distribute;
  • AnimationApp obtains the runningTaskInfo of the left and right split screens respectively through the new interfaces getTopTaskLeft() and getTopTaskRight() of the Activity Manager ActivityManager (RunningTaskInfo is a data structure natively provided by the operating system and can be used to determine which applications are currently running), thereby obtaining the left and right
  • the package name of the split-screen application A is packageA and the package name of the B application is packageB (the package name and application are one-to-one correspondence, and the package name is determined to distinguish different applications), and determine according to the location information when item S is long-pressed
  • the initiator of the open project is packageA
  • the expected project opener is packageA (the application of the window where the project is long-pressed is the open project initiator, and the other application is the expected project opener), and the project S will be opened.
  • the split screen position is openside, which is the window where the B application is located;
  • the third step is to obtain the extension name of the project S through the save path information of the project S passed to the AnimationApp in the second step, so as to obtain the MimeType (MimeType is the file format type) as image/jpeg; according to the obtained B
  • the package name packageB of the application queries whether the userHashMap supports opening the selected project S (userHashMap is the open project mapping table, the mapping table is indexed by the package name of the application, and the opening method and supported file types are the data.
  • AnimationApp has obtained B
  • the package name packageB of the application based on the opening method and file types supported by packageB, can be compared with the file type of project S to know whether application B supports opening project S).
  • No information about application B can open project S in userHashMap Information (that is, the B application does not support the project of the file type), the standard opening method intent intent supported by the system is generated, and then packageMagager.queryIntentActivities(intent) is used to determine whether the intent can be opened by the B application, and if it can be opened, then Through the interface Intent.setPackage(packageB), it is specified to be opened by the B application of the split screen (in this embodiment, the project S can only be opened by the B application through the interface Intent.setPackage(packageB)); if you find about B in the userHashMap
  • the application can open the information of the project S, and the project can be opened by the application B in the split screen state using the predefined project opening method.
  • the fourth step is to draw different display icons according to the results of the third step, that is: if the project S can be opened by the B application, the AnimationApp draws a thumbnail of the project S, and draws a thumbnail to indicate the project based on the thumbnail S can be opened by the B application icon (for example, the green plus sign); if the project S cannot be opened by the B application, the AnimationApp draws a thumbnail of the project S, and draws a thumbnail to indicate that the project S is not The icon that can be opened by the B application (such as the red stop sign); the long press event of the A application has been cancelled in the second step, so at this time the user drags the display icon drawn by the AnimationApp, such as the icon ST in Figure 13 , Instead of the initial long-pressed item; the display icon can follow the movement of the user’s finger to change its position.
  • the fifth step when the user raises his hand, judge the split screen position of the display icon.
  • the display icon is in the window where the B application is located, if it is known from the result of the third step that the B application supports opening project S, then according to the third
  • the intent generated in the second step brings the openside information obtained in the second step (that is, the window where the B application is located) into intent.putExtra(), and the B application uses the interface Context.startActivity(Intent) in the designated split screen position (ie The window where the B application is located) Open project S; if it is known that the B application does not support opening the project S according to the result of the third step, the AnimationApp draws an animation to instruct the project S to return to the window where the A application is located, so that the project S returns To the original position (that is, the position where the item S is long-pressed);
  • FIG. 14 exemplarily shows a schematic diagram of the split screen position where the display icon is determined according to an exemplary embodiment.
  • the upper left corner of the screen is the origin
  • the x-axis coordinate position of the dividing line between the left and right screens is xsplit
  • the position information of the item S being long pressed is determined by comparing the x-axis coordinate of the dividing line with the item S being long pressed With the x-axis coordinate, it can be determined whether the item S is dragged from the left screen to the right screen by the user, or from the right screen to the left screen by the user.
  • the x-axis coordinate of the dividing line is greater than the x-axis coordinate of the long-press item S, Then the item S is dragged by the user from the left screen to the right screen.
  • the x-axis coordinate of the dividing line is less than the x-axis coordinate of the long-pressed item S, the item S is dragged from the right screen to the left screen by the user;
  • the x-axis coordinate position xmove of the displayed icon is also determined.
  • xmove ⁇ xsplit it means that the display icon is dragged to the left screen by the user, and when xmove>xsplit, it means that the display icon is dragged to the right screen by the user.
  • Figure 15 exemplarily shows a schematic diagram of an existing project opening method according to an exemplary embodiment. If you insert a picture in the file manager application on the right screen in the whiteboard application on the left screen, you need to click Insert picture button 1, call up the file manager application, select the picture and click the insert button 2 to insert it.
  • Figures 16 to 18 exemplarily show schematic diagrams of project opening methods implemented by project dragging according to an exemplary embodiment. If in the whiteboard application on the left screen, open the picture in the file manager application on the right screen, it will directly Long press the selected picture on the right screen, and the floating thumbnail of the selected picture will be displayed. The user can drag the floating thumbnail by moving his finger. When the user releases his hand, if the position where the user releases his hand is within the scope of the whiteboard application, it will be displayed directly Insert the current long-press selected picture on the whiteboard application.
  • FIG. 19 exemplarily shows an item opening device according to an exemplary embodiment, including:
  • the first unit 11 is configured to receive an instruction that an item located in the display area of the first view is selected by the user and used to move, wherein the instruction includes the storage path of the item and the currently selected location;
  • the second unit 12 is configured to obtain the attributes of the start application and the target application and the boundary position of the first view display area and the second view display area, wherein the user interface includes the first view display area And a second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application;
  • the third unit 13 is configured to draw the first display icon of the item if it is determined that the item can be opened by the target application located in the second view display area;
  • the fourth unit 14 is configured to detect that the first display icon is dragged by the user to the second view display area, and use the target application to open the item.
  • Fig. 20 also exemplarily shows a project opening device according to an exemplary embodiment, including:
  • the processor 600 is configured to read a program in the memory 610 and execute the following process:
  • the user interface includes a first view display area and a second view display area
  • the first view display area is the user interface of the starting application
  • the second view display area is the user interface of the target application
  • an instruction indicating that an item located in the display area of the first view is selected by the user and used for movement is received, wherein the instruction includes the storage path of the item and the currently selected location; acquiring the starting application and the The attributes of the target application and the boundary position of the first view display area and the second view display area, wherein the user interface includes a first view display area and a second view display area, and the first view display area is the start
  • the second view display area is the user interface of the target application; if it is determined that the item can be opened by the target application located in the second view display area, the first display icon of the item is drawn; The first display icon is dragged by the user to the second view display area, and the target application is used to open the item, so that in the split screen mode, the item is opened at a designated position by dragging between applications.
  • the instruction includes a storage path of the item and a location of the item when the instruction for being selected and moved by the user is received;
  • the drag animation application After the drag animation application receives the message that the item is selected, the event that the item is selected by the user in the initial application is cancelled.
  • the method before the drag animation application determines that the item can be opened by the target application located in the second view display area, the method further includes:
  • the drag animation application obtains the package name of the application located in the display area of the first view, and the package name of the target application located in the display area of the second view;
  • the drag animation application determines, according to the selected position of the item, that the item needs to be opened by the target application located in the second view display area;
  • the drag animation application determines that the item can be opened by the target application located in the second view display area, which specifically includes: the drag animation application matches the file type of the item with the target application located in the second view display area
  • the name of the package determines that the project can be opened by the target application.
  • a second display icon of the item is drawn, and the second display icon includes all An icon of the item and an icon indicating that the item cannot be opened by the target application;
  • the drag animation application draws an animation for instructing the item to return to the first view display area.
  • the bus architecture may include any number of interconnected buses and bridges. Specifically, one or more processors represented by the processor 600 and various circuits of the memory represented by the memory 610 are linked together.
  • the bus architecture can also link various other circuits such as peripherals, voltage regulators, and power management circuits. These are all well-known in the art, and therefore, no further description will be given here.
  • the bus interface provides the interface.
  • An exemplary embodiment of the present application provides a display terminal, and the display terminal may specifically be a desktop computer, a portable computer, a smart phone, a tablet computer, a personal digital assistant (Personal Digital Assistant, PDA), etc.
  • the display terminal may include a central processing unit (CPU), a memory, an input/output device, etc.
  • the input device may include a keyboard, a mouse, a touch screen, etc.
  • an output device may include a display device, such as a liquid crystal display (Liquid Crystal Display, LCD), Cathode Ray Tube (CRT), etc.
  • the user interface 620 may be an interface capable of connecting externally and internally with required equipment, and the connected equipment includes but not limited to a keypad, a display, a speaker, a microphone, a joystick, etc.
  • the processor 600 is responsible for managing the bus architecture and general processing, and the memory 610 can store data used by the processor 600 when performing operations.
  • the processor 600 may be a CPU (central embedded device), an ASIC (Application Specific Integrated Circuit, application-specific integrated circuit), an FPGA (Field-Programmable Gate Array, field programmable gate array), or a CPLD (Complex Programmable Logic Device, complex programmable logic device).
  • CPU central embedded device
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • CPLD Complex Programmable Logic Device, complex programmable logic device
  • the memory 610 may include a read only memory (ROM) and a random access memory (RAM), and provides the processor with program instructions and data stored in the memory.
  • the memory may be used to store the program of any of the methods provided in the embodiment of the present application.
  • the processor calls the program instructions stored in the memory, and the processor is configured to execute any of the methods provided in the embodiments of the present application according to the obtained program instructions.
  • the exemplary embodiment of the present application provides a computer storage medium for storing computer program instructions used by the device provided in the foregoing embodiment of the present application, which includes a computer program instruction for executing any method provided in the foregoing embodiment of the present application. program.
  • the computer storage medium may be any available medium or data storage device that can be accessed by the computer, including but not limited to magnetic storage (such as floppy disk, hard disk, magnetic tape, magneto-optical disk (MO), etc.), optical storage (such as CD, DVD, BD, HVD, etc.), and semiconductor memory (such as ROM, EPROM, EEPROM, non-volatile memory (NAND FLASH), solid state drive (SSD)), etc.
  • magnetic storage such as floppy disk, hard disk, magnetic tape, magneto-optical disk (MO), etc.
  • optical storage such as CD, DVD, BD, HVD, etc.
  • semiconductor memory such as ROM, EPROM, EEPROM, non-volatile memory (NAND FLASH), solid state drive (SSD)
  • the exemplary embodiments of the present application provide a method and device for opening a project, and a display device, so that in the split-screen mode, the project can be opened at a designated position by dragging between applications.
  • the present application may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware.
  • this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, optical storage, etc.) containing computer-usable program codes.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing functions specified in a flow or multiple flows in the flowchart and/or a block or multiple blocks in the block diagram.

Abstract

本申请实施例提供的一种显示设备,包括:显示器,用于显示包括第一和二视图显示区的用户界面;控制器,用于接收位于第一视图显示区的项目被选中并用于移动的指令,获取第一、二视图显示区的边界位置;若确定项目能被位于第二视图显示区的目标应用开启,则绘制项目的显示图标;检测到显示图标被拖动到第二视图显示区,采用目标应用开启项目。

Description

一种项目开启方法及装置、显示设备
本公开要求在2019年6月21日提交中国专利局、申请日为201910540234.8、申请名称为“一种项目开启方法及装置、显示设备”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本申请涉及显示技术领域,尤其涉及一种项目开启方法及装置、显示设备。
背景技术
在分屏模式下,处于不同的分屏窗口中的各个应用,互相不知道处于分屏下的其他应用的信息,因此,当选中其中一个分屏窗口的应用中的资源,无法获知该资源是否可以由其他分屏窗口中的应用打开,也无法指定由处于分屏模式下的其他应用打开被选中的应用中的资源;另外,由于安卓系统本身限制,在分屏模式下,应用之间的交互只能通过指定的接口打开,目前的操作系统无法通过拖动一个应用中的资源到其他应用实现资源的打开。
发明内容
本申请示例性的实施方式中提供了一种项目开启方法及装置、显示设备,用以在分屏模式下,通过应用之间的拖动实现项目在指定的位置打开。
根据示例性的实施方式中一方面,提供一种显示设备,包括:
显示器,该显示器被配置为显示用户界面,其中,所述用户界面包括第一 视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面,其中,各个所述视图显示区中包括布局一个或多个不同项目,以及,该用户界面中还包括指示所述项目被选择的选择器,可通过用户输入而移动所述选择器在所述用户界面中的位置,以使所述项目根据所述选择器而移动;
与所述显示器通信的控制器,所述控制器被配置为执行呈现所述用户界面;
控制器接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
所述控制器获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置;
若控制器确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
检测到所述第一显示图标被用户拖动到第二视图显示区后,采用所述目标应用开启所述项目。
通过控制器接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;所述控制器获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置;若控制器确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;检测到所述第一显示图标被用户拖动到第二视图显示区后,采用所述目标应用开启所述项目,从而在分屏模式下,通过应用之间的拖动实现项目在指定的位置打开。
在一些示例性的实施方式中,控制器接收到位于第一视图显示区的项目被 用户选中的指令之后,根据所述存储路径获取所述项目的文件类型,并取消所述项目在所述起始应用中被用户选中的事件。
在一些示例性的实施方式中,所述控制器确定所述项目能被位于第二视图显示区的目标应用开启之前,所述控制器还用于:
获取位于第一视图显示区的应用的包名,及位于第二视图显示区的目标应用的包名;
根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
所述控制器确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述控制器通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
在一些示例性的实施方式中,所述控制器还用于:
若控制器确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
当所述第二显示图标被用户拖动到第二视图显示区后,绘制用于指示所述项目返回第一视图显示区的动画。
在一些示例性的实施方式中,所述控制器当检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目时,具体用于:
当检测到所述第一显示图标的任一位置坐标大于所述边界位置坐标变化到所述任一位置坐标小于所述边界位置坐标时,采用所述目标应用开启所述项目;
或者,当检测到所述第一显示图标的任一位置坐标小于所述边界位置坐标 变化到所述任一位置坐标大于所述边界位置坐标时,采用所述目标应用开启所述项目;
或者,当检测到用于拖动第一显示图标的物体的位置坐标大于所述边界位置坐标变化到所述物体的位置坐标小于所述边界位置坐标时,采用所述目标应用开启所述项目;或者,当检测到用于拖动第一显示图标的物体的位置坐标小于所述边界位置坐标变化到所述物体的位置坐标大于所述边界位置坐标时,采用所述目标应用开启所述项目。
本申请示例性的实施方式中提供一种项目开启方法,包括:
接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;
若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目。
通过该方法,接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第 一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目,从而在分屏模式下,通过应用之间的拖动实现项目在指定的位置打开。
在一些示例性的实施方式中,所述指令包括所述项目的存储路径和接收到用于被用户选中并移动的指令时所述项目所处的位置;
所述拖动动画应用接收到所述项目被选中的消息之后,取消所述项目在所述起始应用中被用户选中的事件。
在一些示例性的实施方式中,所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启之前,该方法还包括:
所述拖动动画应用获取位于第一视图显示区的应用的包名,及位于第二视图显示区的目标应用的包名;
所述拖动动画应用根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述拖动动画应用通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
在一些示例性的实施方式中,若拖动动画应用确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
当所述第二显示图标被用户拖动到第二视图显示区后,拖动动画应用绘制 用于指示所述项目返回第一视图显示区的动画。
相应地,在装置侧,本申请示例性的实施方式中提供的一种项目开启装置,该装置包括:
第一单元,用于接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
第二单元,用于获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;
第三单元,用于若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
第四单元,用于检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目。
本申请示例性的实施方式中还提供的一种计算设备,包括:
存储器,用于存储程序指令;
处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行上述本申请实施例提供的任一种所述的方法。
本申请示例性的实施方式中提供了一种计算机存储介质,所述计算机存储介质存储有计算机可执行指令,所述计算机可执行指令用于使所述计算机执行上述任一种方法。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简要介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1中示例性示出了根据实施例中显示设备与控制装置之间操作场景的示意图;
图2中示例性示出了根据示例性实施例中控制装置100的配置框图;
图3中示例性示出了根据示例性实施例中显示设备200功能配置示意图;
图4示例性示出了根据示例性实施例中显示设备200中软件系统的配置示意图;
图5中示例性示出了根据示例性实施例中显示设备200中用户界面的示意图;
图6~8中示例性示出了根据示例性实施例中显示设备200中用户界面与用户指令之间操作示意图;
图9中示例性示出了根据示例性实施例中分屏模式下A应用和B应用所处位置示意图;
图10中示例性示出了根据示例性实施例中分屏模式下现有的打开A应用中项目的效果示意图;
图11中示例性示出了根据示例性实施例中一种项目开启方法示意图;
图12中示例性示出了根据示例性实施例中通过拖动项目实现项目开启的方法流程示意图;
图13中示例性示出了根据示例性实施例中项目拖动过程的动画示意图;
图14中示例性示出了根据示例性实施例中判断显示图标所处的分屏位置示意图;
图15中示例性示出了根据示例性实施例中现有的项目打开方法示意图;
图16~18中示例性示出了根据示例性实施例中通过项目拖动实现的项目开启方法示意图;
图19中示例性示出了根据示例性实施例中一种项目开启装置示意图;
图20中还示例性示出了根据示例性实施例中一种项目开启装置示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,并不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
应当理解,本申请中说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,例如能够根据本申请实施例图示或描述中给出那些以外的顺序实施。
此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖但不排他的包含,例如,包含了一系列组件的产品或设备不必限于清楚地列出的那些组件,而是可包括没有清楚地列出的或对于这些产品或设备固有的其它组件。
本申请中使用的术语“模块”,是指任何已知或后来开发的硬件、软件、固 件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
本申请中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
下面结合说明书附图对本申请各个实施例进行详细描述。需要说明的是,本申请实施例的展示顺序仅代表实施例的先后顺序,并不代表实施例所提供的技术方案的优劣。
图1中示例性示出了根据实施例中显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过控制装置100来操作显示设备200。
显示设备200,可以是液晶显示器、OLED显示器、投影显示设备。具体显示设备类型,尺寸大小和分辨率等不作限定,本领技术人员可以理解的是,显示设备200可以根据需要做性能和配置上的一些改变。
图2中示例性示出了根据示例性实施例中控制装置100的配置框图。如图2所示,控制装置100包括控制器110、通信器130、用户输入/输出接口140、存储器190、供电电源180。
控制装置100被配置为控制显示设备200,以及可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起着用户与显示设备200之间交互中介作用。如:用户通过操作控制装置100上频道加减键,显示设备200响应频道加减的操作。
在一些实施例中,控制装置100可是一种智能设备。如:控制装置100可根据用户需求安装控制显示设备200的各种应用。
在一些实施例中,如图1所示,移动终端100B或其他智能电子设备,可在 安装操控显示设备200的应用之后,可以起到控制装置100类似功能。如:用户可以通过安装应用,在移动终端100B或其他智能电子设备上可提供的图形用户界面的各种功能键或虚拟按钮,以实现控制装置100实体按键的功能。
控制器110包括处理器112、RAM113和ROM114、通信接口以及通信总线。控制器110用于控制控制装置100的运行和操作,以及内部各部件之间通信协作以及外部和内部的数据处理功能。
通信器130在控制器110的控制下,实现与显示设备200之间控制信号和数据信号的通信。如:将接收到的用户输入信号发送至显示设备200上。通信器130可包括WIFI模块131、蓝牙模块132、NFC模块133等通信模块中至少一种。
用户输入/输出接口140,其中,输入接口包括麦克风141、触摸板142、传感器143、按键144等输入接口中至少一者。如:用户可以通过语音、触摸、手势、按压等动作实现用户指令输入功能,输入接口通过将接收的模拟信号转换为数字信号,以及数字信号转换为相应指令信号,发送至显示设备200。
输出接口包括将接收的用户指令发送至显示设备200的接口。在一些实施例中,可以是红外接口,也可以是射频接口。如:红外信号接口时,需要将用户输入指令按照红外控制协议转化为红外控制信号,经红外发送模块进行发送至显示设备200。再如:射频信号接口时,需将用户输入指令转化为数字信号,然后按照射频控制信号调制协议进行调制后,由射频发送端子发送至显示设备200。
在一些实施例中,控制装置100包括通信器130和输出接口中至少一者。控制装置100中配置通信器130,如:WIFI、蓝牙、NFC等模块,可将用户输 入指令通过WIFI协议、或蓝牙协议、或NFC协议编码,发送至显示设备200.
存储器190,用于在控制器110的控制下存储驱动和控制控制装置100的各种运行程序、数据和应用。存储器190,可以存储用户输入的各类控制信号指令。
供电电源180,用于在控制器110的控制下为控制装置100各元件提供运行电力支持。可以电池及相关控制电路。
控制器110可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示设备200上显示UI对象的用户命令,控制器110便可以执行与由用户命令选择的对象有关的操作。
图3中示例性示出了根据示例性实施例中显示设备200功能配置示意图。如图3所示,存储器290用于存储操作系统、应用程序、内容和用户数据等,在控制器110控制下执行驱动显示设备200的系统运行以及响应用户的各种操作。存储器290可以包括易失性和/或非易失性存储器。
存储器290,具体用于存储驱动显示设备200中控制器110的运行程序,以及存储显示设备200内置各种应用程序,以及用户从外部设备下载的各种应用程序、以及与应用程序相关的各种图形用户界面,以及与图形用户界面相关的各种对象,用户数据信息,以及各种支持应用程序的内部数据。存储器290用于存储操作系统(OS)内核、中间件和应用等系统软件,以及存储输入的视频数据和音频数据、及其他用户数据。
存储器290,具体用于存储视频处理器260-1和音频处理器260-2、显示器280、通信接口230、调谐解调器220、检测器240、输入/输出接口等驱动程序和相关数据。
在一些实施例中,存储器290可以存储软件和/或程序,用于表示操作系统(OS)的软件程序包括,例如:内核、中间件、应用编程接口(API)和/或应用程序。示例性的,内核可控制或管理系统资源,或其它程序所实施的功能(如所述中间件、API或应用程序),以及内核可以提供接口,以允许中间件和API,或应用访问控制器,以实现控制或管理系统资源。
示例的,存储器290,包括广播接收模块2901、频道控制模块2902、音量控制模块2903、图像控制模块2904、显示控制模块2905、音频控制模块2906、外部指令识别模块2907、通信控制模块2908、光接收模块2909、电力控制模块2910、操作系统2911、以及其他应用程序2912、浏览器模块等等。控制器110通过运行存储器290中各种软件程序,来执行诸如:广播电视信号接收解调功能、电视频道选择控制功能、音量选择控制功能、图像控制功能、显示控制功能、音频控制功能、外部指令识别功能、通信控制功能、光信号接收功能、电力控制功能、支持各种功能的软件操控平台、以及浏览器功能等其他应用。
示例的,存储器290,包括存储用于驱动和控制显示设备200的各种软件模块。如:存储器290中存储的各种软件模块,包括:基础模块、检测模块、通信模块、显示控制模块、浏览器模块、和各种服务模块等。
其中,基础模块是用于显示设备200中各个硬件之间信号通信、并向上层模块发送处理和控制信号的底层软件模块。检测模块是用于从各种传感器或用户输入接口中收集各种信息,并进行数模转换以及分析管理的管理模块。
图4中示例性示出了根据示例性实施例中显示设备200中软件系统的配置框图。
如图4中所示,操作系统2911,包括用于处理各种基础系统服务和用于实 施硬件相关任务的执行操作软件,充当应用程序和硬件组件之间完成数据处理的媒介。
一些实施例中,部分操作系统内核可以包含一系列软件,用以管理显示设备硬件资源,并为其他程序或软件代码提供服务。
其他一些实施例中,部分操作系统内核可包含一个或多个设备驱动器,设备驱动器可以是操作系统中的一组软件代码,帮助操作或控制显示设备关联的设备或硬件。驱动器可以包含操作视频、音频和/或其他多媒体组件的代码。示例的,包括显示屏、摄像头、Flash、WiFi和音频驱动器。
其中,可访问性模块2911-1,用于修改或访问应用程序,以实现应用程序的可访问性和对其显示内容的可操作性。
通信模块2911-2,用于经由相关通信接口和通信网络与其他外设的连接。
用户界面模块2911-3,用于提供显示用户界面的对象,以供各应用程序访问,可实现用户可操作性。
控制应用程序2911-4,用于控制进程管理,包括运行时间应用程序等。
事件传输系统2914,可在操作系统2911内或应用程序2912中实现。一些实施例中,一方面在在操作系统2911内实现,同时在应用程序2912中实现,用于监听各种用户输入事件,将根据各种事件指代响应各类事件或子事件的识别结果,而实施一组或多组预定义的操作的处理程序。
其中,事件监听模块2914-1,用于监听用户输入接口输入事件或子事件。
事件识别模块2914-1,用于对各种用户输入接口输入各类事件的定义,识别出各种事件或子事件,且将其传输给处理用以执行其相应一组或多组的处理程序。
其中,事件或子事件,是指显示设备200中一个或多个传感器检测的输入,以及外界控制设备(如控制装置100等)的输入。如:语音输入各种子事件,手势识别的手势输入子事件,以及控制装置的遥控按键指令输入的子事件等。示例的,遥控器中一个或多个子事件包括多种形式,包括但不限于按键按上/下/左右/、确定键、按键按住等中一个或组合。以及非实体按键的操作,如移动、按住、释放等操作。
界面布局管理模块2913,直接或间接接收来自于事件传输系统2914监听到各用户输入事件或子事件,用于更新用户界面的布局,包括但不限于界面中各控件或子控件的位置,以及容器的大小或位置、层级等与界面布局相关各种执行操作。
图5中示例性示出了根据示例性实施例中显示设备200中用户界面的示意图。如图5所示,用户界面包括多个视图显示区,示例的,第一视图显示区201和第二视图显示区202,各个视图显示区中包括布局一个或多个不同项目。以及用户界面中还包括指示任一项目被选择的选择器,可通过用户输入而移动选择器的位置,以改变选择不同的项目。
在申请的一些实施方式中,第一视图显示区和第二视图显示区可以处于不同的图层,也可以处于同一图层,也可以是第一视图显示区所在的图层嵌套或者被嵌套于第二视图显示区。基于这样的情况,第一视图显示区也可以称为第一视图显示图层,第二视图显示区可以称为第二视图显示图层。同样的,显示区也可以为窗口。
如果第一视图显示区和第二视图显示区不在一个图层时,在某些情况下,会出现被遮挡的情况。
需要说明的是,多个视图显示区可以是可视的界线,也可以是不可视的界线。如:可通过各视图显示区的背景颜色不同标识不同视图显示区,还可以通过边界线等可视的标识,也可以有不可视的隐形边界。也可以不存在可视的或非可视的边界,而仅在屏幕上显示一定范围区域中相关联项目,具有尺寸和/或排布相同改变属性时,而该一定范围区域则被视同一种视图分区的边界的存在,如:视图显示区201中项目同时缩小或放大,而视图显示区202的变化不同。
其中,一些实施例中,第一视图显示区201为可缩放视图显示。“可缩放”可以表示第一视图显示区201在屏幕上尺寸或占比是可缩放的,或第一视图显示201中项目在在屏幕上尺寸或占比是可缩放的。第一视图显示区201为滚动视图显示区,其可通过用户输入而滚动更新在屏幕中显示项目的数量。
“项目”是指在显示设备200中用户界面的各视图显示区中显示以表示,诸如图标、缩略图、视频剪辑等对应内容的视觉对象。例如:项目可以表示电影、电视剧的图像内容或视频剪辑、音乐的音频内容、应用程序,或其他用户访问内容历史信息。
在本申请的一些实施例中,“项目”可显示图像缩略图。如:当项目为电影或电视剧时,项目可显示为电影或电视剧的海报。如项目为音乐时,可显示音乐专辑的海报。如项目为应用程序时,可显示为应用程序的图标,或当应用程序被执行最近执行时捕捉到应用程序的内容截图。如项目为用户访问历史时,可显示为最近执行过程中内容截图。“项目”可显示为视频剪辑。如:项目为电视或电视剧的预告片的视频剪辑动态画面。
各个项目可具有相同尺寸,也可以具有不相同的尺寸。在一些实施示例中,项目的尺寸可以被改变。
“选择器”用于指示其中任意项目已被选择,如:光标或焦点对象。根据用户在显示器200中触摸的图标或菜单位置来定位选择信息输入,可使显示设备200中显示焦点对象的移动来选择控制项目,可选择或控制其中一个或多个项目。
焦点对象指根据用户输入在项目之间移动的对象。示例的,通过项目边缘绘制粗线来实现或标识焦点对象位置。在其他实施例中,焦点形式不限于示例,可以是光标等有形或无形可被用户识别的形态,如可以项目的3D变形等形式,也可以改变聚焦的项目的文本或图像的边框线、尺寸、颜色、透明度和轮廓和/或字体等标识。
在本申请的一些实施例中,第一视图显示区201和第二视图显示区202中各项目中分别关联有不同内容或链接。示例性的,第一视图显示区201中各项目为海报的缩略图或视频剪辑,第二视图显示区202中显示各种应用程序的文本和/或图标。第一视图显示区201和第二视图显示区202各自呈现不同的应用,示例的,第一视图显示区201为显示起始应用的用户界面,第二视图显示区202为显示目标应用的用户界面。
界面布局管理模块2913,用于用户界面状态(包括视图分区、项目、焦点或光标对象等位置和/或大小、变化过程等)的监控,以及根据该事件或子事件,可执行修改各视图显示区的大小和位置、层级等布局,和/或,调整或修改各视图显示区各类项目布局的大小或/和位置、数量、类型、内容等布局。一些实施例中,修改和调整布局,包括在屏幕上显示或不显示各视图分区或视图分区中项目内容。
用户输入接口,用于将用户的输入信号发送给控制器110,或者,将从控制 器输出的信号传送给用户。示例性的,控制装置(例如移动终端或遥控器)可将用户输入的诸如电源开关信号、频道选择信号、音量调节信号等输入信号发送至用户输入接口,再由用户输入接口转送至控制器;
在本申请的一些实施例中,用户可在显示器200上显示的图形用户界面(GUI)输入用户命令,则用户输入接口通过图形用户界面(GUI)接收用户输入命令。或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户输入接口通过传感器识别出声音或手势,来接收用户输入命令。
本申请示例性的实施方式中提供了一种显示设备,包括:
显示器,该显示器被配置为显示用户界面,其中,所述用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面,其中,各个所述视图显示区中包括布局一个或多个不同项目,以及,该用户界面中还包括指示所述项目被选择的选择器,可通过用户输入而移动所述选择器在所述用户界面中的位置,以使所述项目根据所述选择器而移动;
与所述显示器通信的控制器,所述控制器被配置为执行呈现所述用户界面;
控制器接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
所述控制器获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置;
若控制器确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
检测到所述第一显示图标被用户拖动到第二视图显示区后,采用所述目标 应用开启所述项目。
图6~8中示例性示出了根据示例性实施例中显示设备200中用户界面与用户指令之间操作示意图。示例的,如图6-8中,在第一视图显示区201中布局多个不同项目,例如项目1~项目6为多个不同的包括文本和图像缩略图的海报。在第二视图显示区202中还未显示任何项目(此处不限定第二视图显示区是否显示项目,当然,也可以显示图片等项目)。
如图6中,用户通过触发用于选中(例如用户通过长按项目5的方式)并移动项目5(资源5)的指令,实现对显示设备200中项目5的操控;当用户在显示设备200的用户界面拖动项目5的显示图标时,用户界面变化到如图7所示。
如图7所示,项目5的显示图标位于第一视图显示区201和第二视图显示区202的分界线位置(项目5的显示图标可以被用户拖动到显示设备200的用户界面的任何位置);用户继续拖动项目5的显示图标,当项目5被拖动到第二视图显示区202时,用户抬手,用户界面变化到如图8所示,项目5直接呈现在第二视图显示区202,在第二视图显示区202放大或半屏显示项目5的内容。可选的,被拖动到第二视图显示区,也可被目标应用直接打开。
在一些示例性的实施方式中,控制器接收到位于第一视图显示区的项目被用户选中并指示移动的指令之后,获取所述项目的存储路径和所述项目被选中的位置,并取消所述项目在起始应用中被用户选中的事件,其中,控制器根据所述项目的存储路径获取文件类型。
在一些示例性的实施方式中,所述控制器确定所述项目能被位于第二视图显示区的目标应用开启之前,所述控制器还用于:
获取位于第一视图显示区的应用的包名,及位于第二视图显示区的目标应用的包名;
根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
所述控制器确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述控制器通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
在一些示例性的实施方式中,所述控制器还用于:
若控制器确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
当所述第二显示图标被用户拖动到第二视图显示区后,绘制用于指示所述项目返回第一视图显示区的动画。
图9中示例性示出了根据示例性实施例中分屏模式下A应用和B应用所处位置示意图,其中,A应用为起始应用,显示于第一视图显示区;B应用为目标应用,显示于第二视图显示区。对于分屏模式下的应用A和应用B,互相不知道处于分屏下的其他应用的信息,因此,当选中A应用中的项目,无法获知该项目是否可以由B应用打开,也无法指定由处于分屏模式下的B应用打开被选中的A应用中的项目。
分屏模式下,启动Activity;打开项目时,即便指定由与当前窗口处于分屏状态的另一窗口中的应用打开,也有可能启动的应用直接覆盖在了当前项目选择的窗口,而不是在另外的窗口打开(由于Activity启动方式也不一定保持当前 的分屏状态),图10中示例性示出了根据示例性实施例中分屏模式下现有的打开A应用中项目的效果图;当前处于分屏模式下,选中A应用中的项目并打开,会出现以下四种情况:第一种情况,项目由C应用打开,打开的界面在A窗口位置;第二种情况,项目由C应用打开,打开的界面在B窗口位置;第三种情况,项目由B应用打开,打开的界面在A窗口位置;第四种情况,项目由B应用打开,打开的界面在B窗口位置;下面举例说明A应用、B应用和C应用:例如A应用为文件管理器,其中显示了一个PPT文档即为A应用的项目,B应用为电子白板工具,由于电子白板工具不支持PPT文档打开,而本机内集成了WPS应用,PPT可以由WPS打开,则WPS应用就是C应用。
通过本申请示例性实施例提供的方法,可以保证项目的打开方式为图10中的第四种情况,也就是说,可以明确地告诉用户该项目可以由分屏的哪个应用打开,并且实现了分屏模式下通过拖动一个应用中的项目到另外一个应用中实现打开的方式。
图11中示例性示出了根据示例性实施例中一种项目开启方法,包括:
S101、接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
项目被用户选中并用于移动的指令例如包括项目被用户长按,长按是拖动触发事件,项目被用户长按时会通知拖动动画应用AnimationApp,AnimationApp接收到长按事件,同时接收到项目的存储路径和被选择的位置。
S102、获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为 目标应用的用户界面;
S103、若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
所述拖动动画应用接收到用于移动到目标应用的指令后,获取起始应用和目标应用的信息和分屏位置,并通过所述项目的存储路径解析出该项目的文件类型,判断是否可以由B应用打开。
在本申请的一些示例性的实施方式中,若被长按的项目为图片时,所述第一显示图标包括所述项目的缩略图和用于指示所述项目可以被预设应用打开的绿色加号;若被长按的项目为其他项目文件时,所述第一显示图标包括所述项目的图标(该图标为被长按的项目图标的复制体)和用于指示所述项目可以被预设应用打开的绿色加号。
S104、检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目。
例如,在分屏模式下,智能电视或手机终端的用户界面包括第一视图显示区和第二视图显示区,第一视图显示区为界面的左屏,第二视图显示区为界面的右屏,或者,第一视图显示区为界面的右屏,第二视图显示区为界面的左屏。
在一些示例性的实施方式中,所述指令包括所述项目的存储路径和接收到用于被用户选中并移动的指令时所述项目所处的位置;
所述拖动动画应用接收到所述项目被选中的消息之后,取消所述项目在所述起始应用中被用户选中的事件。
项目被长按的消息中,包括该项目被长按时的路径信息,拖动动画应该根据该项目的路径信息获取扩展名,从而获取到该项目的文件类型;取消项目被 选中的事件例如为取消项目被长按。
在一些示例性的实施方式中,所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启之前,该方法还包括:
所述拖动动画应用获取位于第一视图显示区的应用的包名,及位于第二视图显示区的目标应用的包名;
所述拖动动画应用根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述拖动动画应用通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
包名和应用是一一对应的,确定了包名就区分了不同的应用,例如左右分屏应用的包名packageA和packageB。
在一些示例性的实施方式中,若拖动动画应用确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
例如,若被长按的项目为图片时,所述第二显示图标包括所述项目的缩略图和用于指示所述项目不能被预设应用打开的红色停止号;若被长按的项目为其他项目文件时,所述第二显示图标包括所述项目的图标(该图标为被长按的项目图标的复制体)和用于指示所述项目不能被预设应用打开的红色停止号。
当所述第二显示图标被用户拖动到第二视图显示区后,拖动动画应用绘制用于指示所述项目返回第一视图显示区的动画。
图12中示例性示出了根据示例性实施例中通过拖动项目实现项目开启的方 法流程图,具体包括以下内容:
第一步,操作系统启动以后,拖到动画应用AnimationApp启动时,会保存自定义的项目打开方式以及支持文件类型userHashMap(userHashMap是一种自定义的打开方式映射表,例如电子白板应用,支持图片的插入,会生成一条映射数据,电子白板包名为hashkey,插入图片的方式为数据),例如自定义的信息保存在xml文件中,AnimationApp解析以后以包名为索引保存起来(包名可以作为操作系统中应用的标识,操作系统中每个应用的包名都是不同的);
第二步,图13中示例性示出了根据示例性实施例中项目拖动过程的动画图,在图13中,A应用和B应用处于分屏模式下,长按A应用中的项目S,例如图片s.jpg,则长按应用会通过广播通知AnimationApp项目S被用户长按,并向AnimationApp传递项目S的保存路径(例如被长按的图片s.jpg的保存路径);AnimationApp接收到通知以后,通过inputfilter发送MotionEvent.ACTION_CANCEL(长按是一个持续的过程,除非用户抬手,该步骤是在系统层强制把长按按键结束掉,也就是发送触控取消的事件给当前长按应用的界面),从而取消A应用的长按事件以防止拖动项目时拖动过程对A应用的影响;inputfilter是一个按键处理通道,所有的触控事件都先经过这个通道过滤下,决定这个事件是否还继续分发;
AnimationApp通过活动管理ActivityManager的新增接口getTopTaskLeft()和getTopTaskRight()分别获取左右分屏的RunningTaskInfo(RunningTaskInfo是操作系统原生提供的数据结构,可以用于判断当前有哪些应用在运行),从而获取到左右分屏的A应用的包名packageA和B应用的包名packageB(包名和应用是一一对应的,确定了包名就区分了不同的应用),并根据项目S被长按 时的位置信息,确定打开项目发起者为packageA,期望项目打开者为packageA(项目被长按时所处的窗口的应用为打开项目发起者,另外一个应用为期望项目打开者),同时也确定了项目S将要被打开的分屏位置openside,也就是B应用所处的窗口;
第三步,通过第二步中向AnimationApp传递的项目S的保存路径信息,获取项目S的扩展名为jpg,从而获取到MimeType(MimeType为文件格式类型)为image/jpeg;根据获取到的B应用的包名packageB查询userHashMap是否支持打开被选中的项目S(userHashMap是打开项目映射表,该映射表以应用的包名为索引,以打开方式和支持的文件类型为数据,AnimationApp已经获取到B应用的包名packageB,以packageB支持的打开方式和支持文件类型,与项目S的文件类型做对比即可获知B应用是否支持打开项目S),在userHashMap中没有找到关于B应用可以打开项目S的信息(也就是说,B应用不支持该文件类型的项目),则生成系统支持的标准打开方式意图intent,然后通过packageMagager.queryIntentActivities(intent)判定intent是否可以由B应用打开,如果可以打开,则通过接口Intent.setPackage(packageB),指定由分屏的B应用打开(本实施例通过接口Intent.setPackage(packageB)这个接口设置了项目S只能由B应用打开);如果在userHashMap中找到关于B应用可以打开项目S的信息,则使用预定义的项目打开方式由处于分屏状态下的B应用打开项目。
第四步,根据第三步的结果绘制不同的显示图标,即:如果项目S可以被B应用打开,则由AnimationApp绘制项目S的缩略图,并在缩略图的基础上绘制一个用于指示项目S可以被B应用打开的图标(例如绿色的加号);如果项目 S不能被B应用打开,则由AnimationApp绘制项目S的缩略图,并在缩略图的基础上绘制一个用于指示项目S不可以被B应用打开的图标(例如红色停止号);在第二步中已经取消A应用的长按事件,因此,此时用户拖动的是AnimationApp绘制的显示图标,例如图13中的图标ST,而不再是最初用户长按的项目;显示图标可以跟随用户手指移动变动自身的位置。
第五步,当用户抬手时判断显示图标所处的分屏位置,当显示图标处于B应用所处的窗口时,如果根据第三步的结果获知B应用支持打开项目S,则根据第三步生成的intent,将第二步中得到的openside信息(即B应用所处的窗口)带入intent.putExtra(),通过接口Context.startActivity(Intent)由B应用在指定的分屏位置(即B应用所处的窗口)打开项目S;如果根据第三步的结果获知B应用不支持打开项目S,则由AnimationApp绘制动画,用于指示项目S返回A应用所处的窗口,从而项目S返回到原位置(即项目S被长按的位置);
当用户抬手时,判断显示图标所处的分屏位置的具体步骤包括:图14中示例性示出了根据示例性实施例中判断显示图标所处的分屏位置示意图,以图14中左屏的左上角为原点,左屏和右屏的分界线的x轴坐标位置为xsplit,项目S被长按的位置信息是确定的,通过比较分界线的x轴坐标和项目S被长按的x轴坐标,就可以确定项目S是被用户从左屏拖到右屏,还是被用户从右屏拖到左屏,例如,当分界线的x轴坐标大于项目S被长按的x轴坐标,则项目S是被用户从左屏拖到右屏,当分界线的x轴坐标小于项目S被长按的x轴坐标,则项目S是被用户从右屏拖到左屏;而用户拖动的显示图标的x轴坐标位置xmove也是确定的,当xmove<xsplit时,表示显示图标被用户拖到了左屏,当xmove>xsplit时,表示显示图标被用户拖到了右屏。
以下通过举例比较在分屏模式下,示例性示出了现有的项目打开方法和通过项目拖动实现的项目开启方法。
图15中示例性示出了根据示例性实施例中现有的项目打开方法示意图,若在左屏的白板应用中,插入右屏的文件管理器应用中的图片,则需要点击白板应用中的插入图片按钮①,调出文件管理器应用,选中图片后点击插入按钮②进行插入。
图16~18中示例性示出了根据示例性实施例中通过项目拖动实现的项目开启方法示意图,若在左屏的白板应用中,打开右屏的文件管理器应用中的图片,则直接长按选中右屏中的图片,会显示悬浮的所选中图片的缩略图,用户移动手指可以拖动悬浮的缩略图,当用户松手后,如果用户松手的位置在白板应用范围内,则直接在白板应用上插入当前长按选中的图片。
相应地,在装置侧,图19中示例性示出了根据示例性实施例中一种项目开启装置,包括:
第一单元11,用于接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
第二单元12,用于获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;
第三单元13,用于若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
第四单元14,用于检测到所述第一显示图标被用户拖动到第二视图显示区, 采用所述目标应用开启所述项目。
图20中还示例性示出了根据示例性实施例中一种项目开启装置,包括:
处理器600,用于读取存储器610中的程序,执行下列过程:
接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;
若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目。
通过该装置,接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目,从而在分屏模式下,通过应用之间的拖动实现项目在指定的位置打开。
在一些示例性的实施方式中,所述指令包括所述项目的存储路径和接收到用于被用户选中并移动的指令时所述项目所处的位置;
所述拖动动画应用接收到所述项目被选中的消息之后,取消所述项目在所述起始应用中被用户选中的事件。
在一些示例性的实施方式中,所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启之前,该方法还包括:
所述拖动动画应用获取位于第一视图显示区的应用的包名,及位于第二视图显示区的目标应用的包名;
所述拖动动画应用根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述拖动动画应用通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
在一些示例性的实施方式中,若拖动动画应用确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
当所述第二显示图标被用户拖动到第二视图显示区后,拖动动画应用绘制用于指示所述项目返回第一视图显示区的动画。
其中,在图20中,总线架构可以包括任意数量的互联的总线和桥,具体由处理器600代表的一个或多个处理器和存储器610代表的存储器的各种电路链接在一起。总线架构还可以将诸如外围设备、稳压器和功率管理电路等之类的各种其他电路链接在一起,这些都是本领域所公知的,因此,本文不再对其进 行进一步描述。总线接口提供接口。
本申请示例性的实施方式中提供了一种显示终端,该显示终端具体可以为桌面计算机、便携式计算机、智能手机、平板电脑、个人数字助理(Personal Digital Assistant,PDA)等。该显示终端可以包括中央处理器(Center Processing Unit,CPU)、存储器、输入/输出设备等,输入设备可以包括键盘、鼠标、触摸屏等,输出设备可以包括显示设备,如液晶显示器(Liquid Crystal Display,LCD)、阴极射线管(Cathode Ray Tube,CRT)等。
针对不同的显示终端,在一些示例性的实施方式中,用户接口620可以是能够外接内接需要设备的接口,连接的设备包括但不限于小键盘、显示器、扬声器、麦克风、操纵杆等。
处理器600负责管理总线架构和通常的处理,存储器610可以存储处理器600在执行操作时所使用的数据。
在一些示例性的实施方式中,处理器600可以是CPU(中央处埋器)、ASIC(Application Specific Integrated Circuit,专用集成电路)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)或CPLD(Complex Programmable Logic Device,复杂可编程逻辑器件)。
存储器610可以包括只读存储器(ROM)和随机存取存储器(RAM),并向处理器提供存储器中存储的程序指令和数据。在本申请实施例中,存储器可以用于存储本申请实施例提供的任一所述方法的程序。
处理器通过调用存储器存储的程序指令,处理器用于按照获得的程序指令执行本申请实施例提供的任一所述方法。
本申请示例性的实施方式中提供了一种计算机存储介质,用于储存为上述 本申请实施例提供的装置所用的计算机程序指令,其包含用于执行上述本申请实施例提供的任一方法的程序。
所述计算机存储介质可以是计算机能够存取的任何可用介质或数据存储设备,包括但不限于磁性存储器(例如软盘、硬盘、磁带、磁光盘(MO)等)、光学存储器(例如CD、DVD、BD、HVD等)、以及半导体存储器(例如ROM、EPROM、EEPROM、非易失性存储器(NAND FLASH)、固态硬盘(SSD))等。
综上所述,本申请示例性的实施方式中提供了一种项目开启方法及装置、显示设备,从而在分屏模式下,通过应用之间的拖动实现项目在指定的位置打开。
本领域内的技术人员应明白,本申请的一些示例性实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (11)

  1. 一种显示设备,其特征在于,包括:
    显示器,该显示器被配置为显示用户界面,其中,所述用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面,其中,各个所述视图显示区中包括布局一个或多个不同项目,以及,该用户界面中还包括指示所述项目被选择的选择器,可通过用户输入而移动所述选择器在所述用户界面中的位置,以使所述项目根据所述选择器而移动;
    与所述显示器通信的控制器,所述控制器被配置为执行呈现所述用户界面;
    控制器接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
    所述控制器获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置;
    若控制器确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
    检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目。
  2. 根据权利要求1所述的显示设备,其特征在于,控制器接收到位于第一视图显示区的项目被用户选中的指令之后,根据所述存储路径获取所述项目的文件类型,并取消所述项目在所述起始应用中被用户选中的事件。
  3. 根据权利要求2所述的显示设备,其特征在于,所述控制器确定所述项目能被位于第二视图显示区的目标应用开启之前,所述控制器还用于:
    获取位于第一视图显示区的应用的包名,及位于第二视图显示区的目标应用的包名;
    根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
    所述控制器确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述控制器通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
  4. 根据权利要求3所述的显示设备,其特征在于,所述控制器还用于:
    若控制器确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
    当所述第二显示图标被用户拖动到第二视图显示区后,绘制用于指示所述项目返回第一视图显示区的动画。
  5. 根据权利要求3所述的显示设备,其特征在于,所述控制器当检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目时,具体用于:
    当检测到所述第一显示图标的任一位置坐标大于所述边界位置坐标变化到所述任一位置坐标小于所述边界位置坐标时,采用所述目标应用开启所述项目;
    或者,当检测到所述第一显示图标的任一位置坐标小于所述边界位置坐标变化到所述任一位置坐标大于所述边界位置坐标时,采用所述目标应用开启所述项目;
    或者,当检测到用于拖动第一显示图标的物体的位置坐标大于所述边界位 置坐标变化到所述物体的位置坐标小于所述边界位置坐标时,采用所述目标应用开启所述项目;或者,当检测到用于拖动第一显示图标的物体的位置坐标小于所述边界位置坐标变化到所述物体的位置坐标大于所述边界位置坐标时,采用所述目标应用开启所述项目。
  6. 一种项目开启方法,其特征在于,该方法包括:
    接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
    获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;
    若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
    检测到所述第一显示图标被用户拖动到第二视图显示区,采用所述目标应用开启所述项目。
  7. 根据权利要求6所述的方法,其特征在于,所述指令包括所述项目的存储路径和接收到用于被用户选中并移动的指令时所述项目所处的位置;
    所述拖动动画应用接收到所述项目被选中的消息之后,取消所述项目在所述起始应用中被用户选中的事件。
  8. 根据权利要求7所述的方法,其特征在于,所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启之前,该方法还包括:
    所述拖动动画应用获取位于第一视图显示区的应用的包名,及位于第二视 图显示区的目标应用的包名;
    所述拖动动画应用根据所述项目被选中的位置,确定所述项目需要被位于第二视图显示区的目标应用开启;
    所述拖动动画应用确定所述项目能被位于第二视图显示区的目标应用开启,具体包括:所述拖动动画应用通过匹配所述项目的文件类型和位于第二视图显示区的目标应用的包名,确定所述项目能被所述目标应用开启。
  9. 根据权利要求8所述的方法,其特征在于,该方法还包括:
    若拖动动画应用确定所述项目不能被位于第二视图显示区的目标应用开启,则绘制所述项目的第二显示图标,所述第二显示图标包括所述项目的图标和指示所述项目不能被所述目标应用开启的图标;
    当所述第二显示图标被用户拖动到第二视图显示区后,拖动动画应用绘制用于指示所述项目返回第一视图显示区的动画。
  10. 一种项目开启装置,其特征在于,该装置包括:
    第一单元,用于接收位于第一视图显示区的项目被用户选中并用于移动的指令,其中,所述指令包括所述项目的存储路径和当前被选择的位置;
    第二单元,用于获取所述起始应用和所述目标应用的属性以及所述第一视图显示区和所述第二视图显示区的边界位置,其中,用户界面包括第一视图显示区和第二视图显示区,该第一视图显示区为起始应用的用户界面,第二视图显示区为目标应用的用户界面;
    第三单元,用于若确定所述项目能被位于第二视图显示区的目标应用开启,则绘制所述项目的第一显示图标;
    第四单元,用于检测到所述第一显示图标被用户拖动到第二视图显示区, 采用所述目标应用开启所述项目。
  11. 一种计算机存储介质,其特征在于,所述计算机存储介质存储有计算机可执行指令,所述计算机可执行指令用于使所述计算机执行权利要求9所述的方法。
PCT/CN2020/079703 2019-06-21 2020-03-17 一种项目开启方法及装置、显示设备 WO2020253282A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910540234.8 2019-06-21
CN201910540234.8A CN112199124B (zh) 2019-06-21 2019-06-21 一种项目开启方法及装置、显示设备

Publications (1)

Publication Number Publication Date
WO2020253282A1 true WO2020253282A1 (zh) 2020-12-24

Family

ID=74004645

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/079703 WO2020253282A1 (zh) 2019-06-21 2020-03-17 一种项目开启方法及装置、显示设备

Country Status (2)

Country Link
CN (1) CN112199124B (zh)
WO (1) WO2020253282A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974446A (zh) * 2023-09-18 2023-10-31 荣耀终端有限公司 一种动画效果的显示方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279269A (zh) * 2013-05-31 2013-09-04 华为技术有限公司 一种应用程序之间的数据交互方法及装置、终端设备
US20160139776A1 (en) * 2014-11-13 2016-05-19 Microsoft Technology Licensing Content Transfer to Non-Running Targets
CN106055246A (zh) * 2016-05-25 2016-10-26 努比亚技术有限公司 一种移动终端及其操作方法
CN106502557A (zh) * 2016-09-14 2017-03-15 深圳众思科技有限公司 一种分屏传输文件的方法及装置
CN107066172A (zh) * 2017-02-16 2017-08-18 北京小米移动软件有限公司 移动终端的文件传输方法及装置
CN109782976A (zh) * 2019-01-15 2019-05-21 Oppo广东移动通信有限公司 文件处理方法、装置、终端及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9529494B2 (en) * 2011-09-27 2016-12-27 Z124 Unified desktop triad control user interface for a browser
US8773378B2 (en) * 2010-10-01 2014-07-08 Z124 Smartpad split screen
CN102156605B (zh) * 2010-02-12 2013-01-09 宏碁股份有限公司 物件移动方法、物件移动系统及电子装置
CN104133629A (zh) * 2014-07-10 2014-11-05 深圳市中兴移动通信有限公司 双屏互动的方法及移动终端
CN106250081A (zh) * 2016-07-29 2016-12-21 努比亚技术有限公司 一种基于双屏终端的显示方法和装置
CN107977152A (zh) * 2017-11-30 2018-05-01 努比亚技术有限公司 一种基于双屏移动终端的图片分享方法、终端和存储介质
CN109491632A (zh) * 2018-10-30 2019-03-19 维沃移动通信有限公司 一种资源分享方法及终端
CN109618206B (zh) * 2019-01-24 2021-11-05 海信视像科技股份有限公司 呈现用户界面的方法和显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279269A (zh) * 2013-05-31 2013-09-04 华为技术有限公司 一种应用程序之间的数据交互方法及装置、终端设备
US20160139776A1 (en) * 2014-11-13 2016-05-19 Microsoft Technology Licensing Content Transfer to Non-Running Targets
CN106055246A (zh) * 2016-05-25 2016-10-26 努比亚技术有限公司 一种移动终端及其操作方法
CN106502557A (zh) * 2016-09-14 2017-03-15 深圳众思科技有限公司 一种分屏传输文件的方法及装置
CN107066172A (zh) * 2017-02-16 2017-08-18 北京小米移动软件有限公司 移动终端的文件传输方法及装置
CN109782976A (zh) * 2019-01-15 2019-05-21 Oppo广东移动通信有限公司 文件处理方法、装置、终端及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974446A (zh) * 2023-09-18 2023-10-31 荣耀终端有限公司 一种动画效果的显示方法及装置

Also Published As

Publication number Publication date
CN112199124B (zh) 2022-07-01
CN112199124A (zh) 2021-01-08

Similar Documents

Publication Publication Date Title
US10963139B2 (en) Operating method for multiple windows and electronic device supporting the same
US11635869B2 (en) Display device and method of controlling the same
US10613701B2 (en) Customizable bladed applications
CN109164964B (zh) 内容分享方法、装置、终端及存储介质
EP2701054B1 (en) Method and apparatus for constructing a home screen in a terminal having a touch screen
US8881047B2 (en) Systems and methods for dynamic background user interface(s)
WO2020063091A1 (zh) 一种图片处理方法及终端设备
TWI612467B (zh) 行動裝置及其執行應用程式的方法
CN111142730B (zh) 一种分屏显示方法及电子设备
US9864443B2 (en) Method for controlling user input and electronic device thereof
US20120026105A1 (en) Electronic device and method thereof for transmitting data
US20160349946A1 (en) User terminal apparatus and control method thereof
CN111078076A (zh) 一种应用程序切换方法及电子设备
CN113672290B (zh) 一种文件打开方法及设备
CN110865765A (zh) 终端及地图控制方法
WO2020253282A1 (zh) 一种项目开启方法及装置、显示设备
CN107728898B (zh) 一种信息处理方法及移动终端
CN113672289B (zh) 一种文件打开方法及设备
KR101352506B1 (ko) 단말 장치에서의 아이템 표시 방법 및 그 방법에 따른 단말 장치
KR20210022027A (ko) 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20826666

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20826666

Country of ref document: EP

Kind code of ref document: A1