CN112199124A - Project opening method and device and display equipment - Google Patents

Project opening method and device and display equipment Download PDF

Info

Publication number
CN112199124A
CN112199124A CN201910540234.8A CN201910540234A CN112199124A CN 112199124 A CN112199124 A CN 112199124A CN 201910540234 A CN201910540234 A CN 201910540234A CN 112199124 A CN112199124 A CN 112199124A
Authority
CN
China
Prior art keywords
item
display area
view display
application
target application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910540234.8A
Other languages
Chinese (zh)
Other versions
CN112199124B (en
Inventor
张振宝
申静
张耀仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN201910540234.8A priority Critical patent/CN112199124B/en
Priority to PCT/CN2020/079703 priority patent/WO2020253282A1/en
Publication of CN112199124A publication Critical patent/CN112199124A/en
Application granted granted Critical
Publication of CN112199124B publication Critical patent/CN112199124B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a project opening method and device and display equipment, so that in a split screen mode, a project is opened at a designated position through dragging between applications. The project opening method provided by the embodiment of the application comprises the following steps: receiving an instruction that an item located in the first view display area is selected by a user and used for moving; acquiring attributes of a starting application and a target application and boundary positions of a first view display area and a second view display area, wherein a user interface comprises the first view display area and the second view display area, the first view display area is a user interface of the starting application, and the second view display area is a user interface of the target application; if the item is determined to be started by the target application positioned in the second view display area, drawing a first display icon of the item; and detecting that the first display icon is dragged to a second view display area by a user, and opening the item by adopting a target application.

Description

Project opening method and device and display equipment
Technical Field
The application relates to the technical field of display, in particular to a project opening method and device and display equipment.
Background
In the split-screen mode, each application in different split-screen windows does not know the information of other applications in the split-screen mode, so that a resource in the application of one selected split-screen window cannot know whether the resource can be opened by the applications in other split-screen windows or not, and cannot specify that the other applications in the split-screen mode open the resource in the selected application; in addition, due to the limitation of the android system, in the split-screen mode, interaction between applications can be opened only through a specified interface, and the current operating system cannot open resources by dragging resources in one application to other applications.
Disclosure of Invention
The application provides an item opening method and device and display equipment, and is used for realizing opening of an item at a specified position through dragging between applications in a split screen mode.
According to an aspect of exemplary embodiments, there is provided a display apparatus including:
a display configured to display a user interface, wherein the user interface includes a first view display area and a second view display area, the first view display area is a user interface of an initial application, the second view display area is a user interface of a target application, wherein each view display area includes one or more different items arranged therein, and the user interface further includes a selector indicating that the items are selected, and the position of the selector in the user interface can be moved by a user input so that the items move according to the selector;
a controller in communication with the display, the controller configured to perform presenting the user interface;
the method comprises the steps that a controller receives an instruction that an item located in a first view display area is selected by a user and used for moving, wherein the instruction comprises a storage path and a currently selected position of the item;
the controller acquires attributes of the starting application and the target application and boundary positions of the first view display area and the second view display area;
if the controller determines that the item can be started by a target application located in the second view display area, drawing a first display icon of the item;
and after detecting that the first display icon is dragged to a second view display area by a user, opening the item by adopting the target application.
Receiving, by a controller, an instruction for an item located in a first view display area to be selected by a user and for movement, wherein the instruction includes a storage path and a currently selected location of the item; the controller acquires attributes of the starting application and the target application and boundary positions of the first view display area and the second view display area; if the controller determines that the item can be started by a target application located in the second view display area, drawing a first display icon of the item; and after detecting that the first display icon is dragged to the second view display area by the user, opening the item by adopting the target application, so that the item is opened at a specified position by dragging between the applications in a split screen mode.
In some exemplary embodiments, after receiving an instruction that an item located in the first view display area is selected by a user, the controller obtains a file type of the item according to the storage path, and cancels an event that the item is selected by the user in the starting application.
In some exemplary embodiments, the controller is further configured to, before determining that the item can be opened by the target application located in the second view display area:
acquiring a package name of an application positioned in a first view display area and a package name of a target application positioned in a second view display area;
determining that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the controller, that the item can be opened by a target application located in the second view display area specifically includes: the controller determines that the item can be opened by the target application by matching a file type of the item with a package name of the target application located in the second view display area.
In some exemplary embodiments, the controller is further configured to:
if the controller determines that the item cannot be opened by a target application located in a second view display area, drawing a second display icon of the item, wherein the second display icon comprises an icon of the item and an icon indicating that the item cannot be opened by the target application;
and drawing an animation for indicating that the item returns to the first view display area after the second display icon is dragged to the second view display area by the user.
In some exemplary embodiments, when it is detected that the first display icon is dragged to the second view display area by the user and the item is opened by using the target application, the controller is specifically configured to:
when any position coordinate of the first display icon is detected to be larger than the boundary position coordinate and changed to be smaller than the boundary position coordinate, starting the item by adopting the target application;
or when detecting that any position coordinate of the first display icon is smaller than the boundary position coordinate and changes to the position coordinate larger than the boundary position coordinate, adopting the target application to start the item;
or, when detecting that the position coordinate of the object for dragging the first display icon is larger than the boundary position coordinate and changes to the position coordinate of the object which is smaller than the boundary position coordinate, adopting the target application to open the item;
or, when it is detected that the position coordinate of the object for dragging the first display icon is smaller than the boundary position coordinate and changes to the position coordinate of the object which is larger than the boundary position coordinate, the item is opened by using the target application.
An exemplary embodiment of the present application provides a project opening method, including:
receiving an instruction for moving an item located in a first view display area, wherein the instruction comprises a storage path of the item and a currently selected position;
acquiring attributes of a starting application and a target application and boundary positions of a first view display area and a second view display area, wherein a user interface comprises the first view display area and the second view display area, the first view display area is a user interface of the starting application, and the second view display area is a user interface of the target application;
if the item is determined to be started by the target application positioned in the second view display area, drawing a first display icon of the item;
and detecting that the first display icon is dragged to a second view display area by a user, and opening the item by adopting the target application.
By the method, receiving an instruction for moving an item located in a first view display area, wherein the instruction comprises a storage path and a currently selected position of the item, and the instruction is selected by a user; acquiring the attributes of the starting application and the target application and the boundary positions of the first view display area and the second view display area, wherein the user interface comprises the first view display area and the second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application; if the item is determined to be started by the target application positioned in the second view display area, drawing a first display icon of the item; and detecting that the first display icon is dragged to a second view display area by a user, and starting the item by adopting the target application, so that the item is opened at a specified position by dragging between the applications in a split screen mode.
In some exemplary embodiments, the instruction includes a stored path of the item and a location where the item was located when the instruction was received for selection and movement by a user;
and after receiving the message that the item is selected, the dragging animation application cancels the event that the item is selected by the user in the starting application.
In some exemplary embodiments, the drag animation application determines that the item can be opened by a target application located in the second view display area, and the method further comprises:
the method comprises the steps that the dragging animation application obtains a package name of an application located in a first view display area and a package name of a target application located in a second view display area;
the dragging animation application determines that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the drag animation application, that the item can be opened by a target application located in the second view display area specifically includes: and the dragging animation application determines that the item can be opened by the target application by matching the file type of the item with the package name of the target application positioned in the second view display area.
In some example embodiments, if the drag animation application determines that the item cannot be opened by a target application located in a second view display area, drawing a second displayed icon of the item, the second displayed icon including an icon of the item and an icon indicating that the item cannot be opened by the target application;
after the second display icon is dragged to the second view display area by the user, the drag animation application draws an animation for indicating that the item returns to the first view display area.
Accordingly, on the device side, an item opening device provided in an exemplary embodiment of the present application, the device includes:
a first unit, configured to receive an instruction that an item located in a first view display area is selected by a user and used for moving, where the instruction includes a storage path and a currently selected position of the item;
the second unit is used for acquiring the attributes of the starting application and the target application and the boundary positions of the first view display area and the second view display area, wherein the user interface comprises the first view display area and the second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application;
a third unit, configured to draw a first display icon of the item if it is determined that the item can be opened by a target application located in the second view display area;
and the fourth unit is used for detecting that the first display icon is dragged to the second view display area by the user and adopting the target application to open the item.
There is also provided in an exemplary embodiment of the present application, a computing device, comprising:
a memory for storing program instructions;
and the processor is used for calling the program instructions stored in the memory and executing any one of the methods provided by the embodiment of the application according to the obtained program.
A computer storage medium having stored thereon computer-executable instructions for causing a computer to perform any of the above-described methods is provided in exemplary embodiments of the present application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment;
fig. 2 is a block diagram schematically showing a configuration of the control apparatus 100 according to an exemplary embodiment;
fig. 3 is a diagram schematically illustrating a functional configuration of the display device 200 according to an exemplary embodiment;
fig. 4 exemplarily shows a configuration diagram of a software system in the display device 200 according to an exemplary embodiment;
FIG. 5 is a schematic diagram illustrating a user interface in the display device 200 according to an exemplary embodiment;
fig. 6 to 8 are diagrams illustrating operations between a user interface and a user instruction in the display apparatus 200 according to an exemplary embodiment;
fig. 9 is a diagram illustrating positions of an a application and a B application in a split screen mode according to an exemplary embodiment;
FIG. 10 is a diagram illustrating an effect of opening an item in an application A according to the prior art in the split screen mode in the exemplary embodiment;
FIG. 11 is a diagram illustrating a project opening method according to an exemplary embodiment;
FIG. 12 is a flow diagram illustrating a method for opening an item by dragging the item according to an exemplary embodiment;
FIG. 13 is an animation diagram illustrating an item drag process according to an exemplary embodiment;
fig. 14 is a diagram illustrating an exemplary screen-split position where a displayed icon is judged to be located according to an exemplary embodiment;
FIG. 15 is a diagram illustrating an existing item opening method in accordance with an illustrative embodiment;
fig. 16 to 18 are diagrams exemplarily illustrating a method of opening an item by item drag according to an exemplary embodiment;
FIG. 19 is a schematic diagram of an item opening device according to an exemplary embodiment;
fig. 20 also illustrates a schematic diagram of an item opening device according to an exemplary embodiment.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be understood that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Various embodiments of the present application will be described in detail below with reference to the accompanying drawings. It should be noted that the display sequence of the embodiment of the present application only represents the sequence of the embodiment, and does not represent the merits of the technical solutions provided by the embodiments.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the control device 100.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The specific display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
Fig. 2 is a block diagram schematically showing the configuration of the control apparatus 100 according to the exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
The control apparatus 100 is configured to control the display device 200 and may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications that control the display device 200 according to user demands.
In some embodiments, as shown in fig. 1, the mobile terminal 100B or other intelligent electronic device may function similar to the control apparatus 100 after an application for manipulating the display device 200 is installed. Such as: the user may implement the functions of controlling the physical keys of the apparatus 100 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 100B or other intelligent electronic devices.
The controller 110 includes a processor 112, a RAM113 and a ROM114, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display apparatus 200. The communicator 130 may include at least one of a WIFI module 131, a bluetooth module 132, an NFC module 133, and the like.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like. Such as: the user can realize a user instruction input function through actions such as voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display apparatus 200. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
In some embodiments, the control device 100 includes at least one of a communicator 130 and an output interface. The communicator 130 is configured in the control device 100, such as: the modules of WIFI, bluetooth, NFC, etc. may send the user input command to the display device 200 through the WIFI protocol, or the bluetooth protocol, or the NFC protocol code.
And a memory 190 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 190 may store various control signal commands input by a user.
And a power supply 180 for providing operational power support to the components of the control device 100 under the control of the controller 110. A battery and associated control circuitry.
The controller 110 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display device 200, the controller 110 may perform an operation related to the object selected by the user command.
Fig. 3 is a diagram schematically illustrating a functional configuration of the display device 200 according to an exemplary embodiment. As shown in fig. 3, the memory 290 is used to store an operating system, an application program, contents, user data, and the like, and performs system operations for driving the display device 200 and various operations in response to a user under the control of the controller 110. The memory 290 may include volatile and/or nonvolatile memory.
The memory 290 is specifically used for storing an operating program for driving the controller 110 in the display device 200, and storing various applications installed in the display device 200, various applications downloaded by a user from an external device, various graphical user interfaces related to the applications, various objects related to the graphical user interfaces, user data information, and internal data of various supported applications. The memory 290 is used to store system software such as an Operating System (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
The memory 290 is specifically used for storing drivers and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner demodulator 220, the detector 240, the input/output interface, etc.
In some embodiments, memory 290 may store software and/or programs, software programs for representing an Operating System (OS) including, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. For example, the kernel may control or manage system resources, or functions implemented by other programs (e.g., the middleware, APIs, or applications), and the kernel may provide interfaces to allow the middleware and APIs, or applications, to access the controller to implement controlling or managing system resources.
The memory 290, for example, includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external instruction recognition module 2907, a communication control module 2908, a light receiving module 2909, a power control module 2910, an operating system 2911, and other applications 2912, a browser module, and the like. The controller 110 performs functions such as: a broadcast television signal reception demodulation function, a television channel selection control function, a volume selection control function, an image control function, a display control function, an audio control function, an external instruction recognition function, a communication control function, an optical signal reception function, an electric power control function, a software control platform supporting various functions, a browser function, and the like.
Illustratively, the memory 290 includes various software modules for driving and controlling the display device 200. Such as: various software modules stored in memory 290, including: the system comprises a basic module, a detection module, a communication module, a display control module, a browser module, various service modules and the like.
The basic module is a bottom layer software module for signal communication between hardware in the display device 200 and sending processing and control signals to an upper layer module. The detection module is a management module used for collecting various information from various sensors or user input interfaces, and performing digital-to-analog conversion and analysis management.
A block diagram of a configuration of a software system in the display device 200 according to an exemplary embodiment is exemplarily shown in fig. 4.
As shown in fig. 4, an operating system 2911, including executing operating software for handling various basic system services and for performing hardware related tasks, acts as an intermediary between application programs and hardware components for performing data processing.
In some embodiments, portions of the operating system kernel may contain a series of software to manage the display device hardware resources and provide services to other programs or software code.
In other embodiments, portions of the operating system kernel may include one or more device drivers, which may be a set of software code in the operating system that assists in operating or controlling the devices or hardware associated with the display device. The drivers may contain code that operates the video, audio, and/or other multimedia components. Examples include a display screen, a camera, Flash, WiFi, and audio drivers.
The accessibility module 2911-1 is configured to modify or access the application program to achieve accessibility and operability of the application program for displaying content.
A communication module 2911-2 for connection to other peripherals via associated communication interfaces and a communication network.
The user interface module 2911-3 is configured to provide an object for displaying a user interface, so that each application program can access the object, and user operability can be achieved.
Control applications 2911-4 for controlling process management, including runtime applications and the like.
The event transmission system 2914 may be implemented within the operating system 2911 or within the application 2912. In some embodiments, an aspect is implemented within the operating system 2911 and concurrently in the application programs 2912 for listening for various user input events, and will implement one or more sets of predefined operations in response to various events referring to the recognition of various types of events or sub-events.
The event monitoring module 2914-1 is configured to monitor an event or a sub-event input by the user input interface.
The event identification module 2914-1 is configured to input definitions of various types of events for various user input interfaces, identify various events or sub-events, and transmit the same to a process for executing one or more corresponding sets of processes.
The event or sub-event refers to an input detected by one or more sensors in the display device 200 and an input of an external control device (e.g., the control apparatus 100). Such as: the method comprises the following steps of inputting various sub-events through voice, inputting a gesture sub-event through gesture recognition, inputting a remote control key command of a control device and the like. Illustratively, the one or more sub-events in the remote control include a variety of forms including, but not limited to, one or a combination of key presses up/down/left/right/, ok keys, key presses, and the like. And non-physical key operations such as move, hold, release, etc.
The interface layout management module 2913, directly or indirectly receiving the input events or sub-events from the event transmission system 2914, monitors the input events or sub-events, and updates the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the size, position, and level of the container, which are related to the layout of the interface.
A schematic view of a user interface in a display device 200 according to an exemplary embodiment is illustrated in fig. 5. As shown in FIG. 5, the user interface includes a plurality of view display areas, illustratively a first view display area 201 and a second view display area 202, each of which includes a layout of one or more different items. And a selector in the user interface indicating that any one of the items is selected, the position of the selector being movable by user input to change the selection of a different item.
It should be noted that the plurality of view display areas may be visible boundaries or invisible boundaries. Such as: the different view display areas can be marked through different background colors of the view display areas, and visible marks such as boundary lines and invisible boundaries can also be provided. It is also possible that there is no visible or non-visible border, and that only the associated items in a certain area of the screen are displayed, having the same changing properties in size and/or arrangement, which certain area is seen as the presence of the border of the same view partition, such as: items in the first view display area 201 are simultaneously zoomed in or out while the second view display area 202 is changed differently.
Wherein, in some embodiments, the first view display area 201 is a scalable view display. "scalable" may mean that the first view display area 201 is scalable in size or proportion on the screen, or that the items in the first view display 201 are scalable in size or proportion on the screen. The first view display area 201 is a scroll view display area which can scroll update the number of items displayed in the screen by a user input.
"item" refers to a visual object displayed in each view display area of the user interface in the display device 200 to represent corresponding content such as icons, thumbnails, video clips, and the like. For example: the items may represent movies, image content or video clips of a television show, audio content of music, applications, or other user access content history information.
In some embodiments, an "item" may display an image thumbnail. Such as: when the item is a movie or a tv show, the item may be displayed as a poster of the movie or tv show. If the item is music, a poster of a music album may be displayed. Such as an icon for the application when the item is an application, or a screenshot of the content that captures the application when it was most recently executed. If the item is the user access history, the content screenshot in the latest execution process can be displayed. The "item" may be displayed as a video clip. Such as: the item is a video clip dynamic of a trailer of a television or a television show.
The items may or may not be the same size. In some implementation examples, the size of the items may be varied.
A "selector" is used to indicate where any item has been selected, such as a cursor or a focus object. Positioning the selection information input according to an icon or menu position touched by the user in the display 200 may cause movement of a focus object displayed in the display device 200 to select a control item, one or more of which may be selected or controlled.
The focus object refers to an object that moves between items according to user input. Illustratively, the focus object position is implemented or identified by drawing a thick line through the item edge. In other embodiments, the focus form is not limited to an example, and may be a form such as a cursor that is recognizable by the user, either tangible or intangible, such as in the form of a 3D deformation of the item, or may change the identification of the border lines, size, color, transparency, and outline and/or font of the text or image of the item in focus.
In some embodiments, different content or links are associated with each item in the first view display area 201 and the second view display area 202. Illustratively, each item in the first view display area 201 is a thumbnail of a poster or a video clip, and text and/or icons of various applications are displayed in the second view display area 202. The first view display area 201 and the second view display area 202 each present different applications, and illustratively, the first view display area 201 is a user interface displaying a starting application, and the second view display area 202 is a user interface displaying a target application.
The interface layout management module 2913 is configured to monitor the state of the user interface (including the position and/or size of the view partition, the item, the focus or the cursor object, the change process, and the like), and according to the event or the sub-event, may perform modification on the layout of the size, position, hierarchy, and the like of each view display area, and/or adjust or modify the layout of the size, position, number, type, content, and the like of each type of item layout of each view display area. In some embodiments, the layout is modified and adjusted, including displaying or not displaying the view sections or the content of the items in the view sections on the screen.
A user input interface for transmitting an input signal of a user to the controller 110 or transmitting a signal output from the controller to the user. For example, the control device (e.g., a mobile terminal or a remote controller) may send an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by a user to the user input interface, and then the input signal is forwarded to the controller by the user input interface;
in some embodiments, a user may input a user command on a Graphical User Interface (GUI) displayed on the display 200, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
There is provided in an exemplary embodiment of the present application a display device including:
a display configured to display a user interface, wherein the user interface includes a first view display area and a second view display area, the first view display area is a user interface of an initial application, the second view display area is a user interface of a target application, wherein each view display area includes one or more different items arranged therein, and the user interface further includes a selector indicating that the items are selected, and the position of the selector in the user interface can be moved by a user input so that the items move according to the selector;
a controller in communication with the display, the controller configured to perform presenting the user interface;
the method comprises the steps that a controller receives an instruction that an item located in a first view display area is selected by a user and used for moving, wherein the instruction comprises a storage path and a currently selected position of the item;
the controller acquires attributes of the starting application and the target application and boundary positions of the first view display area and the second view display area;
if the controller determines that the item can be started by a target application located in the second view display area, drawing a first display icon of the item;
and after detecting that the first display icon is dragged to a second view display area by a user, opening the item by adopting the target application.
Fig. 6 to 8 are diagrams illustrating operations between a user interface and a user instruction in the display apparatus 200 according to an exemplary embodiment. As an example, as in fig. 6-8, a plurality of different items are laid out in the first-view display area 201, for example, items 1 to 6 are a plurality of different posters including text and image thumbnails. No item is yet displayed in the second view display area 202 (it is not limited herein whether the second view display area displays an item, and of course, items such as pictures may also be displayed).
As in fig. 6, the user implements manipulation of item 5 in display device 200 by triggering an instruction for selecting (e.g., by long-pressing item 5) and moving item 5 (resource 5); when the user drags the displayed icon of item 5 at the user interface of the display apparatus 200, the user interface changes to that shown in fig. 7.
As shown in fig. 7, the displayed icon of item 5 is located at the boundary position of the first view display area 201 and the second view display area 202 (the displayed icon of item 5 may be dragged by the user to any position of the user interface of the display device 200); the user continues to drag the displayed icon for item 5, and as item 5 is dragged to the second view display area 202, the user lifts his hand and the user interface changes to that shown in fig. 8, item 5 is presented directly in the second view display area 202, and the content of item 5 is enlarged or displayed half-screen in the second view display area 202. Optionally, the second view display area is dragged to the second view display area, and the second view display area can also be directly opened by the target application.
In some exemplary embodiments, after receiving an instruction that an item located in the first view display area is selected by a user and indicates movement, the controller acquires a storage path of the item and a position where the item is selected, and cancels an event that the item is selected by the user in the starting application, wherein the controller acquires a file type according to the storage path of the item.
In some exemplary embodiments, the controller is further configured to, before determining that the item can be opened by the target application located in the second view display area:
acquiring a package name of an application positioned in a first view display area and a package name of a target application positioned in a second view display area;
determining that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the controller, that the item can be opened by a target application located in the second view display area specifically includes: the controller determines that the item can be opened by the target application by matching a file type of the item with a package name of the target application located in the second view display area.
In some exemplary embodiments, the controller is further configured to:
if the controller determines that the item cannot be opened by a target application located in a second view display area, drawing a second display icon of the item, wherein the second display icon comprises an icon of the item and an icon indicating that the item cannot be opened by the target application;
and drawing an animation for indicating that the item returns to the first view display area after the second display icon is dragged to the second view display area by the user.
In some exemplary embodiments, when it is detected that the first display icon is dragged to the second view display area by the user and the item is opened by using the target application, the controller is specifically configured to:
when any position coordinate of the first display icon is detected to be larger than the boundary position coordinate and changed to be smaller than the boundary position coordinate, starting the item by adopting the target application;
or when detecting that any position coordinate of the first display icon is smaller than the boundary position coordinate and changes to the position coordinate larger than the boundary position coordinate, adopting the target application to start the item;
or, when detecting that the position coordinate of the object for dragging the first display icon is larger than the boundary position coordinate and changes to the position coordinate of the object which is smaller than the boundary position coordinate, adopting the target application to open the item;
or, when it is detected that the position coordinate of the object for dragging the first display icon is smaller than the boundary position coordinate and changes to the position coordinate of the object which is larger than the boundary position coordinate, the item is opened by using the target application.
Fig. 9 is a schematic diagram illustrating positions of an application a and an application B in a split-screen mode according to an exemplary embodiment, where the application a is a start application and is displayed in a first view display area; the application B is a target application and is displayed in the second view display area. For the application a and the application B in the split-screen mode, the information of other applications in the split-screen mode is not known mutually, so that when an item in the application a is selected, whether the item can be opened by the application B cannot be known, and the application B in the split-screen mode cannot specify that the item in the selected application a is opened by the application B.
In the split screen mode, the Activity is started; when an item is opened, even if it is specified that the application in another window in the split screen state with the current window is opened, the application which is possibly started directly covers the window selected by the current item, rather than opening another window (the current split screen state is not necessarily maintained due to the Activity starting mode), and an effect diagram of the item in the existing open a application in the split screen mode according to the exemplary embodiment is exemplarily shown in fig. 10; currently, in the split screen mode, an item in the application a is selected and opened, and the following four situations occur: in the first case, the project is opened by the C application, and the opened interface is at the position of the A window; in the second case, the project is opened by the C application, and the opened interface is at the position of the B window; in the third situation, the project is opened by the application B, and the opened interface is at the position of the window A; in the fourth situation, the project is opened by the application B, and the opened interface is at the position of the window B; the following illustrates the a, B and C applications: for example, the application A is a file manager, a PPT document is displayed as a project of the application A, the application B is an electronic whiteboard tool, the electronic whiteboard tool does not support opening of the PPT document, a WPS application is integrated in the computer, the PPT can be opened by the WPS, and the WPS application is the application C.
By the method provided by the exemplary embodiment of the present application, it may be ensured that the opening manner of the item is the fourth situation in fig. 10, that is, the user may be explicitly told which application of the split screen the item may be opened by, and a manner of implementing opening by dragging the item in one application to another application in the split screen mode is implemented.
Fig. 11 illustrates an item opening method according to an exemplary embodiment, including:
s101, receiving an instruction for moving an item located in a first view display area, wherein the instruction comprises a storage path of the item and a currently selected position;
the instruction for the item to be selected by the user and used for moving comprises the steps that the item is pressed for a long time by the user, the long press is a dragging trigger event, the item is pressed for a long time by the user to inform a dragging animation application of an AnimationApp, the AnimationApp receives the long press event, and simultaneously receives a storage path and a selected position of the item.
S102, acquiring attributes of the starting application and the target application and boundary positions of the first view display area and the second view display area, wherein the user interface comprises the first view display area and the second view display area, the first view display area is a user interface of the starting application, and the second view display area is a user interface of the target application;
s103, if the item is determined to be started by the target application located in the second view display area, drawing a first display icon of the item;
and after receiving an instruction for moving to a target application, the dragging animation application acquires information and split screen positions of a start application and the target application, analyzes the file type of the project through the storage path of the project, and judges whether the project can be opened by the B application.
For example, if the item being pressed for a long time is a picture, the first display icon includes a thumbnail of the item and a green plus sign indicating that the item can be opened by a preset application; if the item which is long-pressed is other item files, the first display icon comprises an icon of the item (the icon is a copy of the long-pressed item icon) and a green plus sign for indicating that the item can be opened by a preset application.
And S104, detecting that the first display icon is dragged to a second view display area by a user, and opening the item by adopting the target application.
For example, in the split-screen mode, the user interface of the smart television or the mobile phone terminal includes a first view display area and a second view display area, where the first view display area is a left screen of the interface and the second view display area is a right screen of the interface, or the first view display area is a right screen of the interface and the second view display area is a left screen of the interface.
In some exemplary embodiments, the instruction includes a stored path of the item and a location where the item was located when the instruction was received for selection and movement by a user;
and after receiving the message that the item is selected, the dragging animation application cancels the event that the item is selected by the user in the starting application.
The message that the item is pressed for a long time comprises path information of the item that is pressed for a long time, and the dragging animation acquires an extension name according to the path information of the item, so that the file type of the item is acquired; the event that the cancel item is selected is, for example, that the cancel item is pressed long.
In some exemplary embodiments, the drag animation application determines that the item can be opened by a target application located in the second view display area, and the method further comprises:
the method comprises the steps that the dragging animation application obtains a package name of an application located in a first view display area and a package name of a target application located in a second view display area;
the dragging animation application determines that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the drag animation application, that the item can be opened by a target application located in the second view display area specifically includes: and the dragging animation application determines that the item can be opened by the target application by matching the file type of the item with the package name of the target application positioned in the second view display area.
The package names and the applications are in one-to-one correspondence, and different applications are distinguished by determining the package names, such as package names PackageA and PackageB of left and right split-screen applications.
In some example embodiments, if the drag animation application determines that the item cannot be opened by a target application located in a second view display area, drawing a second displayed icon of the item, the second displayed icon including an icon of the item and an icon indicating that the item cannot be opened by the target application;
for example, if the item being pressed for a long time is a picture, the second display icon includes a thumbnail of the item and a red stop number for indicating that the item cannot be opened by a preset application; if the long-pressed item is another item file, the second display icon includes an icon of the item (the icon is a copy of the long-pressed item icon) and a red stop number indicating that the item cannot be opened by a preset application.
After the second display icon is dragged to the second view display area by the user, the drag animation application draws an animation for indicating that the item returns to the first view display area.
Fig. 12 is a flowchart illustrating a method for opening an item by dragging the item according to an exemplary embodiment, which specifically includes the following steps:
the method comprises the steps that firstly, after an operating system is started, when an animation application Animation App is dragged to be started, a user-defined project opening mode and a user HashMap (the user HashMap is a user-defined opening mode mapping table, such as an electronic whiteboard application, and supports the insertion of pictures, a piece of mapping data can be generated, the package name of the electronic whiteboard is a Hashkey, and the mode of inserting pictures is data) are stored, for example, user-defined information is stored in an xml file, and the Animation App is analyzed and then stored by taking the package name as an index (the package name can be used as an identifier of an application in the operating system, and the package name of each application in the operating system is different);
a second step, fig. 13 is an animation diagram illustrating a process of dragging the item according to an exemplary embodiment, in fig. 13, the a application and the B application are in the split screen mode, and if the item S in the a application is pressed for a long time, for example, the picture s.jpg, the long-press application notifies the AnimationApp that the item S is pressed for a long time by the user through broadcasting, and transmits a save path of the item S (for example, a save path of the pressed picture s.jpg) to the AnimationApp; after receiving the notification, the AnimationApp sends a motionevent action _ CANCEL through the inputfilter (long press is a continuous process, and unless the user raises his hand, the step is to forcibly end the long press key at the system layer, namely, send the touch CANCEL event to the current long press application interface), so as to CANCEL the long press event of the a application to prevent the drag process from affecting the a application when dragging the item; the inputfilter is a key processing channel, and all touch events are filtered by the channel to determine whether the events are distributed continuously or not;
the AnimationApp acquires RunningTaskInfo of left and right split screens respectively through newly added interfaces getTopTaskLeft () and getTopTaskRight () of an activity management ActivintManager (the RunningTaskInfo is a data structure originally provided by an operating system and can be used for judging which applications are currently running), so as to acquire packet names of a packet name packet A of an A application and a packet name packet B of a B application of the left and right split screens (the packet names and the applications are in one-to-one correspondence, and different applications are distinguished by determining the packet names), and according to the position information of the item S of the long press, determines that an opening item initiator is the packet A, and an expected item opener is the packet A (the application of a window where the item is located by the long press is an opening item initiator, and the other application is an expected item opener), and determines a split screen position openside where the item S is to be opened, namely the window where the B application is located;
thirdly, acquiring the extension name of the item S as jpg through the saving path information of the item S transmitted to the animationApp in the second step, thereby acquiring the MimeType (the MimeType is a file format type) as image/jpeg; inquiring whether the userHashMap supports opening the selected item S or not according to the acquired packet name packageB of the application B (the userHashMap is an opening item mapping table which takes the packet name of the application as an index, takes the opening mode and the supported file type as data, and the AnamationApp already acquires the packet name packageB of the application B and can know whether the application B supports opening the item S or not by comparing the opening mode and the supported file type supported by the packageB with the file type of the item S), no information is found in the userHashMap that the B application can open item S (i.e., the B application does not support items of that file type), then a standard open style intent supported by the system is generated, then determines whether the intent can be opened by the B application through packagemagemagagerqueryintentactivities (intent), if the item S can be opened, the opening of the split screen B application is designated through an interface Intent.setPackageB (in the embodiment, the item S is set to be opened only by the B application through the interface Intent.setPackageB); and if the information that the B application can open the item S is found in the userHashMap, opening the item by the B application in the split screen state by using a predefined item opening mode.
And step four, drawing different display icons according to the result of the step three, namely: if the item S can be opened by the B application, drawing a thumbnail of the item S by the animation App, and drawing an icon (for example, a plus sign of green) for indicating that the item S can be opened by the B application on the basis of the thumbnail; if the item S cannot be opened by the B application, drawing a thumbnail of the item S by the animation App, and drawing an icon (such as a red stop number) for indicating that the item S cannot be opened by the B application on the basis of the thumbnail; the long-press event of the a application has been cancelled in the second step, and therefore, the user drags a displayed icon drawn by the AnimationApp at this time, such as the icon ST in fig. 13, instead of the item that the user originally pressed long; the displayed icon may change its position following the user's finger movement.
Fifthly, judging the split screen position of the displayed icon when the user lifts the hand, and if the displayed icon is in the window of the B application, if the B application supports opening of the item S according to the result of the third step, bringing openside information (namely the window of the B application) obtained in the second step into the intent.putExtra () according to the Intent generated in the third step, and opening the item S at the specified split screen position (namely the window of the B application) through the B application through an interface context.startActivity (Intent); if it is known from the result of the third step that the B application does not support opening the item S, an animation is drawn by the animation app for instructing the item S to return to the window where the a application is located, so that the item S returns to the original position (i.e., the position where the item S is pressed long);
when the user raises his hand, the specific steps of judging the split screen position where the display icon is located comprise: fig. 14 is a diagram illustrating an exemplary screen split position where a displayed icon is judged to be located according to an exemplary embodiment
In fig. 14, the upper left corner of the left screen is the origin, the x-axis coordinate position of the boundary between the left screen and the right screen is xsplit, and the position information of the item S that is pressed for a long time is determined, and it is determined whether the item S is dragged from the left screen to the right screen by the user or the item S is dragged from the right screen to the left screen by comparing the x-axis coordinate of the boundary with the x-axis coordinate of the item S that is pressed for a long time, for example, when the x-axis coordinate of the boundary is greater than the x-axis coordinate of the item S that is pressed for a long time, the item S is dragged from the left screen to the right screen by the user, and when the x-axis coordinate of the boundary is less than the x-axis coordinate of the item S that is pressed for a long time, the item; and the x-axis coordinate position xmove of the display icon dragged by the user is also determined, when xmove < xsplit, the display icon is dragged to the left screen by the user, and when xmove > xsplit, the display icon is dragged to the right screen by the user.
In the following, by way of example and comparison, in the split screen mode, an existing item opening method and an item opening method implemented by item dragging are exemplarily shown.
Fig. 15 exemplarily shows a schematic diagram of an existing project opening method in the exemplary embodiment, if a picture in a file manager application of a right screen is inserted into a whiteboard application of a left screen, a picture insertion button (i) in the whiteboard application needs to be clicked, the file manager application is called out, and a picture is selected and then an insertion button (ii) is clicked to perform insertion.
Fig. 16 to 18 are schematic diagrams illustrating an item opening method implemented by item dragging according to an exemplary embodiment, where if a picture in a file manager application of a right screen is opened in a whiteboard application of a left screen, a long press is directly performed to select the picture in the right screen, a suspended thumbnail of the selected picture is displayed, the suspended thumbnail can be dragged by a finger moved by a user, and after the user releases his hand, if the position of the released hand of the user is within the range of the whiteboard application, the current long press selected picture is directly inserted into the whiteboard application.
Accordingly, on the device side, an item opening device according to an exemplary embodiment is exemplarily shown in fig. 19, including:
a first unit 11, configured to receive an instruction that an item located in a first view display area is selected by a user and used for moving, where the instruction includes a storage path and a currently selected position of the item;
a second unit 12, configured to obtain attributes of a start application and a target application and boundary positions of the first view display area and the second view display area, where the user interface includes the first view display area and the second view display area, the first view display area is a user interface of the start application, and the second view display area is a user interface of the target application;
a third unit 13, configured to draw the first display icon of the item if it is determined that the item can be opened by the target application located in the second view display area;
a fourth unit 14, configured to detect that the first display icon is dragged to the second view display area by the user, and open the item using the target application.
Fig. 20 also illustrates an item opening apparatus according to an exemplary embodiment, including:
the processor 600, for reading the program in the memory 610, executes the following processes:
receiving an instruction for moving an item located in a first view display area, wherein the instruction comprises a storage path of the item and a currently selected position;
acquiring the attributes of the starting application and the target application and the boundary positions of the first view display area and the second view display area, wherein the user interface comprises the first view display area and the second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application;
if the item is determined to be started by the target application positioned in the second view display area, drawing a first display icon of the item;
and detecting that the first display icon is dragged to a second view display area by a user, and opening the item by adopting the target application.
Receiving an instruction for moving an item located in a first view display area, wherein the instruction comprises a storage path and a currently selected position of the item, and the instruction is selected by a user; acquiring the attributes of the starting application and the target application and the boundary positions of the first view display area and the second view display area, wherein the user interface comprises the first view display area and the second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application; if the item is determined to be started by the target application positioned in the second view display area, drawing a first display icon of the item; and detecting that the first display icon is dragged to a second view display area by a user, and starting the item by adopting the target application, so that the item is opened at a specified position by dragging between the applications in a split screen mode.
In some exemplary embodiments, the instruction includes a stored path of the item and a location where the item was located when the instruction was received for selection and movement by a user;
and after receiving the message that the item is selected, the dragging animation application cancels the event that the item is selected by the user in the starting application.
In some exemplary embodiments, the drag animation application determines that the item can be opened by a target application located in the second view display area, and the method further comprises:
the method comprises the steps that the dragging animation application obtains a package name of an application located in a first view display area and a package name of a target application located in a second view display area;
the dragging animation application determines that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the drag animation application, that the item can be opened by a target application located in the second view display area specifically includes: and the dragging animation application determines that the item can be opened by the target application by matching the file type of the item with the package name of the target application positioned in the second view display area.
In some example embodiments, if the drag animation application determines that the item cannot be opened by a target application located in a second view display area, drawing a second displayed icon of the item, the second displayed icon including an icon of the item and an icon indicating that the item cannot be opened by the target application;
after the second display icon is dragged to the second view display area by the user, the drag animation application draws an animation for indicating that the item returns to the first view display area.
Where in fig. 20, the bus architecture may include any number of interconnected buses and bridges, with various circuits being linked together, particularly one or more processors represented by processor 600 and memory represented by memory 610. The bus architecture may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. The bus interface provides an interface.
The exemplary embodiments of the present application provide a display terminal, which may be specifically a desktop computer, a portable computer, a smart phone, a tablet computer, a Personal Digital Assistant (PDA), and the like. The Display terminal may include a Central Processing Unit (CPU), a memory, an input/output device, etc., the input device may include a keyboard, a mouse, a touch screen, etc., and the output device may include a Display device, such as a Liquid Crystal Display (LCD), a Cathode Ray Tube (CRT), etc.
For different display terminals, in some exemplary embodiments, the user interface 620 may be an interface capable of interfacing externally to a desired device, including but not limited to a keypad, a display, a speaker, a microphone, a joystick, etc.
The processor 600 is responsible for managing the bus architecture and general processing, and the memory 610 may store data used by the processor 600 in performing operations.
In some exemplary embodiments, the processor 600 may be a CPU (central processing unit), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array), or a CPLD (Complex Programmable Logic Device).
Memory 610 may include Read Only Memory (ROM) and Random Access Memory (RAM), and provides the processor with program instructions and data stored in the memory. In the embodiments of the present application, the memory may be used for storing a program of any one of the methods provided by the embodiments of the present application.
The processor is used for executing any one of the methods provided by the embodiment of the application according to the obtained program instructions by calling the program instructions stored in the memory.
The present application provides, in an exemplary embodiment, a computer storage medium for storing computer program instructions for an apparatus provided in the above-mentioned embodiment of the present application, which includes a program for executing any one of the methods provided in the above-mentioned embodiment of the present application.
The computer storage media may be any available media or data storage device that can be accessed by a computer, including, but not limited to, magnetic memory (e.g., floppy disks, hard disks, magnetic tape, magneto-optical disks (MOs), etc.), optical memory (e.g., CDs, DVDs, BDs, HVDs, etc.), and semiconductor memory (e.g., ROMs, EPROMs, EEPROMs, non-volatile memory (NAND FLASH), Solid State Disks (SSDs)), etc.
In summary, the exemplary embodiments of the present application provide a method and an apparatus for opening an item, and a display device, so that in a split-screen mode, an item is opened at a designated position by dragging between applications.
As will be appreciated by one of skill in the art, some of the exemplary embodiments of this application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A display device, comprising:
a display configured to display a user interface, wherein the user interface includes a first view display area and a second view display area, the first view display area is a user interface of an initial application, the second view display area is a user interface of a target application, wherein each view display area includes one or more different items arranged therein, and the user interface further includes a selector indicating that the items are selected, and the position of the selector in the user interface can be moved by a user input so that the items move according to the selector;
a controller in communication with the display, the controller configured to perform presenting the user interface;
the method comprises the steps that a controller receives an instruction that an item located in a first view display area is selected by a user and used for moving, wherein the instruction comprises a storage path and a currently selected position of the item;
the controller acquires attributes of the starting application and the target application and boundary positions of the first view display area and the second view display area;
if the controller determines that the item can be started by a target application located in the second view display area, drawing a first display icon of the item;
and detecting that the first display icon is dragged to a second view display area by a user, and opening the item by adopting the target application.
2. The display device according to claim 1, wherein after receiving an instruction that an item in the first view display area is selected by a user, the controller obtains a file type of the item according to the storage path, and cancels an event that the item is selected by the user in the starting application.
3. The display device of claim 2, wherein the controller determines that the item can be opened by a target application located in the second view display area, the controller further configured to:
acquiring a package name of an application positioned in a first view display area and a package name of a target application positioned in a second view display area;
determining that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the controller, that the item can be opened by a target application located in the second view display area specifically includes: the controller determines that the item can be opened by the target application by matching a file type of the item with a package name of the target application located in the second view display area.
4. The display device of claim 3, wherein the controller is further configured to:
if the controller determines that the item cannot be opened by a target application located in a second view display area, drawing a second display icon of the item, wherein the second display icon comprises an icon of the item and an icon indicating that the item cannot be opened by the target application;
and drawing an animation for indicating that the item returns to the first view display area after the second display icon is dragged to the second view display area by the user.
5. The display device according to claim 3, wherein the controller, when detecting that the first displayed icon is dragged by the user to the second view display area, is specifically configured to, when opening the item with the target application:
when any position coordinate of the first display icon is detected to be larger than the boundary position coordinate and changed to be smaller than the boundary position coordinate, starting the item by adopting the target application;
or when detecting that any position coordinate of the first display icon is smaller than the boundary position coordinate and changes to the position coordinate larger than the boundary position coordinate, adopting the target application to start the item;
or, when detecting that the position coordinate of the object for dragging the first display icon is larger than the boundary position coordinate and changes to the position coordinate of the object which is smaller than the boundary position coordinate, adopting the target application to open the item;
or, when it is detected that the position coordinate of the object for dragging the first display icon is smaller than the boundary position coordinate and changes to the position coordinate of the object which is larger than the boundary position coordinate, the item is opened by using the target application.
6. A method for opening an item, the method comprising:
receiving an instruction for moving an item located in a first view display area, wherein the instruction comprises a storage path of the item and a currently selected position;
acquiring attributes of a starting application and a target application and boundary positions of a first view display area and a second view display area, wherein a user interface comprises the first view display area and the second view display area, the first view display area is a user interface of the starting application, and the second view display area is a user interface of the target application;
if the item is determined to be started by the target application positioned in the second view display area, drawing a first display icon of the item;
and detecting that the first display icon is dragged to a second view display area by a user, and opening the item by adopting the target application.
7. The method of claim 6, wherein the instruction comprises a stored path of the item and a location where the item was located when the instruction was received for selection and movement by a user;
and after receiving the message that the item is selected, the dragging animation application cancels the event that the item is selected by the user in the starting application.
8. The method of claim 7, wherein the drag animation application determines that the item can be opened by a target application located in the second view display area, and further comprising:
the method comprises the steps that the dragging animation application obtains a package name of an application located in a first view display area and a package name of a target application located in a second view display area;
the dragging animation application determines that the item needs to be opened by a target application positioned in a second view display area according to the selected position of the item;
the determining, by the drag animation application, that the item can be opened by a target application located in the second view display area specifically includes: and the dragging animation application determines that the item can be opened by the target application by matching the file type of the item with the package name of the target application positioned in the second view display area.
9. The method of claim 8, further comprising:
if the drag animation application determines that the item cannot be opened by a target application located in a second view display area, drawing a second display icon of the item, the second display icon including an icon of the item and an icon indicating that the item cannot be opened by the target application;
after the second display icon is dragged to the second view display area by the user, the drag animation application draws an animation for indicating that the item returns to the first view display area.
10. An item opening device, comprising:
a first unit, configured to receive an instruction that an item located in a first view display area is selected by a user and used for moving, where the instruction includes a storage path and a currently selected position of the item;
the second unit is used for acquiring the attributes of the starting application and the target application and the boundary positions of the first view display area and the second view display area, wherein the user interface comprises the first view display area and the second view display area, the first view display area is the user interface of the starting application, and the second view display area is the user interface of the target application;
a third unit, configured to draw a first display icon of the item if it is determined that the item can be opened by a target application located in the second view display area;
and the fourth unit is used for detecting that the first display icon is dragged to the second view display area by the user and adopting the target application to open the item.
CN201910540234.8A 2019-06-21 2019-06-21 Project opening method and device and display equipment Active CN112199124B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910540234.8A CN112199124B (en) 2019-06-21 2019-06-21 Project opening method and device and display equipment
PCT/CN2020/079703 WO2020253282A1 (en) 2019-06-21 2020-03-17 Item starting method and apparatus, and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910540234.8A CN112199124B (en) 2019-06-21 2019-06-21 Project opening method and device and display equipment

Publications (2)

Publication Number Publication Date
CN112199124A true CN112199124A (en) 2021-01-08
CN112199124B CN112199124B (en) 2022-07-01

Family

ID=74004645

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910540234.8A Active CN112199124B (en) 2019-06-21 2019-06-21 Project opening method and device and display equipment

Country Status (2)

Country Link
CN (1) CN112199124B (en)
WO (1) WO2020253282A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974446B (en) * 2023-09-18 2024-06-14 荣耀终端有限公司 Animation effect display method and device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156605A (en) * 2010-02-12 2011-08-17 宏碁股份有限公司 Object moving method, object moving system and electronic device
US20120081318A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Displaying the desktop upon device open
US20130080944A1 (en) * 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop triad control user interface for a browser
CN104133629A (en) * 2014-07-10 2014-11-05 深圳市中兴移动通信有限公司 Double-screen interaction method and mobile terminal
CN106055246A (en) * 2016-05-25 2016-10-26 努比亚技术有限公司 Mobile terminal and operation method thereof
CN106250081A (en) * 2016-07-29 2016-12-21 努比亚技术有限公司 A kind of display packing based on double screen terminal and device
CN107977152A (en) * 2017-11-30 2018-05-01 努比亚技术有限公司 A kind of picture sharing method, terminal and storage medium based on dual-screen mobile terminal
CN109491632A (en) * 2018-10-30 2019-03-19 维沃移动通信有限公司 A kind of resource sharing method and terminal
CN109618206A (en) * 2019-01-24 2019-04-12 青岛海信电器股份有限公司 The method and display equipment at presentation user interface

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279269B (en) * 2013-05-31 2016-03-02 华为技术有限公司 Data interactive method between a kind of application program and device, terminal device
US9612732B2 (en) * 2014-11-13 2017-04-04 Microsoft Technology Licensing, Llc Content transfer to non-running targets
CN106502557A (en) * 2016-09-14 2017-03-15 深圳众思科技有限公司 A kind of split screen transmits the method and device of file
CN107066172B (en) * 2017-02-16 2020-07-10 北京小米移动软件有限公司 File transmission method and device of mobile terminal
CN109782976B (en) * 2019-01-15 2020-12-22 Oppo广东移动通信有限公司 File processing method, device, terminal and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156605A (en) * 2010-02-12 2011-08-17 宏碁股份有限公司 Object moving method, object moving system and electronic device
US20120081318A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Displaying the desktop upon device open
CN103348311A (en) * 2010-10-01 2013-10-09 Flex Electronics ID Co.,Ltd. Long drag gesture in user interface
US20130080944A1 (en) * 2011-09-27 2013-03-28 Paul E. Reeves Unified desktop triad control user interface for a browser
CN104133629A (en) * 2014-07-10 2014-11-05 深圳市中兴移动通信有限公司 Double-screen interaction method and mobile terminal
CN106055246A (en) * 2016-05-25 2016-10-26 努比亚技术有限公司 Mobile terminal and operation method thereof
CN106250081A (en) * 2016-07-29 2016-12-21 努比亚技术有限公司 A kind of display packing based on double screen terminal and device
CN107977152A (en) * 2017-11-30 2018-05-01 努比亚技术有限公司 A kind of picture sharing method, terminal and storage medium based on dual-screen mobile terminal
CN109491632A (en) * 2018-10-30 2019-03-19 维沃移动通信有限公司 A kind of resource sharing method and terminal
CN109618206A (en) * 2019-01-24 2019-04-12 青岛海信电器股份有限公司 The method and display equipment at presentation user interface

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张儒: ""Android手机与机顶盒间双屏互动及远程遥控功能的设计与实现"", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
张凯等: ""双屏互动智慧教室的构建与应用研究"", 《数字教育》 *

Also Published As

Publication number Publication date
CN112199124B (en) 2022-07-01
WO2020253282A1 (en) 2020-12-24

Similar Documents

Publication Publication Date Title
US10963139B2 (en) Operating method for multiple windows and electronic device supporting the same
CN109164964B (en) Content sharing method and device, terminal and storage medium
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
EP3680764B1 (en) Icon moving method and device
CN101828162B (en) Unlocking a touch screen device
US11934848B2 (en) Control display method and electronic device
WO2020063091A1 (en) Picture processing method and terminal device
WO2021135068A1 (en) Selection control method for sound output device, and display device
CN108829314B (en) Screenshot selecting interface selection method, device, equipment and storage medium
CN111078076A (en) Application program switching method and electronic equipment
CN111225108A (en) Communication terminal and card display method of negative screen interface
CN106020698A (en) Mobile terminal and realization method of single-hand mode
KR20160003400A (en) user terminal apparatus and control method thereof
CN110865765A (en) Terminal and map control method
CN111124219A (en) Communication terminal and card display method of negative screen interface
CN112199124B (en) Project opening method and device and display equipment
CN107728898B (en) Information processing method and mobile terminal
CN113010056A (en) Desktop display control method, device, terminal and storage medium
CN103686271B (en) Display device and its control method
US20150234546A1 (en) Method for Quickly Displaying a Skype Contacts List and Computer Program Thereof and Portable Electronic Device for Using the Same
CN112449227B (en) Interaction method and device for touch screen application compatible with remote controller operation and smart television
CN114546219A (en) Picture list processing method and related device
CN109032728B (en) UI (user interface) display method, intelligent terminal and computer-readable storage medium
CN102956098B (en) Long-distance light remote control method, remote control terminal, display device and system
CN113573115B (en) Method for determining search characters and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant