CN113330415A - Interactive method, display device and computer-readable storage medium - Google Patents
Interactive method, display device and computer-readable storage medium Download PDFInfo
- Publication number
- CN113330415A CN113330415A CN201980080826.4A CN201980080826A CN113330415A CN 113330415 A CN113330415 A CN 113330415A CN 201980080826 A CN201980080826 A CN 201980080826A CN 113330415 A CN113330415 A CN 113330415A
- Authority
- CN
- China
- Prior art keywords
- screen
- sub
- target icon
- display device
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an interaction method, a display device and a computer readable storage medium. The interaction method comprises the following steps: controlling a first sub-screen to display a target icon (100); detecting a touch operation (102) of an external object on the target icon; when the touch operation is a dragging operation and the target icon is dragged to the intermediate screen, controlling to respond to the target icon (104). According to the method and the device, the target icon positioned on the first sub-screen can be controlled to respond to the target icon on the second sub-screen side when the target icon is dragged to the middle screen, so that a user can conveniently know the content shared by the other side from the second sub-screen side, and the purpose of multi-user interaction is achieved.
Description
The present application relates to the field of intelligent devices, and in particular, to an interaction method, a display device, and a computer-readable storage medium.
With the development of science and technology, the functions and performances of electronic equipment are continuously enhanced, and convenience is brought to the life of people. For an electronic device with a large display screen, a plurality of users can respectively operate in different areas of the display screen at the same time. During the use of the electronic device, the user may need to interact with other users, such as sharing pictures, web pages, videos, and rights exchange (e.g., exchange of sound playing rights, and a messy sound may occur if sounds are played at the same time), and the like. However, the existing interaction among multiple users may require a third-party application, and the operation is complex, which brings certain inconvenience to the users.
Disclosure of Invention
The application provides an interaction method, a display device and a computer readable storage medium which are convenient to operate.
A first aspect of the embodiments of the present application provides an interaction method applied to a display device, where the display device includes a first sub-screen, a second sub-screen, and an intermediate screen connecting the first sub-screen and the second sub-screen, and includes:
controlling the first sub-screen to display a target icon;
detecting touch operation of an external object on the target icon;
and when the touch operation is a dragging operation and the target icon is dragged to the intermediate screen, controlling to respond to the target icon.
A second aspect of embodiments of the present application provides a display device, including:
the display screen comprises a first sub-screen, a second sub-screen and a middle screen connecting the first sub-screen and the second sub-screen;
the processor is connected with the display screen, controls the first sub-screen to display a target icon, and detects touch operation of an external object on the target icon; when the touch operation is a dragging operation and the target icon is dragged to the intermediate screen, the processor controls to respond to the target icon.
A third aspect of embodiments of the present application provides a computer-readable storage medium for storing a computer program for electronic data exchange, wherein the computer program causes a computer to perform some or all of the steps as described in any one of the methods of the first aspect of embodiments of the present application.
Compared with the prior art, the embodiment of the application provides an interaction method, a display device and a computer-readable storage medium, by controlling the target icon on the second sub-screen side to respond to the target icon when the target icon on the first sub-screen is dragged to the intermediate screen, a user can conveniently know the content shared by the other side from the second sub-screen side, and the purpose of multi-user interaction is achieved; or when the target icon positioned on the first sub-screen is dragged to the middle screen, the operation instruction corresponding to the target icon is executed so as to interact with the second sub-screen, and certain convenience is brought to interaction among multiple users.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart illustrating steps of an interaction method according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a three-dimensional structure of a display device according to an embodiment of the present application.
Fig. 3 is a schematic position diagram of each sub-screen of the display screen in an embodiment of the present application.
Fig. 4 is a schematic diagram of an icon display area of a shared sub-screen according to an embodiment of the present application.
Fig. 5 is a block diagram of a hardware structure of a display device in an embodiment of the present application.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. The described embodiments are only some embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
It should be noted that for simplicity of description, the following method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts, as some steps may occur in other orders or concurrently depending on the application.
Referring to fig. 1, a flowchart illustrating steps of an interaction method according to an embodiment of the present application is shown. The interaction method comprises the following steps:
and step 100, controlling a first sub-screen of the display device to display a target icon.
Fig. 2 is a schematic view showing a three-dimensional structure of a display device according to an embodiment of the present application. The display device 50 includes a body 520 and a display 504. The body 520 includes a first support 522, a second support 524, and a bendable mechanism 526 connecting the first support 522 and the second support 524. When the first supporting body 522 and the second supporting body 524 of the display device 50 bend around the bendable mechanism 526, the first supporting body 522 and the second supporting body 524 may approach each other, so that an included angle a between the first supporting body 522 and the second supporting body 524 is continuously changed, wherein the included angle a between the first supporting body 522 and the second supporting body 524 is related to the bending degree of the bendable mechanism 526.
In the embodiment, the display device 50 is a foldable device, and the display device 50 has a shape having a plurality of modes including, but not limited to, an unfolding mode, a folding mode and a tent mode. For example, when the second support body 524 is far away from the first support body 522, the first support body 522 and the second support body 524 may be located in the same plane after being flipped or rotated around the bendable mechanism 526 (an included angle a between the first support body 522 and the second support body 524 is approximately 180 degrees), and at this time, the display device 50 may be in the unfolding mode. When the first supporting body 522 and the second supporting body 524 are turned around the foldable mechanism 526, the opposite surfaces can be parallel to each other (the included angle a between the first supporting body 522 and the second supporting body 524 is substantially 0 degree), and at this time, the display device 90 is in the foldable mode. In one embodiment, when the included angle a between the first supporting body 522 and the second supporting body 524 is between 90 degrees and 180 degrees, it may be determined that the display device 50 is in the unfolded mode; when the included angle a between the first support 522 and the second support 524 is less than 30 degrees, it may be determined that the display device 50 is in the folding mode. When a plurality of users use the display device 50, if the display device 50 is in a folding mode (e.g. the included angle a is smaller than 30 degrees), the display device 50 may not stand on a horizontal plane (e.g. a desktop), which brings inconvenience to the users. Therefore, to improve the convenience of the display device, the display device 50 can be in the tent mode (e.g., the included angle a is between 30 degrees and 90 degrees) between the unfolding mode and the folding mode by controlling the bending degree of the bendable mechanism 526. In the present embodiment, when the display device 50 is in the tent mode, the operations within the first sub-screen 530 and the second sub-screen 540 are not affected by each other.
A sensor 506 (shown in fig. 6) is disposed within the display device 50, and the display device 50 determines the configuration of the display device 50 based on the monitoring data transmitted by the sensor 506. For example, the sensor 506 is used to detect monitoring data of an angle a between the first support 522 and the second support 524, and at this time, the display device 50 determines whether the angle a between the first support 522 and the second support 524 is within a preset angle range according to the monitoring data detected by the sensor 506 to determine that the configuration of the display device 50 is in the unfolding mode, the tent mode, or the folding mode. For example, when the angle a between the first support 522 and the second support 524 is in the range of 30 degrees to 90 degrees, it may be determined that the operating state of the display device 50 is in the tent mode.
Please refer to fig. 3, which is a schematic diagram illustrating the positions of the screens of the display screen according to an embodiment of the present application. The display 504 is disposed on a side surface of the body 520, for example, the display 504 is disposed on a surface of the first support 522 opposite to the surface of the second support 524 when the first support and the second support are close to each other. The display 504 may be a flexible display and include a preset number of sub-screens. In this embodiment, the display 504 includes a first sub-screen 530, a second sub-screen 540, and a middle screen 550 connecting the first sub-screen 530 and the second sub-screen 540, wherein the first sub-screen 530 and the second sub-screen 540 are respectively located at two sides of the middle screen 550. In this embodiment, the middle screen 550 includes a first shared sub-screen 552 connected to the first sub-screen 530 and a second shared sub-screen 554 connected to the second sub-screen 540, the first shared sub-screen 552 and the second shared sub-screen 554 are respectively located on two opposite sides of the folding line 552 of the folding mechanism 526, a distance between a border line connecting the first shared sub-screen 552 and the first sub-screen 530 and the folding line 552 may be L, and a distance between a border line connecting the second shared sub-screen 554 and the second sub-screen 540 and the folding line 552 may be L. The first sub-screen 530 covers the first supporting body 522, the second sub-screen 540 covers the second supporting body 524, and the middle screen 550 covers the whole bendable mechanism 526 or a part of the bendable mechanism 526.
Several interfaces can be displayed in the first sub-screen 530 and the second sub-screen 540, including but not limited to a desktop interface, an application level interface, and the like. The user can perform operations including, but not limited to, a click operation, a drag operation, etc., on the first sub-screen 530 and the second sub-screen 540.
In this embodiment, one or more icons 560 may be included in the first sub-screen 530 and the second sub-screen 540, where the content of the icon 560 may correspond to the content of a picture, a video, an audio, a web page, a text, an executable program, an event reminder, and the like, and the target icon is an icon that is located in the first sub-screen 530 and can be shared with the second sub-screen 540; alternatively, the target icon is also an icon in the second sub-screen 540 that can be shared with the first sub-screen 530. For example, the icons located in the first sub-screen 530 include a first icon, a second icon, and a third icon, and if the first icon is to be shared to the second sub-screen 540, the target icon may be the first icon.
And 102, detecting the touch operation of an external object on the target icon.
In this embodiment, the display device 50 may determine the manner of responding to the target icon according to the touch operation of the user (i.e., the external object). The touch operation includes, but is not limited to, a drag operation, a click operation, a long press operation, and the like.
And 104, when the touch operation is a dragging operation and the target icon is dragged to the intermediate screen, controlling to respond to the target icon.
In this embodiment, each icon has a corresponding operation instruction, and the operation instruction of each icon may correspond to the type of the content of the icon. In one embodiment, when the type of the content of the icon 560 is a picture, a video, an audio, a web page, a text, an event reminder, etc., the display device 50 may determine that the operation instruction corresponding to the icon 560 is of a first type; when the type of the content of the icon 560 is an executable program, the display device 50 may determine that the operation instruction corresponding to the icon 560 is of a second type. In another embodiment, the type of the operation instruction corresponding to the icon 560 may also be set by other methods, for example, by manually setting the operation type corresponding to the icon whose content is a video to be the second type.
When the control responds to the target icon, the display device 50 may respond according to the type of the operation instruction corresponding to the target icon. When the operation instruction corresponding to the target icon is of the first type, it indicates that the user wants to share the first sub-screen 530 where the target icon is located with the second sub-screen 540, so that another user can perform an operation in the second sub-screen 540 and execute the operation instruction corresponding to the target icon; when the operation instruction corresponding to the target icon is of the second type, it indicates that the user wants to directly execute the operation instruction corresponding to the target icon, so as to interact with another user using the second sub-screen 540.
In an embodiment, when the operation instruction corresponding to the target icon is of the first type:
when the target icon is shared from the first sub-screen 530 to the second sub-screen 540 or from the second sub-screen 540 to the first sub-screen 530, the external object may drag the target icon into the intermediate screen 560, and the display device 50 may respond to the target icon. For example, when the first icon is the target icon and the first icon is dragged into the first sharing sub-screen 552 connected to the first sub-screen 530, it indicates that the first icon is shared with the second sub-screen 540, and at this time, the display device 50 controls the second sub-screen 540 to display the first icon or controls the first icon to be displayed in the second sharing sub-screen 554. In an embodiment, the display device 50 may also display a first icon dragged to the first sharing sub-screen 552 in the first sharing sub-screen 552, so as to facilitate a user using the first sub-screen 530 to intuitively know the shared object or content from the first sharing screen 552.
In an embodiment, when the first user drags the first icon to the first sharing sub-screen 552, the display device 50 detects that the position where the first icon is displayed is changed from the first sub-screen 530 to the first sharing sub-screen 552. At this time, the display device 50 may control to display the first icon in the second sharing sub-screen 554, so that the second user can intuitively know the shared object or content from the second sharing screen 554. The second user can also select a corresponding icon from the second sharing sub-screen 554 to execute the operation instruction corresponding to the selected icon.
In one embodiment, to reduce discomfort that may be visually caused to the user by the second sub-screen 540 suddenly displaying the first icon, the display device 50 may control the first icon to smoothly transition to be displayed within the second sub-screen 540. For example, when the first user drags the first icon to the first sharing sub-screen 552, the display apparatus 50 detects that the position at which the first icon is displayed is changed from the first sub-screen 530 to the first sharing sub-screen 552. At this time, the display device 50 may control the first icon to move from the first sharing sub-screen 552 and the second sharing sub-screen 554 to the second sub-screen 540 according to a preset path within a preset time, so that the second user using the second sub-screen 550 may view the first icon displayed on the second sub-screen 540 in a smooth transition, which is beneficial to reducing the visual discomfort of the user and improving the user experience.
On the contrary, when the target icon is shared from the second screen 540 to the first sub-screen 530, and the external object can drag the target icon located on the second screen 540 to the second sharing sub-screen 554, the display device 50 can control the first sub-screen 530 to display the target icon, or control the target icon to be smoothly and transitionally displayed in the first sharing sub-screen 552. In other embodiments, the touch operation may be a long-press operation, for example, when the time for which the second icon is long-pressed exceeds a preset time, the display device 50 may also control to respond to the second icon so as to share the second icon to the second sub-screen 552 side.
Please refer to fig. 4, which is a schematic diagram illustrating a target icon displayed on the shared sub-screen according to an embodiment of the present application. Each shared sub-screen includes one or more icon display areas 570, each icon display area 570 being operable to display one target icon. Therefore, when the target icon is displayed on the second sharing sub-screen 554, the display device 50 controls the icon display area 570 at the preset position of the second sharing sub-screen 554 to display the target icon. The display device 50 may control the target icons to be displayed in the second sharing sub-screen 554 from left to right according to the time sequence of the target icons being shared; on the contrary, when the target icon is shared from the second sub-screen 540 to the first sub-screen 530, the display device 50 may also control the target icon to be displayed in the first sharing sub-screen 552 from left to right according to the time sequence of the shared target icon.
In this embodiment, each of the sharing sub-screens includes an icon display area 570 with a preset threshold. When the target icon is displayed on the second sharing sub-screen 554, the display device 50 obtains the current number of the icon display areas 570 in the second sharing sub-screen 554, in which the icons are displayed, and determines whether the current number is greater than a preset threshold value; when the current number is greater than the preset threshold, the display device 50 deletes the icon that is shared earliest in the icon display area in the second sharing sub-screen 554, and updates the positions of the remaining icons in the second sharing sub-screen 554 according to the time sequence in which the target icon is shared. In this embodiment, the preset threshold may be the number of icon display areas of the shared display partition, and in other embodiments, the preset threshold may also be smaller than the number of icon display areas of the shared display partition.
For example, the preset threshold of the icon display area 570 of the second sharing sub-screen 554 is 5, when icons (i.e., the first to fifth icons) are displayed in the first to fifth positions of the icon display area 570 of the second sharing sub-screen 554, and when a target icon (e.g., the sixth icon) shared to the second sub-screen 540 side is received again, since icons are already displayed in all positions of the icon display area 570 of the second sharing sub-screen 554, at this time, the display device 50 deletes the icon in the icon display area 570 in the first position in the second sharing sub-screen 554. Thereafter, the display device 50 may update the positions of the icons in the icon display area 570 according to the time sequence of sharing the icons in the icon display area. At this time, the second to fifth icons are displayed at the first to fourth positions of the icon display area 570, respectively, and the sixth icon is displayed at the icon display area 570 at the fifth position.
When the target icon is displayed on the second sharing sub-screen 544, if the display device 50 detects the touch operation of the external object on the target icon again, the display device 50 executes the operation instruction corresponding to the target icon.
For example, when the content of the target icon is audio/video, if the target icon in the second sharing sub-screen 554 is clicked, the display device 50 may control to play audio/video on the second sub-screen 540 or control to play the first sub-screen 530 and the second sub-screen 540 synchronously when executing the operation instruction corresponding to the target icon; when the content of the target icon is a picture, a file, or the like, if the target icon is clicked, the display device 50 may control the second sub-screen 540 to display the content corresponding to the target icon when executing the operation instruction corresponding to the target icon, or control the first sub-screen 530 and the second sub-screen 540 to display the content corresponding to the target icon at the same time; when the content of the target icon is the event reminder, if the target icon is clicked, the display device 50 may control the event reminder to be displayed on the second sub-screen 540 or control the first sub-screen 530 and the second sub-screen 540 to display the event reminder at the same time when executing the operation instruction corresponding to the target icon; when the content of the target icon is an executable program, if the target icon is clicked, the display device 50 executes the operation instruction corresponding to the target icon, and then the executable program is executed.
In one embodiment, the display device 50 may include one or more playback devices 510 for playing back sound. For example, the display device 50 may include two external devices 510, a first external device and a second external device, wherein the first external device may be disposed on the first support body 522 and the second external device may be disposed on the second support body 524. When the sound is played, if both of the external devices 510 play, the sound may be scrambled, and therefore, the display apparatus 50 may display a fourth icon including muting of the first external device and a fifth icon including muting of the second external device in the first sub-screen 530. When the user drags the fourth icon to the first sharing sub-screen 552, the display apparatus 50 controls the second sharing sub-screen 554 to display the fourth icon. At this time, if the second shared sub-screen 554 further includes a third icon corresponding to the audio content, when the user clicks the third icon, the display device 50 may control the first external playing device and the second external playing device to play simultaneously, and if the user corresponding to the second sub-screen 540 wants to control the first external playing device on the first supporting member 522 to mute, the user may click a fourth icon in the second shared sub-screen 554 again, at this time, the display device 50 controls the first external playing device to mute, and keeps the second external playing device to continue playing, thereby reducing the phenomenon of sound disorder occurring when multiple external playing devices play.
In this embodiment, when the icon in the sharing sub-screen is clicked, the display device 500 may delete the clicked icon from the sharing sub-screen, and update the displayed positions of the remaining icons based on the time sequence in which the icons in the icon display area are shared. When the user deletes an icon within the second sharing sub-screen 554, the user may drag the icon located within the second sharing sub-screen 554 to a preset position to indicate that the dragged icon is deleted.
In an embodiment, when the operation instruction corresponding to the target icon is of the second type:
when the target icon is dragged from the first sub-screen 530 to the first shared sub-screen 552, the display device 50 directly executes the operation instruction corresponding to the target icon; alternatively, when the target icon is dragged from the second sub-screen 540 to the second shared sub-screen 554, the display device 50 may also directly execute the operation instruction corresponding to the target icon.
For example, if the operation instruction corresponding to the first icon is to close the display of the second sub-screen 540 (i.e., the operation instruction corresponding to the first icon is an executable program), at this time, when the user drags the first icon from the first sub-screen 530 to the first shared sub-screen 552, the display device 50 directly executes the operation instruction corresponding to the first icon, i.e., closes the display of the second sub-screen 540. In this way, when the first user operates the target icon in the first sub-screen 530, the display device 50 may respond to the target icon to control the second sub-screen 540, so as to interact with the second sub-screen 540. If the operation instruction corresponding to the fourth icon controls the first play-out device to mute, at this time, when the user drags the fourth icon from the first sub-screen 530 to the first shared sub-screen 552, the display device 50 directly executes the operation instruction corresponding to the first icon, i.e., controls the first play-out device to mute. Therefore, when a plurality of external devices play, the user can control the corresponding external device to mute. In other embodiments, the operation instruction corresponding to the target icon may also be at least one of controlling the second sub-screen to play the audio/video, controlling the second sub-screen to display content sharing, and controlling the second sub-screen to display the item reminder.
In one embodiment, if the display device 50 is in the expanded mode, the privacy of the display device is affected because each user can see the content of the sub-screens of the other users. Therefore, to further improve the security of the display device during the use process, when the display device 50 is in the tent mode, if the touch operation is the drag operation and the target icon is dragged to the intermediate screen (e.g. the first sharing sub-screen 552), the display device 50 controls to respond to the target icon.
According to the interaction method, the target icon positioned on the first sub-screen is controlled to respond to the target icon on the second sub-screen side when the target icon is dragged to the middle screen, so that a user can conveniently know the content shared by the other side from the second sub-screen side, and the purpose of multi-user interaction is achieved; or when the target icon positioned on the first sub-screen is dragged to the middle screen, the operation instruction corresponding to the target icon is executed so as to interact with the second sub-screen, and certain convenience is brought to interaction among multiple users.
Referring to fig. 5, a block diagram of a hardware structure of a display device according to an embodiment of the present disclosure is shown. As shown in fig. 5, the display device may apply the above embodiments, and the display device 50 provided in this application is described below, the display device 50 may further include a processor 500, a memory 502, a display 504, a sensor 506, a peripheral device 510, and a computer program (instruction) or a display control program (instruction) stored in the storage 502 and executable on the processor 500, and the display device 50 may further include other hardware components, such as keys and a communication device, and will not be described herein again. The processor 500 may exchange data with a storage device 502, a display 504, a sensor 506, and a peripheral device 510 via a bus 506.
The Processor 500 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being the control center for the display device 50, with various interfaces and lines connecting the various parts of the overall display device 50.
The storage device 502 may be used for storing the computer programs and/or modules, and the processor 500 implements various functions of the interaction method by running or executing the computer programs and/or modules stored in the storage device 502 and calling data stored in the storage device 502. The storage device 502 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like. In addition, the storage device 502 may include a high speed random access memory device, and may also include a non-volatile storage device such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one piece of magnetic disk storage, a Flash memory device, or other volatile solid state storage device.
The display 504 may display a User Interface (UI) or a Graphical User Interface (GUI) including data of photos, videos, chat contents, and the like, and the display 504 may also serve as an input device and an output device, and the display may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) touch display, a flexible display, a three-dimensional (3D) touch display, an ink screen display, and the like.
The sensor 506 is disposed on the body 520 and is used for detecting an angle between the first supporting body 522 and the second supporting body 524. The placing-out device 510 may include one or more, and the one or more placing-out devices 510 may be disposed on the first support 522 and/or the second support 524.
The processor 500 runs a program corresponding to the executable program code by reading the executable program code stored in the storage device 502, so as to execute the steps of the interaction method performed by the display device in any of the previous embodiments.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (20)
- An interaction method is applied to a display device, and the display device comprises a first sub-screen, a second sub-screen and an intermediate screen connecting the first sub-screen and the second sub-screen; characterized in that the method comprises:controlling the first sub-screen to display a target icon;detecting touch operation of an external object on the target icon;and when the touch operation is a dragging operation and the target icon is dragged to the intermediate screen, controlling to respond to the target icon.
- The interaction method of claim 1, wherein said controlling to respond to said target icon comprises:and controlling the intermediate screen to display the target icon.
- The interactive method of claim 2, wherein the intermediate screen includes a first shared sub-screen connected to the first sub-screen and a second shared sub-screen connected to the second sub-screen; the controlling the intermediate screen to display the target icon includes:and controlling the second sharing sub-screen to display the target icon.
- The interaction method according to any one of claims 1 to 3, wherein the target icon has a corresponding operation instruction; the method further comprises the following steps:after detecting the touch operation of an external object on the target icon:and when the touch operation is a click operation, executing an operation instruction corresponding to the target icon.
- The interaction method according to claim 4, wherein the operation instruction corresponding to the target icon comprises at least one of controlling audio and video playing, controlling synchronous playing of the first sub-screen and the second sub-screen, displaying content sharing, and displaying an item reminder.
- The interaction method of claim 1, wherein said controlling to respond to said target icon comprises:and controlling the second sub-screen to display the target icon.
- The interaction method of claim 1, wherein the target icon has a corresponding operation instruction; the controlling to respond to the target icon comprises:and executing the operation instruction corresponding to the target icon.
- The interaction method of claim 7, wherein the operation instruction corresponding to the target icon comprises at least one of controlling the second sub-screen to close display, controlling the second sub-screen to play audio and video, controlling the second sub-screen to display content sharing, and controlling the second sub-screen to display an item reminder.
- The interaction method according to claim 1, wherein the display device is a bendable display device, the bendable display device comprising a first support, a second support, and a bendable mechanism connecting the first support and the second support; the first sub-screen covers the first support body; the second sub-screen covers the second support body; the intermediate screen covers all or part of the bendable mechanism.
- The interactive method of claim 9, wherein prior to said controlling to respond to said target icon, further comprising:judging whether the display device is in a tent mode or not;controlling to respond to the target icon when the display device is in the tent mode.
- The interaction method of claim 10, wherein said determining whether said display device is in a tent mode comprises:determining whether an included angle between the first support body and the second support body is within a preset angle range according to monitoring data detected by a sensor;when the included angle between the first supporting body and the second supporting body is located in the preset angle range, the display device is determined to be in the tent mode, and the operation in the first sub-screen and the operation in the second sub-screen are not influenced mutually in the tent mode.
- A display device, characterized in that the display device comprises:the display screen comprises a first sub-screen, a second sub-screen and a middle screen connecting the first sub-screen and the second sub-screen;the processor is connected with the display screen, controls the first sub-screen to display a target icon, and detects touch operation of an external object on the target icon; when the touch operation is a dragging operation and the target icon is dragged to the intermediate screen, the processor controls to respond to the target icon.
- The display device according to claim 12, wherein the processor controls the intermediate screen to display the target icon when controlling to respond to the target icon.
- The display device of claim 13, wherein the intermediate screen includes a first shared sub-screen connected to the first sub-screen and a second shared sub-screen connected to the second sub-screen, and the processor controls the second shared sub-screen to display the target icon while controlling the intermediate screen to display the target icon.
- The display device according to any one of claims 12 to 14, wherein the target icon has a corresponding operation instruction, and the processor executes the operation instruction corresponding to the target icon if the touch operation is a click operation after detecting the touch operation of an external object on the target icon.
- The display device according to claim 15, wherein the operation instruction corresponding to the target icon includes at least one of controlling audio and video playing, controlling synchronous playing of the first sub-screen and the second sub-screen, displaying content sharing, and displaying an event reminder.
- The display device of claim 12, wherein the target icon has a corresponding operation instruction; and when the control responds to the target icon, the processor executes an operation instruction corresponding to the target icon.
- The display device according to claim 17, wherein the operation instruction corresponding to the target icon includes at least one of controlling the second sub-screen to close display, controlling the second sub-screen to play audio and video, controlling the second sub-screen to display content sharing, and controlling the second sub-screen to display an event reminder.
- The display device according to claim 12, wherein the display device is a bendable display device including a first support, a second support, and a bendable mechanism connecting the first support and the second support; the first sub-screen covers the first support body; the second sub-screen covers the second support body; the intermediate screen covers all or part of the bendable mechanism.
- A computer-readable storage medium storing computer instructions which, when executed by a processor, implement the interaction method of any one of claims 1 to 11.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/087887 WO2020232653A1 (en) | 2019-05-22 | 2019-05-22 | Interaction method, display apparatus, and computer-readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113330415A true CN113330415A (en) | 2021-08-31 |
Family
ID=73459288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980080826.4A Pending CN113330415A (en) | 2019-05-22 | 2019-05-22 | Interactive method, display device and computer-readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113330415A (en) |
WO (1) | WO2020232653A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114594885A (en) * | 2022-02-28 | 2022-06-07 | 北京梧桐车联科技有限责任公司 | Application icon management method, device and equipment and computer readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107015777A (en) * | 2015-10-22 | 2017-08-04 | 三星电子株式会社 | Electronic equipment and its control method with curved displays |
CN109582477A (en) * | 2018-11-30 | 2019-04-05 | 北京小米移动软件有限公司 | Document transmission method, terminal and storage medium |
CN109710135A (en) * | 2018-12-29 | 2019-05-03 | 努比亚技术有限公司 | Split screen display available control method, terminal and computer readable storage medium |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4818427B2 (en) * | 2009-12-22 | 2011-11-16 | 株式会社東芝 | Information processing apparatus and screen selection method |
CN108255378B (en) * | 2018-02-09 | 2020-05-26 | 维沃移动通信有限公司 | Display control method and mobile terminal |
-
2019
- 2019-05-22 WO PCT/CN2019/087887 patent/WO2020232653A1/en active Application Filing
- 2019-05-22 CN CN201980080826.4A patent/CN113330415A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107015777A (en) * | 2015-10-22 | 2017-08-04 | 三星电子株式会社 | Electronic equipment and its control method with curved displays |
CN109582477A (en) * | 2018-11-30 | 2019-04-05 | 北京小米移动软件有限公司 | Document transmission method, terminal and storage medium |
CN109710135A (en) * | 2018-12-29 | 2019-05-03 | 努比亚技术有限公司 | Split screen display available control method, terminal and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020232653A1 (en) | 2020-11-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11675391B2 (en) | User terminal device for displaying contents and methods thereof | |
US11360634B1 (en) | Shared-content session user interfaces | |
KR102571369B1 (en) | Display control method, storage medium and electronic device for controlling the display | |
KR102027612B1 (en) | Thumbnail-image selection of applications | |
US9933935B2 (en) | Device, method, and graphical user interface for editing videos | |
WO2022068721A1 (en) | Screen capture method and apparatus, and electronic device | |
CN113330415A (en) | Interactive method, display device and computer-readable storage medium | |
WO2020087504A1 (en) | Screenshot interaction method, electronic device, and computer-readable storage medium | |
CN113330407A (en) | Interaction method, flexible electronic device and readable storage medium | |
CN113396379A (en) | Interaction method, flexible electronic device and readable storage medium | |
WO2020206654A1 (en) | Interaction method, flexible electronic device and readable storage medium | |
CN113589992A (en) | Game interface interaction method, game interface interaction device, medium and terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210831 |
|
WD01 | Invention patent application deemed withdrawn after publication |