WO2023179129A1 - 显示设备、投屏设备及基于轨迹提取的设备控制方法 - Google Patents

显示设备、投屏设备及基于轨迹提取的设备控制方法 Download PDF

Info

Publication number
WO2023179129A1
WO2023179129A1 PCT/CN2022/141150 CN2022141150W WO2023179129A1 WO 2023179129 A1 WO2023179129 A1 WO 2023179129A1 CN 2022141150 W CN2022141150 W CN 2022141150W WO 2023179129 A1 WO2023179129 A1 WO 2023179129A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
trajectory
screen projection
user interface
screen
Prior art date
Application number
PCT/CN2022/141150
Other languages
English (en)
French (fr)
Inventor
马晓燕
宋子全
刘美玉
李金昆
李乃金
庞秀娟
肖成创
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Publication of WO2023179129A1 publication Critical patent/WO2023179129A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Definitions

  • the present application relates to the technical field of display devices. Specifically, it relates to a display device, a screen projection device, and a device control method based on trajectory extraction.
  • the screen projection communication connection between the display device and the screen projection device is usually established first; then it is confirmed whether the screen projection communication protocol supports the display device to reversely control the screen projection device;
  • the screen projection area on the display device side can monitor the screen touch events that occur in the user's area, and send and inject the acquired operation traces into the screen projection device system to achieve reverse control of the screen projection device.
  • the projection user interface will not be able to monitor the touch event, resulting in the inability to obtain the operation trajectory and the inability to control the projection device, which occurs at the boundary of the projection area.
  • the screen projection device sometimes does not respond when controlled nearby.
  • a first aspect of an embodiment of the present application provides a display device, including: a display, used to display a third user interface including a first screen projection area, the first screen projection area being used to synchronously display the first screen projection device The first user interface; the first controller, configured to: when the first user interface is displayed on the third user interface, determine the relative positional relationship between the first screen projection area and the third user interface; based on the third user interface Monitor touch events to obtain the first trajectory generated when the user operates the third user interface; determine the second trajectory formed by the first trajectory in the first screen projection area based on the relative position relationship, and store the second trajectory The trajectory is sent to the first screen projection device; wherein the second trajectory is used by the first screen projection device to control the first user interface, so that the user controls the first user interface of the first screen projection device by operating the third user interface. control.
  • a second aspect of the embodiment of the present application provides a screen projection device, including: a display for displaying a first user interface.
  • the first user interface is synchronously displayed to the first user interface of the third user interface of the display device during the screen projection process.
  • the screen projection area; the second controller is configured to: receive the second trajectory sent by the display device, the second trajectory is the user's operation trajectory in the first screen projection area in the third user interface; control the third screen projection area based on the second trajectory
  • the first user interface is updated and displayed, and the updated first user interface is projected to the first projection area of the third user interface.
  • a third aspect of the embodiment of the present application provides a device control method based on trajectory extraction.
  • the method includes: when the first user interface of the first screen projection device is displayed on the third user interface, determining the first screen projection area and The relative positional relationship between the third user interface and the first screen projection area is used to synchronously display the first user interface of the first screen projection device; based on the third user interface's monitoring of touch events, the user's operation of the third user interface is obtained the first trajectory generated when; according to the relative position relationship, determine the second trajectory formed by the first trajectory in the first screen projection area, and send the second trajectory to the first screen projection device; wherein, the The second track is used for the first screen projection device to control its first user interface, so that the user controls the first user interface of the first screen projection device by operating the third user interface.
  • the fourth aspect of the embodiments of the present application provides a device control method based on trajectory extraction.
  • the method includes: receiving a second trajectory sent by a display device, where the second trajectory is the user's first step in the third user interface of the display device.
  • the operation trajectory of the screen projection area based on the second trajectory, the first user interface is controlled to update the display, and the updated first user interface is projected to the first screen projection area of the third user interface.
  • the screen process it is synchronously displayed to the first screen projection area of the third user interface of the display device.
  • Figure 1 is a schematic diagram of the operation scene between the display device and the control device according to the embodiment of the present application;
  • FIG. 2 is a hardware configuration block diagram of the display device 200 according to the embodiment of the present application.
  • FIG. 3 is a hardware configuration block diagram of the control device 100 according to the embodiment of the present application.
  • Figure 4 is a schematic diagram of the software configuration in the display device 200 according to the embodiment of the present application.
  • Figure 5A is a schematic diagram of the user interface of the display device and screen projection device according to the embodiment of the present application.
  • Figure 5B is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 5C is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 5D is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 6A is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 6B is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 6C is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 6D is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 6E is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 6F is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 6G is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 7A is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 7B is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 7C is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 7D is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 7E is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 8A is a schematic structural diagram of another display device reversely controlling a screen projection device according to an embodiment of the present application.
  • Figure 8B is a schematic diagram of the layer layout of another display device according to an embodiment of the present application.
  • Figure 8C is a schematic diagram of the layer layout of another display device according to an embodiment of the present application.
  • Figure 9A is a schematic diagram of a touch point according to an embodiment of the present application.
  • Figure 9B is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 9C is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 9D is a schematic diagram of the user interface of another display device and screen projection device according to the embodiment of the present application.
  • Figure 9E is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • Figure 10A is a schematic structural diagram of another display device reversely controlling a screen projection device according to an embodiment of the present application.
  • FIG. 10B is a schematic diagram of another display device and screen projection device according to an embodiment of the present application.
  • FIG. 1 is a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1 , the user can operate the display device 200 through the smart device 300 or the control device 100 .
  • control device 100 may be a remote controller.
  • the communication between the remote controller and the display device includes infrared protocol communication or Bluetooth protocol communication, and other short-distance communication methods to control the display device 200 through wireless or wired methods.
  • the user can control the display device 200 by inputting user instructions through buttons on the remote control, voice input, control panel input, etc.
  • display device 200 also communicates data with server 400.
  • the display device 200 may be allowed to communicate via a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 400 can provide various content and interactions to the display device 200.
  • the server 400 may be a cluster or multiple clusters, and may include one or more types of servers.
  • FIG. 2 schematically shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
  • the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply.
  • the control device 100 can receive input operation instructions from the user, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, thereby mediating the interaction between the user and the display device 200 .
  • the display device 200 includes at least one of a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface. kind.
  • the display 260 can be a liquid crystal display, an OLED display, a projection display, or a projection device and a projection screen.
  • the communicator 220 is a component for communicating with external devices or servers according to various communication protocol types.
  • the communicator may include at least one of a Wifi module, a Bluetooth module, a wired Ethernet module, other network communication protocol chips or near field communication protocol chips, and an infrared receiver.
  • the display device 200 can establish transmission and reception of control signals and data signals with the external control device 100 or the server 400 through the communicator 220 .
  • the user interface can be used to receive control signals from the control device 100 (such as an infrared remote control, etc.).
  • the detector 230 is used to collect signals from the external environment or interactions with the outside.
  • the detector 230 includes a light receiver, a sensor used to collect ambient light intensity; or the detector 230 includes an image collector, such as a camera, which can be used to collect external environment scenes, user attributes or user interaction gestures, or , the detector 230 includes a sound collector, such as a microphone, etc., for receiving external sounds.
  • the external device interface 240 may include, but is not limited to, any one of the following: high-definition multimedia interface (HDMI), analog or data high-definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, etc., or Multiple interfaces. It can also be a composite input/output interface formed by the above-mentioned multiple interfaces.
  • HDMI high-definition multimedia interface
  • component analog or data high-definition component input interface
  • CVBS composite video input interface
  • USB USB input interface
  • RGB port etc.
  • Multiple interfaces can also be a composite input/output interface formed by the above-mentioned multiple interfaces.
  • the tuner-demodulator 210 receives broadcast television signals through wired or wireless reception methods, and demodulates audio and video signals, such as EPG data signals, from multiple wireless or wired broadcast television signals.
  • the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in the memory.
  • the controller 250 controls the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on display 260, controller 250 may perform operations related to the object selected by the user command.
  • the application framework layer in the embodiment of this application includes managers (Managers), content providers (Content Provider), etc., where the manager includes at least one of the following modules: Activity Manager (Activity Manager) Interact with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve files currently installed on the device Various information related to the application package; Notification Manager is used to control the display and clearing of notification messages; Window Manager is used to manage icons, windows, toolbars, wallpapers and Desktop widgets.
  • Managers includes at least one of the following modules: Activity Manager (Activity Manager) Interact with all activities running in the system; Location Manager is used to provide system services or applications with access to system location services; Package Manager is used to retrieve files currently installed on the device Various information related to the application package; Notification Manager is used to control the display and clearing of notification messages; Window Manager is used to manage icons, windows, toolbars, wallpapers and Desktop widgets.
  • Activity Manager Activity Manager
  • Location Manager is used to provide system services or applications with access
  • the kernel layer is the layer between hardware and software. As shown in Figure 4, the kernel layer contains at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, pressure sensor) sensors, etc.), and power drives, etc.
  • the kernel layer contains at least one of the following drivers: audio driver, display driver, Bluetooth driver, camera driver, WIFI driver, USB driver, HDMI driver, sensor driver (such as fingerprint sensor, temperature sensor, pressure sensor) sensors, etc.), and power drives, etc.
  • Embodiments of the present application can be applied to various types of display devices and screen projection devices, including but not limited to: smart TVs, LCD TVs, VR headsets, tablets, mobile phones, smart terminals and other devices.
  • FIG. 5A is a schematic diagram of a user interface of a display device and a screen projection device provided by another embodiment of the present application.
  • the first screen projection device can project its first user interface to a third user interface of the display device, and the first user interface is configured to be displayed in the first screen projection area.
  • the content of the first user interface displayed in the first screen projection area will be synchronized with the first user interface on the first screen projection device.
  • the user can set the screen projection display position of the first screen projection device in the third user interface.
  • the first screen projection device when the first screen projection device is displayed vertically, the first user interface displayed in the first screen projection area will also be displayed vertically.
  • the first screen projection device and the first screen projection device screen area when the first screen projection device is displayed horizontally, the first user interface displayed in the first screen projection area will also be displayed horizontally, as shown in FIG. 5C with the first screen projection device and the first screen projection area.
  • the third user interface of the display device may further include a second screen projection area, which is used to display the second user interface projected by the second screen projection device, as shown in Figure 5D. It can be understood that during the screen sharing process, multiple screen projection devices can project screens to the same display device at the same time or one after another, and the third user interface can simultaneously display user interfaces projected by multiple screen projection devices.
  • FIG. 5B is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • the user inputs the trajectory ABC in the first screen projection area, and the trajectory will act on the first user interface of the first screen projection device to achieve feedback through the display device. to control the first screen projection device.
  • the first screen casting device and the display device access the communication network based on the screen casting protocol; based on the screen casting protocol, the first screen casting device and the display device will carry relevant protocol information during the signaling interaction process to perform relevant screen casting. Operation; wherein, the first screen projection device acts as a screen projection protocol sender, and the display device acts as a screen projection protocol receiver.
  • the display device After receiving the screen casting message sent by the screen casting device, the display device establishes a connection with the screen casting device; after the connection is successful, based on the screen casting protocol, in some embodiments, the first screen casting device transmits the data stream through the protocol in a screen recording manner.
  • the first user interface is projected and displayed on the third user interface; the display device displays the layout of the first projection area to be displayed in the third user interface to realize projection display, as shown in Figure 8A.
  • the first screen projection area in the third user interface is used to display the first user interface, and the first screen projection area or the layer where the first screen projection area is located can support user touch operations;
  • the first screen projection device sends signaling to the display device to confirm whether the current display device supports the reverse control function; after confirming that the display device supports the reverse control function, the first screen projection device responds Initialization;
  • the display device can control the touch operation input by the user; when the user inputs a touch track on the first user interface in the first projection area, the first controller of the display device will send the touch control Operate to the first screen projection device to achieve reverse control.
  • the touch operation will be used as input operation data on the first screen projection device.
  • the second screen projection device and the third screen projection device may be included.
  • Equipment and more equipment, the control architecture is shown in Figure 8A.
  • the first screen projection area or the layer where the first screen projection area is located can monitor the touch event, and the first controller can obtain the user's first operation.
  • a first trajectory formed in a screen projection area, and the first trajectory is sent to the first screen projection device through the control processing module; the second controller of the first screen projection device injects the received first trajectory into the system to achieve Reverse control of the first projection device.
  • the third user interface includes a first screen projection area and a second screen projection area.
  • the user can simultaneously control the first screen projection device and the second screen projection area. Control the screen projection device.
  • the first user interface in the first screen projection area corresponds to the first screen projection device
  • the second user interface in the second screen projection area corresponds to the second screen projection device
  • the user While the user inputs the touch trajectory (ABC) in the first screen projection area, he also inputs the touch control trajectory (FGH) in the second screen projection area.
  • the above two touch control trajectories will be sent to the corresponding first screen projection device and the third screen projection device respectively.
  • the second screen projection device is used to control the first screen projection device with the touch track (ABC) and the second screen projection device with the touch track (FGH).
  • the user first inputs the touch track (ABC) in the first screen projection area, and then inputs the touch control track (FGH) in the second screen projection area.
  • the above two touch tracks will be sent to the first screen projection area respectively.
  • first screen projection area and the second screen projection area are configured as rectangles in the drawings, this application does not limit the shapes of the first screen projection area and the second screen projection area.
  • the above-mentioned screen projection area It can also be configured into other shapes such as circles, ovals, polygons, triangles, etc.
  • the above embodiment introduces the scenario where the display device reversely controls the screen projection device when the touch trajectory is implemented within the screen projection area; the following will introduce the reverse control of the screen projection device by the display device when the touch trajectory is partially implemented outside the screen projection area.
  • FIG. 6A is a schematic diagram of the user interface of another display device and screen projection device according to an embodiment of the present application.
  • the first controller when the third user interface displays the first user interface projected by the first screen projection device, the first controller will determine the position of the first screen projection area in the third user interface, that is, determine the first projection area. The relative position relationship between the screen area and the third user interface.
  • the first controller will determine the position of the screen projection area in the third user interface, so as to compare the touch trajectory input by the user with the screen projection area, and determine Confirm the user’s operation intention.
  • the first controller will promptly obtain the relative position of the first screen projection area and the third user interface. relation.
  • the first controller when the user operates the third user interface through a remote control, a control device, or a touch method, the first controller will monitor the touch events that occur on the third user interface and obtain the information in a timely manner. The first trajectory generated by the touch operation.
  • the user inputs a first trajectory (ABCDE) on the third user interface.
  • the first trajectory starts from outside the first screen projection area, passes through the left boundary of the first screen projection area, and enters the first screen projection area. , and ends the trajectory at point E in the first projection area; where point A, point B, point C, point D, and point E are touch points obtained by the first controller according to the touch sampling frequency of the display device screen;
  • the first controller will The first trajectory, and the relative positional relationship between the screen projection area and the third user interface, analyze the first trajectory and extract the effective trajectory contained therein.
  • the effective trajectory is the trajectory within the first screen projection area, which can also be called is the second trajectory, which is used to send to the screen projection device for reverse control.
  • the valid trajectory is the trajectory (CDE); the trajectory (ABC) where the first trajectory is outside the first projection area can be regarded as an invalid trajectory; therefore, in the reverse control
  • the first controller should eliminate the trajectory (ABC), use the extracted valid trajectory (CDE) as the second trajectory, and send the second trajectory to the first screen casting device.
  • the first trajectory (ABCDE) input by the user is actually equivalent to the second trajectory (CDE) input by the user in the first screen projection device;
  • the dotted trajectory shown in the first user interface of the first screen projection device in Figure 6A represents the The second trajectory will not be displayed in the first user interface of the screen projection device in a physical form, but will only be injected into the system of the first screen projection device in the form of data to realize the operation of the first screen projection device.
  • Figure 6A shows the scene interface where the first trajectory (ABCDE) enters the first screen projection area from outside the first screen projection area; correspondingly, when the first trajectory (EDCBA) draws the first screen projection area from the first screen projection area, When outside the screen projection area, the second trajectory (EDC) can also be extracted according to the above solution and sent to the first screen projection device, as shown in Figure 6B.
  • the screen projection device in the process of reversely controlling the screen projection device, it is first necessary to determine and record the position of the first screen projection area in the third user interface, and obtain the relative positional relationship between the first screen projection area and the third user interface. ; Since the first trajectory enters the first screen projection area from outside the first screen projection area, it will happen that the first screen projection area or the layer where the first screen projection area is located cannot sense the touch event, so the first screen projection area needs to be The boundary area of the projection area is captured; in the process of laying out the first projection area, the boundary can be left aside, that is, part of the boundary area is left to prepare for the user's touch track to enter from the outside.
  • the first controller will create a first layer, and the first layer will be used to carry the first screen casting area to display the target screen.
  • the first user interface for delivery.
  • the third user interface of the display device shown in FIG. 6C includes a first layer and a second layer.
  • the first layer is configured to be displayed on the upper layer of the second layer, that is, the first layer is provided on
  • the top layer of the third user interface is used to display the delivered first user interface to prevent the first screen projection area from being blocked by other layer elements during the screen projection process.
  • the second layer can be configured to display other elements of the third user interface, including the content being played on the third user interface before the screen is cast, and the system user interface, such as home button, search button, message prompts, signal icons and other elements. ;
  • the first layer and the second layer can be overlaid to display the entire content of the third user interface.
  • the content displayed on the second layer can also be implemented to be displayed by multiple layers; that is, the third user interface is formed by multiple layers, but the first layer is displayed on the screen. Always displayed at the top during the process.
  • the first layer is configured to be sized to adapt to the first screen projection area, that is, the area occupied by the first layer and the first screen projection area is the same size; By moving the first layer, the user can adjust the projection position. Compared with adjusting the display position of the projection area in the layer, this mechanism is more efficient in adjusting the projection area.
  • the relative positional relationship between the first projection area and the second layer is equivalent to that of the first projection area.
  • the relative position relationship between the area and the third user interface is equivalent to that of the first projection area.
  • the third user interface is composed of a first layer and a second layer, and the first layer is set on the upper layer of the second layer.
  • the size of the first layer is the same as the first screen projection area.
  • the size of the second layer is the same as the size of the third user interface; the user inputs a touch operation, and the first controller obtains the first trajectory (ABCDE) input by the user;
  • the first track starts from the second layer point A, then the second layer will monitor this touch track event, and the first layer will not be able to monitor this touch track event; the first layer will monitor the touch track event.
  • the layer that controls the touch operation will monitor the touch trajectory, and other layers will no longer monitor the touch trajectory; this monitoring mechanism determines that in some embodiments, touch operations that occur at the boundary of the projection area will cause unresponsiveness.
  • Sensitivity problem in this embodiment, the second layer can monitor and obtain the first trajectory (ABCDE), but the first layer cannot obtain the above-mentioned first trajectory.
  • the first controller may obtain the first touch point sequence included in the first trajectory (ABCDE): point A, point B, point C, point D, point E;
  • the touch points of the first touch point sequence within the first screen projection area are extracted to form a second touch point sequence: point C, point D, and point E; the second touch point sequence is
  • the touch point sequence constitutes a second trajectory (CDE), which is sent to the first screen projection device to achieve reverse control, as shown in Figure 6C.
  • the third user interface is configured to include multiple layers, such as a system layer, an activity layer, and a screen projection layer;
  • the system layer can be used to display the underlying user interface of the system, such as the menu interface, function icon interface, etc.;
  • the activity layer can be used to display foreground applications, such as displaying the currently playing video application, or smart doorbell application, etc.;
  • the projection layer For displaying the screen projection interface the screen projection layer is usually configured on the top layer of the third user interface, and its size is adapted to the first screen projection area hosting the first user interface.
  • the layout of the above layer in the third user interface can be The configuration is as shown in Figure 8B.
  • the third user interface may be composed of a screen projection layer and a system layer, and the screen projection layer may also be implemented as the first layer.
  • the system layer may be implemented as a second layer.
  • the foreground application can be displayed on the active layer.
  • the touch track is input within the range of the active layer and the screen projection layer, the active layer is equivalent to As for the second layer, the projection layer is equivalent to the first layer.
  • the first layer size is set to be larger than the first screen projection area, and the second layer size is set to full screen display.
  • the first controller will obtain the first trajectory generated by the user's operation on the first layer.
  • the input trajectory starts from point E in the first projection area of the first layer, passes through the boundary point C of the first projection area, passes through point B on the first layer, and ends at point B on the second layer.
  • the first layer first monitors the touch of point E, then the second layer cannot monitor this touch trajectory; according to the screen touch sampling frequency, the information contained in the first trajectory (EDCB) can be obtained
  • EDCB information contained in the first trajectory
  • the point A and the trajectory (BA) shown by the dotted lines indicate that although the user inputs this part of the operation, the display device cannot actually detect the touch operation of the point A and the trajectory (BA).
  • the first controller will eliminate the touch point B in the first touch point sequence that exceeds the first screen projection area; point E, point D, and point C , will be used as a second touch point sequence to form a second trajectory (EDC), and the second trajectory will be sent to the first screen projection device, as shown in Figure 6D.
  • EDC second trajectory
  • the first controller when the user inputs the first trajectory, the first controller will synchronously obtain the first touch point sequence it contains, that is, when the first trajectory is drawn from point E to point D, The first controller can synchronously acquire the first touch point sequence: point E, point D;
  • the first controller when detecting that there is a first touch point beyond the first screen projection area in the first touch point sequence, the first controller will generate a second touch point sequence.
  • the first controller synchronously acquires the first sequence of touch points: point E, point D, point C, and point B;
  • the first controller will obtain the first coordinates and the vertex coordinates.
  • the vertex coordinates are the coordinates of the two diagonal vertices of the first screen projection area on the third user interface
  • the first coordinates are the coordinates of the touch point included in the first trajectory on the third user interface.
  • the coordinates of its lower left vertex can be expressed as (X1, Y1), and the coordinates of its upper right vertex can be expressed as (X2, Y2);
  • the first trajectory (ABCDE) includes the first coordinate of the touch point It can be expressed as: point A (XA, YA), point B (XB, YB), point C (XC, YC), point D (XD, YD), point E (XE, YE);
  • the touch points located in the first screen projection area can be determined: point C, point D, and point E, that is, the touch points in the first screen projection area are determined.
  • the touch points constitute the second trajectory (CDE), as shown in Figure 6E.
  • determining the second trajectory based on the relative position relationship may be implemented by the following algorithm.
  • the first trajectory is expressed as: the first trajectory (point 1, point 2, point 3, point 4, point 5...point N); the first trajectory enters the first projection area from outside the first projection area When, point 5 reaches the left boundary of the first projection area of the third user interface, it can be determined that the touch trajectory at this time has violated the first projection area;
  • point 6 is used as the trigger point for the touch trajectory to operate in the first screen projection area, and this trigger point is used as the starting point for the user to perform touch operations on the first user interface; convert point 6 to point N to obtain Go to the new track (click new 1, click new 2, click new 3, click new 4, click new 5... click new N), and send the new track to the first screen projection device for reverse control.
  • Figure 6F shows that
  • the first screen projection area is configured as 656 pixels ⁇ 1080 pixels, and its upper and lower boundaries respectively conflict with the upper and lower boundaries of the third user interface;
  • the event information of the touch points included in the first trajectory can be expressed as (x, y, KEY ,extra);
  • the first trajectory including the touch point can be expressed as follows:
  • the starting point of the relative position of the first screen projection area that is, the coordinate of the lower left vertex is (630,0); since the y-coordinate values of the above-mentioned touch points are greater than the ordinate 0 of the starting point, the above-mentioned touch points The ordinate y does not need to be processed;
  • the above touch points need to be converted. Specifically, 630 pixels can be subtracted from the original abscissas to obtain the abscissa value of the touch point relative to the starting point in the first projection area;
  • the first touch point of the first screen projection device is used to obtain a new trajectory.
  • the new trajectory includes the touch points and is represented as follows:
  • the above touch points mean point 6, point 7, point 8, point 9,...point N, which will be regarded as the touch points contained in the new trajectory; based on the above new trajectory, the difference between the abscissas is value conversion, and change the direction parameter MOVE of the first touch point to DOWN, the above new trajectory is converted into a new trajectory relative to the first screen projection area, that is, the second trajectory, expressed as:
  • the above-mentioned second trajectory will be sent to the first screen projection device, and the second controller of the first screen projection device receives the second trajectory sent by the display device; based on the second trajectory, the second controller controls its second trajectory.
  • a user interface is updated and displayed, and the updated first user interface is projected to the first projection area again, thereby realizing reverse control of the first projection device by the display device.
  • each touch point and user interface element in the application drawing are schematically marked; that is, the layout position of the touch point in the drawing and the distance between each point are
  • the relative distance, screen projection area, and user interface are all schematic displays, and their actual layout should be determined based on the parameters of each display element.
  • the first trajectory is drawn from inside the first screen projection area to outside the first screen projection area, and the relative position starting point coordinate of the first screen projection area is (630,0); the first trajectory is based on the above implementation
  • the coordinate value conversion technical solution in the example finally obtains a new trajectory relative to the first projection area, which includes touch points and is represented as follows:
  • the first controller sends the second trajectory to the first screen projection device, and the second trajectory will be injected into the system of the first screen projection device as event information to reversely control the first screen projection device, as shown in Figure 6G.
  • FIG. 7A is a schematic diagram of a user interface of a display device and a screen projection device provided by another embodiment of the present application.
  • the projection area in the third user interface is continuously adjusted, and the user response problem at the boundaries of different projection areas needs to be solved.
  • a solution to this problem will be proposed below.
  • the display device of this embodiment can accept screen projection from multiple screen projection devices, and its third user interface also includes a second screen projection area for synchronously displaying the second screen projection device.
  • the second user interface correspondingly, when the user operates the user interface in the first screen projection area and the second screen projection area respectively, the corresponding first screen projection device and the second screen projection device can be controlled in reverse respectively .
  • the first controller When the first screen projection device projects the screen to the third user interface, or the first and second screen projection devices project the screen to the third user interface at the same time, the first controller will create a new layer in time, and the new layer will be, for example, It can be implemented as a transparent template layer, which is usually set on the top layer of the third user interface and is used to monitor touch events that occur in the third user interface, as shown in Figure 8C.
  • the new layer Since the new layer is set to be transparent and placed on the top layer, the new layer can monitor touch events occurring in the third user interface at the first time without blocking the display content of other layers.
  • the first controller can obtain the touch events that occur in the third user interface.
  • the transparent template layer is set on the layer where the entire screen projection area is located, and can directly monitor and obtain all touch events. Touch events enable monitoring of touch events outside different projection areas.
  • the first controller can obtain the user's touch on the transparent template layer.
  • the generated first trajectory such as the first trajectory used to control the first user interface (ABCDE), or the first trajectory used to control the second user interface (FGHIJ);
  • the first controller determines the first touch point sequence based on the first screen projection area,
  • the relative positional relationship between the two projection areas and the transparent template layer in the third user interface determines the second trajectory formed by the first trajectory in each projection area, and sends the determined second trajectory to the corresponding projection area. equipment;
  • the second trajectory (CDE) is determined based on the first trajectory (ABCDE), and the second trajectory (HIJ) is determined based on the first trajectory (FGHIJ); and then the second trajectory is sent to the first projection device or the second projection device respectively.
  • screen device; the second controller controls its user interface to update the display based on the second trajectory, and projects the updated user interface to the projection area corresponding to the third user interface.
  • the scenario described in the above embodiment is that the user successively inputs the first trajectory to operate the first user interface and the second user interface, and each time the touch trajectory is input to control a single screen projection device; in actual application scenarios, the user can also input at the same time to control different screen projection devices.
  • the first controller will determine the second trajectory distributed in different projection areas, and send different parts of the second trajectory to different screen projection devices respectively, so as to achieve simultaneous reverse control of different display devices. screencasting device.
  • the display device after receiving screen casting requests from multiple screen casting devices, the display device establishes connections with the multiple screen casting devices respectively; based on the screen casting protocol, the multiple screen casting devices deliver their user interfaces to the display device , and make layout settings for the projection area in the display device user interface.
  • the second trajectory determined by the second controller will include two parts, the trajectory located in the first projection area (CDEF), and the trajectory located in the second projection area.
  • the first controller sends the trajectories in the respective projection areas to its corresponding screen projection device, that is, sends the trajectory (CDEF) to the first screen projection device, and sends the trajectory (HIJK) to the first screen projection device.
  • CDEF trajectory
  • HAJK trajectory
  • the first trajectory when the first trajectory passes through the first screen projection area and the second screen projection area in sequence, the first trajectory is configured to only take effect on the first screen projection area.
  • the first controller synchronously obtains the first touch point sequence it contains.
  • the first trajectory passes from touch point A to point B, point C, point D, point E, and point F.
  • the first controller will determine that the first screen projection area is the screen projection area that the first trajectory first passes, and point G is the first screen projection area that the first trajectory passes through and then exceeds the first screen projection area. touch points;
  • a second touch point sequence is generated: point C, point D, point E, and point F; the second touch point sequence does not include point G, that is, it does not include the first touch point beyond the first projection area. , so the second touch point sequence constitutes the final second track package.
  • the second trajectory (CDEF) will only be sent to the first screen projection device for reverse control. This touch operation is invalid for the second screen projection device, as shown in Figure 7C.
  • the third user interface also displays other applications during the screen casting process.
  • the first controller can create multiple layers to display different elements respectively; for example, after receiving the first screen casting device, or multiple casting devices, After receiving the screen casting message from the screen device, the first controller will create a first layer, as shown in Figure 7D; the first layer can display the first screen casting area and the second screen casting area, and the first layer displays on top of the second layer, and the first layer appears below the transparent template layer.
  • the second layer of the third user interface can display active foreground applications other than the screen projection interface, such as video playback applications or smart doorbell applications.
  • the second layer can also be used to display system diagrams. layer.
  • the first controller needs to determine the distance between the first screen projection area, the second screen projection area, and the second layer. relative positional relationship;
  • the second layer and the first layer are overlaid and displayed to generate a third user interface, and the second layer is displayed flatly in the third user interface.
  • the second layer is configured for non-full-screen display. It may happen that the first touch point of the first track falls on the third layer. In this scenario, the first track will be displayed by the third image.
  • the corresponding second trajectory determination technical solution obtained through layer monitoring is similar to the technical solution recorded above, and will not be described again here.
  • first the vertex coordinates of the first screen projection area and the second screen projection area in the third user interface are obtained, and the touch point contained in the first trajectory is obtained in the third user interface.
  • the vertex coordinate includes the coordinate of the diagonal vertex.
  • the coordinates of the lower left vertex of the first screen projection area can be expressed as (X1, Y1), and the coordinates of its upper right vertex can be expressed as is (X2, Y2);
  • the coordinates of the lower left vertex of the second projection area can be expressed as (X3, Y3), and the coordinates of its upper right vertex can be expressed as (X4, Y4);
  • the first trajectory includes the second coordinates of the touch point, which can be expressed as: point A (XA, YA), point B (XB, YB), point C (XC, YC), point D (XD, YD) , point E (XE, YE), point F (XF, YF), point G (XG, YG), point H (XH, YH), point I (XI, YI), point J (XJ, YJ), point K(XK,YK);
  • the touch points located in the vertex coordinate area can be determined; it is determined that the touch points in the first projection area include: point C, point D, point E, and point F; it is determined that the touch points in the first projection area include: point C, point D, point E, and point F;
  • the touch points in the second projection area include: point H, point I, point J, and point K;
  • the second trajectory includes two parts, namely the trajectory (CDEF) and the trajectory (HIJK).
  • this application also provides device control methods based on trajectory extraction, device control methods based on trajectory extraction Multi-channel device control method. The method has been described in detail in the specific steps of the display device reversely controlling the screen projection device, and will not be described again here.
  • the beneficial effect of this part of the embodiment is that by constructing a relative position relationship, the location of the screen projection area in the user interface of the display device can be determined; further by constructing the first trajectory, the complete touch operation input by the user can be obtained; further by constructing the second The trajectory can determine the effective touch operations sent to the screen projection device, so that touch operations that occur in the boundary area of the screen projection interface or start outside the boundaries of the screen projection interface can be responded to in a timely manner by the screen projection device.
  • a transparent template layer By constructing a transparent template layer, all touch operations that occur on the user interface of the display device can be monitored; further by constructing the first trace, the complete touch operations input by the user can be obtained; further by constructing the second trace, the complete touch operations input by the user can be determined;
  • the effective touch operation of the screen device can realize timely response to multiple screen projection devices corresponding to touch operations that occur in the boundary areas of multiple screen projection interfaces or start outside the boundaries of the screen projection interface.
  • FIG. 9A is a schematic diagram of touch points included in respective touch trajectories of another display device and a screen projection device according to an embodiment of the present application.
  • the user reversely controls the screen projection device, and the first controller will obtain the touch trajectory generated by the user's operation of the second user interface.
  • the touch trajectory includes the first touch point sequence, and the first The touch point sequence is collected based on the first touch sampling frequency of the display device;
  • the second controller of the screen projection device will obtain the touch trajectory generated by the user's operation of the first user interface, and the touch point sequence it contains is based on screen touch sampling. Frequency collection.
  • the touch sampling frequency determines the number of touch points that can be collected by the user interface per unit time and the sampling event interval of adjacent touch points, when the first touch sampling frequency of the display device is different from the touch point of the screen projection device, When the sampling frequency is controlled and the user inputs the same touch trajectory respectively, the display device and the screen projection device will respectively generate touch trajectories containing different numbers of touch points.
  • the user inputs the same touch trajectory on the display device and the screen projection device respectively.
  • the touch operations take the same time and the trajectory is the same; assuming that the first touch sampling frequency of the display device is 60HZ, and the projection frequency is 60HZ. If the touch sampling frequency of the screen device is 120HZ, the display device will collect the touch points included in the touch trajectory at an approximate average time interval of 15ms; for the above five touches of point A, point B, point C, point D, and point E Touch events, the time interval between adjacent touch events is collected is about 15ms, and the touch trajectory (ABCDE) is obtained;
  • the first user interface of the screen projection device will collect touch points in the input trajectory at an approximate average time interval of 7.5ms; that is, during the user's touch operation, for point A, point B, point C, point D,
  • the time interval for collecting adjacent layer touch events is about 7.5ms, and the touch trajectory of the screen projection device (ABCDEFGHI) is obtained.
  • the touch sampling frequency of the device is used to determine the touch points included in the touch trajectory during the user's touch operation; the higher the touch sampling frequency, the shorter the time interval for collecting touch points, and the time interval for collecting touch points per unit time is The more touch points there are; if the touch traces with a lower touch sampling frequency acquired by the display device are directly injected into the screen projection device, the screen projection device will become insensitive.
  • the first controller will compare the screen touch sampling frequency with the first touch of the display device. Sampling frequency to determine whether the screen touch sampling frequency is greater than the first touch sampling frequency.
  • the first touch sampling frequency of the display device is 60HZ
  • the screen projection message received includes the screen touch sampling frequency of 120HZ.
  • the first controller will determine that the screen touch sampling frequency is greater than the first touch sampling frequency.
  • the user when the first touch sampling frequency of the display device is less than the screen touch sampling frequency, the user can add correction points to the touch trajectory after inputting a touch operation on the display device for controlling the screen projection device.
  • the touch sampling frequency is increased from the first touch sampling frequency to the second touch sampling frequency to improve the response sensitivity of the screen projection device.
  • the response sensitivity of the screen projection device to user input touch operations will also increase accordingly; therefore, increasing the touch sampling frequency of the touch track can solve the problem of the difference in touch sampling frequency to a certain extent.
  • This causes the problem of low response sensitivity of the screen projection device; due to visual limitations, the senses of particularly weak and low-latency users are not sensitive, so the improved second touch sampling frequency can be lower than, equal to, or higher than the screen touch Sampling frequency, but must be greater than the first touch sampling frequency.
  • the user when the first touch sampling frequency of the display device is less than the screen touch sampling frequency, the user can improve the touch trajectory by adding correction points after inputting a touch operation on the display device for controlling the screen projection device.
  • the touch sampling frequency is such that the touch trajectory sent to the screen projection device can meet the touch sampling frequency requirements of the screen projection device.
  • the correction point makes the second touch sampling frequency of the touch trajectory greater than or equal to the screen touch sampling frequency. Control the sampling frequency to improve the response sensitivity of the screen projection device.
  • the first touch sampling frequency of the display device is 60HZ and the screen touch sampling frequency is 120HZ;
  • the first touch point sequence included in the touch trajectory (ABCDE) input by the user on the display device is: point A, point B, point C, point D, point E;
  • the first controller can add correction points to the first touch point sequence.
  • the correction points include point 1 and point 2. , point 3, point 4; thus obtaining a second touch point sequence containing more touch points: point A, point 1, point B, point 2, point C, point 3, point D, point 4, point E;
  • touch sampling frequency For the touch track after adding correction points, its touch sampling frequency will be corrected from the first touch sampling frequency to the second touch sampling frequency, where the required number of correction points must make the second touch sampling frequency greater than or equal to the screen
  • the touch sampling frequency is 120HZ, as shown in Figure 9B.
  • the display device after the display device obtains the screen touch sampling frequency, it will determine the first number of correction points based on a comparison of the first touch sampling frequency and the screen touch sampling frequency.
  • the first touch sampling frequency is 60HZ
  • the collection time interval of adjacent touch events is about 15ms
  • the display device takes a total of about 5 touch points included in the touch trajectory (ABCDE).
  • the screen touch sampling frequency is 120HZ
  • the collection time interval of adjacent touch events is about 7.5ms.
  • the screen projection device can obtain approximately 9 touch points in the above 60ms time period; it should be noted that in some In the embodiment, based on the touch sampling frequency and the system sampling frequency constraints, the first controller can preset the sampling time interval of the touch point within a certain range, such as setting the sampling time interval to 15ms within the constraint range. , or 10ms, etc.
  • the first number needs to be greater than or equal to 4, that is, at least 4 correction points need to be added to the first touch point sequence;
  • the first controller can evenly insert 4 correction points into the first touch point sequence to obtain the second touch point sequence: point A, point 1, point B, point 2, point C, point 3, point D, point 4, and point E, so that the second touch sampling frequency of the touch track is greater than or equal to the screen touch sampling frequency 60HZ, as shown in Figure 9B.
  • the first controller sends the touch trajectory of the second touch sampling frequency to the screen projection device; the second controller of the screen projection device receives the touch trajectory including the second touch point sequence sent by the display device.
  • the second touch sampling frequency of the track is greater than or equal to the touch sampling frequency of the screen projection device itself.
  • the second controller controls the first user interface of the screen projection device to update the display, and projects the updated and displayed first user interface to the display device again to enable the user to operate
  • the second user interface of the display device controls the screen projection device.
  • the dotted trajectory, touch points, and correction points shown in the first user interface of the screen projection device in Figure 9B indicate that the touch operation will be directly injected into the screen projection device as control data, and its first user interface does not The entity's touch track is not actually displayed.
  • the correction points added to the first touch point sequence may be from sampling points between adjacent touch points, and the sampling points are based on System sampling frequency acquisition.
  • the touch trajectory (ABCDE) shown in Figure 9C includes solid touch points and hollow sampling points. That is, the touch trajectory is composed of all sampling points, and all sampling points include all touch points. Control points.
  • the first controller acquires the above-mentioned solid touch point based on the first touch sampling frequency, that is, the first touch sampling frequency is used to determine the touch points included in the touch trajectory during user manipulation; during the user input touch operation process , the first controller can also obtain all sampling points of the touch trajectory based on the system sampling frequency, and all sampling points are recorded in the system database;
  • the display device can acquire and record all sampling points included in the touch trajectory based on the system sampling frequency, and all the sampling points include all the above-mentioned touch points; that is, the touch points are partial sampling points, and the corresponding display device system samples
  • the frequency is usually higher than its first touch sampling frequency.
  • sampling points between adjacent touch points in the first touch point sequence will be obtained based on the system sampling frequency; then the first controller The above sampling points are filtered, and the sampling points whose distances from adjacent touch points on both sides are greater than or equal to the preset threshold are used as the first sampling points, and then the first sampling points are added to the first touch points as correction points. sequence.
  • Figure 9D shows the touch trajectory (AE) acquired by the device, and its first touch point sequence is: point A, point E; the sampling points acquired based on the system sampling frequency are at touch point A, touch point E also includes the first sampling point C and the second sampling point D;
  • the above-mentioned sampling points can be filtered by the preset threshold L; the distance between the above-mentioned sampling point and the touch point is compared with the preset threshold L, and the distance is smaller than the preset threshold L.
  • the threshold L is set, the sampling point and the touch point can be approximately considered to coincide; when the distance is greater than or equal to the preset threshold L, the sampling point can be inserted into the first touch point sequence as a correction point;
  • Figure 9D shows that in the second user interface of the device, the second sampling point D is close to the touch point E.
  • the distance is less than the preset threshold L, it can be approximately considered that the second sampling point D and the point E coincide. D should be removed; the distances from the first sampling point C to the touch point A and the touch point E are both greater than or equal to the preset threshold L, then the first sampling point C can be added to the first touch point sequence as a correction point;
  • the acquired second touch point sequence includes: point A, point C, and point E; the touch sampling frequency of the touch track is increased from the first touch sampling frequency to the second touch sampling frequency, and the second touch sampling frequency The sampling frequency is greater than or equal to the screen touch sampling frequency 120HZ.
  • the preset threshold L can be set according to the screen touch sampling frequency. By adjusting the size of the preset threshold L, the number of correction points inserted between adjacent touch points can be adjusted, and the touch trajectory includes corrections. The number of points will affect its second touch sampling frequency, as shown in Figure 9D.
  • the first controller may obtain all sampling points between adjacent touch points in the first touch point sequence, and add all sampling points to Added as correction points to the first touch point sequence, the touch trajectory of the second touch point sequence is obtained.
  • the display device user interface can be obtained by monitoring the touch events (ACTION_MOVE) that occur in the response layer; for the touch points between adjacent touch points based on the system sampling frequency, , recorded sampling points, the sampling point touch event set can be obtained through the historical data acquisition function. The set includes all sampling points existing between the touch points; the sampling point can be determined through the abscissa and ordinate of the sampling point.
  • sampling points are filtered based on the distance. If there are multiple sampling points between adjacent touch points, the sampling points with relatively large intervals are inserted into adjacent ones as correction points. Touch points are inserted into the sampling point after the previous touch point is sent out and before the back touch point is sent out, thereby increasing the amount of information contained in the touch trajectory.
  • the first touch point sequence included in the touch track is expressed as follows:
  • the touch track is converted from the first touch sampling frequency to the second touch sampling frequency
  • the historical information between touch point B and touch point C is queried, that is, the sampling points recorded based on the system sampling frequency are queried.
  • the above distance difference can be defined as the preset threshold L, so The preset threshold L can be adjusted according to specific circumstances to adjust the number of sampling points added between touch points.
  • the resulting second touch point sequence is expressed as follows:
  • the above-mentioned touch trajectory including the second touch point sequence will be sent and injected into the screen projection device. Since the second touch sampling frequency of the touch trajectory is greater than or equal to the touch sampling frequency of the screen projection device itself, its control effect is the same as that of the user. The control of the screen projection device is consistent, which can solve the problem of insensitive response of reverse control high touch sampling frequency devices to a certain extent.
  • the first sampling point is added as a correction point to the first touch point sequence
  • the first controller is further configured to detect whether there is a second sampling point, the second sampling point being further away from the first sampling point.
  • the point is greater than or equal to the preset threshold, and the distance between the second sampling point and the adjacent touch point on the other side is greater than or equal to the preset threshold; if the above-mentioned second sampling point exists, the second sampling point will also be added as a correction point to First touch point sequence.
  • Figure 9E shows the touch trajectory (AE) acquired by the device. Between touch point A and touch point E, there are also sampling points acquired based on the system sampling frequency: first sampling point B, second sampling point C, and the third sampling point D;
  • the first sampling point B if its distance from the touch point A and the touch point E are both greater than the preset threshold L, then the first sampling point B is added to the first touch point sequence as a correction point; for the second If the distance of sampling point C from the first sampling point B and the touch point E is both greater than the preset threshold L, then the second sampling point C is also added as a correction point to the first touch point sequence; for the third sampling Point D is close to the touch point E, and the distance is less than the preset threshold L. It can be approximately considered that point D and point E overlap, and the third sampling point D should be removed;
  • a second touch point sequence is obtained, which is expressed as: point A, point B, point C, and point E;
  • the sampling frequency of the touch trajectory including the second touch point sequence is also increased from the first touch sampling frequency to the second touch sampling frequency, and the second touch sampling frequency is greater than or equal to the screen touch sampling frequency of 120HZ.
  • the first touch point sequence included in the touch trajectory is expressed as follows:
  • sampling points There are multiple sampling points between adjacent touch points, and the sampling points can be filtered according to the actual situation; according to the previous sampling point screening technical solution, compare the above sampling points with touch point C and touch point D, among which the first sampling point point and the second sampling point are available; the third sampling point and touch point D are repeated, or the distance between the third sampling point and touch point D is extremely small, and the third sampling point can be ignored;
  • the second touch point sequence of the touch track after adding correction points is expressed as follows:
  • the second controller in order to make the user's control input on the display device feel closer to the user's direct control of the screen projection device, after the screen projection device receives the touch track, the second controller will remove the second touch point sequence from the screen projection device. the last touch point to get a new touch track.
  • the above mechanism is to simulate the control effect of the user pressing the screen and accelerating the movement of the finger when the user directly controls the screen projection device.
  • the difference between the display device and the screen projection device lies in the number of touch events collected per unit time.
  • the screen projection device will respond to the touch events sent by the display device and accelerate the processing. This processing does not affect the user's touch trajectory, only the last A touch event is discarded, and the last touch event in the discarded touch point sequence is used as the touch endpoint (ACTION_UP event), simulating the accelerated processing scenario of directly controlling the projection device, making the user's control closer to directly controlling the projection device.
  • screen equipment
  • the second controller of the screen projection device controls the first user interface to update the display based on the new touch trajectory, thereby enabling the screen projection device to accelerate processing of the touch trajectory input by the user.
  • the display device after the display device receives the screen casting message sent by the screen casting device, it will establish a connection with the screen casting device; after the connection is successful, the screen casting device will After recording the screen of the first user interface, it is sent to the second user interface of the display device through a protocol transmission data stream, and the display device layouts and displays the screen projection area to be displayed, as shown in Figure 10A.
  • the screen projection area in the second user interface of the display device, or the layer where the screen projection area is located may support user touch operations; during the screen projection establishment process, the screen projection device may be configured to send signaling to Display device to confirm whether the display device supports reverse control function; after confirming through the agreement that the display device supports reverse control, the touch trace input by the user in the screen projection area will be sent to the screen projection device, and the touch trace will be displayed on the screen projection device
  • the terminal will serve as input operation data.
  • the above-mentioned display device or screen projection device can also be implemented as the same type of terminal device, such as a large-screen terminal, a mobile phone, a tablet computer, a computer, or other terminals.
  • the overall process of controlling the device based on sampling frequency correction described above in this application is applicable to any scenario where the response is insensitive due to differences in touch sampling frequencies, as shown in Figure 10B.
  • this application also provides a device control method based on sampling frequency correction.
  • the method has been described in detail in the specific steps of implementing touch sampling frequency correction on display devices and screen projection devices, and will not be described again here.
  • the beneficial effect of this part of the embodiment is that by obtaining the screen touch sampling frequency during the screen casting process, it can be determined whether to correct the touch sampling frequency of the touch trajectory; further by constructing correction points, the second touch point sequence can be realized Obtain; further by constructing a second touch sampling frequency, the sampling frequency of the touch track can be increased, the amount of data information contained in the touch track can be increased, the response sensitivity of the screen projection device can be improved, and the same control experience as the screen projection device can be achieved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请涉及显示设备技术领域,具体而言,涉及一种显示设备、投屏设备及基于轨迹提取的设备控制方法,一定程度上可以解决用户操控轨迹在由投屏区域外进入投屏区域内过程中,投屏用户界面无法获取操作轨迹,不能反向控制投屏设备的问题。所述显示设备包括:显示器;第一控制器,被配置为:在第一用户界面显示于第三用户界面时,确定第一投屏区域与第三用户界面之间的相对位置关系;基于第三用户界面对触控事件的监听,获取生成的第一轨迹;根据相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹,并发送至第一投屏设备;使用户通过操作第三用户界面对第一投屏设备的第一用户界面进行控制。

Description

显示设备、投屏设备及基于轨迹提取的设备控制方法
相关申请的交叉引用
本申请要求在2022年03月24日提交、申请号为202210303261.5;在2022年03月24日提交、申请号为202210303643.8;在2022年03月24日提交;申请号为202210303291.6的中国申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示设备技术领域,具体而言,涉及一种显示设备、投屏设备及基于轨迹提取的设备控制方法。
背景技术
随着显示设备智能化、及新型终端的不断涌现,设备间屏幕共享的场景也越来越多;在投屏至显示设备共享屏幕的过程中,用户还可以通过操作显示设备用户界面对投屏设备进行反向控制;例如,在平板电脑投屏至电视过程中,用户可通过电视用户界面的操作,反向控制平板电脑。
在一些显示设备反向控制投屏设备的实现中,通常首先建立显示设备、投屏设备间的投屏通信连接;然后确认投屏通信协议是否支持显示设备反向控制投屏设备;在支持反向控制时,显示设备端的投屏区域可监听用户在其区域内发生的屏幕触控事件,并将获取的操作轨迹发送、注入投屏设备系统,以实现对投屏设备的反向控制。
然而,当用户的操控轨迹从投屏区域边界之外切入时,该投屏用户界面将无法监听到该次触控事件,导致不能获取操作轨迹、无法控制投屏设备,发生在投屏区域边界附近操控时投屏设备有时不响应的情况。
发明内容
本申请实施例的第一方面提供一种显示设备,包括:显示器,用于显示包含第一投屏区域的第三用户界面,所述第一投屏区域用于同步显示第一投屏设备的第一用户界面;第一控制器,被配置为:在第一用户界面显示于第三用户界面时,确定第一投屏区域与第三用户界面之间的相对位置关系;基于第三用户界面对触控事件的监听,获取用户操作第三用户界面时生成的第一轨迹;根据所述相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹,并将所述第二轨迹发送至第一投屏设备;其中,所述第二轨迹用于第一投屏设备控制第一用户界面,以使用户通过操作第三用户界面对第一投屏设备的第一用户界面进行控制。
本申请实施例的第二方面提供一种投屏设备,包括:显示器,用于显示第一用户界面,所述第一用户界面在投屏过程中同步显示至显示设备第三用户界面的第一投屏区域;第二控制器,被配置为:接收显示设备发送的第二轨迹,所述第二轨迹为用户在第三用户界面中第一投屏区域的操作轨迹;基于第二轨迹控制第一用户界面更新显示,并将更新后的第一用户界面投屏至第三用户界面的第一投屏区域。
本申请实施例的第三方面提供一种基于轨迹提取的设备控制方法,所述方法包括:在第一投屏设备的第一用户界面显示于第三用户界面时,确定第一投屏区域与第三用户界面 之间的相对位置关系,第一投屏区域用于同步显示第一投屏设备的第一用户界面;基于第三用户界面对触控事件的监听,获取用户操作第三用户界面时生成的第一轨迹;根据所述相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹,并将所述第二轨迹发送至第一投屏设备;其中,所述第二轨迹用于第一投屏设备控制其第一用户界面,以使用户通过操作第三用户界面对第一投屏设备的第一用户界面进行控制。
本申请实施例的第四方面提供一种基于轨迹提取的设备控制方法,所述方法包括:接收显示设备发送的第二轨迹,所述第二轨迹为用户在显示设备第三用户界面中第一投屏区域的操作轨迹;基于第二轨迹控制第一用户界面更新显示,并将更新后的第一用户界面投屏至第三用户界面的第一投屏区域,所述第一用户界面在投屏过程中同步显示至显示设备第三用户界面的第一投屏区域。
附图说明
图1为本申请实施例的显示设备与控制装置之间操作场景的示意图;
图2为本申请实施例的显示设备200的硬件配置框图;
图3为本申请实施例的控制设备100的硬件配置框图;
图4为本申请实施例的显示设备200中软件配置示意图;
图5A为本申请实施例的显示设备、投屏设备的用户界面示意图;
图5B为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图5C为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图5D为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6A为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6B为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6C为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6D为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6E为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6F为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图6G为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图7A为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图7B为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图7C为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图7D为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图7E为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图8A为本申请实施例的另一显示设备反向控制投屏设备的架构示意图;
图8B为本申请实施例的另一显示设备的图层布局示意图;
图8C为本申请实施例的另一显示设备的图层布局示意图;
图9A为本申请实施例的触控点的示意图;
图9B为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图9C为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图9D为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图9E为本申请实施例的另一显示设备、投屏设备的用户界面示意图;
图10A为本申请实施例的另一显示设备反向控制投屏设备的架构示意图;
图10B为本申请实施例的另一显示设备、投屏设备的示意图。
具体实施方式
为使本申请的目的、实施方式和优点更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,所描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
需要说明的是,本申请中对于术语的简要说明,仅是为了方便理解接下来描述的实施方式,而不是意图限定本申请的实施方式。除非另有说明,这些术语应当按照其普通和通常的含义理解。
图1为根据实施例中显示设备与控制装置之间操作场景的示意图。如图1所示,用户可通过智能设备300、或控制装置100操作显示设备200。
在一些实施例中,控制装置100可以是遥控器,遥控器和显示设备的通信包括红外协议通信或蓝牙协议通信,及其他短距离通信方式,通过无线或有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。
在一些实施例中,显示设备200还与服务器400进行数据通信。可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器400可以向显示设备200提供各种内容和互动。服务器400可以是一个集群,也可以是多个集群,可以包括一类或多类服务器。
图2示例性示出了根据示例性实施例中控制装置100的配置框图。如图2所示,控制装置100包括控制器110、通信接口130、用户输入/输出接口140、存储器、供电电源。控制装置100可接收用户的输入操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起用用户与显示设备200之间交互中介作用。
如图3,显示设备200包括调谐解调器210、通信器220、检测器230、外部装置接口240、控制器250、显示器260、音频输出接口270、存储器、供电电源、用户接口中的至少一种。
显示器260可为液晶显示器、OLED显示器、以及投影显示器,还可以为一种投影装置和投影屏幕。
通信器220是用于根据各种通信协议类型与外部设备或服务器进行通信的组件。例如:通信器可以包括Wifi模块,蓝牙模块,有线以太网模块等其他网络通信协议芯片或近场通信协议芯片,以及红外接收器中的至少一种。显示设备200可以通过通信器220与外部控制设备100或服务器400建立控制信号和数据信号的发送和接收。
用户接口,可用于接收控制装置100(如:红外遥控器等)的控制信号。
检测器230用于采集外部环境或与外部交互的信号。例如,检测器230包括光接收器,用于采集环境光线强度的传感器;或者,检测器230包括图像采集器,如摄像头,可以用于采集外部环境场景、用户的属性或用户交互手势,再或者,检测器230包括声音采集器,如麦克风等,用于接收外部声音。
外部装置接口240可以包括但不限于如下:高清多媒体接口(HDMI)、模拟或数据高清分量输入接口(分量)、复合视频输入接口(CVBS)、USB输入接口(USB)、RGB端口等任一个或多个接口。也可以是上述多个接口形成的复合性的输入/输出接口。
调谐解调器210通过有线或无线接收方式接收广播电视信号,以及从多个无线或有线广播电视信号中解调出音视频信号,如以及EPG数据信号。
在一些实施例中,控制器250,通过存储在存储器上中各种软件控制程序,来控制显示设备的工作和响应用户的操作。控制器250控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器260上显示UI对象的用户命令,控制器250便可以执行与由用户命令选择的对象有关的操作。
如图4所示,本申请实施例中应用程序框架层包括管理器(Managers),内容提供者(Content Provider)等,其中管理器包括以下模块中的至少一个:活动管理器(Activity  Manager)用与和系统中正在运行的所有活动进行交互;位置管理器(Location Manager)用于给系统服务或应用提供了系统位置服务的访问;文件包管理器(Package Manager)用于检索当前安装在设备上的应用程序包相关的各种信息;通知管理器(Notification Manager)用于控制通知消息的显示和清除;窗口管理器(Window Manager)用于管理用户界面上的图标、窗口、工具栏、壁纸和桌面部件。
在一些实施例中,内核层是硬件和软件之间的层。如图4所示,内核层至少包含以下驱动中的至少一种:音频驱动、显示驱动、蓝牙驱动、摄像头驱动、WIFI驱动、USB驱动、HDMI驱动、传感器驱动(如指纹传感器,温度传感器,压力传感器等)、以及电源驱动等。
本申请实施例可应用于各种类型的显示设备及投屏设备,包括但不限于:智能电视、液晶电视、VR头显、平板电脑、移动手机、智能终端等设备。
图5A为本申请另一实施例提供的显示设备、投屏设备的用户界面示意图。
在一些实施例中,第一投屏设备可投送其第一用户界面至显示设备第三用户界面,该第一用户界面被配置为在第一投屏区域内进行显示。
在第一投屏设备的第一用户界面投送过程中,第一投屏区域内所显示的第一用户界面,其内容将同步于第一投屏设备端的第一用户界面。
在一些实施例中,通过移动第一投屏区域,用户可设置第一投屏设备在第三用户界面中的投屏显示位置。在一些实施例中,第一投屏设备纵向显示时,第一投屏区域中所显示的第一用户界面也将进行纵向显示,如图5A示出的第一投屏设备、及第一投屏区域;第一投屏设备横向显示时,第一投屏区域中所显示的第一用户界面也将横向显示,如图5C中示出的第一投屏设备、及第一投屏区域。
在一些实施例中,显示设备的第三用户界面还可包括第二投屏区域,该第二投屏区域用于显示第二投屏设备投送的第二用户界面,如图5D所示。可以理解,在屏幕共享过程中,多个投屏设备可同时、或先后对同一个显示设备进行投屏,第三用户界面可同时显示多个投屏设备所投送的用户界面。
图5B为本申请实施例的另一显示设备、投屏设备的用户界面示意图。
在一些实施例中,第一投屏设备、显示设备建立连接后,用户于第一投屏区域输入轨迹ABC,该轨迹将作用于第一投屏设备的第一用户界面,实现通过显示设备反向控制第一投屏设备。
例如,第一投屏设备、显示设备基于投屏协议接入通信网络;基于投屏协议,第一投屏设备、显示设备在信令交互过程中将携带相关的协议信息,以执行相关投屏操作;其中,第一投屏设备执行为投屏协议发送端,显示设备之行为投屏协议接收端。
显示设备接收到投屏设备发送的投屏消息后,与投屏设备建立连接;连接成功后,基于投屏协议,在一些实施例中第一投屏设备以录屏方式通过协议传输数据流,将其第一用户界面投屏显示于第三用户界面;显示设备在第三用户界面中对待显示的第一投屏区域进行布局展示,实现投屏显示,如图8A所示。
在一些实施例中,第三用户界面中的第一投屏区域用于显示第一用户界面,所述第一投屏区域、或第一投屏区域所在图层可支持用户触控操作;
在显示投屏的建立过程中,第一投屏设备发送信令至显示设备以确认当前显示设备是否支持反控功能;在确认显示设备支持所述反控功能后,第一投屏设备进行对应的初始化;
经过协议交互确认,显示设备可对用户输入的触控操作实现控制处理;当用户在第一投屏区域内的第一用户界面输入触控轨迹时,显示设备第一控制器将发送该触控操作至第一投屏设备,以实现反向控制,该触控操作在第一投屏设备将作为输入操作数据;在上述反向控制过程中,可包括第二投屏设备、第三投屏设备等更多设备,其控制架构示意如图8A所示。
在一些实施例中,触控操作发生于第一投屏区域时,第一投屏区域、或第一投屏区域 所在图层可监听所述触控事件,第一控制器可获取用户操作第一投屏区域形成的第一轨迹,并通过控制处理模块将该第一轨迹发送至第一投屏设备;第一投屏设备的第二控制器将所接收的第一轨迹注入系统,以实现对第一投屏设备反向控制。
在一些实施例中,第三用户界面包括第一投屏区域、及第二投屏区域,用户在上述多个投屏区域输入触控轨迹时,可同时对第一投屏设备、及第二投屏设备进行控制。
如图5D,第一投屏区域第一用户界面对应于第一投屏设备,第二投屏区域第二用户界面对应于第二投屏设备;
用户在第一投屏区域输入触控轨迹(ABC)的同时在第二投屏区域输入触控轨迹(FGH),上述两个触控轨迹将分别被发送至对应的第一投屏设备、第二投屏设备,以实现用触控轨迹(ABC)控制第一投屏设备、用触控轨迹(FGH)控制第二投屏设备。
在一些实施例中,用户首先在第一投屏区域输入触控轨迹(ABC),然后在第二投屏区域输入触控轨迹(FGH),上述两个触控轨迹将分别被发送至第一投屏设备、及第二投屏设备。
需要说明的是,虽然附图中第一投屏区域、第二投屏区域被配置为矩形,但本申请对第一投屏区域、第二投屏区域的形状并不作限制,上述投屏区域还可被配置为圆形、椭圆形、多边形、三角形等其他形状。
以上实施例介绍了触控轨迹实施于投屏区域内时显示设备反向控制投屏设备场景;下文将就触控轨迹部分实施于投屏区域外时显示设备反向控制投屏设备进行介绍。
图6A为本申请实施例的另一显示设备、投屏设备的用户界面示意图。
在一些实施例中,当第三用户界面显示第一投屏设备投送的第一用户界面时,第一控制器将确定第一投屏区域在第三用户界面中位置,即确定第一投屏区域与第三用户界面的相对位置关系。
即当投屏设备用户界面显示于投屏设备时,第一控制器将确定投屏区域在第三用户界面中的位置,以便于将用户输入的触控轨迹与投屏区域进行比对,判定确用户的操作意图。
在一些实施例中,第一投屏区域在第三用户界面的位置被用户手动调整、或经系统自动调整后,第一控制器将及时获取第一投屏区域与第三用户界面的相对位置关系。
在一些实施例中,用户通过遥控器、或控制装置、或触控方式对第三用户界面进行操作时,第一控制器将对第三用户界面所发生的触控事件进行监听,以及时获取触控操作所生成的第一轨迹。
例如,图6A中用户在第三用户界面输入第一轨迹(ABCDE),该第一轨迹从第一投屏区域外起始、途径第一投屏区域左侧边界、进入第一投屏区域内、并在第一投屏区域内点E结束轨迹;其中,点A、点B、点C、点D、点E为第一控制器根据显示设备屏幕触控采样频率所获取的触控点;
需要说明的是,为了便于说明技术方案,图6A中仅示出了第一轨迹的部分触控点;在实际采样中,根据屏幕触控采样频率的参数配置,第一轨迹包含的触控点数量、分别密度也不同。
可以发现,在第一轨迹(ABCDE)中,其部分轨迹处于第一投屏区域内、部分轨迹处于第一投屏区域外;因此,在获取第一轨迹后,第一控制器将根据已获取的第一轨迹,及投屏区域与第三用户界面的相对位置关系,对第一轨迹进行分析提取其中包含的有效轨迹,所述有效轨迹即处于第一投屏区域内的轨迹,也可称为第二轨迹,所述第二轨迹用于发送至投屏设备进行反向控制。
如图6A所示,在第一轨迹(ABCDE)中,有效轨迹为轨迹(CDE);第一轨迹处于第一投屏区域外的轨迹(ABC)可视为无效轨迹;因此,在反向控制投屏设备的过程中,第一控制器应将轨迹(ABC)剔除,将提取的有效轨迹(CDE)作为第二轨迹,并将第二轨迹发送至第一投屏设备。
用户输入的第一轨迹(ABCDE),实际等效于用户在第一投屏设备中输入第二轨迹 (CDE);图6A中第一投屏设备第一用户界面示出的虚线轨迹,表示该第二轨迹不会以实体形式显示于投屏设备第一用户界面,而仅以数据形式注入第一投屏设备的系统,实现对第一投屏设备的操作。
图6A示出了第一轨迹(ABCDE)从第一投屏区域外进入第一投屏区域内的场景界面;相应的,当第一轨迹(EDCBA)从第一投屏区域内划出第一投屏区域外时,也可根据上述方案提取第二轨迹(EDC),并将其发送至第一投屏设备,如图6B所示。
在一些实施例中,在反向控制投屏设备过程中,首先须确定、记录第一投屏区域在第三用户界面中的位置,获取第一投屏区域与第三用户界面的相对位置关系;由于第一轨迹从第一投屏区域之外进入第一投屏区域内部,会发生第一投屏区域、或第一投屏区域所在图层无法感知该触控事件,所以需要对第一投屏区域的边界区域进行捕捉处理;在布局第一投屏区域过程中可对其进行边界留余处理,即边界留有部分区域为用户触控轨迹从外部进入做准备。
在一些实施例中,显示设备收到第一投屏设备发送的投屏消息后,第一控制器将创建第一图层,该第一图层将用于承载第一投屏区域以显示被投送的第一用户界面。
例如,图6C中所示显示设备,其第三用户界面包括第一图层、及第二图层,第一图层被配置为显示于第二图层的上层,即第一图层设置于第三用户界面的顶层,以显示被投送的第一用户界面,避免第一投屏区域在投屏过程中被其它图层元素所遮挡。
其中,第二图层可被配置为显示第三用户界面的其它元素,包括投屏前第三用户界面正在播放内容、及系统用户界面,例如主页按钮、搜索按钮、消息提示、信号图标等元素;第一图层、第二图层经过叠加显示可显示第三用户界面的全部内容。
需要说明的是,第二图层所显示内容,在一些实施例中还可被实施为由多个图层显示;即第三用户界面由多个图层形成,但第一图层在投屏过程中始终被置顶显示。
在一些实施例中,根据待加载第一投屏区域的大小,第一图层被配置为大小适应该第一投屏区域,即第一图层、第一投屏区域所占据区域大小相同;用户通过移动第一图层,可实现投屏位置调整,相较与在图层中调整投屏区域显示位置,该机制调整投屏区域的效率更高。
在第一图层与第一投屏区域大小相同、第二图层全屏显示于第三用户界面时,第一投屏区域与第二图层之间的相对位置关系等效于第一投屏区域与第三用户界面之间的相对位置关系。
如图6C所示,第三用户界面由第一图层、第二图层构成,且第一图层设置于第二图层的上层,第一图层大小与第一投屏区域相同,第二图层大小与第三用户界面大小相同;用户输入触控操作,第一控制器得到用户输入的第一轨迹(ABCDE);
该第一轨迹起始于第二图层点A,则第二图层将对此次触控轨迹事件进行监听,第一图层将无法监听到此次触控轨迹事件;在先监听到触控操作的图层将对触控轨迹进行监听,其它图层将不再监听此次触控轨迹;该监听机制决定了一些实施例中发生于投屏区域边界的触控操作,会发生响应不灵敏的问题;在本实施例中,第二图层可监听得到第一轨迹(ABCDE),第一图层则无法获取上述第一轨迹。
在一些实施例中,基于显示设备屏幕触控采样频率,第一控制器可获取第一轨迹(ABCDE)中包含的第一触控点序列:点A、点B、点C、点D、点E;
然后基于已获取的相对位置关系,提取第一触控点序列在第一投屏区域范围内的触控点,构成第二触控点序列:点C、点D、点E;所述第二触控点序列构成第二轨迹(CDE),所述第二轨迹被发送至第一投屏设备以实现反向控制,如图6C所示。
在一些实施例中,第三用户界面被配置为包括多个图层,如包括系统图层、活动图层、及投屏图层;
其中,系统图层可用于显示系统底层用户界面,例如菜单界面、功能图标界面等;活动图层可用于显示前台应用,如显示当前正在播放的视频应用、或智慧门铃应用等;投屏 图层用于显示投屏界面,通常投屏图层被配置于第三用户界面的顶层,其大小适应于承载第一用户界面的第一投屏区域,上述图层在第三用户界面中布局可被配置为如图8B所示。
需要说明的是,在一些实施例中,显示设备没有活跃前台应用时,第三用户界面可由投屏图层、系统图层构成,所述投屏图层也可被实施为第一图层,所述系统图层可被实施为第二图层。在一些实施例中,显示设备存在活跃前台应用时,所述前台应用可显示于活动图层,在触控轨迹输入于活动图层、投屏图层范围内时,所述活动图层等效于第二图层,所述投屏图层等效于第一图层。
在一些实施例中,第一图层大小设置为大于第一投屏区域,第二图层大小设置为全屏显示。在第一图层首先监听到触控事件时,第一控制器将获取用户操作第一图层所生成的第一轨迹。
例如,输入轨迹(EDCBA),起始于第一图层第一投屏区域内点E、途径第一投屏区域边界点C、途径第一图层的点B、终止于第二图层的点A;
对于输入轨迹的监听,第一图层最先监听到点E触控,则第二图层无法监听此次触控轨迹;根据屏幕触控采样频率,可获取第一轨迹(EDCB)中包含的第一触控点序列:点E、点D、点C、点B;
其中,虚线示出的点A、以及轨迹(BA),表示用户虽然输入该部分操作,但显示设备实际无法监听到点A、及轨迹(BA)的触控操作。
基于第一投屏区域、第二图层间的相对位置关系,第一控制器将剔除第一触控点序列中超出第一投屏区域的触控点B;点E、点D、点C、将作为第二触控点序列构成第二轨迹(EDC),所述第二轨迹将被发送至第一投屏设备,如图6D所示。
在一些实施例中,如图6D所示,用户输入第一轨迹过程中,第一控制器将同步获取其包含的第一触控点序列,即第一轨迹由点E划至点D时,第一控制器可同步获取第一触控点序列:点E、点D;
基于已获取的上述相对位置关系,在检测到第一触控点序列中存在超出第一投屏区域的首个触控点时,第一控制器将生成第二触控点序列。
例如,第一轨迹由点E至点D、点C、点B时,第一控制器同步获取第一触控点序列:点E、点D、点C、点B;
基于相对位置关系将检测到点B超出第一投屏区域,第一控制器将生成第二触控点序列:点E、点D、点C;该第二触控点序列不包含首个超出第一投屏区域的触控点B,该第二触控点序列构成的第二轨迹(EDC),如图6D所示。可以发现,该机制下第一控制器获取的第二轨迹用时更短,用户会觉得反向控制更加跟手、更灵敏、响应更快。
在一些实施例中,在获取第二轨迹过程中,第一控制器将获取第一坐标、顶点坐标,所述顶点坐标为第一投屏区域在第三用户界面两个对角顶点的坐标,所述第一坐标为第一轨迹所包含触控点在第三用户界面的坐标。
当第一投屏区域实施为矩形,其左下顶点坐标可表示为(X1,Y1)、其右上顶点坐标可表示为(X2,Y2);第一轨迹(ABCDE)包含触控点的第一坐标可表示为:点A(XA,YA)、点B(XB,YB)、点C(XC,YC)、点D(XD,YD)、点E(XE,YE);
通过比对上述触控点的第一坐标与顶点坐标,可确定位于第一投屏区域内的触控点:点C、点D、点E,即确定第一投屏区域内的触控点,所述触控点构成第二轨迹(CDE),如图6E所示。
在一些实施例中,基于相对位置关系确定第二轨迹可通过以下算法实现。第一轨迹表示为:第一轨迹(点1,点2,点3,点4,点5.....点N);第一轨迹由第一投屏区域外进入第一投屏区域内时,点5到达第三用户界面第一投屏区域左侧边界,可判定该时刻触控轨迹已抵触第一投屏区域;
然后,将点6作为触控轨迹在第一投屏区域进行操作的触发点,将该触发点作为用户对第一用户界面进行触控操作的起始点;对点6至点N进行换算,获取到新轨迹(点新1, 点新2,点新3,点新4,点新5.....点新N),并将该新轨迹发送至第一投屏设备进行反向控制,如图6F所示。
例如,第一投屏区域被配置为656像素×1080像素,其上下边界分别抵触至第三用户界面的上下边界;第一轨迹所包含触控点的事件信息可表示为(x,y,KEY,extra);
其中,x表示横坐标、y表示纵坐标、KEY表示方向参数、extra表示监听参数,则第一轨迹包含触控点可表示如下:
点1(x=601.0,y=377.14832,DOWN,extra1),
点2(x=608.3,y=374.80246,MOVE,extra1),
点3(x=610.3,y=373.10000,MOVE,extra1),
点4(x=615.3,y=372.80246,MOVE,extra1),
点5(x=623.3,y=371.80246,MOVE,extra1),
点6(x=631.3,y=371.80246,MOVE,extra1),
.....
点N(x=670.3,y=370.0833,UP,extra1);
基于相对位置关系,第一投屏区域的相对位置起始点即左下顶点坐标为(630,0);由于上述触控点y坐标值均大于所述起始点的纵坐标0,所以上述触控点纵坐标y均不需处理;
在横坐标方向,需要对上述触控点进行换算,具体可将原横坐标均减去630像素以获取触控点在第一投屏区域内对于所述起始点的横坐标值;
基于触控点坐标、与第一投屏区域比对,触控点中所有横坐标小于630像素的触控点均需要丢弃;首个横坐标值大于630像素的触控点6将作为用户控制第一投屏设备的第一个触控点,从而得到新轨迹,所述新轨迹包含触控点表示如下:
点6(x=631.3,y=371.80246,MOVE,extra1),
.....
点N(x=670.3,y=370.08330,UP,extra1);
上述触控点意味着点6,点7,点8,点9,.....点N,将被视为新轨迹所包含的触控点;基于上述新轨迹,通过对其横坐标差值换算、并将首个触控点的方向参数MOVE变更为DOWN,上述新轨迹被换算为相对于第一投屏区域的新轨迹,即第二轨迹,表示为:
第二轨迹(点新1,点新2,点新3,点新4,.....点新N),其包含的触控点表示如下:
点新1(x=1.3,y=371.80246,DOWN,extra1),
.....
点新N(x=40.3,y=370.0833,UP,extra1);
上述第二轨迹将被发送至第一投屏设备,所述第一投屏设备的第二控制器接收该显示设备发送的所述第二轨迹;基于第二轨迹,第二控制器控制其第一用户界面进行更新显示,并将更新后的第一用户界面再次投屏至第一投屏区域,实现显示设备对第一投屏设备的反向控制。
需要说明的是,为了便于形象、清晰的阐述本技术方案,申请附图中各触控点、与用户界面元素均为示意性标注;即附图中触控点布局位置、各点之间的相对距离、投屏区域、用户界面均为示意性显示,其实际布局应以各显示元素参数确定。
在一些实施例中,第一轨迹由第一投屏区域内划至第一投屏区域外,第一投屏区域的相对位置起始点坐标为(630,0);该第一轨迹基于上述实施例中坐标值转换技术方案,最终得到相对于第一投屏区域的新轨迹,其包含触控点表示如下:
点1(x=10.3,y=370.0833,DOWN,extra1),
点2(x=9.3,y=371.80246,MOVE,extra1),
点3(x=8.3,y=371.80246,MOVE,extra1),
点4(x=6.3,y=373.10000,MOVE,extra1),
点5(x=0.3,y=372.80246,MOVE,extra1),
点6(x=-3.3,y=372.80246,MOVE,extra1),
....
点N(x=-8.0,y=377.14832,UP,extra1);
因点6、及后续触控点横坐标均小于0,所以点6、及后续触控点将被判定为第一投屏区域外发生的触控事件,进行丢掉处理;同时将最后一个投屏区域内的触控点方向参数MOVE变更为UP,所获取新轨迹为第二轨迹,其包含触控点表示如下:
点新1(x=10.3,y=370.0833,DOWN,extra1),
点新2(x=9.3,y=371.80246,MOVE,extra1),
点新3(x=8.3,y=371.80246,MOVE,extra1),
点新4(x=6.3,y=373.10000,MOVE,extra1),
点新5(x=0.3,y=372.80246,UP,extra1);
第一控制器将所述第二轨迹发送至第一投屏设备,该第二轨迹将作为事件信息注入第一投屏设备的系统以反向控制第一投屏设备,如图6G所示。
图7A为本申请另一实施例提供的显示设备、投屏设备的用户界面示意图。
针对多个投屏设备同时投屏,第三用户界面中投屏区域不断进行位置调整,需要解决不同投屏区域边界的用户响应问题,下文将对该问题提出解决方案。
相较于图5A示出的显示设备,本实施例显示设备可接受来自多个投屏设备的投屏,其第三用户界面还包括第二投屏区域,用于同步显示第二投屏设备的第二用户界面;相应的,当用户分别操作第一投屏区域、及第二投屏区域内的用户界面时,可分别反向控制对应的第一投屏设备、及第二投屏设备。
在第一投屏设备投屏至第三用户界面、或第一、第二投屏设备同时投屏至第三用户界面时,第一控制器将及时创建新图层,所述新图层例如可实施为透明模板图层,所述透明模板图层通常设置于第三用户界面顶层,用于监听第三用户界面所发生的触控事件,如图8C所示。
由于所述新图层被设置为透明、且置于顶层,所述该新图层可第一时间监听第三用户界面发生的触控事件,且不会遮挡其它图层显示内容。
基于所述全屏透明模板图层,第一控制器可获取第三用户界面所发生的触控事件,所述透明模板图层设置于整体投屏区域所在图层之上,可直接监听获取所有的触控事件,实现监听不同投屏区域外的触控事件。
例如,当用户输入用于控制第一投屏设备、或第二投屏设备的触控轨迹时,通过监听所述透明模板图层,第一控制器可获取用户在所述透明模板图层上生成的第一轨迹,如用于控制第一用户界面的第一轨迹(ABCDE)、或用于控制第二用户界面的第一轨迹(FGHIJ);
然后基于屏幕触控采样频率,获取第一轨迹中包含的第一触控点序列;如前文实施例所述,第一轨迹超出投屏区域时,第一控制器根据第一投屏区域、第二投屏区域、所述透明模板图层在第三用户界面中的相对位置关系,确定第一轨迹在各个投屏区域形成的第二轨迹,并将所确定第二轨迹发送至对应的投屏设备;
例如,基于第一轨迹(ABCDE)确定第二轨迹(CDE),基于第一轨迹(FGHIJ)确定第二轨迹(HIJ);然后将第二轨迹分别发送至第一投屏设备、或第二投屏设备;第二控制器基于第二轨迹控制其用户界面更新显示,并将更新后的用户界面投屏至第三用户界面对应的投屏区域。
上述实施例介绍场景为用户先后输入第一轨迹操作第一用户界面、第二用户界面,每次输入触控轨迹控制单个投屏设备;在实际应用场景中,用户还可同时输入用于控制不同投屏设备的第一轨迹,第一控制器将确定分布于不同投屏区域的第二轨迹,并将第二轨迹的不同部分分别发送至不同投屏设备,以实现显示设备同时反向控制不同的投屏设备。
在一些实施例中,显示设备收到多个投屏设备的投屏请求后,分别与多个投屏设备建立连接;基于投屏协议,多个投屏设备将其用户界面投送至显示设备,并在显示设备用户 界面进行关于投屏区域的布局设置。
在一些实施例中,会出现第一轨迹贯穿第一投屏区域、及第二投屏区域的情况,如图7B所示用户界面。
在该场景下,根据前文所述实施例第二轨迹确定方案,第二控制器确定的第二轨迹将包括两部分,分别位于第一投屏区域的轨迹(CDEF)、以及位于第二投屏区域的轨迹(HIJK);
在第二轨迹确定后,第一控制器分别发送各自投屏区域内的所述轨迹至其对应的投屏设备,即发送轨迹(CDEF)至第一投屏设备,发送轨迹(HIJK)至第二投屏设备,以使得用户通过第一轨迹可同时控制第一投屏设备和第二投屏设备,如图7B所示。
在一些实施例中,当第一轨迹依次贯穿第一投屏区域、第二投屏区域时,所述第一轨迹被配置为仅对第一投屏区域生效。
例如,透明模板图层监听第一轨迹过程中,第一控制器同步获取其包含的第一触控点序列。第一轨迹由触控点A途径点B、点C、点D、点E、点F,抵达点G过程中,基于第一投屏区域、第二投屏区域、以及透明模板图层的相对位置关系,第一控制器将判定第一投屏区域为第一轨迹最先途径的投屏区域,且点G为第一轨迹中首个途径第一投屏区域后又超出第一投屏区域的触控点;
基于上述判定,生成第二触控点序列:点C、点D、点E、点F;第二触控点序列不包括点G,即不包含首个超出第一投屏区域的触控点,所以第二触控点序列构成最终的第二轨迹包。第二轨迹(CDEF)仅会被发送至第一投屏设备用于反向控制,此次触控操作对第二投屏设备无效,如图7C所示。
在一些实施例中,第三用户界面在投屏过程中还显示其他应用,第一控制器可建立多个图层分别显示不同元素;例如,在接收到第一投屏设备、或多个投屏设备的投屏消息后,第一控制器将创建第一图层,如图7D所示;所述第一图层可显示第一投屏区域、第二投屏区域,第一图层显示于第二图层之上、且第一图层显示于透明模板图层之下。
在图7D中,第三用户界面的第二图层可显示除投屏界面之外的活跃前台应用,例如视频播放应用、或智慧门铃应用等,所述第二图层还可用于显示系统图层。当第一投屏区域、第二投屏区域显示于第三用户界面时,如前文所述方案,第一控制器需要确定第一投屏区域、第二投屏区域、第二图层之间的相对位置关系;
其中,第二图层、第一图层以叠加显示方式生成第三用户界面,且第二图层铺平显示于第三用户界面中。需要说明的是,在一些实施例中第二图层被配置为非全屏显示,可能会发生第一轨迹的首个触控点落于第三图层,该场景下第一轨迹将由第三图层监听获取,其对应的第二轨迹确定技术方案与前文记载技术方案类似,此处不再赘述。
在一些实施例中,在确定相对位置关系过程中,首先获取第一投屏区域、第二投屏区域在第三用户界面的顶点坐标,以及获取第一轨迹所包含触控点在第三用户界面的第二坐标,该顶点坐标包括对角顶点的坐标。
如图7E示出的设备用户界面,在第一投屏区域、第二投屏区域实施为矩形时,第一投屏区域左下顶点坐标可表示为(X1,Y1)、其右上顶点坐标可表示为(X2,Y2);第二投屏区域左下顶点坐标可表示为(X3,Y3)、其右上顶点坐标可表示为(X4,Y4);
第一轨迹(ABCDEFGHIJK)包含触控点的第二坐标可分别表示为:点A(XA,YA)、点B(XB,YB)、点C(XC,YC)、点D(XD,YD)、点E(XE,YE)、点F(XF,YF)、点G(XG,YG)、点H(XH,YH)、点I(XI,YI)、点J(XJ,YJ)、点K(XK,YK);
通过比对上述第二坐标与顶点坐标,可确定位于顶点坐标区域内的触控点;确定第一投屏区域内的触控点包括:点C、点D、点E,点F;确定第二投屏区域内的触控点包括:点H、点I、点J,点K;所述第二轨迹包括两部分,分别是轨迹(CDEF)和轨迹(HIJK)。
基于上文中显示设备、投屏设备、基于轨迹提取的设备控制方案、基于轨迹提取的多路设备控制方案及相关附图的介绍,本申请还提供了基于轨迹提取的设备控制方法、基于 轨迹提取的多路设备控制方法。所述方法在显示设备反向控制投屏设备的具体步骤中已进行详细阐述,在此不再赘述。
本部分实施例的有益效果在于,通过构建相对位置关系,可确定显示设备用户界面中投屏区域所在位置;进一步通过构建第一轨迹,可获取用户输入的完整触控操作;进一步通过构建第二轨迹,可确定发送至投屏设备的有效触控操作,可实现发生于投屏界面边界区域、或起始于投屏界面边界之外的触控操作被投屏设备及时响应。
通过构建透明模板图层,可实现监显示设备用户界面发生的全部触控操作;进一步通过构建第一轨迹,可获取用户输入的完整触控操作;进一步通过构建第二轨迹,可确定发送至投屏设备的有效触控操作,可实现对发生于多个投屏界面边界区域、或起始于投屏界面边界之外的触控操作被对应的多个投屏设备及时响应。
图9A为本申请实施例的另一显示设备、投屏设备各自触控轨迹所包含触控点的示意图。
在一些实施例中,用户反向控制投屏设备,第一控制器将获取用户操作第二用户界面所生成的触控轨迹,所述触控轨迹包含第一触控点序列,所述第一触控点序列基于显示设备的第一触控采样频率采集;
可以理解,如果用户在投屏设备输入相同的触控轨迹,投屏设备第二控制器将获取用户操作第一用户界面所生成的触控轨迹,其包含的触控点序列基于屏幕触控采样频率采集。
可以理解,由于触控采样频率决定了单位时间内用户界面可采集的触控点数量、及相邻触控点的采样事件间隔,当显示设备第一触控采样频率不同于投屏设备的触控采样频率时、且用户分别输入相同触控轨迹,显示设备和投屏设备将分别生成包含不同数量触控点的触控轨迹。
如图9A示出的对比示意,用户分别在显示设备、投屏设备输入相同触控轨迹,所述触控操作的耗时相同、轨迹相同;假设显示设备第一触控采样频率为60HZ、投屏设备触控采样频为120HZ,则显示设备将以近似平均15ms的时间间隔采集触控轨迹中包含的触控点;对于点A、点B、点C、点D、点E上述5次触控事件,采集相邻触控事件的时间间隔约为15ms,得到触控轨迹(ABCDE);
相应的,投屏设备第一用户界面将以近似平均7.5ms的时间间隔采集输入轨迹中的触控点;即在用户触控操作过程中,对于点A、点B、点C、点D、点E、点F、点G、点H、点I上述9次触控事件,采集相邻层触控事件的时间间隔约为7.5ms,得到投屏设备触控轨迹(ABCDEFGHI)。
可以发现,设备触控采样频率用于确定用户触控操作过程中其触控轨迹所包含的触控点;触控采样频率越高,采集触控点的时间间隔就越短,单位时间内采集的触控点就越多;如果将显示设备获取的触控采样频率较低的触控轨迹直接注入投屏设备,投屏设备会发生响应不灵敏的情况。
在一些实施例中,显示设备接收到投屏设备所发送包含屏幕触控采样频率的投屏消息后,第一控制器将比对所述屏幕触控采样频率、及显示设备的第一触控采样频率,以判定屏幕触控采样频率是否大于第一触控采样频率。
例如,显示设备第一触控采样频率为60HZ,其接收的投屏消息包含屏幕触控采样频率为120HZ,经过比对第一控制器将判定屏幕触控采样频率大于第一触控采样频率。
在一些实施例中,当显示设备第一触控采样频率小于屏幕触控采样频率,用户在显示设备输入用于控制投屏设备的触控操作后,可通过添加修正点的方式将触控轨迹的触控采样频率由第一触控采样频率提升至第二触控采样频率,以提高投屏设备的响应灵敏度。
理论上随着触控采样频率的提升,投屏设备对用户输入触控操作的响应灵敏度也会相应提高;因此提高触控轨迹的触控采样频率,可以一定程度上解决由于触控采样频率差异导致的投屏设备响应灵敏度低的问题;由于视觉局限,对特别微弱低延时用户感官并不灵敏,因此提升后的第二触控采样频率可低于、或等于、或高于屏幕触控采样频率,但必须 大于第一触控采样频率。
在一些实施例中,当显示设备第一触控采样频率小于屏幕触控采样频率,用户在显示设备输入用于控制投屏设备的触控操作后,可通过添加修正点的方式提高触控轨迹触控采样频率,以使发送至投屏设备的触控轨迹能够满足投屏设备的触控采样频率需求,如所述修正点使触控轨迹的第二触控采样频率大于等于所述屏幕触控采样频率,以提高投屏设备的响应灵敏度。
例如,显示设备第一触控采样频率为60HZ、屏幕触控采样频率为120HZ;用户在显示设备输入触控轨迹(ABCDE)所包含的第一触控点序列为:点A、点B、点C、点D、点E;
在反向控制过程中,为解决触控采样频率差异造成的投屏设备响应不灵敏问题,第一控制器可添加修正点至第一触控点序列,所述修正点包括点1、点2、点3、点4;从而得到包含更多触控点的第二触控点序列:点A、点1、点B、点2、点C、点3、点D、点4、点E;
对于添加修正点后的触控轨迹,其触控采样频率将由第一触控采样频率修正为第二触控采样频率,其中修正点所需数量须使第二触控采样频率大于等于所述屏幕触控采样频率120HZ,如图9B所示。
在一些实施例中,显示设备获取到屏幕触控采样频率后,将基于第一触控采样频率、及屏幕触控采样频率的比对,确定修正点的第一数量。
如图9B所示,第一触控采样频率为60HZ,其相邻触控事件的采集时间间隔约为15ms,显示设备获取触控轨迹(ABCDE)所包含的5个触控点耗时总计约60ms;屏幕触控采样频率为120HZ,其相邻触控事件的采集时间间隔约为7.5ms,投屏设备在上述60ms时间段内大约可获取9个触控点;需要说明的是,在一些实施例中,基于触控采样频率、以及系统采样频率约束,第一控制器可在一定范围内对触控点的采样时间间隔进行预设,如在约束范围内将其采集时间间隔设置为15ms、或10ms等。
为使转换后触控轨迹的第二触控采样频率大于等于所述屏幕触控采样频率,所述第一数量需要大于等于4,即至少需添加4个修正点至第一触控点序列;
假设第一触控点序列所包含触控点均匀分布时,第一控制器可将4个修正点均匀插入第一触控点序列,得到第二触控点序列:点A、点1、点B、点2、点C、点3、点D、点4、点E,从而使触控轨迹的第二触控采样频率大于等于屏幕触控采样频率60HZ,如图9B所示。
然后第一控制器发送第二触控采样频率的触控轨迹至投屏设备;投屏设备第二控制器接收所述显示设备发送的包含第二触控点序列的触控轨迹,该触控轨迹的第二触控采样频率大于等于投屏设备本身的触控采样频率。
第二控制器基于该包含第二触控点序列的触控轨迹,控制投屏设备第一用户界面更新显示,并将更新显示的第一用户界面再次投屏至显示设备,以实现用户通过操作显示设备第二用户界面对投屏设备进行控制。需要说明的是,图9B中投屏设备第一用户界面所示出的虚线轨迹、触控点、修正点,表示该触控操作将作为操控数据直接注入投屏设备,其第一用户界面并不实际显示该实体触控轨迹。
在一些实施例中,获取第二触控采样频率触控轨迹的过程中,其添加至第一触控点序列的修正点可来自相邻触控点之间的采样点,所述采样点基于系统采样频率获取。
如图9C示出的触控轨迹(ABCDE),包括示意为实心的触控点和示意为空心的采样点,即触控轨迹由全部采样点构成,所述全部采样点中包含所述全部触控点。
第一控制器基于第一触控采样频率获取上述实心触控点,即第一触控采样频率在用户操控过程中用于确定触控轨迹所包含的触控点;在用户输入触控操作过程中,第一控制器还可基于系统采样频率获取触控轨迹的全部采样点,所述全部采样点被记录于系统数据库;
可以理解,显示设备可基于系统采样频率获取、记录触控轨迹所包含的全部采样点, 所述全部采样点包含上述全部触控点;即触控点为部分采样点,相应的显示设备系统采样频率通常高于其第一触控采样频率。
在一些实施例中,在添加修正点至第一触控点序列过程中,将基于系统采样频率获取第一触控点序列中相邻触控点之间的采样点;然后第一控制器对上述采样点进行筛选,将其中与两侧相邻触控点之间距离均大于等于预设阈值的采样点作为第一采样点,然后将第一采样点作为修正点添加至第一触控点序列。
例如,图9D中显示设备获取的触控轨迹(AE),其第一触控点序列为:点A、点E;基于系统采样频率所获取的采样点,在触控点A、触控点E之间还包括第一采样点C、及第二采样点D;
在仅需插入1个修正点时,可通过预设阈值L对上述采样点进行筛选;将上述采样点至触控点之间的距离与预设阈值L进行比较,所述距离小于所述预设阈值L时,可近似认为该采样点、触控点重合;所述距离大于等于所述预设阈值L时,可将该采样点作为修正点插入第一触控点序列;
图9D显示设备第二用户界面中,第二采样点D距离触控点E较近,其距离小于预设阈值L时,可近似认为第二采样点D、点E重合,该第二采样点D应去除;第一采样点C至触控点A、及触控点E的距离均大于等于预设阈值L,则可将第一采样点C作为修正点添加至第一触控点序列;
所获取的第二触控点序列包括:点A、点C、点E;触控轨迹的触控采样频率由第一触控采样频率提升至第二触控采样频率,所述第二触控采样频率大于等于屏幕触控采样频率120HZ。
需要说明的是,预设阈值L可根据屏幕触控采样频率进行设置,通过调整所述预设阈值L大小,可调整插入相邻触控点之间的修正点数量,而触控轨迹包含修正点数量将影响其第二触控采样频率,如图9D所示。
在一些实施例中,在添加修正点至第一触控点序列步骤中,第一控制器可获取第一触控点序列中相邻触控点之间的全部采样点,并将全部采样点作为修正点添加至第一触控点序列,得到第二触控点序列的触控轨迹。在一些实施例中,对于触控轨迹包含的触控点,显示设备用户界面可通过监听响应图层所发生触控事件(ACTION_MOVE)获取;对于相邻触控点之间基于系统采样频率所获取、记录的采样点,可通过历史数据获取函数得到采样点触控事件集合,所述集合包含触控点之间存在的全部采样点;通过采样点的横坐标、及纵坐标,可确定采样点距离其他采样点、触控点的距离;基于所述距离对采样点进行筛选,若相邻触控点之间存在多个采样点,则将间隔相对较大的采样点作为修正点插入相邻触控点,以实现在前触控点发出后、在后触控点发出前,将采样点插入,实现触控轨迹包含信息量的增加。
例如,触控轨迹所包含第一触控点序列表示如下:
点A(x=279.25854,y=216.84871,DOWN,extra1),
点B(x=277.9558,y=216.84871,MOVE,extra1),
点C(x=272.51184,y=217.24818,MOVE,extra1),
点D(x=267.77722,y=217.64766,MOVE,extra1),
.....
点N(x=240.90857,y=220.37743,UP,extra1);
触控轨迹由第一触控采样频率转换为第二触控采样频率过程中,查询获取触控点B、触控点C之间的历史信息,即查询基于系统采样频率所记录的采样点,得到如下信息:
第一采样点(x=274.524,y=217.11502);
第二采样点(x=272.51184,y=217.24818);
比较上述采样点、与触控点B和触控点C,可以确定第一采样点是可用的;第二采样点与触控点C重复,或第二采样点与触控点C之间距离极小,相应的第二采样点与触控点 C的坐标值差值极小,第二采样点可忽略不计;在一些实施例中,可将上述距离差值定义为预设阈值L,所述预设阈值L可根据具体情况进行调整,以调节触控点之间添加采样点的数量。
第一触控点序列添加修正点后,得到的第二触控点序列表示如下:
点A(x=279.25854,y=216.84871,DOWN,extra1),
点B(x=277.9558,y=216.84871,MOVE,extra1),
第一修正点(x=274.524,y=217.11502,MOVE,extra1),
点C(x=272.51184,y=217.24818,MOVE,extra1),
点D(x=267.77722,y=217.64766,MOVE,extra1),
.....
点N(x=240.90857,y=220.37743,UP,extra1);
上述包含第二触控点序列的触控轨迹将被发送、注入投屏设备,由于该触控轨迹的第二触控采样频率大于等于投屏设备本身触控采样频率,其操控效果与用户在投屏设备端操控一致,一定程度上可解决反向控制高触控采样频率设备是响应不灵敏的问题。
在一些实施例中,第一采样点作为修正点添加至第一触控点序列过程中,第一控制器还被配置为检测是否存在第二采样点,所述第二采样点距第一采样点大于等于预设阈值,且第二采样点距其另一侧相邻触控点也大于等于预设阈值;若存在上述第二采样点,所述第二采样点也将作为修正点添加至第一触控点序列。
例如,图9E中显示设备获取的触控轨迹(AE),在触控点A和触控点E之间还包括基于系统采样频率所获取的采样点:第一采样点B、第二采样点C、及第三采样点D;
对于第一采样点B,其距离触控点A、及触控点E的距离均大于预设阈值L,则将第一采样点B作为修正点添加至第一触控点序列;对于第二采样点C,其距离第一采样点B、及触控点E的距离均大于预设阈值L,则将第二采样点C也作为修正点添加至第一触控点序列;对于第三采样点D,其距离触控点E较近,距离小于预设阈值L,可近似认为点D、点E重合,该第三采样点D应去除;
将第一采样点B、第二采样点C作为修正点添加至第一触控点序列后,得到第二触控点序列,表示为:点A、点B、点C、点E;
包含第二触控点序列的触控轨迹采样频率也由第一触控采样频率提升至第二触控采样频率,所述第二触控采样频率大于等于屏幕触控采样频率120HZ。
又例如,触控轨迹所包含第一触控点序列表示如下:
点A(x=279.25854,y=216.84871,DOWN,extra1),
点B(x=277.9558,y=216.84871,MOVE,extra1),
点C(x=272.51184,y=217.24818,MOVE,extra1),
点D(x=267.77722,y=217.64766,MOVE,extra1),
.....
点N(x=240.90857,y=220.37743,UP,extra1);
在触控点C、触控点D之间查询获取基于系统采样频率所记录的采样点,得到如下信息:
第一修正点(x=270.26288,y=217.38135);
第二修正点(x=269.4099,y=217.51399);
第三修正点(x=267.77722,y=217.64766);
相邻触控点之间包括多个采样点,可根据实际情况对采样点进行筛选;根据前文采样点筛选技术方案,比较上述采样点与触控点C、触控点D,其中第一采样点、第二采样点是可用的;第三采样点与触控点D是重复的,或第三采样点与触控点D之间距离极小,第三采样点可忽略不计;
添加修正点后的触控轨迹,其第二触控点序列表示如下:
点A(x=279.25854,y=216.84871,DOWN,extra1),
点B(x=277.9558,y=216.84871,MOVE,extra1),
点C(x=272.51184,y=217.24818,MOVE,extra1),
第一修正点(x=270.26288,y=217.38135,MOVE,extra1),
第二修正点(x=269.4099,y=217.51399,MOVE,extra1),
点D(x=267.77722,y=217.64766,MOVE,extra1),
.....
点N(x=240.90857,y=220.37743,UP,extra1);
将包含第二触控点序列的触控轨迹发送、注入投屏设备。
在一些实施例中,为使用户在显示设备输入的操控感受更接近于用户直接操控投屏设备,投屏设备收到触控轨迹后,第二控制器将去除其第二触控点序列中的最后触控点,以得到新触控轨迹。
上述机制为了模拟用户直接操控投屏设备时用户按下屏幕并加速移动手指的操控效果。显示设备、投屏设备之间差异在于单位时间内所采集触控事件数量的不同,投屏设备将显示设备发送的触控事件进行响应加速处理,该处理不影响用户触控轨迹,仅将最后一次触控事件抛弃,将抛弃后触控点序列中的最后触控事件作为触控终结点(ACTION_UP事件),模拟了直接操控投屏设备的加速处理场景,让用户操控更接近于直接操控投屏设备;
投屏设备第二控制器基于所述新触控轨迹控制第一用户界面进行更新显示,实现投屏设备加速处理该用户输入的所述触控轨迹。
需要说明的是,在反向控制投屏设备场景中,显示设备收到投屏设备发送的投屏消息后,将建立与投屏设备的连接;连接成功后,基于投屏协议投屏设备将第一用户界面录屏后通过协议传输数据流发送至显示设备的第二用户界面,显示设备对待显示的投屏区域进行布局展示,如图10A所示。
在一些实施例中,显示设备第二用户界面中的投屏区域、或投屏区域所在图层可支持用户触控操作;在投屏建立过程中,投屏设备可被配置为发送信令至显示设备,以确认显示设备是否支持反控功能;经协议确认显示设备支持反向控制,用户在投屏区域所输入的触控轨迹将被发送至投屏设备,该触控轨迹在投屏设备端将作为输入操作数据。
上述显示设备、或投屏设备还可实施为相同类型的终端设备,如大屏终端、或手机、或平板电脑、或计算机、或其它终端。此外,本申请上述基于采样频率修正对设备进行控制的整体流程适用于任何由于触控采样频率差异而导致响应不灵敏的场景中,具体可如图10B所示。
基于上文显示设备、投屏设备实现触采样频率修正的技术方案及相关附图介绍,本申请还提供了一种基于采样频率修正的设备控制方法。所述方法在显示设备、投屏设备实现触控采样频率修正的具体步骤中已进行详细阐述,在此不再赘述。
本部分实施例的有益效果在于,通过在投屏过程中获取屏幕触控采样频率,可确定是否修正触控轨迹的触控采样频率;进一步通过构建修正点,可实现第二触控点序列的获取;进一步通过构建第二触控采样频率,可提高触控轨迹的采样频率、增加触控轨迹所包含的数据信息量、提高投屏设备的响应灵敏度、实现与投屏设备相同的操控感受。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述示例性的讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得到多种修改和变形。上述实施方式的选择和描述是为了更好的解释原理以及实际的应用,从而使得本领域技术人员更好的使用所述实施方式以及适于具体使用考虑的各种不同的变形的实施方式。

Claims (37)

  1. 一种显示设备,包括:
    显示器,用于显示包含第一投屏区域的第三用户界面,所述第一投屏区域用于同步显示第一投屏设备的第一用户界面;
    第一控制器,被配置为:
    在第一用户界面显示于第三用户界面时,确定第一投屏区域与第三用户界面之间的相对位置关系;
    基于第三用户界面对触控事件的监听,获取用户操作第三用户界面时生成的第一轨迹;
    根据所述相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹,并将所述第二轨迹发送至第一投屏设备;
    其中,所述第二轨迹用于第一投屏设备控制第一用户界面,以使用户通过操作第三用户界面对第一投屏设备的第一用户界面进行控制。
  2. 如权利要求1所述的显示设备,在确定所述相对位置关系步骤中,所述第一控制器还被配置为:
    在接收到第一投屏设备发送的投屏消息后,控制第一图层显示第一投屏区域;
    在第一投屏区域显示于第三用户界面时,确定第一投屏区域与第二图层之间的相对位置关系,所述第二图层、第一图层叠加显示以生成第三用户界面。
  3. 如权利要求2所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器具体包括:
    第二图层首先监听到触控事件时,获取用户操作第二图层所生成的第一轨迹;
    根据屏幕触控采样频率获取第一轨迹中包含的第一触控点序列;
    基于所述相对位置关系,提取第一触控点序列在第一图层投屏区域范围内的触控点,以生成第二触控点序列,所述第二触控点序列构成第二轨迹。
  4. 如权利要求2所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器具体包括:
    第一图层首先监听到触控事件时,获取用户操作第一图层所生成的第一轨迹;
    根据屏幕触控采样频率获取第一轨迹中包含的第一触控点序列;
    基于所述相对位置关系,剔除第一触控点序列中超出第一图层中第一投屏区域范围的触控点,以生成第二触控点序列,所述第二触控点序列构成第二轨迹。
  5. 如权利要求4所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器还被配置为:
    获取第一轨迹的过程中同步获取其包含的第一触控点序列;
    基于所述相对位置关系,在检测到第一触控点序列中超出第一图层中第一投屏区域范围的首个触控点时,即时生成第二触控点序列,所述第二触控点序列不包含所述首个触控点。
  6. 如权利要求1所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器具体包括:
    获取第一投屏区域在第三用户界面的顶点坐标,及获取第一轨迹所包含触控点在第三用户界面的第一坐标;
    比对第一坐标与所述顶点坐标,确定位于所述顶点坐标区域内的触控点,所述触控点构成第二轨迹。
  7. 如权利要求2所述的显示设备,在控制第一图层显示第一投屏区域的步骤中,所述第一控制器还被配置为:
    根据待加载的第一投屏区域大小,确定第一图层的大小,通过调整第一图层位置以调整第一投屏设备在第三用户界面中对应的投屏位置。
  8. 如权利要求1所述的显示设备,所述第三用户界面还包含第二投屏区域,所述第二投屏区域用于同步显示第二投屏设备的第二用户界面;
    所述第一控制器,还被配置为:
    在投屏设备投屏至第三用户界面时,创建用于监听触控事件且设置于第三用户界面顶层的透明模板图层;
    基于所述监听,获取用户操作第三用户界面时在所述透明模板图层上生成的第一轨迹;
    根据第一投屏区域、第二投屏区域、所述透明模板图层在第三用户界面中的相对位置关系,确定第一轨迹在投屏区域形成的第二轨迹,并将所述第二轨迹发送至对应的投屏设备;
    其中,所述投屏区域包括所述第一投屏区域和所述第二投屏区域,所述第二轨迹用于所述投屏设备控制其用户界面,以使用户通过操作第三用户界面对投屏区域对应的投屏设备进行控制。
  9. 如权利要求8所述的显示设备,在将确定的第二轨迹发送至对应的投屏设备步骤中,所述第一控制器还被配置为:
    当第二轨迹在第一投屏区域、及第二投屏区域均形成轨迹时,分别发送各自投屏区域内的所述轨迹至其对应的投屏设备,以使得用户通过输入的第一轨迹同时控制第一用户界面、及第二用户界面。
  10. 如权利要求8所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器具体包括:
    在透明模板图层监听到触控事件时,获取用户操作所述透明模板图层所生成的第一轨迹;
    根据屏幕触控采样频率,获取第一轨迹中包含的第一触控点序列;
    基于所述相对位置关系,提取第一触控点序列在第一投屏区域、或第二投屏区域范围内的触控点,以生成第二触控点序列,所述第二触控点序列构成第二轨迹。
  11. 如权利要求10所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器还被配置为:
    获取第一轨迹的过程中同步获取其包含的第一触控点序列;
    基于所述相对位置关系,在检测到第一轨迹途径任一投屏区域且其第一触控点序列中超出第一投投屏区域、或第二投屏区域范围的首个触控点时,即时生成第二触控点序列,所述第二触控点序列不包含所述首个触控点。
  12. 如权利要求8所述的显示设备,在根据所述相对位置关系确定第二轨迹步骤中,所述第一控制器具体包括:
    获取所述第一投屏区域、及第二投屏区域在第三用户界面的顶点坐标,所述顶点坐标包括对角顶点的坐标,及获取第一轨迹所包含触控点在第三用户界面的第二坐标;
    比对第二坐标与所述顶点坐标,确定位于所述顶点坐标区域内的触控点,所述触控点构成第二轨迹。
  13. 如权利要求9所述的显示设备,用户界面以第一触控采样频率获取包含第一触控点序列的触控轨迹;其中,所述用户界面包括所述第一用户界面、所述第二用户界面以及所述第三用户界面;触控采样频率用于确定触控轨迹所包含的触控点;
    所述第一控制器,还被配置为:
    接收投屏设备发送的投屏消息后,获取屏幕触控采样频率;
    在第一触控采样频率低于所述屏幕触控采样频率时,添加修正点至第一触控点序列,得到包含第二触控点序列的触控轨迹,所述触控轨迹对应第二触控采样频率;
    发送第二触控采样频率的触控轨迹至所述投屏设备,所述触控轨迹用于使用户通过操作所述用户界面对所述投屏设备进行控制;
    其中,添加至第一触控点序列的所述修正点用于使所述触控轨迹的第二触控采样频率 大于第一触控采样频率,以提高控制所述投屏设备的响应灵敏度。
  14. 如权利要求13所述的显示设备,在添加修正点至第一触控点序列步骤中,所述第一控制器:
    获取第一触控点序列中相邻触控点之间的全部采样点;
    将所述全部采样点作为修正点添加至第一触控点序列;其中,显示设备基于系统采样频率获取、记录触控轨迹中的所述全部采样点,所述全部采样点包含全部触控点,所述系统采样频率高于第一触控采样频率。
  15. 如权利要求13所述的显示设备,在获取第二触控采样频率的触控轨迹过程中,所述第一控制器还被配置为:
    添加至第一触控点序列的所述修正点使所述触控轨迹的第二触控采样频率大于等于所述屏幕触控采样频率。
  16. 如权利要求13或15所述的显示设备,在添加修正点至第一触控点序列步骤中,所述第一控制器:
    基于系统采样频率,获取第一触控点序列中相邻触控点之间的第一采样点;
    在第一采样点与其两侧相邻触控点之间距离均大于等于预设阈值时,将所述第一采样点作为修正点添加至第一触控点序列;
    其中,显示设备基于所述系统采样频率获取、记录触控轨迹中的全部采样点,所述全部采样点包含全部触控点,所述系统采样频率高于第一触控采样频率。
  17. 如权利要求16所述的显示设备,在将第一采样点作为修正点添加至第一触控点序列步骤中,所述第一控制器还被配置为:
    在第二采样点距第一采样点、及距另一侧相邻触控点均大于等于所述预设阈值时,将第二采样点作为修正点添加至第一触控点序列。
  18. 如权利要求16所述的显示设备,在添加修正点至第一触控点序列步骤中,所述第一控制器:
    基于第一触控采样频率与所述屏幕触控采样频率的比对,确定第一数量的修正点;
    将所述第一数量的修正点均匀插入第一触控点序列,所述第一数量使第二触控采样频率大于等于所述屏幕触控采样频率。
  19. 一种投屏设备,包括:
    显示器,用于显示第一用户界面,所述第一用户界面在投屏过程中同步显示至显示设备第三用户界面的第一投屏区域;
    第二控制器,被配置为:
    接收显示设备发送的第二轨迹,所述第二轨迹为用户在第三用户界面中第一投屏区域的操作轨迹;
    基于第二轨迹控制第一用户界面更新显示,并将更新后的第一用户界面投屏至第三用户界面的第一投屏区域。
  20. 如权利要求19所述的投屏设备,所述第三用户界面还包括第二投屏区域,所述第二投屏区域用于同步显示第二投屏设备的第二用户界面;
    所述第二控制器,还被配置为:
    基于第二轨迹控制第一用户界面更新显示,并将更新后的第一用户界面投屏至第三用户界面的第一投屏区域;
    其中,设置于第三用户界面顶层用于监听触控事件的透明模板图层在用户操作第三用户界面过程中生成第一轨迹,所述显示设备根据第一投屏区域、第二投屏区域、所述透明模板图层在第三用户界面中的相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹。
  21. 如权利要求19或20所述的投屏设备,用户界面以屏幕触控采样频率确定触控轨迹所包含的触控点;所述目标用户界面包括所述第一用户界面、所述第二用户界面以及所 述第三用户界面;
    所述第二控制器,还被配置为:
    接收显示设备发送的包含第二触控点序列的触控轨迹,所述第二触控点序列由所述显示设备在第一触控点序列中添加修正点得到,所述修正点用于使所述触控轨迹的采样频率由第一触控采样频率提升为第二触控采样频率,以提高所述投屏设备的响应灵敏度;
    基于所述触控轨迹控制所述用户界面更新显示,并将更新显示的用户界面投屏至所述显示设备。
  22. 如权利要求21所述的投屏设备,在接收显示设备发送包含第二触控点序列的触控轨迹步骤中,所述第二控制器还被配置为:
    所述触控轨迹的第二触控采样频率大于等于所述屏幕触控采样频率。
  23. 如权利要求21所述的投屏设备,在接收显示设备发送的包含第二触控点序列的触控轨迹后,所述第二控制器还被配置为:
    去除第二触控点序列中的最后触控点,得到新触控轨迹;
    基于所述新触控轨迹控制所述用户界面进行更新显示,以使投屏设备加速处理用户输入触控轨迹。
  24. 一种基于轨迹提取的设备控制方法,所述方法包括:
    在第一投屏设备的第一用户界面显示于第三用户界面时,确定第一投屏区域与第三用户界面之间的相对位置关系,第一投屏区域用于同步显示第一投屏设备的第一用户界面;
    基于第三用户界面对触控事件的监听,获取用户操作第三用户界面时生成的第一轨迹;
    根据所述相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹,并将所述第二轨迹发送至第一投屏设备;其中,所述第二轨迹用于第一投屏设备控制其第一用户界面,以使用户通过操作第三用户界面对第一投屏设备的第一用户界面进行控制。
  25. 如权利要求24所述的方法,在确定所述相对位置关系步骤中,所述方法还包括:
    在接收到第一投屏设备发送的投屏消息后,控制第一图层显示近第一投屏区域;
    在第一投屏区域显示于第三用户界面时,确定第一投屏区域与第二图层之间的相对位置关系,所述第二图层、第一图层叠加显示以生成第三用户界面。
  26. 如权利要求25所述的方法,在根据所述相对位置关系确定第二轨迹步骤中,具体包括:
    第二图层首先监听到触控事件时,获取用户操作第二图层所生成的第一轨迹;
    根据屏幕触控采样频率获取第一轨迹中包含的第一触控点序列;
    基于所述相对位置关系,提取第一触控点序列在第一图层中第一投屏区域范围内的触控点,以生成第二触控点序列,所述第二触控点序列构成第二轨迹。
  27. 如权利要求25所述的方法,在根据所述相对位置关系确定第二轨迹步骤中,具体包括:
    第一图层首先监听到触控事件时,获取用户操作第一图层所生成的第一轨迹;
    根据屏幕触控采样频率获取第一轨迹中包含的第一触控点序列;
    基于所述相对位置关系,剔除第一触控点序列中超出第一图层中第一投屏区域范围的触控点,以生成第二触控点序列,所述第二触控点序列构成第二轨迹。
  28. 如权利要求25所述的方法,所述方法还包括:
    在投屏设备投屏至第三用户界面时,创建用于监听触控事件且设置于第三用户界面顶层的透明模板图层,所述第三用户界面包括第一投屏区域、第二投屏区域,所述第一投屏区域用于同步显示第一投屏设备的第一用户界面,所述第二投屏区域用于同步显示第二投屏设备的第二用户界面;
    基于所述监听,获取用户操作第三用户界面时在所述透明模板图层上生成的第一轨迹;
    根据第一投屏区域、第二投屏区域、所述透明模板图层在第三用户界面中的相对位置关系,确定第一轨迹在投屏区域形成的第二轨迹,并将所述第二轨迹发送至对应的投屏设 备;
    其中,所述投屏区域包括所述第一投屏区域和所述第二投屏区域,所述第二轨迹用于所述投屏设备控制其用户界面,以使用户通过操作第三用户界面对投屏区域对应的投屏设备进行控制。
  29. 如权利要求28所述的方法,在将确定的第二轨迹发送至对应的投屏设备步骤中,所述方法还包括:
    当第二轨迹在第一投屏区域、及第二投屏区域均形成轨迹时,分别发送各自投屏区域内的所述轨迹至其对应的投屏设备,以使得用户通过输入的第一轨迹同时控制第一用户界面、及第二用户界面。
  30. 如权利要求24-29中任一所述的方法,所述方法还包括:
    接收投屏设备发送的投屏消息后,获取屏幕触控采样频率;
    在第一触控采样频率低于所述屏幕触控采样频率时,添加修正点至第一触控点序列,得到包含第二触控点序列的触控轨迹;其中,所述第一触控点序列对应第一触控采样频率,所述第二触控点序列对应第二触控采样频率,添加至第一触控点序列的所述修正点用于使触控轨迹的第二触控采样频率大于第一触控采样频率,以提高控制所述投屏设备的响应灵敏度;
    发送第二触控采样频率的触控轨迹至所述投屏设备,所述触控轨迹用于使用户通过操作所述用户界面对所述投屏设备进行控制。
  31. 如权利要求30所述的方法,在获取第二触控采样频率的触控轨迹过程中,所述方法还包括:
    添加至第一触控点序列的所述修正点使所述触控轨迹的第二触控采样频率大于等于所述屏幕触控采样频率。
  32. 如权利要求30或31所述的方法,在添加修正点至第一触控点序列步骤中,所述第一控制器:
    基于系统采样频率,获取第一触控点序列中相邻触控点之间的第一采样点;
    在第一采样点与其两侧相邻触控点之间距离均大于等于预设阈值时,将所述第一采样点作为修正点添加至第一触控点序列;
    其中,基于所述系统采样频率获取、记录触控轨迹中的全部采样点,所述全部采样点包含全部触控点,所述系统采样频率高于第一触控采样频率。
  33. 如权利要求32所述的方法,在将第一采样点作为修正点添加至第一触控点序列步骤中,具体方法还包括:
    在第二采样点距第一采样点、及距另一侧相邻触控点均大于等于所述预设阈值时,将第二采样点作为修正点添加至第一触控点序列。
  34. 一种基于轨迹提取的设备控制方法,所述方法包括:
    接收显示设备发送的第二轨迹,所述第二轨迹为用户在显示设备第三用户界面中第一投屏区域的操作轨迹;
    基于第二轨迹控制第一用户界面更新显示,并将更新后的第一用户界面投屏至第三用户界面的第一投屏区域,所述第一用户界面在投屏过程中同步显示至显示设备第三用户界面的第一投屏区域。
  35. 如权利要求34所述的方法,所述方法还包括:
    接收显示设备发送的第二轨迹,所述第二轨迹为用户在所述显示设备第三用户界面中第一投屏区域形成的操作轨迹,所述第三用户界面还包括第二投屏区域,所述第二投屏区域用于同步显示来自其他投屏设备的第二用户界面;
    基于第二轨迹控制第一用户界面更新显示,并将更新后的第一用户界面投屏至第三用户界面的第一投屏区域,所述第一用户界面在投屏过程中同步显示至显示设备第三用户界面的第一投屏区域;
    其中,设置于第三用户界面顶层用于监听触控事件的透明模板图层在用户操作第三用户界面过程中生成第一轨迹,所述显示设备根据第一投屏区域、第二投屏区域、所述透明模板图层在第三用户界面中的相对位置关系,确定第一轨迹在第一投屏区域形成的第二轨迹。
  36. 如权利要求34或35所述的方法,所述方法还包括:
    接收显示设备发送的包含第二触控点序列的触控轨迹,所述第二触控点序列由所述显示设备在第一触控点序列中添加修正点得到,所述修正点用于使所述触控轨迹的采样频率由第一触控采样频率提升为第二触控采样频率,以提高响应灵敏度;
    基于所述触控轨迹控制更新显示,并将更新显示内容投屏至所述显示设备。
  37. 如权利要求36所述的方法,在接收显示设备发送包含第二触控点序列的触控轨迹步骤中,所述方法还包括:
    所述触控轨迹的第二触控采样频率大于等于屏幕触控采样频率。
PCT/CN2022/141150 2022-03-24 2022-12-22 显示设备、投屏设备及基于轨迹提取的设备控制方法 WO2023179129A1 (zh)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202210303291.6 2022-03-24
CN202210303261.5 2022-03-24
CN202210303291 2022-03-24
CN202210303643.8 2022-03-24
CN202210303261 2022-03-24
CN202210303643 2022-03-24

Publications (1)

Publication Number Publication Date
WO2023179129A1 true WO2023179129A1 (zh) 2023-09-28

Family

ID=88099787

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/141150 WO2023179129A1 (zh) 2022-03-24 2022-12-22 显示设备、投屏设备及基于轨迹提取的设备控制方法

Country Status (1)

Country Link
WO (1) WO2023179129A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112000306A (zh) * 2020-10-28 2020-11-27 深圳乐播科技有限公司 多端投屏的反向控制方法、装置、设备及存储介质
CN112468863A (zh) * 2020-11-24 2021-03-09 北京字节跳动网络技术有限公司 投屏控制方法、设备及电子设备
CN112860207A (zh) * 2021-03-18 2021-05-28 努比亚技术有限公司 一种投屏方法、系统、发起投屏的设备及存储介质
CN113556588A (zh) * 2020-04-23 2021-10-26 深圳市万普拉斯科技有限公司 反向控制方法、装置、计算机设备和存储介质
WO2022042656A1 (zh) * 2020-08-26 2022-03-03 华为技术有限公司 一种界面显示方法及设备
CN114201130A (zh) * 2020-09-18 2022-03-18 青岛海信移动通信技术股份有限公司 一种投屏的方法、装置及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113556588A (zh) * 2020-04-23 2021-10-26 深圳市万普拉斯科技有限公司 反向控制方法、装置、计算机设备和存储介质
WO2022042656A1 (zh) * 2020-08-26 2022-03-03 华为技术有限公司 一种界面显示方法及设备
CN114201130A (zh) * 2020-09-18 2022-03-18 青岛海信移动通信技术股份有限公司 一种投屏的方法、装置及存储介质
CN112000306A (zh) * 2020-10-28 2020-11-27 深圳乐播科技有限公司 多端投屏的反向控制方法、装置、设备及存储介质
CN112468863A (zh) * 2020-11-24 2021-03-09 北京字节跳动网络技术有限公司 投屏控制方法、设备及电子设备
CN112860207A (zh) * 2021-03-18 2021-05-28 努比亚技术有限公司 一种投屏方法、系统、发起投屏的设备及存储介质

Similar Documents

Publication Publication Date Title
US9088814B2 (en) Image display method and apparatus
WO2019237662A1 (zh) 显示界面控制方法以及装置、设备及存储介质
US8499243B2 (en) Information processing device, information processing method, recording medium, and integrated circuit
CN111314768A (zh) 投屏方法、投屏装置、电子设备以及计算机可读存储介质
CN110536008B (zh) 一种投屏方法及移动终端
CN106792071A (zh) 字幕处理方法及装置
CN104915173A (zh) 双屏互动控制方法
KR20140133363A (ko) 디스플레이 장치 및 이의 제어 방법
WO2022089088A1 (zh) 显示设备、移动终端、投屏数据传输方法及传输系统
US9652823B2 (en) Method and terminal device for controlling display of video image
CN107846617B (zh) 一种智能终端和智能电视的互动方法
CN112269505B (zh) 音视频控制方法、装置及电子设备
WO2021169885A1 (zh) 显示方法及电子设备
CN114327199A (zh) 一种显示设备及多窗口参数设置方法
CN113766305A (zh) 显示设备及镜像投屏音频输出控制方法
CN115437542A (zh) 一种显示设备及投屏反控方法
WO2022028060A1 (zh) 一种显示设备及显示方法
WO2024066538A1 (zh) 显示设备和显示设备控制方法
WO2024041033A1 (zh) 显示设备和用于显示设备的设备名称处理方法
WO2023179129A1 (zh) 显示设备、投屏设备及基于轨迹提取的设备控制方法
CN114430492A (zh) 显示设备、移动终端及图片同步缩放方法
CN111669662A (zh) 显示设备、视频通话方法及服务器
CN110620842A (zh) 一种应用程序的控制方法及终端
WO2022083357A1 (zh) 显示设备及摄像头控制的方法
WO2021218096A1 (zh) 一种调整频道控件排序的方法和显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22933177

Country of ref document: EP

Kind code of ref document: A1