CN111427501A - Touch control implementation method and interactive intelligent device - Google Patents

Touch control implementation method and interactive intelligent device Download PDF

Info

Publication number
CN111427501A
CN111427501A CN202010209572.6A CN202010209572A CN111427501A CN 111427501 A CN111427501 A CN 111427501A CN 202010209572 A CN202010209572 A CN 202010209572A CN 111427501 A CN111427501 A CN 111427501A
Authority
CN
China
Prior art keywords
touch
coordinate
display
processing module
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010209572.6A
Other languages
Chinese (zh)
Other versions
CN111427501B (en
Inventor
邱伟波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN202010209572.6A priority Critical patent/CN111427501B/en
Publication of CN111427501A publication Critical patent/CN111427501A/en
Application granted granted Critical
Publication of CN111427501B publication Critical patent/CN111427501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a touch control realization method and interactive intelligent equipment, and relates to the field of interactive intelligent equipment, wherein the touch control realization method is applied to the interactive intelligent equipment, and the interactive intelligent equipment at least comprises the following steps: the touch screen comprises a touch screen, a first processing module and a second processing module, wherein a first display interface and a second display interface with different display channels are displayed in the touch screen; the touch screen receives touch operation, the first processing module converts the first touch coordinate into a second touch coordinate suitable for the second display interface and sends the second touch coordinate to the second processing module when confirming that the first touch coordinate obtained according to the touch operation acts on the second display interface, and the second processing module responds according to the second touch coordinate. By adopting the scheme, the technical problem that the interactive intelligent equipment in the prior art cannot perform touch control on the display data of other signal sources can be solved, and touch control operation on the display data of different display channels is realized.

Description

Touch control implementation method and interactive intelligent device
Technical Field
The embodiment of the application relates to the technical field of interactive intelligent equipment, in particular to a touch control implementation method and interactive intelligent equipment.
Background
With the development of intelligent technology, the picture-in-picture mode is widely applied in interactive intelligent devices. A pip may be understood as a content presentation, for example, displaying one or more small area interfaces in a full screen displayed interface, so that the user can view the full screen interface and the small area interface simultaneously. Generally, a full screen interface and a small area interface correspond to display data of different signal sources, where the signal source corresponding to the full screen interface is a current main signal source, and the signal source corresponding to the small area interface is another signal source. In the process of implementing the invention, the inventor finds that the prior art has the following defects: in the picture-in-picture mode, when the interactive intelligent device detects a touch operation, the default is an operation for displaying data of the main signal source, and a response is performed based on the display data of the main signal source. At this time, for the display data of other signal sources, touch control cannot be realized, and the use experience of the user is reduced.
Disclosure of Invention
The application provides a touch control implementation method and interactive intelligent equipment, and aims to solve the technical problem that the interactive intelligent equipment cannot perform touch control on display data of other signal sources in the prior art.
In a first aspect, an embodiment of the present application provides a touch implementation method, where the touch implementation method is applied to an interactive smart device, and the interactive smart device at least includes: the touch screen is provided with a first display interface and a second display interface, the first display interface corresponds to a first display channel of the first processing module, and the second display interface corresponds to a second display channel of the second processing module;
the touch screen receives touch operation;
the first processing module obtains a first touch coordinate according to the touch operation, converts the first touch coordinate into a second touch coordinate suitable for the second display interface when confirming that the touch operation acts on the second display interface according to the first touch coordinate, and sends the second touch coordinate to the second processing module;
and the second processing module responds according to the second touch coordinate.
Further, the second display interface is located within the first display interface.
Further, a first coordinate system in the first display interface is obtained through a coordinate domain of the first display channel, a second coordinate system in the second display interface is obtained through a coordinate domain of the second display channel, the first touch coordinate is a coordinate in the first coordinate system, and the second touch coordinate is a coordinate in the second coordinate system.
Further, before the touch screen receives a touch operation, the method further includes:
the first processing module receives an interface display instruction;
the first processing module responds to the interface display instruction and determines a coordinate area of a second display interface in the first coordinate system;
the first processing module receives the display data sent by the second processing module;
and the first processing module determines a second coordinate system according to the display data and indicates the touch screen to display the display data in the coordinate area.
Further, the first processing module comprises a touch processor and a main system processor, and the first display channel is a display channel of a main system;
the touch control processor obtains a first touch control coordinate according to the touch control operation and sends the first touch control coordinate to the main system processor;
and when the main system processor confirms that the touch operation acts on the second display interface according to the first touch coordinate, the first touch coordinate is converted into a second touch coordinate suitable for the second display interface, and the second touch coordinate is sent to the second processing module.
Further, the first processing module comprises a touch processor, a main system processor and a transparent transmission processor, and the first display channel is a display channel of the main system;
the touch control processor obtains a first touch control coordinate according to the touch control operation and sends the first touch control coordinate to the main system processor;
when the main system processor confirms that the touch operation acts on the second display interface according to the first touch coordinate, the first touch coordinate is converted into a second touch coordinate suitable for the second display interface, and the second touch coordinate is sent to the transparent transmission processor;
and the transparent transmission processor sends the second touch coordinate to the second processing module.
Further, the first processing module comprises a touch processor and a main system processor, and the first display channel is a display channel of a main system;
the main system processor sends the coordinate area and the second coordinate system to the touch processor;
and the touch processor obtains a first touch coordinate according to the touch operation, converts the first touch coordinate into a second touch coordinate suitable for the second display interface when confirming that the touch operation acts on the second display interface according to the first touch coordinate, and sends the second touch coordinate to the second processing module.
Further, the converting, by the first processing module, the first touch coordinate into a second touch coordinate suitable for the second display interface includes:
the first processing module substitutes the first touch coordinate into a set formula to calculate a second touch coordinate suitable for the second display interface;
the setting formula comprises: x-max (X-X _ min)/(X _ max-X _ min), Y-Y _ max (Y-Y _ min)/(Y _ max-Y _ min);
and (X, Y) is a second touch coordinate, (X, Y) is a first touch coordinate, X _ max and Y _ max are respectively a maximum value of an X axis and a maximum value of a Y axis in a second coordinate system, X _ min and X _ max are respectively a minimum value and a maximum value of a second display interface on the X axis in a first coordinate system, and Y _ min and Y _ max are respectively a minimum value and a maximum value of the second display interface on the Y axis in the first coordinate system.
Further, the determining, by the first processing module according to the first touch coordinate, that the touch operation acts on the second display interface includes:
and when the first processing module confirms that the first touch coordinate falls into the coordinate area, determining that the touch operation acts on the second display interface.
Further, the method also comprises the following steps:
and the first processing module responds according to the first touch coordinate when confirming that the touch operation acts on the first display interface according to the first touch coordinate.
Further, the first display channel is an android display channel, and the second display channel and the first display channel have different data source types.
In a second aspect, an embodiment of the present application further provides an interactive intelligent device, where the interactive intelligent device at least includes: the touch screen is provided with a first display interface and a second display interface, the first display interface corresponds to a first display channel of the first processing module, and the second display interface corresponds to a second display channel of the second processing module;
the touch screen is used for receiving touch operation;
the first processing module is used for obtaining a first touch coordinate according to the touch operation, converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the second processing module;
and the second processing module is used for responding according to the second touch coordinate.
Further, a first coordinate system in the first display interface is obtained through a coordinate domain of the first display channel, a second coordinate system in the second display interface is obtained through a coordinate domain of the second display channel, the first touch coordinate is a coordinate in the first coordinate system, and the second touch coordinate is a coordinate in the second coordinate system.
Further, the method also comprises the following steps:
the first processing module is further configured to receive an interface display instruction, determine, in response to the interface display instruction, a coordinate area of a second display interface in the first coordinate system, receive display data sent by the second processing module, determine a second coordinate system according to the display data, and instruct the touch screen to display the display data in the coordinate area.
Further, the first processing module comprises a touch processor and a main system processor, and the first display channel is a display channel of a main system;
the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor;
and the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the second processing module.
Further, the first processing module comprises a touch processor, a main system processor and a transparent transmission processor, and the first display channel is a display channel of the main system;
the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor;
the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the transparent transmission processor;
and the transparent transmission processor is used for sending the second touch coordinate obtained by the main system processor to the second processing module.
Further, the first processing module comprises a touch processor and a main system processor, and the first display channel is a display channel of a main system;
the main system processor is used for sending the coordinate area and the second coordinate system to the touch processor;
the touch processor is configured to obtain a first touch coordinate according to the touch operation, convert the first touch coordinate into a second touch coordinate suitable for the second display interface when it is determined that the touch operation acts on the second display interface according to the first touch coordinate, and send the second touch coordinate to the second processing module.
Further, when the first processing module is configured to convert the first touch coordinate into a second touch coordinate suitable for the second display interface, the first processing module is specifically configured to:
substituting the first touch coordinate into a set formula to calculate a second touch coordinate suitable for the second display interface;
the setting formula comprises: x-max (X-X _ min)/(X _ max-X _ min), Y-Y _ max (Y-Y _ min)/(Y _ max-Y _ min);
and (X, Y) is a second touch coordinate, (X, Y) is a first touch coordinate, X _ max and Y _ max are respectively a maximum value of an X axis and a maximum value of a Y axis in a second coordinate system, X _ min and X _ max are respectively a minimum value and a maximum value of a second display interface on the X axis in a first coordinate system, and Y _ min and Y _ max are respectively a minimum value and a maximum value of the second display interface on the Y axis in the first coordinate system.
Further, the first display channel is an android display channel, and the second display channel and the first display channel have different data source types.
According to the touch control implementation method and the interactive intelligent device, the first processing module obtains the first touch control coordinate according to the touch control operation received by the touch screen, and when the touch control operation is confirmed to act on the second display interface according to the first touch control coordinate, the first touch control coordinate is converted into the second touch control coordinate suitable for the second display interface, and the second touch control coordinate is sent to the second processing module, so that the second processing module responds according to the second touch control coordinate. Particularly, in a picture-in-picture mode, touch operation on the two display interfaces can be realized by determining whether the first touch coordinate acts on the first display interface or the second display interface, and the use experience of a user is improved. Furthermore, the coordinate systems of the two display interfaces are related to the display channel, and when the first touch coordinate is converted into the second touch coordinate, the second touch coordinate is ensured to be more related to the display data of the second display channel, namely, the rationality of the second touch coordinate is ensured. Furthermore, the calculation mode of the second touch coordinate is simple, and the data processing pressure of the first processing module is reduced. Furthermore, by setting the connection relationship between each hardware in the first processing module and the second processing module, the number of interfaces and the types of interfaces, the touch implementation method can have different data streams, the limitation of the touch implementation method on the connection mode of hardware equipment is reduced, and the flexibility and compatibility of the touch implementation method are enhanced.
Drawings
Fig. 1 is a flowchart of a touch implementation method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a first structure of an interactive smart device according to an embodiment of the present application;
fig. 3 is a schematic view of a first display interface and a second display surface provided in an embodiment of the present application;
fig. 4 is a second schematic structural diagram of an interactive intelligent device provided in the embodiment of the present application;
FIG. 5 is a schematic diagram of a first data flow provided by an embodiment of the present application;
FIG. 6 is a third schematic structural diagram of an interactive smart device according to an embodiment of the present application;
FIG. 7 is a second data flow diagram provided in accordance with an embodiment of the present application;
fig. 8 is a fourth schematic structural diagram of an interactive intelligent device provided in the embodiment of the present application;
fig. 9 is a schematic diagram of a third data flow provided in the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are for purposes of illustration and not limitation. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action or object from another entity or action or object without necessarily requiring or implying any actual such relationship or order between such entities or actions or objects. For example, "first" and "second" of the first display interface and the second display interface are used to distinguish between two different display interfaces.
In the embodiment of the application, the interactive intelligent device may be formed by two or more physical entities or may be formed by one physical entity. For example, the interactive smart device may be a smart device with a touch function, such as a mobile phone, a tablet or an interactive smart tablet. For convenience of understanding, the interactive smart tablet is exemplarily described as the interactive smart device in the embodiment. The interactive intelligent panel can be integrated equipment for controlling contents displayed on the display panel and realizing man-machine interaction operation through a touch technology, and can integrate one or more functions of a projector, an electronic whiteboard, a curtain, a sound box, a television, a video conference terminal and the like.
The touch control implementation method provided by the embodiment of the application is applied to an interactive intelligent device, the interactive intelligent device at least comprises a touch screen, a first processing module and a second processing module, a first display interface and a second display interface are displayed in the touch screen, the first display interface corresponds to a first display channel of the first processing module, and the second display interface corresponds to a second display channel of the second processing module. Fig. 1 is a flowchart of a touch implementation method according to an embodiment of the present application, and as shown in fig. 1, the touch implementation method specifically includes:
and step 110, the touch screen receives touch operation.
Step 120, the first processing module obtains a first touch coordinate according to the touch operation, converts the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sends the second touch coordinate to the second processing module.
And step 130, the second processing module responds according to the second touch coordinate.
In one embodiment, the interactive smart device is described as including a touch screen, a first processing module, and a second processing module. In practical applications, the interactive intelligent device may further include a network interface, a network communication module, a memory, and other devices, where the memory may be one or more memories, and one or more programs are stored in the memories, and the one or more programs may be executed by other devices (such as the first processing module, the second processing module, and the like) of the interactive intelligent device to implement the touch implementation method and other functions of the interactive intelligent device provided in the embodiments of the present application. Fig. 2 is a schematic diagram of a first structure of an interactive intelligent device according to an embodiment of the present application. Referring to fig. 2, the interactive smart device includes a touch screen 11, a first processing module 12, and a second processing module 13.
Typically, a first display interface and a second display interface are displayed on the touch screen 11. The first display interface and the second display interface respectively display data of different signal sources, namely the first display interface and the second display interface correspond to different display channels. In an embodiment, the display channel to which the first display interface belongs is a display channel corresponding to the first processing module 12, and the display channel is marked as a first display channel. The display channel to which the second display interface belongs is a display channel corresponding to the second processing module 13, and the display channel is marked as a second display channel. It can be understood that each processing module may correspond to multiple display channels, for example, when a certain operating system is installed in the first processing module 12, the first display channel may be a display channel of the operating system, and the display data in the first display channel is used to display a desktop of the operating system and a window interface of an application installed in the desktop. For another example, when the first processing module 12 is connected to the external data source through the setting interface, the first display channel may be a display channel corresponding to the setting interface, and at this time, the display data in the first display channel is used for displaying the display content sent by the external data source. Optionally, specific display contents of the first display interface and the second display interface are related to display data generated by the first processing module 12 and the second processing module 13, for example, the android system is installed on the first processing module 12, and the first display channel is a display channel of the installation system, at this time, the first display interface is a desktop of the android system, and a window of at least one application installed in the android system can be displayed, where the window of one application is the second display interface, and displays a desktop of an operating system installed in the second processing module 13.
Optionally, the second display interface is located in the first display interface. In the embodiment, for example, the first display interface is displayed in a full screen mode, and at this time, the interactive intelligent device may be considered to start a picture-in-picture mode. In practical application, the first display interface and the second display interface can also respectively occupy half of the display screen or adopt other layout modes.
In one embodiment, the touch screen 11 includes a display screen and a touch device. The display screen realizes the display function, and the touch device realizes the touch control function. Further, the touch device may be at least one of an infrared touch device (infrared touch frame), a capacitive touch device (capacitive screen), an electromagnetic touch device (electromagnetic screen), and a resistive touch device (resistive screen), and the embodiment does not limit the type, model, and arrangement of components included in the touch device. When the user performs a touch operation, the touch device may detect the touch operation.
Typically, the first processing module 12 comprises at least one processor having data processing capabilities. The class of the processor can be set according to actual conditions. The first processing module 12 is provided with a main operating system of the interactive intelligent device, and in the embodiment, the main operating system is denoted as a main system. The host system may be an android system, a Windows system, or the like, and in one embodiment, the android system is taken as an example. Further, the first display channel 12 is an android display channel, and the data source type of the second display channel is different from that of the first display channel. For example, the data source of the second display channel may be a display channel of an operating system (e.g., a Windows system) installed in the second processing module 13, and for another example, the second display channel is a display channel corresponding to a communication Interface provided in the second processing module, where the communication Interface may be at least one of a High Definition Multimedia Interface (HDMI), an Interface corresponding to a vga (video graphics array) line, and a USB Type-C. Currently, the second processing module 13 may be used as an application of the first processing module 12, that is, the first processing module 12 obtains the display data of the second display channel through an application layer.
Further, the first processing module 12 is connected to the touch screen 11 and the second processing module 13, respectively. The embodiment of the connection mode between the first processing module 12 and the touch screen 11 is not limited. For example, it may be connected by a bus. Optionally, the first processing module 12 is connected to the touch screen 11 and the display screen and the touch device respectively. At this time, the first processing module 12 may send the display data to the display screen for displaying. Moreover, the first processing module 12 may receive data collected by each component in the touch device, where each component in the touch device detects a position area in the display screen. When a user executes a touch operation based on the content in the display screen, data collected by the component corresponding to the position of the touch operation in the touch device may change, and the first processing module 12 may determine the touch position of the touch operation according to the change condition of the data collected by each component. In an embodiment, the touch position determined by the first processing module 12 is recorded as a first touch coordinate. When the first display screen is generated, the first processing module 12 synchronously determines a coordinate system corresponding to the first display screen, and in the embodiment, the coordinate system is recorded as a first coordinate system. The embodiment of the determination method of the first coordinate system is not limited. For example, the first coordinate system is determined by a coordinate domain of the android system display channel, where the coordinate domain of the android system display channel is related to a pixel setting of the android system display desktop, and the coordinate domains corresponding to chips of different android systems may be different. For another example, the first coordinate system is a customized view coordinate system. For example, the first coordinate system is determined by pixels of the display screen, and at this time, each coordinate point corresponds to one pixel point. In either way, the origin of the first coordinate system is located at the top left vertex of the first display interface. The first processing module 12 records a corresponding relationship between the first coordinate system and the component in the touch device, and determines the coordinate of the component with changed acquired data in the first coordinate system according to the corresponding relationship to obtain a first touch coordinate.
In one embodiment, the coordinate system is referred to as a second coordinate system, and it is understood that, when the second processing module 13 sends the display data to the first processing module 12, the display data has a coordinate system, and the coordinate system is determined by a corresponding signal source, the coordinate system is referred to as the second coordinate system, and it is understood that, in an embodiment, the determination manner of the second coordinate system is not limited as well, for example, the resolution of the display data in the second display channel is 1080, 35720, at this time, the second coordinate system can be determined by the resolution, and when the second processing module 13 responds to the touch operation, the display content to which the touch operation is directed is determined based on the coordinates in the second coordinate system, if the resolution of the display data in the second display channel is 1080, the second processing module indicates that the second coordinate system is 1080, the display data is referred to as a coordinate system, and the second processing module converts the display data into the coordinate system corresponding to the coordinate system, so that, the second processing module can convert the display data into the coordinate system corresponding coordinate system, that, the coordinate system is referred to the coordinate system 360, and convert the coordinate system into a coordinate system 360, so that the coordinate system can be used in this case, the touch processing module can convert the coordinate system 3613, and convert the coordinate system 3613 into a coordinate system 360.
Further, the second processing module 13 includes at least one processor having a data processing function, and the type of the processor may be set according to actual conditions, and is different from or partially the same as the type of the processor in the first processing module 12. The second processing module 13 may control the display data in the second display channel. At this time, the second processing module 13 may determine the second touch coordinate as the input instruction. At this time, the second processing module 13 determines the display content corresponding to the second touch coordinate according to the second coordinate system, further determines the instruction corresponding to the second touch coordinate according to the display content, and responds to the instruction. It is to be understood that, when the first processing module 12 sends the second touch coordinate to the second processing module 13, a specific package format embodiment is not limited, and when sending the second touch coordinate, other information may be sent together in combination with the actual situation, for example, when there are a plurality of second touch coordinates, information such as timestamps of the plurality of second touch coordinates may be sent together. It should be noted that, when there are two display channels, the second processing module 13 corresponding to the second display channel is integrated in the interactive intelligent device. For other cases, the second processing module 13 may be independent of the interactive smart device. I.e. the second processing module 13 can be understood as a pluggable module in the interactive intelligent device, which can be connected with the first processing module 12 according to the actual situation.
Optionally, when the first processing module 12 confirms that the touch operation acts on the first display interface according to the first touch coordinate, the first processing module responds according to the first touch coordinate. That is, when the first processing module 12 determines that the first touch coordinate falls on the first display interface (the area other than the second display interface), it is determined that the touch operation is directed to the first display interface, and at this time, the first processing module 12 directly responds according to the first touch coordinate, that is, the first processing module 12 determines the instruction corresponding to the touch operation according to the display content corresponding to the first touch coordinate, and executes the instruction, so as to respond to the touch operation.
The technical means that the first processing module obtains the first touch coordinate according to the touch operation received by the touch screen, and when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, the first touch coordinate is converted into the second touch coordinate suitable for the second display interface, and the second touch coordinate is sent to the second processing module, so that the second processing module responds according to the second touch coordinate can realize the touch operation on the second display interface when the touch screen of the interactive intelligent device displays the first display interface and the second display interface, and the two display interfaces correspond to the display channels of the two different processing modules, and the technical problem that the interactive intelligent device cannot perform touch control on the display data of other signal sources in the prior art is solved. Particularly, in a picture-in-picture mode, touch operation on the two display interfaces can be realized by determining whether the first touch coordinate acts on the first display interface or the second display interface, and the use experience of a user is improved.
Further, in an embodiment, a first coordinate system in the first display interface is obtained through a coordinate field of the first display channel, a second coordinate system in the second display interface is obtained through a coordinate field of the second display channel, the first touch coordinate is a coordinate in the first coordinate system, and the second touch coordinate is a coordinate in the second coordinate system. The coordinate system corresponding to the display data sent by the first display channel is a first coordinate system. For example, when the first display channel is a display channel of an android system, the coordinate system corresponding to the display data may be understood as a coordinate system set for the desktop when the android system displays the desktop, and the coordinate system is the first coordinate system. Similarly, the corresponding coordinate system in the display data sent by the second display channel is the second coordinate system. For example, if the second display channel is an HDMI channel, the pixel coordinate system of the display data is the second coordinate system, and if the second display channel is a Windows channel, the coordinate system set for the desktop when the Windows channel displays the desktop is the second coordinate system.
In the above, when both coordinate systems are related to the display channel, it can be ensured that the second touch coordinate is more related to the display data of the display channel when the first touch coordinate is converted into the second touch coordinate, i.e. the rationality of the second touch coordinate is ensured.
Further, in an embodiment, the step 120 of converting the first touch coordinate into a second touch coordinate suitable for the second display interface by the first processing module includes: the first processing module substitutes the first touch coordinate into a set formula to calculate a second touch coordinate suitable for the second display interface; the setting formula comprises: x-max (X-X _ min)/(X _ max-X _ min), Y-Y _ max (Y-Y _ min)/(Y _ max-Y _ min); and (X, Y) is a second touch coordinate, (X, Y) is a first touch coordinate, X _ max and Y _ max are respectively a maximum value of an X axis and a maximum value of a Y axis in a second coordinate system, X _ min and X _ max are respectively a minimum value and a maximum value of a second display interface on the X axis in a first coordinate system, and Y _ min and Y _ max are respectively a minimum value and a maximum value of the second display interface on the Y axis in the first coordinate system.
Fig. 3 is a schematic view of a first display interface and a second display surface provided in an embodiment of the present application. For ease of understanding, fig. 3 illustrates coordinate ranges corresponding to the first coordinate system, the second coordinate system, and the coordinate region of the second display interface 22, wherein the second coordinate system and the coordinate region are illustrated in the second display interface 22. As can be seen from fig. 3, the origin of the first coordinate system corresponding to the first display interface 21 is located at the top left vertex, and the range of the first coordinate system is (0,0) - (X _ MAX, Y _ MAX), where X _ MAX is the maximum value on the X axis in the first coordinate system, and Y _ MAX is the maximum value on the Y axis in the first coordinate system. The second coordinate system has a range of (0,0) to (x _ max, Y _ max), where x _ max is the maximum value on the x-axis in the second coordinate system, and Y _ max is the maximum value on the Y-axis in the second coordinate system. The coordinate regions of the second display interface 22 in the first coordinate system are (X _ min, Y _ min) - (X _ max, Y _ max), where X _ min and X _ max are the minimum value and the maximum value on the X-axis in the coordinate region, respectively, and Y _ min and Y _ max are the minimum value and the maximum value on the Y-axis in the coordinate region, respectively. In this case, the second display interface is an interface obtained by mapping display data having coordinate ranges of (0,0) to (X _ max, Y _ max) to regions having coordinate ranges of (X _ min, Y _ min) to (X _ max, Y _ max). For any coordinate point (a, B) in the second coordinate system, the corresponding coordinate point in the coordinate region is (a, B), and in this case, the mapping relationship is: (a-X _ min)/(X _ max-X _ min) ═ a/X _ max, (B-Y _ min)/(Y _ max-Y _ min) ═ B/Y _ max. Then, by transforming the above mapping relationship, a ═ X _ max (a-X _ min)/(X _ max-X _ min), and B ═ Y _ max (B-Y _ min)/(Y _ max-Y _ min) can be obtained. Accordingly, it can be determined that the first touch coordinate is represented by (X, Y) and the second touch coordinate is represented by (X, Y), and the coordinate conversion is set as X _ max (X-X _ min)/(X _ max-X _ min) and Y _ max (Y-Y _ min)/(Y _ max-Y _ min). Through the set formula, the second touch coordinate can be simply and quickly obtained, and the data processing pressure of the first processing module is reduced.
Further, in an embodiment, before the touch screen 11 receives the touch operation, the interactive smart device further needs to perform steps 140 to 170:
step 140, the first processing module receives an interface display instruction.
And 150, the first processing module responds to the interface display instruction and determines a coordinate area of a second display interface in the first coordinate system.
Step 160, the first processing module receives the display data sent by the second processing module.
Step 170, the first processing module determines a second coordinate system according to the display data, and instructs the touch screen to display the display data in the coordinate area.
Specifically, the interface display instruction is an instruction for establishing an application window and displaying the display data of the second display channel through the application window. The interface display instruction may also be understood as an instruction to initiate a picture-in-picture mode when the second display interface is within the first display interface. The specific embodiment of the display interface command issued by the user is not limited.
It will be appreciated that since the first processing module 12 has a host system installed, various instructions issued by the user are received by the first processing module 12. After the first processing module 12 receives the interface display instruction, it first determines a coordinate area of the window corresponding to the second display interface in the first coordinate system, where the range of the coordinate area may be preset by the host system and/or the user. Optionally, the first processing module 12 may obtain the coordinate area of the second display interface through a UI layout of the android system.
The first processing module 12 receives the display data sent by the second processing module 13, wherein the embodiment of the transmission form of the display data is not limited. The transmission path of the display data and the transmission path of the second touch coordinate may be the same or different, and in the embodiment, the transmission path is different, that is, the display data is transmitted through a separate path. Optionally, before the first processing module 12 receives the display data, the second processing module 13 is notified to send the display data, where the notification method embodiment is not limited. Optionally, when the first processing module 12 receives the display data, the second coordinate system is determined according to the display data. The second coordinate system is determined, for example, from pixels of the display data. Then, the first processing module 12 determines a mapping position of each coordinate point in the display data in the coordinate area according to the second coordinate system and the coordinate area of the second display interface, and displays the display data in the coordinate area according to the mapping position. It can be understood that the display data is transmitted in real time, and after the second processing module 13 responds according to the second touch coordinate, if the display data changes, the second display interface may display the changed display data in real time.
In the foregoing, before the first processing module displays the second display interface, the coordinate region and the second coordinate system are determined, so that accuracy of converting the subsequent first touch coordinate into the second touch coordinate can be ensured.
Further, in an embodiment, the step 120 of confirming, by the first processing module according to the first touch coordinate, that the touch operation is applied to the second display interface includes: and when the first processing module confirms that the first touch coordinate falls into the coordinate area, determining that the touch operation acts on the second display interface.
Typically, after obtaining the first touch coordinate, the first processing module determines whether the first touch coordinate falls in a coordinate area of the second display interface, if so, the touch operation is directed to the second display interface, and then, the first processing module converts the first touch coordinate into the second touch coordinate. If not, the touch operation is directed to the first display interface, and at this time, the first processing module responds according to the first touch coordinate.
Further, in an embodiment, the first processing module includes a touch processor and a main system processor, and the first display channel is a display channel of a main system; the touch control processor obtains a first touch control coordinate according to the touch control operation and sends the first touch control coordinate to the main system processor; and when the main system processor confirms that the touch operation acts on the second display interface according to the first touch coordinate, the first touch coordinate is converted into a second touch coordinate suitable for the second display interface, and the second touch coordinate is sent to the second processing module.
Fig. 4 is a second schematic structural diagram of the interactive intelligent device according to the embodiment of the present application, where the first processing module 12 includes a touch processor 14 and a main system processor 15, where the touch processor 14 is connected to the touch screen 11 and the main system processor 15, and the main system processor 15 is connected to the touch screen 11 and the second processing module 13.
Specifically, the touch processor 14 is connected to a touch device in the touch screen 11, and is configured to detect a touch operation and obtain a first touch coordinate. The touch processor may be a Micro Controller Unit (MCU) or other chip having a certain data processing function and integrating multiple I/O functions. Alternatively, the touch processor may be installed in the touch screen 11, or may be independent from the touch screen 11. For example, the touch device in the touch screen 11 is an infrared touch device, and in this case, the touch processor 14 is located on a Printed Circuit Board (PCB) of an infrared light bar in the infrared touch device, that is, the touch processor 14 is installed in the touch screen 11. For another example, the touch device in the touch screen 11 is a capacitive touch device, and at this time, the touch processor 14 is connected to components in the capacitive touch device, but is independent of the capacitive touch device, that is, the touch processor 14 is relatively independent of the touch screen 11. The touch processor 14 and the touch device may be connected through a bus or in other manners, the touch processor 14 may receive data collected by each component in the touch device, and a corresponding relationship between the first coordinate system and a position area detected by each component is further stored in the touch processor, so as to determine a position of the touch operation in the first coordinate system through the corresponding relationship. Wherein the first coordinate system may be transmitted by the main system processor 15 to the touch processor 14.
The main system processor 15 may be a Central Processing Unit (CPU), and its corresponding operating system is an android system. Further, the main system processor 15 and the touch processor 14 may be connected by a bus or a USB, etc. In one embodiment, when the main system processor 15 and the touch processor 14 are connected via USB, the USB interface of the main system processor 15 for connecting to the touch processor 14 is a host-function interface or an OTG-function interface. The USB interface in the touch processor 14 is an interface having a device function. At this time, when the touch processor 14 wants to send data to the main system processor 15, the data may be placed in a buffer corresponding to the touch processor 14, and when the main system processor 15 polls (actively queries), the main system processor 15 may read the buffer to obtain the data, thereby implementing that the touch processor 14 sends the data to the main system processor 15. In an embodiment, the touch processor 14 sends the first touch coordinate to the main system processor 15, so that the main system processor 15 converts the first touch coordinate into the second touch coordinate when determining that the first touch coordinate is corresponding to the second display interface. The format of the data transmission between the main system processor 15 and the touch control processor 14 is not limited. The main system processor 15 is also connected to a display screen in the touch screen 11 for sending display data to the display screen to instruct the display screen to display.
Typically, the main system processor 15 is connected to the second processing module 13, which may be connected by means of a bus and/or a USB or the like. When the main system processor 15 and the second processing module 13 are USB connected, the second processing module 13 may be considered a pluggable module. In one embodiment, the USB interface in the second processing module 13 is a host-enabled interface. Accordingly, the USB interface of the main system processor 15 for connecting to the second processing module 13 is an interface having a Device function. At this time, the main system processor 15 may be regarded as a slave device, and the second processing module 13 may be regarded as a master device. When the main system processor 15 sends data to the second processing module 13, the data needs to be first put into a buffer corresponding to the main system processor 15, and when the second processing module 13 polls, the second processing module 13 may read the buffer to obtain the data, thereby implementing bidirectional transmission between the main system processor 15 and the second processing module 13. In addition, the main system processor 15 needs to receive the display data transmitted by the second processing module 13. At this time, the main system processor 15 further includes an interface (including but not limited to HDMI, VGA, DisplayPort, TypeC, etc.) for transmitting display data, and is connected to the second processing module 13 through the interface to receive the display data. That is, at least two connection paths are included between the main system processor 15 and the second processing module 13, one connection path is connected in a USB manner and is used for sending the second touch coordinate to the second processing module 13, and the other connection path is connected in an interface manner and is used for receiving the display data sent by the second processing module 13.
In practical applications, a plurality of connection paths may be included between the devices, and fig. 4 shows all the connection paths included in the connection paths as only one connection path.
Optionally, fig. 5 is a schematic diagram of a first data flow provided in an embodiment of the present application, and is a schematic diagram of a data flow of the interactive intelligent device shown in fig. 4 when executing the touch implementation method. Referring to fig. 5, the host system processor receives an interface display instruction and determines a coordinate area of the second display interface in the first coordinate system. And then, the main system processor receives the display data sent by the second processing module, determines a second coordinate system according to the display data, and then sends the display data of the first display channel and the display data of the second display channel to the touch screen for display. Further, after the touch screen receives touch operation, the touch processor detects the change condition of the collected data of the component when the touch operation occurs to obtain a first touch coordinate. And then, the touch processor sends the first touch coordinate to the main system processor. And after receiving the first touch coordinate, the main system processor confirms whether the first touch coordinate falls in a coordinate area of the second display interface. And if so, determining that the touch operation acts on the second display interface, and at the moment, converting the first touch coordinate into a second touch coordinate by the main system processor and sending the second touch coordinate to the second processing module. And the second processing module responds according to the received second touch coordinate. Specifically, the sending of the second touch coordinate by the main system processor means that the second touch coordinate is placed in a buffer, and the second touch coordinate is obtained from the buffer when the second processing module polls. It can be understood that, in the above process, the second processing module sends the display data of the second display channel to the main system processor in real time, so that the main system processor sends the display data of the first display channel and the display data of the second display channel to the touch screen in real time for display. Further, if the main system processor confirms that the first touch coordinate is in the coordinate area of the first display interface, responding according to the first touch coordinate.
Further, in another embodiment, the first processing module includes a touch processor, a main system processor, and a pass-through processor, and the first display channel is a display channel of the main system; the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor; the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the transparent transmission processor; and the transparent transmission processor is used for sending the second touch coordinate obtained by the main system processor to the second processing module.
Fig. 6 is a schematic diagram of a third structure of an interactive intelligent device according to an embodiment of the present disclosure, where the first processing module 12 includes a touch processor 16, a main system processor 17, and a passthrough processor 18, where the touch processor 16 is connected to the touch screen 11 and the main system processor 17, the main system processor 15 is connected to the touch screen 11, the second processing module 13, and the passthrough processor 18 is connected to the second processing module 13.
The touch processor 16 and the touch processor 14 are the same touch processor, and have the same interface, connection relationship, and function, which are not described herein again.
Further, the main system processor 17 may be a CPU, and its corresponding operating system is an android system. The connection and data transmission between the main system processor 17 and the touch processor 16 are the same as those between the main system processor 15 and the touch processor 14, which is not described herein, and at this time, the main system processor 17 may convert the first touch coordinate into the second touch coordinate when determining that the touch operation is performed on the second display interface according to the first touch coordinate. The main system processor 17 is also connected to a display screen in the touch screen 11 for sending display data to the display screen to instruct the display screen to display. The main system processor 17 is also connected to the second processing module 13 via an interface for transmitting display data (including, but not limited to, HDMI, VGA, DisplayPort, TypeC, etc.) to receive display data.
Further, the transparent transmission processor 18 may be an MCU. Optionally, the transparent transmission processor 18 and the main system processor 17 may be disposed on the same motherboard, or disposed on different motherboards, and the transparent transmission processor 18 is connected to the main system processor 17 to receive the second touch coordinate sent by the main system processor 17. At this time, the transparent transmission processor 18 and the main system processor 17 may be connected in a bus or USB manner, when the transparent transmission processor 18 and the main system processor 17 are connected in a USB manner, the corresponding USB interface in the main system processor 17 has a host function, and the corresponding USB interface in the transparent transmission processor 18 has a device function. Further, the transparent transmission processor 18 is connected to the second processing module 13 in a USB manner to transmit the second touch coordinate to the second processing module 13, where a corresponding USB interface in the transparent transmission processor 18 has a device function, and a corresponding USB interface of the second processing module 13 has a host function, and at this time, when the transparent transmission processor 18 sends the second touch coordinate to the second processing module 13, the second touch coordinate needs to be first placed in a corresponding buffer area, and when the second processing module 13 polls, the second touch coordinate is obtained from the buffer area. It should be noted that the passthrough processor 18 may also perform other functions in conjunction with the actual situation. The advantage of setting up the pass-through processor is that, when the USB interface with host function in the main system processor can't transmit data with the second processing module with USB interface of host function, can realize the data transmission between main system processor and the second processing module through the pass-through processor.
In practical applications, a plurality of connection paths may be included between the devices, and fig. 6 shows all the connection paths included in the connection paths as only one connection path.
Fig. 7 is a second data flow diagram provided in the embodiment of the present application, which is a data flow diagram of the interactive intelligent device shown in fig. 6 when executing the touch implementation method. Referring to fig. 7, the main system processor receives an interface display instruction and determines a coordinate area of the second display interface in the first coordinate system. And then, the main system processor receives the display data sent by the second processing module, determines a second coordinate system according to the display data, and then sends the display data of the first display channel and the display data of the second display channel to the touch screen for display. Further, after the touch screen receives touch operation, the touch processor detects the change condition of the collected data of the component when the touch operation occurs to obtain a first touch coordinate. And then, the touch processor sends the first touch coordinate to the main system processor. And after receiving the first touch coordinate, the main system processor confirms whether the first touch coordinate falls in a coordinate area of the second display interface. And if so, determining that the touch operation acts on the second display interface, and at the moment, converting the first touch coordinate into a second touch coordinate by the main system processor and sending the second touch coordinate to the transparent transmission processor. And after receiving the second touch coordinate, the transparent transmission processor transmits the second touch coordinate to the second processing module. And the second processing module responds according to the received second touch coordinate. It can be understood that, in the above process, the second processing module sends the display data of the second display channel to the main system processor in real time, so that the main system processor sends the display data of the first display channel and the display data of the second display channel to the touch screen in real time for display. Further, if the main system processor confirms that the first touch coordinate is in the coordinate area of the first display interface, responding according to the first touch coordinate.
Further, in another embodiment, the first processing module includes a touch processor and a main system processor, and the first display channel is a display channel of a main system; the main system processor sends the coordinate area and the second coordinate system to the touch processor; and the touch processor obtains a first touch coordinate according to the touch operation, converts the first touch coordinate into a second touch coordinate suitable for the second display interface when confirming that the touch operation acts on the second display interface according to the first touch coordinate, and sends the second touch coordinate to the second processing module.
Fig. 8 is a fourth schematic structural diagram of an interactive intelligent device according to an embodiment of the present application, where the first processing module 12 includes a touch processor 19 and a main system processor 20, where the touch processor 19 is connected to the touch screen 11, the main system processor 19 and the second processing module 13, and the main system processor 20 is connected to the touch screen 11 and the second processing module 13.
The touch processor 19 is connected to a touch device in the touch screen 11, and is configured to detect a touch operation and obtain a first touch coordinate. The touch control processor 19 may be a chip such as an MCU having a certain data processing function and integrating a plurality of I/O functions. Alternatively, the touch processor may be installed in the touch screen 11, or may be independent from the touch screen 11. The touch processor 19 and the touch device may be connected through a bus or other means, the touch processor 19 may receive data collected by each component in the touch device, and the touch processor 19 further stores a corresponding relationship between the first coordinate system and a position area detected by each component, so as to determine a position of the touch operation in the first coordinate system through the corresponding relationship. Wherein the first coordinate system may be transmitted by the main system processor 20 to the touch processor 19. Further, the touch control processor 19 is connected to the main system processor 20, wherein the touch control processor 19 and the main system processor 20 may be connected by a bus or a USB. In one embodiment, when the main system processor 20 and the touch processor 19 are connected via USB, the USB interface of the main system processor 20 for connecting to the touch processor 19 is a host-capable interface or an OTG-capable interface. The USB interface in the touch processor 19 is an interface having a device function. Typically, when the main system processor 20 instructs the touch screen 11 to display the second display interface, the second coordinate system and the coordinate area of the second display interface may be sent to the touch control processor 19. Thereafter, the touch processor 19 may convert the first touch coordinates into second touch coordinates when detecting the first touch coordinates. Further, the touch processor 19 is connected to the second processing module 13, wherein the touch processor 19 and the second processing module 13 may be connected by USB or bus, which is taken as an example in this embodiment. When connected by USB means, the USB interface of the second processing module 13 for connection to the touch processor 19 has a host function. The USB interface in the touch processor 19 for connecting to the second processing module 13 has a device function, at this time, when the touch processor 19 sends data to the second processing module 13, it is necessary to put the data into the buffer corresponding to the touch processor 19 first, and when the second processing module 13 polls, the second processing module 13 may read the buffer to obtain the data, so as to implement that the touch processor 19 sends the data to the second processing module 13.
The main system processor 20 is connected to the second processing module 13 via an interface (including but not limited to HDMI, VGA, DisplayPort, TypeC, etc.) for transmitting display data, so as to receive the display data transmitted by the second processing module 13. The main system processor 20 is also coupled to a display screen of the touch screen 11 for sending display data to the display screen to instruct the display screen to display.
In practical applications, a plurality of connection paths may be included between the devices, and fig. 8 shows all the connection paths included in the connection paths as only one connection path.
Specifically, fig. 9 is a third data flow schematic diagram provided in the embodiment of the present application, which is a data flow schematic diagram of the interactive intelligent device shown in fig. 8 when executing the touch implementation method. Referring to fig. 9, the main system processor receives an interface display instruction and determines a coordinate area of the second display interface in the first coordinate system. And then, the main system processor receives the display data sent by the second processing module, determines a second coordinate system according to the display data, sends the display data of the first display channel and the display data of the second display channel to the touch screen in real time for displaying, and sends the contents of the coordinate area, the second coordinate system and the like to the touch processor. Further, after the touch screen receives touch operation, the touch processor detects the change condition of the collected data of the component when the touch operation occurs to obtain a first touch coordinate. And then, the touch processor determines whether the first touch coordinate falls in a coordinate area of the second display interface. And if so, determining that the touch operation acts on the second display interface, converting the first touch coordinate into a second touch coordinate by the touch processor, and sending the second touch coordinate to the second processing module. And the second processing module responds according to the received second touch coordinate. It can be understood that, in the above process, the second processing module sends the display data of the second display channel to the main system processor in real time, so that the main system processor sends the display data of the first display channel and the display data of the second display channel to the touch screen in real time for display. Further, if the touch processor determines that the first touch coordinate falls in the coordinate area of the first display interface, the first touch coordinate is sent to the main system processor. And the main system processor responds according to the first touch coordinate. It should be noted that, after the touch processor obtains the first touch coordinate, no matter whether the first touch coordinate acts on the second display interface, the first touch coordinate needs to be sent to the main system processor, so that the main system processor determines that the touch operation and the touch coordinate corresponding to the touch operation are currently received.
The connection via the bus mentioned in the above embodiments includes but is not limited to: the Interface is connected by at least one of an I2C bus, a Serial Peripheral Interface (SPI), and a universal asynchronous Receiver/Transmitter (UART).
It should be noted that, in practical applications, the types of the interfaces mentioned above may be changed according to actual situations, and the execution bodies and the data flow directions of the steps in the touch implementation method may be changed according to the types of the interfaces, and the three embodiments described above are only three alternatives.
By setting the connection relationship between each hardware in the first processing module and the second processing module, the number of interfaces and the types of interfaces, the touch implementation method can have different data streams, the limitation of the touch implementation method on the connection mode of hardware equipment is reduced, and the flexibility and compatibility of the touch implementation method are enhanced.
On the basis of the foregoing embodiments, an embodiment of the present application further provides an interactive intelligent device, where the interactive intelligent device at least includes: the touch screen is provided with a first display interface and a second display interface, the first display interface corresponds to a first display channel of the first processing module, and the second display interface corresponds to a second display channel of the second processing module;
the touch screen is used for receiving touch operation;
the first processing module is used for obtaining a first touch coordinate according to the touch operation, converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the second processing module;
and the second processing module is used for responding according to the second touch coordinate.
Further, in one embodiment, the second display interface is located within the first display interface.
Further, in an embodiment, a first coordinate system in the first display interface is obtained through a coordinate field of the first display channel, a second coordinate system in the second display interface is obtained through a coordinate field of the second display channel, the first touch coordinate is a coordinate in the first coordinate system, and the second touch coordinate is a coordinate in the second coordinate system.
Further, in an embodiment, the first processing module is further configured to receive an interface display instruction, determine, in response to the interface display instruction, a coordinate area of a second display interface in the first coordinate system, receive display data sent by the second processing module, determine a second coordinate system according to the display data, and instruct the touch screen to display the display data in the coordinate area.
Further, in an embodiment, the first processing module includes a touch processor and a main system processor, and the first display channel is a display channel of a main system; the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor; and the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the second processing module.
Further, in an embodiment, the first processing module includes a touch processor, a main system processor, and a pass-through processor, and the first display channel is a display channel of the main system; the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor; the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the touch processor; and the transparent transmission processor is used for sending the second touch coordinate obtained by the main system processor to the second processing module.
Further, in an embodiment, the first processing module includes a touch processor and a main system processor, and the first display channel is a display channel of a main system; the main system processor is used for sending the coordinate area and the second coordinate system to the touch processor; the touch processor is configured to obtain a first touch coordinate according to the touch operation, convert the first touch coordinate into a second touch coordinate suitable for the second display interface when it is determined that the touch operation acts on the second display interface according to the first touch coordinate, and send the second touch coordinate to the second processing module.
Further, in an embodiment, when the first processing module is configured to convert the first touch coordinate into a second touch coordinate suitable for the second display interface, the first processing module is specifically configured to: substituting the first touch coordinate into a set formula to calculate a second touch coordinate suitable for the second display interface; the setting formula comprises: x-max (X-X _ min)/(X _ max-X _ min), Y-Y _ max (Y-Y _ min)/(Y _ max-Y _ min); and (X, Y) is a second touch coordinate, (X, Y) is a first touch coordinate, X _ max and Y _ max are respectively a maximum value of an X axis and a maximum value of a Y axis in a second coordinate system, X _ min and X _ max are respectively a minimum value and a maximum value of a second display interface on the X axis in a first coordinate system, and Y _ min and Y _ max are respectively a minimum value and a maximum value of the second display interface on the Y axis in the first coordinate system.
Further, in an embodiment, when the first processing module is configured to confirm that the touch operation is applied to the second display interface according to the first touch coordinate, the first processing module is specifically configured to: and when the first touch coordinate is confirmed to fall into the coordinate area, determining that the touch operation acts on the second display interface.
Further, in an embodiment, the first processing module is further configured to respond according to the first touch coordinate when it is determined that the touch operation is applied to the first display interface according to the first touch coordinate.
Further, in an embodiment, the first display channel is an android display channel, and the second display channel is different from the first display channel in data source type.
The hardware structure of the interactive intelligent device and the functions specifically executed by each hardware provided by the embodiment of the application can refer to the relevant description in the embodiment of the touch implementation method, and have the same beneficial effects.
It is understood that computer executable instructions (i.e. computer programs) used when executing the operations of the method in the embodiments of the present application may be stored in a storage medium, and a first processing module, a second processing module, etc. may implement a touch implementation method by executing corresponding computer executable instructions stored in the storage medium.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (19)

1. A touch implementation method is applied to an interactive intelligent device, and the interactive intelligent device at least comprises: the touch screen is provided with a first display interface and a second display interface, the first display interface corresponds to a first display channel of the first processing module, and the second display interface corresponds to a second display channel of the second processing module;
the touch screen receives touch operation;
the first processing module obtains a first touch coordinate according to the touch operation, converts the first touch coordinate into a second touch coordinate suitable for the second display interface when confirming that the touch operation acts on the second display interface according to the first touch coordinate, and sends the second touch coordinate to the second processing module;
and the second processing module responds according to the second touch coordinate.
2. The method of claim 1, wherein the second display interface is located within the first display interface.
3. The method of claim 1, wherein a first coordinate system in the first display interface is obtained through a coordinate field of the first display channel, wherein a second coordinate system in the second display interface is obtained through a coordinate field of the second display channel, wherein the first touch coordinate is a coordinate in the first coordinate system, and wherein the second touch coordinate is a coordinate in the second coordinate system.
4. The method of claim 3, wherein before the touch screen receives the touch operation, the method further comprises:
the first processing module receives an interface display instruction;
the first processing module responds to the interface display instruction and determines a coordinate area of a second display interface in the first coordinate system;
the first processing module receives the display data sent by the second processing module;
and the first processing module determines a second coordinate system according to the display data and indicates the touch screen to display the display data in the coordinate area.
5. The method of claim 1, wherein the first processing module comprises a touch processor and a main system processor, and wherein the first display channel is a display channel of a main system;
the touch control processor obtains a first touch control coordinate according to the touch control operation and sends the first touch control coordinate to the main system processor;
and when the main system processor confirms that the touch operation acts on the second display interface according to the first touch coordinate, the first touch coordinate is converted into a second touch coordinate suitable for the second display interface, and the second touch coordinate is sent to the second processing module.
6. The method of claim 1, wherein the first processing module comprises a touch processor, a main system processor, and a pass-through processor, and wherein the first display channel is a display channel of a main system;
the touch control processor obtains a first touch control coordinate according to the touch control operation and sends the first touch control coordinate to the main system processor;
when the main system processor confirms that the touch operation acts on the second display interface according to the first touch coordinate, the first touch coordinate is converted into a second touch coordinate suitable for the second display interface, and the second touch coordinate is sent to the transparent transmission processor;
and the transparent transmission processor sends the second touch coordinate to the second processing module.
7. The method of claim 4, wherein the first processing module comprises a touch processor and a main system processor, and wherein the first display channel is a display channel of a main system;
the main system processor sends the coordinate area and the second coordinate system to the touch processor;
and the touch processor obtains a first touch coordinate according to the touch operation, converts the first touch coordinate into a second touch coordinate suitable for the second display interface when confirming that the touch operation acts on the second display interface according to the first touch coordinate, and sends the second touch coordinate to the second processing module.
8. The method of claim 3, wherein the first processing module converting the first touch coordinates into second touch coordinates suitable for the second display interface comprises:
the first processing module substitutes the first touch coordinate into a set formula to calculate a second touch coordinate suitable for the second display interface;
the setting formula comprises: x-max (X-X _ min)/(X _ max-X _ min), Y-Y _ max (Y-Y _ min)/(Y _ max-Y _ min);
and (X, Y) is a second touch coordinate, (X, Y) is a first touch coordinate, X _ max and Y _ max are respectively a maximum value of an X axis and a maximum value of a Y axis in a second coordinate system, X _ min and X _ max are respectively a minimum value and a maximum value of a second display interface on the X axis in a first coordinate system, and Y _ min and Y _ max are respectively a minimum value and a maximum value of the second display interface on the Y axis in the first coordinate system.
9. The method of claim 4, wherein the first processing module confirming that the touch operation is applied to the second display interface according to the first touch coordinate comprises:
and when the first processing module confirms that the first touch coordinate falls into the coordinate area, determining that the touch operation acts on the second display interface.
10. The method of claim 1 or 9, further comprising:
and the first processing module responds according to the first touch coordinate when confirming that the touch operation acts on the first display interface according to the first touch coordinate.
11. The method of claim 1, wherein the first display channel is an android display channel, and wherein the second display channel is of a different data source type than the first display channel.
12. An interactive smart device, characterized in that it comprises at least: the touch screen is provided with a first display interface and a second display interface, the first display interface corresponds to a first display channel of the first processing module, and the second display interface corresponds to a second display channel of the second processing module;
the touch screen is used for receiving touch operation;
the first processing module is used for obtaining a first touch coordinate according to the touch operation, converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the second processing module;
and the second processing module is used for responding according to the second touch coordinate.
13. The interactive smart device of claim 12, wherein a first coordinate system in the first display interface is obtained through a coordinate field of the first display channel, wherein a second coordinate system in the second display interface is obtained through a coordinate field of the second display channel, wherein the first touch coordinate is a coordinate in the first coordinate system, and wherein the second touch coordinate is a coordinate in the second coordinate system.
14. The interactive smart device of claim 13, further comprising:
the first processing module is further configured to receive an interface display instruction, determine, in response to the interface display instruction, a coordinate area of a second display interface in the first coordinate system, receive display data sent by the second processing module, determine a second coordinate system according to the display data, and instruct the touch screen to display the display data in the coordinate area.
15. The interactive smart device of claim 12, wherein the first processing module comprises a touch processor and a host system processor, and wherein the first display channel is a display channel of a host system;
the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor;
and the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the second processing module.
16. The interactive smart device of claim 12, wherein the first processing module comprises a touch processor, a main system processor, and a pass-through processor, and the first display channel is a display channel of a main system;
the touch control processor is used for obtaining a first touch control coordinate according to the touch control operation and sending the first touch control coordinate to the main system processor;
the main system processor is used for converting the first touch coordinate into a second touch coordinate suitable for the second display interface when the touch operation is confirmed to act on the second display interface according to the first touch coordinate, and sending the second touch coordinate to the transparent transmission processor;
and the transparent transmission processor is used for sending the second touch coordinate obtained by the main system processor to the second processing module.
17. The interactive smart device of claim 13, wherein the first processing module comprises a touch processor and a host system processor, and wherein the first display channel is a display channel of a host system;
the main system processor is used for sending the coordinate area and the second coordinate system to the touch processor;
the touch processor is configured to obtain a first touch coordinate according to the touch operation, convert the first touch coordinate into a second touch coordinate suitable for the second display interface when it is determined that the touch operation acts on the second display interface according to the first touch coordinate, and send the second touch coordinate to the second processing module.
18. The interactive smart device of claim 12, wherein the first processing module, when converting the first touch coordinate into a second touch coordinate suitable for the second display interface, is specifically configured to:
substituting the first touch coordinate into a set formula to calculate a second touch coordinate suitable for the second display interface;
the setting formula comprises: x-max (X-X _ min)/(X _ max-X _ min), Y-Y _ max (Y-Y _ min)/(Y _ max-Y _ min);
and (X, Y) is a second touch coordinate, (X, Y) is a first touch coordinate, X _ max and Y _ max are respectively a maximum value of an X axis and a maximum value of a Y axis in a second coordinate system, X _ min and X _ max are respectively a minimum value and a maximum value of a second display interface on the X axis in a first coordinate system, and Y _ min and Y _ max are respectively a minimum value and a maximum value of the second display interface on the Y axis in the first coordinate system.
19. The interactive smart device of claim 12, wherein the first display channel is an android display channel, and wherein the second display channel is of a different data source type than the first display channel.
CN202010209572.6A 2020-03-23 2020-03-23 Touch control implementation method and interactive intelligent device Active CN111427501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010209572.6A CN111427501B (en) 2020-03-23 2020-03-23 Touch control implementation method and interactive intelligent device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010209572.6A CN111427501B (en) 2020-03-23 2020-03-23 Touch control implementation method and interactive intelligent device

Publications (2)

Publication Number Publication Date
CN111427501A true CN111427501A (en) 2020-07-17
CN111427501B CN111427501B (en) 2021-07-23

Family

ID=71549246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010209572.6A Active CN111427501B (en) 2020-03-23 2020-03-23 Touch control implementation method and interactive intelligent device

Country Status (1)

Country Link
CN (1) CN111427501B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115066892A (en) * 2020-12-30 2022-09-16 广州视源电子科技股份有限公司 Data transmission method and data transmission system
CN115167752A (en) * 2022-06-28 2022-10-11 华人运通(上海)云计算科技有限公司 Single-screen system and control method thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1855001A (en) * 2005-04-18 2006-11-01 乐金电子(沈阳)有限公司 Booknote computer with touching screen
CN102129345A (en) * 2010-01-19 2011-07-20 Lg电子株式会社 Mobile terminal and control method thereof
US20110176732A1 (en) * 2010-01-18 2011-07-21 Go Maruyama Image processing method and device, and imaging apparatus using the image processing device
CN104820547A (en) * 2014-01-30 2015-08-05 三星显示有限公司 Touch-in-touch display apparatus
EP2966558A1 (en) * 2013-04-07 2016-01-13 Guangzhou Shirui Electronics Co., Ltd. Multi-channel touch control method, device and computer storage media for integration machine
CN106201413A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch-control distribution method, device and the liquid crystal indicator of one screen windows display
CN107229407A (en) * 2016-03-25 2017-10-03 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107340967A (en) * 2017-07-10 2017-11-10 广州视源电子科技股份有限公司 Intelligent device, operation method and device thereof and computer storage medium
CN108549500A (en) * 2018-04-16 2018-09-18 广州视源电子科技股份有限公司 Touch panel control circuit and touch panel
CN109782968A (en) * 2018-12-18 2019-05-21 维沃移动通信有限公司 A kind of interface method of adjustment and terminal device
CN110456998A (en) * 2019-07-31 2019-11-15 广州视源电子科技股份有限公司 display method and device, storage medium and processor

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1855001A (en) * 2005-04-18 2006-11-01 乐金电子(沈阳)有限公司 Booknote computer with touching screen
US20110176732A1 (en) * 2010-01-18 2011-07-21 Go Maruyama Image processing method and device, and imaging apparatus using the image processing device
CN102129345A (en) * 2010-01-19 2011-07-20 Lg电子株式会社 Mobile terminal and control method thereof
EP2966558A1 (en) * 2013-04-07 2016-01-13 Guangzhou Shirui Electronics Co., Ltd. Multi-channel touch control method, device and computer storage media for integration machine
CN104820547A (en) * 2014-01-30 2015-08-05 三星显示有限公司 Touch-in-touch display apparatus
CN107229407A (en) * 2016-03-25 2017-10-03 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN106201413A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch-control distribution method, device and the liquid crystal indicator of one screen windows display
CN107340967A (en) * 2017-07-10 2017-11-10 广州视源电子科技股份有限公司 Intelligent device, operation method and device thereof and computer storage medium
CN108549500A (en) * 2018-04-16 2018-09-18 广州视源电子科技股份有限公司 Touch panel control circuit and touch panel
CN109782968A (en) * 2018-12-18 2019-05-21 维沃移动通信有限公司 A kind of interface method of adjustment and terminal device
CN110456998A (en) * 2019-07-31 2019-11-15 广州视源电子科技股份有限公司 display method and device, storage medium and processor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115066892A (en) * 2020-12-30 2022-09-16 广州视源电子科技股份有限公司 Data transmission method and data transmission system
CN115167752A (en) * 2022-06-28 2022-10-11 华人运通(上海)云计算科技有限公司 Single-screen system and control method thereof

Also Published As

Publication number Publication date
CN111427501B (en) 2021-07-23

Similar Documents

Publication Publication Date Title
TWI400638B (en) Touch display device, touch display system, and method for adjusting touch area thereof
US10698530B2 (en) Touch display device
US9070321B2 (en) Tablet computer and method for controlling the same
IL200983A (en) System and method for driving and receiving data from multiple touch screen devices
CN111427501B (en) Touch control implementation method and interactive intelligent device
US20070126712A1 (en) Display apparatus, display system and control method thereof
WO2019200906A1 (en) Touch panel control circuit and touch panel
US8896611B2 (en) Bi-directional data transmission system and method
US20080136828A1 (en) Remote Access Device
US10699664B2 (en) Image display system and method of transforming display panels of mobile devices into being compatible with medical images display standard
US11144155B2 (en) Electronic device
CN113253877B (en) Electronic whiteboard system and control method thereof
KR100676366B1 (en) Method and system for controlling computer using touchscrren of portable wireless terminal
TW201306566A (en) Method and system for controlling multimedia monitor
KR102188870B1 (en) Interface device and method between wehicle terminal and external terminal
CN102810054B (en) The control method of display device and display device
CN107015935B (en) Docking apparatus and control method thereof
CN112004043B (en) Display control system and display device
CN115617294A (en) Screen display method, device and equipment and readable storage medium
CN108881800B (en) Display device, setting method of information terminal in display device, and display system
CN114115633A (en) Touch method and device of single-touch screen multi-touch receiving equipment and computer equipment
CN219642228U (en) Display and multi-screen display system
CN213122917U (en) KVM multi-system combined control device based on OSD
CN211375585U (en) Multi-computer switching device capable of adjusting picture size
US20060189271A1 (en) Display card with a wireless module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant