CN113778310A - Cross-device control method and computer program product - Google Patents

Cross-device control method and computer program product Download PDF

Info

Publication number
CN113778310A
CN113778310A CN202110898684.1A CN202110898684A CN113778310A CN 113778310 A CN113778310 A CN 113778310A CN 202110898684 A CN202110898684 A CN 202110898684A CN 113778310 A CN113778310 A CN 113778310A
Authority
CN
China
Prior art keywords
gesture
gesture operation
touch
parameter
touch point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110898684.1A
Other languages
Chinese (zh)
Inventor
吴惟惟
王斌
姜在斌
刘松
张浩普
倪斯文
巫丹丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Innovation Co
Original Assignee
Alibaba Singapore Holdings Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Singapore Holdings Pte Ltd filed Critical Alibaba Singapore Holdings Pte Ltd
Priority to CN202110898684.1A priority Critical patent/CN113778310A/en
Publication of CN113778310A publication Critical patent/CN113778310A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure discloses a cross-device control method and a computer program product, wherein the method comprises the following steps: in response to the detection of the gesture operation, capturing a first gesture operation parameter from a gesture touch area of the graphical user interface of the application program; and packaging the first gesture operation parameter and the size parameter of the gesture touch area of the first equipment according to a preset protocol, and sending the packaging parameter to second equipment so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameter. The technical scheme can effectively respond to the user intention, accords with the habit of the user, realizes interaction between the user and the vehicle-mounted equipment, and can improve interaction efficiency and user experience.

Description

Cross-device control method and computer program product
Technical Field
The disclosed embodiments relate to the technical field of device control, and in particular, to a cross-device control method and a computer program product.
Background
With the development and progress of the society, vehicles on roads are more and more, and many users can check information synchronized with an intelligent terminal (such as a smart phone), such as navigation information, by means of vehicle-mounted equipment (such as a vehicle-mounted multimedia system, a vehicle machine and the like) when going out, because the screen size of the vehicle-mounted equipment is usually larger than that of the intelligent terminal, the interaction (such as display, check, operation and the like) between the information and the users is facilitated. Therefore, in the related art, the smart phone and the vehicle-mounted device are connected in a wired or wireless manner, and the smart phone is synchronized with information of the vehicle-mounted device through a physical key of the vehicle-mounted device or a screen of the vehicle-mounted device to operate. However, the inventor finds that, due to different automobile delivery times, screens of some vehicle-mounted devices are not touch screens, or even the touch screens are poor in sensitivity, so that a user cannot or is difficult to perform touch operation on information displayed by the vehicle-mounted devices through gestures, and the screen information is operated through physical keys, so that on one hand, the problem of limited operation modes exists, and on the other hand, under the condition that most users are used to gesture touch operation, the screen information is operated through the physical keys, and the user habit is not met. Therefore, an interaction control mode which can effectively respond to the user intention and accords with the user habit is needed, interaction between the user and the vehicle-mounted equipment is realized, and the user experience is improved.
Disclosure of Invention
The embodiment of the disclosure provides a cross-device control method and a computer program product.
In a first aspect, an embodiment of the present disclosure provides a cross-device control method.
Specifically, the cross-device control method includes:
in response to the detection of the gesture operation, capturing a first gesture operation parameter from a gesture touch area of the graphical user interface of the application program; and packaging the first gesture operation parameter and the size parameter of the gesture touch area of the first equipment according to a preset protocol, and sending the packaging parameter to second equipment so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameter.
With reference to the first aspect, in a first implementation manner of the first aspect, the first gesture operation parameter includes one or more of the following parameters: touch point position, zoom factor, touch point movement distance.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in a second implementation manner of the first aspect, the application is an application supporting a map navigation function, the gesture operation is a map zooming gesture, and the captured first gesture operation parameter includes: the positions and scaling factors of the two touch points; the packaged first gesture operational parameters include: the position of the two touch points and the zoom factor.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in a third implementation manner of the first aspect, the application is an application supporting a map navigation function, the gesture operation is a map zooming gesture, and the captured first gesture operation parameter includes: the position and scaling factor of two touch points, the method further comprising: determining a position of an intermediate point located in the middle of the two touch points based on the positions of the two touch points; the packaged first gesture operational parameters include: the position of the intermediate point and the scaling factor.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in a fourth implementation manner of the first aspect, the application is an application supporting a map navigation function, the gesture operation is a map moving gesture, and the captured first gesture operation parameter includes: a position of a touch point and a moving distance of the touch point; the gesture operation is a map zooming gesture, and the captured first gesture operation parameters comprise: the location of a touch point; a map zoom-out gesture, the first gesture operation parameters comprising: the positions of the two touch points; a map pitch angle control gesture, the first gesture operational parameters comprising: the positions of the two touch points and the moving distance of each touch point.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in a fifth implementation manner of the first aspect, the present disclosure further includes: and responding to a virtual control touch operation, sending a function execution command corresponding to the virtual control packaged according to a preset protocol to the second device or triggering the first device to execute the function execution command corresponding to the virtual control, wherein the virtual control is arranged in a virtual control area of the graphical user interface of the application program, and the virtual control area is positioned above or below the gesture touch area or is parallel to the gesture touch area.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in a sixth implementation manner of the first aspect, the gesture touch area is provided with a slider control, and the method further includes: capturing a movement distance of the slider in response to an operation of the slider control; and packaging the moving distance of the sliding bar and the size parameter of the sliding bar and then sending the packaged parameters to second equipment, so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameters.
In a second aspect, an embodiment of the present disclosure provides a cross-device control method.
Specifically, the cross-device control method includes:
receiving a packaging parameter sent by first equipment, and de-packaging according to a preset protocol to obtain a first gesture operation parameter and a size parameter of a gesture touch area of the first equipment; converting the first gesture operation parameter into a second gesture operation parameter suitable for operating on second equipment according to the size parameter of the gesture touch area of the first equipment and the size parameter of the display area of the second equipment; and controlling the information displayed in the corresponding display area of the second equipment according to the second gesture operation parameter.
With reference to the second aspect, in a first implementation manner of the second aspect, the converting the first gesture operation parameter into a second gesture operation parameter suitable for operating on a second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device includes: acquiring a size parameter of a display area of second equipment; calculating to obtain a size ratio between the first device and the second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device; converting the first gesture operation parameter to a second gesture operation parameter suitable for operation on a second device according to the size ratio.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in a second implementation manner of the second aspect, an embodiment of the present disclosure includes one or more of the following parameters: touch point position, zoom factor, touch point moving distance and touch point moving direction; the converting the first gesture operation parameter into a second gesture operation parameter suitable for operation on a second device according to the size ratio comprises: and converting the position of the touch point into the position of the touch point in the second equipment, converting the movement distance of the touch point into the movement distance of the touch point in the second equipment and/or converting the movement direction of the touch point into the movement direction of the touch point in the second equipment according to the size proportion.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in a third implementation manner of the second aspect, the controlling, according to the second gesture operation parameter, information displayed in a corresponding display area of the second device includes: calculating to obtain a gesture control parameter according to the second gesture operation parameter; and controlling the information displayed in the corresponding display area of the second equipment according to the gesture control parameter.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in a fourth implementation manner of the second aspect, in the present disclosure, a display interface of the second device is a map navigation interface, and the calculating according to the second gesture operation parameter to obtain the gesture control parameter includes: when the second gesture operation parameters comprise the positions of the two touch points and a zoom factor, determining that the gesture operation is a map zoom gesture, calculating the position of the middle point between the two touch points according to the positions of the two touch points, and calculating the map zoom scale according to the zoom factor; when the second gesture operation parameters comprise the position of the middle point and a zooming coefficient, determining that the gesture operation is a map zooming gesture, and calculating according to the zooming coefficient to obtain a map zooming scale; when the second gesture operation parameter comprises the position of a touch point and the moving distance of the touch point, determining that the gesture operation is a map moving gesture, calculating the moving direction of the display information of the display area of the second equipment according to the position change of the touch point, and calculating the moving distance of the display information of the display area of the second equipment according to the moving distance of the touch point; when the second gesture operation parameter comprises the position of a touch point, determining that the gesture operation is a map magnifying gesture, and calculating according to the number of the touch point positions to obtain a magnifying scale; when the second gesture operation parameters comprise the positions of two touch points, determining that the gesture operation is a map zooming-out gesture, and calculating according to the number of the positions of each touch point to obtain a zooming-out scale; when the second gesture operation parameters comprise the positions of two touch points and the moving distance of each touch point, determining that the gesture operation is a map pitch angle control gesture, determining the moving direction of the corresponding touch point according to the positions of the touch points and the moving distance of the corresponding touch point, determining a pitch angle control direction according to the moving direction of the touch point, and calculating a pitch angle control angle according to the moving distance of the touch point.
In a third aspect, an embodiment of the present disclosure provides a cross-device control method.
Specifically, the cross-device control method includes:
the method comprises the steps that in response to detection of gesture operation, first gesture operation parameters are obtained by capturing from a gesture touch area of an application program graphical user interface through first equipment, the first gesture operation parameters and size parameters of the gesture touch area of the first equipment are packaged according to a preset protocol, and the packaged parameters are sent to second equipment; the method comprises the steps that a second device receives packaging parameters sent by a first device, decapsulates the packaging parameters according to a preset protocol to obtain first gesture operation parameters and size parameters of a gesture touch area of the first device, converts the first gesture operation parameters into second gesture operation parameters suitable for operation on the second device according to the size parameters of the gesture touch area of the first device and size parameters of a display area of the second device, and controls information displayed in the corresponding display area of the second device according to the second gesture operation parameters.
In a fourth aspect, a cross-device control system is provided in embodiments of the present disclosure.
Specifically, the cross-device control system includes:
the gesture control system comprises a first device, a second device and a third device, wherein the first device is configured to respond to detection of gesture operation, capture a first gesture operation parameter from a gesture touch area of an application program graphical user interface, package the first gesture operation parameter and a size parameter of the gesture touch area of the first device according to a preset protocol, and send the packaged parameter to the second device; the second device is configured to receive the encapsulation parameter sent by the first device, decapsulate the encapsulation parameter according to a preset protocol to obtain a first gesture operation parameter and a size parameter of a gesture touch area of the first device, convert the first gesture operation parameter into a second gesture operation parameter suitable for operating on the second device according to the size parameter of the gesture touch area of the first device and the size parameter of a display area of the second device, and control information displayed in a corresponding display area of the second device according to the second gesture operation parameter.
In a fifth aspect, the disclosed embodiments provide an electronic device, including a memory for storing one or more computer instructions that support a cross-device control apparatus or system to perform the above-described cross-device control method, and a processor configured to execute the computer instructions stored in the memory. The cross-device control apparatus or system may also include a communication interface for communicating with other devices or communication networks across the device control apparatus or system.
In a sixth aspect, the disclosed embodiments provide a computer-readable storage medium for storing computer instructions for a cross-device control apparatus or system, which contains computer instructions for executing the cross-device control method described above as a cross-device control apparatus or system.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the technical scheme, the interaction with the vehicle-mounted equipment is realized by setting the gesture touch area at the mobile phone end. The technical scheme can effectively respond to the user intention, accords with the habit of the user, realizes interaction between the user and the vehicle-mounted equipment, and can improve interaction efficiency and user experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of embodiments of the disclosure.
Drawings
Other features, objects, and advantages of embodiments of the disclosure will become more apparent from the following detailed description of non-limiting embodiments when taken in conjunction with the accompanying drawings. In the drawings:
FIG. 1 shows a flow diagram of a cross-device control method according to an embodiment of the present disclosure;
2A-2E illustrate schematic diagrams of gesture touch area locations according to an embodiment of the present disclosure;
FIG. 3 illustrates a virtual control schematic according to an embodiment of the present disclosure;
FIG. 4 shows a flow diagram of a cross-device control method according to another embodiment of the present disclosure;
FIG. 5 shows a flow diagram of a cross-device control method according to yet another embodiment of the present disclosure;
6A-6E illustrate cross-device control schematics according to an embodiment of the present disclosure;
FIG. 7 shows a block diagram of a cross-device control apparatus according to an embodiment of the present disclosure;
fig. 8 shows a block diagram of a cross-device control apparatus according to another embodiment of the present disclosure;
FIG. 9 shows a block diagram of a cross-device control system according to an embodiment of the present disclosure;
FIG. 10 shows a block diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 11 is a schematic block diagram of a computer system suitable for use in implementing a cross-device control method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, exemplary embodiments of the disclosed embodiments will be described in detail with reference to the accompanying drawings so that they can be easily implemented by those skilled in the art. Also, for the sake of clarity, parts not relevant to the description of the exemplary embodiments are omitted in the drawings.
In the disclosed embodiments, it is to be understood that terms such as "including" or "having," etc., are intended to indicate the presence of the disclosed features, numbers, steps, behaviors, components, parts, or combinations thereof, and are not intended to preclude the possibility that one or more other features, numbers, steps, behaviors, components, parts, or combinations thereof may be present or added.
It should be further noted that the embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict. The embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
According to the technical scheme, the interaction with the vehicle-mounted navigation equipment is realized by setting the gesture touch area at the mobile phone end. The technical scheme can effectively respond to the user intention, accords with the habit of the user, realizes interaction between the user and the vehicle-mounted equipment, and can improve interaction efficiency and user experience.
Fig. 1 shows a flowchart of a cross-device control method according to an embodiment of the present disclosure, as shown in fig. 1, the cross-device control method includes the following steps S101 to S102:
in step S101, in response to detecting a gesture operation, capturing a first gesture operation parameter from a gesture touch area of an application Graphical User (GUI) interface;
in step S102, the first gesture operation parameter and the size parameter of the gesture touch area of the first device are packaged according to a preset protocol, and the packaged parameter is sent to the second device, so that the second device controls information displayed in the corresponding display area of the second device based on the received packaged parameter.
As mentioned above, the inventor found that, due to different automobile delivery times, screens of some vehicle-mounted devices are not touch screens, or even touch screens have poor sensitivity, so that a user cannot or is difficult to perform touch operation on information displayed by the vehicle-mounted devices through gestures, and screen information is operated through physical keys, on one hand, the problem of limited operation mode exists, and on the other hand, under the condition that most users are used to gesture touch operation, screen information is operated through the physical keys, so that the screen information does not conform to the habit of the user. Therefore, an interaction control mode which can effectively respond to the user intention and accords with the user habit is needed, interaction between the user and the vehicle-mounted equipment is realized, and the user experience is improved.
In view of the above problem, in this embodiment, a cross-device control method is provided, in which a gesture touch area is set at a mobile phone end to implement interaction with an in-vehicle device. The technical scheme can effectively respond to the user intention, accords with the habit of the user, realizes interaction between the user and the vehicle-mounted equipment, and can improve interaction efficiency and user experience.
In an embodiment of the present disclosure, the cross-device control method may be applied to a first device, such as a user terminal, such as an electronic device that can perform cross-device control.
In an embodiment of the present disclosure, the gesture operation refers to a gesture operation performed by a user in a gesture touch area of a GUI interface of a first device. The gesture operation can be a map zooming gesture for zooming a map, such as a gesture of finger pinch and the like; map movement gestures that move the map, such as finger movements and the like; a map zoom-in gesture to zoom in on the map, such as a single-finger double-click gesture; a map zoom-out gesture for zooming out a map, such as a gesture of double-finger clicking, single-finger clicking, or the like; a map pitch angle control gesture that changes the pitch angle of the map, such as a gesture of sliding two fingers; and so on.
In an embodiment of the present disclosure, the gesture touch area refers to a touchable area in the GUI interface of the first device for providing a user with a gesture input operation. The gesture touch area can be arranged at any position of the GUI interface, the GUI interface can be set as an integral display area of the first device according to the requirements of practical application, and the GUI interface can also be set as a display area of the first device except for a basic function display area. Fig. 2A to 2E are schematic diagrams illustrating positions of gesture touch areas according to an embodiment of the present disclosure, in which the GUI interface is a display area of the first device except for a basic function display area, the basic function display area is provided with basic function virtual controls such as "problem help", "end navigation", and the like, and the gesture touch area may be disposed at the bottom of the GUI interface, as shown in fig. 2A; a top portion, as shown in FIG. 2B; left side, as shown in fig. 2C; right side, as shown in fig. 2D; it may even occupy the entirety of the GUI interface, as shown in fig. 2E.
In an embodiment of the present disclosure, the first gesture operation parameter may include one or more of the following parameters: touch point location, zoom factor, distance of movement corresponding to the touch point, and the like.
The touch point position refers to a pixel position of a point where a user contacts the first device gesture touch area, and can be used for representing a reference point for subsequently controlling display information of the second device display area. In an embodiment of the present disclosure, if the touch point moves, the touch point position may include touch point positions before and after the movement.
The zoom factor refers to a factor generated based on the movement of the touch point position and used for characterizing a zoom type and a zoom scale, where the zoom type refers to zoom-out operation or zoom-in operation, for example, if the movement of the touch point position is a relative movement, the zoom type may be determined as zoom-out operation, and if the movement of the touch point position is a back movement, the zoom type may be determined as zoom-in operation.
The moving distance corresponding to the touch point refers to a pixel distance of the touch point moving in the gesture touch area, and can be used for representing a control proportion for subsequently controlling information display of the second device display area, such as a scaling proportion, an amplification proportion, a reduction proportion, a pitch angle change angle and the like.
In another embodiment of the present disclosure, the first gesture operation parameter may further include a touch point moving direction. The touch point moving direction refers to a direction in which a certain touch point moves, such as upward or downward, and may be used to represent a control direction for subsequently performing control on the display information of the display area of the second device, such as whether the information range displayed in the display area is enlarged or reduced, a direction in which the display information of the display area moves, and whether the display information of the display area is controlled to be in a downward view or in an upward view. When two or more touch points are provided, the touch point moving direction refers to a moving direction corresponding to each touch point.
In an embodiment of the present disclosure, the size parameter of the gesture touch area of the first device refers to a height and a width of the gesture touch area of the first device.
In an embodiment of the present disclosure, the preset protocol refers to a preset protocol capable of implementing encapsulation of data. The first device and the second device agree on the preset protocol, that is, after the first device encapsulates data using the preset protocol, the second device can decapsulate the encapsulated data using the same protocol or a corresponding protocol.
In an embodiment of the present disclosure, the second device is a device that receives the package parameter sent by the first device and controls information displayed in a corresponding display area according to the package parameter. The second device may be, for example, an in-vehicle device or the like.
In the foregoing embodiment, as a device for controlling a display area of a second device to display information, after detecting a gesture operation, a first device first captures a first gesture operation parameter generated based on the gesture operation of a user from a gesture touch area of a GUI interface, and then packages the first gesture operation parameter and a size parameter of the gesture touch area of the first device and sends the packaged first gesture operation parameter and the size parameter to the second device, so that the second device controls information displayed in a corresponding display area of the second device based on the received packaged parameter, thereby implementing effective interaction between a user terminal and a vehicle-mounted device.
In an embodiment of the present disclosure, the application is an application supporting a map navigation function, and the gesture operation is a map zooming gesture, such as a finger pinch gesture, that is, two fingers perform relative or back movement in a gesture touch area, where the captured first gesture operation parameter includes: the positions and the zooming coefficients of the two touch points, wherein the two touch points are obtained through capture because the map zooming gesture is two-finger operation; thus, the first gesture operation parameters of the package also include: the position of the two touch points and the zoom factor.
In another embodiment of the present disclosure, when the gesture operation is a map zoom gesture, the captured first gesture operation parameter includes: in this embodiment, control reference point information of display information of a display area of a second device to be displayed subsequently may be calculated in advance by the first device according to the positions of the two touch points, and then the control reference point information may be sent to the second device, wherein the control reference point may be set as a middle point between the two touch points, and therefore, the method may further include the steps of:
based on the positions of the two touch points, a position of an intermediate point located in the middle of the two touch points is determined.
Further in this embodiment, the packaged first gesture operation parameters include: the position of the intermediate point and the scaling factor.
In an embodiment of the present disclosure, the application is an application supporting a map navigation function, and the gesture operation is a map moving gesture, such as a finger moving gesture, that is, a finger moves a distance in a gesture touch area, at this time, the captured first gesture operation parameter, that is, the encapsulated first gesture operation parameter includes: a position of a touch point and a moving distance of the touch point;
in an embodiment of the present disclosure, the application program is an application program supporting a map navigation function, and the gesture operation is a map zoom-in gesture, such as a single-finger double-click gesture, that is, a finger performs two strokes in a gesture touch area, at this time, the captured first gesture operation parameter, that is, the encapsulated first gesture operation parameter includes: the touch position of the touch point is at least two and is even, the touch position is two if the user performs a single-finger double click once, and the touch position is four if the user performs two continuous single-finger double clicks twice, wherein the distance between at least two positions is very close, for example, smaller than a preset distance threshold.
In an embodiment of the present disclosure, the application program is an application program supporting a map navigation function, the gesture operation is a map zoom-out gesture, such as a two-finger single-click gesture, that is, two fingers perform a single touch on a gesture touch area at the same time, at this time, the captured first gesture operation parameter, that is, the encapsulated first gesture operation parameter includes: the positions of the two touch points;
in an embodiment of the present disclosure, the application program is an application program supporting a map navigation function, and the gesture operation is a map pitch angle control gesture, such as a two-finger sliding gesture, that is, two fingers slide in a gesture touch area at the same time and in the same direction for almost the same distance, at this time, the captured first gesture operation parameter, that is, the packaged first gesture operation parameter includes: the positions of the two touch points and the moving distance corresponding to each touch point, wherein the moving distance corresponding to the touch points comprises: the distance between the touch point and the touch point is close to the first distance, for example, the distance difference between the first distance and the second distance is smaller than a preset distance threshold.
As mentioned above, the first gesture operation parameter further comprises: the touch point moving direction, that is, after the position of the touch point is captured by the first device, the moving direction of each touch point can be determined according to the change information of the position of the touch point, and then the moving direction of the touch point can be packaged together with the other first gesture operation parameters and then sent to the second device.
In this embodiment:
when the gesture operation is a map moving gesture, for example, a finger moving gesture is a finger moving gesture, that is, one finger moves for a certain distance in a gesture touch area, there is one touch point, and at this time, the touch point moving direction is the moving direction corresponding to the touch point.
When the gesture operation is a map pitch angle control gesture, such as a two-finger sliding gesture, that is, two fingers slide in the same direction at the same time in the gesture touch area for almost the same distance, the number of the touch points is two, and at this time, the touch point moving direction includes: the touch screen comprises a first direction corresponding to one touch point and a second direction corresponding to the other touch point, wherein the first direction and the second direction are in a same-direction relationship.
In an embodiment of the present disclosure, the method may further include the steps of:
a connection is established with the second device.
In this embodiment, a connection between the first device and the second device needs to be established first, and data can be transmitted between the first device and the second device. The connection between the first device and the second device may be a wired connection or a wireless connection.
In an embodiment of the present disclosure, the method may further include the steps of:
and responding to a virtual control touch operation, sending a function execution command corresponding to the virtual control packaged according to a preset protocol to the second device or triggering the first device to execute the function execution command corresponding to the virtual control, wherein the virtual control is arranged in a virtual control area of the graphical user interface of the application program, and the virtual control area is positioned above or below the gesture touch area or is parallel to the gesture touch area.
In this embodiment, in order to reduce redundancy of information displayed by the first device, a current display area of the first device does not display a specific navigation route and map information or displays the navigation route and map information on a background display interface, and an area outside a gesture touch area of the first device only displays a virtual control corresponding to a preset navigation function, so as to be able to control the second device or the first device to start the corresponding navigation function when the virtual control is touched, where the virtual control may be, for example, a virtual control corresponding to "avoid the front road", "route full-scale", "route refresh", "voice-look-over", "search along the way", "more", "information and voice interaction" or "click to invoke" xiaoden voice "and other navigation functions, as shown in fig. 3. At this time, if a touch operation for the virtual control is detected, a function execution command corresponding to the virtual control may be sent to the second device, so that the second device executes a corresponding touched function, where the function execution command sent to the second device may also be sent to the second device after being encapsulated according to the preset protocol; and triggering the first device to execute a function execution command corresponding to the virtual control. Therefore, the problems of information display redundancy and performance consumption increase of the first equipment can be effectively avoided.
In an embodiment of the present disclosure, the gesture touch area is further provided with a slider control, and in this embodiment, the method may further include:
capturing a movement distance of the slider in response to an operation of the slider control;
and packaging the moving distance of the sliding bar and the size parameter of the sliding bar and then sending the packaged parameters to second equipment, so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameters.
In view of that the display area of the device is usually not enough to display all information, therefore, in order to facilitate control over information that is not currently displayed, in this embodiment, the gesture touch area is further provided with a slider control, so as to implement control over display of information in the corresponding display area of the second device according to an operation on the slider control.
Fig. 4 shows a flowchart of a cross-device control method according to another embodiment of the present disclosure, which includes the following steps S401-S403, as shown in fig. 4:
in step S401, receiving a package parameter sent by a first device, and decapsulating according to a preset protocol to obtain a first gesture operation parameter and a size screen parameter of a gesture touch area of the first device;
in step S402, according to the size screen parameter of the gesture touch area of the first device and the size parameter of the display area of the second device, converting the first gesture operation parameter into a second gesture operation parameter suitable for operating on the second device;
in step S403, controlling information displayed in a corresponding display area of the second device according to the second gesture operation parameter.
As mentioned above, the inventor found that, due to different automobile delivery times, screens of some vehicle-mounted devices are not touch screens, or even touch screens have poor sensitivity, so that a user cannot or is difficult to perform touch operation on information displayed by the vehicle-mounted devices through gestures, and screen information is operated through physical keys, on one hand, the problem of limited operation mode exists, and on the other hand, under the condition that most users are used to gesture touch operation, screen information is operated through the physical keys, so that the screen information does not conform to the habit of the user. Therefore, an interaction control mode which can effectively respond to the user intention and accords with the user habit is needed, interaction between the user and the vehicle-mounted equipment is realized, and the user experience is improved.
In view of the above problem, in this embodiment, a cross-device control method is provided, in which a gesture touch area is set at a mobile phone end to implement interaction with an in-vehicle device. The technical scheme can effectively respond to the user intention, accords with the habit of the user, realizes interaction between the user and the vehicle-mounted equipment, and can improve interaction efficiency and user experience.
In an embodiment of the present disclosure, the cross-device control method may be applied to a second device such as an electronic device that can perform cross-device control, for example, an in-vehicle device.
In an embodiment of the present disclosure, the second gesture operation parameter refers to a gesture operation parameter that is suitable for operating on a second device and is obtained by converting the first gesture operation parameter based on a size parameter of a gesture touch area of the first device, which corresponds to the first gesture operation parameter.
In the foregoing embodiment, as a device that receives control of a first device, after receiving a package parameter sent by the first device, the second device decapsulates the package parameter according to a preset protocol that is the same as or corresponding to the first device, to obtain a size parameter of a gesture touch area of the first device and a first gesture operation parameter, converts the first gesture operation parameter into a second gesture operation parameter suitable for operation on the second device according to the size parameter of the gesture touch area of the first device and a size parameter of a display area of the second device, and then controls information displayed in a corresponding display area of the second device according to the second gesture operation parameter, thereby implementing effective interaction between a user terminal and a vehicle-mounted device.
In an embodiment of the present disclosure, the step S402 of converting the first gesture operation parameter into a second gesture operation parameter suitable for being operated on the second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device may include the following steps:
acquiring a size parameter of a display area of second equipment;
calculating to obtain a size ratio between the first device and the second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device;
converting the first gesture operation parameter to a second gesture operation parameter suitable for operation on a second device according to the size ratio.
Wherein the size parameter of the display area of the second device refers to a height and a width of the display area of the second device.
Wherein the first gesture operation parameters include one or more of the following parameters: touch point position, zoom factor, touch point moving distance and touch point moving direction.
In view of that the sizes of the gesture touch area of the first device and the controlled display area of the second device are usually different, in this embodiment, after receiving the size parameter of the gesture touch area of the first device and the first gesture operation parameter, a size ratio between the first device and the second device is further calculated according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device, and the first gesture operation parameter is converted into a second gesture operation parameter suitable for operating on the second device according to the size ratio.
More specifically, when the first gesture operation parameter is converted into the second gesture operation parameter suitable for operation on the second device according to the size ratio, the touch point position may be converted into a touch point position in the second device, the touch point movement distance may be converted into a touch point movement distance in the second device, and/or the touch point movement direction may be converted into a touch point movement direction in the second device according to the size ratio. If the first gesture operation parameter includes a touch point moving direction and needs to be converted in direction, or if the first gesture operation parameter does not include the touch point moving direction and needs to generate the touch point moving direction in the second device, a touch point moving starting point position and a touch point moving end point position in the second device may be converted into the touch point moving starting point position and the touch point moving end point position in the second device according to the size ratio, and then the touch point moving direction in the second device is determined according to the touch point moving starting point position and the touch point moving end point position in the second device.
In an embodiment of the present disclosure, the step S403 of controlling information displayed in the corresponding display area of the second device according to the second gesture operation parameter may include the following steps:
calculating to obtain a gesture control parameter according to the second gesture operation parameter;
and controlling the information displayed in the corresponding display area of the second equipment according to the gesture control parameter.
In view of the fact that the gesture operation parameters are only operation parameters captured in the gesture touch area of the first device, even if the gesture operation parameters are converted into gesture operation parameters suitable for the display area of the second device, the control of displaying information in the display area of the second device is often not enough, in this embodiment, gesture control parameters suitable for controlling displaying information in the display area of the second device are further calculated based on the second gesture operation parameters, and then the information displayed in the corresponding display area of the second device is controlled according to the gesture control parameters.
In an embodiment of the present disclosure, assuming that the display interface of the second device is a map navigation interface, the step of calculating a gesture control parameter according to the second gesture operation parameter may include the following steps:
when the second gesture operation parameters comprise the positions of the two touch points and a zoom factor, determining that the gesture operation is a map zoom gesture, calculating the position of the middle point between the two touch points according to the positions of the two touch points, and calculating the map zoom scale according to the zoom factor;
when the second gesture operation parameters comprise the position of the middle point and a zooming coefficient, determining that the gesture operation is a map zooming gesture, and calculating according to the zooming coefficient to obtain a map zooming scale;
when the second gesture operation parameter comprises the position of a touch point and the moving distance of the touch point, determining that the gesture operation is a map moving gesture, calculating the moving direction of the display information of the display area of the second equipment according to the position change of the touch point, and calculating the moving distance of the display information of the display area of the second equipment according to the moving distance of the touch point;
when the second gesture operation parameter comprises the position of a touch point, determining that the gesture operation is a map magnifying gesture, and calculating according to the number of the touch point positions to obtain a magnifying scale;
when the second gesture operation parameters comprise the positions of two touch points, determining that the gesture operation is a map zooming-out gesture, and calculating according to the number of the positions of each touch point to obtain a zooming-out scale;
when the second gesture operation parameters comprise the positions of two touch points and the moving distance of each touch point, determining that the gesture operation is a map pitch angle control gesture, determining the moving direction of the corresponding touch point according to the positions of the touch points and the moving distance of the corresponding touch point, determining a pitch angle control direction according to the moving direction of the touch point, and calculating a pitch angle control angle according to the moving distance of the touch point.
In the embodiment, different gesture control parameter calculation modes are determined according to different obtained second gesture operation parameters. Specifically, the method comprises the following steps:
when the second gesture operation parameter includes the positions of the two touch points and the zoom factor, it may be determined that the gesture operation is a map zoom gesture, such as a pinch gesture, and at this time, the touch point position does not need to be further calculated, but an intermediate point position between the two touch points needs to be calculated according to the two touch point positions; and calculating the map scaling according to the scaling coefficient, wherein the map scaling can be determined according to a preset corresponding relation between the scaling coefficient and the map scaling.
When the second gesture operation parameter includes the position of the middle point and the zoom factor, it may be determined that the gesture operation is also a map zoom gesture, and at this time, the position of the middle point is touched without further calculation, but a map zoom scale is calculated according to the zoom factor.
When the second gesture operation parameter includes the position of a touch point and the moving distance of the touch point, the gesture operation can be determined to be a map moving gesture, at this time, the moving direction of the display information of the display area of the second device can be calculated according to the position change of the touch point, and the moving distance of the display information of the display area of the second device can be calculated according to the moving distance of the touch point.
When the second gesture operation parameter includes the position of one touch point, the gesture operation can be determined to be a map magnifying gesture, and at this time, the position of the touch point does not need to be further calculated, but the magnifying scale needs to be calculated according to the number of the position of the touch point. Assuming that the map magnifying gesture is a single-finger double click, when the magnifying scale is determined according to the number of single-finger double clicks, the current magnifying scale may be determined according to a preset magnifying scale increasing step length and a magnifying scale extreme value, for example, assuming that the magnifying scale is set to 10 levels in total, the current magnifying scale is 6 levels, if the number of single-finger double clicks is 1, that is, the number of touch point positions is 2, the magnifying scale may be increased to 7 levels, if the number of single-finger double clicks is 2, that is, the number of touch point positions is 4, the magnifying scale may be increased to 8 levels, but if the number of single-finger double clicks is 4, that is, the number of touch point positions is 8, and since the highest level of the magnifying scale is 10 levels, the magnifying scale may be increased only to the highest 10 levels.
When the second gesture operation parameter includes the positions of two touch points, the gesture operation can be determined to be a map reduction gesture, and at this time, the positions of the touch points do not need to be further calculated, but a reduction ratio is calculated according to the number of the positions of each touch point. Assuming that the map zooming-out gesture is a double-finger click, when the zooming-out scale is determined according to the double-finger click times, the current zooming-out scale can be determined according to the preset zooming-out scale increasing step length and the zooming-out scale extreme value, and the calculation of the zooming-out scale is similar to that of the zooming-out scale, and is not repeated here.
When the second gesture operation parameter includes the positions of two touch points and the moving distance of each touch point, determining that the gesture operation is a map pitch angle control gesture, at this time, the positions of the touch points do not need to be further calculated, but the moving direction of the corresponding touch points needs to be determined according to the change of the positions of the touch points and the moving distance of the corresponding touch points, and further determining a pitch angle control direction according to the moving direction of the touch points, for example, if the moving direction of the touch points is upward or obliquely upward, the pitch angle control direction is upward pitch, and if the moving direction of the touch points is downward or obliquely downward, the pitch angle control direction is downward pitch; and calculating a pitch angle control angle according to the movement distance of the touch point, wherein the pitch angle control angle can be determined according to the ratio between the movement distance of the touch point and the size of the display area of the second device, or according to a preset corresponding relation between the movement distance of the touch point and the pitch angle control angle, and the movement distance of the touch point used for calculating the pitch angle control angle can be set according to the requirements of practical application, for example, the movement distance of the touch point can be set to be a larger value of the movement distances of the two touch points, or a smaller value of the movement distances of the two touch points, or an average value of the movement distances of the two touch points.
In an embodiment of the present disclosure, the step of controlling information displayed in a corresponding display area of the second device according to the gesture control parameter may include the following steps:
when the gesture operation is determined to be a map zooming gesture, zooming operation is carried out on information displayed in a corresponding display area of the second equipment according to the map zooming scale by taking the middle point of the two touch points as a reference position;
when the gesture operation is determined to be a map moving gesture, the position of the touch point is taken as a reference position, and the information displayed in the corresponding display area of the second equipment is controlled to move along the moving direction of the information displayed in the display area of the second equipment by the moving distance of the information displayed in the display area of the second equipment;
when the gesture operation is determined to be a map magnifying gesture, magnifying the information displayed in the display area of the second device by taking the position of the touch point as a reference position;
when the gesture operation is determined to be a map zooming-out gesture, taking the position of the touch point as a reference position, and zooming out the information displayed in the display area of the second equipment by the zooming-out scale;
and when the gesture operation is determined to be a map pitch angle control gesture, controlling information displayed in the display area of the second equipment to change the pitch angle control angle along the pitch angle control direction by taking the position of the touch point as a reference position.
And after the gesture control parameters are obtained through calculation, controlling the information displayed in the corresponding display area of the second equipment according to the gesture control parameters. Specifically, the method comprises the following steps:
and when the gesture operation is determined to be a map zooming gesture, zooming operation is carried out on the information displayed in the corresponding display area of the second equipment according to the determined map zooming scale by taking the middle point of the two touch points as a control reference position.
And when the gesture operation is determined to be a map moving gesture, controlling the information displayed in the corresponding display area of the second equipment to move the moving distance of the display information of the display area of the second equipment along the moving direction of the display information of the display area of the second equipment, which is obtained by conversion, by taking the position of the touch point as a control reference position.
And when the gesture operation is determined to be a map magnifying gesture, magnifying the information displayed in the display area of the second device by taking the position of the touch point as a control reference position.
And when the gesture operation is determined to be a map zooming-out gesture, zooming out the information displayed in the display area of the second device by taking the position of the touch point as a control reference position.
And when the gesture operation is determined to be a map pitch angle control gesture, controlling the information displayed in the display area of the second device to pitch up or pitch down along the determined pitch angle control direction by taking the position of the touch point as a control reference position.
In an embodiment of the present disclosure, the method may further include the steps of:
a connection is established with the first device.
In this embodiment, the connection between the second device and the first device needs to be established first, so that the data sent by the second device can be received from the first device. The connection between the second device and the first device may be a wired connection or a wireless connection.
In an embodiment of the present disclosure, the method may further include the steps of:
and responding to the received function execution command sent by the first equipment to execute the corresponding function.
As mentioned above, the first device may control the display information of the display area of the second device by means of touch control on the virtual control, and thus, in this embodiment, if a function execution command sent by the first device is received, a corresponding function is executed according to the function execution command.
In an embodiment of the present disclosure, the method may further include the steps of:
and responding to the received packaging parameters which are sent by the first equipment and packaged with the sliding strip moving distance and the sliding strip size parameters, and controlling the information displayed in the corresponding display area of the second equipment according to the sliding strip moving distance and the sliding strip size parameters.
As mentioned above, the first device may control, by means of touch control on the slider control, information that is not currently displayed in the display area of the second device, and therefore, in this embodiment, if the package parameter in which the slider movement distance and the slider size parameter are packaged is received, the display of the information in the display area of the second device may be controlled according to the slider movement distance and the slider size parameter.
Technical terms and technical features related to the technical terms and technical features shown in fig. 4 and related embodiments are the same as or similar to those of the technical terms and technical features shown in fig. 1 to 3 and related embodiments, and for the explanation and description of the technical terms and technical features related to the technical terms and technical features shown in fig. 4 and related embodiments, reference may be made to the above explanation of the embodiment shown in fig. 1 to 3 and related embodiments, and no further description is provided here.
Fig. 5 shows a flowchart of a cross-device control method according to still another embodiment of the present disclosure, which includes the following steps S501-S502, as shown in fig. 5:
in step S501, in response to detecting a gesture operation, the first device captures a first gesture operation parameter from a gesture touch area of the application graphical user interface, packages the first gesture operation parameter and a size parameter of the gesture touch area of the first device according to a preset protocol, and sends the packaged parameter to the second device;
in step S502, the second device receives the encapsulation parameter sent by the first device, decapsulates the encapsulation parameter according to a preset protocol to obtain a first gesture operation parameter and a size parameter of a gesture touch area of the first device, converts the first gesture operation parameter into a second gesture operation parameter suitable for operating on the second device according to the size parameter of the gesture touch area of the first device and the size parameter of a display area of the second device, and controls information displayed in a corresponding display area of the second device according to the second gesture operation parameter.
In an embodiment of the present disclosure, the first gesture operation parameter comprises one or more of the following parameters: touch point position, zoom factor, touch point movement distance.
In an embodiment of the present disclosure, the application is an application supporting a map navigation function, the gesture operation is a map zooming gesture, and the captured first gesture operation parameter includes: the positions and scaling factors of the two touch points;
the packaged first gesture operational parameters include: the position of the two touch points and the zoom factor.
In an embodiment of the present disclosure, the application is an application supporting a map navigation function, the gesture operation is a map zooming gesture, and the captured first gesture operation parameter includes: the position and scaling factor of two touch points, the method further comprising:
determining a position of an intermediate point located in the middle of the two touch points based on the positions of the two touch points;
the packaged first gesture operational parameters include: the position of the intermediate point and the scaling factor.
In an embodiment of the present disclosure, the application program is an application program supporting a map navigation function, the gesture operation is a map moving gesture, and the captured first gesture operation parameters include: a position of a touch point and a moving distance of the touch point;
the gesture operation is a map zooming gesture, and the captured first gesture operation parameters comprise: the location of a touch point;
a map zoom-out gesture, the first gesture operation parameters comprising: the positions of the two touch points;
a map pitch angle control gesture, the first gesture operational parameters comprising: the positions of the two touch points and the moving distance of each touch point.
In an embodiment of the present disclosure, the method further includes:
and responding to a virtual control touch operation, sending a function execution command corresponding to the virtual control packaged according to a preset protocol to the second device or triggering the first device to execute the function execution command corresponding to the virtual control, wherein the virtual control is arranged in a virtual control area of the graphical user interface of the application program, and the virtual control area is positioned above or below the gesture touch area or is parallel to the gesture touch area.
In an embodiment of the present disclosure, the gesture touch area is provided with a slider control, and the method further includes:
capturing a movement distance of the slider in response to an operation of the slider control;
and packaging the moving distance of the sliding bar and the size parameter of the sliding bar and then sending the packaged parameters to second equipment, so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameters.
In an embodiment of the present disclosure, the converting the first gesture operation parameter into a second gesture operation parameter suitable for operating on a second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device includes:
acquiring a size parameter of a display area of second equipment;
calculating to obtain a size ratio between the first device and the second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device;
converting the first gesture operation parameter to a second gesture operation parameter suitable for operation on a second device according to the size ratio.
In an embodiment of the present disclosure, the first gesture operation parameter comprises one or more of the following parameters: touch point position, zoom factor, touch point moving distance and touch point moving direction;
the converting the first gesture operation parameter into a second gesture operation parameter suitable for operation on a second device according to the size ratio comprises:
and converting the position of the touch point into the position of the touch point in the second equipment, converting the movement distance of the touch point into the movement distance of the touch point in the second equipment and/or converting the movement direction of the touch point into the movement direction of the touch point in the second equipment according to the size proportion.
In an embodiment of the present disclosure, the controlling information displayed in a corresponding display area of the second device according to the second gesture operation parameter includes:
calculating to obtain a gesture control parameter according to the second gesture operation parameter; and controlling the information displayed in the corresponding display area of the second equipment according to the gesture control parameter.
In one embodiment of the present disclosure, the first and second electrodes are,
the display interface of the second device is a map navigation interface, and the gesture control parameter is obtained by calculation according to the second gesture operation parameter, and the method comprises the following steps:
when the second gesture operation parameters comprise the positions of the two touch points and a zoom factor, determining that the gesture operation is a map zoom gesture, calculating the position of the middle point between the two touch points according to the positions of the two touch points, and calculating the map zoom scale according to the zoom factor;
when the second gesture operation parameters comprise the position of the middle point and a zooming coefficient, determining that the gesture operation is a map zooming gesture, and calculating according to the zooming coefficient to obtain a map zooming scale;
when the second gesture operation parameter comprises the position of a touch point and the moving distance of the touch point, determining that the gesture operation is a map moving gesture, calculating the moving direction of the display information of the display area of the second equipment according to the position change of the touch point, and calculating the moving distance of the display information of the display area of the second equipment according to the moving distance of the touch point;
when the second gesture operation parameter comprises the position of a touch point, determining that the gesture operation is a map magnifying gesture, and calculating according to the number of the touch point positions to obtain a magnifying scale;
when the second gesture operation parameters comprise the positions of two touch points, determining that the gesture operation is a map zooming-out gesture, and calculating according to the number of the positions of each touch point to obtain a zooming-out scale;
when the second gesture operation parameters comprise the positions of two touch points and the moving distance of each touch point, determining that the gesture operation is a map pitch angle control gesture, determining the moving direction of the corresponding touch point according to the positions of the touch points and the moving distance of the corresponding touch point, determining a pitch angle control direction according to the moving direction of the touch point, and calculating a pitch angle control angle according to the moving distance of the touch point.
In an embodiment of the present disclosure, the controlling information displayed in a corresponding display area of the second device according to the gesture control parameter includes:
when the gesture operation is determined to be a map zooming gesture, zooming operation is carried out on information displayed in a corresponding display area of the second equipment according to the map zooming scale by taking the middle point of the two touch points as a reference position;
when the gesture operation is determined to be a map moving gesture, the position of the touch point is taken as a reference position, and the information displayed in the corresponding display area of the second equipment is controlled to move along the moving direction of the information displayed in the display area of the second equipment by the moving distance of the information displayed in the display area of the second equipment;
when the gesture operation is determined to be a map magnifying gesture, magnifying the information displayed in the display area of the second device by taking the position of the touch point as a reference position;
when the gesture operation is determined to be a map zooming-out gesture, taking the position of the touch point as a reference position, and zooming out the information displayed in the display area of the second equipment by the zooming-out scale;
and when the gesture operation is determined to be a map pitch angle control gesture, controlling information displayed in the display area of the second equipment to change the pitch angle control angle along the pitch angle control direction by taking the position of the touch point as a reference position.
In an embodiment of the present disclosure, the method may further include the steps of:
a connection between the first device and the second device is established.
Based on the technical scheme, the control of the information displayed on the second device display area can be realized based on the gesture operation in the gesture touch area of the first device GUI interface, for example, the zoom control of the information displayed on the second device display area can be realized based on the pinch gesture in the gesture touch area of the first device GUI interface, as shown in fig. 6A; movement control of information displayed by the second device display area may be achieved based on a movement gesture in the first device GUI interface gesture touch area, as shown in fig. 6B; magnification control of information displayed in the second device display area may be achieved based on a single-finger double-click gesture in the first device GUI interface gesture touch area, as shown in fig. 6C; zooming out control on information displayed in the second device display area can be achieved based on a double-finger single-click gesture in the first device GUI interface gesture touch area, as shown in FIG. 6D; pitch control of information displayed in the second device display area may be achieved based on a two-finger swipe gesture in the first device GUI interface gesture touch area, as shown in fig. 6E.
Technical terms and technical features related to the technical terms and technical features shown in fig. 5 and related embodiments are the same as or similar to those of the technical terms and technical features shown in fig. 1 to 4 and related embodiments, and for the explanation and description of the technical terms and technical features related to the technical terms and technical features shown in fig. 5 and related embodiments, reference may be made to the above explanation of the embodiment shown in fig. 1 to 4 and related embodiments, and no further description is provided here.
The following are embodiments of the disclosed apparatus or system that may be used to perform embodiments of the disclosed methods.
Fig. 7 shows a block diagram of a cross-device control apparatus according to an embodiment of the present disclosure, which may be implemented as part or all of an electronic device by software, hardware, or a combination of both. As shown in fig. 7, the cross-device control apparatus includes:
the capturing module 701 is configured to capture a first gesture operation parameter from a gesture touch area of an application graphical user interface in response to detecting a gesture operation;
the sending module 702 is configured to package the first gesture operation parameter and the size parameter of the gesture touch area of the first device according to a preset protocol, and send the packaged parameter to the second device, so that the second device controls information displayed in the corresponding display area of the second device based on the received packaged parameter.
Fig. 8 shows a block diagram of a cross-device control apparatus according to another embodiment of the present disclosure, which may be implemented as part or all of an electronic device by software, hardware, or a combination of both. As shown in fig. 8, the cross-device control apparatus includes:
the receiving module 801 is configured to receive the encapsulation parameter sent by the first device, and decapsulate the encapsulation parameter according to a preset protocol to obtain a first gesture operation parameter and a size parameter of a gesture touch area of the first device;
a conversion module 802, configured to convert the first gesture operation parameter into a second gesture operation parameter suitable for operating on a second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device;
and the control module 803 is configured to control information displayed in the corresponding display area of the second device according to the second gesture operation parameter.
Fig. 9 shows a block diagram of a cross-device control system according to an embodiment of the present disclosure, which may be implemented as part or all of an electronic device by software, hardware, or a combination of both. As shown in fig. 9, the cross-device control system includes:
the first device 901 is configured to, in response to detecting a gesture operation, capture a first gesture operation parameter from a gesture touch area of an application graphical user interface, package the first gesture operation parameter and a size parameter of the gesture touch area of the first device according to a preset protocol, and send the packaged parameter to a second device, so that the second device controls information displayed in a corresponding display area of the second device based on the received packaged parameter;
the second device 902 is configured to receive the encapsulation parameter sent by the first device, decapsulate the encapsulation parameter according to a preset protocol to obtain a first gesture operation parameter and a size parameter of a gesture touch area of the first device, convert the first gesture operation parameter into a second gesture operation parameter suitable for operating on the second device according to the size parameter of the gesture touch area of the first device and the size parameter of a display area of the second device, and control information displayed in a corresponding display area of the second device according to the second gesture operation parameter.
The technical terms and technical features related to the embodiments shown in fig. 7 to 9 and related embodiments are the same as or similar to the technical terms and technical features related to the embodiments shown in fig. 1 to 6 and related embodiments, and for the explanation and description of the technical terms and technical features related to the embodiments shown in fig. 7 to 9 and related embodiments, the above explanation of the embodiments shown in fig. 1 to 6 and related embodiments can be referred to, and will not be repeated herein.
The present disclosure also discloses an electronic device, fig. 10 shows a block diagram of the electronic device according to an embodiment of the present disclosure, and as shown in fig. 10, the electronic device 1000 includes a memory 1001 and a processor 1002; the memory 1001 is used for storing one or more computer instructions, which are executed by the processor 1002 to implement the above-mentioned method steps.
FIG. 11 is a schematic block diagram of a computer system suitable for use in implementing a cross-device control method according to an embodiment of the present disclosure.
As shown in fig. 11, the computer system 1100 includes a processing unit 1101, which can execute various processes in the above-described embodiments according to a program stored in a Read Only Memory (ROM)1102 or a program loaded from a storage section 1108 into a Random Access Memory (RAM) 1103. In the RAM1103, various programs and data necessary for the operation of the system 1100 are also stored. The processing unit 1101, the ROM1102, and the RAM1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
The following components are connected to the I/O interface 1105: an input portion 1106 including a keyboard, mouse, and the like; an output portion 1107 including a signal output unit such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 1108 including a hard disk and the like; and a communication section 1109 including a network interface card such as a LAN card, a modem, or the like. The communication section 1109 performs communication processing via a network such as the internet. A driver 1110 is also connected to the I/O interface 1105 as necessary. A removable medium 1111 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1110 as necessary, so that a computer program read out therefrom is mounted into the storage section 1108 as necessary. The processing unit 1101 may be implemented as a CPU, a GPU, a TPU, an FPGA, an NPU, or other processing units.
In particular, the above described methods may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a medium readable thereby, the computer program comprising program code for performing the clear information determination method. In such an embodiment, the computer program can be downloaded and installed from a network through the communication portion 1109 and/or installed from the removable medium 1111.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowcharts or block diagrams may represent a module, a program segment, or a portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present disclosure may be implemented by software or hardware. The units or modules described may also be provided in a processor, and the names of the units or modules do not in some cases constitute a limitation of the units or modules themselves.
As another aspect, the disclosed embodiment also provides a computer-readable storage medium, which may be the computer-readable storage medium included in the apparatus in the foregoing embodiment; or it may be a separate computer readable storage medium not incorporated into the device. The computer readable storage medium stores one or more programs for use by one or more processors in performing the methods described in the embodiments of the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept. For example, the above features and (but not limited to) the features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (15)

1. A cross-device control method, wherein the method is executed by a first device, the method comprising:
in response to the detection of the gesture operation, capturing a first gesture operation parameter from a gesture touch area of the graphical user interface of the application program;
and packaging the first gesture operation parameter and the size parameter of the gesture touch area of the first equipment according to a preset protocol, and sending the packaging parameter to second equipment so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameter.
2. The method of claim 1, wherein the first gesture operation parameter comprises one or more of: touch point position, zoom factor, touch point movement distance.
3. The method of claim 2, wherein the application is a map navigation function enabled application, the gesture operation is a map zoom gesture, and the capturing the derived first gesture operation parameters comprises: the positions and scaling factors of the two touch points;
the packaged first gesture operational parameters include: the position of the two touch points and the zoom factor.
4. The method of claim 2, wherein the application is a map navigation function enabled application, the gesture operation is a map zoom gesture, and the capturing the derived first gesture operation parameters comprises: the position and scaling factor of two touch points, the method further comprising:
determining a position of an intermediate point located in the middle of the two touch points based on the positions of the two touch points;
the packaged first gesture operational parameters include: the position of the intermediate point and the scaling factor.
5. The method of claim 2, wherein the application is a map navigation function enabled application, the gesture operation is a map movement gesture, and the capturing the obtained first gesture operation parameters comprises: a position of a touch point and a moving distance of the touch point;
the gesture operation is a map zooming gesture, and the captured first gesture operation parameters comprise: the location of a touch point;
a map zoom-out gesture, the first gesture operation parameters comprising: the positions of the two touch points;
a map pitch angle control gesture, the first gesture operational parameters comprising: the positions of the two touch points and the moving distance of each touch point.
6. The method according to any one of claims 1-5, further comprising:
and responding to a virtual control touch operation, sending a function execution command corresponding to the virtual control packaged according to a preset protocol to the second device or triggering the first device to execute the function execution command corresponding to the virtual control, wherein the virtual control is arranged in a virtual control area of the graphical user interface of the application program, and the virtual control area is positioned above or below the gesture touch area or is parallel to the gesture touch area.
7. The method of any of claims 1-6, wherein the gesture touch area is provided with a slider control, the method further comprising:
capturing a movement distance of the slider in response to an operation of the slider control;
and packaging the moving distance of the sliding bar and the size parameter of the sliding bar and then sending the packaged parameters to second equipment, so that the second equipment controls information displayed in a corresponding display area of the second equipment based on the received packaging parameters.
8. A cross-device control method, wherein execution is by a second device, the method comprising:
receiving a packaging parameter sent by first equipment, and de-packaging according to a preset protocol to obtain a first gesture operation parameter and a size parameter of a gesture touch area of the first equipment;
converting the first gesture operation parameter into a second gesture operation parameter suitable for operating on second equipment according to the size parameter of the gesture touch area of the first equipment and the size parameter of the display area of the second equipment;
and controlling the information displayed in the corresponding display area of the second equipment according to the second gesture operation parameter.
9. The method of claim 8, the converting the first gesture operation parameter to a second gesture operation parameter suitable for operating on a second device according to a size parameter of a gesture touch area of the first device and a size parameter of a display area of the second device, comprising:
acquiring a size parameter of a display area of second equipment;
calculating to obtain a size ratio between the first device and the second device according to the size parameter of the gesture touch area of the first device and the size parameter of the display area of the second device;
converting the first gesture operation parameter to a second gesture operation parameter suitable for operation on a second device according to the size ratio.
10. The method of claim 9, wherein the first gesture operation parameter comprises one or more of: touch point position, zoom factor, touch point moving distance and touch point moving direction;
the converting the first gesture operation parameter into a second gesture operation parameter suitable for operation on a second device according to the size ratio comprises:
and converting the position of the touch point into the position of the touch point in the second equipment, converting the movement distance of the touch point into the movement distance of the touch point in the second equipment and/or converting the movement direction of the touch point into the movement direction of the touch point in the second equipment according to the size proportion.
11. The method of claim 10, wherein controlling the information displayed in the corresponding display area of the second device according to the second gesture operation parameter comprises:
calculating to obtain a gesture control parameter according to the second gesture operation parameter;
and controlling the information displayed in the corresponding display area of the second equipment according to the gesture control parameter.
12. The method of claim 11, wherein the display interface of the second device is a map navigation interface, and the calculating the gesture control parameter according to the second gesture operation parameter comprises:
when the second gesture operation parameters comprise the positions of the two touch points and a zoom factor, determining that the gesture operation is a map zoom gesture, calculating the position of the middle point between the two touch points according to the positions of the two touch points, and calculating the map zoom scale according to the zoom factor;
and when the second gesture operation parameters comprise the position of the middle point and the zoom coefficient, determining that the gesture operation is a map zooming gesture, and calculating to obtain a map zooming scale according to the zoom coefficient.
13. The method of claim 11, wherein,
when the second gesture operation parameter comprises the position of a touch point and the moving distance of the touch point, determining that the gesture operation is a map moving gesture, calculating the moving direction of the display information of the display area of the second equipment according to the position change of the touch point, and calculating the moving distance of the display information of the display area of the second equipment according to the moving distance of the touch point;
when the second gesture operation parameter comprises the position of a touch point, determining that the gesture operation is a map magnifying gesture, and calculating according to the number of the touch point positions to obtain a magnifying scale;
when the second gesture operation parameters comprise the positions of two touch points, determining that the gesture operation is a map zooming-out gesture, and calculating according to the number of the positions of each touch point to obtain a zooming-out scale;
when the second gesture operation parameters comprise the positions of two touch points and the moving distance of each touch point, determining that the gesture operation is a map pitch angle control gesture, determining the moving direction of the corresponding touch point according to the positions of the touch points and the moving distance of the corresponding touch point, determining a pitch angle control direction according to the moving direction of the touch point, and calculating a pitch angle control angle according to the moving distance of the touch point.
14. A cross-device control method, comprising:
the method comprises the steps that in response to detection of gesture operation, first gesture operation parameters are obtained by capturing from a gesture touch area of an application program graphical user interface through first equipment, the first gesture operation parameters and size parameters of the gesture touch area of the first equipment are packaged according to a preset protocol, and the packaged parameters are sent to second equipment;
the method comprises the steps that a second device receives packaging parameters sent by a first device, decapsulates the packaging parameters according to a preset protocol to obtain first gesture operation parameters and size parameters of a gesture touch area of the first device, converts the first gesture operation parameters into second gesture operation parameters suitable for operation on the second device according to the size parameters of the gesture touch area of the first device and size parameters of a display area of the second device, and controls information displayed in the corresponding display area of the second device according to the second gesture operation parameters.
15. A computer program product comprising computer programs/instructions, wherein the computer programs/instructions, when executed by a processor, implement the method steps of any of claims 1-14.
CN202110898684.1A 2021-08-05 2021-08-05 Cross-device control method and computer program product Pending CN113778310A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110898684.1A CN113778310A (en) 2021-08-05 2021-08-05 Cross-device control method and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110898684.1A CN113778310A (en) 2021-08-05 2021-08-05 Cross-device control method and computer program product

Publications (1)

Publication Number Publication Date
CN113778310A true CN113778310A (en) 2021-12-10

Family

ID=78836786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110898684.1A Pending CN113778310A (en) 2021-08-05 2021-08-05 Cross-device control method and computer program product

Country Status (1)

Country Link
CN (1) CN113778310A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114924677A (en) * 2022-02-09 2022-08-19 广州展讯信息科技有限公司 Map display method and system of resistance screen, storage medium and electronic equipment
WO2023174214A1 (en) * 2022-03-16 2023-09-21 华为技术有限公司 Universal device control method based on camera assembly, and device and system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104736969A (en) * 2012-10-16 2015-06-24 三菱电机株式会社 Information display device and display information operation method
CN104915173A (en) * 2015-06-29 2015-09-16 惠州华阳通用电子有限公司 Method for controlling interaction of double screens
CN106681552A (en) * 2016-11-18 2017-05-17 乐视控股(北京)有限公司 Method and device for controlling vehicle-mounted system, vehicle-mounted system and control method and device of vehicle-mounted system
CN107273083A (en) * 2017-06-30 2017-10-20 百度在线网络技术(北京)有限公司 Exchange method, device, equipment and storage medium between a kind of terminal device
CN108170358A (en) * 2017-12-27 2018-06-15 成都配天智能技术有限公司 Mobile phone and head-up display exchange method
CN109918012A (en) * 2019-03-11 2019-06-21 百度在线网络技术(北京)有限公司 A kind of control method of mobile terminal, device, equipment and storage medium
CN110543278A (en) * 2019-08-19 2019-12-06 广州点云科技有限公司 Cross-device screen coordinate adaptation method and device and storage medium
CN110865765A (en) * 2019-11-06 2020-03-06 青岛海信移动通信技术股份有限公司 Terminal and map control method
CN111078012A (en) * 2019-12-13 2020-04-28 钟林 Method and device for operating zooming function of intelligent terminal by using sliding-pressing gesture
CN111433832A (en) * 2017-12-29 2020-07-17 深圳市柔宇科技有限公司 Entity globe with touch function, display terminal and map display method
CN111497612A (en) * 2020-04-03 2020-08-07 广州小鹏汽车科技有限公司 Vehicle interaction method and device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104736969A (en) * 2012-10-16 2015-06-24 三菱电机株式会社 Information display device and display information operation method
CN104915173A (en) * 2015-06-29 2015-09-16 惠州华阳通用电子有限公司 Method for controlling interaction of double screens
CN106681552A (en) * 2016-11-18 2017-05-17 乐视控股(北京)有限公司 Method and device for controlling vehicle-mounted system, vehicle-mounted system and control method and device of vehicle-mounted system
CN107273083A (en) * 2017-06-30 2017-10-20 百度在线网络技术(北京)有限公司 Exchange method, device, equipment and storage medium between a kind of terminal device
CN108170358A (en) * 2017-12-27 2018-06-15 成都配天智能技术有限公司 Mobile phone and head-up display exchange method
CN111433832A (en) * 2017-12-29 2020-07-17 深圳市柔宇科技有限公司 Entity globe with touch function, display terminal and map display method
CN109918012A (en) * 2019-03-11 2019-06-21 百度在线网络技术(北京)有限公司 A kind of control method of mobile terminal, device, equipment and storage medium
CN110543278A (en) * 2019-08-19 2019-12-06 广州点云科技有限公司 Cross-device screen coordinate adaptation method and device and storage medium
CN110865765A (en) * 2019-11-06 2020-03-06 青岛海信移动通信技术股份有限公司 Terminal and map control method
CN111078012A (en) * 2019-12-13 2020-04-28 钟林 Method and device for operating zooming function of intelligent terminal by using sliding-pressing gesture
CN111497612A (en) * 2020-04-03 2020-08-07 广州小鹏汽车科技有限公司 Vehicle interaction method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114924677A (en) * 2022-02-09 2022-08-19 广州展讯信息科技有限公司 Map display method and system of resistance screen, storage medium and electronic equipment
WO2023174214A1 (en) * 2022-03-16 2023-09-21 华为技术有限公司 Universal device control method based on camera assembly, and device and system

Similar Documents

Publication Publication Date Title
JP5472256B2 (en) Vehicle display device and information display system
CN104335148B (en) Display device
US20140129941A1 (en) Information display processing device
US8493341B2 (en) Optical touch display device and method thereof
EP3046094A1 (en) Map information display device, map information display method, and map information display program
TWI450128B (en) Gesture detecting method, gesture detecting system and computer readable storage medium
US9684947B2 (en) Indicating availability of indoor content on a digital map
CN113778310A (en) Cross-device control method and computer program product
EP1887776A1 (en) Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
US20200272308A1 (en) Shake Event Detection System
CN110647286A (en) Screen element control method, device, equipment and storage medium
US11184302B2 (en) Method for transmitting content using message application and electronic device supporting the same
US20150186026A1 (en) Displaced double tap gesture
CN109724617A (en) A kind of method for drafting and relevant device of navigation routine
US20190277649A1 (en) Map display system and map display program
CN109857298A (en) Using starting method, apparatus, equipment and storage medium
CN112083871A (en) Method, device, terminal and storage medium for controlling electronic equipment
CN113204299B (en) Display method, display device, electronic equipment and storage medium
JP5884294B2 (en) Content display device, content display system, server, terminal, content display method, and computer program
EP2993555A1 (en) Screen operation method for electronic device based on electronic device and control action
US9753557B2 (en) Fast inking a touch display
WO2021068964A1 (en) Window arrangement method and device, terminal and storage medium
CN113253874A (en) Display device control method, device, terminal and storage medium
CN110162251B (en) Image scaling method and device, storage medium and electronic equipment
CN110618776B (en) Picture scaling method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240305

Address after: # 03-06, Lai Zan Da Building 1, 51 Belarusian Road, Singapore

Applicant after: Alibaba Innovation Co.

Country or region after: Singapore

Address before: Room 01, 45th Floor, AXA Building, 8 Shanton Road, Singapore

Applicant before: Alibaba Singapore Holdings Ltd.

Country or region before: Singapore