CN114003154A - Control method and device for automobile display window and automobile - Google Patents

Control method and device for automobile display window and automobile Download PDF

Info

Publication number
CN114003154A
CN114003154A CN202111320405.XA CN202111320405A CN114003154A CN 114003154 A CN114003154 A CN 114003154A CN 202111320405 A CN202111320405 A CN 202111320405A CN 114003154 A CN114003154 A CN 114003154A
Authority
CN
China
Prior art keywords
display
screen
touch
cross
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111320405.XA
Other languages
Chinese (zh)
Inventor
周子韧
李微萌
沈芳朱
崔巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhiji Automobile Technology Co Ltd
Original Assignee
Zhiji Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhiji Automobile Technology Co Ltd filed Critical Zhiji Automobile Technology Co Ltd
Priority to CN202111320405.XA priority Critical patent/CN114003154A/en
Publication of CN114003154A publication Critical patent/CN114003154A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The invention discloses a method and a device for controlling a display window for an automobile and the automobile. The control method comprises the steps of obtaining a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface; executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation; and acquiring a second touch operation acting on the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen. According to the technical scheme provided by the invention, the intention of the user is judged by judging the touch operation, the man-machine interaction is realized according to the actual operation requirement of the user, and the interaction efficiency of the user is improved.

Description

Control method and device for automobile display window and automobile
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a method and a device for controlling a display window for an automobile and the automobile.
Background
With the development of intelligent automobile technology and the improvement of cabin processor performance, more and more touch screens and larger touch screens begin to appear in the automobile, and the automobile control system can also adapt to more and more application functions. It is now common for an application to be displayed only on a certain screen or fixed area. Currently, a few vehicles are designed with a function of cross-screen application/video content, and a common implementation mode is to click one key to perform content projection. Such approaches are not flexible enough and are relatively backward compared to common interactive gestures on cell phones. For functions such as video call and video recording, tasks such as map navigation and music are often run simultaneously.
The prior art is therefore still subject to further development.
Disclosure of Invention
In order to solve the technical problems, the invention provides a method and a device for controlling a display window for an automobile and the automobile.
In a first aspect of the present invention, there is provided a method for controlling a display window for an automobile, comprising:
acquiring a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface;
executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation;
and acquiring a second touch operation acting on the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
Optionally, the acquiring a first touch operation applied to the touch screen includes:
and acquiring the operation duration and the operation position of the first touch operation, wherein the operation position is used for controlling an application interface displayed by a display window, and the operation position is positioned on the application interface.
Optionally, an operation control identifier or an operation control area is displayed on the application interface displayed by the display window.
Optionally, the executing a multitasking state of an application interface or executing a cross-screen operation according to the operation intention of the first touch operation includes:
controlling an application interface to execute a multi-task state if the first touch operation meets a preset condition;
controlling an application interface to perform cross-screen display in other display areas according to the sliding state of the first touch operation or the second touch operation; the cross-screen display comprises multi-screen interaction task interaction, cabin cross-screen display, cross-equipment display and cross-region display in a screen.
Optionally, the controlling an application interface to execute a multitasking state according to whether the first touch operation meets a preset condition includes:
the first touch control operation is pressing operation, the pressing duration meets a first time threshold, and an application program running on the current touch control screen is displayed as a preview interface to be displayed as a multi-task state;
or the first touch operation is a downslide operation, and the application program operated by the current touch screen is displayed as a preview interface so as to be displayed in a multi-task state.
Optionally, the displaying the application program currently running on the touch screen as a preview interface to display the application program as a multitask state includes:
the method comprises the steps that an application program operated by a current touch screen executes the reduced display of an application interface; and when a plurality of application programs exist, the plurality of reduced application interfaces are sequentially overlapped, the plurality of application interfaces are rebounded after being sequentially overlapped and correspond to the edge touch of the current touch screen, and the application interfaces are sequentially displayed in an arranging mode.
Optionally, the obtaining a second touch operation applied to the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen includes:
the second touch operation is a sliding operation for an application interface, the current state of the touch screen is a multitasking state, and the application interface is adsorbed to a second preset position of the current touch screen to be displayed according to the sliding speed of the second touch operation, or is displayed in other display areas in a cross-screen mode.
Optionally, the controlling, according to the sliding state of the first touch operation or the second touch operation, the application interface to perform cross-screen display in another display area includes:
and if the current other display areas subjected to cross-screen display have a plurality of application interfaces, and the current other display areas are subjected to application preemption, and then the current other display areas automatically enter a multitask state.
Optionally, the performing cross-screen display according to the operation intention of the first touch operation includes:
and judging whether the application program is limited by cross-screen display or not according to the application program corresponding to the current application interface, limiting the cross-screen display of the application program limited by the cross-screen display, and executing the cross-screen display of the application program not limited.
Optionally, the determining, according to the application program corresponding to the current application interface, whether the application program is limited to cross-screen display includes:
distributing a cross-screen control ID to an application program, and judging whether the application program is limited to cross-screen display according to the cross-screen control ID corresponding to the application program.
Optionally, the number of the on-vehicle display screens is more than two, and the performing of the cross-screen display according to the operation intention of the first touch operation includes:
and determining the vehicle-mounted display screen in the sliding direction according to the sliding direction of the first touch operation based on the relative direction position of the current touch screen on the vehicle-mounted display screen, and executing cross-screen display on the vehicle-mounted display screen, wherein the sliding direction corresponds to the direction position of the vehicle-mounted display screen executing cross-screen display.
Optionally, the executing of the multitask state or the cross-screen display of the application interface according to the operation intention of the first touch operation includes:
when the current touch screen is in a multitask state, adsorbing an application interface in the multitask state to a second preset position of the current touch screen for display;
when the current touch screen executes cross-screen display, adsorbing an application interface executing the cross-screen display to a second preset position of the current touch screen for display;
and displaying the application interface which executes the multitask state of the application interface or is displayed in a cross-screen mode as a preview interface.
Optionally, the method for controlling a display window for an automobile further includes: and if the application program using state is finished or the touch operation is not generated within the second time threshold in the multitasking state, removing the multitasking state animation.
Optionally, the method for controlling a display window for an automobile further includes: and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, closing the running application program in the multitask state.
In a second aspect of the present invention, there is provided a method for controlling a display window for an automobile, including:
presenting a display interface into a multitasking state based on a first touch operation acting on a touch screen;
and acquiring a second touch operation acting on the touch screen, and performing cross-screen display or frame adsorption display or program closing according to the second touch operation.
Optionally, the first touch operation is long-time pressing or sliding down, an operation position of the first touch operation is located on an application interface, and the operation position is used for controlling the application interface; and an operation control identifier or an operation control area is displayed on the application interface.
Optionally, the second touch operation is a sliding operation for an application interface, and the application interface is adsorbed to a third preset position of the current touch screen according to the second touch operation for display, or displayed in other display areas across screens.
Optionally, the method further includes that a plurality of application interfaces exist in the current touch screen for cross-screen display, and if application preemption occurs in the current touch screen, the current touch screen automatically enters a multitask state.
In a third aspect of the present invention, there is provided a method for controlling a display window for an automobile, comprising:
acquiring a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface;
executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation; the multitasking state is that application interfaces of one or more application programs are displayed as preview interfaces; the cross-screen display comprises cross-screen display on other display screens or display and other divided areas of the touch screen.
In a fourth aspect of the present invention, there is provided a control device for a display window for an automobile, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first touch operation acting on a touch screen, and the first touch operation occurs at a first preset position of an application interface;
the display module is used for executing the multi-task state of the application interface or displaying the multi-task state or the cross screen according to the operation intention of the first touch operation;
the control module is used for acquiring a second touch operation acting on the touch screen and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
In a fifth aspect of the present invention, there is provided a control device for a display window for an automobile, comprising:
the display module is used for presenting a display interface into a multitasking state based on first touch operation acting on the touch screen;
and the control module is used for acquiring a second touch operation acting on the touch screen, and performing cross-screen display or frame adsorption display or program closing according to the second touch operation.
A sixth aspect of the present invention provides a control device for a display window for an automobile, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first touch operation acting on a touch screen, and the first touch operation occurs at a first preset position of an application interface;
the display module is used for executing the multi-task state of the application interface or displaying the multi-task state or the cross screen according to the operation intention of the first touch operation; the multitasking state is that application interfaces of one or more application programs are displayed as preview interfaces; the cross-screen display comprises cross-screen display on other display screens or display and other divided areas of the touch screen.
A seventh aspect of the present invention provides a vehicle comprising a processor, a memory, and a computer program stored on the memory and capable of running on the processor, wherein the computer program, when executed by the processor, implements the steps of the method for controlling an automotive display window as described above.
An eighth aspect of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the method for controlling a display window for an automobile as described above.
According to the technical scheme provided by the invention, the intention of the user is judged by judging the touch operation, the man-machine interaction is realized according to the actual operation requirement of the user, and the interaction efficiency of the user is improved.
Drawings
FIG. 1 is a flowchart illustrating a method for controlling a display window of an automobile according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a first touch operation display according to an embodiment of the invention;
FIG. 3 is a schematic diagram illustrating a cross-screen display of an application interface from a display screen A to a display screen B in a reduced size according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an application interface displayed in a cross-screen manner, which is displayed in the display area 1 of the display screen in a cross-screen manner according to the embodiment of the present invention;
FIG. 5 is a flowchart illustrating a method for controlling a display window according to another embodiment of the present invention;
FIG. 6 is a flowchart illustrating a method for controlling a display window of a vehicle according to another embodiment of the present invention;
FIG. 7 is a flowchart illustrating a method for controlling a display window of a vehicle according to another embodiment of the present invention;
FIG. 8 is a block diagram of a control device for a display window of an automobile according to an embodiment of the present invention;
fig. 9 is a block diagram of a control device for a display window of an automobile according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
For functions such as video call and video recording, the functions are often operated simultaneously with tasks such as map navigation and music, so that a cabin interaction mode supporting multiple tasks and cross-screen capability is urgently needed. But the display content of the application program is projected to another screen by clicking the control button, the interaction effect is poor, and the interaction efficiency is low. Gesture control operations such as sliding, pressing, dragging and the like are relatively convenient, so that cross-screen display or program control in the cabin can be realized through the gesture operations.
It should be understood that most of the on-board display screens are touch screens, such as a main cockpit, a secondary cockpit, etc.; there are also some cars whose on-board display screen is only partially a touch screen, and the display screen of the passenger cabin (back row cabin) is only displayed (or touch is limited). Therefore, other display screens described in the present invention can be understood as touch screens and also as display screens, unless the description of the related touch operation with touch is considered as a touch screen. In addition, the cross-screen display mentioned in the invention can be understood as a display screen displayed in different position areas of the vehicle-mounted cockpit or different divided display areas displayed in the same screen, and is particularly suitable for a large-screen display entity with the vehicle-mounted control screen as a whole.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating a method for controlling a display window for an automobile according to an embodiment of the present invention. The control method of the display window for the automobile is applied to a software control system and comprises the following steps:
step S100: acquiring a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface;
step S200: and executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation.
Taking the main control screen of the automobile as an example, an application interface, such as a navigation map, is displayed on the touch screen of the main control screen. If the user wants to start the video call application program again, the cross-screen display of the application program is needed, for example, the video call is displayed in a main control screen (a display screen of a main cockpit), and a navigation map is displayed in other display areas; of course the display position may be adapted to the contrary. In addition, a multitask state is realized in the display screen, and the multitask state can also be realized through the first touch operation, so that a user can realize software control through operation according to requirements.
The first touch operation can be a pressing operation or a pulling-down operation; the specific operation mode of the first touch operation is not limited, and the specific operation mode includes the second touch operation described below, or other gesture operations, such as double-point or three-point clicking, and the like, and is not limited intentionally. In some embodiments, the first touch operation is a pressing operation, and the pressing duration meets a first time threshold, an application program currently running on the touch screen can be displayed as a preview interface to be displayed in a multi-task state; or the first touch operation is a downslide operation, and the application program operated by the current touch screen is displayed as a preview interface so as to be displayed in a multi-task state. The multitask state can be displayed as a task state or a display form to be processed by a plurality of task states; details will be described later.
Since the application interface of each application program has its own control area and control button, an area or a mark that can respond to the first touch operation may be preset so as not to affect the use of the application program. In an embodiment, for example, as described in fig. 2, the manipulation identifier may be displayed on the application interface as a first preset position, and displayed as a gray straight line, and when the first touch operation occurs on the gray straight line, the cross-screen display or the position movement of the application interface may be performed according to the first touch operation, or the multi-task state may be directly entered.
In other embodiments, it is also contemplated that the first predetermined position is set as an operation control area, for example, a partial area of the upper portion of the application interface, such as the area where the gray line is located in fig. 2, but a gray line is not displayed.
Due to the diversity of the operation intentions of the user, it is possible to display the application programs across screens or switch the application programs, or move the application programs within the screen. Then, if the first touch operation occurs at the first preset position, the operation intention of the user may be to implement cross-screen display or switch to a multi-task state.
Exemplarily, if the first touch operation is a pressing operation, it may be determined to perform cross-screen display or switch to a multitasking state by determining a pressing duration; if the user wants to realize cross-screen display, the second touch operation following the first touch operation is dragging and sliding operation. If the user wants to switch to the multitask state, the user needs to press for a long time to switch to the multitask state, and then second touch operation is performed based on display of the multitask state.
Step S300: and acquiring a second touch operation acting on the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
In combination with the above, the second touch operation is another demanding operation based on the first touch operation, and if the first touch operation is a click operation (selection operation) and the pressing duration does not satisfy the first time threshold t, the first touch operation may drag the selected application interface within the touch screen or throw the application interface to other display areas for display by sliding, so as to implement cross-screen display; or continuing to execute the second touch operation, for example, closing the application interface. That is, the first touch operation may be a direct execution of a cross-screen operation, and then, based on the second touch operation, an operation such as closing an application interface, displaying a cross-screen display for the second time, or other operations may be executed. For example, the second touch operation may be the same as the first touch operation, but occurs on a display screen displayed across screens, such as a display screen at a position of a cockpit, and the application interface is displayed across screens on the main control screen. For another example, the second touch operation is a long press operation, and the application interface may be reduced in size. As shown in fig. 3 and 4, fig. 3 shows the display zoomed out across the display screen A, B and attached to the lower left corner. Fig. 4 shows that in any display screen, the display area has a plurality of divided areas, and each divided area can display an application interface. Fig. 4 shows 3 display areas, and the application interface is displayed in the display area 2 across screens from the display area 1.
Specifically, if the first touch operation is a pressing operation (selection operation) and the pressing duration meets a first time threshold, for example, the pressing duration is t time, the display window (the display window of the entire touch screen) enters a multitasking state and is displayed as a multitasking animation; the second touch operation may select, move, or close the application interfaces of the application programs in the multitasking state. When closing the application, the user can click "x" or ". multidot.; of course, all applications can be closed directly, for example, by dragging.
According to the technical scheme provided by the invention, the intention mode of the touch operation is determined by judging the first touch operation, then the intention of the user operation is realized according to the second touch operation, projection display is not required to be carried out through keys, and cross-screen display or multi-task state control can be quickly realized. And man-machine interaction is realized according to the actual operation requirements of the user, so that the user interaction efficiency is improved.
A method for controlling a display window for an automobile according to an embodiment of the present invention will be described with reference to fig. 2, 3, 4, and 5. In the following embodiments, the first touch operation is a pressing operation, and the second touch operation is a sliding operation. The method specifically comprises the following steps:
step S501: acquiring the operation duration and the operation position of the first touch operation;
the operation position is used for controlling an application interface displayed by a display window, the operation position is positioned on the application interface, and an operation control identifier or an operation control area is displayed on the application interface displayed by the display window; the gray straight line as depicted in FIG. 2;
step S502: judging whether the first touch operation meets a preset condition or not;
if the first touch operation meets the preset condition, the application interface is controlled to present a multitasking state, which may specifically refer to the descriptions in steps S503 to S504. And if the preset condition is not met, directly responding to a second touch operation or directly executing the first touch operation.
Specifically, the preset condition is not met, the first touch operation is a cross-screen operation, and the application interface can be directly slid to perform cross-screen display without meeting the preset condition. The cross-screen display can be multi-screen interaction task interaction, cockpit cross-screen display, cross-equipment display and the like. Or the first touch operation is an operation of selecting the application interface, and the second touch operation is an operation of closing the application interface, so that the application interface can be directly closed after the second touch operation. The explanation of steps S405 to S406 can be specifically referred to.
Step S503: and if the first touch operation meets the preset condition, controlling the application interface to present a multi-task state.
For example, the first touch operation is a pressing operation, the pressing duration meets the first time threshold t, an application program running on the current touch screen is displayed as a preview interface to be displayed in a multitasking state, and the multitasking state can be realized by one or more application interfaces. Specifically, the application program running on the current touch screen is subjected to reduced display of an application interface; reducing the picture content of the application interface to a minimum card, and reducing the application interface to a pressing position if the first touch control operation is a long-time pressing operation; or to a fixed position, e.g. close to or far from the first preset position.
Further, if only one application program exists, the application interface can be displayed in a reduced size and in a pressed position or a second preset position. The second preset position is, for example, a position near four corners or four edges of the display window. And the subsequent second touch operation can perform cross-screen display on the application interface.
For another example, if the current touch screen runs with multiple application programs, the multiple reduced application interfaces are sequentially overlaid, and the multiple application interfaces are sequentially overlaid, then bounce occurs after corresponding to the current touch screen touches edges, and are sequentially displayed. The application interfaces of the multiple application programs displayed in the sequential arrangement can be selected, for example, the application interfaces can be selected by sliding up and down in the up-and-down arrangement, or the application interfaces can be selected by sliding left and right in the left-and-right arrangement. If a plurality of application programs are uniformly distributed and arranged on the touch screen, the selection can be realized in a direct clicking mode. If the application interfaces are too many and exceed the display boundary in the sorting process, the display interfaces can be displayed and then rebound and arranged in sequence.
Further, if a plurality of application programs exist, the application interface can be reduced, and the application interface in the multitasking state is adsorbed to a second preset position of the current touch screen for display; for example, the position of the display window near four corners or four sides, the application interface may be displayed across screens by a subsequent second touch operation.
Step S504: and acquiring a second touch operation acting on the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
Specifically, the current state of the touch screen is a multitask state, the second touch operation is a sliding operation for an application interface, and the application interface is adsorbed to a second preset position of the current touch screen to be displayed according to the sliding speed of the second touch operation, or is displayed in other display areas across screens. As shown in fig. 3, the application interface is displayed in the display screen a to the display screen B across screens, and the preview interface is zoomed out in the illustration and is attached to the lower left corner. Of course, the display interface may be the same size as that of the display screen a. Or as shown in fig. 4, the display screen has 3 display areas, and the application interface is displayed in the display area 2 across screens from the display area 1.
For example, if the sliding speed of the second touch operation is greater than or equal to the preset sliding speed threshold, the application interface is displayed across the screen, and if the sliding speed of the second touch operation is less than the preset sliding speed threshold, the application interface is adsorbed to a second preset position of the current touch screen for display, where the second preset position is described above and is not described again.
In the multitasking state, a case where the task state ends or the application needs to be deleted may also occur, and for this case, task control may be performed according to the following manner:
1. if the application program using state is over in the multitasking state, removing the task frame in the multitasking state; for example, in the multitask state of video recording and map navigation, the video recording task can be closed and the multitask state can be exited without executing the video recording task after the video recording task is finished. Wherein the multitask state is displayed in the form of animation;
2. and if the application program does not generate touch operation within the second time threshold in the multitasking state, exiting the multitasking state. That is, the user can exit the multitask state after triggering the multitask state but not performing further operation;
3. and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, closing the running application program in the multitask state. For example, the first touch operation triggers the execution of the multitask state, and the second touch operation drags (slowly moves) the task frame of the multitask state to the outside of the boundary of the touch screen, so that the plurality of application programs are executed and closed;
4. and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, performing cross-screen display. For example, the first touch operation triggers the multitask state execution, and the second touch operation throws (fast moves) the task frame in the multitask state out of the boundary of the touch screen, so that the task frame in the multitask state is executed and displayed across screens. In addition to the multiple tasks being displayed across screens, one of the multiple tasks may be caused to perform a display across screens.
Step S505: and if the first touch operation does not meet the preset condition, directly executing cross-screen display.
For example, the first touch operation is a sliding operation, and the application interface is thrown to other display areas for cross-screen display.
In some embodiments, the application interface currently displayed on the touch screen may also be dragged within the display window, or an interface or a button for closing the program may be provided after the application interface is clicked.
Step S506: and if the application preemption occurs, automatically entering a multitasking state.
And if the current other display areas subjected to cross-screen display have a plurality of application interfaces, and the current other display areas are subjected to application preemption, and then the current other display areas automatically enter a multitask state. For example, when a music application interface is displayed in other display areas, a map navigation application interface is displayed across screens from a main control screen, and application preemption occurs, two application interfaces are automatically displayed in a task frame in a multi-task state.
It should be understood that, not only is there application preemption limited to other display areas at present, but the current touch screen as the main control screen also enters a multitasking state when receiving cross-screen display from other display areas. That is, cross-screen display may be performed on each other among multiple display screens.
In this step, it needs to be considered whether different applications support cross-screen display or support multitasking functions. Specifically, whether the application program is limited to cross-screen display or not is judged according to the application program corresponding to the current application interface, cross-screen display is limited to the application program limited to cross-screen display, and cross-screen display is executed to the application program which is not limited. The method can be realized by the following steps:
distributing a cross-screen control ID to an application program, and judging whether the application program is limited to cross-screen display according to the cross-screen control ID corresponding to the application program. For example, each APP is assigned an APP ID, the screen known to support cross-screens is the A, B screen, most content supports A, B cross-screens and a few applications support multitasking capabilities according to the operating system and APP definitions.
The screen supporting cross-screen may also be an A, B, C, D screen. At the moment, cross-screen display according to the direction position orientation can be realized. For example, based on the relative direction position of the current touch screen on the vehicle-mounted display screen, the vehicle-mounted display screen in the sliding direction is determined according to the sliding direction of the first touch operation, and cross-screen display is performed on the vehicle-mounted display screen, wherein the sliding direction corresponds to the direction position of the vehicle-mounted display screen on which the cross-screen display is performed. In one embodiment, the main cockpit slides the application interface to the right, and the application interface is displayed on a display screen of the auxiliary cockpit in a cross-screen mode; and the main cockpit slides the application interface to the left and the right, and the application interface is displayed on a display screen of the passenger cabin in a cross-screen mode.
As can be seen from the foregoing embodiments, the technical solution provided by the present invention can implement display control and cross-screen display of an application program based on a touch operation of a user. In addition, cross-screen limitation can be set for different application programs, the display interface is reduced and displayed in an attached mode so as to achieve multi-application program display in the screen, various controls of the application interface in a multi-task state are facilitated, man-machine interaction is facilitated, and control of a vehicle-mounted display window is improved.
As shown in fig. 6, the present invention further provides a method for controlling a display window for an automobile, comprising the steps of:
and 610, presenting the display interface into a multitasking state based on the first touch operation acting on the touch screen.
For example, the touch operation is a pressing operation, the pressing duration meets a first preset time threshold, it is determined that the first touch operation starts, and the display interface of the current touch screen is controlled and displayed in a multitasking state, for example, a plurality of or a single display interface is displayed in a reduced mode.
Specifically, the first touch operation is long-time pressing or sliding down, an operation position of the first touch operation is located on the application interface, and the operation position is used for controlling the application interface; and an operation control identifier or an operation control area is displayed on the application interface. The first touch operation may be triggered by preset conditions such as a gliding distance and a gliding speed, and the first touch operation may be conveniently understood by combining with a long-press operation instruction, which may be referred to above specifically and is not described in detail.
Step 620: and acquiring a second touch operation acting on the touch screen, and performing cross-screen display or frame adsorption display or program closing according to the second touch operation.
The second touch operation may be performed on a certain application interface, or may be performed on multiple application interfaces, for example, when the multi-task state includes multiple application interfaces.
For example, the second touch operation is a sliding operation for an application interface, and the application interface is adsorbed to a third preset position of the current touch screen according to the second touch operation for display, where the third preset position may be a position near four corners or four sides of a display window.
Or displaying in other display areas in a cross-screen mode based on the second touch operation. For example, the task box in the multitasking state is displayed on other screens in a cross-screen mode completely, or only part of the interface is displayed in a cross-screen mode.
In addition, a plurality of application interfaces exist in the current touch screen for cross-screen display, and if application preemption occurs in the current touch screen, the current touch screen automatically enters a multi-task state.
And if the current other display areas subjected to cross-screen display have a plurality of application interfaces, and the current other display areas are subjected to application preemption, and then the current other display areas automatically enter a multitask state. For example, when a music application interface is displayed in other display areas, a map navigation application interface is displayed across screens from a main control screen, and application preemption occurs, two application interfaces are automatically displayed in a task frame in a multi-task state.
It should be understood that, not only is there application preemption limited to other display areas at present, but the current touch screen as the main control screen also enters a multitasking state when receiving cross-screen display from other display areas. That is, cross-screen display may be performed on each other among multiple display screens.
In this step, it needs to be considered whether different applications support cross-screen display or support multitasking functions. Specifically, whether the application program is limited to cross-screen display or not is judged according to the application program corresponding to the current application interface, cross-screen display is limited to the application program limited to cross-screen display, and cross-screen display is executed to the application program which is not limited. The method can be realized by the following steps:
distributing a cross-screen control ID to an application program, and judging whether the application program is limited to cross-screen display according to the cross-screen control ID corresponding to the application program. For example, each APP is assigned an APP ID, the screen known to support cross-screens is the A, B screen, most content supports A, B cross-screens and a few applications support multitasking capabilities according to the operating system and APP definitions.
The screen supporting cross-screen may also be an A, B, C, D screen. At the moment, cross-screen display according to the direction position orientation can be realized. For example, based on the relative direction position of the current touch screen on the vehicle-mounted display screen, the vehicle-mounted display screen in the sliding direction is determined according to the sliding direction of the first touch operation, and cross-screen display is performed on the vehicle-mounted display screen, wherein the sliding direction corresponds to the direction position of the vehicle-mounted display screen on which the cross-screen display is performed. In one embodiment, the main cockpit slides the application interface to the right, and the application interface is displayed on a display screen of the auxiliary cockpit in a cross-screen mode; and the main cockpit slides the application interface to the left and the right, and the application interface is displayed on a display screen of the passenger cabin in a cross-screen mode.
In the multitasking state, a case where the task state ends or the application needs to be deleted may also occur, and for this case, task control may be performed according to the following manner:
1. if the application program using state is over in the multitasking state, removing the task frame in the multitasking state; for example, in the multitask state of video recording and map navigation, the video recording task can be closed and the multitask state can be exited without executing the video recording task after the video recording task is finished. Wherein the multitask state is displayed in the form of animation;
2. and if the application program does not generate touch operation within the second time threshold in the multitasking state, exiting the multitasking state. That is, the user can exit the multitask state after triggering the multitask state but not performing further operation;
3. and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, closing the running application program in the multitask state. For example, the first touch operation triggers the execution of the multitask state, and the second touch operation drags (slowly moves) the task frame of the multitask state to the outside of the boundary of the touch screen, so that the plurality of application programs are executed and closed;
4. and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, performing cross-screen display. For example, the first touch operation triggers the multitask state execution, and the second touch operation throws (fast moves) the task frame in the multitask state out of the boundary of the touch screen, so that the task frame in the multitask state is executed and displayed across screens. In addition to the multiple tasks being displayed across screens, one of the multiple tasks may be caused to perform a display across screens.
As shown in fig. 7, the present invention further provides a method for controlling a display window for an automobile, which can implement preview display of one or more application interfaces according to user operations, and provide further operations for the user; or directly realize the display on different display screens or different display areas of the same display screen. The method specifically comprises the following steps:
step 710: acquiring a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface;
step 720: executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation; the multitasking state is that application interfaces of one or more application programs are displayed as preview interfaces; the cross-screen display comprises cross-screen display on other display screens or display and other divided areas of the touch screen.
The above method can refer to the above description of the method, and the specific implementation process is not repeated.
It should be noted that, in the present embodiment, in order to describe the control method of the display window in the multitasking state, the contents of the embodiments shown in fig. 1 to 7 may be referred to, and if necessary, the two embodiments may be understood in combination.
As shown in fig. 8, the present invention also provides a control device for a display window for an automobile, comprising:
an obtaining module 810, configured to obtain a first touch operation applied to a touch screen, where the first touch operation occurs at a first preset position of an application interface;
a display module 820, configured to execute a multitask state or cross-screen display of an application interface according to the operation intention of the first touch operation;
the control module 830 is configured to obtain a second touch operation applied to the touch screen, and execute an operation intention corresponding to the second touch operation according to the current state of the touch screen.
Specifically, the three modules may control the display window according to the following content:
the first touch operation can be a pressing operation or a pulling-down operation; the specific operation mode of the first touch operation is not limited, and the specific operation mode includes the second touch operation described below, or other gesture operations, such as double-point or three-point clicking, and the like, and is not limited intentionally. In some embodiments, the first touch operation is a pressing operation, and the pressing duration meets a first time threshold, an application program currently running on the touch screen can be displayed as a preview interface to be displayed in a multi-task state; or the first touch operation is a downslide operation, and an application program operated by the current touch screen is displayed as a preview interface to be displayed as a multi-task state; details will be described later.
Since the application interface of each application program has its own control area and control button, an area or a mark that can respond to the first touch operation may be preset so as not to affect the use of the application program. In an embodiment, for example, as described in fig. 2, the manipulation identifier may be displayed on the application interface as a first preset position, and displayed as a gray straight line, and when the first touch operation occurs on the gray straight line, the cross-screen display or the position movement of the application interface may be performed according to the first touch operation, or the multi-task state may be directly entered.
In other embodiments, it is also contemplated that the first predetermined position is set as an operation control area, for example, a partial area of the upper portion of the application interface, such as the area where the gray line is located in fig. 2, but a gray line is not displayed.
Due to the diversity of the operation intentions of the user, it is possible to display the application programs across screens or switch the application programs, or move the application programs within the screen. Then, if the first touch operation occurs at the first preset position, the operation intention of the user may be to implement cross-screen display or switch to a multi-task state.
Exemplarily, if the first touch operation is a pressing operation, it may be determined to perform cross-screen display or switch to a multitasking state by determining a pressing duration; if the user wants to realize cross-screen display, the second touch operation following the first touch operation is dragging and sliding operation. If the user wants to switch to the multitask state, the user needs to press for a long time to switch to the multitask state, and then second touch operation is performed based on display of the multitask state.
And acquiring a second touch operation acting on the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
In combination with the above, the second touch operation is another demanding operation based on the first touch operation, and if the first touch operation is a click operation (selection operation) and the pressing duration does not satisfy the first time threshold t, the first touch operation may drag the selected application interface within the touch screen or throw the application interface to other display areas for display by sliding, so as to implement cross-screen display; or continuing to execute the second touch operation, for example, closing the application interface. That is, the first touch operation may be a direct execution of a cross-screen operation, and then, based on the second touch operation, an operation such as closing an application interface, displaying a cross-screen display for the second time, or other operations may be executed. For example, the second touch operation may be the same as the first touch operation, but occurs on a display screen displayed across screens, such as a display screen at a position of a cockpit, and the application interface is displayed across screens on the main control screen. For another example, the second touch operation is a long press operation, and the application interface may be reduced in size.
Specifically, if the first touch operation is a pressing operation (selection operation) and the pressing duration meets a first time threshold, for example, the pressing duration is t time, the display window (the display window of the entire touch screen) enters a multitasking state and is displayed as a multitasking animation; the second touch operation may select, move, or close the application interfaces of the application programs in the multitasking state. When closing the application, the user can click "x" or ". multidot.; of course, all applications can be closed directly, for example, by dragging.
As shown in fig. 9, the present invention also provides a control device for a display window for an automobile, comprising:
the display module 910 is configured to present a display interface in a multitasking state based on a first touch operation applied to the touch screen;
the control module 920 is configured to acquire a second touch operation applied to the touch screen, and perform cross-screen display or border adsorption display or program shutdown according to the second touch operation.
The display module 910 and the control module 920 may specifically refer to the following contents:
and if the touch operation is a pressing operation and the pressing duration meets a first preset time threshold, judging that the first touch operation starts, and controlling and displaying the display interface of the current touch screen to be in a multitask state, such as reducing and displaying a plurality of or a single display interface.
Specifically, the first touch operation is long-time pressing or sliding down, an operation position of the first touch operation is located on the application interface, and the operation position is used for controlling the application interface; and an operation control identifier or an operation control area is displayed on the application interface. The first touch operation may be triggered by preset conditions such as a gliding distance and a gliding speed, and the first touch operation may be conveniently understood by combining with a long-press operation instruction, which may be referred to above specifically and is not described in detail.
The second touch operation may be performed on a certain application interface, or may be performed on multiple application interfaces, for example, when the multi-task state includes multiple application interfaces.
For example, the second touch operation is a sliding operation for an application interface, and the application interface is adsorbed to a third preset position of the current touch screen according to the second touch operation for display, where the third preset position may be a position near four corners or four sides of a display window.
Or displaying in other display areas in a cross-screen mode based on the second touch operation. For example, the task box in the multitasking state is displayed on other screens in a cross-screen mode completely, or only part of the interface is displayed in a cross-screen mode.
In addition, a plurality of application interfaces exist in the current touch screen for cross-screen display, and if application preemption occurs in the current touch screen, the current touch screen automatically enters a multi-task state.
And if the current other display areas subjected to cross-screen display have a plurality of application interfaces, and the current other display areas are subjected to application preemption, and then the current other display areas automatically enter a multitask state. For example, when a music application interface is displayed in other display areas, a map navigation application interface is displayed across screens from a main control screen, and application preemption occurs, two application interfaces are automatically displayed in a task frame in a multi-task state.
It should be understood that, not only is there application preemption limited to other display areas at present, but the current touch screen as the main control screen also enters a multitasking state when receiving cross-screen display from other display areas. That is, cross-screen display may be performed on each other among multiple display screens.
In this step, it needs to be considered whether different applications support cross-screen display or support multitasking functions. Specifically, whether the application program is limited to cross-screen display or not is judged according to the application program corresponding to the current application interface, cross-screen display is limited to the application program limited to cross-screen display, and cross-screen display is executed to the application program which is not limited. The method can be realized by the following steps:
distributing a cross-screen control ID to an application program, and judging whether the application program is limited to cross-screen display according to the cross-screen control ID corresponding to the application program. For example, each APP is assigned an APP ID, the screen known to support cross-screens is the A, B screen, most content supports A, B cross-screens and a few applications support multitasking capabilities according to the operating system and APP definitions.
The screen supporting cross-screen may also be an A, B, C, D screen. At the moment, cross-screen display according to the direction position orientation can be realized. For example, based on the relative direction position of the current touch screen on the vehicle-mounted display screen, the vehicle-mounted display screen in the sliding direction is determined according to the sliding direction of the first touch operation, and cross-screen display is performed on the vehicle-mounted display screen, wherein the sliding direction corresponds to the direction position of the vehicle-mounted display screen on which the cross-screen display is performed. In one embodiment, the main cockpit slides the application interface to the right, and the application interface is displayed on a display screen of the auxiliary cockpit in a cross-screen mode; and the main cockpit slides the application interface to the left and the right, and the application interface is displayed on a display screen of the passenger cabin in a cross-screen mode.
In the multitasking state, a case where the task state ends or the application needs to be deleted may also occur, and for this case, task control may be performed according to the following manner:
1. if the application program using state is over in the multitasking state, removing the task frame in the multitasking state; for example, in the multitask state of video recording and map navigation, the video recording task can be closed and the multitask state can be exited without executing the video recording task after the video recording task is finished. Wherein the multitask state is displayed in the form of animation;
2. and if the application program does not generate touch operation within the second time threshold in the multitasking state, exiting the multitasking state. That is, the user can exit the multitask state after triggering the multitask state but not performing further operation;
3. and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, closing the running application program in the multitask state. For example, the first touch operation triggers the execution of the multitask state, and the second touch operation drags (slowly moves) the task frame of the multitask state to the outside of the boundary of the touch screen, so that the plurality of application programs are executed and closed;
4. and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, performing cross-screen display. For example, the first touch operation triggers the multitask state execution, and the second touch operation throws (fast moves) the task frame in the multitask state out of the boundary of the touch screen, so that the task frame in the multitask state is executed and displayed across screens. In addition to the multiple tasks being displayed across screens, one of the multiple tasks may be caused to perform a display across screens.
In addition, the present invention also provides a control device for a display window for an automobile, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first touch operation acting on a touch screen, and the first touch operation occurs at a first preset position of an application interface; the display module is used for executing the multi-task state of the application interface or displaying the multi-task state or the cross screen according to the operation intention of the first touch operation; the multitasking state is that application interfaces of one or more application programs are displayed as preview interfaces; the cross-screen display comprises cross-screen display on other display screens or display and other divided areas of the touch screen.
The preview display of one or more application interfaces can be realized according to the operation of the user, and the user can further operate the application interfaces; or directly realize the display on different display screens or different display areas of the same display screen.
It should be understood that the control devices shown in fig. 8 and 9 correspond to the above control methods, respectively, and the contents of the control methods can be understood as additions to the control devices for the sake of understanding.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The invention also provides a vehicle comprising a processor, a memory and a computer program stored on the memory and capable of running on the processor, wherein the computer program, when executed by the processor, implements the steps of the method for controlling a display window for an automobile as described above.
The present invention also provides a computer-readable storage medium on which a computer program is stored, which, when executed by a processor, implements the steps of the method for controlling a display window for an automobile as described above.
It is understood that the computer-readable storage medium may include: any entity or device capable of carrying a computer program, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer memory, Read-only memory (ROM), Random Access Memory (RAM), and software distribution medium, etc. The computer program includes computer program code. The computer program code may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer memory, Read-only memory (ROM), Random Access Memory (RAM), software distribution medium, and the like. In some embodiments of the present invention, the automatic parking device 100 may include a controller, which is a single chip integrated with a processor, a memory, a communication module, and the like. The processor may refer to a processor included in the controller. The processor may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, etc.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (24)

1. A method for controlling a display window for an automobile, comprising:
acquiring a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface;
executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation;
and acquiring a second touch operation acting on the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
2. The method of claim 1, wherein obtaining the first touch operation applied to the touch screen comprises:
and acquiring the operation duration and the operation position of the first touch operation, wherein the operation position is used for controlling an application interface displayed by a display window, and the operation position is positioned on the application interface.
3. The method of claim 2, wherein the application interface displayed by the display window displays an operation control identifier or an operation control area.
4. The method according to claim 1, wherein the executing a multitask state of an application interface or executing a cross-screen operation according to the operation intention of the first touch operation comprises:
controlling an application interface to execute a multi-task state if the first touch operation meets a preset condition;
controlling an application interface to perform cross-screen display in other display areas according to the sliding state of the first touch operation or the second touch operation; the cross-screen display comprises multi-screen interaction task interaction, cabin cross-screen display, cross-equipment display and cross-region display in a screen.
5. The method according to claim 4, wherein the controlling an application interface to execute a multitasking state according to whether the first touch operation meets a preset condition comprises:
the first touch control operation is pressing operation, the pressing duration meets a first time threshold, and an application program running on the current touch control screen is displayed as a preview interface to be displayed as a multi-task state;
or the first touch operation is a downslide operation, and the application program operated by the current touch screen is displayed as a preview interface so as to be displayed in a multi-task state.
6. The method of claim 5, wherein displaying the application currently running on the touch screen as a preview interface for display in a multitasking state comprises:
the method comprises the steps that an application program operated by a current touch screen executes the reduced display of an application interface;
when a plurality of application programs exist, the plurality of reduced application interfaces are sequentially overlapped, the plurality of application interfaces are rebounded after being sequentially overlapped and corresponding to the touch edges of the current touch screen, and the application interfaces are sequentially displayed in an arranging mode.
7. The method according to claim 5, wherein the obtaining a second touch operation applied to the touch screen, and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen comprises:
the second touch operation is a sliding operation for an application interface, the current state of the touch screen is a multitasking state, and the application interface is adsorbed to a second preset position of the current touch screen to be displayed according to the sliding speed of the second touch operation, or is displayed in other display areas in a cross-screen mode.
8. The method according to claim 4, wherein the controlling, according to the sliding state of the first touch operation or the second touch operation, the application interface to perform cross-screen display in other display areas comprises:
and if the current other display areas subjected to cross-screen display have a plurality of application interfaces, and the current other display areas are subjected to application preemption, and then the current other display areas automatically enter a multitask state.
9. The method of claim 1, wherein performing a cross-screen display according to the operational intent of the first touch operation comprises:
and judging whether the application program is limited by cross-screen display or not according to the application program corresponding to the current application interface, limiting the cross-screen display of the application program limited by the cross-screen display, and executing the cross-screen display of the application program not limited.
10. The method of claim 9, wherein the determining whether the application is limited to cross-screen display according to the application corresponding to the current application interface comprises:
distributing a cross-screen control ID to an application program, and judging whether the application program is limited to cross-screen display according to the cross-screen control ID corresponding to the application program.
11. The method according to claim 1, wherein more than two in-vehicle display screens are provided, and the performing of cross-screen display according to the operation intention of the first touch operation comprises:
and determining the vehicle-mounted display screen in the sliding direction according to the sliding direction of the first touch operation based on the relative direction position of the current touch screen on the vehicle-mounted display screen, and executing cross-screen display on the vehicle-mounted display screen, wherein the sliding direction corresponds to the direction position of the vehicle-mounted display screen executing cross-screen display.
12. The method according to any one of claims 1 to 11, wherein the performing of the multi-task state or cross-screen display of the application interface according to the operation intent of the first touch operation comprises:
when the current touch screen is in a multitask state, adsorbing an application interface in the multitask state to a second preset position of the current touch screen for display;
when the current touch screen executes cross-screen display, adsorbing an application interface executing the cross-screen display to a second preset position of the current touch screen for display;
and displaying the application interface which executes the multitask state of the application interface or is displayed in a cross-screen mode as a preview interface.
13. The method of any one of claims 1 to 11, further comprising: and if the application program using state is finished or the touch operation is not generated within the second time threshold in the multitasking state, removing the multitasking state animation.
14. The method of any one of claims 1 to 11, further comprising: and if the multitask frame is at least partially moved out of the display range of the touch screen in the multitask state, closing the running application program in the multitask state.
15. A method for controlling a display window for an automobile, comprising:
presenting a display interface into a multitasking state based on a first touch operation acting on a touch screen;
and acquiring a second touch operation acting on the touch screen, and performing cross-screen display or frame adsorption display or program closing according to the second touch operation.
16. The method according to claim 15, wherein the first touch operation is a long press or a slide down, an operation position of the first touch operation is located on an application interface, and the operation position is used for controlling the application interface; and an operation control identifier or an operation control area is displayed on the application interface.
17. The method according to claim 15, wherein the second touch operation is a sliding operation for an application interface, and the application interface is attached to a third preset position of a current touch screen according to the second touch operation for display, or displayed in other display areas across screens.
18. The method of claim 15, further comprising presenting a plurality of application interfaces on a current touch screen for cross-screen display, wherein if application preemption occurs on the current touch screen, the current touch screen automatically enters a multitasking state.
19. A method for controlling a display window for an automobile, comprising:
acquiring a first touch operation acting on a touch screen, wherein the first touch operation occurs at a first preset position of an application interface;
executing a multi-task state of an application interface or cross-screen display according to the operation intention of the first touch operation; the multitasking state is that application interfaces of one or more application programs are displayed as preview interfaces; the cross-screen display comprises cross-screen display on other display screens or display and other divided areas of the touch screen.
20. A control device for a display window for an automobile, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first touch operation acting on a touch screen, and the first touch operation occurs at a first preset position of an application interface;
the display module is used for executing the multi-task state of the application interface or displaying the multi-task state or the cross screen according to the operation intention of the first touch operation;
the control module is used for acquiring a second touch operation acting on the touch screen and executing an operation intention corresponding to the second touch operation according to the current state of the touch screen.
21. A control device for a display window for an automobile, comprising:
the display module is used for presenting a display interface into a multitasking state based on first touch operation acting on the touch screen;
and the control module is used for acquiring a second touch operation acting on the touch screen, and performing cross-screen display or frame adsorption display or program closing according to the second touch operation.
22. A control device for a display window for an automobile, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring a first touch operation acting on a touch screen, and the first touch operation occurs at a first preset position of an application interface;
the display module is used for executing the multi-task state of the application interface or displaying the multi-task state or the cross screen according to the operation intention of the first touch operation; the multitasking state is that application interfaces of one or more application programs are displayed as preview interfaces; the cross-screen display comprises cross-screen display on other display screens or display and other divided areas of the touch screen.
23. A vehicle characterized by comprising a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program, when executed by the processor, realizing the steps of the method for controlling a display window for an automobile according to any one of claims 1 to 19.
24. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, realizes the steps of the method for controlling a display window for an automobile according to any one of claims 1 to 19.
CN202111320405.XA 2021-11-09 2021-11-09 Control method and device for automobile display window and automobile Pending CN114003154A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111320405.XA CN114003154A (en) 2021-11-09 2021-11-09 Control method and device for automobile display window and automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111320405.XA CN114003154A (en) 2021-11-09 2021-11-09 Control method and device for automobile display window and automobile

Publications (1)

Publication Number Publication Date
CN114003154A true CN114003154A (en) 2022-02-01

Family

ID=79928328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111320405.XA Pending CN114003154A (en) 2021-11-09 2021-11-09 Control method and device for automobile display window and automobile

Country Status (1)

Country Link
CN (1) CN114003154A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115206197A (en) * 2022-07-22 2022-10-18 智己汽车科技有限公司 Spliced display screen group of vehicle-mounted front row
WO2024022177A1 (en) * 2022-07-29 2024-02-01 华为技术有限公司 Control method and electronic device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115206197A (en) * 2022-07-22 2022-10-18 智己汽车科技有限公司 Spliced display screen group of vehicle-mounted front row
WO2024022177A1 (en) * 2022-07-29 2024-02-01 华为技术有限公司 Control method and electronic device

Similar Documents

Publication Publication Date Title
US10901515B2 (en) Vehicular interface system for launching an application
EP3000013B1 (en) Interactive multi-touch remote control
KR101651358B1 (en) User interface and a method for adjusting a view on a display unit
CN114003154A (en) Control method and device for automobile display window and automobile
CN109992193B (en) Touch screen flying interaction method in vehicle
US20100053221A1 (en) Information processing apparatus and operation method thereof
CN109933388B (en) Vehicle-mounted terminal equipment and display processing method of application components thereof
CN103593108A (en) Method for providing user interface having multi-tasking function, and mobile communication device
KR20180095849A (en) A vehicle having an image recording unit and an operating system for operating the devices of the vehicle and a method for operating the operating system
CN107225973A (en) A kind of vehicle-mounted multiple terminals control device and vehicle
EP3543061B1 (en) Vehicular display device
JP2021028660A (en) Display control apparatus of content, display control method and display control program
CN116521111A (en) Multi-screen display method and device for vehicle machine screen, vehicle and storage medium
DE102017219332A1 (en) HUMAN-VEHICLE INTERACTION
US11230189B2 (en) System and method for application interaction on an elongated display screen
CN111078068A (en) Vehicle-mounted control method and system and vehicle-mounted controller
JP2016103171A (en) Operation receiving system, method, and program
CN113791713B (en) Multi-screen display window sharing method and device applied to vehicle-mounted intelligent cabin
US11926258B2 (en) Vehicle ambient lighting
CN117413244A (en) Display control device and display control method
CN113791711A (en) Vehicle-mounted multi-screen display sharing method and device
CN117785283A (en) Vehicle-mounted application program management method, device, equipment and storage medium
CN113448469A (en) Vehicle-mounted multi-screen display diversified sharing interaction method and device
CN111625306A (en) Display control method and device for suspension icon in vehicle machine
CN115202526A (en) Window processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination