CN111309414A - User interface integration method and vehicle-mounted device - Google Patents

User interface integration method and vehicle-mounted device Download PDF

Info

Publication number
CN111309414A
CN111309414A CN201910803614.6A CN201910803614A CN111309414A CN 111309414 A CN111309414 A CN 111309414A CN 201910803614 A CN201910803614 A CN 201910803614A CN 111309414 A CN111309414 A CN 111309414A
Authority
CN
China
Prior art keywords
vehicle
user interface
sub
handheld device
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910803614.6A
Other languages
Chinese (zh)
Other versions
CN111309414B (en
Inventor
池育阳
林新杰
黄柏融
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dutch Mobile Drive Co
Original Assignee
Shenzhen Chaojie Communication Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chaojie Communication Co ltd filed Critical Shenzhen Chaojie Communication Co ltd
Priority to US16/710,060 priority Critical patent/US11356726B2/en
Publication of CN111309414A publication Critical patent/CN111309414A/en
Application granted granted Critical
Publication of CN111309414B publication Critical patent/CN111309414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Abstract

The invention provides a user interface integration method, which comprises the following steps: the vehicle-mounted device displays related data of the vehicle-mounted device through a first user interface; when the vehicle-mounted device is in communication connection with the handheld device, acquiring the data access authority of the handheld device; after the vehicle-mounted device acquires the data access authority of the handheld device, acquiring related data of the handheld device; and the vehicle-mounted device displays the related data of the vehicle-mounted device and the acquired related data of the handheld device through a second user interface. The invention also provides a vehicle-mounted device for realizing the user interface integration method. The invention can integrate and process the user interface, is convenient for the user to operate the vehicle-mounted device and improves the user experience.

Description

User interface integration method and vehicle-mounted device
Technical Field
The invention relates to the technical field of user interface management, in particular to a user interface integration method and a vehicle-mounted device.
Background
Most of the existing vehicle-mounted systems are designed by vehicle manufacturers or third-party manufacturers. Although some manufacturers design vehicle-mounted systems that can be connected to the smart handheld devices for use, the user interface of the vehicle-mounted system is different from the user interface of the handheld devices of users, so that the user may be confused when the vehicle-mounted system is connected to the handheld devices for use, and the experience of use is poor.
Disclosure of Invention
In view of the above, a need exists for a user interface integration method and a vehicle-mounted device, which can integrate and process a user interface, facilitate a user to operate the vehicle-mounted device, and improve user experience.
The invention provides a user interface integration method, which is applied to a vehicle-mounted device and comprises the following steps: the vehicle-mounted device displays related data of the vehicle-mounted device through a first user interface; when the vehicle-mounted device is in communication connection with the handheld device, acquiring the data access authority of the handheld device; after the vehicle-mounted device acquires the data access authority of the handheld device, acquiring related data of the handheld device; and the vehicle-mounted device displays the related data of the vehicle-mounted device and the acquired related data of the handheld device through a second user interface.
Preferably, the method acquires the relevant data of the handheld device according to the relevant data of the vehicle-mounted device displayed by the first user interface.
Preferably, when the first user interface displays the navigation interface of the first navigation application of the vehicle-mounted device, the acquired related data of the handheld device comprises navigation data of the second navigation application of the handheld device.
Preferably, the navigation data of the second navigation application includes a destination reached by the second navigation application navigated within a preset time period.
Preferably, when the first user interface displays a music playing interface of a first music application of the vehicle-mounted device, the acquired related data of the handheld device includes music data of a second music application of the handheld device, where the music data of the second music application includes currently played music and a playing progress, an identifier of a user logging in the second music application, an application related to audio that has been run by the handheld device within a preset time period, and covers corresponding to favorite song lists and music albums played by the handheld device within the preset time period.
Preferably, when the first user interface displays an icon of an application that the vehicle-mounted device has run within a preset time period, the acquired related data of the handheld device includes the icon corresponding to the application that the handheld device has run within the preset time period.
Preferably, when the first user interface displays a specific icon for triggering display of icons of all applications of the in-vehicle device, the acquired relevant data of the handheld device includes icons corresponding to all applications of the handheld device, and the method further includes: and displaying icons of all applications of the vehicle-mounted device and icons of all applications of the handheld device by using a second user interface.
Preferably, the acquired related data of the handheld device further includes an identifier of a user logged in the handheld device within a preset time period, and the method further includes: and classifying the icons corresponding to all the applications of the vehicle-mounted device and the handheld device according to the use frequency or similar content of each icon in the icons corresponding to all the applications of the vehicle-mounted device and the handheld device, and then displaying the icons on the second user interface.
Preferably, on the second user interface, the color and/or shape of the related data of the vehicle-mounted device are different from the acquired related data of the handheld device.
Preferably, the first user interface comprises a first primary display area; the method also includes dividing the first main display area into a plurality of sub-areas according to a user input.
Preferably, the method further comprises:
and displaying a specific icon on one of the sub-areas, wherein the specific icon is used for triggering the display of icons of all applications of the vehicle-mounted device, and when a touch signal generated by touching the specific icon by a user is received, the icons corresponding to all the applications of the vehicle-mounted device are displayed to the user.
Preferably, the dividing the first main display area into a plurality of sub-areas according to the user's input includes:
presetting a plurality of user operations, wherein different user operations correspond to dividing the first main display area into different preset number of sub-areas; and
dividing the first main display area into a preset number of sub-areas corresponding to the received user operation according to the received user operation.
Preferably, the method further comprises:
specifying the content displayed by each of the plurality of sub-regions in response to user input.
Preferably, the method further comprises:
determining one sub-area from the plurality of sub-areas as a target area when a specified signal is received, wherein the target area corresponds to the specified signal;
determining the content currently displayed by the target area;
combining the plurality of sub-regions into one region in response to the designation signal; and
hiding/closing the contents displayed by other sub-areas except the target area in the plurality of sub-areas, and displaying the currently displayed contents of the target area and/or other contents related to the contents displayed by the target area in the merged area.
Preferably, the method further comprises:
and arranging and displaying icons respectively corresponding to all applications of the vehicle-mounted device in the combined area according to a preset sequence, wherein the preset sequence refers to the sequence of the installation time of each application installed on the vehicle-mounted device.
Preferably, the first user interface further comprises a first status display area and a first function control column; the first state display area is used for displaying the system time of the vehicle-mounted device and the signal intensity of wireless communication; the first function control column is used for displaying function buttons of the vehicle-mounted device.
Preferably, the method maintains the size, position and shape of the area where the first status display area and the first function control column are located while combining the plurality of sub-areas into one area in response to the designation signal.
Preferably, in the first function control column, the brightness of the function button corresponding to the function currently in the activated state is higher than the brightness of the function button corresponding to the function currently in the inactivated state.
Preferably, the second user interface includes a second status display area, a second main display area, and a second function control column, where the second status display area is used to display the system time of the vehicle-mounted device, the signal strength of wireless communication, and the identifiers of a plurality of users logged in to the handheld device within a preset time period; the second main display area comprises a plurality of sub-areas, and the plurality of sub-areas included in the second user interface respectively correspond to the plurality of sub-areas included in the first user interface; the content displayed by the second function control column is the same as the content displayed by the first function control column.
Preferably, the method further comprises:
and synchronizing the favorite song lists respectively corresponding to the plurality of users to the vehicle-mounted device.
The second aspect of the invention provides an in-vehicle device, which comprises a memory and a processor, wherein the memory is used for storing at least one instruction, and the processor is used for realizing the user interface integration method when executing the at least one instruction.
Compared with the prior art, the user interface integration method and the vehicle-mounted device provided by the invention have the advantages that the relevant data of the vehicle-mounted device is displayed through the first user interface of the vehicle-mounted device; when the vehicle-mounted device is in communication connection with the handheld device, acquiring the data access authority of the handheld device; after the vehicle-mounted device acquires the data access authority of the handheld device, acquiring related data of the handheld device; and the vehicle-mounted device displays the related data of the vehicle-mounted device and the acquired related data of the handheld device through a second user interface. According to the invention, the user interfaces of the vehicle-mounted device and the handheld device can be integrated, so that a user can conveniently operate the vehicle-mounted device, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a diagram of an environment for operating a user interface integration system according to a preferred embodiment of the present invention.
FIG. 2 is a block diagram of a user interface integration system according to a preferred embodiment of the present invention.
FIG. 3 is a flow chart of a user interface integration method according to a preferred embodiment of the present invention.
FIG. 4A is a diagram illustrating a first example of displaying relevant data of the in-vehicle device with a first user interface when the in-vehicle device and the handheld device are not connected.
FIG. 4B is a second exemplary diagram of displaying the related data of the vehicle-mounted device and the related data of the handheld device through the second user interface when the vehicle-mounted device and the handheld device are connected in a communication manner.
Fig. 4C illustrates that the data of the in-vehicle device is displayed after a plurality of sub-areas are combined into one area.
FIG. 5A is a third exemplary diagram of displaying relevant data of the in-vehicle device with the first user interface when the in-vehicle device and the handheld device are not connected.
FIG. 5B is a fourth exemplary diagram of displaying the related data of the vehicle-mounted device and the related data of the handheld device through the second user interface when the vehicle-mounted device and the handheld device are connected in a communication manner.
FIG. 6A is a fifth exemplary diagram illustrating the first user interface displaying the related data of the in-vehicle device when the in-vehicle device and the handheld device are not connected.
FIG. 6B is a sixth exemplary diagram of displaying the related data of the vehicle-mounted device and the related data of the handheld device by the second user interface when the vehicle-mounted device and the handheld device establish a communication connection.
Description of the main elements
Figure BDA0002183005480000061
Figure BDA0002183005480000071
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments of the present invention and features of the embodiments may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Referring to fig. 1, the user interface integration system 20 operates in the in-vehicle apparatus 1. The user interface integration system 20 is configured to integrate the user interface of the in-vehicle device 1 and the user interface of the handheld device 2 when the in-vehicle device 1 and the handheld device 2 establish a communication connection. In the present embodiment, the in-vehicle apparatus 1 may be mounted on a vehicle 100. The vehicle 100 may be an automobile, a locomotive, or the like. The handheld device 2 may be a mobile phone, a tablet computer, a personal digital assistant, a notebook computer, or the like.
In this embodiment, the in-vehicle device 1 includes a first memory 11, a first processor 12, a first display 13, a first GPS (Global Positioning System) module 14, a first wireless communication module 15, and a first wired communication module 16. The handheld device 2 includes a second memory 21, a second processor 22, a second display 23, a second GPS module 24, a second wireless communication module 25, and a second wired communication module 26.
It should be noted that fig. 1 is only an example of the in-vehicle device 1 and the handheld device 2. In other embodiments, the in-vehicle device 1 and the handheld device 2 may also include more or fewer elements, respectively, or have different configurations of elements. For example, the in-vehicle apparatus 1 may further include an operating system, and the handheld apparatus 2 may further include an operating system and a battery. I.e. the vehicle-mounted device 1 and the hand-held device 2 shown in fig. 1, are not to be construed as limiting the invention.
In this embodiment, the in-vehicle device 1 and the handheld device 2 may establish a communication connection in a wired or wireless manner. Specifically, the vehicle-mounted device 1 and the handheld device 2 can establish wireless communication connection by using the first wireless communication module 15 and the second wireless communication module 25, respectively. The vehicle-mounted device 1 and the handheld device 2 can establish a wired communication connection by using the first wired communication module 16 and the second wired communication module 26, respectively.
In this embodiment, the first wireless communication module 15 and the second wireless communication module 25 may be a Wi-Fi module, a bluetooth module, or other wireless communication modules, such as Radio-frequency identification (RFID) modules. In an embodiment, the first wireless communication module 15 and the second wireless communication module 25 may also be communication modules supporting communication protocols of 3G, 4G, Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WIMAX), 5G, and the like, so that the vehicle-mounted device 1 and the handheld device 2 have the above-mentioned various wireless communication functions.
In this embodiment, the first wired communication module 16 and the second wired communication module 26 may be Universal Serial Bus (USB) modules. So that the in-vehicle device 1 and the handheld device 2 can be connected through a transmission line compatible with the USB specification. In one embodiment, the first wired communication module 16 and the second wired communication module 26 may be Universal Serial Bus (USB) Type-C modules.
In this embodiment, the first Memory 11 and the second Memory 21 may respectively include a Read-Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), a One-time Programmable Read-Only Memory (OTPROM), an Electrically Erasable rewritable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc Memory, a magnetic disk Memory, a tape Memory, or any other non-volatile readable storage medium capable of carrying or storing data.
In some embodiments, the first memory 11 is used for storing program codes of computer programs and various data, such as a user interface integration system 20 installed in the in-vehicle apparatus 1, a first navigation application 17, a first music application 18, and one or more other applications and data related to the applications. In this embodiment, the first navigation application 17 is used for navigating the in-vehicle device 1, and the first music application 18 is used for playing music. The in-vehicle apparatus 1 may also include one or more other applications such as a radio or the like. In some embodiments, the second memory 21 is used for storing program codes of computer programs and various data, such as a second navigation application 27, a second music application 28 installed in the handheld device 2, and one or more other applications and data related to the applications. In this embodiment, the second navigation application 27 is used for navigating the handheld device 2, and the second music application 28 is used for playing music. The handheld device 2 may also include one or more other applications such as instant messaging software such as WeChat, QQ, etc.
In some embodiments, the first processor 12 and the second processor 22 may be formed by integrated circuits, for example, may be formed by integrated circuits packaged in a single package, or may be formed by integrated circuits packaged in a plurality of same functions or different functions. The first processor 12 and the second processor 22 may respectively include one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, artificial intelligence chips or Graphic Processing Units (GPUs), or a combination of various control chips. The first processor 12 is a Control Unit (Control Unit) of the in-vehicle apparatus 1, connects various components of the entire in-vehicle apparatus 1 by using various interfaces and lines, and executes various functions of the in-vehicle apparatus 1 and processes data by running or executing programs or modules stored in the first memory 11 and calling data stored in the first memory 11, for example, executing the user interface integration system 20 to integrate processing for a user interface. The second processor 22 is a control core of the handheld device 2, connects various components of the entire handheld device 2 by using various interfaces and lines, and executes various functions of the handheld device 2 and processes data by executing or executing programs or modules stored in the second memory 21 and calling data stored in the second memory 22.
In this embodiment, the first display 13 and the second display 23 may be touch screens or non-touch screens. The first display 13 may be used to display various data of the in-vehicle apparatus 1, such as a navigation interface of the first navigation application 17, a music playing interface of the first music application 18, and the like. The second display 23 is used for displaying various data of the handheld device 2, such as displaying a navigation interface of the second navigation application 27, a music playing interface of the second music application 28, and the like.
In this embodiment, the first GPS module 14 may be configured to detect a current geographic location and a moving speed of the in-vehicle apparatus 1, and transmit the current geographic location and the moving speed to the first navigation application 17 of the in-vehicle apparatus 1, so that the first navigation application 17 may navigate the in-vehicle apparatus 1 according to the detected current geographic location and the detected moving speed of the in-vehicle apparatus 1. The second GPS module 24 may be configured to detect a current geographic location and a moving speed of the handheld device 2 and transmit the current geographic location and the moving speed to the second navigation application 17 of the handheld device 2, so that the second navigation application 27 may navigate the handheld device 2 according to the detected current geographic location and the moving speed of the handheld device 2.
In this embodiment, the user interface integration system 20 may include one or more modules, which are stored in the first memory 11 and executed by at least one processor (in this embodiment, a first processor 12) to complete the present invention. For example, referring to FIG. 2, the user interface integration system 20 may include, but is not limited to, a determination module 201, a display module 202, and an execution module 203. The module referred to in the present invention refers to a program code of a series of computer programs that can be executed by at least one processor and that can perform a fixed function, and is stored in a memory. Each module may include at least one instruction that the first processor 12 may implement by executing to implement the user interface integration method provided by the following embodiments of the present invention. The detailed functions of the modules will be described in detail later in conjunction with the flowchart of fig. 3.
FIG. 3 is a flow chart of a user interface integration method according to a preferred embodiment of the present invention. The user interface integration method may be implemented by the at least one first processor 12 of the in-vehicle apparatus 1 executing computer-readable program code or instructions stored in a memory, such as the first memory 11.
Referring to FIG. 3, the user interface integration method may be implemented in a number of different ways. For example, the user interface integration method may be implemented using the configuration shown in fig. 1. Each step illustrated in fig. 3 represents the execution of one or more processes, methods, or subroutines that implement the user interface integration method. In addition, the sequence of each step shown in fig. 3 is only an example, and the execution sequence of each step may be changed according to different requirements according to the disclosure of the present invention. Steps may also be added or subtracted without departing from the present invention. The user interface integration method may be performed from step S301.
In step S301, the determining module 201 determines whether the in-vehicle device 1 and the handheld device 2 establish a communication connection. When the in-vehicle apparatus 1 does not establish a communication connection with the handheld apparatus 2, step S302 is executed. When the vehicle-mounted device 1 establishes a communication connection with the handheld device 2, step S303 is executed.
In this embodiment, when the first wireless communication module 15 of the in-vehicle device 1 establishes a communication connection with the first wireless communication module 25 of the handheld device 2, the determining module 201 may detect a wireless communication connection signal from the first wireless communication module 15. It is thus determined that the in-vehicle apparatus 1 and the hand-held apparatus 2 establish a communication connection.
Similarly, when the first wired communication module 16 of the in-vehicle device 1 establishes a communication connection with the second wired communication module 26 of the handheld device 2, the determining module 201 may detect a wired communication connection signal from the first wired communication module 16. It is thus determined that the in-vehicle apparatus 1 and the hand-held apparatus 2 establish a communication connection.
Step S302, when the in-vehicle device 1 does not establish a communication connection with the handheld device 2, the display module 202 displays relevant data of the in-vehicle device 1 through the first user interface 3.
In one embodiment, referring to FIG. 4A, the first user interface 3 includes a first status display area 31, a first main display area 32, and a first function control column 33.
In one embodiment, the first status display area 31 is located above the first main display area 32, and the first function control column 33 is located below the first main display area 32. The area of the area where the first main display area 32 is located is larger than the area of the area where the first state display area 31 is located. The area of the first main display area 32 is larger than that of the first function control column 33.
In one embodiment, the first status display area 31 is used for displaying the system time of the in-vehicle apparatus 1, the signal strength of wireless communication, and other related information.
In one embodiment, the first function control column 33 is used for displaying function buttons of the in-vehicle apparatus 1. For example, function buttons corresponding to the common functions of the in-vehicle device 1 may be displayed. For example, referring to fig. 4A, a setting button 331 for setting the air-conditioning temperature of the front seat of the vehicle 100, a button 332 corresponding to the telephone function, a button 333 corresponding to the front defogging function, a button 334 corresponding to the fan adjusting function, a button 335 corresponding to the rear defogging function, a button 336 corresponding to the related function of the vehicle 100, and a setting button 337 for setting the air-conditioning temperature of the rear seat of the vehicle 100 may be displayed in the first function control column 33.
In one embodiment, in the first function control column 33, the brightness of the function button corresponding to the function currently in the activated state is higher than the brightness of the function button corresponding to the function currently in the inactivated state.
In one embodiment, in the first function control column 33, the color of the function button corresponding to the function currently in the activated state is different from the color of the function button corresponding to the function currently in the inactivated state.
In one embodiment, the display module 202 may divide the first main display area 32 into a plurality of sub-areas. In one embodiment, the plurality of sub-areas may be respectively used for displaying user interfaces corresponding to different applications of the in-vehicle apparatus 1.
In one embodiment, the display module 202 may divide the first main display area 32 into the plurality of sub-areas in response to a user input.
In one embodiment, the display module 202 may preset a plurality of user operations, and different user operations correspond to dividing the first main display area 32 into different preset numbers of sub-areas. That is, how many sub-regions the first main display area 32 is divided into is determined according to which operation a user operation belongs to preset. In one embodiment, the size of each of the plurality of sub-regions may be preset by the display module 202.
For example, the display module 202 may preset three user operations, which include a first user operation, a second user operation, and a third user operation. For example, the first user operation may refer to that two fingers of the user slide on the first main display area 32 at the same time and the distance between the two fingers gradually increases. The second user operation may mean that three fingers of the user slide on the first main display area 32 at the same time and the distances between the three fingers gradually increase. The third user operation may mean that four fingers of the user slide on the first main display area 32 at the same time and the distances between the four fingers gradually increase. When the display module 202 detects the first user operation, the display module 202 divides the first main display area 32 into three sub-areas. When the display module 202 detects the second user operation, the display module 202 divides the first main display area 32 into four sub-areas. When the display module 202 detects the third user operation, the display module 202 divides the first main display area 32 into five sub-areas.
In one embodiment, the display module 202 may further specify the content displayed by each of the plurality of sub-regions in response to a user input.
For example, when the display module 202 detects a long-press signal (e.g., pressing for more than 5 seconds) from a sub-area of the plurality of sub-areas, the names of the plurality of applications are displayed to the user for selection, and the user interface corresponding to the application selected by the user is displayed in the sub-area.
In the embodiment, the sub-regions include a first sub-display region 321, a second sub-display region 322, and a third sub-display region 323 as an example.
In one embodiment, the first sub-display area 321 may be used for displaying a navigation interface of the first navigation application 17. For example, the geographic location at which the vehicle 100 is currently located, the destination location, the suggested navigation route, and alternative navigation routes, etc. may be displayed. The second sub-display area 322 may be used to display a music playing interface of the first music application 18. For example, a name of music currently played or to be played by the first music application 18, a cover picture corresponding to the currently played or to be played music, and the like may be displayed. The third sub-display area 323 may be used to display user interfaces corresponding to other applications of the in-vehicle apparatus 1 or to display icons of respective applications used by the in-vehicle apparatus 1 within a preset time period (e.g., within the last week).
For example, referring to fig. 4A, the third sub-display area 323 displays an icon 3231 corresponding to the first music application 18, an icon 3232 corresponding to the parking function, an icon 3233 corresponding to the radio function, and an icon 3234 corresponding to the USB connection function, which have been run for the last week by the in-vehicle apparatus 1.
In one embodiment, the third sub-display section 323 may further display a specific icon 3230, and the specific icon 3230 is used for triggering display of icons of all applications of the in-vehicle apparatus 1. For example, when the display module 202 receives a touch signal generated by a user touching the specific icon 3230, the display module 202 may display icons corresponding to all applications of the in-vehicle device 1 to the user.
In one embodiment, when the display module 202 has divided the first main display area 32 into a plurality of sub-areas (for example, when the first main display area 32 is divided into the first sub-display area 321, the second sub-display area 322, and the third sub-display area 323), the display module 202 further receives a designation signal, and determines one of the plurality of sub-areas as a target area when receiving the designation signal, wherein the target area corresponds to the designation signal. The display module 202 determines what the target area is currently displaying. The display module 202 may further merge the sub-regions into one region in response to the designation signal (the merged region is the region where the first main display region 32 is located, i.e., the first main display region 32 is not divided into a plurality of regions). The display module 202 may further hide/close the content displayed in the sub-areas other than the target area, and display the content currently displayed in the target area and/or the other content related to the content currently displayed in the target area in the merged area. In one embodiment, the display module 202 displays the content currently displayed in the target area in the merged area after zooming in.
In one embodiment, the display module 202 combines the sub-areas into one area in response to the designation signal while maintaining the size, position, and shape of the area where the first status display area 31 and the first function control column 33 are located.
In the first embodiment, the designation signal is a first signal, and the first signal may be a signal received from the first sub-display area 321 (for example, a sliding signal generated by sliding a preset distance, for example, 2 centimeters, from left to right or from right to left in the first sub-display area 321), or a signal generated by rotating a first designated hardware knob (not shown in the figure) on the in-vehicle device 1, or a signal generated by pressing a first designated hardware button (not shown in the figure) on the in-vehicle device 1.
In one embodiment, the display module 202 may associate the first signal with the first sub-display area 321 in advance, so that when the first signal is received, it may be determined that the first signal corresponds to the first sub-display area 321, and determine that the first sub-display area 321 is a target area from the plurality of sub-areas, and determine content currently displayed by the target area. The display module 202 further merges the sub-regions (such as the first sub-display region 321, the second sub-display region 322, and the third sub-display region 323) into a region (the merged region is the region where the first main display region 32 is located, i.e. the first main display region 32 is not divided into a plurality of regions). The display module 202 further hides/closes the contents displayed in the sub-areas (i.e., the second sub-display area 322 and the third sub-display area 323) except for the target area (i.e., the first sub-display area 321) from the plurality of sub-areas, and displays the contents currently displayed in the target area and/or the other contents related to the contents currently displayed in the target area in the merged area.
In one embodiment, the other content related to the content currently displayed in the first sub-display area 321 includes a shortcut bar of the in-vehicle device 1.
In one embodiment, the display module 202 displays the shortcut bar of the vehicle-mounted device 1 and also displays the detail information corresponding to one of the shortcut bars. In one embodiment, the one of the shortcuts may refer to a shortcut tool located at a first position in the shortcut bar.
For example, when the display module 202 receives the first signal, referring to fig. 4C, the display module 202 merges the first sub-display area 321, the second sub-display area 322, and the third sub-display area 323 into one area (the merged area is the area where the first main display area 32 is located, that is, the first main display area 32 is not divided into a plurality of areas), and displays the shortcut bar of the in-vehicle device 1 in the merged area. The shortcut tool bar includes, but is not limited to, shortcut tools such as "quick setup", "vehicle information", "driving mode", "mobile phone connection", "in-vehicle control", "audio equipment", "maintenance service", and the like. The display module 202 also displays detail information corresponding to the shortcut tool, i.e., "quick setup", located at the first position of the shortcut bar. For example, the detailed information for displaying the "quick setup" of the shortcut tool includes a status of a headlight of the vehicle 100, a status of a fog light, a status of a front compartment, a status of a rear compartment, a status of a sunroof, and a control column 340 for adjusting the brightness of the first display 13. The control column 340 is striped.
In an embodiment, when the display module 202 receives a touch signal from a location of a certain shortcut tool in the shortcut bar, the display module 202 displays the detail information corresponding to the certain shortcut tool in the merged area. For example, when the display module 202 receives a touch signal generated when a user touches the location of the shortcut tool "audio device", the display module 202 displays the detail information corresponding to the shortcut tool "audio device", such as the current volume, in the merged area.
In the second embodiment, the designation signal is a second signal, and the second signal may be a signal received from the second sub-display area 322 (for example, a long-press signal generated by pressing any position of the second sub-display area 322 for a preset time period, for example, 3 seconds), or a signal generated by rotating a second designated hardware knob (not shown in the figure) on the in-vehicle apparatus 1, or a signal generated by pressing a second designated hardware button (not shown in the figure) on the in-vehicle apparatus 1.
In one embodiment, the display module 202 may associate the second signal with the second sub-display area 322 in advance, so that when the second signal is received, it may be determined that the second signal corresponds to the second sub-display area 322, and determine that the second sub-display area 322 is a target area from the plurality of sub-areas, and determine content currently displayed by the target area. The display module 202 further merges the sub-regions (such as the first sub-display region 321, the second sub-display region 322, and the third sub-display region 323) into a region (the merged region is the region where the first main display region 32 is located, i.e. the first main display region 32 is not divided into a plurality of regions). The display module 202 further hides/closes the contents displayed in the sub-areas (i.e., the first sub-display area 321 and the third sub-display area 323) except for the target area (i.e., the second sub-display area 322) from the plurality of sub-areas, and displays the contents currently displayed in the target area and/or the contents related to the contents currently displayed in the target area in the merged area.
In one embodiment, the other content related to the content currently displayed in the second sub-display area 322 includes: the icons corresponding to the applications related to the audio that have been run by the in-vehicle device 1 within a preset time period (for example, within the last week), and the covers corresponding to the covers and favorite song tickets respectively corresponding to the music albums that have been played by the in-vehicle device 1 within a preset time period (for example, within the last week).
For example, referring to fig. 5A, when the display module 202 receives the second signal, the display module 202 merges the first sub-display area 321, the second sub-display area 322, and the third sub-display area 323 into one area (the merged area is the area where the first main display area 32 is located, that is, the first main display area 32 is not divided into a plurality of areas), and the display module 202 further merges the currently displayed content (such as the name of the currently played music, the cover picture corresponding to the currently played music, and the playing progress) of the target area (i.e., the second sub-display area 322), the icons corresponding to the audio-related applications that the in-vehicle device 1 has run within the preset time period (for example, within the latest week) (such as the icon 3231 corresponding to the first music application 18, the icon 3232 corresponding to the "pandra" application), and the icon 3232 corresponding to the "panda" application, An icon 3233 corresponding to the radio application, an icon 3234 corresponding to the USB connection application, and the like), and covers corresponding to music albums played by the in-vehicle device 1 within a preset time period (for example, within the last week) and covers corresponding to favorite song tickets are displayed in the merged area.
In one embodiment, the display module 202 displays icons corresponding to all applications related to audio, which are operated by the in-vehicle device 1 within the preset time period, at different brightness levels according to whether the applications are currently in an operating state, where the brightness level of the icon corresponding to the application currently in the operating state is greater than the brightness level of the icon corresponding to the application currently in a non-operating state.
In one embodiment, the display module 202 arranges icons corresponding to all audio-related applications that have been run by the in-vehicle apparatus 1 within the preset time period (for example, within the last week) according to whether the applications are currently running, wherein the icons corresponding to the applications that are currently running are arranged in front, and the icons corresponding to the applications that are currently not running are arranged behind.
In an embodiment, the display module 202 further arranges the icons corresponding to the applications currently running in the previous state according to the sequence of the initial execution times corresponding to the applications. In an embodiment, the display module 202 further randomly arranges the icons corresponding to the applications that are arranged later and currently in the non-running state, or arranges the icons according to the order of the termination execution times corresponding to the applications that are currently in the non-running state. For example, an icon of a certain application corresponding to the latest termination execution time is arranged at the forefront.
For example, referring to fig. 5A, an icon 3232 of a "Pandora" application currently in an un-running state and corresponding to a latest termination execution time is displayed at the forefront of each application currently in an un-running state.
In this embodiment, the start execution time of each application currently in the running state refers to a start execution time corresponding to the current running of each application.
In one embodiment, the display module 202 may further use an identification line to distinguish an icon corresponding to an application currently in a running state from an icon corresponding to an application currently in an non-running state.
For example, still referring to fig. 5A, the display module 202 uses the identification line 4 to distinguish the icon 3231 corresponding to the first music application 18 currently in the running state from icons corresponding to other applications currently in the non-running state (e.g., the icon 3232 corresponding to the "Pandora" application).
In the third embodiment, the designation signal is a third signal, and the third signal may be a signal received from the third sub-display area 323 (for example, a signal generated by pressing/touching a specific icon 3230 of the third sub-display area 323), or a signal generated by rotating a third designation hardware knob (not shown in the figure) on the in-vehicle apparatus 1, or a signal generated by pressing a third designation hardware button (not shown in the figure) on the in-vehicle apparatus 1.
It should be noted that the third designated hardware knob is different from the first designated hardware knob and the second designated hardware knob. The third designated hardware button is different from the first designated hardware button and the second designated hardware button.
In one embodiment, the display module 202 may associate the third signal with the third sub-display area 321 in advance, so that when the third signal is received, it may be determined that the third signal corresponds to the third sub-display area 323, and determine that the third sub-display area 323 is a target area from the plurality of sub-areas, and determine the content currently displayed by the target area. The display module 202 further merges the sub-regions (such as the first sub-display region 321, the second sub-display region 322, and the third sub-display region 323) into a region (the merged region is the region where the first main display region 32 is located, i.e. the first main display region 32 is not divided into a plurality of regions). The display module 202 further hides/closes the contents displayed in the sub-areas (i.e., the first sub-display area 321 and the second sub-display area 322) except for the target area (i.e., the third sub-display area 323) from the plurality of sub-areas, and displays the contents currently displayed in the target area and/or the contents related to the contents currently displayed in the target area in the merged area.
In one embodiment, the other content related to the content currently displayed in the third sub-display area 323 includes icons respectively corresponding to all applications of the in-vehicle apparatus 1.
For example, referring to fig. 6A, when the display module 202 receives the third signal, the display module 202 merges the first sub-display area 321, the second sub-display area 322, and the third sub-display area 323 into one area (the merged area is an area where the first main display area 32 is located, that is, the first main display area 32 is not divided into a plurality of areas), and the display module 202 further displays the currently displayed content (for example, icons corresponding to applications that have been run in the last week) of the target area (that is, the third sub-display area 323) and icons corresponding to all applications of the in-vehicle device 1 in the merged area.
In one embodiment, the display module 202 may display icons corresponding to all applications of the in-vehicle device 1 in the merged area randomly or in a preset order. In one embodiment, the preset order may refer to a sequential order of the installation time of each application to the in-vehicle apparatus 1.
In one embodiment, referring to fig. 6A, the display module 202 may display icons corresponding to applications that have been run by the in-vehicle apparatus 1 within the preset time period (e.g., within the last week) and icons corresponding to all applications of the in-vehicle apparatus 1 in different lists respectively.
In step S303, when the vehicle-mounted device 1 and the handheld device 2 establish a communication connection, the execution module 203 obtains the data access right of the handheld device 2. The executing module 203 further obtains the related data of the handheld device 2 after obtaining the data access right of the handheld device 2.
In one embodiment, when the in-vehicle device 1 and the handheld device 2 establish a communication connection, the executing module 203 may send a request message to the handheld device 2 for obtaining the data access right of the handheld device 2. The handheld device 2 may determine whether to allow the handheld device 2 to obtain the data access right according to the confirmation operation of the user after receiving the request message. For example, the handheld device 2 may display a dialog box on the second display 23 of the handheld device 2 after receiving the request message to inquire whether the user authorizes the in-vehicle device 1 to obtain the relevant data of the handheld device 2.
In one embodiment, the executing module 203 acquires the relevant data of the handheld device 2 according to the relevant data of the in-vehicle device 1 displayed by the first user interface.
Specifically, when the first sub-display area 321 included in the first main display area 32 displays the navigation interface of the first navigation application 17, the execution module 203 acquires the navigation data of the second navigation application 27 of the handheld device 2. In one embodiment, the navigation data of the second navigation application 27 includes a destination reached by the second navigation application 27 by navigation within a preset time period (e.g., within the last week).
In one embodiment, when the second sub-display area 322 included in the first main display area 32 displays the music playing interface of the first music application 18, the executing module 203 obtains the music data of the second music application 28 of the handheld device 2. In one embodiment, the music data of the second music application 28 includes the currently played music and the playing progress, the identification of the user logged in the second music application 28, the audio-related applications that have been run by the handheld device 2 within a preset time period (e.g., within the last week), and the cover corresponding to the cover and the favorite song list respectively corresponding to the music album played by the handheld device 2 within a preset time period (e.g., within the last week).
In one embodiment, when the third sub-display area 323 included in the first main display area 32 displays an icon showing an application that the in-vehicle device 1 has run within a preset time period (for example, within the last week), the execution module 203 acquires the icon corresponding to the application that the handheld device 2 has run within the preset time period.
In one embodiment, when the specific icon 3230 for triggering display of icons of all applications of the in-vehicle apparatus 1 is further displayed in the third sub-display area 323, the execution module 203 further acquires icons corresponding to all applications of the handheld apparatus 2.
In one embodiment, the profile of the handheld device 2 further includes an identification of a user logged into the handheld device 2 within a predetermined time period (e.g., within the last week).
In step S304, the display module 202 displays the related data of the in-vehicle device 1 and the acquired related data of the handheld device 2 through a second user interface 5.
In one embodiment, on the second user interface, the color and/or shape of the related data of the vehicle-mounted device is different from the acquired related data of the handheld device.
In one embodiment, referring to fig. 4B, after the communication connection between the in-vehicle device 1 and the handheld device 2 is established, the display module 202 displays the related data of the in-vehicle device 1 and the acquired related data of the handheld device 2 through a second user interface 5.
In one embodiment, the second user interface 5 includes a second status display area 51, a second main display area 52, and a second function control column 53.
In one embodiment, the second status display area 51 is located above the second main display area 52, and the second function control column 53 is located below the second main display area 52. The area of the region where the second main display region 52 is located is larger than the area of the region where the second state display region 51 is located. The area of the second main display area 52 is also larger than the area of the second function control column 53.
In one embodiment, the second status display area 51 has the same size and shape as the first status display area 31, and the position of the second status display area 51 on the first display screen 13 is the same as the position of the first status display area 31 on the first display screen 13. The second main display area 52 has the same size and shape as the first main display area 32, and the position of the second main display area 52 on the first display 13 is the same as the position of the first main display area 32 on the first display 13. The second function control column 53 has the same size and shape as the first function control column 33, and the position of the second function control column 53 on the first display 13 is the same as the position of the first function control column 33 on the first display 13.
In one embodiment, the second status display area 51 is used for displaying the system time of the in-vehicle device 1, the signal strength of wireless communication, and the identities of a plurality of users logged in the handheld device 2 within a preset time period (e.g., within the last week).
For example, referring to fig. 4B, the display module 202 displays, in the second status display area 51, the system time of the in-vehicle device 1, the signal strength of the wireless communication, and the fact that the user 511 and the user 512 have logged in the handheld device 2 within the last week.
In one embodiment, the display module 202 also synchronizes favorite song lists respectively corresponding to the plurality of users to the in-vehicle device 1.
In one embodiment, the brightness of the identifier corresponding to the user currently logged in is greater than the brightness of the identifier corresponding to the user logged in to the handheld device 2 but currently not logged in within the preset time period.
In one embodiment, the second function control column 53 displays the same content as the first function control column 53, and the second function control column 53 may also be used to display function buttons of the in-vehicle apparatus 1. For example, function buttons corresponding to the common functions of the in-vehicle device 1 may be displayed. For example, a setting button 331 for setting the air-conditioning temperature of the front seat of the vehicle 100, a button 332 corresponding to the telephone function, a button 333 corresponding to the front defogging function, a button 334 corresponding to the fan adjustment function, a button 335 corresponding to the rear defogging function, a button 336 corresponding to the related function of the vehicle 100, and a setting button 337 for setting the air-conditioning temperature of the rear seat of the vehicle 100 may be displayed in the second function control column 53.
In one embodiment, in the second function control column 53, the brightness of the function button corresponding to the function currently in the activated state is higher than the brightness of the button corresponding to the function button currently in the inactivated state.
In one embodiment, the display module 202 may divide the second main display area 52 into a plurality of sub-areas. The plurality of sub-areas includes a fourth sub-display area 521, a fifth sub-display area 522, and a sixth sub-display area 523.
In one embodiment, the fourth sub display area 521 has the same shape and size as the first sub display area 321, and the position of the fourth sub display area 521 in the second main display area 52 is the same as the position of the first sub display area 321 in the first main display area 32. The fifth sub display area 522 has the same shape and size as the second sub display area 322, and the position of the fifth sub display area 522 in the second main display area 52 is the same as the position of the second sub display area 322 in the first main display area 32. The sixth sub display region 523 and the third sub display region 323 have the same shape and size, and the position of the sixth sub display region 523 in the second main display region 52 is the same as the position of the third sub display region 323 in the first main display region 32.
In one embodiment, the plurality of sub-regions may be respectively used to display user interfaces corresponding to different applications. In one embodiment, the fourth sub-display 521 may be used to display a navigation interface of the first navigation application 17. For example, the geographic location at which the vehicle 100 is currently located, the destination location, the suggested navigation route, and alternative navigation routes, etc. may be displayed. The display module 202 further displays the navigation data of the second navigation application 27 in the fourth sub-display area 521, for example, the destination reached by the second navigation application 27 navigated within a preset time period (for example, within the last week).
In one embodiment, when the execution module 203 acquires the music data of the second music application 28 from the handheld apparatus 1, as long as the second music application 28 of the handheld apparatus 2 is currently in the running state, the display module 202 closes the first music application 18 of the in-vehicle apparatus 1 whether the first music application 18 is currently in the running state or the non-running state, and displays the music playing interface of the second music application 28 in the fifth sub-display area 522. For example, a name of music currently played by the second music application 28, a cover picture and a playing progress corresponding to the currently played music may be displayed.
In one embodiment, the display module 202 displays, in the sixth sub-display area 523, an icon corresponding to an application that the in-vehicle apparatus 1 has run within the preset time period (for example, within the last week) and an acquired icon corresponding to an application that the handheld apparatus 2 has run within the preset time period.
In one embodiment, the display module 202 may arrange icons corresponding to applications that have been run by the in-vehicle device 1 and the handheld device 2 within the preset time period according to the termination execution time corresponding to each application. For example, the display module 202 may arrange the icon 5231 corresponding to an application whose termination execution time is latest at the top.
In this embodiment, the execution termination time corresponding to each application refers to the time when the execution of each application is finished last time.
In one embodiment, the display module 202 further displays a specific icon 5230 in the sixth sub-display region 523, wherein the specific icon 5230 is used for triggering the display of icons of all applications of the in-vehicle device 1 and icons corresponding to all applications of the handheld device 2. For example, when the display module 202 receives a touch signal generated by a user touching the specific icon 5230, the display module 202 can display icons corresponding to all applications of the in-vehicle device 1 and the handheld device 2 to the user.
In one embodiment, the display module 202 may classify and display icons corresponding to all applications of the in-vehicle device 1 and the handheld device 2 to a user according to the frequency of use or similar contents of the icons corresponding to all applications of the in-vehicle device 1 and the handheld device 2.
For example, the display module 202 may divide icons corresponding to applications with usage frequencies greater than a preset value into the same group and display the same group to the user, and divide icons corresponding to other applications with usage frequencies less than or equal to the preset value into another group and display the same group to the user.
For another example, the display module 202 may display the icons corresponding to the applications related to the food and beverage category to the user after classifying into one group, and display the icons corresponding to the applications related to the game category to the user after classifying into another group.
In one embodiment, when the display module 202 has divided the second main display area 52 into a plurality of sub-areas (for example, the second main display area 52 is divided into the fourth sub-display area 521, the fifth sub-display area 522, and the sixth sub-display area 523), the display module 202 further receives a predetermined signal, and determines one of the sub-areas as a target area when receiving the predetermined signal, wherein the target area corresponds to the predetermined signal. The display module 202 determines what the target area is currently displaying. The display module 202 may further merge the sub-regions into one region in response to the predetermined signal (the merged region is the region where the second main display region 52 is located, i.e., the second main display region 52 is not divided into multiple regions). The display module 202 may further hide/close the content displayed in the sub-areas other than the target area, and display the content currently displayed in the target area and/or the other content related to the content currently displayed in the target area in the merged area. In one embodiment, the display module 202 displays the content currently displayed in the target area in the merged area after zooming in.
In one embodiment, the display module 202 combines the sub-areas into one area in response to the predetermined signal while maintaining the size, position and shape of the area where the second status display area 51 and the second function control column 53 are located.
In the first embodiment, the predetermined signal is a fourth signal, and the fourth signal may be a signal received from the fourth sub-display area 521 (for example, a sliding signal generated by sliding the fourth sub-display area 521 from left to right or from right to left by a preset distance, for example, 2 centimeters), or a signal generated by rotating the first designated hardware knob on the in-vehicle apparatus 1, or a signal generated by pressing the first designated hardware button on the in-vehicle apparatus 1.
In one embodiment, the display module 202 may associate the fourth signal with the fourth sub-display area 521 in advance, so that when the fourth signal is received, it may be determined that the fourth signal corresponds to the fourth sub-display area 521, and determine that the fourth sub-display area 521 is a target area from the plurality of sub-areas, and determine content currently displayed by the target area. The display module 202 further merges the sub-regions (such as the fourth sub-display region 521, the fifth sub-display region 522, and the sixth sub-display region 523) into one region (the merged region is the region where the second main display region 52 is located, i.e. the second main display region 52 is not divided into a plurality of regions). The display module 202 further hides/closes the contents displayed in the sub-areas (i.e., the fifth sub-display area 522 and the sixth sub-display area 523) except for the target area (i.e., the fourth sub-display area 521), and displays the contents currently displayed in the target area and/or the other contents related to the contents currently displayed in the target area in the merged area.
In one embodiment, the other content related to the content currently displayed in the fourth sub-display area 521 includes a shortcut bar of the in-vehicle device 1.
In one embodiment, the display module 202 displays the shortcut bar of the vehicle-mounted device 1 and also displays the detail information corresponding to one of the shortcut bars. In one embodiment, the one of the shortcuts may refer to a shortcut tool located at a first position in the shortcut bar. In a specific display method, the display module 202 is referred to and the first sub-display area 321, the second sub-display area 322, and the third sub-display area 322 are combined into one area, and then the shortcut bar of the in-vehicle device 1 is displayed in the combined area. And will not be described in detail herein.
In the second embodiment, the predetermined signal is a fifth signal, which may be a signal received from the fifth sub-display area 522 (for example, a long-press signal generated by pressing any position of the fifth sub-display area 522 for a preset time period, for example, 3 seconds), or a signal generated by rotating a second designated hardware knob on the in-vehicle apparatus 1, or a signal generated by pressing a second designated hardware button on the in-vehicle apparatus 1.
In one embodiment, the display module 202 may associate the fifth signal with the fifth sub-display area 522 in advance, so that when the fifth signal is received, it may be determined that the fifth signal corresponds to the fifth sub-display area 522, and determine that the fifth sub-display area 522 is a target area from the plurality of sub-areas, and determine content currently displayed by the target area. The display module 202 further merges the sub-regions (such as the fourth sub-display region 521, the fifth sub-display region 522, and the sixth sub-display region 523) into one region (the merged region is the region where the second main display region 52 is located, i.e. the second main display region 52 is not divided into a plurality of regions). The display module 202 further hides/closes the contents displayed in the sub-areas (i.e., the fourth sub-display area 521 and the sixth sub-display area 523) except for the target area (i.e., the fifth sub-display area 522) from the plurality of sub-areas, and displays the contents currently displayed in the target area and/or the other contents related to the contents currently displayed in the target area in the merged area.
In one embodiment, the other content related to the content currently displayed in the fifth sub-display area 522 includes: the icons respectively corresponding to the applications related to the audio that the vehicle-mounted device 1 has run within a preset time period (for example, within the last week), the covers respectively corresponding to the covers and favorite song tickets that the vehicle-mounted device 1 has played in the preset time period (for example, within the last week), the icons respectively corresponding to the applications related to the audio that the handheld device 2 has run within the preset time period, and the covers respectively corresponding to the covers and favorite song tickets that the handheld device 2 has played in the preset time period.
For example, when the display module 202 receives the fifth signal, as shown in fig. 5B, the display module 202 merges the fourth sub-display area 521, the fifth sub-display area 522, and the sixth sub-display area 523 into one area (the merged area is the area where the second main display area 52 is located, that is, the second main display area 52 is not divided into a plurality of areas), and the display module 202 further merges the currently displayed content (such as the name of the currently played music, the cover picture corresponding to the currently played music, and the playing progress) of the target area (i.e., the fifth sub-display area 522), the icons corresponding to the audio-related applications that the in-vehicle device 1 has run within the preset time period (for example, within the latest week) (such as the icon 3231 corresponding to the first music application 18, the icon 3232 corresponding to the "panda" application), and the icon 3232 corresponding to the "panda" application), An icon 3233 corresponding to a radio application, an icon 3234 corresponding to a USB connection application, etc.), covers corresponding to covers and favorite song tickets respectively of a music album played by the vehicle-mounted device 1 within the preset time period (for example, within the last week), icons corresponding to applications related to audio that have been run by the handheld device 2 within the preset time period (for example, an icon 5201 corresponding to the second music application 28 and icons 5202 corresponding to other music applications), and covers and favorite song tickets corresponding to music albums respectively played by the handheld device 2 within the preset time period are displayed in the combined area.
In an embodiment, the display module 202 may display icons corresponding to applications related to audio, which are executed by the in-vehicle device 1 and the handheld device 2 within the preset time period, at different brightness levels according to whether the applications are currently in an operating state, where the brightness level of the icon corresponding to the application currently in the operating state is greater than the brightness level of the icon corresponding to the application currently in the non-operating state.
In one embodiment, the display module 202 may display icons corresponding to applications related to audio, which are executed by the in-vehicle device 1 and the handheld device 2 within the preset time period, in different colors according to whether the applications are currently in an operating state, where the color of the icon corresponding to the application currently in the operating state is different from the color of the icon corresponding to the application currently in an non-operating state.
In an embodiment, the display module 202 may also arrange icons corresponding to all applications related to audio that have been run by the in-vehicle device 1 and the handheld device 2 within the preset time period according to whether the applications are currently in a running state, where the display module 202 arranges the icons corresponding to the applications that are currently in the running state in front of the icons corresponding to the applications that are currently in the running state, and arranges the icons corresponding to the applications that are currently not in the running state behind the icons.
In an embodiment, the display module 202 further arranges the icons corresponding to the applications currently running in the previous state according to the sequence of the initial execution times corresponding to the applications. In an embodiment, the display module 202 further randomly arranges the icons corresponding to the applications that are arranged later and currently in the non-running state, or arranges the icons according to the order of the termination execution times corresponding to the applications that are currently in the non-running state. For example, an icon of a certain application corresponding to the latest termination execution time is arranged at the forefront.
In one embodiment, the display module 202 may further use an identification line 6 to distinguish an icon corresponding to an application currently in a running state from an icon corresponding to an application currently in an non-running state.
In the third embodiment, the predetermined signal is a sixth signal, which may be a signal received from the sixth sub-display area 523 (for example, a signal generated by pressing/touching a specific icon 3230 of the sixth sub-display area 523), or a signal generated by rotating the third designated hardware knob on the in-vehicle apparatus 1, or a signal generated by pressing the third designated hardware button on the in-vehicle apparatus 1.
In an embodiment, the display module 202 may associate the sixth signal with the sixth sub-display area 321 in advance, so that when the sixth signal is received, it may be determined that the sixth signal corresponds to the sixth sub-display area 523, and determine, from the plurality of sub-areas, that the sixth sub-display area 523 is a target area, and determine content currently displayed by the target area. The display module 202 further merges the sub-regions (such as the fourth sub-display region 521, the fifth sub-display region 522, and the sixth sub-display region 523) into one region (the merged region is the region where the second main display region 52 is located, i.e. the second main display region 52 is not divided into a plurality of regions). The display module 202 further hides/closes the contents displayed in the sub-regions (i.e., the fourth sub-display region 521 and the fifth sub-display region 522) except the target region (i.e., the sixth sub-display region 523) from the plurality of sub-regions, and displays the contents currently displayed in the target region and/or the other contents related to the contents currently displayed in the target region in the merged region.
In one embodiment, the other content related to the currently displayed content in the sixth sub-display area 523 includes icons corresponding to all applications of the in-vehicle apparatus 1, and icons corresponding to all applications of the handheld apparatus 1.
For example, referring to fig. 6B, when the display module 202 receives the sixth signal, the display module 202 merges the fourth sub-display area 521, the fifth sub-display area 522, and the sixth sub-display area 523 into one area (the merged area is the area where the second main display area 52 is located, that is, the second main display area 52 is not divided into a plurality of areas), and the display module 202 further displays the currently displayed content of the target area (that is, the sixth sub-display area 523) (for example, icons corresponding to applications that have been run in the last week of the in-vehicle device 1 and the handheld device 2), icons corresponding to all applications of the in-vehicle device 1 (pie icons shown in the area where the second main display area 52 is located), and icons corresponding to all applications of the handheld device 2 (rectangular icons shown in the area where the second main display area 52 is located) on the merged area And (4) a region.
In one embodiment, the display module 202 may display icons corresponding to all applications of the in-vehicle device 1 and the handheld device 2 in the merged area at random or in a preset order. In one embodiment, the preset sequence may refer to a sequence of installation time of each application to the corresponding device (i.e., the in-vehicle device 1 or the handheld device 2).
In one embodiment, referring to fig. 6B, the display module 202 may display icons corresponding to applications that have been run by the in-vehicle apparatus 1 and the handheld apparatus 2 within the preset time period (for example, within the last week) in one list (in-vehicle applications), and display icons corresponding to all applications of the in-vehicle apparatus 1 and the handheld apparatus 2 in another list (Stephen mobile phone applications).
In step S305, the determination module 201 determines whether the in-vehicle device 1 and the handheld device 2 are disconnected from communication. If the communication connection between the vehicle-mounted device 1 and the handheld device 2 is disconnected, the process returns to step S302. If the on-board device 1 and the handheld device 2 are not disconnected from the communication link, the process returns to step S304.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or that the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (20)

1. A user interface integration method is applied to a vehicle-mounted device and is characterized by comprising the following steps:
the vehicle-mounted device displays related data of the vehicle-mounted device through a first user interface;
when the vehicle-mounted device is in communication connection with a handheld device, acquiring the data access authority of the handheld device;
after the vehicle-mounted device acquires the data access authority of the handheld device, acquiring related data of the handheld device; and
and the vehicle-mounted device displays the related data of the vehicle-mounted device and the acquired related data of the handheld device through a second user interface.
2. The method as claimed in claim 1, wherein the method obtains the relevant data of the handheld device according to the relevant data of the in-vehicle device displayed by the first user interface.
3. The user interface integration method of claim 2, wherein the acquired profile of the handheld device comprises navigation data of a second navigation application of the handheld device when the first user interface displays a navigation interface of a first navigation application of the in-vehicle device.
4. The user interface integration method of claim 3, wherein the navigation data of the second navigation application includes a destination reached by the second navigation application navigating within a preset time period.
5. The user interface integration method of claim 2, wherein the acquired related data of the handheld device comprises music data of a second music application of the handheld device when the first user interface displays a music playing interface of a first music application of the in-vehicle device.
6. The method as claimed in claim 5, wherein the music data of the second music application includes music currently played and a playing progress, an identifier of a user logged in the second music application, applications related to audio running by the handheld device within a preset time period, covers corresponding to music albums played by the handheld device within the preset time period, and covers corresponding to favorite song lists.
7. The user interface integration method of claim 2, wherein when the first user interface displays an icon of an application that has been run by the in-vehicle device within a preset time period, the acquired related data of the handheld device includes an icon corresponding to the application that has been run by the handheld device within the preset time period.
8. The user interface integration method of claim 7, wherein when the first user interface displays a specific icon for triggering display of icons of all applications of the in-vehicle device, the acquired related data of the handheld device comprises icons corresponding to all applications of the handheld device, the method further comprising: and displaying icons of all applications of the vehicle-mounted device and icons of all applications of the handheld device by using a second user interface.
9. The method of claim 8, wherein the obtained profile of the handheld device further includes an identification of a user logged into the handheld device within a predetermined time period, the method further comprising: and classifying the icons corresponding to all the applications of the vehicle-mounted device and the handheld device according to the use frequency or similar content of each icon in the icons corresponding to all the applications of the vehicle-mounted device and the handheld device, and then displaying the icons on the second user interface.
10. The user interface integration method according to claim 1, wherein, on the second user interface, the color and/or shape of the related data of the in-vehicle device is different from the acquired related data of the handheld device.
11. The user interface integration method of claim 1, wherein the first user interface comprises a first primary display area; the method also includes dividing the first main display area into a plurality of sub-areas according to a user input.
12. The user interface integration method of claim 11, further comprising:
specifying the content displayed by each of the plurality of sub-regions in response to user input.
13. The user interface integration method of claim 12, further comprising:
determining one sub-area from the plurality of sub-areas as a target area when a specified signal is received, wherein the target area corresponds to the specified signal;
determining the content currently displayed by the target area;
combining the plurality of sub-regions into one region in response to the designation signal; and
hiding/closing the contents displayed by other sub-areas except the target area in the plurality of sub-areas, and displaying the currently displayed contents of the target area and/or other contents related to the contents displayed by the target area in the merged area.
14. The user interface integration method of claim 13, further comprising:
and arranging and displaying icons respectively corresponding to all applications of the vehicle-mounted device in the combined area according to a preset sequence, wherein the preset sequence refers to the sequence of the installation time of each application installed on the vehicle-mounted device.
15. The method of claim 13, wherein the first user interface further comprises a first status display area and a first function control column; the first state display area is used for displaying the system time of the vehicle-mounted device and the signal intensity of wireless communication; the first function control column is used for displaying function buttons of the vehicle-mounted device.
16. The user interface integration method of claim 15, wherein the method maintains the size, position, and shape of the area where the first status display area and the first function control column are located while combining the plurality of sub-areas into one area in response to the designation signal.
17. The method of claim 15, wherein in the first function control column, the brightness of the function button corresponding to the function currently in the activated state is higher than the brightness of the function button corresponding to the function currently in the inactivated state.
18. The user interface integration method according to claim 15, wherein the second user interface comprises a second status display area, a second main display area, and a second function control column, wherein the second status display area is used for displaying the system time of the in-vehicle device, the signal strength of the wireless communication, and the identities of a plurality of users logged in to the handheld device within a preset time period; the second main display area comprises a plurality of sub-areas, and the plurality of sub-areas included in the second user interface respectively correspond to the plurality of sub-areas included in the first user interface; the content displayed by the second function control column is the same as the content displayed by the first function control column.
19. The user interface integration method of claim 18, further comprising:
and synchronizing the favorite song lists respectively corresponding to the plurality of users to the vehicle-mounted device.
20. An in-vehicle device, comprising a memory for storing at least one instruction and a processor for implementing the user interface integration method of any one of claims 1 to 19 when executing the at least one instruction.
CN201910803614.6A 2018-12-12 2019-08-28 User interface integration method and vehicle-mounted device Active CN111309414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/710,060 US11356726B2 (en) 2018-12-12 2019-12-11 In-vehicle device and method for managing user interfaces

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862778316P 2018-12-12 2018-12-12
US62/778316 2018-12-12

Publications (2)

Publication Number Publication Date
CN111309414A true CN111309414A (en) 2020-06-19
CN111309414B CN111309414B (en) 2023-07-18

Family

ID=71148772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910803614.6A Active CN111309414B (en) 2018-12-12 2019-08-28 User interface integration method and vehicle-mounted device

Country Status (2)

Country Link
CN (1) CN111309414B (en)
TW (1) TWI742421B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112181249A (en) * 2020-09-25 2021-01-05 北京字节跳动网络技术有限公司 Play control method and device, electronic equipment and storage medium
WO2022052907A1 (en) * 2020-09-10 2022-03-17 华为技术有限公司 Display method and electronic device

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020115477A1 (en) * 2001-02-13 2002-08-22 Raja Singh Portable high speed internet access device with scrolling
US6606543B1 (en) * 2002-01-09 2003-08-12 Microsoft Corporation Method and apparatus for logging into a vehicle computer system
US20110051665A1 (en) * 2009-09-03 2011-03-03 Apple Inc. Location Histories for Location Aware Devices
CN102378402A (en) * 2010-08-04 2012-03-14 宏达国际电子股份有限公司 Method for carrying out multiple connections and related communications device
US20120271723A1 (en) * 2011-04-22 2012-10-25 Penilla Angel A Electric vehicle (ev) range extending charge systems, distributed networks of charge kiosks, and charge locating mobile apps
US20130227038A1 (en) * 2012-02-29 2013-08-29 Bradly Freeman Rich Mechanism for facilitating user-controlled features relating to media content in multiple online media communities and networks
US20130275994A1 (en) * 2012-04-11 2013-10-17 Nokia Corporation Method and apparatus for activity management across multiple devices
CN103425494A (en) * 2013-08-06 2013-12-04 惠州华阳通用电子有限公司 Information interaction system of vehicle-mounted terminal and smart mobile terminal
CN103780702A (en) * 2014-02-17 2014-05-07 重庆长安汽车股份有限公司 Vehicle-mounted amusement device and mobile phone interactive system and method
CN104142806A (en) * 2014-07-09 2014-11-12 常熟恒基科技有限公司 Vehicle-mounted mobile phone interconnection system
CN104169865A (en) * 2012-04-11 2014-11-26 丰田自动车工程及制造北美公司 Systems and methods for browsing mobile device with in-vehicle user interface
CN104301507A (en) * 2013-07-15 2015-01-21 Lg电子株式会社 Mobile terminal and control method thereof
CN104471353A (en) * 2012-04-16 2015-03-25 纽昂斯通信有限公司 Low-attention gestural user interface
CN104935741A (en) * 2015-06-05 2015-09-23 北京九五智驾信息技术股份有限公司 Interconnecting method and system of vehicle-mounted information and entertainment device and mobile terminal
CN104954859A (en) * 2014-03-28 2015-09-30 比亚迪股份有限公司 Mobile terminal control method through vehicle-mounted multimedia and device and system thereof
CN104977018A (en) * 2015-07-10 2015-10-14 钛马信息网络技术有限公司 Vehicle-mounted navigation equipment, mobile device, vehicle-mounted navigation equipment navigation method, mobile device navigation method, and navigation system
CN104991788A (en) * 2015-05-07 2015-10-21 奇瑞汽车股份有限公司 Application operation method and system
CN105094502A (en) * 2014-04-30 2015-11-25 阿尔派株式会社 Vehicle device system
CN105117146A (en) * 2014-05-26 2015-12-02 Lg电子株式会社 Information providing apparatus and method thereof
CN105283840A (en) * 2013-06-08 2016-01-27 苹果公司 Device, method, and graphical user interface for synchronizing two or more displays
CN105320429A (en) * 2014-08-01 2016-02-10 大众汽车有限公司 Mirroring deep links
CN105472115A (en) * 2014-09-05 2016-04-06 阿尔卑斯电气株式会社 Display device and display method
CN105516460A (en) * 2015-11-25 2016-04-20 奇瑞汽车股份有限公司 Method for realizing full-screen projection of single application function of mobile phone to vehicle-mounted display screen
CN105511945A (en) * 2014-11-14 2016-04-20 乔治·斯特彻夫 Application matching method for mobile device and accessory method
US20160125077A1 (en) * 2014-10-29 2016-05-05 Hyundai Motor Company Music recommendation system for vehicle and method thereof
CN105917320A (en) * 2013-12-03 2016-08-31 本田技研工业株式会社 Portable electronic device linking system
CN106462378A (en) * 2016-09-28 2017-02-22 北京小米移动软件有限公司 Display method, apparatus and automobile data recorder
KR101730315B1 (en) * 2015-11-05 2017-04-27 엘지전자 주식회사 Electronic device and method for image sharing
CN106790486A (en) * 2016-12-13 2017-05-31 百度在线网络技术(北京)有限公司 Display methods and device
WO2017101325A1 (en) * 2015-12-14 2017-06-22 乐视控股(北京)有限公司 Vehicle-mounted display control method and device thereof
CN107117114A (en) * 2015-09-10 2017-09-01 福特全球技术公司 In-car add-on module is integrated to driver's user interface
CN206820780U (en) * 2017-04-05 2017-12-29 惠州市凯越电子股份有限公司 A kind of car networking cross complaint system of vehicle intelligent terminal and hand-held mobile terminal
CN107610723A (en) * 2016-07-12 2018-01-19 福特全球技术公司 Use non-matching device access information entertainment systems
CN107608582A (en) * 2017-07-30 2018-01-19 魏新成 The method system and equipment of classification application download are carried out by mobile phone classification desktop
WO2018131908A1 (en) * 2017-01-12 2018-07-19 삼성전자 주식회사 Vehicle device, display method in vehicle device and electronic device, and information transmission method in electronic device
CN108349423A (en) * 2015-11-13 2018-07-31 哈曼国际工业有限公司 User interface for onboard system
CN108446140A (en) * 2017-02-15 2018-08-24 阿里巴巴集团控股有限公司 Interface display method, device, the device and operating system
CN208092708U (en) * 2017-12-15 2018-11-13 蔚来汽车有限公司 Vehicle-mounted terminal equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102297330B1 (en) * 2015-01-16 2021-09-02 삼성전자주식회사 Method for controlling display and an electronic device thereof
CN107506117A (en) * 2017-03-30 2017-12-22 宝沃汽车(中国)有限公司 display control method, device and vehicle
CN107256129A (en) * 2017-07-20 2017-10-17 广东欧珀移动通信有限公司 Switch method, device and its relevant device of application under span mode

Patent Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020115477A1 (en) * 2001-02-13 2002-08-22 Raja Singh Portable high speed internet access device with scrolling
US6606543B1 (en) * 2002-01-09 2003-08-12 Microsoft Corporation Method and apparatus for logging into a vehicle computer system
US20110051665A1 (en) * 2009-09-03 2011-03-03 Apple Inc. Location Histories for Location Aware Devices
CN102378402A (en) * 2010-08-04 2012-03-14 宏达国际电子股份有限公司 Method for carrying out multiple connections and related communications device
US20120271723A1 (en) * 2011-04-22 2012-10-25 Penilla Angel A Electric vehicle (ev) range extending charge systems, distributed networks of charge kiosks, and charge locating mobile apps
US20130227038A1 (en) * 2012-02-29 2013-08-29 Bradly Freeman Rich Mechanism for facilitating user-controlled features relating to media content in multiple online media communities and networks
US20130275994A1 (en) * 2012-04-11 2013-10-17 Nokia Corporation Method and apparatus for activity management across multiple devices
CN104169865A (en) * 2012-04-11 2014-11-26 丰田自动车工程及制造北美公司 Systems and methods for browsing mobile device with in-vehicle user interface
CN104471353A (en) * 2012-04-16 2015-03-25 纽昂斯通信有限公司 Low-attention gestural user interface
CN105283840A (en) * 2013-06-08 2016-01-27 苹果公司 Device, method, and graphical user interface for synchronizing two or more displays
CN104301507A (en) * 2013-07-15 2015-01-21 Lg电子株式会社 Mobile terminal and control method thereof
CN103425494A (en) * 2013-08-06 2013-12-04 惠州华阳通用电子有限公司 Information interaction system of vehicle-mounted terminal and smart mobile terminal
CN105917320A (en) * 2013-12-03 2016-08-31 本田技研工业株式会社 Portable electronic device linking system
CN103780702A (en) * 2014-02-17 2014-05-07 重庆长安汽车股份有限公司 Vehicle-mounted amusement device and mobile phone interactive system and method
CN104954859A (en) * 2014-03-28 2015-09-30 比亚迪股份有限公司 Mobile terminal control method through vehicle-mounted multimedia and device and system thereof
CN105094502A (en) * 2014-04-30 2015-11-25 阿尔派株式会社 Vehicle device system
CN105117146A (en) * 2014-05-26 2015-12-02 Lg电子株式会社 Information providing apparatus and method thereof
CN104142806A (en) * 2014-07-09 2014-11-12 常熟恒基科技有限公司 Vehicle-mounted mobile phone interconnection system
CN105320429A (en) * 2014-08-01 2016-02-10 大众汽车有限公司 Mirroring deep links
CN105472115A (en) * 2014-09-05 2016-04-06 阿尔卑斯电气株式会社 Display device and display method
US20160125077A1 (en) * 2014-10-29 2016-05-05 Hyundai Motor Company Music recommendation system for vehicle and method thereof
CN105511945A (en) * 2014-11-14 2016-04-20 乔治·斯特彻夫 Application matching method for mobile device and accessory method
CN104991788A (en) * 2015-05-07 2015-10-21 奇瑞汽车股份有限公司 Application operation method and system
CN104935741A (en) * 2015-06-05 2015-09-23 北京九五智驾信息技术股份有限公司 Interconnecting method and system of vehicle-mounted information and entertainment device and mobile terminal
CN104977018A (en) * 2015-07-10 2015-10-14 钛马信息网络技术有限公司 Vehicle-mounted navigation equipment, mobile device, vehicle-mounted navigation equipment navigation method, mobile device navigation method, and navigation system
CN107117114A (en) * 2015-09-10 2017-09-01 福特全球技术公司 In-car add-on module is integrated to driver's user interface
KR101730315B1 (en) * 2015-11-05 2017-04-27 엘지전자 주식회사 Electronic device and method for image sharing
CN108349423A (en) * 2015-11-13 2018-07-31 哈曼国际工业有限公司 User interface for onboard system
CN105516460A (en) * 2015-11-25 2016-04-20 奇瑞汽车股份有限公司 Method for realizing full-screen projection of single application function of mobile phone to vehicle-mounted display screen
WO2017101325A1 (en) * 2015-12-14 2017-06-22 乐视控股(北京)有限公司 Vehicle-mounted display control method and device thereof
CN107610723A (en) * 2016-07-12 2018-01-19 福特全球技术公司 Use non-matching device access information entertainment systems
CN106462378A (en) * 2016-09-28 2017-02-22 北京小米移动软件有限公司 Display method, apparatus and automobile data recorder
CN106790486A (en) * 2016-12-13 2017-05-31 百度在线网络技术(北京)有限公司 Display methods and device
WO2018131908A1 (en) * 2017-01-12 2018-07-19 삼성전자 주식회사 Vehicle device, display method in vehicle device and electronic device, and information transmission method in electronic device
CN108446140A (en) * 2017-02-15 2018-08-24 阿里巴巴集团控股有限公司 Interface display method, device, the device and operating system
CN206820780U (en) * 2017-04-05 2017-12-29 惠州市凯越电子股份有限公司 A kind of car networking cross complaint system of vehicle intelligent terminal and hand-held mobile terminal
CN107608582A (en) * 2017-07-30 2018-01-19 魏新成 The method system and equipment of classification application download are carried out by mobile phone classification desktop
CN208092708U (en) * 2017-12-15 2018-11-13 蔚来汽车有限公司 Vehicle-mounted terminal equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022052907A1 (en) * 2020-09-10 2022-03-17 华为技术有限公司 Display method and electronic device
CN112181249A (en) * 2020-09-25 2021-01-05 北京字节跳动网络技术有限公司 Play control method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
TW202022570A (en) 2020-06-16
CN111309414B (en) 2023-07-18
TWI742421B (en) 2021-10-11

Similar Documents

Publication Publication Date Title
EP3013076B1 (en) Mobile terminal and control method for the mobile terminal
EP3092559B1 (en) Presenting and interacting with audio-visual content in a vehicle
CN108284840B (en) Autonomous vehicle control system and method incorporating occupant preferences
JP6525888B2 (en) Reconfiguration of Vehicle User Interface Based on Context
US8175803B2 (en) Graphic interface method and apparatus for navigation system for providing parking information
US20160342406A1 (en) Presenting and interacting with audio-visual content in a vehicle
US11356726B2 (en) In-vehicle device and method for managing user interfaces
US10409449B2 (en) In-vehicle display apparatus and controlling program
US9057624B2 (en) System and method for vehicle navigation with multiple abstraction layers
US10019155B2 (en) Touch control panel for vehicle control system
WO2013074897A1 (en) Configurable vehicle console
US20150339007A1 (en) Portable information terminal
CN104346047A (en) Icon arrangement processing method, device and terminal
CN109002270A (en) A kind of multi-display method and system, car-mounted terminal
JP2016097928A (en) Vehicular display control unit
CN111309414B (en) User interface integration method and vehicle-mounted device
JP5494318B2 (en) Mobile terminal and communication system
CN104512322A (en) Intelligent switching of audio sources
JP2004251738A (en) Vehicle-mounted information display
WO2013190802A1 (en) On-vehicle map display device
CN111475075A (en) Vehicle-mounted screen control method, management system and computer-readable storage medium
CN104019826A (en) Automatic navigation method and system based on touch control
US20240111393A1 (en) Information apparatus and menu display method
JP6905665B2 (en) Electronics and programs
JP7111513B2 (en) Information processing device and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211015

Address after: 238 helicoburg Road, Amsterdam, the Netherlands

Applicant after: Dutch mobile drive Co.

Address before: 42nd floor, 63 Xuefu Road, Nanshan District, Shenzhen, Guangdong 518052

Applicant before: Shenzhen Chaojie Communication Co.,Ltd.

GR01 Patent grant
GR01 Patent grant