CN116061854A - Touch menu display method, device, terminal, medium and program product - Google Patents

Touch menu display method, device, terminal, medium and program product Download PDF

Info

Publication number
CN116061854A
CN116061854A CN202310084577.4A CN202310084577A CN116061854A CN 116061854 A CN116061854 A CN 116061854A CN 202310084577 A CN202310084577 A CN 202310084577A CN 116061854 A CN116061854 A CN 116061854A
Authority
CN
China
Prior art keywords
display area
touch menu
information
vehicle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310084577.4A
Other languages
Chinese (zh)
Inventor
刘林欣
杜加价
袁敏敏
赛影辉
李涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhu Automotive Prospective Technology Research Institute Co ltd
Chery Automobile Co Ltd
Original Assignee
Wuhu Automotive Prospective Technology Research Institute Co ltd
Chery Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhu Automotive Prospective Technology Research Institute Co ltd, Chery Automobile Co Ltd filed Critical Wuhu Automotive Prospective Technology Research Institute Co ltd
Priority to CN202310084577.4A priority Critical patent/CN116061854A/en
Publication of CN116061854A publication Critical patent/CN116061854A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/037Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/80Circuits; Control arrangements
    • B60Q3/82Switches specially adapted for vehicle interior lighting, e.g. switching by tilting the lens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0229Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for displays, e.g. cathodic tubes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0258Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for navigation systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R13/00Elements for body-finishing, identifying, or decorating; Arrangements or adaptations for advertising purposes
    • B60R13/02Internal Trim mouldings ; Internal Ledges; Wall liners for passenger compartments; Roof liners
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/005Electro-mechanical devices, e.g. switched
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R13/00Elements for body-finishing, identifying, or decorating; Arrangements or adaptations for advertising purposes
    • B60R13/02Internal Trim mouldings ; Internal Ledges; Wall liners for passenger compartments; Roof liners
    • B60R2013/0287Internal Trim mouldings ; Internal Ledges; Wall liners for passenger compartments; Roof liners integrating other functions or accessories

Abstract

The application discloses a touch menu display method, device, terminal, medium and program product, and relates to the field of intelligent automobiles. The method comprises the following steps: collecting biological characteristic information; collecting carrying position information; determining a target display area corresponding to the carrying position information from different display areas corresponding to the different carrying positions; carrying out identity recognition on the biological characteristic information to obtain identity recognition information corresponding to the organism; and displaying the touch menu corresponding to the identification information in the target display area. The method and the device can provide personalized touch menus for different organisms existing on the target vehicle based on different identity identification information, display the touch menus in a proper target display area, facilitate user operation, improve safety in the driving process, provide more diversified services, and improve interaction efficiency between people and the vehicle.

Description

Touch menu display method, device, terminal, medium and program product
Technical Field
The embodiment of the application relates to the field of intelligent automobiles, in particular to a touch menu display method, a touch menu display device, a touch menu display terminal, a touch menu display medium and a touch menu display program product.
Background
The application of intelligent automobiles is becoming wider, the demand for intellectualization is also increasing, and in order to create a third space other than "working space" and "living space", so that on-board products provide more services, the body of the intelligent automobile generally uses a light-transmitting skin as the body surface.
The light-transmitting epidermis is used for displaying a luminous touch menu, the touch menu comprises a plurality of different controls, and the intelligent automobile can execute different automobile machine functions by triggering the controls in the touch menu.
However, the common intelligent transparent skin has a single display content, and cannot provide personalized services for different users, so that the user experience is poor.
Disclosure of Invention
The embodiment of the application provides a display method, a device, a terminal, a medium and a program product of a touch menu, which can customize the personalized touch menu and display the personalized touch menu in different areas. The technical scheme is as follows:
in one aspect, a method for displaying a touch menu is provided, the method including:
the method comprises the steps of collecting biological characteristic information, wherein the biological characteristic information is information collected under the condition that organisms exist on a target vehicle on which the vehicle-mounted terminal is mounted, and different mounting positions of the vehicle-mounted terminal corresponding to the target vehicle comprise different display areas;
Collecting mounting position information for indicating a mounting seat position of the living body on the target vehicle;
determining a target display area corresponding to the carrying position information from different display areas corresponding to the different carrying positions;
carrying out identity recognition on the biological characteristic information to obtain identity recognition information corresponding to the organism;
and displaying the touch menu corresponding to the identity identification information in the target display area, wherein the touch menu comprises controls corresponding to various vehicle functions.
In another aspect, a display device of a touch menu is provided, the device including:
an acquisition module that acquires biometric information that is acquired when an organism is present on a target vehicle on which the vehicle-mounted terminal is mounted, the vehicle-mounted terminal comprises different display areas corresponding to different carrying positions of the target vehicle;
the acquisition module acquires carrying position information, wherein the carrying position information is used for indicating the carrying seat position of the organism on the target vehicle;
a determining module configured to determine a target display area corresponding to the mounting position information from among different display areas corresponding to the different mounting positions;
The identification module is used for carrying out identity identification on the biological characteristic information to obtain the identity identification information corresponding to the organism;
and the display module is used for displaying the touch menu corresponding to the identity identification information in the target display area, wherein the touch menu comprises controls corresponding to various vehicle functions.
In another aspect, a vehicle-mounted terminal is provided, where the vehicle-mounted terminal includes a processor and a memory, where at least one instruction, at least one section of program, a code set, or an instruction set is stored in the memory, where the at least one instruction, the at least one section of program, the code set, or the instruction set is loaded and executed by the processor to implement a method for displaying a touch menu according to any one of the embodiments of the present application.
In another aspect, a computer readable storage medium is provided, where at least one instruction, at least one section of program, a code set, or an instruction set is stored in the storage medium, where the at least one instruction, the at least one section of program, the code set, or the instruction set is loaded and executed by a processor to implement a method for displaying a touch menu according to any one of the embodiments of the present application.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the vehicle-mounted terminal reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the vehicle-mounted terminal executes the touch menu display method according to any one of the above embodiments.
The beneficial effects that technical scheme that this application embodiment provided include at least:
the biological characteristic information of the organism is collected, the biological characteristic information is identified to obtain the identity identification information, the corresponding touch menu is obtained based on the identity identification information, the touch menu is displayed in the target display area corresponding to the carrying position based on the carrying position information of the organism, the convenience and the safety of operation are improved, the personalized touch menu can be provided based on different identity identification information, different touch menus are displayed in different positions, the richness of the display effect of the touch menu is improved, more diversified services are provided, and the efficiency of man-machine interaction is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a conventional smart transparent skin technology provided in an exemplary embodiment of the present application for displaying a touch menu for different users;
FIG. 2 is a schematic diagram of a personalized touch menu for different users provided by an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application;
fig. 4 is a flowchart of a method for displaying a touch menu according to an exemplary embodiment of the present application;
fig. 5 is a schematic diagram of a correspondence relationship between a mounting position and a display area according to an exemplary embodiment of the present application;
FIG. 6 is a flowchart of a method for displaying a touch menu based on FIG. 4 according to an exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of a primary drive display area and a secondary drive display area provided in an exemplary embodiment of the present application;
FIG. 8 is a flowchart of a method for adjusting a default touch menu according to an exemplary embodiment of the present application;
fig. 9 is a block diagram of a touch menu display device according to an exemplary embodiment of the present application;
fig. 10 is a block diagram of a touch menu display device according to another exemplary embodiment of the present application;
fig. 11 is a block diagram of a vehicle-mounted terminal according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, a brief description will be made of terms involved in the embodiments of the present application:
the DMS system (Driver Monitor System) is a driving fatigue detection system, that is, an in-vehicle system that detects the state of a driver during driving, as the name implies. The DMS system comprises face ID recognition, fatigue detection, distraction detection, expression recognition, gesture recognition, dangerous action recognition, sight tracking and the like.
The implementation of the DMS system is identified through a non-wide-angle camera of the main driver seat and an in-vehicle wide-angle camera, the identification is transmitted to a fatigue detection main control unit, the fatigue detection main control unit performs certain algorithm calculation through the action of a driver, and the driver detection system transmits the information of the driver to a man-machine interaction interface for fatigue text or alarm reminding.
The intelligent light-transmitting surface is an intelligent interior decoration in the intelligent automobile cockpit, integrates light-transmitting surface materials to the surface of the interior decoration, and is an intelligent surface manufactured by integrating functions of light, sensing, touch control and the like. The light-emitting touch menu can be displayed on the light-transmitting epidermis when needed.
The touch menu comprises a plurality of different controls, each control corresponds to a different car machine function, and the car machine can execute different functions by triggering the control on the touch menu, so that different services are provided for users.
In the related art, the application of the DMS system and the intelligent translucent skin in the interior of the intelligent automobile is separated, and is mainly used for providing different automobile machine services for the driver of the main driver seat. However, the conventional intelligent transparent skin technology can only correspondingly display the same touch menu in a fixed area, that is, different drivers or passengers on the current vehicle can only use the same default touch menu, personalized service cannot be provided, the display effect of the touch menu is single, and the experience of the driver is poor.
Schematically, as shown in fig. 1, fig. 1 is a schematic diagram of a conventional smart transparent skin technology for displaying a touch menu for different users.
For the same target vehicle 110, the intelligent transparent skin inside the target vehicle 110 is used as a whole display area, and only one default touch menu 120 can be displayed.
When the first user 101 enters the target vehicle 110, a default touch menu 120 is displayed in the display area of the target vehicle 110, and similarly, when the second user 102 and the third user 103 enter the target vehicle 110, respectively, a default touch menu 120 is also displayed in the display area.
In the embodiment of the application, the DMS system is combined with the intelligent light-transmitting epidermis technology, biological information is collected for drivers or passengers at different carrying positions based on the DMS system of the vehicle-mounted terminal, identity information identification is carried out on the collected biological information, and different touch menus are displayed based on the identity identification information.
The DMS system can also collect the carrying position information of a driver or a passenger in the automobile, and based on the corresponding relation between the carrying position information and different display areas in the automobile, the intelligent light-transmitting skin can display touch menus in different display areas in the automobile. Personalized touch menus can be customized according to habits and hobbies of different drivers or passengers, and after the identity information is acquired by the DMS system, the corresponding personalized touch menus are displayed in the corresponding display areas immediately.
Schematically, as shown in fig. 2, fig. 2 is a schematic diagram showing personalized touch menus for different users.
Inside the target vehicle 200, there are a driver 201 and an occupant 202, and the inside of the target vehicle 200 includes intelligent translucent skins that cover the surfaces of the internal components of the target vehicle 200 as display areas for personalized touch menus.
The mounting position of the driver 201 is a main driver seat, the display area corresponding to the main driver seat is a first target display area 211, a first touch menu 221 is displayed in the first target display area 211, and the first touch menu 221 corresponds to the identification information of the driver 201.
The passenger 202 is mounted at a passenger seat, the display areas corresponding to the passenger seat are a second target display area 212 and a third target display area 213, and a second touch menu 222 is displayed in the second target display area 212 and the third target display area 213, and the second touch menu 222 corresponds to the identification information of the passenger 202. The manner of displaying the second touch menu 222 in the second target display area 212 and the third target display area 213 includes, but is not limited to: (1) separately displaying the second touch menu 222: displaying a complete second touch menu 222 in the second target display area 212, and displaying a complete second touch menu 222 in the third target display area 213; (2) The second target display area 212 and the third target display area 213 are used as an integral display area, and together display the complete second touch menu 222.
It should be noted that, in the present embodiment, the second target display area 212 and the third target display area 213 corresponding to the front passenger seat are taken as an example for explanation, and in some embodiments, the third target display area 213 may also be used as the display area corresponding to the main passenger seat to display the first touch menu 221, which is not limited in this embodiment.
In the present embodiment, two living bodies exist in the interior of the target vehicle 200: for example, in some embodiments, when only one living body exists in the interior of the target vehicle 200, the first target display area 211, the second target display area 212, and the third target display area 213 may be simultaneously display areas corresponding to the driver's seat in which the living body is located. That is, the first, second, and third target display areas 211, 212, and 213 collectively display a touch menu corresponding to the identification information of the living body, which is not limited in this embodiment.
Notably, the biometric information, the identity information and the like are data actively uploaded by a user; or, the data acquired after the individual authorization of the user.
It should be noted that, the information and the data related to the application are individually authorized by the user or fully authorized by each party, and the collection, the use and the processing of the related data need to comply with the related laws and regulations and standards of the related country and region. For example, biometric information referred to in this application is obtained with sufficient authorization.
Next, an implementation environment according to an embodiment of the present application will be described, schematically, with reference to fig. 3, in which a target vehicle 310, an in-vehicle terminal 320 of the target vehicle 310, a display area 330 of the target vehicle 310, and a server 340 are referred to. The in-vehicle terminal 320 and the server 340 are connected by a communication network.
Wherein the in-vehicle terminal 320 and the display area 330 are both components of the interior of the target vehicle 310.
In some embodiments, the in-vehicle terminal 320 is configured to collect biometric information of an organism when the organism exists in the target vehicle 310, send the collected biometric information to the server 340, and identify the received biometric information by the server 340 to obtain identification information corresponding to the biometric information; after the server 340 returns the identification information to the vehicle-mounted terminal 320, the vehicle-mounted terminal 320 controls the display area 330 of the target vehicle 310 to display a corresponding touch menu based on the identification information.
Optionally, the in-vehicle terminal 320 is further configured to acquire, when a living body exists in the target vehicle 310, seat position information of the living body in the target vehicle 310, and obtain mounting position information of the living body. The in-vehicle terminal 320 transmits the acquired mounting position information to the server 340, and the server 340 determines a target display area in the display area 330 for displaying a touch menu corresponding to the living body based on the received mounting position information.
In some embodiments, after the vehicle-mounted terminal 320 of the target vehicle 310 collects the biometric information, the biometric information is directly and continuously identified to obtain the identification information, and the vehicle-mounted terminal 320 controls the display area 330 of the target vehicle 310 to display a corresponding touch menu based on the identification information. In this process, biometric information need not be sent to the server 340 for identification by the server 340.
In some embodiments, the in-vehicle terminal 320 has installed therein an application program having an information collection function (including collection of biometric information of an organism, such as facial feature information, etc.), and illustratively, the in-vehicle terminal 320 has installed therein an application program that collects biometric information of an organism present inside the target vehicle 310. Such as: the in-vehicle terminal 320 is provided with a face recognition application, a human body temperature detection application, a location detection application, an instant messaging application, a music broadcasting application, a navigation positioning application, a news application, etc., which are not limited in this embodiment.
In some embodiments, the application program with the information collection function installed in the in-vehicle terminal 320 may also collect the mounting location information of the living body existing on the target vehicle 310.
It should be noted that the above-mentioned communication network may be implemented as a wired network or a wireless network, and the communication network may be implemented as any one of a local area network, a metropolitan area network, or a wide area network, which is not limited in the embodiments of the present application.
It should be noted that the server 340 may be implemented as a Cloud server, where Cloud technology refers to a hosting technology that unifies serial resources such as hardware, software, and network in a wide area network or a local area network to implement calculation, storage, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business mode, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
In some embodiments, the server 340 described above may also be implemented as a node in a blockchain system. Blockchain (Blockchain) is a new application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The blockchain is essentially a decentralised database, and is a series of data blocks which are generated by association by using a cryptography method, and each data block contains information of a batch of network transactions and is used for verifying the validity (anti-counterfeiting) of the information and generating a next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
The method for displaying the touch menu provided in the embodiments of the present application will be described with reference to the above description of the noun introduction and the implementation environment. Fig. 4 is a flowchart of a method for displaying a touch menu according to an exemplary embodiment of the present application, which is performed by a vehicle-mounted terminal, as shown in fig. 4, and includes the following steps.
In step 410, biometric information is collected.
The biological characteristic information is information acquired under the condition that a living body exists on a target vehicle provided with the vehicle-mounted terminal, and different mounting positions of the vehicle-mounted terminal corresponding to the target vehicle comprise different display areas.
The biometric information refers to personal data generated by processing specific technologies related to physical, physiological, or behavioral characteristics of a natural person, which can identify unique characteristics of the natural person, including but not limited to facial characteristic information, fingerprint characteristic information, human body temperature characteristic information, human body eyeball characteristic information.
The vehicle-mounted terminal on the target vehicle comprises an information acquisition system, wherein the information acquisition system is used for acquiring the biological characteristic information.
In some embodiments, the biometric information includes at least one of:
first, the biometric information includes facial feature information;
optionally, the information acquisition system is a driving fatigue detection system (Driver Monitor System, DMS).
Alternatively, the biometric information refers to information obtained by collecting facial feature information of a driver in the presence of the driver at a main driver seat in the target vehicle.
Wherein the facial feature information includes feature information at a specific part of the driver's face, such as: the three-court five-eye five-sense organs distribution characteristics of the driver.
Second, the biometric information includes fingerprint feature information;
optionally, the information acquisition system is acquired by a fingerprint acquisition element installed on the target vehicle, such as: a fingerprint collector mounted on the steering wheel on the door handle of the target vehicle.
When an organism pulls a door handle of a target vehicle, and the door is opened to enter the target vehicle, fingerprint characteristic information of the organism can be acquired; or when a living body enters the target vehicle, the hand is placed on the steering wheel, and the fingerprint characteristic information of the living body can be acquired.
Alternatively, the biometric information refers to information obtained by acquiring fingerprint information of a driver in the case where the driver is present in a main driver seat in the target vehicle.
Wherein the fingerprint feature information includes, but is not limited to, the finger fingerprint feature of the driver, palm texture feature.
Third, the biometric information includes iris feature information;
optionally, the information acquisition system is acquired by an iris acquisition camera installed on the target vehicle, such as: and the digital camera is arranged on the inner wall of the target vehicle.
After the living body enters the target vehicle, the whole eyes of the living body are shot and stored through the digital camera.
Optionally, the biometric information refers to information obtained by collecting eyeball characteristic information of a driver in a case where the driver exists in a main driver seat in the target vehicle.
The iris characteristic information refers to characteristic information of the iris of the driver, and includes but is not limited to: the black eye in the human eye has the visual characteristics of spots, stripes, filaments and the like.
The different mounting positions of the vehicle-mounted terminal corresponding to the target vehicle include different display areas, that is, the different mounting positions correspond to the different display areas.
The display area refers to an intelligent transparent epidermis in the target vehicle, the intelligent transparent epidermis is divided into different display areas, and the display areas are used for displaying touch menus.
In some embodiments, the living body on the target vehicle may be an occupant riding on the target vehicle, in addition to the driver, and the collected biometric information may also refer to facial feature information of the occupant, which is not limited in this embodiment.
Notably, the biological characteristic information and the like of the living body acquired by the information acquisition system are data actively uploaded by a user; or, the data acquired after the individual authorization of the user.
It should be noted that, the information related to the present application (including but not limited to facial feature information of an organism, fingerprint feature information of an organism, iris feature information of an organism, etc.), data (including but not limited to data for analysis, stored data, displayed data, etc.) are individually authorized by a user or sufficiently authorized by each party, and the collection, use and processing of relevant data are required to comply with relevant laws and regulations and standards of relevant countries and regions. For example, biometric information referred to in this application is obtained with sufficient authorization.
Step 420, collecting the mounting position information.
The mounting position information indicates a mounting seat position of the living body on the target vehicle, and at least one mounting seat position is present on the target vehicle.
The information acquisition system is also used for acquiring carrying position information corresponding to the organism.
Optionally, if the living body existing on the target vehicle is a driver, the mounting position information is used to indicate that the driver is located at the main driver seat on the target vehicle.
Alternatively, if the living body present on the target vehicle is an occupant, the mounting position information indicates that the occupant is located on the passenger seat on the target vehicle.
Alternatively, if the living body on the target vehicle is another passenger located in the rear seat, the mounting position information indicates that the other passenger is located in another seat on the target vehicle.
In some embodiments, the mounting location information differs depending on the in-vehicle location where the living body present on the target vehicle is located.
In step 430, a target display area corresponding to the mounting position information is determined from among the different display areas corresponding to the different mounting positions.
Each carrying position corresponds to a respective display area, and before carrying position information is acquired, the vehicle-mounted system carries out matching between the carrying position and the display area through receiving position matching operation, so that the corresponding relation between the carrying position and the display area is obtained.
The correspondence between the mounting position and the display area includes, but is not limited to, the following:
(1) One carrying position corresponds to one display area;
(2) One mounting position corresponds to a plurality of display areas.
That is, the position matching operation is mainly used to divide the display area based on different mounting positions, so that there is a correspondence between the mounting position and the display area, and the target display area can be determined based on the mounting position information.
Schematically, as shown in fig. 5, fig. 5 is a schematic view of the correspondence relationship between the mounting position and the display area.
The display area 500 includes, but is not limited to, the following: an instrument panel display area 501, a main door handle display area 502, a center control display area 503, a center control armrest display area 504, a sub-driver display area 505, and a sub-door handle display area 506.
The mounting locations include, but are not limited to: a primary driving position 511 and a secondary driving position 512.
The primary driving position 511 may be matched with any one of the display areas 500, and the secondary driving position 512 may be matched with any one of the display areas 500.
Optionally, the area matching the primary driving position 511 is: a main door handle display area 502, a center control display area 503, and a center control armrest display area 504; the area that matches the passenger seat 512 is: a secondary drive machine display area 505 and a secondary drive door handle display area 506.
Optionally, the area matching the primary driving position 511 is: a main door handle display area 502, a center control display area 503; the area that matches the passenger seat 512 is: a center handle display area 504, a secondary drive machine display area 505, and a secondary drive door handle display area 506.
In some embodiments, if the correspondence between the mounting position and the display area is not set, the main driving position may correspond to all the display areas, which is not limited in this embodiment.
Step 440, performing identity recognition on the biological characteristic information to obtain the identity recognition information corresponding to the organism.
And acquiring and storing the biological characteristic information of the driver based on the information acquisition system, and storing the biological characteristic information into a database of the target vehicle.
The biological characteristic information is sent to a server, the server carries out identity recognition on the biological characteristic information, the identity recognition information of the driver is obtained and stored, and matching is carried out on the basis of the identity recognition information and other stored identity recognition information.
When other identity information is the same as the identity information, a corresponding touch menu is displayed; and when other identity information is not present and is the same as the identity information, no corresponding touch menu is present.
The identity information obtained by carrying out identity recognition on different biological characteristic information is different, and the identity information is used for obtaining a touch menu with a corresponding relation.
Step 450, displaying a touch menu corresponding to the identification information in the target display area.
And when the touch menu corresponding to the identity identification information exists, displaying the touch menu in the target display area.
The touch menu is used for providing service options for a driver or an occupant in the target vehicle, and comprises controls corresponding to various vehicle functions.
The touch menu comprises a plurality of controls, the functions corresponding to the controls are different, and different vehicle functions can be executed by the target vehicle through triggering the controls.
Optionally, the functions corresponding to the control include, but are not limited to: the method comprises the steps of (1) music playing, (2) route navigation, (3) connection with Bluetooth, (4) atmosphere light in a car, (5) seat adjustment, (6) emergency call and the like.
Because the target display area is also the intelligent transparent skin in the target vehicle, when the touch menu is displayed in the target display area, the intelligent transparent skin can start atmosphere light in the target vehicle to display the touch menu and the control in a luminous mode.
In summary, according to the method provided by the application, the biological characteristic information is collected to identify the biological characteristic information to obtain the identity information, the carrying position information is collected, the touch menu corresponding to the identity information is displayed in the target display area corresponding to the carrying position information, the personalized touch menu can be provided based on different identity information, and different touch menus are displayed in different display areas, so that more diversified services are provided.
According to the method provided by the embodiment, the corresponding relation between the carrying position and the display area is obtained through receiving the position matching operation, after the carrying position information is acquired, the touch menu can be displayed in the proper display area, the user can conveniently operate the touch menu, the safety and the efficiency of the user for operating the touch menu are improved, the display effect of the touch menu is improved, and the efficiency of man-machine interaction is improved.
According to the method provided by the embodiment, the biological characteristic information and the carrying position information of the living body existing on the target vehicle are collected and analyzed by using the driving fatigue detection system, so that the identity identification information corresponding to the living body and the target display area of the touch menu can be obtained, the display effect of the touch menu is improved, and more diversified services are provided.
And carrying out identity recognition on the biological characteristic information to obtain corresponding identity recognition information, and displaying a touch menu corresponding to the identity recognition information in a target display area when the touch menu exists. In some embodiments, the vehicle-mounted terminal includes a plurality of display areas, and when a plurality of different biometric information from different organisms is collected, different touch menus corresponding to the identification information of each organism can be displayed in different target display areas, as shown in fig. 6, and the above step 450 can also be expressed as the following steps.
In step 451, in response to the target display area belonging to the main driving display area, a first touch menu corresponding to the identification information and the main driving display area is displayed in the main driving display area.
The vehicle-mounted terminal comprises a main driving display area and a secondary driving display area. The main driving display area comprises an instrument panel display area and a main driving door handle display area; the secondary driving display area comprises a central control display area, a secondary driving machine display area and a secondary driving door handle display area.
Schematically, as shown in fig. 7, fig. 7 is a schematic view of a main driving display area and a sub driving display area.
The main drive display area refers to a display area corresponding to the main drive seat 710 in the target vehicle 700; the secondary drive display area is a display area corresponding to the secondary drive seat 720 in the target vehicle 700.
Comprises the following display areas: an instrument panel display area 711, a main door handle display area 712, a center control display area 721, a secondary driver display area 722, a secondary door handle display area 723, and a center control armrest display area 724.
Optionally, the primary driving display area and the secondary driving display area include an intelligent transparent skin and a display screen.
The instrument panel display area and the central control display area are display screens corresponding to a main driving seat in the target vehicle; the main driving door handle display area and the central control handrail display area are intelligent light-transmitting skin areas corresponding to a main driving seat in a target vehicle; the display area of the assistant driver refers to a display screen corresponding to an assistant driver seat in the target vehicle; the tandem door handle display area refers to an intelligent light-transmitting skin area corresponding to a tandem seat in a target vehicle.
Optionally, the primary and secondary drive display areas are both smart clear skins of the interior of the target vehicle.
That is, the main driving display area refers to an intelligent light-transmitting skin area corresponding to a main driving seat in the target vehicle; the secondary driving display area is an intelligent light-transmitting skin area corresponding to a secondary driving seat in a target vehicle.
Optionally, the identification information is acquired by collecting and identifying biometric information of a driver of the main driver seat, and the mounting position information corresponding to the identification information is the main driver seat of the target vehicle.
The main driver seat corresponds to a main driving display area of the vehicle-mounted terminal, so that a first touch menu corresponding to the identity identification information comprises driving auxiliary function items, and the driving auxiliary function items are used for assisting a driving process of a living body.
Among them, the driving assistance functions include, but are not limited to, the following:
(1) Vehicle speed auxiliary function: prompting the highest speed limit and the lowest speed limit of the current road section of the driver, and assisting the driver to control the acceleration or deceleration of the target vehicle;
(2) Brake auxiliary function: prompting the relative position of a target vehicle and an obstacle of a driver, relieving the trouble caused by the existence of a blind zone of a rearview mirror, and assisting the driver in braking;
(3) Navigation assistance function: navigation is carried out in the process that a driver drives a vehicle to go to a destination, the driving path of the driver is reminded in real time, and the driver turns at an intersection needing to turn;
(4) Emergency call function: when a driver encounters danger in the driving process, the driver automatically dials an alarm call for help;
(5) Main driver seat adjustment function: when the visual field of a driver is limited, the height of the main driving seat can be adjusted to enable the driver to adjust the driving sitting posture, so that the safety of the driver in the driving process is improved; or when the driver stops for rest due to fatigue, the height of the main driver seat is adjusted for the driver to rest in a lying or side lying mode.
It should be noted that, in addition to the main driving display area, the display area corresponding to the main driver seat may also be other display areas, such as: the secondary driving display area is provided corresponding to the main driver seat, which is not limited in this embodiment.
In response to the target display region belonging to the secondary drive display region, step 452, and displaying a second touch menu corresponding to the identification information and the auxiliary driving display area in the auxiliary driving display area.
The second touch menu comprises entertainment function items, and the entertainment function items are used for providing entertainment services for organisms.
Optionally, the identification information is acquired by collecting and identifying biometric information of an occupant of the passenger seat, and the mounting position information corresponding to the identification information is the passenger seat of the target vehicle.
The secondary driver seat corresponds to the secondary driving display area of the vehicle-mounted terminal, so that the second touch menu corresponding to the identity identification information comprises entertainment function items, and the entertainment function items are used for providing entertainment services for organisms.
Among them, entertainment function items include, but are not limited to, the following:
(1) Music playing function: opening the vehicle-mounted sound equipment to play music for the passengers;
(2) Broadcast playing function: broadcasting news, sound and other broadcast contents for passengers;
(3) Passenger seat adjustment function: when the passenger needs to rest, the height of the copilot seat is adjusted to enable the passenger to lie down or lie on one side for rest.
It should be noted that, in addition to the secondary driving display area, the display area corresponding to the secondary driver seat may be other display areas, for example: the main drive display area is provided corresponding to the passenger seat, which is not limited in this embodiment.
In some embodiments, when the living body exists on the main driver seat and the auxiliary driver seat of the target vehicle, that is, when the driver and the passenger exist on the target vehicle at the same time, the information acquisition system in the vehicle-mounted terminal acquires the biological feature information of the living body of the main driver seat and the biological feature information of the auxiliary driver seat respectively, and acquires the respective carrying position information accordingly. After the biological characteristic information of each organism is respectively identified, the respective identity identification information is obtained, the corresponding target display area is obtained based on the respective carrying position information, and the respective corresponding touch menus are respectively displayed in the target display area.
It should be noted that the operations performed in the steps 451 and 452 are parallel, that is, the steps 451 and 452 may be performed simultaneously, or the steps 451 and 452 may be performed sequentially in different orders, which is not limited in this embodiment.
Optionally, the primary driver seat of the target vehicle is in the presence of a first organism and the secondary driver seat is in the presence of a second organism.
After the biological characteristic information and the mounting position information of the first organism and the second organism are acquired, the biological characteristic information is identified, and the first identification information and the first mounting position information of the first organism, the second identification information and the second mounting position information of the second organism are obtained respectively.
The display area corresponding to the first carrying position information is a main driving display area, and the display area corresponding to the second carrying position information is a secondary driving display area. And displaying a first touch menu corresponding to the first identity identification information in the main driving display area, and displaying a second touch menu corresponding to the second identity identification information in the auxiliary driving display area.
In some embodiments, after the identification is performed on the biometric information of the organism on the target vehicle to obtain the identification information corresponding to the organism, there is a case that: that is, there is no touch menu corresponding to the identification information.
Optionally, in response to the absence of the touch menu corresponding to the identification information, displaying a default touch menu in the target display area.
The default touch menu refers to a preset general touch menu.
The universal touch menu also comprises controls corresponding to different vehicle functions, each control is located at a preset position, and the controls are triggered to enable the target vehicle to execute different functions so as to provide different services for organisms on the target vehicle.
In summary, according to the method provided by the application, by distinguishing the situations that organisms exist on the target vehicle, when the organisms are located on the main driver seat and the auxiliary driver seat, the touch menus are displayed in different target display areas, personalized touch menus can be provided for different organisms, and the display diversity and the user experience are improved.
According to the method provided by the embodiment, under the condition that a plurality of organisms exist on the target vehicle, touch menus corresponding to different organisms can be displayed in different target display areas at the same time, the vehicle-mounted product can provide more services at the same time, various operation requirements are met, interaction efficiency between people and a vehicle is improved, when a driver of a main driving seat drives, passengers of other seats can operate the touch menus, and safety in the driving process is improved.
According to the method provided by the embodiment, the intelligent transparent skin area in the target vehicle is divided to obtain the main driving display area and the auxiliary driving display area, the main driving display area and the auxiliary driving display area are further divided, the main driving display area comprises the instrument panel display area and the main driving door handle display area, the auxiliary driving display area comprises the central control display area, the auxiliary driving machine display area and the auxiliary driving door handle display area, touch menus can be displayed in the plurality of display areas, the operation convenience is improved, the corresponding relation between the display areas and the carrying positions can be manually changed, the requirement of a user for changing the display areas is met, and the utilization rate of the display areas is improved.
According to the method provided by the embodiment, when the touch menu corresponding to the identity identification information of the organism on the target vehicle does not exist, the default touch menu is displayed in the target display area, the preset touch menu scheme is provided, and even if a user entering the target vehicle for the first time can use the touch menu, different services are experienced, and the efficiency of man-machine interaction is improved.
In some embodiments, there are touch menus corresponding to the identification information of the living body on the target vehicle, and these touch menus are obtained by adjusting and saving the default touch menu, that is, before the touch menu corresponding to the identification information is displayed in the target display area, the default touch menu needs to be adjusted. As shown in fig. 8, fig. 8 is a flowchart of a method for adjusting a default touch menu according to an exemplary embodiment of the present application, where the method includes the following steps.
Step 810, receiving a control adjustment operation for the default touch menu to obtain the touch menu.
The control adjusting operation is used for adjusting the position and triggering mode of the control, including but not limited to the following:
(1) Receiving a control dragging operation: the control dragging operation is used for adjusting the positions of various controls in the touch menu;
the touch menu comprises a plurality of controls corresponding to various functions of the vehicle, each control is provided with a respective icon, the position of the control can be adjusted by putting a finger on the control icon for long-pressing and dragging, the control is dragged to the target position and then released, the position of the control is updated, and the updated control position can be automatically saved by the touch menu.
(2) Receiving a gesture recognition operation: the gesture recognition operation is used for setting and storing the triggering modes of various controls.
Typically, the manner in which a control in a touch menu is triggered is by a finger clicking or long pressing on an icon of the target control.
The touch menu is provided with a gesture recognition area, gesture recognition operation is received, namely, a control which wants to input a gesture is selected in the touch menu, a user-defined sliding gesture or a gesture with a pattern shape is input in the gesture recognition area of the touch menu, the gesture can be automatically recognized and stored by the touch menu, and when the control needs to be triggered next time, the user-defined gesture corresponding to the control can be input instead of clicking an icon of the control.
Step 820, receiving a storage operation for the touch menu and the identification information.
The storing operation is used for storing the corresponding relation between the identity identification information and the touch menu, including but not limited to the following steps:
(1) Receiving a storage operation of the corresponding relation between the touch menu and the identity identification information;
and when the organism corresponding to the identity identification information carries out control adjustment operation on the default touch menu, the corresponding relation between the identity identification information and the adjusted touch menu is automatically stored, so that the touch menu corresponding to the identity identification information can be immediately displayed in the target display area when the identity identification information is identified next time.
(2) And receiving a storage operation of the corresponding relation between the identity identification information and the carrying position information.
When the information acquisition system acquires the biometric information and the mounting position information of the living body at the same time, the mounting position of the living body in the target vehicle corresponding to the identification information can be obtained, and the correspondence between the identification information and the mounting position information can be stored.
Because the corresponding relation exists between the carrying position information and the display area of the intelligent transparent epidermis on the target vehicle, when the identity identification information is identified next time, the corresponding target display area can be determined through the carrying position information, and the corresponding touch menu is displayed in the target display area.
The above steps mainly describe that the touch menu corresponding to each identity identification information is personalized and customized by adjusting the default touch menu. In some embodiments, after the corresponding touch menu is displayed in the target display area based on the identification information, the touch menu may be adjusted continuously, and the updated touch menu is obtained and stored.
It should be noted that, the receiving order of the control dragging operation and the gesture recognition operation is arbitrary, that is, the control dragging operation and the gesture recognition operation may be received simultaneously, the control dragging operation may be received first, then the gesture recognition operation may be received, and then the control dragging operation may be received, and the storage operation is the same as the above, which is not limited in this embodiment.
In summary, according to the method provided by the application, the default touch menu is adjusted by receiving the control adjustment operation, so that the personalized touch menu corresponding to different identity identification information is obtained, more choices are provided, and the user experience is better.
According to the method provided by the embodiment, through receiving and storing operation, the corresponding relation among the identity identification information, the touch menu and the carrying position information is respectively stored, when the information acquisition system acquires the biological characteristic information of the organism, the identity identification is carried out on the biological characteristic information, the touch menu corresponding to the identity characteristic information can be immediately displayed in the target display area, the matching efficiency is improved, different touch menus are provided, and the man-machine interaction efficiency is high.
According to the method provided by the embodiment, the updated touch menu is obtained by continuing to adjust the personalized customized touch menu, so that the display effect of the touch menu can be changed in real time, the touch menu is displayed in a diversified manner, more choices are provided for users, and the requirements of diversification and richness of the display effect are met.
Fig. 9 is a block diagram of a touch menu display device according to an exemplary embodiment of the present application, and the device includes the following parts as shown in fig. 9.
The collecting module 920 is configured to collect biometric information, where the biometric information is collected when an organism exists on a target vehicle on which the vehicle-mounted terminal is mounted, and different mounting positions of the vehicle-mounted terminal corresponding to the target vehicle include different display areas;
the collection module 920 is further configured to collect mounting position information, where the mounting position information is used to indicate a mounting seat position of the living body on the target vehicle;
a determining module 930, configured to determine a target display area corresponding to the loading position information from different display areas corresponding to the different loading positions;
the identification module 940 is configured to identify the biometric information to obtain identification information corresponding to the organism;
The display module 960 is configured to display, in the target display area, the touch menu corresponding to the identification information, where the touch menu includes controls corresponding to multiple functions of the vehicle.
In an alternative embodiment, as shown in fig. 10, the vehicle-mounted terminal includes a main driving display area and a secondary driving display area;
the display module 960 is further configured to display, in response to the target display area belonging to the main driving display area, a first touch menu corresponding to the identification information and the main driving display area in the main driving display area, where the first touch menu includes a driving assistance function item, and the driving assistance function item is used to assist a driving process of the living body; and responding to the target display area belonging to the auxiliary driving display area, and displaying a second touch menu corresponding to the identification information and the auxiliary driving display area in the auxiliary driving display area, wherein the second touch menu comprises entertainment function items, and the entertainment function items are used for providing entertainment services for the living body.
In an optional embodiment, the display module 960 is further configured to display a default touch menu in the target display area in response to the absence of the touch menu corresponding to the identification information, where the default touch menu is a preset general touch menu.
In an alternative embodiment, before the display module 960, the apparatus further comprises:
the receiving module 950 is configured to receive a control adjustment operation for the default touch menu, to obtain the touch menu, where the control adjustment operation is used to adjust a position and a triggering manner of the control;
the receiving module 950 is further configured to receive a storage operation for the touch menu and the identification information, where the storage operation is used to store a correspondence between the identification information and the touch menu.
In an optional embodiment, the receiving module 950 is further configured to receive a control dragging operation, where the control dragging operation is used to adjust positions of the plurality of controls in the touch menu; and receiving gesture recognition operation, wherein the gesture recognition operation is used for setting and storing the triggering modes of the plurality of controls.
In an optional embodiment, the receiving module 950 is further configured to receive a storage operation for a correspondence between the touch menu and the identification information; and receiving a storage operation of the corresponding relation between the identity identification information and the carrying position information.
In an alternative embodiment, before the acquisition module 920, the apparatus further includes:
and a matching module 910, configured to receive a position matching operation, where the position matching operation is used to match the mounting position with the display area, so as to obtain a correspondence between the mounting position and the display area.
In an alternative embodiment, the main driving display area comprises a dashboard display area and a main driving door handle display area; the secondary driving display area comprises a central control display area, a secondary driving machine display area and a secondary driving door handle display area.
In an optional embodiment, the vehicle-mounted terminal comprises an information acquisition system, wherein the information acquisition system is used for acquiring the biological characteristic information and the carrying position information; the main driving display area is an intelligent transparent skin area corresponding to a main driving seat in the target vehicle; the secondary driving display area is an intelligent light-transmitting skin area corresponding to a secondary driving seat in the target vehicle.
In summary, the embodiment of the application provides a display device for a touch menu. The biological characteristic information of the organism is collected, the biological characteristic information is identified to obtain the identity identification information, the corresponding touch menu is obtained based on the identity identification information, the touch menu is displayed in the target display area corresponding to the carrying position based on the carrying position information of the organism, the safety in the driving process is improved, the personalized touch menu can be provided based on different identity identification information, different touch menus are displayed in different positions, more diversified services are provided, and the interaction efficiency between people and the automobile is improved.
It should be noted that: in the touch menu display device provided in the above embodiment, only the division of the above functional modules is used as an example, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the display device of the touch menu provided in the above embodiment and the display method embodiment of the touch menu belong to the same concept, and detailed implementation processes of the display device and the display method embodiment of the touch menu are detailed in the method embodiment, and are not repeated here.
Fig. 11 shows a block diagram of a vehicle-mounted terminal 1100 provided by an exemplary embodiment of the present application. The in-vehicle terminal 1100 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. The in-vehicle terminal 1100 may also be referred to by other names of user equipment, portable terminals, laptop terminals, desktop terminals, etc.
In general, the in-vehicle terminal 1100 includes: a processor 1101 and a memory 1102.
The processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1101 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1101 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1101 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 1101 may also include an AI processor for processing computing operations related to machine learning.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1102 is used to store at least one instruction for execution by processor 1101 to implement a method of displaying a touch menu provided by a method embodiment in the present application.
In some embodiments, the in-vehicle terminal 1100 also includes other components, and those skilled in the art will appreciate that the structure shown in fig. 11 is not limiting of the terminal 1100, and may include more or less components than illustrated, or may combine certain components, or may employ a different arrangement of components.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The embodiment of the application also provides a vehicle-mounted terminal, which can be implemented as the terminal or the server shown in fig. 3. The vehicle-mounted terminal comprises a processor and a memory, wherein at least one instruction, at least one section of program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by the processor so as to realize the display method of the touch menu provided by each method embodiment.
The embodiment of the application also provides a computer readable storage medium, and at least one instruction, at least one section of program, code set or instruction set is stored on the computer readable storage medium, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by a processor, so as to realize the display method of the touch menu provided by each method embodiment.
Embodiments of the present application also provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the vehicle-mounted terminal reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the vehicle-mounted terminal executes the touch menu display method according to any one of the above embodiments.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It will be appreciated by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by software, or may be implemented by a program for instructing relevant software, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (13)

1. A method for displaying a touch menu, the method being performed by a vehicle-mounted terminal, the method comprising:
the method comprises the steps of collecting biological characteristic information, wherein the biological characteristic information is information collected under the condition that organisms exist on a target vehicle on which the vehicle-mounted terminal is mounted, and different mounting positions of the vehicle-mounted terminal corresponding to the target vehicle comprise different display areas;
collecting mounting position information for indicating a mounting seat position of the living body on the target vehicle;
determining a target display area corresponding to the carrying position information from different display areas corresponding to the different carrying positions;
Carrying out identity recognition on the biological characteristic information to obtain identity recognition information corresponding to the organism;
and displaying the touch menu corresponding to the identity identification information in the target display area, wherein the touch menu comprises controls corresponding to various vehicle functions.
2. The method of claim 1, wherein the vehicle-mounted terminal comprises a primary driving display area and a secondary driving display area;
the displaying the touch menu corresponding to the identification information in the target display area includes:
responding to the target display area belonging to the main driving display area, and displaying a first touch menu corresponding to the identity identification information and the main driving display area in the main driving display area, wherein the first touch menu comprises driving auxiliary function items, and the driving auxiliary function items are used for assisting the driving process of the organism;
and responding to the target display area belonging to the auxiliary driving display area, and displaying a second touch menu corresponding to the identification information and the auxiliary driving display area in the auxiliary driving display area, wherein the second touch menu comprises entertainment function items, and the entertainment function items are used for providing entertainment services for the living body.
3. The method according to claim 2, wherein after the identifying the biometric information to obtain the identification information corresponding to the organism, further comprises:
and displaying a default touch menu in the target display area in response to the fact that the touch menu does not correspond to the identity identification information, wherein the default touch menu is a preset general touch menu.
4. A method according to any one of claims 1 to 3, wherein before the step of displaying the touch menu corresponding to the identification information in the target display area, the method further comprises:
receiving a control adjustment operation aiming at the default touch menu to obtain the touch menu, wherein the control adjustment operation is used for adjusting the position and the triggering mode of the control;
and receiving a storage operation of the touch menu and the identity identification information, wherein the storage operation is used for storing the corresponding relation between the identity identification information and the touch menu.
5. The method of claim 4, wherein the receiving a control adjustment operation for the default touch menu, obtaining the touch menu, comprises:
Receiving a control dragging operation to obtain the touch menu, wherein the control dragging operation is used for adjusting the positions of the plurality of controls in the touch menu; or alternatively, the process may be performed,
and receiving gesture recognition operation to obtain the touch menu, wherein the gesture recognition operation is used for setting and storing the triggering modes of the plurality of controls.
6. The method of claim 4, wherein the receiving a storage operation for the touch menu and the identification information comprises:
receiving a storage operation of the corresponding relation between the touch menu and the identity identification information;
and receiving a storage operation of the corresponding relation between the identity identification information and the carrying position information.
7. A method according to any one of claims 1 to 3, further comprising, prior to the collecting of the mounting position information:
and receiving a position matching operation, wherein the position matching operation is used for matching the carrying position with the display area to obtain the corresponding relation between the carrying position and the display area.
8. The method of claim 2, wherein the step of determining the position of the substrate comprises,
the main driving display area comprises an instrument panel display area, a main driving door handle display area, a central control display area and a central control handrail display area;
The secondary driving display area comprises a secondary driving machine display area and a secondary driving door handle display area.
9. The method according to claim 8, wherein the vehicle-mounted terminal includes an information acquisition system for acquiring the biometric information and the mounting position information;
the instrument panel display area and the central control display area are display screens corresponding to a main driving seat in the target vehicle;
the main driving door handle display area and the central control handrail display area are intelligent light-transmitting skin areas corresponding to a main driving seat in the target vehicle;
the display area of the assistant driver refers to a display screen corresponding to an assistant driver seat in the target vehicle;
the secondary door handle display area is an intelligent transparent skin area corresponding to a secondary seat in the target vehicle.
10. A display device for a touch menu, the device comprising:
the system comprises an acquisition module, a display module and a control module, wherein the acquisition module acquires biological characteristic information, the biological characteristic information is acquired under the condition that organisms exist on a target vehicle on which the vehicle-mounted terminal is mounted, and the vehicle-mounted terminal comprises different display areas corresponding to different mounting positions of the target vehicle;
The acquisition module acquires carrying position information, wherein the carrying position information is used for indicating the carrying seat position of the organism on the target vehicle;
a determining module configured to determine a target display area corresponding to the mounting position information from among different display areas corresponding to the different mounting positions;
the identification module is used for carrying out identity identification on the biological characteristic information to obtain the identity identification information corresponding to the organism;
and the display module is used for displaying the touch menu corresponding to the identity identification information in the target display area, wherein the touch menu comprises controls corresponding to various vehicle functions.
11. A vehicle-mounted terminal, characterized in that the vehicle-mounted terminal comprises a processor and a memory, wherein at least one section of program is stored in the memory, and the at least one section of program is loaded and executed by the processor to realize the touch menu display method according to any one of claims 1 to 9.
12. A computer-readable storage medium, wherein at least one program is stored in the storage medium, and the at least one program is loaded and executed by a processor to implement the method for displaying a touch menu according to any one of claims 1 to 9.
13. A computer program product comprising a computer program which, when executed by a processor, implements a method of displaying a touch menu as claimed in any one of claims 1 to 9.
CN202310084577.4A 2023-01-17 2023-01-17 Touch menu display method, device, terminal, medium and program product Pending CN116061854A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310084577.4A CN116061854A (en) 2023-01-17 2023-01-17 Touch menu display method, device, terminal, medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310084577.4A CN116061854A (en) 2023-01-17 2023-01-17 Touch menu display method, device, terminal, medium and program product

Publications (1)

Publication Number Publication Date
CN116061854A true CN116061854A (en) 2023-05-05

Family

ID=86181712

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310084577.4A Pending CN116061854A (en) 2023-01-17 2023-01-17 Touch menu display method, device, terminal, medium and program product

Country Status (1)

Country Link
CN (1) CN116061854A (en)

Similar Documents

Publication Publication Date Title
US11249544B2 (en) Methods and systems for using artificial intelligence to evaluate, correct, and monitor user attentiveness
CN109416733B (en) Portable personalization
US10317900B2 (en) Controlling autonomous-vehicle functions and output based on occupant position and attention
CN109552340B (en) Gesture and expression control for vehicles
US10894476B2 (en) Display system in a vehicle and a method for control thereof
JP6761967B2 (en) Driving support method and driving support device, automatic driving control device, vehicle, program using it
US9977593B2 (en) Gesture recognition for on-board display
US20180345909A1 (en) Vehicle with wearable for identifying one or more vehicle occupants
EP3072444B1 (en) Display apparatus and vehicle
US20220203996A1 (en) Systems and methods to limit operating a mobile phone while driving
US11364926B2 (en) Method for operating a motor vehicle system of a motor vehicle depending on the driving situation, personalization device, and motor vehicle
US20170351990A1 (en) Systems and methods for implementing relative tags in connection with use of autonomous vehicles
CN104691449A (en) Vehicle control apparatus and method thereof
CN114746311B (en) Management system and method for identification and biological monitoring in a vehicle
EP4042322A1 (en) Methods and systems for using artificial intelligence to evaluate, correct, and monitor user attentiveness
US11572039B2 (en) Confirmed automated access to portions of vehicles
CN108351886A (en) The system for determining vehicle driver common interest
CN108216087B (en) Method and apparatus for identifying a user using identification of grip style of a door handle
CN113895364A (en) Vehicle-mounted information entertainment system based on double asynchronous displays
KR102286569B1 (en) Smart car see-through display control system and method thereof
CN109562740A (en) For remotely accessing the fingerprint device and method of the individual function profile of vehicle
US20230347903A1 (en) Sensor-based in-vehicle dynamic driver gaze tracking
CN116061854A (en) Touch menu display method, device, terminal, medium and program product
US20230211790A1 (en) Multi-function input devices for vehicles
CN115805956A (en) Danger prompting method and device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination