CN114610208A - Display control method and device - Google Patents

Display control method and device Download PDF

Info

Publication number
CN114610208A
CN114610208A CN202210277569.7A CN202210277569A CN114610208A CN 114610208 A CN114610208 A CN 114610208A CN 202210277569 A CN202210277569 A CN 202210277569A CN 114610208 A CN114610208 A CN 114610208A
Authority
CN
China
Prior art keywords
function item
user
function
information
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210277569.7A
Other languages
Chinese (zh)
Inventor
张宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202210277569.7A priority Critical patent/CN114610208A/en
Publication of CN114610208A publication Critical patent/CN114610208A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Abstract

The application provides a display control method and a display control device, wherein the method comprises the following steps: determining the use scene information of the electronic equipment; determining a target function item required to be operated by the user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit; determining at least one target sub-function item with operation requirements of a user from a sub-function item set associated with the target function item according to the use scene information, wherein the sub-function item set associated with the target function item comprises sub-function items in menus of all levels associated with the target function item; outputting a function selection area containing at least one target sub-function item. The scheme of this application can promote the convenience of operation electronic equipment to reduce the maloperation.

Description

Display control method and device
Technical Field
The present application relates to the field of interactive control technologies, and in particular, to a display control method and apparatus.
Background
The variety and application scenarios of electronic devices are increasingly widespread.
In many cases, the complexity of operating the electronic device is high due to the influence of the use scene of the electronic device and the like. For example, for some vehicle-mounted devices, when a driver drives a vehicle, operations such as map navigation or turning on a fog light in an electronic device are likely to be performed, however, when the vehicle runs on a bumpy road section or in a poor weather environment, the driver turns on multiple levels of menus in sequence to search for a function to be operated, which is difficult and is likely to cause misoperation. Therefore, how to improve the operation convenience of the electronic device to reduce the operation error is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The application provides a display control method and device.
The display control method is applied to electronic equipment and comprises the following steps:
determining the use scene information of the electronic equipment;
determining a target function item required to be operated by a user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
determining at least one target sub-function item with operation requirements of a user from a sub-function item set associated with the target function item according to the use scene information, wherein the sub-function item set associated with the target function item comprises sub-function items in menus at different levels associated with the target function item;
outputting a function selection area containing the at least one target sub-function item.
In one possible implementation manner, the determining, based on the relative spatial position of the finger of the user with respect to the menu interface presented by the display unit, a target function item that is required to be operated by the user from the function items of the menu interface includes:
determining a relative spatial position of a user's finger with respect to a menu interface presented by a display unit;
and determining that the relative spatial position meets a set condition, and determining a target function item to be operated by the user from the function items of the menu interface based on the relative spatial position, wherein the relative spatial position meets the set condition and represents that the user has an operation requirement on the menu interface.
In another possible implementation manner, the determining, from the function items of the menu interface, a target function item to be operated by the user based on the relative spatial position includes:
determining a target function item to be operated by a user from the function items of the menu interface based on the relative spatial position and assistant decision information, wherein the assistant decision information comprises: the user operation habit information at least comprises information that a user of the electronic equipment selects function items in different use scenes historically.
In another possible implementation manner, the determining, from the set of sub-function items associated with the target function item according to the usage scenario information, at least one target sub-function item for which an operation requirement exists by a user includes:
according to the user operation habit information, determining at least one target sub-function item which belongs to the user and can be operated in the use scene corresponding to the use scene information from the sub-function item set associated with the target function item, wherein the user operation habit information at least comprises: the user of the electronic device historically selects information for the function item in different usage scenarios.
In another possible implementation manner, after the outputting the function selection area containing the at least one target sub-function item, the method further includes:
if the target sub-function item in the function selection area does not belong to the sub-function item required to be selected by the user based on the operation of the user on the function selection area, determining the information of the function item actually selected by the user in the use scene corresponding to the information of the use scene;
and updating the stored user operation habit information based on the function item information actually selected by the user.
In another possible implementation manner, the determining the usage scenario information where the electronic device is located includes:
determining road environment information and/or weather environment information where the electronic equipment is located.
In yet another possible implementation manner, the determining the road environment information and/or the weather environment information where the electronic device is located includes:
acquiring acceleration information sensed by a gravity sensor;
obtaining the vehicle running state of the vehicle where the electronic equipment is located;
acquiring geographic environment information and weather condition information of the electronic equipment;
determining road environment information where the electronic equipment is located by combining the acceleration information, the vehicle running state and the geographic environment information;
and determining weather environment information where the electronic equipment is located by combining the vehicle running state and the weather condition information.
In yet another possible implementation manner, before the outputting the function selection area containing the at least one target sub-function item, the method further includes:
acquiring acceleration information sensed by a gravity sensor;
determining a layout mode of at least one target sub-function item in combination with the acceleration information, wherein the layout mode is a layout mode suitable for a user to operate the target sub-function item;
and generating a function selection area containing at least one target sub-function item according to the layout mode of the target sub-function item.
In yet another possible implementation manner, the determining, from the function items of the menu interface, a target function item that a user needs to operate based on a relative spatial position of a finger of the user with respect to the menu interface presented by the display unit includes:
determining at least one candidate function item in a user operation range from the function items of the menu interface based on a first relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
switching the display effect of the at least one candidate function item from a first display effect to a second display effect, wherein the size of the candidate function item displayed by the second display effect is larger than the size of the candidate function item displayed by the first display effect;
and determining a target function item required to be operated by the user from at least one candidate function item displayed by adopting a second display effect based on a second relative space position of the finger of the user relative to the menu interface displayed by the display unit.
The display control device is applied to electronic equipment and comprises:
the scene determining unit is used for determining the use scene information of the electronic equipment;
the first determination unit is used for determining a target function item required to be operated by a user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
a second determining unit, configured to determine, according to the usage scenario information, at least one target sub-function item for which an operation requirement exists by a user from a sub-function item set associated with the target function item, where the sub-function item set associated with the target function item includes sub-function items in menus at different levels associated with the target function item;
a selection area output unit for outputting a function selection area including the at least one target sub-function item.
As can be seen from the above, in the embodiment of the present application, after the target function item that the user needs to operate is determined from the function items of the menu interface, the target sub-function item that the user needs to operate is determined from each level of the menu associated with the target function item by combining the usage scenario information where the electronic device is located, and the function selection area including each sub-function item is directly output, so that the user can directly select the required sub-function item in the function selection area, and does not need to sequentially query each level of the menu of the target function item to find the required sub-function item, thereby improving convenience of operating the electronic device, and reducing misoperation caused by inconvenience in operation when the electronic device is in a complex usage environment.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on the provided drawings without creative efforts.
Fig. 1 is a schematic flow chart illustrating a display control method according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating a display control method provided in an embodiment of the present application;
FIG. 3 is a diagram illustrating various levels of interface operations required by a user for a function item in a conventional manner;
FIG. 4 is a schematic interface diagram illustrating operations required to select a function item in an embodiment of the present application;
fig. 5 is a schematic flowchart illustrating a display control method provided in an embodiment of the present application;
fig. 6 is a schematic flowchart illustrating a display control method provided in an embodiment of the present application in an application scenario;
fig. 7 is a schematic diagram illustrating a structural configuration of a display control apparatus according to an embodiment of the present application;
fig. 8 shows a schematic diagram of a component architecture of an electronic device according to an embodiment of the present application.
Detailed Description
The display control method and the display control device can reduce the complexity of operating the electronic equipment and reduce misoperation. Especially, to some operating environment comparatively complicated, the electronic equipment that is difficult for accurate control, for example, electronic equipment such as mobile phone in on-vehicle terminal or in the vehicle that is in the operation, can effectively promote the convenience of operation electronic equipment through the scheme of this application to reduce the maloperation that leads to because operating environment influences.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without inventive step, are within the scope of the present disclosure.
Referring to fig. 1, which shows a flowchart of a display control method provided in an embodiment of the present application, where the method of the present embodiment is applied to an electronic device, where the electronic device may be a vehicle-mounted terminal, a mobile phone, or a notebook computer, and the method of the present embodiment may include:
s101, determining the use scene information of the electronic equipment.
The use scene information represents the use scene of the currently used electronic equipment.
For example, the usage scenario information may include information for one or more of a plurality of dimensions, such as a geographic location where the electronic device is located, a geographic environment (e.g., an alpine, plain, or rugged mountain road, etc.), a weather environment condition, and a motion state.
The specific information type included in the usage scenario information may be set as needed, and is not limited herein.
There are various ways to determine the usage scenario information. For example, the motion state information (such as motion speed, direction, acceleration, etc.) of the electronic device itself and the external environmental conditions (such as humidity, whether there is rain or snow, etc.) of the electronic device itself can be obtained by using the sensor of the electronic device itself. As another example, the geographic location, the geographic environment, weather condition information, and the like, where the electronic device is located can be obtained from a server via a communication network.
It is understood that, in practical applications, the usage scenario information may be determined in one way or a combination of ways at the same time based on the type of usage scenario information that needs to be obtained. For example, while the running speed and the running condition of the electronic device are determined by combining the gravity sensor, the use scenario information such as the running state, the running road section, the weather condition and the like of the user or the vehicle carrying the electronic device can be determined by combining the weather condition information obtained by the server.
It can be understood that the operation requirement of the user for currently operating the electronic device can be reflected by the use scene information of the electronic device.
And S102, determining a target function item required to be operated by the user from the function items of the menu interface based on the relative space position of the finger of the user relative to the menu interface presented by the display unit.
It is understood that, in the case that the display unit of the electronic device presents a menu interface, the user may select a function item in the menu interface as desired to select a desired operation function through the menus of respective levels associated with the function item.
The function items in the menu interface are a plurality of selectable operation items presented in the menu interface, and an operation function can be triggered and started or an operation interface can be presented through the operation items.
For example, a "common" function item, a "setting" function item, a "light" function item, and the like may be presented in the menu interface, where selecting the "common" function item may trigger a secondary menu interface showing some common functions, selecting the "setting" function item may show an operation interface related to settings of the electronic device, and selecting the "light" function item may trigger the electronic device to turn on a flashlight or a controlled light, and the like.
The target function item required to be operated by the user can be a target function item selected by the user and determined based on the relative spatial position of the finger of the user and the menu interface; before the user's finger contacts the display unit, the target function item that the user needs to operate may be determined based on the relative spatial position of the user's finger with respect to the menu interface. The following will describe the process of determining the target function item with reference to the embodiments, and will not be described herein again.
S103, determining at least one target sub-function item with operation requirements of the user from the sub-function item set associated with the target function item according to the use scene information.
And the sub-function item set associated with the target function item comprises sub-function items in menus of different levels associated with the target function item.
The target function item may include one or more levels of menus, and each level of menu may include at least one function item.
For example, if the target function item is a "commonly used" function item, the menu of the "commonly used" function item may be a level-one menu, and the level-one menu may include: windshield function, light control, etc., and the windshield function can include one deck menu again, and this menu is the second level menu of commonly used function item, and this second level menu can include: front windshield heating (on/off), rear windshield heating, fog lights and the like. Based on this, the sub-function items of the target function item may include: windshield function, light control, front windshield heating, rear windshield heating, fog lights, and the like.
It can be understood that, since the target function item may include a plurality of sub-function items located on different levels of menus, if a user needs to select a certain sub-function item, the user may need to sequentially open the menus of different levels to select the corresponding sub-function item, which is relatively complicated to operate.
In consideration of the current use scene of the electronic device represented by the use scene information, the operation items which can be operated by the electronic device in different use scenes are different, so that after the target function item is determined, the sub-function items which are required by the user to operate can be screened from the sub-menus of the target function item in each level by combining the use scene information of the electronic device.
For the convenience of distinction, the sub-function item in the sub-function item set of the target function item, in which the user has an operation requirement, is referred to as a target sub-function item.
In a possible implementation manner, in the case of determining the usage scenario of the electronic device, in order to more accurately screen out the target sub-function items for which the user has an operation requirement, in consideration of differences in usage habits of different users, the application may further determine, from the set of sub-function items associated with the target function item, at least one target sub-function item that belongs to a usage scenario corresponding to the usage scenario information and is possible to be operated by the user according to the user operation habit information.
Wherein, the user operation habit information at least comprises: the user of the electronic device historically selects information for the function item in different usage scenarios. For example, the user operation habit information may record one or more of information about a function item selected by a user of the electronic device under different usage scenarios, a number of times of selecting the function item, and a selection path of selecting the function item (e.g., which function item is finally selected sequentially), and the like.
It can be understood that, in order to more comprehensively reflect the function items that may be selected by the user in a certain usage scenario, or in order to make up for the insufficiency of the historical operating habit data of the user of the electronic device, the user operating habit information may further include: and selecting the information of the function item by other users except the user of the electronic equipment under different use scenes.
S104, outputting a function selection area containing at least one target subfunction item.
It can be understood that the target sub-function items which are screened from the sub-function item set of the target function item and belong to the user and have operation requirements are output independently, so that the user can select a certain target sub-function item, the user does not need to search the required sub-function items layer by layer, the operation complexity can be reduced, and the misoperation risk caused by searching the sub-function items layer by layer in a complex operation environment can be reduced.
The function selection area may be presented in the form of a window or a menu, which is not limited in this respect.
It is understood that if a load-bearing object carrying an electronic device is moved or in a bumpy state, a user cannot reliably and accurately operate on the display unit, for example, the in-vehicle terminal may not be stably and accurately operated while the vehicle is in a bumpy state; alternatively, a user holding the phone may not be able to operate the phone accurately in a bumpy vehicle or while walking.
Based on this, in a possible implementation manner, in order to enable a user to conveniently and accurately select a desired target sub-function item in the function selection area, the application may further display each target sub-function item in the function selection area by using a set display effect.
Wherein the set display effect is different from the first display effect of the function item presented on the front menu interface. Wherein a function item presented through the set display effect has a larger size than the function item presented using the first display effect.
Of course, the set display effect further includes that the color of the function item is different from the color of the function item presented by the first display effect. For example, the color of the function item displayed by the set display effect is more obvious and attractive than the color of the function item displayed by the first display effect.
In yet another possible implementation manner, the present application may also obtain acceleration information of the electronic device, for example, obtain acceleration information sensed by a gravity sensor in the electronic device. On the basis, the layout mode of the at least one target sub-function item can be determined by combining the acceleration information, and then, a function selection area containing the at least one target sub-function item is generated according to the layout mode of the target sub-function item.
And the layout mode determined by combining the acceleration information is a layout mode suitable for the user to operate the target sub-function item.
It is understood that the acceleration information of the electronic device may reflect the direction and magnitude of the shaking or swinging of the electronic device. In order to accommodate the shaking or swinging of the electronic device so that the user can select a certain target sub-function item from the function selection area more conveniently and accurately, the present application may determine a layout mode suitable for operating the at least one target sub-function item in combination with acceleration information of the electronic device.
For example, if the acceleration information represents that the electronic device moves left and right, and the fingers of the user may shake left and right, it may be determined that the at least one target sub-function item is more suitable for the user to operate as being arranged from top to bottom in sequence, and thus even if the fingers of the user shake left and right during the process of selecting a certain target sub-function item by the user, the selection of other target sub-function items is not triggered.
Of course, in practical applications, the layout mode of the at least one target sub-function item may also be determined comprehensively by combining the acceleration information of the electronic device and the user interface usage habit information. The user interface use habit information may include information of interface presentation or layout mode selected or set by the user in different acceleration modes.
As can be seen from the above, after the target function item required to be operated by the user is determined from the function items in the menu interface, the target sub-function item required by the user can be determined from the menus of different levels associated with the target function item in combination with the use scene information of the electronic device, and the function selection area containing each sub-function item is directly output, so that the user can directly select the required sub-function item in the function selection area without sequentially querying the menus of different levels of the target function item to find the required sub-function item, thereby improving the convenience of operating the electronic device, and reducing misoperation caused by inconvenient operation when the electronic device is in a complex use environment.
In the embodiment of the present application, there are many possibilities for determining the target function item required to be operated by the user based on the relative spatial position of the finger of the user with respect to the menu interface:
in one possible case, upon detecting that the user's finger contacts the display unit, the target function item selected by the user is determined based on the relative position of the user's finger with respect to the menu interface in the display unit.
It can be understood that, in some complicated usage scenarios of the electronic device, the complexity of clicking the selection function item on the display unit by the finger of the user is high, for example, for the vehicle-mounted terminal, when a vehicle in which the vehicle-mounted terminal is located is in a bumpy state, the user may not be able to select the target function item quickly and accurately.
In order to reduce the difficulty of selecting the function items in a complex use environment, in yet another possible implementation manner, after determining the relative spatial position of the finger of the user with respect to the menu interface presented by the display unit, if the relative spatial position meets the set condition, the application may further determine, from the function items of the menu interface, a target function item to be operated by the user based on the relative spatial position. The relative spatial position meets the set condition to represent that the user has operation requirements on the menu interface.
That is to say, when it is determined that the user has an operation requirement for operating the menu interface based on the relative spatial position of the finger of the user with respect to the menu interface, the finger of the user can determine the target function item to be operated by the user according to the relative spatial position without contacting the menu interface, thereby reducing the complexity of selecting the function item by clicking the user.
Furthermore, in order to more accurately determine the target function item to be operated by the user, when the relative spatial position meets the set condition, the target function item can be comprehensively determined by considering information such as user habits on the basis of combining the relative spatial position. This is illustrated below with reference to examples.
As shown in fig. 2, which shows another schematic flow chart of a display control method according to the present application, the method of the present embodiment may include:
s201, determining the use scene information of the electronic equipment.
This step can be referred to the related description of the previous embodiment, and is not described herein again.
S202, determining the relative spatial position of the finger of the user relative to the menu interface presented by the display unit.
Wherein the relative spatial distance may be determined by a sensor of the electronic device or a sensor connected to the electronic device.
For example, the image of the finger of the user may be captured by a camera on the electronic device, or the relative position information of the finger of the user may be sensed by an infrared sensor associated with the electronic device, and on this basis, the relative spatial position of the finger with respect to the menu interface may be determined by combining one or both of the image of the finger and the relative position information.
Of course, this is merely an example, and in practical applications, the relative spatial position of the finger with respect to the menu interface output by the electronic device may also be determined in other ways, which is not limited herein.
The relative spatial position can reflect the spatial position of the finger relative to the menu interface, so that the position of a projection point of the finger on the menu interface can be reflected based on the relative spatial position, and information such as the vertical distance between the finger and the menu interface can also be reflected.
S203, determining that the relative spatial position meets the set condition, and determining a target function item to be operated by the user from the function items of the menu interface based on the relative spatial position and the auxiliary decision information.
The relative spatial position meets the set condition to represent that the user has operation requirements on the menu interface.
For example, the setting condition may be that a vertical distance between the finger of the user and the menu interface is less than a set first threshold, and a distance between a projection point position of the finger of the user on the menu interface and at least one function item in the menu interface is less than a set second threshold. Wherein, the first threshold value and the second threshold value can be set according to requirements.
It can be understood that when a user needs to operate a function item in a menu interface output by an electronic device, a finger of the user necessarily approaches the menu interface and is located in a spatial range corresponding to one or more function items, and therefore, whether the user has an operation requirement for operating the menu interface can be determined by analyzing whether the relative spatial position meets a set condition.
Wherein the assistant decision information at least comprises: the usage scenario information where the electronic device is located.
It can be understood that the operation requirements of the user for operating the electronic device under the condition of not using the scene are different, so that the application is favorable for more accurately judging the function item which is expected to be operated by the user by combining the relative spatial position of the finger of the user relative to the menu interface and the information of the use scene.
For example, when the user holds the electronic device during walking or the electronic device is in a bumpy vehicle, it is difficult for the user to accurately position a finger on a certain function item of the menu interface due to the shaking of the electronic device and the finger of the user. On the basis, the application also considers the use scene information on the basis of combining the relative position of the finger of the user relative to the menu interface, so that some function items which are unlikely to be selected in the use scene can be eliminated, the function items which are likely to be selected by mistake due to the finger position deviation caused by the finger shake of the user and the like can be effectively eliminated, and the target function item which is expected to be selected by the user can be naturally and accurately determined.
In a possible implementation manner, the assistant decision information may further include: and the user operates the habit information. The user operation habit information at least comprises information that a user of the electronic equipment selects the function item historically under different use scenes.
Of course, the user operation habit information may also include: function item information historically used by a plurality of different users in different usage scenarios; or the user operation habit information may be function item information determined to be suitable in different use scenes by combining function item information used by a plurality of different users in different use scenes.
It can be understood that, after at least one candidate function item within the user operation range is determined from the menu interface based on the relative spatial position of the finger of the user with respect to the menu interface, the target function item that the user wants to operate can be more favorably and accurately determined by combining the current usage scenario information and the operation habit information of the user in the usage scenario corresponding to the usage scenario information, and the misjudgment can be further reduced.
S204, according to the user operation habit information, at least one target sub-function item which belongs to the user and can be operated in the use scene corresponding to the use scene information is determined from the sub-function item set associated with the target function item.
And the sub-function item set associated with the target function item comprises sub-function items in menus of different levels associated with the target function item.
The user operation habit information can be referred to the above description, and is not described herein again.
In this embodiment, on the basis of combining the usage scenario information, the target sub-function item that the user may operate is determined comprehensively in combination with the user operation habit information, so that the accuracy of determining the sub-function item can be further improved, and the misjudgment can be reduced.
For example, assuming that the target function item is a "commonly used" function item and the current usage scenario is rainy or snowy weather and is at a high-speed exit at location a, the sub-function items under the "commonly used" function item that need to be operated according to the usage scenario may include: map function item, map function item-navigation, fog light function item, windshield function-front view mirror heating, windshield function-rear view mirror heating, windshield function-front windshield heating, etc.; however, according to the use habit information of the user, it is determined that the user of the electronic device does not start the map function item and the related navigation at the position a, and then several sub-function items of "front windshield heating", "front mirror heating", "rear mirror heating", and "fog light" under the windshield function can be directly determined as target sub-function items.
It is to be understood that, for convenience of understanding in the present embodiment, the target sub-function item is determined by combining the operation habit of the user and the usage scenario information as an example for explanation. However, it is understood that, in practical applications, the manner of determining the target sub-function item based on the usage scenario information is also applicable to the present embodiment, and details thereof are not described herein again.
S205, outputting a function selection area containing at least one target subfunction item.
This step S205 can be referred to the related description of the previous embodiment.
It can be understood that, in the embodiment of the present application, when the relative spatial position of the finger of the user relative to the menu interface output by the electronic device satisfies the setting condition, the application may determine the target function item to be operated by the user in the menu interface, further determine at least one target sub-function item that may be operated by the user in the target function item, and then directly output the function selection area including the at least one target sub-function item, thereby directly outputting each target sub-function item that may be operated by the user, so that the user may directly select the desired target sub-function item in the function selection area, and may complete the desired activated sub-function item through one selection operation in the function selection area, thereby avoiding complex operations such as the user selecting the function item first, then sequentially finding the menus at different levels associated with the function item and selecting the desired sub-function item, the convenience of operating the electronic equipment is greatly improved.
Moreover, because the user does not need to select the function item and find the menus of all levels of the function item, misoperation caused by shaking of the electronic equipment and the fingers of the user in the process of finding the menus of all levels can be reduced, and the method is favorable for selecting the function item more accurately and quickly.
Similar to the previous embodiment, in order to facilitate the user to select the target sub-function item in the function selection area, considering that the number of the target sub-function items in the function selection area is relatively limited, the size of each sub-function item presented in the function selection area in the present application may be significantly larger than the size of the function item presented in the menu interface of the electronic device and the size of the sub-function item presented in each level of the menu of the function item.
Of course, the function selection area may highlight the area range of the sub-function item in other ways, or may be in other ways to facilitate the user to select the sub-function item.
To facilitate an understanding of the benefits of the present embodiments, the following description is made in connection with an application scenario:
for example, when an electronic device is used as an in-vehicle terminal (similarly to an electronic device such as a mobile phone used in a vehicle), it is assumed that usage scenario information of the electronic device is: the current usage scenario is in rainy or snowy weather and is within the high speed exit range of position a.
It can be understood that, in the process of driving a vehicle by a user and in a use scene corresponding to current use scene information, eyes of the user cannot watch a display unit of the electronic device for a long time, and a possibility of shaking of fingers of the user is high, so that it is difficult to reliably and accurately select a required function item from function items and menus of various levels in a menu interface of the display unit.
It is assumed that in the current usage scenario, what the user actually needs to turn on is the front windshield heating function.
Then, if operated in accordance with conventional practice, it may be as shown in fig. 3.
Fig. 3 shows the interfaces of the respective stages, which are required to be operated one by one for the user to select the front windshield heating function in the conventional manner.
As shown in fig. 3, after the display unit of the electronic device presents the menu interface 31, in order to enable the front windshield heating function, the user needs to click the "commonly used" function item 311 in the menu interface 31, and then trigger the electronic device to output the secondary menu 32.
In the secondary menu 32 may be included: a "map" function item 321, a "windshield function" function item 322, and a "fog light" function item 323, among others. To select the front windshield heating function, the user calls up the menu of "windshield function" in the second-level menu interface to cause the electronic device to output the third-level menu 33.
The front windshield heating function 331 is displayed only in the three-level menu 33, and the user can select to turn on the front windshield heating function.
It can be seen that according to the conventional operation mode shown in fig. 3, the user needs to click many times in the menus of different levels to select the required function item, and the operation complexity is high. However, under the conditions that the user drives the vehicle and the weather and the environment are complex, the user cannot accurately operate the electronic device, so that the user may mistakenly operate each time of calling the menus at different levels and clicking the selection function item due to mistaken touch, the operation difficulty is high, and the accuracy is low.
In the present application, a schematic process of the interface operation for selecting the front windshield heating function in the usage scenario may be as shown in fig. 4.
As can be seen from fig. 4, after the display unit of the electronic device presents the menu interface 41, if the finger of the user is located above the menu interface and is close to the "commonly used" function item 411, even if the finger of the user cannot accurately locate the "commonly used" function item due to shaking or the like, the electronic device can predict that the user currently operates the "commonly used" function item according to the current usage scenario and the operation habit of the user and by combining the current relative spatial position of the finger with respect to the menu interface.
On this basis, the electronic device determines, in combination with each sub-function item in each level of menu associated with the "commonly used" function item and in combination with the use habit information of the user, that the sub-function item that the user may operate in the current use scenario is the "fog light" sub-function item under the "commonly used" function item and each sub-function item associated with the "commonly used" wind shield function ", and then the electronic device may directly pop up the function selection area 42 including the determined each sub-function item.
As can be seen from fig. 4, the function selection area 42 directly displays sub-function items such as "fog light", "front windshield heating", "front mirror heating", and "rear mirror heating". Moreover, the icons of the respective sub-function items in the function selection area are significantly larger than the icon size of the function item in the menu interface 41. In this way, the user can select the sub-function item "front windshield heating" in the function selection area 42 more accurately and conveniently.
As can be seen from fig. 4, under the condition that the finger of the user does not touch the menu interface, as long as the finger of the user is close to the "commonly used" function item, the electronic device may automatically determine the sub-function items that the user may need to select according to the use scenario and the user operation habit information, so that the user may directly see each sub-function item in the function selection area, and select the required sub-function item in the function selection area 42, so that the user may select the required "front windshield heating" only by clicking the selection operation once, which simplifies the complexity of selecting the "front windshield heating" function item; and misoperation caused by frequently calling menus at all levels and performing selection operation is avoided, and the operation accuracy is improved.
It can be understood that, after the function selection area containing the at least one target sub-function item is output, the application can also monitor the operation of the user on the function selection area, so that the operation habit information of the user can be updated in time under the condition that the recommended sub-function item in the function selection area is not suitable for the user.
Specifically, if it is determined that the target sub-function item in the function selection area does not belong to the sub-function item that the user needs to select based on the operation of the user on the function selection area, the information of the function item actually selected by the user in the use scene corresponding to the information of the use scene is determined. Accordingly, the stored user operation habit information may be updated based on the function item information actually selected by the user.
If the user does not select the target sub-function item in the function selection area, but selects to close or quit the function selection area, it indicates that each target sub-function item in the function selection area is not a sub-function item that the user needs to select, and in this case, it indicates that the predicted sub-function item that the user may select is wrong. In this case, the related information of the function item actually selected by the user may be monitored.
The function item information actually selected by the user may be information of a function item newly selected by the user from the menu interface and a function item finally selected from the menus of the respective levels of the function items on the premise that the user does not select any sub-function item in the function selection area.
The updating of the user operation habit information may be adding function item information that may be selected in the use scenario to the user operation habit information; the function item that may be selected in the use scenario in the user operation habit information may be changed based on the function item information selected by the user, or the probability of selecting the function item information in the user operation habit information may be updated, and the like, which is not limited herein.
It can be understood that the continuous updating of the user operation habit information is beneficial to more accurately determining the information of the function items and the sub-function items under the function items, which can be selected by the user under different use scenes.
It can be understood that, in the embodiment of the present application, it is a key for accurately recommending a sub-function item for a user to accurately determine a target function item that the user needs to operate from a menu interface. Based on this, in order to further improve the accuracy of determining the target function item that the user needs to select, the display size of a part of function items in the menu interface can be adjusted at any time based on the relative spatial position of the finger of the user and the menu interface, so that misjudgment is reduced.
Referring to fig. 5, which shows a schematic flow chart of an embodiment of a display control method according to the present application, the method of the present embodiment may include:
s501, determining the use scene information of the electronic equipment.
This step can be referred to the related description of the previous embodiment, and is not described herein again.
S502, at least one candidate function item in the operation range of the user is determined from the function items of the menu interface based on the first relative space position of the finger of the user relative to the menu interface presented by the display unit.
For the sake of easy distinction, the function items within the user operation range are referred to as candidate function items.
For example, an area in the menu interface that belongs to the user operation range may be determined based on a projection coordinate point of a finger of the user on the menu interface, for example, an area having a distance from the projection coordinate point smaller than a set threshold value may be used as the area of the user operation range. Accordingly, candidate function items in the menu interface within the user operation range can be determined.
Wherein, as the first relative spatial position changes, the candidate function item may also change.
S503, the display effect of the at least one candidate function item is switched from the first display effect to the second display effect.
The first display effect may be considered as an original display effect of the menu interface presentation function item.
And the size of the candidate function item displayed by adopting the second display effect is larger than that of the candidate function item displayed by adopting the first display effect.
Of course, in order to enable the user to distinguish each candidate function item more easily, the color of the display frame of the candidate function item presented by the second display effect is brighter or the display frame is thicker, and the like, which is not limited to this.
S504, based on the second relative space position of the finger of the user relative to the menu interface presented by the display unit, the target function item required to be operated by the user is determined from the at least one candidate function item displayed by adopting the second display effect.
And the second relative spatial position is the relative spatial position of the finger relative to the menu interface after the candidate function item is displayed by the second display effect.
It is understood that, in order to reduce the misoperation, after presenting the candidate function item with the second display effect, if it is detected that the relative spatial position of the user's finger with respect to the menu interface is maintained for a set time period (e.g. 2 ms, etc.), the relative spatial position may be determined as the second relative spatial position.
The manner of determining the target function item in combination with the second relative spatial position may be any one of the aforementioned several implementations of determining the target function item.
For example, if the relative spatial position of the user's finger with respect to the menu interface satisfies the setting condition, the relative spatial position may be determined as a second relative spatial position, and then the target function item to be operated by the user may be determined based on the second relative spatial position. Of course, the target function item to be operated by the user may also be determined comprehensively by combining the second relative spatial position, the current usage scenario information of the electronic device, and the user operation habit information.
For another example, if the user's finger touches the operation interface, a candidate function item selected by the user may be determined in combination with the second relative spatial position, and the candidate function item may be determined as the target function item.
And S505, determining at least one target sub-function item which belongs to the user and can be operated in the use scene corresponding to the use scene information from the sub-function item set associated with the target function item according to the user operation habit information.
And the sub-function item set associated with the target function item comprises sub-function items in menus of different levels associated with the target function item.
It should be noted that the same applies to the case where the target sub-function item is determined based on only the usage scenario information.
S506, outputting a function selection area containing at least one target subfunction item.
The above steps S504 and S506 can refer to the related description, and are not described herein again.
In the application, the candidate function items within the operation range of the user can be displayed in an enlarged manner based on the first relative spatial position of the finger of the user relative to the menu interface, and on this basis, even if the finger of the user shakes, the possibility that the finger deviates to the candidate function item which the user does not want to select due to shaking can be reduced, so that the finger of the user can more accurately select or indicate the target function item which needs to be operated from each candidate function item, and the method is favorable for accurately determining the sub-function items which can be operated by the user by combining the scene information and the target function item.
It is understood that, in the present application, there are many possibilities of determining the usage scenario information where the electronic device is located.
The following description is made in connection with one possible scenario.
In a possible case, the method and the device can determine one or more of road environment information and weather environment information where the electronic device is located.
It can be understood that, considering that some vehicle-mounted terminals or electronic devices used in various running vehicles are more likely to cause that a vehicle driver or a passenger cannot stably operate the electronic device due to road jolt or weather conditions, and the like, in such a scenario, the functional items required to be operated are determined by combining the road environment information and the weather environment information where the electronic device is located, which is more beneficial to improving the convenience of operation and reducing misoperation.
The road environment information can be comprehensively determined by combining the sensed acceleration information of the electronic equipment, the vehicle running state obtained by the electronic equipment, the obtained geographic environment information, the road condition information and the like.
The weather environment information can be determined by combining the vehicle running state and the acquired weather condition information, and of course, the weather environment information can be comprehensively determined by combining data sensed by sensors such as humidity of the electronic equipment, and the like without limitation.
The following describes a scheme of the present application with reference to an implementation possibility of an electronic device as a vehicle-mounted terminal. As shown in fig. 6, which shows another flowchart of the display control method provided in the embodiment of the present application, the method of the present embodiment may include:
s601, obtaining the road environment information and the weather environment information of the vehicle-mounted terminal.
The road environment information may reflect information related to road driving, such as a road type, a driving environment, and a driving speed of a vehicle in which the vehicle-mounted terminal is located. For example, the road environment information may reflect the degree of jolt of a vehicle in which the vehicle-mounted terminal is located, whether the road in which the vehicle is located is an urban road, a rugged mountain road, the current road location, and the like.
When the road environment information is determined, the road environment information can be comprehensively determined by combining information of multiple dimensions, for example, acceleration information sensed by a gravity sensor on a vehicle-mounted terminal, the vehicle running state of a vehicle where the vehicle-mounted terminal is located, geographic environment information where the vehicle-mounted terminal is located can be obtained, and the road environment information is determined by combining the acceleration information, the vehicle running state information and the geographic environment information.
The weather environment information can reflect the weather condition of the current position of the vehicle where the vehicle-mounted terminal is located. The weather environment information can represent whether a vehicle where the vehicle-mounted terminal is located is in a foggy weather form state currently, whether rain and snow exist in the current environment, the visibility of the vehicle and the like.
For example, the weather environment information where the electronic device is located may be determined by combining the vehicle operating state and the weather condition information acquired by the vehicle-mounted device. The running state of the vehicle can comprise the running state information of the vehicle speed, whether the vehicle starts a windshield wiper or not, whether a fog lamp is turned on or not and the like for assisting in judging the weather environment.
S602, determining the relative spatial position of the finger of the user relative to the menu interface presented by the display unit.
S603, determining that the relative spatial position meets the set condition, and determining a target function item to be operated by the user from the function items of the menu interface based on the relative spatial position, the geographic environment information, the weather condition information and the user operation habit information.
The relative spatial position meets the set condition to represent that the user has operation requirements on the menu interface.
S604, according to the user operation habit information, at least one target sub-function item which belongs to the use scene which can be operated by the user and corresponds to the geographic environment information and the weather condition information is determined from the sub-function item set associated with the target function item.
And the sub-function item set associated with the target function item comprises sub-function items in menus of different levels associated with the target function item.
S605, outputting a function selection area containing at least one target subfunction item.
This step S605 can refer to the related description of the previous embodiment, and is not described herein again.
The embodiment of the application also provides a display control device corresponding to the display control method provided by the embodiment of the application. Fig. 7 is a schematic diagram illustrating a structural configuration of a display control device according to an embodiment of the present disclosure, where the display control device can be applied to an electronic device.
The apparatus may include:
a scene determining unit 701, configured to determine usage scene information of the electronic device;
a first determining unit 702, configured to determine, based on a relative spatial position of a finger of a user with respect to a menu interface presented by a display unit, a target function item that is required to be operated by the user from among function items of the menu interface;
a second determining unit 703, configured to determine, according to the usage scenario information, at least one target sub-function item for which an operation requirement exists by a user from a set of sub-function items associated with the target function item, where the set of sub-function items associated with the target function item includes sub-function items in menus of different levels associated with the target function item;
a selection area output unit 704 for outputting a function selection area including the at least one target sub-function item.
In one possible implementation manner, the first determining unit includes:
the position determining subunit is used for determining the relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
the first determining subunit is configured to determine that the relative spatial position meets a set condition, and determine, based on the relative spatial position, a target function item to be operated by a user from the function items of the menu interface, where the relative spatial position meets the set condition, and the presence of an operation requirement on the menu interface is represented by the user.
In a possible implementation manner, the first determining subunit is specifically configured to determine that the relative spatial position satisfies a set condition, and determine, based on the relative spatial position and assistant decision information, a target function item to be operated by a user from function items of the menu interface, where the assistant decision information includes: the user operation habit information at least comprises information that a user of the electronic equipment selects function items in different use scenes historically.
In one possible implementation manner, the second determining unit includes:
a second determining subunit, configured to determine, according to the user operation habit information, at least one target sub-function item that may be operated by the user in the usage scenario corresponding to the usage scenario information from the sub-function item set associated with the target function item, where the user operation habit information at least includes: the user of the electronic device historically selects information for the function item in different usage scenarios.
In one possible implementation, the apparatus further includes:
after the selection area output unit outputs the function selection area containing the at least one target sub-function item, if it is determined that the target sub-function item in the function selection area does not belong to the sub-function item required to be selected by the user based on the operation of the user on the function selection area, determining the function item information actually selected by the user in the use scene corresponding to the use scene information;
and the habit information updating unit is used for updating the stored user operation habit information based on the function item information actually selected by the user.
In another possible implementation manner, the scene determining unit includes:
and the scene determining subunit is used for determining the road environment information and/or the weather environment information where the electronic equipment is located.
In yet another possible implementation manner, the scene determining subunit includes:
the acceleration determining subunit is used for obtaining acceleration information sensed by the gravity sensor;
the state determining subunit is used for obtaining the vehicle running state of the vehicle where the electronic equipment is located;
the information acquisition subunit is used for acquiring the geographical environment information and the weather condition information of the electronic equipment;
the road condition determining subunit is used for determining road environment information where the electronic equipment is located by combining the acceleration information, the vehicle running state and the geographic environment information;
and the weather environment determining subunit is used for determining the weather environment information where the electronic equipment is located by combining the vehicle running state and the weather condition information.
In yet another possible implementation manner, the apparatus further includes:
the acceleration obtaining unit is used for obtaining acceleration information sensed by the gravity sensor before the function selection area containing the at least one target sub-function item is output by the selection area output unit;
a layout mode determining unit, configured to determine, in combination with the acceleration information, a layout mode of at least one target sub-function item, where the layout mode is a layout mode suitable for a user to operate the target sub-function item;
and the selection area generating unit is used for generating a function selection area containing at least one target sub-function item according to the layout mode of the target sub-function item.
In another possible implementation manner, the first determining unit includes:
the candidate determining subunit is used for determining at least one candidate function item in a user operation range from the function items of the menu interface based on a first relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
the effect switching subunit is configured to switch the display effect of the at least one candidate function item from a first display effect to a second display effect, where a size of the candidate function item displayed with the second display effect is larger than a size of the candidate function item displayed with the first display effect;
and the target determining subunit is used for determining a target function item required to be operated by the user from at least one candidate function item displayed by adopting a second display effect based on a second relative spatial position of the finger of the user relative to the menu interface presented by the display unit.
In yet another aspect, the present application further provides an electronic device, as shown in fig. 8, which shows a schematic structural diagram of the electronic device, and the electronic device may be any type of electronic device, and the electronic device at least includes a memory 801 and a processor 802;
wherein the processor 801 is configured to execute the display control method as in any of the above embodiments.
The memory 802 is used to store programs needed for the processor to perform operations.
It is to be understood that the electronic device may further include a display unit 803 and an input unit 804.
It will be appreciated that the electronic device may also include other components such as an acceleration sensor.
Of course, the electronic device may have more or less components than those shown in fig. 8, which is not limited thereto.
In another aspect, the present application further provides a computer-readable storage medium having at least one instruction, at least one program, a set of codes, or a set of instructions stored therein, which is loaded and executed by a processor to implement the display control method according to any one of the above embodiments.
The present application also proposes a computer program comprising computer instructions stored in a computer readable storage medium. A computer program for performing the display control method as in any one of the above embodiments when run on an electronic device.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. Meanwhile, the features described in the embodiments of the present specification may be replaced or combined with each other, so that those skilled in the art can implement or use the present application. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (10)

1. A display control method is applied to electronic equipment and comprises the following steps:
determining the use scene information of the electronic equipment;
determining a target function item required to be operated by a user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
determining at least one target sub-function item with operation requirements of a user from a sub-function item set associated with the target function item according to the use scene information, wherein the sub-function item set associated with the target function item comprises sub-function items in menus at different levels associated with the target function item;
outputting a function selection area containing the at least one target sub-function item.
2. The method of claim 1, wherein the determining a target function item required to be operated by a user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit comprises:
determining a relative spatial position of a user's finger with respect to a menu interface presented by a display unit;
and determining that the relative spatial position meets a set condition, and determining a target function item to be operated by the user from the function items of the menu interface based on the relative spatial position, wherein the relative spatial position meets the set condition and represents that the user has an operation requirement on the menu interface.
3. The method of claim 2, wherein determining a target function item to be operated by a user from the function items of the menu interface based on the relative spatial position comprises:
determining a target function item to be operated by a user from the function items of the menu interface based on the relative spatial position and assistant decision information, wherein the assistant decision information comprises: the user operation habit information at least comprises information that a user of the electronic equipment selects function items in different use scenes historically.
4. The method according to claim 1, wherein the determining, from the set of sub-function items associated with the target function item according to the usage scenario information, at least one target sub-function item for which an operation requirement exists by a user includes:
according to the user operation habit information, at least one target sub-function item which belongs to the user and is possible to operate in the use scene corresponding to the use scene information is determined from the sub-function item set associated with the target function item, and the user operation habit information at least comprises: the user of the electronic device historically selects information for the function item in different usage scenarios.
5. The method of claim 4, further comprising, after said outputting a function selection region containing the at least one target sub-function item:
if the target sub-function item in the function selection area does not belong to the sub-function item required to be selected by the user based on the operation of the user on the function selection area, determining the information of the function item actually selected by the user in the use scene corresponding to the information of the use scene;
and updating the stored user operation habit information based on the function item information actually selected by the user.
6. The method of claim 1, the determining usage scenario information in which the electronic device is located, comprising:
determining road environment information and/or weather environment information where the electronic equipment is located.
7. The method of claim 6, the determining road environment information and/or weather environment information in which the electronic device is located, comprising:
acquiring acceleration information sensed by a gravity sensor;
obtaining the vehicle running state of the vehicle where the electronic equipment is located;
acquiring geographic environment information and weather condition information of the electronic equipment;
determining road environment information where the electronic equipment is located by combining the acceleration information, the vehicle running state and the geographic environment information;
and determining weather environment information where the electronic equipment is located by combining the vehicle running state and the weather condition information.
8. The method of claim 1, further comprising, prior to said outputting a function selection region containing said at least one target sub-function item:
acquiring acceleration information sensed by a gravity sensor;
determining a layout mode of at least one target sub-function item in combination with the acceleration information, wherein the layout mode is a layout mode suitable for a user to operate the target sub-function item;
and generating a function selection area containing at least one target sub-function item according to the layout mode of the target sub-function item.
9. The method of claim 1, wherein the determining a target function item required to be operated by a user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit comprises:
determining at least one candidate function item in a user operation range from the function items of the menu interface based on a first relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
switching the display effect of the at least one candidate function item from a first display effect to a second display effect, wherein the size of the candidate function item displayed by the second display effect is larger than the size of the candidate function item displayed by the first display effect;
and determining a target function item required to be operated by the user from at least one candidate function item displayed by adopting a second display effect based on a second relative space position of the finger of the user relative to the menu interface displayed by the display unit.
10. A display control device applied to an electronic device includes:
the scene determining unit is used for determining the use scene information of the electronic equipment;
the first determination unit is used for determining a target function item required to be operated by a user from the function items of the menu interface based on the relative spatial position of the finger of the user relative to the menu interface presented by the display unit;
a second determining unit, configured to determine, according to the usage scenario information, at least one target sub-function item for which an operation requirement exists by a user from a sub-function item set associated with the target function item, where the sub-function item set associated with the target function item includes sub-function items in menus at different levels associated with the target function item;
a selection area output unit for outputting a function selection area including the at least one target sub-function item.
CN202210277569.7A 2022-03-21 2022-03-21 Display control method and device Pending CN114610208A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210277569.7A CN114610208A (en) 2022-03-21 2022-03-21 Display control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210277569.7A CN114610208A (en) 2022-03-21 2022-03-21 Display control method and device

Publications (1)

Publication Number Publication Date
CN114610208A true CN114610208A (en) 2022-06-10

Family

ID=81865960

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210277569.7A Pending CN114610208A (en) 2022-03-21 2022-03-21 Display control method and device

Country Status (1)

Country Link
CN (1) CN114610208A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117369885A (en) * 2023-10-11 2024-01-09 广州文石信息科技有限公司 Interface configuration method, device and storage medium for editing application

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103620541A (en) * 2011-07-11 2014-03-05 Kddi株式会社 User interface device capable of execution of input by finger contact in plurality of modes, input operation assessment method, and program
CN106161729A (en) * 2015-03-24 2016-11-23 联想(北京)有限公司 A kind of information processing method and device
US20180300031A1 (en) * 2016-12-23 2018-10-18 Realwear, Incorporated Customizing user interfaces of binary applications
CN109814775A (en) * 2017-11-22 2019-05-28 腾讯科技(深圳)有限公司 Menu item method of adjustment, device and terminal
CN110652724A (en) * 2019-10-31 2020-01-07 网易(杭州)网络有限公司 Display control method and device in game

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103620541A (en) * 2011-07-11 2014-03-05 Kddi株式会社 User interface device capable of execution of input by finger contact in plurality of modes, input operation assessment method, and program
CN106161729A (en) * 2015-03-24 2016-11-23 联想(北京)有限公司 A kind of information processing method and device
US20180300031A1 (en) * 2016-12-23 2018-10-18 Realwear, Incorporated Customizing user interfaces of binary applications
CN109814775A (en) * 2017-11-22 2019-05-28 腾讯科技(深圳)有限公司 Menu item method of adjustment, device and terminal
CN110652724A (en) * 2019-10-31 2020-01-07 网易(杭州)网络有限公司 Display control method and device in game

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117369885A (en) * 2023-10-11 2024-01-09 广州文石信息科技有限公司 Interface configuration method, device and storage medium for editing application

Similar Documents

Publication Publication Date Title
CN107315511B (en) Service display method, device, equipment and system
US9251722B2 (en) Map information display device, map information display method and program
US11256525B2 (en) Object starting method and device
EP2282172B1 (en) Method for operating navigation frame, navigation apparatus and computer program product
CN101977796B (en) Vehicular manipulation input apparatus
US10372468B2 (en) Apparatus and method for configuring idle screen
US7814434B2 (en) Operation system
US20100305844A1 (en) Mobile vehicle navigation method and apparatus thereof
US8527144B2 (en) In-vehicle apparatus and method for controlling same
US20060097986A1 (en) Image display system
US20230004267A1 (en) Control Method and Apparatus
US20100030469A1 (en) Contents navigation apparatus and method thereof
JPWO2011108257A1 (en) Display device
US20110106365A1 (en) In-vehicle device for storing gadget
US20080059902A1 (en) Operation system
JP2008180786A (en) Navigation system and navigation device
KR20080016917A (en) Navigation system
CN105786324A (en) User habit-based user interface display method
CN114610208A (en) Display control method and device
US20190102082A1 (en) Touch-sensitive alphanumeric user interface
JP2019144955A (en) Electronic device, control method and program
EP3748476B1 (en) Electronic device, control method, and program
JP2019145094A (en) Electronic device, control method, and program
US20160286036A1 (en) Method for quick access to application functionalities
US20200001718A1 (en) Vehicle display apparatus and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination