CN112558752A - Method for operating display content of head-up display, operating system and vehicle - Google Patents

Method for operating display content of head-up display, operating system and vehicle Download PDF

Info

Publication number
CN112558752A
CN112558752A CN201910912290.XA CN201910912290A CN112558752A CN 112558752 A CN112558752 A CN 112558752A CN 201910912290 A CN201910912290 A CN 201910912290A CN 112558752 A CN112558752 A CN 112558752A
Authority
CN
China
Prior art keywords
display
gesture
display elements
vehicle
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910912290.XA
Other languages
Chinese (zh)
Inventor
J·费德勒
S·利兹卡诺
郁飞
刘恒华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Priority to CN201910912290.XA priority Critical patent/CN112558752A/en
Publication of CN112558752A publication Critical patent/CN112558752A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0229Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for displays, e.g. cathodic tubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Instrument Panels (AREA)

Abstract

The invention relates to a method for operating the display content of a head-up display of a vehicle, comprising: providing a multi-layer array arrangement of all display elements in the display content, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged in a first direction, in case the layer comprises a plurality of display elements, at least partially arranged in a second direction, the remaining display elements being in turn arranged in a third direction orthogonal to the first and second directions with respect to at least one of the display elements arranged in the second direction; displaying the display element at least partially on a head-up display, and the second and third directions extend within a display plane; collecting and identifying gestures of passengers in the vehicle; switching between layers and/or switching between display elements within a layer according to a mobility gesture that characterizes directional movement within a space. The invention also relates to an operating system for a vehicle and to a vehicle.

Description

Method for operating display content of head-up display, operating system and vehicle
Technical Field
The invention relates to a method for operating the display content of a head-up display of a vehicle, to an operating system for a vehicle and to a vehicle comprising said operating system.
Background
Nowadays, the use in vehicles is increasing in head-up displays or head-up displays, in particular head-up displays covering a large part of the windscreen area. The ordinary-time display can provide information for passengers in the vehicle more in an ergonomic way.
However, such a head-up display cannot be operated by means of touch, since it is difficult for an occupant in the vehicle to reach it. However, when the operation or input is performed through a conventional touch screen, a scroll pressing device, or a rotary pressing device provided on a steering wheel or a console, the operation is inefficient and inconvenient, and the object of the operation is relatively limited. Therefore, further improvement is required for human-computer interaction for the head-up display.
Disclosure of Invention
An object of the present invention is to provide a method and an operation system capable of intuitively and efficiently operating display contents of a head-up display of a vehicle, and the vehicle.
One aspect of the invention relates to a method for operating display content of a head-up display of a vehicle, the method comprising:
providing a multi-layer array arrangement of all display elements in the display content, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged along a first direction, in case the layer comprises a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction orthogonal to the first direction, and one or more of the remaining display elements being in turn arranged along a third direction orthogonal to the first and second directions with respect to at least one of the display elements arranged along the second direction;
displaying the display element at least partially on the head-up display with mutually orthogonal second and third directions extending within a general display plane of the head-up display;
collecting and identifying gestures of passengers in the vehicle;
switching between layers and/or switching between display elements within a layer according to a mobility gesture that characterizes directional movement within a space.
According to the invention, a head-up display refers to a display which is capable of displaying, in particular projecting, a display content in front of a driver, preferably on a windshield. Head-up displays within the scope of the present invention include flat panel display devices, curved display devices, flexible display devices, projection display devices, and holographic display devices, and in particular, also encompass Augmented Reality (AR) displays that superimpose display content on a scene outside the vehicle and rear view displays that are projected on a rear window glass. In order to ensure the display effect, it can be provided that the display plane of the head-up display covers 50% of the entire surface of the windshield, preferably 80% of the entire surface of the windshield. It is particularly preferred that the head-up display in the present invention is a panoramic head-up display covering substantially the entire windscreen. In addition, in the invention, the head-up display can also be used together with a light adjusting glass, so that clearer display contents can be provided for passengers in the vehicle under the influence of no external light.
The display content referred to in the present invention is presented as a set of a plurality of display elements. The set of multiple display elements may be: menus, directories, lists, etc. By way of example, the display content may include an information directory, a settings menu, a song list, a video list, and the like. And the display elements may be display elements that provide information to occupants of the vehicle through graphics and/or text. The display elements are in particular each presented in the form of a single frame. One item of data may be displayed singly or a plurality of items of data may be displayed in combination in the form. The information provided to occupants within the vehicle may include driving data, such as vehicle operational data, driving data, vehicle environmental data, vehicle exterior image data, blind spot image data, obstacle data, vehicle component operational data, mileage data, fuel and/or power consumption data, fuel tank and/or accumulator data, and the like; navigation data, such as destination data, current location data, route data, remaining distance data, projected time remaining data, traffic data, point of interest data, along-the-road facility data, and the like; convenience data, such as multimedia data, social data, mail data, office data, weather data, information data, shopping data, business data, web page data, document data, game data, schedule data, travel data, financial data, educational data, and the like.
According to the invention, a special multi-layer array-like arrangement of the display elements is provided. It should be noted that the meaning of an array is to be understood broadly, and it may include not only regular matrices, such as 2 x 2 matrices, 2 x 3 matrices or 3 x 3 matrices, but also any arrangement ordered in two or more orthogonal parallel directions. In particular, a multi-layer array arrangement of all display elements in a display content is provided, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged along a first direction, in case the layer comprises a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction orthogonal to the first direction, while one or more of the remaining display elements are in turn arranged along a third direction orthogonal to the first direction and the second direction with respect to at least one of these display elements arranged along the second direction. This means that the plurality of display elements in the display content form a hierarchical array arrangement in three-dimensional space. The display elements may first be distributed in a plurality of layers arranged one behind the other in a first direction. Each of the plurality of layers may in turn include one or more display elements. For example, a display element may itself constitute a layer; two display elements may be arranged in a layer in rows along a second direction orthogonal to the first direction; more than two display elements may be arranged in a layer in a row or in an array, wherein more than two display elements may all be arranged in a second direction orthogonal to the first direction, or a portion of the more than two display elements may be arranged in a second direction orthogonal to the first direction and another portion of the display elements branches off from one or more of the portion of display elements in a third direction orthogonal to the first and second directions. In the present invention, a novel arrangement of a plurality of display elements in a three-dimensional space is formed, wherein there are generally at least two, in particular three, different directions of arrangement of the display elements. For example, the arrangement direction may include an x-axis, a y-axis, and a z-axis according to a cartesian coordinate system. In particular, when the vehicle interior occupant is in the riding posture, the first direction may be defined as a front-rear direction of the vehicle interior occupant, the second direction may be a left-right direction of the vehicle interior occupant, and the third direction may be a top-bottom direction of the vehicle interior occupant. Not limited to this, it is also conceivable that the vertical direction of the occupant in the vehicle is set to the second direction and the vertical direction thereof is set to the third direction.
The multi-layer array arrangement according to the invention is able to accommodate more display elements than the one-dimensional or two-dimensional menus of the prior art. Moreover, these more display elements can also be arranged in a more logical hierarchical manner, facilitating the understanding and operation thereof. Furthermore, since there are more directional relationships between the individual display elements, more versatile and faster operation possibilities are provided. In particular, switching between display elements can be achieved with fewer operations than in conventional one-dimensional or two-dimensional arrangements.
According to the invention, it is further provided that the display elements are displayed at least partially on the head-up display, and that the second and third mutually orthogonal directions extend in the general display plane of the head-up display. On the one hand, all display elements of the display content can be displayed in their entirety on the head-up display in a mutually partially superposed manner; on the other hand, only a part of these display elements may be displayed for good visual effect. In the present invention, it is provided that the plane opened by the second direction and the third direction may correspond to a display plane of the head-up display. That is, in the present invention, the respective layers of the multilayer array are arranged in front and rear parallel to the display plane.
In order to operate the display content of a head-up display of a vehicle, the invention provides for gestures of an occupant in the vehicle to be detected and recognized. Here, the vehicle interior occupants include at least the driver, but may be front passengers or even rear passengers. The occupant in the vehicle can make a gesture for operation by hand in the vehicle interior space. The gestures can be detected and recognized, in particular, by the sensor device and the gesture recognition device. Here, the gesture is not limited to a left or right hand movement, but may be a movement made by both hands together, such as a clapping, a cross-gripping of both hands. In addition, the gesture can also cover the action of the hand of the passenger in the vehicle with the upper limb, such as the crossing of the two forearms, even the respective hands of the two passengers in the vehicle can make a gesture together.
In the present invention, a unique operation mode of the display elements arranged in the multi-layer array is defined: switching between layers and/or switching between display elements within a layer according to a mobility gesture that characterizes directional movement within a space. Here, based on the spatial arrangement of the display elements, the present invention can correspondingly realize operations in more directions compared to the one-dimensional or two-dimensional arrangement of the related art. This means that mobility gestures in a first direction (e.g. move back and out) enable switching between layers; a mobility gesture in a second direction (e.g., a left swipe and a right swipe) enables switching display elements in the second direction within one layer; a mobility gesture in a third direction (e.g., a slide up and slide down) enables switching of display elements in the third direction within one layer. Within the scope of the present invention, a gesture that characterizes a movement in a direction in space refers to a movement of at least one hand, in particular the entire palm, of an occupant in the vehicle from one point in space to another point in a main direction in space within a certain time. Here, the main directions include, but are not limited to: the front-rear direction, the left-right direction, the up-down direction, and even the oblique direction with respect to an occupant in the vehicle, such as obliquely above left, obliquely above right, obliquely below left, obliquely below, and the like. Such movement of the hand may be indicative of an occupant in the vehicle wanting to move the toggle display element in the direction of the gesture. For example, an in-vehicle occupant swipes his palm upright and his fingertips forward from right to left (i.e., slides left) to instruct switching of the display elements to the left. For example, moving the current display element to the left, the display element that was originally to the right of the current display element will be referred to as the new current display element.
The invention firstly provides a multi-layer array type arrangement of all display elements in display contents applied to vehicles, and provides an intuitive operation mode by combining the arrangement. By the invention, the passengers in the vehicle can carry out intuitive and easy-to-learn operation by contrasting the display on the head-up display. This can provide great convenience for occupants in the vehicle to operate the vehicle, learn information, and perform social entertainment in the vehicle. According to the invention, the multi-layer array arrangement of the display elements in the space can provide a more global visual angle for passengers in the vehicle to know functions, information, items and the like provided by the display elements. And the invention can show the logical relationship among the display elements with higher dimension. Furthermore, physical operation entity devices such as keys, a touch screen, a rotary pressing device and a rolling pressing device are removed through gesture operation, so that passengers in the automobile can operate without being bound by the operation entity devices in the automobile. Furthermore, passengers in the vehicle can perform convenient and logical intuitive operation according to the arrangement relation of the display elements, and the man-machine interaction between the passengers in the vehicle and the vehicle is optimized. Moreover, the switching operation is also more efficiently and quickly achieved due to the higher dimensionality provided than the traditional menus in vehicles.
The method and the operating device according to the invention are particularly suitable for use in autonomous vehicles. The concept of an autonomous vehicle is understood to mean that the vehicle is capable of automated longitudinal and transverse guidance or autonomous driving with automated longitudinal and transverse guidance. The concept "autopilot" includes highly automated, fully automated or driver-free driving. In highly automated driving, the vehicle assumes longitudinal guidance and transverse guidance without the driver having to continuously monitor the system; but the driver needs to be able to undertake vehicle guidance for a certain period of time. In fully automated driving, the vehicle can automatically complete driving in all situations for specific application situations in which the driver is no longer required. In the case of a driverless trip, the vehicle can automatically complete the trip in all situations throughout the trip, generally eliminating the need for a driver.
According to one embodiment of the invention, the mobility gestures comprise at least a left slide, a right slide, a top slide, a bottom slide, a recall, and a push-out. For example, it can be provided that a left glide according to the invention means that the vehicle occupant sweeps his palm vertically (preferably to the left) and his fingertips forward from right to left; right skating according to the present invention means that an occupant sweeps his palm vertically (preferably with palm right) and fingertips forward from left to right; the upward sliding according to the present invention means that the occupant sweeps his palm horizontally (preferably with palm-up) and fingertips forward from bottom to top; roll-down according to the invention means that the occupant sweeps his palm horizontally (preferably palm-down) and fingertips forward from top to bottom; the retraction according to the present invention means that the occupant sweeps his palm upright (preferably palm-to-back) and fingertips upward from front to back; the push-out according to the present invention means that the occupant sweeps his palm upright (preferably forward of the palm) and fingertips upward from the rear to the front. According to the invention, the left slide represents a left switch, the right slide represents a right switch, the up slide represents an up switch, the down slide represents a down switch, the bring-back represents a forward switch by one layer (i.e. a switch to a layer below the layer of the current display element), and the push-out represents a backward switch by one layer (i.e. a switch back to a layer above the layer of the current display element). Here, switching to a certain direction may mean switching a current display element to the direction, or switching a next display element (or layer) opposite to the moving direction of the hand to the current display element (or layer).
According to one embodiment of the invention, one of the display elements displayed on the head-up display is used as an active display element and is displayed in the foreground; hiding, popping up, confirming, rejecting, resizing, etc. the active display element through manipulative gestures. The specific operation on the current display element is realized through the embodiment. In the present invention, active display elements, for example active frames, are provided in the foreground or unobstructed in the computer operating system. The active display elements may be distinguished from other inactive display elements by color, brightness, size, and/or location. For example, the active display elements may be displayed in a larger size directly in front of the occupant in the vehicle for ease of reading and viewing by the occupant in the vehicle. Whereas inactive display elements are placed to the side or behind the active display element with a smaller size, lower brightness and/or partial occlusion, it is even possible to consider hiding inactive display elements entirely. The passengers in the vehicle can hide, pop up, confirm, reject, zoom in and out the active display elements through the manipulative gestures. For example, active display elements, in particular all display elements, may be hidden in case a view of the road surface in front is required. While the hidden active display element or all display elements can be popped up onto the head-up display without the need to pay attention to the road surface in front. Options on the active display element may be confirmed or rejected. The active display element may be enlarged when a partial portion thereof needs to be viewed in detail and reduced when the entire active display element needs to be viewed.
According to an embodiment of the invention, the manipulative gesture comprises at least: the gesture comprises a press-down gesture, a lift-up gesture, a finger-up gesture of a thumb of a four-finger fist, a fist-making gesture, a gesture of expanding the distance between the index finger and the thumb and a gesture of shortening the distance between the index finger and the thumb. Hiding the display element through a press gesture, wherein the press gesture can be defined as that the palm of a hand points to a side window of the vehicle in a direction of a lower fingertip and the palm of the hand moves from top to bottom; popping up a display element by a lifting gesture, wherein the lifting gesture can be defined as that the palm of the hand points to a side window of the vehicle in the direction of the upper fingertip and the palm of the hand moves from bottom to top; confirming the current option through a gesture that the four-finger fist thumb points to extend upwards; rejecting the current option by a fist-making gesture; zooming in the active display element by a gesture of an expanded distance of index finger and thumb; the active display element is zoomed out by a gesture in which the index finger and thumb distance is shortened.
According to one embodiment of the present invention, the display elements arranged in a multi-layer array form at least three levels of space menus, each direction of the directions forms one level in the space menus, the next level of the space menus is expanded through a gesture of gradually opening five fingers, and the current level of the space menus is retracted through a gesture of gradually opening five fingers. A spatial menu for operating is specified for the vehicle, which is divided into at least three levels, each of the aforementioned directions (first direction, second direction and third direction) forming in each case one level in the spatial menu. For example, in the first level, the layers arranged in the first direction respectively represent driving data, navigation data, convenience data, and the like. In the second stage, for example, vehicle operating data, driving data, vehicle interior environment data, vehicle exterior image data, blind zone image data, obstacle data, vehicle component operating data, mileage data, fuel and/or power consumption data, fuel tank and/or energy storage data may be arranged in the second direction in the layer of driving data. For another example, in the third stage, the vehicle speed, the engine speed, the cooling water temperature, the battery temperature, and the like may be expanded in the third direction with respect to the vehicle operation data. The space menu of the invention provides multilevel information for passengers in the vehicle in order. The passengers in the vehicle can intuitively and efficiently operate the space menu through the mobile gestures according to the invention, and conveniently find required information. By means of the invention, the operating logic and the man-machine interaction are optimized with respect to the vehicle menus of the prior art. In particular, the invention also provides the possibility of expanding and collapsing menus of various levels, i.e. expanding the next level of the space menu by a gesture in which the five fingers are gradually opened, and collapsing the current level of the space menu by a gesture in which the five fingers are gradually closed. Therefore, information of all levels is simply and clearly displayed for passengers in the vehicle, and a novel user interface with extremely high operability is provided.
According to one embodiment of the invention, with the index finger extended and the remaining four fingers clenched, the intersection point of the extension direction of the index finger and the display content is calculated and highlighted on the head-up display. With this embodiment, the index finger of the occupant in the vehicle can be pointed like a laser pointer. The gesture may be referred to as a pointing gesture. An extension line of the index finger points to a portion desired to be operated on the head-up display. The orientation of the intersection can be intuitively adjusted by highlighting it on the head-up display, thereby performing the operation more precisely.
It should be noted that the present invention is not limited to the above-mentioned gestures, and other gestures, voice commands, expression commands, and physical operating devices, which are not mentioned above, may be used to implement operations on the display contents of the head-up display.
Another aspect of the invention relates to an operating system for a vehicle, the operating system comprising:
a data processing device configured to provide a multi-layer array arrangement of all display elements in the display content, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged along a first direction, in case the layer comprises a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction orthogonal to the first direction, and one or more of the remaining display elements being in turn arranged along a third direction orthogonal to the first and second directions with respect to at least one of the display elements arranged along the second direction;
a head-up display for projecting the display element at least partially onto a vehicle windscreen, and mutually orthogonal second and third directions extending within a substantial display plane of the head-up display;
the sensing device is used for acquiring the actions of the hands of passengers in the vehicle;
gesture recognition means for recognizing a gesture from the acquired hand motion;
the data processing device switches among layers and/or switches among display elements in one layer according to the mobility gesture for representing the direction movement in the space.
The operating system according to the invention can be designed in particular for carrying out the method for operating the display content of a head-up display of a vehicle as described above.
According to one aspect of the invention, the sensor device is an RGB camera, a structured light camera, a binocular camera, a time-of-flight camera or a depth camera, in particular an RGB depth camera.
Another aspect of the invention relates to a vehicle comprising an operating system according to the invention.
It should be understood that the term "vehicle" or "vehicular" or other similar terms as used herein generally includes motor vehicles, such as including Sport Utility Vehicles (SUVs), buses, vans, commercial vehicles, passenger vehicles, including various boats, ships, boats, aircraft, and the like, and includes hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles, and other alternative fuel vehicles (e.g., fuel derived from sources other than petroleum). As referred to herein, a hybrid vehicle is a vehicle having two or more power sources, such as both gasoline-powered and electric-powered vehicles.
In accordance with the present invention, the term "operating system for a vehicle" may be integrated with an in-vehicle system, an in-vehicle information system, a telematics service, an entertainment system, and/or a navigation system. In recent years, wireless communication connections, such as bluetooth, WiFi technologies, have been applied to in-vehicle systems, and therefore, even when the mobile phone is left in, for example, a bag or a pocket, it is possible to transmit data to the operating system for the vehicle, such as by making a call or using or playing application programs, audio and video contents, which are stored in advance in a smart phone.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and any person skilled in the art can make possible variations and modifications of the present invention using the methods and technical contents disclosed above without departing from the spirit and scope of the present invention, and therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention fall within the scope of protection of the present invention.
Drawings
Fig. 1 shows a schematic perspective view of an arrangement of a multi-layer array of display elements in a display content according to the invention.
Fig. 2 shows a schematic representation of a layer of a plurality of display elements according to the invention.
Fig. 3 shows a scenario in which an occupant in the vehicle uses the method according to the invention and/or the operating system according to the invention in the vehicle interior.
Fig. 4 shows an exemplary display situation on a head-up display.
Fig. 5a to 5f each show exemplary mobility gestures.
Fig. 6a to 6c each show an exemplary manipulative gesture.
FIG. 7 illustrates a schematic diagram of a point gesture.
Fig. 8 shows a flow chart of a method according to the invention for operating the display content of a head-up display of a vehicle.
Fig. 9 shows a schematic view of an operating system for a vehicle according to the invention.
Detailed Description
Fig. 1 shows a schematic perspective view of an arrangement of a multi-layer array of display elements in a display content according to the invention. In fig. 1, three layers each composed of nine display elements are shown, wherein each display element indicated by a solid line constitutes a first layer 1, each display element indicated by a dotted line constitutes a second layer 2, and each display element indicated by a dot-dash line constitutes a third layer 3. The first layer 1, the second layer 2 and the third layer are arranged along a first direction x. Fig. 1 only schematically illustrates the case where each layer is composed of nine display elements for the sake of clarity. It is conceivable, however, that the number of layers may not only be three, but also two or more; each layer is composed of one or any plurality of display elements; the individual display elements in the various layers may also be arranged in a manner different from that shown in fig. 1. In the case shown in fig. 1, taking the first layer 1 as an example, nine display elements are arranged in rows and columns in a3 × 3 matrix. The three display elements in each row are arranged in a second direction y orthogonal to the first direction x, and the three display elements in each column are arranged in a third direction z orthogonal to the first direction x and the second direction.
These display elements as shown in fig. 1 can be displayed at least partially, preferably completely, on the head-up display. The planes in which the second direction y and the third direction z orthogonal to each other are spread may correspond to the approximate display plane of the head-up display. The arrangement of display elements like that of fig. 1 can be viewed by an occupant in the vehicle from the head-up display. In this way, the vehicle occupant can intuitively understand the distribution of the display elements that he can manipulate on the head-up display, so that switching between the layers and/or between the display elements in one layer in the three illustrated directions by gestures can be easily imagined without additional learning processes. For example, the occupant in the vehicle easily approaches the display element of the next layer, that is, switches the next layer to the current layer, by the gesture of the approach. Preferably, each layer is circularly arranged, that is, the current layer is moved to the last layer by the gesture called for, and the next layer is switched to the current layer, so as to facilitate the operation of the passengers in the vehicle.
Particularly preferably, the display elements arranged in a multi-layer array form at least three spatial menus as a whole, and each direction of the directions forms one level in the spatial menus. As shown in fig. 1, the spatial menu is divided into at least three levels, and each of the directions (the first direction x, the second direction y, and the third direction z) constitutes one level in the spatial menu. For example, in the first level, a first layer 1 showing driving data, a second layer 2 showing navigation data, and a third layer 3 showing convenience data are arranged in a first direction. In the second level, for example, vehicle operating data, driving data, vehicle environment data can be arranged in the middle row from left to right in the second direction y in the first layer 1 showing driving data. In the third level, the vehicle speed and the engine speed may be displayed in the display elements above and below thereof, respectively, with respect to the vehicle operation data in the left side in the third direction z. The space menu of the invention provides multilevel information for passengers in the vehicle in order. Moreover, the passengers in the vehicle can intuitively and efficiently operate the space menu through the mobile gestures according to the invention, and can conveniently find required information.
Fig. 2 shows a schematic representation of a layer of a plurality of display elements according to the invention. There are a total of 17 display elements in this layer. Some of the 17 display elements a1, a2, a3, a4, a5, a6, a7 are aligned along the second direction y, and one or more of the remaining display elements in the figure may be aligned along the third direction z with respect to at least one of the display elements aligned along the second direction y. Specifically, one display element b1 is located above display element a2 in the third direction z with respect to display element a 2. The four display elements c1, c2, c3, c4 are respectively located above and below the display element a4 in the third direction z with respect to the display element a 4. The two display elements d1, d2 are located only above display element a6 in the third direction z with respect to display element a 6. The two display elements e1, e2 are located only below the display element a6 in the third direction z with respect to the display element a 7. In fig. 2, only one exemplary arrangement of one layer is schematically shown for the sake of clarity. It is also conceivable to use the z-axis direction as the second direction and the y-axis direction as the first direction. Further, the layer shown in fig. 2 may also constitute any one of the three layers in fig. 1. The other layers in fig. 1 may also be considered to be aligned as shown in fig. 2 without limitation to structure, orientation, and number.
Fig. 3 shows a scenario in which an occupant in the vehicle uses the method according to the invention and/or the operating system according to the invention in the vehicle interior. The in-vehicle occupant 4 (here, a driver located near the steering wheel 7 is taken as an example) is located inside the vehicle. On the head-up display 6 of the vehicle, a display is displayed, in which the individual display elements are arranged in a multi-layer array according to the invention. For the sake of simplicity, only two layers are schematically illustrated here, each of which comprises four display elements. According to the invention, the occupant 4 can operate the display elements and/or layers with gestures of his hand 5. In order to detect gestures of the hand 5 of the vehicle occupant 4, a sensor device 8 is provided on the vehicle roof. According to the invention, the gestures of the passengers 4 in the vehicle are collected and recognized; switching between layers and/or switching between display elements within a layer according to a mobility gesture that characterizes directional movement within a space. Here, the mobility gesture includes at least left slide, right slide, up slide, down slide, bring back, and push out. These gestures are further described below.
Fig. 4 shows an exemplary display situation on a head-up display. Here, it is also shown that the display elements indicated by solid lines constitute a first layer 1, the display elements indicated by broken lines constitute a second layer 2, and the display elements indicated by dot-dash lines constitute a third layer 3. Unlike the illustrations of fig. 1-3, only one display element in each layer is displayed on the heads-up display for viewing, reading, and/or viewing by the vehicle occupant 4. Here, the remaining display elements in each layer remain undisplayed. With such display, the vehicle occupant easily understands that switching between the respective layers can be performed by retracting and pushing out. To optimize the visual effect, even only the shown one display element of the first layer 1 may be displayed.
Furthermore, the display element indicated by the solid line in fig. 4 may be defined as an active display element and displayed in the foreground, i.e., not occluded, at the forefront. Although other display elements are not shown, the in-vehicle occupant 4 can know that it can switch the display elements by sliding left, right, up, down, by prompting with directional arrows as shown in fig. 1. The occupant 4 may also hide, pop up, confirm, reject, and resize the active display elements via manipulative gestures. These manipulative gestures are also described in detail below.
Fig. 5a to 5f show mobility gestures, respectively, which include at least a left slide, a right slide, a top slide, a bottom slide, a rollback and a push-out. Fig. 5a shows a left glide gesture according to the present invention, which refers to an occupant sweeping his palm upright to the left and fingertips forward from right to left. The left swipe gesture may be used to switch the current (or active) display element to the display element that was originally on the right side. Fig. 5b shows a right glide gesture according to the present invention, which refers to an occupant sweeping his palm upright to the right and fingertips forward from left to right. The right swipe gesture may be used to switch the current (or active) display element to the display element that was originally to the left. Fig. 5c shows a glide gesture according to the present invention, which refers to an occupant sweeping his palm up horizontally and fingertips up from bottom to top, in a forward direction. The SWIPE UP gesture may be used to switch the current (or active) display element to the display element that was originally on the lower side. Fig. 5d shows a glide gesture according to the present invention, which refers to an occupant sweeping his palm down horizontally and fingertips forward from top to bottom. The slide-down gesture may be used to switch the current (or active) display element to the display element that was originally on the top side. Fig. 5e shows a rollback gesture according to the present invention, which refers to an occupant sweeping his palm back upright and with the fingertips up from front to back. The recall gesture may be used to switch the current layer to the original next layer. Fig. 5f shows a push-out gesture according to the invention, which refers to an occupant in the vehicle sweeping his palm forward upright and fingertips upward from back to front. The push-out gesture may be used to switch the current layer to the originally previous layer.
Fig. 6a to 6c each show an exemplary manipulative gesture. Fig. 6a shows a four-finger fist-pointing gesture by which the current selection can be confirmed. Fig. 6b shows a fist-making gesture by which the current option can be rejected. FIG. 6c illustrates a forefinger-thumb distance enlargement gesture by which an active display element is enlarged. Conversely, the active display element may also be zoomed out by a gesture in which the index finger is shortened from the thumb. Although not shown in fig. 6a to 6c, the manipulative gesture further includes at least: a press down gesture, a lift up gesture. A push gesture may be defined as a palm-down fingertip direction pointing towards the side window of the vehicle, the palm moving from top to bottom, by which the display element is hidden. The upward-lifting gesture can be defined as that the palm points to a side window of the vehicle in the upward fingertip direction, the palm moves from bottom to top, and the display element pops up through the upward-lifting gesture.
FIG. 7 illustrates a schematic diagram of a point gesture. In the case where the index finger of the hand 5 of the in-vehicle occupant 4 is extended and the remaining four fingers are clenched, the intersection point of the extending direction of the index finger and the display content may be calculated and highlighted on the head-up display. As shown in fig. 7, the direction pointed by the index finger of the in-vehicle occupant 4 is shown by a broken line. This dashed line intersects the visual display, here the display element 9. The intersection points of which are highlighted by circles 10. With this embodiment, the index finger of the occupant in the vehicle can be pointed like a laser pointer. An extension line of the index finger points to a portion desired to be operated on the head-up display. The orientation of the intersection can be intuitively adjusted by highlighting it on the head-up display, thereby performing the operation more precisely.
Fig. 8 shows a flow chart of a method according to the invention for operating the display content of a head-up display of a vehicle. After starting the method according to the invention, a multi-layer array arrangement of all display elements in the display content is first provided in step 101, wherein one or more display elements each form a layer of the multi-layer array, the plurality of layers being arranged along a first direction, in the case of the layer comprising a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction orthogonal to the first direction, while one or more of the remaining display elements are in turn arranged along a third direction orthogonal to the first direction and the second direction with respect to at least one of the display elements arranged along the second direction; subsequently, in step 102, the display element is displayed at least partially on the head-up display, and a second and a third mutually orthogonal direction extend in the general display plane of the head-up display; collecting and recognizing gestures of passengers in the vehicle in step 103; in step 105, switching between layers and/or switching between display elements within a layer is performed according to a mobility gesture that characterizes a directional movement within the space.
Fig. 9 shows an operating system 11 for a vehicle according to the invention. The operating system 11 includes:
a data processing device 12 configured to provide a multi-layer array arrangement of all display elements in the display content, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged along a first direction x, in case the layer comprises a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction y orthogonal to the first direction x, and one or more of the remaining display elements being in turn arranged along a third direction z orthogonal to the first direction x and the second direction y with respect to at least one of these display elements arranged along the second direction y;
a head-up display 13 for projecting the display elements at least partially onto a vehicle windscreen, and mutually orthogonal second and third directions extending within a substantial display plane of the head-up display;
a sensing device 8 for acquiring the motion of the hand of the passenger in the vehicle;
a gesture recognition device 14 for recognizing a gesture from the acquired hand motion;
wherein the data processing device 12 switches between layers and/or between display elements within a layer according to a mobility gesture that characterizes a directional movement within the space.
In particular, the sensor device 8 is an RGB camera, a structured light camera, a binocular camera, a time-of-flight camera or a depth camera.
The present invention may also be a computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions thereon for causing a processor to perform various aspects of the invention.
The computer readable storage medium may be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In some embodiments, electronic circuitry, including, for example, programmable logic circuitry, Field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), may execute computer-readable program instructions to perform aspects of the present invention by utilizing state information of the computer-readable program instructions to personalize the electronic circuitry.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, systems, and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
The invention is not limited to the embodiments shown but comprises or extends to all technical equivalents that may fall within the scope and spirit of the appended claims. The positional references selected in the description, such as, for example, upper, lower, left, right, etc., refer to the direct description and to the illustrated drawings and can be transferred to new positions in the event of a change in position.
The features disclosed in the present document can be essential for the implementation of the embodiments in terms of different embodiments and can be implemented both individually and in any combination.
Although some aspects are described in association with a device, it should be understood that: these aspects are also descriptions of corresponding methods, so that a component of a module or a device of a system can also be understood as a corresponding method step or as a feature of a method step. Similarly, an aspect described in connection with or as a method step is also a description of a corresponding module or detail or feature of a corresponding device.
Thus, a computer-readable storage medium may be machine-readable or computer-readable. Thus, in some embodiments, a computer-readable storage medium comprises a data carrier having executable instructions that can cooperate with a programmable computer system or programmable hardware components such that one of the methods described herein is performed. An embodiment is thus a data carrier, a digital storage medium or a computer-readable storage medium, on which a program for implementing one of the methods described herein is recorded.
Furthermore, another embodiment is a data flow, a signal sequence, or a signal sequence, which is a program for implementing one of the methods described herein. A data stream, a signal sequence or a signal sequence may for example be arranged for transmission via a data communication connection, for example via the internet or other networks. Thus, an embodiment may also be a signal sequence representing data, which is suitable for transmission via a network or a data communication connection, wherein the data is a program.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above.

Claims (15)

1. Method for operating the display content of a head-up display of a vehicle, the method comprising:
providing a multi-layer array arrangement of all display elements in the display content, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged along a first direction, in case the layer comprises a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction orthogonal to the first direction, and one or more of the remaining display elements being in turn arranged along a third direction orthogonal to the first and second directions with respect to at least one of the display elements arranged along the second direction;
displaying the display element at least partially on the head-up display with mutually orthogonal second and third directions extending within a general display plane of the head-up display;
collecting and identifying gestures of passengers in the vehicle;
switching between layers and/or switching between display elements within a layer according to a mobility gesture that characterizes directional movement within a space.
2. The method of claim 1, wherein the mobility gestures include at least a left slide, a right slide, a top slide, a bottom slide, a rollback, and a push out.
3. Method according to claim 1 or 2, characterized in that one of the display elements displayed on the head-up display is taken as active display element and placed in the foreground display; the active display element is hidden, popped up, confirmed, rejected, and resized by manipulative gestures.
4. The method of claim 3,
hiding the display element by a press gesture;
pop up display elements by a lift-up gesture;
confirming the current option through a gesture that the four-finger fist thumb points to extend upwards;
rejecting the current option by a fist-making gesture;
zooming in the active display element by a gesture of an expanded distance of index finger and thumb;
the active display element is zoomed out by a gesture in which the index finger and thumb distance is shortened.
5. The method according to any one of the preceding claims, wherein the display elements arranged in a multi-layer array form at least three levels of the spatial menu, each of the directions respectively forms one level of the spatial menu, the next level of the spatial menu is expanded by a gesture of gradually opening five fingers, and the current level of the spatial menu is retracted by a gesture of gradually opening five fingers.
6. Method according to one of the preceding claims, characterized in that in the case of an extended index finger and a fist with the remaining four fingers, the intersection of the extension direction of the index finger with the display is calculated and highlighted on the head-up display.
7. An operating system for a vehicle, the operating system comprising:
a data processing device configured to provide a multi-layer array arrangement of all display elements in the display content, wherein one or more display elements each constitute one layer of the multi-layer array, the plurality of layers being arranged along a first direction, in case the layer comprises a plurality of display elements, the plurality of display elements being at least partially arranged along a second direction orthogonal to the first direction, and one or more of the remaining display elements being in turn arranged along a third direction orthogonal to the first and second directions with respect to at least one of the display elements arranged along the second direction;
a head-up display for projecting the display element at least partially onto a vehicle windscreen, and mutually orthogonal second and third directions extending within a substantial display plane of the head-up display;
the sensing device is used for acquiring the actions of the hands of passengers in the vehicle;
gesture recognition means for recognizing a gesture from the acquired hand motion;
the data processing device switches among layers and/or switches among display elements in one layer according to the mobility gesture for representing the direction movement in the space.
8. The operating system of claim 7, wherein the mobility gestures include at least a left slide, a right slide, a top slide, a bottom slide, a rollback, and a push out.
9. Operating system according to claim 7 or 8, characterized in that the data processing means take one of the display elements displayed on the head-up display as an active display element and place it in the foreground display, which active display element is hidden, popped up, confirmed, rejected and resized by means of manipulative gestures.
10. The operating system of claim 9,
hiding the display element by a press gesture;
pop up display elements by a lift-up gesture;
confirming the current option through a gesture that the four-finger fist thumb points to extend upwards;
rejecting the current option by a fist-making gesture;
zooming in the active display element by a gesture of an expanded distance of index finger and thumb;
the active display element is zoomed out by a gesture in which the index finger and thumb distance is shortened.
11. The operating system according to any one of claims 7 to 10, wherein the plurality of display elements arranged in a multi-layer array form at least three levels of the spatial menu, each orthogonal direction forms one level of the spatial menu, the next level of the spatial menu is expanded by a gesture of gradually opening five fingers, and the current level of the spatial menu is retracted by a gesture of gradually opening five fingers.
12. Operating system according to one of claims 7 to 11, characterized in that, in the case of a sensor device which detects an extension of the index finger and a fist with the remaining four fingers, the data processing device calculates the intersection of the extension direction of the index finger with the display and highlights this intersection on the head-up display.
13. Operating system according to one of claims 7 to 12, characterised in that the sensor device is an RGB camera, a structured light camera, a binocular camera, a time-of-flight camera or a depth camera.
14. Operating system according to one of claims 7 to 13, characterized in that the operating system is configured for carrying out a method according to one of claims 1 to 6.
15. Vehicle comprising an operating system according to one of the claims 7 to 14.
CN201910912290.XA 2019-09-25 2019-09-25 Method for operating display content of head-up display, operating system and vehicle Pending CN112558752A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910912290.XA CN112558752A (en) 2019-09-25 2019-09-25 Method for operating display content of head-up display, operating system and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910912290.XA CN112558752A (en) 2019-09-25 2019-09-25 Method for operating display content of head-up display, operating system and vehicle

Publications (1)

Publication Number Publication Date
CN112558752A true CN112558752A (en) 2021-03-26

Family

ID=75029388

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910912290.XA Pending CN112558752A (en) 2019-09-25 2019-09-25 Method for operating display content of head-up display, operating system and vehicle

Country Status (1)

Country Link
CN (1) CN112558752A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247123A1 (en) * 2021-05-26 2022-12-01 京东方科技集团股份有限公司 Display device, container system, and method for controlling display device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103246351A (en) * 2013-05-23 2013-08-14 刘广松 User interaction system and method
EP2661091A1 (en) * 2012-05-04 2013-11-06 Novabase Digital TV Technologies GmbH Controlling a graphical user interface
US20140351770A1 (en) * 2013-05-24 2014-11-27 Atheer, Inc. Method and apparatus for immersive system interfacing
US20150363057A1 (en) * 2013-01-10 2015-12-17 Volkswagen Aktiengesellschaft Method and device for providing a user interface in a vehicle
US20160109952A1 (en) * 2014-10-17 2016-04-21 Top Victory Investments Ltd. Method of Controlling Operating Interface of Display Device by User's Motion
CN105653034A (en) * 2015-12-31 2016-06-08 北京小鸟看看科技有限公司 Content switching method and device achieved in three-dimensional immersive environment
CN106293127A (en) * 2016-08-10 2017-01-04 北京英梅吉科技有限公司 Array input method in three dimensions and system
US20190073040A1 (en) * 2017-09-05 2019-03-07 Future Mobility Corporation Limited Gesture and motion based control of user interfaces
CN110045825A (en) * 2018-03-27 2019-07-23 杭州凌感科技有限公司 Gesture recognition system for vehicle interaction control

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2661091A1 (en) * 2012-05-04 2013-11-06 Novabase Digital TV Technologies GmbH Controlling a graphical user interface
US20150363057A1 (en) * 2013-01-10 2015-12-17 Volkswagen Aktiengesellschaft Method and device for providing a user interface in a vehicle
CN103246351A (en) * 2013-05-23 2013-08-14 刘广松 User interaction system and method
US20140351770A1 (en) * 2013-05-24 2014-11-27 Atheer, Inc. Method and apparatus for immersive system interfacing
US20160109952A1 (en) * 2014-10-17 2016-04-21 Top Victory Investments Ltd. Method of Controlling Operating Interface of Display Device by User's Motion
CN105653034A (en) * 2015-12-31 2016-06-08 北京小鸟看看科技有限公司 Content switching method and device achieved in three-dimensional immersive environment
CN106293127A (en) * 2016-08-10 2017-01-04 北京英梅吉科技有限公司 Array input method in three dimensions and system
US20190073040A1 (en) * 2017-09-05 2019-03-07 Future Mobility Corporation Limited Gesture and motion based control of user interfaces
CN110045825A (en) * 2018-03-27 2019-07-23 杭州凌感科技有限公司 Gesture recognition system for vehicle interaction control

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022247123A1 (en) * 2021-05-26 2022-12-01 京东方科技集团股份有限公司 Display device, container system, and method for controlling display device

Similar Documents

Publication Publication Date Title
US11503251B2 (en) Vehicular vision system with split display
JP4840620B2 (en) In-vehicle electronic device operation device
US9244527B2 (en) System, components and methodologies for gaze dependent gesture input control
US20110128164A1 (en) User interface device for controlling car multimedia system
EP2870528B1 (en) Light-based touch controls on a steering wheel and dashboard
US9594472B2 (en) Method and array for providing a graphical user interface, in particular in a vehicle
US20180232195A1 (en) Electronic device and method for sharing images
KR102029842B1 (en) System and control method for gesture recognition of vehicle
US20190073040A1 (en) Gesture and motion based control of user interfaces
US20170243389A1 (en) Device and method for signalling a successful gesture input
WO2014188565A1 (en) Display controller
CN106573627A (en) Multitouch chording language
KR20120046265A (en) Method and device for displaying information
US10133473B2 (en) Input apparatus and vehicle including the same
US20180307405A1 (en) Contextual vehicle user interface
EP3659848A1 (en) Operating module, operating method, operating system and storage medium for vehicles
US20210382560A1 (en) Methods and System for Determining a Command of an Occupant of a Vehicle
KR101806172B1 (en) Vehicle terminal control system and method
CN112558752A (en) Method for operating display content of head-up display, operating system and vehicle
JP2018195134A (en) On-vehicle information processing system
Pickering The search for a safer driver Interface: a review of gesture recognition Human Machine Interface.
KR102675875B1 (en) An apparatus and method for controlling a display screen in a vehicle by operating a touch pad
CN111469663A (en) Control system for a vehicle
WO2017188098A1 (en) Vehicle-mounted information processing system
US20170003839A1 (en) Multifunctional operating device and method for operating a multifunctional operating device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination