CN115328376A - Display interaction method and electronic device - Google Patents
Display interaction method and electronic device Download PDFInfo
- Publication number
- CN115328376A CN115328376A CN202110440467.8A CN202110440467A CN115328376A CN 115328376 A CN115328376 A CN 115328376A CN 202110440467 A CN202110440467 A CN 202110440467A CN 115328376 A CN115328376 A CN 115328376A
- Authority
- CN
- China
- Prior art keywords
- electronic device
- angle
- steering wheel
- gesture
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 36
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000002452 interceptive effect Effects 0.000 claims abstract 2
- 238000010586 diagram Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides an electronic device and a display interaction method. An electronic device displays a first graphical object combination. Wherein the first combination of graphical objects comprises a plurality of graphical objects. Each of the plurality of graphical objects in the first graphical object combination is linked to the display screen of the electronic device, that is, a deflection of the display screen by a first angle in a first direction causes each of the plurality of graphical objects in the first graphical object combination to deflect by a second angle in a second direction around a first center axis, for example, the second direction is opposite to the first direction, and the second angle is the same as the first angle. Each of a plurality of graphical objects displayed by the electronic device is configured to be displayed at a different layer of a display screen of the electronic device. The electronic device detects a gesture it receives, and in response to detecting the gesture. The steering wheel comprising the electronic device can provide an intelligent and convenient interactive interface for a vehicle driver.
Description
Technical Field
The embodiments relate generally to display interaction methods and electronic devices, and more particularly to methods and devices for use in automotive steering wheels, vehicle steering wheels, smart cabs, automotive dashboards, and automotive control panels.
Background
The automobile generally has an independent steering wheel and an intelligent display screen, and the steering wheel and the intelligent display screen are separated by a certain distance, so that a driver is difficult to operate the intelligent display screen simultaneously when operating the steering wheel to drive. There is a clear need to improve the operating efficiency of both.
Disclosure of Invention
Therefore, more efficient display interaction methods and electronic devices are needed. Various embodiments related to display interaction methods and electronic devices are described herein.
According to some embodiments, an electronic device includes a display unit, a data receiving unit, and a processing unit. The display unit is configured to display a first graphical object combination on the display unit, wherein the first graphical object combination includes a plurality of graphical objects. The processing unit is coupled to the display unit and the data receiving unit, the processing unit configured to: detecting the data received by the data receiving unit to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle).
According to some embodiments, an electronic device includes a display unit, a data receiving unit, and a processing unit. Wherein each of a plurality of graphical objects in the display unit is displayed at a configured layer.
According to some embodiments, an electronic device includes a display unit, a data receiving unit, a processing unit, a touch-sensitive surface unit. The touch-sensitive surface unit is coupled to the processing unit, which is configured to receive a gesture. The processing unit is configured to: detecting a gesture; in response to the detected gesture.
According to some embodiments, an electronic device includes a display unit, a data receiving unit, a processing unit, a touch-sensitive surface unit, an interaction data channel. The touch-sensitive surface unit is coupled to the processing unit, which is configured to receive a gesture. The processing unit is configured to: detecting a gesture; in response to the detected gesture. The interaction data channel is coupled to the processing unit, which is configured to: receiving image data sent by a first external device, wherein the electronic device displays the image data as a first graphic object in a first area of the display unit; and issuing gesture data detected by the electronic device within the first region to the first external device; and the interaction data channel may be wireless (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.11 g, IEEE802.11n, IEEE802.11ac, and/or IEEE802.11 ax) or wired (e.g., USB).
According to some embodiments, displaying graphical objects and interacting methods are performed at an electronic device, which includes displaying a first combination of graphical objects on a display unit of the electronic device, wherein the first combination of graphical objects includes a plurality of graphical objects. The method also comprises the steps of detecting the data received by the data receiving unit to obtain a first direction and a first angle; and responsive to the detected first direction and first angle, and deflecting each of the plurality of graphical objects of the first graphical object combination about a first central axis at a second direction and second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle). Optionally, the method further comprises detecting a gesture, and responding to the detected gesture. Optionally, the method further includes receiving image data sent by a first external device, and the electronic apparatus displays the image data as a first graphic object in a first area of the display unit. Optionally, the method further includes sending gesture data detected by the electronic apparatus in the first area to the first external device.
According to some embodiments, a method of displaying graphical objects and interacting is performed at an electronic device, comprising: detecting a continuous tap gesture of a first finger; and in response to detecting the continuous tap gesture of the first finger: setting an operating mode (e.g., a screen locking mode, an unlocking mode) of the electronic device; and optionally, an alert tone (e.g., lock screen alert tone, unlock alert tone).
According to some embodiments, a steering wheel includes a steering wheel grip, a rotation angle sensor, a display unit, a data receiving unit, a processing unit, a touch-sensitive surface unit.
The steering wheel grip is linked with the display unit; the display unit is driven to rotate by the rotation of the steering wheel handle; and the rotation angle sensor is coupled with the data receiving unit and is configured to detect the deflection of the steering wheel grip or the display unit.
The display unit is configured to display a first combination of graphical objects on the display unit, wherein the first combination of graphical objects includes a plurality of graphical objects. Optionally, wherein each of the plurality of graphical objects in the display unit is displayed in a configured layer.
The processing unit is coupled to the display unit and the data receiving unit, the processing unit configured to: detecting the data of the corner sensor received by the data receiving unit to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle).
The touch-sensitive surface unit is coupled to the processing unit, which is configured to receive a gesture. The processing unit is configured to: detecting a gesture; in response to the detected gesture.
According to some embodiments, displaying graphical objects and interaction methods are performed at a steering wheel, including the steering wheel grip and the display unit being in linkage; the display unit is driven to rotate by the rotation of the steering wheel handle; and the rotation angle sensor is coupled with the data receiving unit and is configured to detect the deflection of the steering wheel grip or the display unit. The method also includes the display unit being configured to display a first combination of graphical objects on the display unit, wherein the first combination of graphical objects includes a plurality of graphical objects. Optionally, wherein each of the plurality of graphical objects in the display unit is displayed in a configured layer. The method also includes the processing unit coupled to the display unit and the data receiving unit, the processing unit configured to: detecting the data of the corner sensor received by the data receiving unit to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle). The method also includes the touch-sensitive surface unit being coupled to the processing unit, which is configured to receive a gesture. The processing unit is configured to: detecting a gesture; in response to the detected gesture.
According to some embodiments, displaying graphical objects and interaction methods are performed at a steering wheel, which includes the steering wheel displaying a virtual input keyboard; and detecting character data, a password or a password input by clicking the first finger on the virtual keyboard; and in response to detecting that the first finger has clicked on the virtual keyboard the entered character data, password or passcode: judging whether the character data, the password or the password accord with a first rule or not; and configuring a state of the steering wheel (e.g., a drivable state, a parked state).
According to some embodiments, a steering wheel includes a steering wheel grip, a steering wheel grip touch-sensitive surface module, a display unit, a data receiving unit, a processing unit, a touch-sensitive surface unit.
The steering wheel grip touch-sensitive surface module is coupled to a data receiving unit of the electronic device that is configured to detect a hand swipe gesture at the steering wheel grip.
The display unit is configured to display a first combination of graphical objects on the display unit, wherein the first combination of graphical objects includes a plurality of graphical objects. Optionally, wherein each of the plurality of graphical objects in the display unit is displayed in a configured layer.
The processing unit is coupled to the display unit and the data receiving unit, the processing unit configured to: detecting the sliding gesture data of the steering wheel grip touch-sensitive surface module received by the data receiving unit to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle).
The touch-sensitive surface unit is coupled to the processing unit, which is configured to receive a gesture. The processing unit is configured to: detecting a gesture; in response to the detected gesture.
According to some embodiments, displaying graphical objects and interacting methods are performed at a steering wheel, including the steering wheel grip touch-sensitive surface module coupled to a data receiving unit of the electronic device configured to detect a hand swipe gesture at the steering wheel grip. The method also includes the display unit being configured to display a first combination of graphical objects on the display unit, wherein the first combination of graphical objects includes a plurality of graphical objects. Optionally, wherein each of the plurality of graphical objects in the display unit is displayed in a configured layer. The method also includes the processing unit coupled to the display unit and the data receiving unit, the processing unit configured to: detecting the sliding gesture data of the steering wheel grip touch-sensitive surface module received by the data receiving unit to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle). The method also includes the touch-sensitive surface unit being coupled to the processing unit, which is configured to receive a gesture. The processing unit is configured to: detecting a gesture; in response to the detected gesture.
According to some embodiments, the device has a touch-sensitive display (also referred to as a "touch screen" or "touch screen display") comprising a display unit and a touch-sensitive surface unit. The touch screen displays visual output to a user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively, "graphical objects").
Drawings
For a better understanding of the foregoing embodiments of the invention, as well as additional embodiments thereof, reference should be made to the following detailed description read in conjunction with the accompanying drawings in which like reference numerals refer to corresponding parts throughout the various views. Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. In the drawings:
FIG. 1 illustrates a block diagram of an electronic device having a touch screen in accordance with some embodiments.
FIG. 2 illustrates an electronic device display screen and graphical object display interaction, in accordance with some embodiments.
FIG. 3A illustrates an electronic device display screen and graphical object display interaction, in accordance with some embodiments.
FIG. 3B illustrates an electronic device display screen and graphical object display interaction in accordance with some embodiments.
FIG. 4 illustrates an electronic device display screen and graphical object display interaction, in accordance with some embodiments.
FIG. 5 illustrates a block diagram of an electronic device having a touch screen and an angle sensor, according to some embodiments.
FIG. 6 illustrates a steering wheel display screen and graphical object display interaction, in accordance with some embodiments.
FIG. 7 illustrates a steering wheel display screen and graphical object display interaction, in accordance with some embodiments.
FIG. 8 illustrates a steering wheel display screen and graphical object display interaction in accordance with some embodiments.
FIG. 9 illustrates a vehicle steering apparatus according to some embodiments.
FIG. 10 illustrates a block diagram of an electronic device with a touch screen and a steering wheel grip touch-sensitive surface module, according to some embodiments.
FIG. 11 illustrates a steering wheel display screen and graphical object display interaction, in accordance with some embodiments.
Detailed Description
Representative applications of the methods and apparatus according to the present patent application are described in this section. These examples are provided solely to add context and aid in the understanding of the described embodiments. It will thus be apparent to one skilled in the art that the embodiments may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order to not unnecessarily obscure the embodiments. Other applications are possible so that the following examples should not be considered limiting.
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in accordance with the embodiments. Although these embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, it is understood that these examples are not limiting; such that other embodiments may be used and changes may be made without departing from the spirit and scope of the described embodiments.
In order to facilitate a person skilled in the art to understand the embodiments of the present application in a deep manner, definitions of terms used in the embodiments of the present application will be described first.
FIG. 1 illustrates a block diagram of an electronic device having a touch screen in accordance with some embodiments.
FIG. 2 illustrates an electronic device display screen and graphical object display interaction, in accordance with some embodiments.
In some embodiments, the electronic device 101 includes a display unit 102, a data receiving unit 202, and a processing unit 204. The display unit 102 is configured to display a first combination of graphical objects 106 on the display unit 102, wherein the first combination of graphical objects 106 includes a plurality of graphical objects (e.g., graphical object 108 and graphical object 109). The display unit 102 is configured to display a plurality of graphical objects (e.g., graphical object 110 and graphical object 111) on the display unit 102.
FIG. 3A illustrates an electronic device display screen and graphical object display interaction, in accordance with some embodiments.
In some specific embodiments, the processing unit 204 of the electronic device 101 is coupled to the display unit 102 and the data receiving unit 202, the processing unit 204 configured to: detecting data received by the data receiving unit 202 to obtain a first direction and a first angle α; in response to the detected first direction and first angle, each of the plurality of graphical objects (e.g., graphical object 108 and graphical object 109) of the first graphical object combination 106 is deflected around the first central axis 105 by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle).
FIG. 3B illustrates an electronic device display screen and graphical object display interaction, in accordance with some embodiments.
In some embodiments, each of a plurality of graphical objects in the display unit of the electronic device 101 is displayed in a configured layer, such as image object 111 in a first layer, image object 109 in a second layer, image object 110 in a third layer, and image object 108 in a fourth layer.
FIG. 5 illustrates a block diagram of an electronic device having a touch screen and an angle sensor, according to some embodiments.
In some embodiments, the electronic device 120 includes a display unit 102, a data receiving unit 202, a processing unit 204, and a touch-sensitive surface unit 206. The touch-sensitive surface unit 206 is coupled to the processing unit 204, which is configured to receive gestures. The processing unit 204 is configured to: detecting a gesture; in response to the detected gesture.
In some embodiments, the electronic device 120 includes a display unit 102, a data receiving unit 202, a processing unit 204, a touch-sensitive surface unit 206, and an interaction data channel 203. The touch-sensitive surface unit 206 is coupled to the processing unit 204, which is configured to receive gestures. The processing unit 204 is configured to: detecting a gesture; in response to the detected gesture. The interaction data channel 203 is coupled to the processing unit 204, which is configured to: receiving image data transmitted from a first external device 131, the electronic apparatus 120 displaying the image data as a first graphic object in a first region of the display unit 102; and issuing gesture data detected by the electronic apparatus 120 within the first region to the first external device 131; and the interaction data channel 203 may be wireless (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.11 g, IEEE802.11n, IEEE802.11ac, and/or IEEE802.11 ax) or wired (e.g., USB).
In some embodiments, displaying graphical objects and interacting methods are performed at an electronic device 120, which includes displaying a first combination of graphical objects 106 on a display unit of the electronic device 120, wherein the first combination of graphical objects 106 includes a plurality of graphical objects. The method further comprises detecting the data received by the data receiving unit 202 to obtain a first direction and a first angle; and in response to the detected first direction and first angle, and deflecting each of the plurality of graphical objects of the first graphical object combination 106 about a first central axis by a second direction and second angle (e.g., the second direction is opposite the first direction, the second angle is the same as the first angle). Optionally, the method further comprises detecting a gesture, and responding to the detected gesture. Optionally, the method further includes receiving image data sent by the first external device 131, and the electronic apparatus 120 displays the image data as a first graphic object in the first area of the display unit 102. Optionally, the method further includes sending gesture data detected by the electronic apparatus 120 in the first area to the first external device 131.
In some embodiments, the touch screen 207 includes a display unit 102 and a touch-sensitive surface unit 206.
In some embodiments, the method of displaying graphical objects and interacting is performed at the electronic device 120, which includes: detecting a continuous tap gesture of a first finger; and in response to detecting the continuous tap gesture of the first finger: setting an operation mode (e.g., a screen locking mode, an unlocking mode) of the electronic device 120; and optionally, an alert tone (e.g., lock screen alert tone, unlock alert tone).
FIG. 6 illustrates a steering wheel display screen and graphical object display interaction, in accordance with some embodiments.
FIG. 7 illustrates a steering wheel display screen and graphical object display interaction in accordance with some embodiments.
In some embodiments, the steering wheel 121 includes a steering wheel grip 104, a rotation angle sensor 209, a touch screen 207, a data receiving unit 202, a processing unit 204, a touch sensitive surface unit 206, wherein the touch screen 207 includes a display unit 102 and a touch sensitive surface unit 206. The steering wheel grip 104 and the touch screen 207 are linked; and the steering wheel grip 104 rotates to drive the touch screen 207 to rotate; and the rotation angle sensor 209 is coupled to the data receiving unit 202, which is configured to detect the deflection of the steering wheel grip 104 or the touch screen 207. The touch screen 207 is configured to display a first graphical object combination on the touch screen 207, wherein the first graphical object combination includes a plurality of graphical objects (e.g., graphical object 112, graphical object 113, graphical object 114). Optionally, wherein each of a plurality of graphical objects (e.g., image object 112, image object 113, image object 114, image object 115, image object 116) in the touch screen 207 is displayed at a configured layer. The processing unit 204 is coupled to the touch screen 207 and the data receiving unit 202, the processing unit 204 configured to: detecting the data of the corner sensor 209 received by the data receiving unit 202 to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects (e.g., graphical object 112, graphical object 113, graphical object 114) of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction, the second angle is the same as the first angle). The touch screen 207 is configured to receive gestures. The processing unit 204 is configured to: detecting a gesture; in response to the detected gesture.
In some embodiments, the steering wheel 121 further includes an interaction data channel 203. The touch-sensitive surface unit 206 is coupled to the processing unit 204, which is configured to receive gestures. The processing unit 204 is configured to: detecting a gesture; in response to the detected gesture. The interaction data channel 203 is coupled to the processing unit 204, which is configured to: receiving image data transmitted from a first external device 131, the electronic apparatus 120 displaying the image data as a first graphic object 112 in a first region of the display unit 102; and issuing gesture data detected by the electronic apparatus 120 within the first region to the first external device 131; and the interaction data channel 203 may be wireless (e.g., IEEE802.11a, IEEE802.11 b, IEEE802.11 g, IEEE802.11n, IEEE802.11ac, and/or IEEE802.11 ax) or wired (e.g., USB).
In some embodiments, displaying graphical objects and interaction methods are performed at the steering wheel 121, which includes the steering wheel displaying a virtual input keyboard; and detecting character data, a password or a password input by clicking the first finger on the virtual keyboard; and in response to detecting that the first finger has clicked on the virtual keyboard the entered character data, password or passcode: judging whether the character data, the password or the password accord with a first rule or not; and configuring a state of the steering wheel (e.g., a drivable state, a parked state).
FIG. 8 illustrates a steering wheel display screen and graphical object display interaction in accordance with some embodiments.
In some embodiments, the graphical object 115 and the graphical image 116 are configured to join the first graphical object combination.
FIG. 9 illustrates a vehicle steering apparatus according to some embodiments.
In some embodiments, the steering wheel 121 is coupled to the vehicle 301, and a user of the vehicle 301 can control the vehicle 301 by operating the touch screen 207 of the steering wheel 121, for example, navigating, operating a cell phone, viewing vehicle conditions, sounding a horn, controlling a car light, controlling a car lock, controlling a window, controlling a wiper, playing music, playing a radio, and making and receiving calls.
FIG. 10 illustrates a block diagram of an electronic device with a touch screen and a steering wheel grip touch-sensitive surface module, according to some embodiments.
FIG. 11 illustrates a steering wheel display screen and graphical object display interaction in accordance with some embodiments.
In some embodiments, the steering wheel 123 includes a steering wheel grip 104, a steering wheel grip touch-sensitive surface module 210, a display unit 102, a data receiving unit 202, a processing unit 204, a touch-sensitive surface unit 206. The steering wheel grip touch-sensitive surface module 210 is coupled to the data receiving unit 202 of the steering wheel 123, which is configured to detect a swipe gesture of a hand at the steering wheel grip 104. The display unit 102 is configured to display a first combination of graphical objects on the display unit 102, wherein the first combination of graphical objects includes a plurality of graphical objects. Optionally, each of the plurality of graphic objects (e.g., graphic object 112, image object 113, graphic object 114, graphic object 115, graphic object 116) in the display unit 102 is displayed in the configured layer. The processing unit 204 is coupled to the display unit 102 and the data receiving unit 202, the processing unit 204 configured to: detecting the sliding gesture data of the steering wheel grip touch-sensitive surface module 210 received by the data receiving unit 202 to obtain a first direction and a first angle; in response to the detected first direction and first angle, each of the plurality of graphical objects (e.g., graphical object 112, image object 113, graphical object 114) of the first graphical object combination is deflected about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction, the second angle is the same as the first angle). The touch-sensitive surface unit 206 is coupled to the processing unit, which is configured to receive gestures. The processing unit 204 is configured to: detecting a gesture; in response to the detected gesture.
Although certain examples have been illustrated and described for purposes of description, a wide variety of alternate and/or equivalent implementations, or calculations, may be made to achieve the same objectives without departing from the scope of practice of the present application. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that the embodiments described herein be limited only by the claims and the equivalents thereof.
Claims (10)
1. An electronic device, comprising:
a display unit configured to display, on the display unit:
a first graphical object combination, wherein:
the first graphical object combination comprises a plurality of graphical objects;
a data receiving unit; and
a processing unit coupled to the display unit and the data receiving unit, the processing unit configured to:
detecting the data received by the data receiving unit to obtain a first direction and a first angle;
in response to the detected first direction and first angle:
each of the plurality of graphical objects of the first graphical object combination is deflected about a first central axis by a second angle in a second direction (e.g., the second direction is opposite the first direction, the second angle is the same as the first angle).
2. The electronic device of claim 1, wherein each of a plurality of graphical objects in the display unit is displayed at a configured layer.
3. The electronic device of any of claims 1-2, further comprising:
a touch-sensitive surface unit coupled to the processing unit configured to receive a gesture; and
the processing unit is configured to:
detecting a gesture; and
in response to the detected gesture.
4. The electronic device of any of claims 1-2, further comprising:
a touch-sensitive surface unit coupled to the processing unit configured to receive a gesture; and
the processing unit is configured to:
detecting a gesture; and
in response to the detected gesture;
an interaction data channel coupled to the processing unit configured to:
receiving image data sent by a first external device, wherein the electronic device displays the image data as a first graphic object in a first area of the display unit; and
sending gesture data detected by the electronic device in the first area to the first external equipment; and
the interactive data channel may be wireless or wired.
5. A method for an electronic device to display graphical objects and interactions, comprising:
displaying a first graphical object combination on a display unit of the electronic device, wherein the first graphical object combination comprises a plurality of graphical objects; and
detecting the data received by the data receiving unit to obtain a first direction and a first angle;
in response to the detected first direction and first angle, and deflecting each of the plurality of graphical objects of the first graphical object combination about a first central axis by a second direction and a second angle (e.g., the second direction is opposite the first direction and the second angle is the same as the first angle); and
optionally, detecting a gesture, in response to the detected gesture; and
optionally, image data sent by a first external device is received, and the electronic apparatus displays the image data as a first graphic object in a first area of the display unit; and
optionally, the gesture data detected by the electronic device in the first area is sent to the first external device.
6. The method of claim 5, further comprising:
detecting a continuous tap gesture of a first finger; and
in response to detecting the continuous tap gesture of the first finger:
setting an operating mode (e.g., a screen locking mode, an unlocking mode) of the electronic device; and
optionally, an alert tone (e.g., lock screen alert tone, unlock alert tone) is issued.
7. A steering wheel, comprising:
the electronic device of any one of claims 1-4; and
a steering wheel grip to which the electronic device is attached; and
a steering angle sensor coupled to a data receiving unit of the electronic device configured to detect deflection of the steering wheel grip or a display unit of the electronic device; and
the steering wheel grip is linked with a display unit of the electronic device; and
the handle of the steering wheel rotates to drive the display unit of the electronic device to rotate; and
and the processing unit of the electronic device detects the data of the corner sensor received by the data receiving unit of the electronic device to obtain a first direction and a first angle.
8. A method for steering wheel display of graphical objects and interactions, comprising:
the method of any one of claims 5-6; and
a steering wheel grip attached to the electronic device; and
a steering angle sensor coupled to a data receiving unit of the electronic device and configured to detect deflection of the steering wheel grip or a display unit of the electronic device; and
the steering wheel grip is linked with a display unit of the electronic device; and
the handle of the steering wheel rotates to drive the display unit of the electronic device to rotate; and
and the processing unit of the electronic device detects the data of the corner sensor received by the data receiving unit of the electronic device to obtain a first direction and a first angle.
9. The method of claim 8, further comprising:
displaying a virtual input keyboard; and
detecting character data input by clicking a first finger on the virtual keyboard; and
in response to detecting that the first finger is clicking input character data on the virtual keyboard:
judging whether the character data accords with a first rule or not; and
configuring a state (e.g., a drivable state, a parked state) of the steering wheel.
10. A steering wheel, comprising:
the electronic device of any one of claims 1-4; and
a steering wheel grip; and
a steering wheel grip touch-sensitive surface module coupled to a data receiving unit of the electronic device configured to detect a swipe gesture of a hand on the steering wheel grip; and
the processing unit of the electronic device detects the sliding gesture data received by the data receiving unit of the electronic device to obtain a first direction and a first angle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110440467.8A CN115328376A (en) | 2021-04-23 | 2021-04-23 | Display interaction method and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110440467.8A CN115328376A (en) | 2021-04-23 | 2021-04-23 | Display interaction method and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115328376A true CN115328376A (en) | 2022-11-11 |
Family
ID=83912790
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110440467.8A Pending CN115328376A (en) | 2021-04-23 | 2021-04-23 | Display interaction method and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115328376A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130024071A1 (en) * | 2011-07-22 | 2013-01-24 | Clas Sivertsen | Steering Wheel Input Device Having Gesture Recognition and Angle Compensation Capabilities |
FR2978942A1 (en) * | 2011-08-08 | 2013-02-15 | Optopartner | Display device for e.g. displaying information in vehicle, has display screen arranged in/on rotating unit, where display screen is arranged to perform display in predetermined direction regardless of angle of rotation of rotating unit |
CN104854553A (en) * | 2012-11-27 | 2015-08-19 | 内奥诺德公司 | Light-based touch controls on a steering wheel and dashboard |
CN109131529A (en) * | 2017-09-05 | 2019-01-04 | 南京知行新能源汽车技术开发有限公司 | Transfer, steering column, system and vehicle for vehicle |
-
2021
- 2021-04-23 CN CN202110440467.8A patent/CN115328376A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130024071A1 (en) * | 2011-07-22 | 2013-01-24 | Clas Sivertsen | Steering Wheel Input Device Having Gesture Recognition and Angle Compensation Capabilities |
FR2978942A1 (en) * | 2011-08-08 | 2013-02-15 | Optopartner | Display device for e.g. displaying information in vehicle, has display screen arranged in/on rotating unit, where display screen is arranged to perform display in predetermined direction regardless of angle of rotation of rotating unit |
CN104854553A (en) * | 2012-11-27 | 2015-08-19 | 内奥诺德公司 | Light-based touch controls on a steering wheel and dashboard |
CN109131529A (en) * | 2017-09-05 | 2019-01-04 | 南京知行新能源汽车技术开发有限公司 | Transfer, steering column, system and vehicle for vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10120567B2 (en) | System, apparatus and method for vehicle command and control | |
US8907778B2 (en) | Multi-function display and operating system and method for controlling such a system having optimized graphical operating display | |
US8564560B2 (en) | Method for operating a control system for a vehicle and control system for a vehicle | |
US20080143686A1 (en) | Integrated vehicle control interface and module | |
US11307756B2 (en) | System and method for presenting moving graphic animations in inactive and active states | |
US20100238129A1 (en) | Operation input device | |
CN103688518A (en) | Method, apparatus, computer and mobile device for display and vehicle having the apparatus | |
WO2018022329A1 (en) | Detecting user interactions with a computing system of a vehicle | |
CN108345420B (en) | Vehicle input apparatus and method of controlling the same | |
JP4858206B2 (en) | In-vehicle device operation support device and operation support method | |
CN101784984A (en) | Method for displaying information in a motor vehicle with a variable scale, and display device | |
CN106926697B (en) | Display system and display device for vehicle | |
US11014449B2 (en) | Method and device for displaying information, in particular in a vehicle | |
CN115328376A (en) | Display interaction method and electronic device | |
US11977806B2 (en) | Presenting content on separate display devices in vehicle instrument panel | |
JP6018775B2 (en) | Display control device for in-vehicle equipment | |
JP2014100998A (en) | Operation support system, operation support method, and computer program | |
US20100164861A1 (en) | Image system capable of switching programs corresponding to a plurality of frames projected from a multiple view display and method thereof | |
US10732824B2 (en) | Vehicle and control method thereof | |
KR20080113465A (en) | Apparatus for controlling operation of electronic equipment for the use of a car, using haptic device, and electronic system for the use of a car comprising the apparatus | |
JP2019032886A (en) | Display control device, display control method, and display control device program | |
CN113791713B (en) | Multi-screen display window sharing method and device applied to vehicle-mounted intelligent cabin | |
JP2014102657A (en) | Manipulation assistance system, manipulation assistance method, and computer program | |
EP1821178B1 (en) | Vehicle user interface for entering data into an on-board vehicle system using said interface | |
EP3125099B1 (en) | Vehicle and method of controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |