CN112445323B - Data processing method, device, equipment and machine-readable medium - Google Patents
Data processing method, device, equipment and machine-readable medium Download PDFInfo
- Publication number
- CN112445323B CN112445323B CN201910810762.0A CN201910810762A CN112445323B CN 112445323 B CN112445323 B CN 112445323B CN 201910810762 A CN201910810762 A CN 201910810762A CN 112445323 B CN112445323 B CN 112445323B
- Authority
- CN
- China
- Prior art keywords
- display layer
- interface
- interface element
- movement
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title abstract description 19
- 238000000034 method Methods 0.000 claims abstract description 64
- 230000003068 static effect Effects 0.000 claims abstract description 14
- 238000012545 processing Methods 0.000 claims description 20
- 230000003993 interaction Effects 0.000 abstract description 25
- 230000006870 function Effects 0.000 description 51
- 238000010586 diagram Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 238000006073 displacement reaction Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 7
- 230000007306 turnover Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 230000007423 decrease Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a data processing method, a device, equipment and a machine-readable medium, wherein the method comprises the following steps: determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state; and controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space. According to the gesture interaction method and device, the memory cost and the learning cost of gesture interaction can be reduced, and user experience is improved.
Description
Technical Field
The present application relates to the field of computer technology, and in particular, to a data processing method, a data processing apparatus, a device, and a machine readable medium.
Background
Along with the development of society and science and technology, a gesture interaction mode is used as a non-contact man-machine interaction mode, and is gradually applied to the fields of man-machine interaction games, computer control and the like according to the sense of true experience and flexible usability.
A gesture interaction manner can provide N gestures for a user, wherein different gestures can represent different functions, so that the user realizes different functions through different gestures. However, in practical applications, the value of N is generally matched with the number of functions, and the value of N is generally greater than 5, thus increasing the memory cost and learning cost of gestures.
Disclosure of Invention
The technical problem to be solved by the embodiment of the application is to provide a data processing method, which can reduce the memory cost and the learning cost of gesture interaction and improve the user experience.
Correspondingly, the embodiment of the application also provides a data processing device, a device and a machine-readable medium, which are used for guaranteeing the implementation and application of the method.
In order to solve the above problems, an embodiment of the present application discloses a data processing method, including:
determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state;
and controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
On the other hand, the embodiment of the application also discloses a data processing method, which comprises the following steps:
determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying unselected interface elements; the second display layer is used for displaying the selected interface element;
and controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
On the other hand, the embodiment of the application also discloses a data processing device, which comprises:
the determining module is used for determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state; and
and the control module is used for controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
On the other hand, the embodiment of the application also discloses a data processing device, which comprises:
The determining module is used for determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying unselected interface elements; the second display layer is used for displaying the selected interface element; and
and the control module is used for controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
In yet another aspect, an embodiment of the present application further discloses an apparatus, including:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform one or more of the methods described previously.
In yet another aspect, embodiments of the present application disclose one or more machine-readable media having instructions stored thereon that, when executed by one or more processors, cause an apparatus to perform one or more of the methods described previously.
Embodiments of the present application include the following advantages:
in this embodiment, the interface is divided into a plurality of display layers for stacked display, where the plurality of display layers may include: a first display layer and a second display layer; the first display layer is used for displaying static interface elements, and the second display layer is used for displaying interface elements in a motion state.
According to the first motion information of the hand of the user in the three-dimensional space, the embodiment of the application controls the target interface element to move between the first display layer and the second display layer. In this way, the interface element in the motion state can be displayed in a vertically stacked manner with the static interface element, so that a three-dimensional effect realized by the two-dimensional display device can be presented, and the target interface element moves between the first display layer and the second display layer along with the hand of the user; therefore, the function corresponding to the target interface element can be realized through the three-dimensional effect of the interface. For example, the method can be used for adjusting the position of the target interface element in the interface, and for example, a function corresponding to the target interface element can be triggered. The embodiment of the application can support the user to realize the function corresponding to the target interface element through the movement of the user hand, so that the user can execute the movement of the user hand according to the actual function requirement, the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
Drawings
FIG. 1 is a schematic representation of a three-dimensional coordinate system according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating steps of a first embodiment of a data processing method according to the present application;
FIG. 3 is a schematic illustration of an interface of an embodiment of the present application;
FIG. 4 is a schematic illustration of a process for adjusting interface layout by spatial gesture according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating steps of a second embodiment of a data processing method according to the present application;
FIG. 6 is a flowchart illustrating steps of a third embodiment of a data processing method of the present application;
FIG. 7 is a flowchart illustrating steps of a fourth embodiment of a data processing method according to the present application;
FIG. 8 is a block diagram of an embodiment of a data processing apparatus of the present application; and
fig. 9 is a schematic structural diagram of an apparatus according to an embodiment of the present application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments that are illustrated in the appended drawings.
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The concepts of the present application are susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the description herein of specific embodiments is not intended to limit the concepts of the present application to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the scope of the present application.
Reference in the specification to "one embodiment," "an embodiment," "one particular embodiment," etc., means that a particular feature, structure, or characteristic may be included in the described embodiments, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, where a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature, structure, or characteristic in connection with other ones of the embodiments whether or not explicitly described. In addition, it should be understood that the items in the list included in this form of "at least one of A, B and C" may include the following possible items: (A); (B); (C); (A and B); (A and C); (B and C); or (A, B and C). Likewise, an item listed in this form of "at least one of A, B or C" may mean (A); (B); (C); (A and B); (A and C); (B and C); or (A, B and C).
In some cases, the disclosed embodiments may be implemented as hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried on or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be executed by one or more processors. A machine-readable storage medium may be implemented as a storage device, mechanism, or other physical structure (e.g., volatile or non-volatile memory, a media disc, or other media other physical structure device) for storing or transmitting information in a form readable by a machine.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or ordering. Preferably, however, such specific arrangement and/or ordering is not necessary. Rather, in some embodiments, such features may be arranged in a different manner and/or order than as shown in the drawings. Furthermore, inclusion of a feature in a particular figure that is not necessarily meant to imply that such feature is required in all embodiments and that, in some embodiments, may not be included or may be combined with other features.
Aiming at the technical problems of high memory cost and high learning cost of gestures, the embodiment of the application provides a data processing scheme, which specifically can comprise: determining a target interface element corresponding to the hand of the user from the interface; the interface may include: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state; and controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
In the embodiment of the present application, the Interface may be a UI (User Interface), which may implement interaction between a User and an application program through displaying content. Examples of interfaces may include: the pages of the web page, or interfaces provided by the application, it will be appreciated that embodiments of the present application are not limited to a particular interface. Interface elements (interface element) refer to a series of elements included in a software or system interface that can meet the interaction requirements that meet the user interaction requirements.
In this embodiment, the interface is divided into a plurality of display layers for stacked display, where the plurality of display layers may include: a first display layer and a second display layer; the first display layer is used for displaying static interface elements, and the second display layer is used for displaying interface elements in a motion state.
According to the first motion information of the hand of the user in the three-dimensional space, the embodiment of the application controls the target interface element to move between the first display layer and the second display layer. In this way, the interface element in the motion state can be displayed in a vertically layered manner with the static interface element, so that a three-dimensional effect achieved by the two-dimensional display device can be presented, and the target interface element moves between the first display layer and the second display layer along with the hand of the user.
Referring to fig. 1, a schematic diagram of a three-dimensional coordinate system according to an embodiment of the present application is shown, where the XOY plane is a plane in which the display device is located or is parallel to the plane in which the display device is located; the Z axis is perpendicular to the plane of the interface. The embodiment of the application introduces a Z-axis space to realize a three-dimensional effect through a two-dimensional display device.
According to the method and the device, according to the first motion information of the hand of the user in the three-dimensional space, the target interface element is controlled to move between the first display layer and the second display layer, and the function corresponding to the target interface element can be achieved through the three-dimensional effect of the interface. For example, the method can be used for adjusting the position of the target interface element in the interface, and for example, a function corresponding to the target interface element can be triggered. The embodiment of the application can support the user to realize the function corresponding to the target interface element through the movement of the user hand, so that the user can execute the movement of the user hand according to the actual function requirement, the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
According to an embodiment, the method and the device for adjusting the position of the target interface element in the interface are used for adjusting the position of the target interface element in the interface, the first motion information of the user's hand can be used for picking up the target interface element from the source position of the first display layer, the target interface element is moved to the target position of the first display layer and put down after the target interface element passes through the second display layer, and the logic of the first motion information is equivalent to moving an object in reality, so that the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
According to another embodiment, when the function corresponding to the target interface element is triggered, the first motion information of the user's hand may be used to press the target interface element, and in this case, the target interface element may be controlled to move toward a display layer (such as a desktop layer) lower than the first display layer, and the logic of the first motion information is simple, so that the memory cost and the learning cost of gesture interaction may be reduced, and the user experience may be improved.
The data processing method provided by the embodiment of the invention can be applied to application environments corresponding to the client and the server, wherein the client and the server are located in a wired or wireless network, and the client and the server perform data interaction through the wired or wireless network.
Alternatively, the client may run on the device, for example, the client may be an APP (Application program) running on the device, such as a voice assistant APP, a remote control APP, an instant messaging APP, or an APP carried by an operating system, etc., which is not limited by the specific APP corresponding to the client in the embodiments of the present Application.
Alternatively, the apparatus may be built in or externally connected with a display device such as a screen, and the display device is used for displaying an interface.
Alternatively, the device may be an internal or external microphone, which is used to collect voice information of the user. The device may also have an internal or external speaker for playing information.
Alternatively, the device may be an internal or external three-dimensional sensor, and the three-dimensional sensor may be used to detect position information of the hand of the user in the three-dimensional space, so as to obtain first motion information of the hand of the user in the three-dimensional space.
Alternatively, the three-dimensional sensor emits radiation of a known frequency, detects energy returned by the surface of an object (user's hand) within the observed light field, and determines positional information of the user's hand in three-dimensional space from the energy. The three-dimensional sensor may include: low frequency magnetic field sensors (e.g., infrared sensors), ultrasonic sensors, and the like.
Optionally, the device may be an internal or external image capturing device, and the image capturing device is configured to capture image data, so as to determine a gesture according to the image data, and further implement a gesture interaction function.
Such devices may include, but are not limited to: smart phones, tablet computers, e-book readers, MP3 (dynamic video expert compression standard audio plane 3,Moving Picture Experts Group Audio Layer III) players, MP4 (dynamic video expert compression standard audio plane 4,Moving Picture Experts Group Audio Layer IV) players, laptop portable computers, in-vehicle devices, PCs (personal computers, personal Computer), set-top boxes, smart televisions, wearable devices, smart home devices, and the like. The smart home device may include: intelligent audio amplifier, intelligent door lock, intelligent entrance guard etc. it can be appreciated that this application embodiment is not restricted to specific equipment.
The embodiment of the application can be applied to equipment supporting gesture interaction, so that the application can be applied to scenes needing gesture interaction, such as an on-vehicle scene, an intelligent television scene, a somatosensory game scene and the like, which are inconvenient for touch operation.
Examples of the in-vehicle apparatus may include: a HUD (Head Up Display) is usually installed in front of the driver, and can provide some necessary information for the driver during driving, such as vehicle speed, oil consumption, navigation, even mobile phone calls, message reminding, entertainment information, etc.; in other words, the HUD can integrate multiple functions, is convenient for a driver to pay attention to driving road conditions, and can meet entertainment requirements of the driver.
Method embodiment one
Referring to fig. 2, a flowchart illustrating steps of a first embodiment of a data processing method of the present application may specifically include the following steps:
step 201, determining a target interface element corresponding to the hand of a user from an interface; the interface may include: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state;
step 202, controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
At least one step included in the data processing method shown in fig. 2 may be performed by a client on the device.
In step 201, at least one interface element may be included in the interface. The target interface element may refer to a user-selected interface element.
Optionally, the interface element may include at least one of the following elements: icons, cards, and listings. It is understood that the interface elements described above are merely alternative embodiments, and in fact, the interface elements of the embodiments of the present application may further include: text, audio, animation, video, etc., it will be appreciated that embodiments of the present application are not limited to particular interface elements.
The card may refer to a card, which may include: such as information of pictures and text. Alternatively, the card may be rectangular containing rich pictures and text content, which may serve as an entry for a detail page in practical applications.
Alternatively, the interface may be a card interface. The card interface is a display interface containing one or more cards, and the number of cards included in the card interface is not limited in the embodiment of the present application. In the case where a plurality of cards are included on the card interface, a single card may have a corresponding display area, and information contained in the card may be displayed in the display area corresponding to the card. For example, the card interface includes three cards, namely, card a, card B, and card C, where information included in card a is denoted as information a, information included in card B is denoted as information B, and information included in card C is denoted as information C.
According to the method and the device for determining the target card, the target card corresponding to the hand of the user can be determined from the card interface, so that the function corresponding to the target card is achieved.
According to the method and the device for determining the target interface elements, the target interface elements corresponding to the hands of the user can be determined through the position information of the hands of the user. Optionally, step 201 determines a target interface element corresponding to the hand of the user from the interface, which may specifically include: and determining a target interface element corresponding to the hand of the user from the interface according to the matching condition between the projection object of the hand of the user on the display device and the interface element, so that the target interface element is matched with the projection object. The matching conditions may include: overlapping, etc., and are not described in detail herein.
The embodiment of the application does not limit specific gestures corresponding to the hands of the user. The gestures corresponding to the user's hand may include: a fist-making gesture, a palm-stretching gesture, a palm-half-stretching gesture, etc.
In step 202, all or part of the target interface element may follow the user's hand between the first display layer and the second display layer, i.e., the range of motion of the target interface element in the Z-axis may be determined by the first display layer and the second display layer.
The target interface element may correspond to second motion information, and the first motion information may be matched with the second motion information. The first motion information or the second motion information may include: type of movement and direction of movement.
The above-described motion types may include at least one of the following types: flipping, moving, and rotating.
The rotation may be any angular transformation of the target interface element on the corresponding plane.
Flipping may be any rotation of the target interface element about the axis of flipping about a plane perpendicular to the corresponding plane. The flipping axis corresponding to the target interface element may include: an outer edge of the interface element, or a central axis of the interface element. Taking a rectangular interface element as an example, the outer edge may be: upper edge, lower edge, left edge, right edge, etc.
Movement may be used to change the position of the target interface element, it being understood that moving the corresponding direction of motion may include: a direction of movement or directions of movement.
Optionally, the motion direction of the user's hand or the target interface element may include: perpendicular to the direction of perpendicular movement of the interface, and/or parallel to the direction of parallel movement of the interface. It can be appreciated that the embodiment of the application can determine the motion direction of the hand of the user according to the change of the position information of the hand of the user in the three-dimensional space. The motion direction of the user's hand may include: any one or combination of a vertical movement direction and a parallel movement direction.
It should be noted that, in the embodiment of the present application, the displacement of the user hand or the target interface element may include: displacement in one direction of movement or in a plurality of directions of movement. Taking the three-dimensional coordinate system shown in fig. 1 as an example, in the embodiment of the present application, the motion direction of the user's hand may be parallel to the Z axis, or may have an included angle greater than 0 degrees with the Z axis.
Optionally, in the embodiment of the present application, when the position of the target interface element between the first display layer and the second display layer is adjusted, the first motion information of the user's hand may be used to pick up the target interface element from the source position of the first display layer, move the target interface element to the target position of the first display layer through the second display layer, and put down, where the logic of the first motion information is equivalent to moving an object in reality.
According to one embodiment, the track points of the target interface element may include, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer.
According to another embodiment, the moving direction of the target interface element may sequentially include, in order from first to last: a first direction of movement, a second direction of movement, and a third direction of movement;
the second movement direction is parallel to the interface; the first movement direction and the third movement direction are perpendicular to the interface; the first movement direction may be a direction from the first display layer to the second display layer; the third movement direction may be a direction from the second display layer to the first display layer.
According to still another embodiment, the track points of the target interface element may include, in order from first to last: a location in the first display layer, a location in the second display layer, and a target location in the third display layer. The second display layer may be located between the first display layer and the third display layer.
The embodiment of the application can control the movement of the target interface element between the first display layer and the second display layer by using an interface processing technology. For example, animation technology or image processing technology may be used to control the target interface element to flip in the space corresponding to the first display layer and the second display layer.
In this embodiment of the present application, optionally, the movement of the target interface element in a direction perpendicular to the interface may be controlled by changing a display layer corresponding to the target interface element.
It is understood that the number of second display layers may be one or more. The second display layer may be located above the first display layer, or the second display layer may be located below the first display layer. For example, the target interface element may be moved downward from the first display layer, and then approach the second display layer, and finally reach the third display layer, where the third display layer is configured to display a preset interface corresponding to the target interface element, and the preset interface corresponding to the target interface element may include: music interface, video interface, etc. For example, if the target interface element is a card corresponding to the music APP, the preset interface may be an interface corresponding to the music APP.
Alternatively, the second display layer may be derived from displacement of the user's hand in the first direction of motion.
For example, as the displacement of the user's hand in the first direction of motion increases, the distance between the second display layer and the first display layer increases. For example, the number of the first display layer is 1, the number of the second display layer is i, i may be a natural number greater than 1, and i may increase as the displacement of the user's hand in the first movement direction increases.
As another example, as the displacement of the user's hand in the first direction of motion decreases, the distance between the second display layer and the first display layer decreases. For example, the number of the first display layer is 1, the number of the second display layer is i, i may be a natural number greater than 1, and i may decrease as the displacement of the user's hand in the first movement direction decreases.
In an embodiment of the present application, optionally, the interface may include: and controlling the corresponding control of the target interface element. Controls refer to packages of data and methods. The control may have its own properties and methods, where the properties are simple visitors to the control data and the methods are some simple and visible functions of the control.
In this embodiment of the present application, optionally, a frequency of use corresponding to a control corresponding to a target interface element in the interface may meet a preset condition. For example, the preset conditions may include: the use frequency exceeds a threshold value and the like so as to lead the high-frequency control corresponding to the target interface element, and further the operation efficiency corresponding to the target interface element can be improved.
In an alternative embodiment of the present application, the interface may specifically include: a plurality of display layers of range upon range of setting, above-mentioned a plurality of display layers from bottom to top can include in proper order:
A table top layer; for displaying a desktop;
a first display layer; for displaying static cards; and
and the second display layer is used for displaying the card in the motion state.
The multiple display layers are arranged in a stacked mode, the corresponding function of the card can be achieved through the movement of the card in the space, and the depth of field effect of the interface can be improved.
Referring to fig. 3, a schematic illustration of an interface according to an embodiment of the present application is shown, wherein the interface may include: a plurality of cards, the plurality of cards 301 may include: target card 301A, unselected card 301B, card 301C, card 301D, and card 301E, selected by the user's hand, the interface may further include: control 302 corresponding to target card 301A. Alternatively, the display layers corresponding to the plurality of cards may be located below the display layers corresponding to the control 302. Optionally, the interface may further include: the desktop layer, the display layer that the desktop layer corresponds to can be located the display layer that a plurality of cards correspond to.
In an alternative embodiment of the present application, interface elements such as cards, icons may be used as portals to the preset interface. The preset interface may be an interface of an application program. The categories of applications may include: music category, address book category, navigation category, video category, audio reading category, etc.
For example, the control functions corresponding to the music categories may include: play function, forward function, or backward function, etc. The control functions corresponding to the address book category can comprise: short message function, dial function, contact function, etc. The control functions corresponding to the navigation category may include: navigation functions, home functions, assistant functions, etc. The control functions corresponding to the video category can comprise: play function, comment function, search function, etc. The control functions corresponding to the audio reading category can comprise: play function, forward function, or backward function, etc. It can be appreciated that, a person skilled in the art may determine, according to actual application requirements, a control function corresponding to an interface element, and the embodiment of the present application does not limit a specific control function corresponding to the interface element.
In an optional embodiment of the present application, in the case that the first gesture is presented on the hand of the user, the hand of the user may be supported to move in the three-dimensional space. In this case, the first motion information may include: perpendicular to the direction of movement of the interface. The first gesture may include: a fist-making gesture, etc., the embodiment of the present application may control the movement of the target interface element between the first display layer and the second display layer through the movement direction of the user's hand perpendicular to the interface. Of course, in the case that the hand of the user presents the first gesture, the first motion information may include: parallel to the direction of movement of the interface.
In an optional embodiment of the present application, in the case where the second gesture is presented on the hand of the user, only movement of the hand of the user in the two-dimensional plane corresponding to the interface may be supported. In this case, the first motion information may include: parallel to the motion direction of the interface, so that the target interface element is updated with the motion of the user's hand. The second gesture may include: palm expansion gestures, palm half expansion gestures and the like, and the updating of the target interface element can be achieved through the second movement direction of the user hand. Taking the interface shown in fig. 3 as an example, when the user's hand corresponding to the second gesture moves from card 301A to card 301B, the target interface element is updated to card 301B, and thus the control corresponding to card 301B may be displayed.
Of course, in the case that the hand of the user presents the first gesture, the first motion information may include: perpendicular to the direction of movement of the interface. For example, if the user's hand corresponding to the second gesture moves in the direction of the first display layer to the table layer, the target interface element may move in the direction of the first display layer to the table layer. For example, the target interface element may be moved downward from the first display layer, and then approach the second display layer, and finally reach the third display layer, where the third display layer is configured to display a preset interface corresponding to the target interface element, and the preset interface corresponding to the target interface element may include: music interface, video interface, etc.
For another example, if all or part of the second gesture is flipped in three-dimensional space, the target interface element may be flipped in the interface, and the flipping information of the target interface element may be matched with the flipping information of the second gesture, where the flipping information may include: a turnover shaft, a turnover direction, a turnover angle, or the like.
In this embodiment of the present application, optionally, in a case where the hand of the user presents a first gesture, the first motion information may include: perpendicular to the direction of movement of the interface.
In this embodiment of the present application, optionally, in a case where the hand of the user presents the second gesture, the first motion information may include: parallel to the direction of movement of the interface;
the method may further include:
and under the condition that the hand of the user presents a second gesture, updating the target interface element according to the movement direction parallel to the interface.
In summary, according to the data processing method of the embodiment of the present application, the movement of the target interface element between the first display layer and the second display layer is controlled according to the first movement information of the hand of the user in the three-dimensional space. In this way, the interface element in the motion state can be displayed in a vertically stacked manner with the static interface element, so that a three-dimensional effect realized by the two-dimensional display device can be presented, and the target interface element moves between the first display layer and the second display layer along with the hand of the user; therefore, the function corresponding to the target interface element can be realized through the three-dimensional effect of the interface. For example, the method can be used for adjusting the position of the target interface element in the interface, and for example, a function corresponding to the target interface element can be triggered. The embodiment of the application can support the user to realize the function corresponding to the target interface element through the movement of the user hand, so that the user can execute the movement of the user hand according to the actual function requirement, the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
Specific examples are provided herein for a better understanding of embodiments of the present application by those skilled in the art.
In this example, the interface may include a plurality of interface elements, which may be disposed laterally and/or longitudinally.
The method and the device can support the user to select the target interface element through the second gesture. Optionally, according to the matching condition between the projection object of the hand of the user on the display device and the interface element, determining the target interface element corresponding to the hand of the user from the interface. Optionally, the second gesture moves in a second movement direction parallel to the interface, and then the target interface element is updated along with the movement of the second gesture.
According to one embodiment, if all or part of the second gesture is flipped in three-dimensional space, the target interface element may be flipped in the interface, the flipping information of the target interface element may match the flipping information of the second gesture, and the flipping information may include: a turnover shaft, a turnover direction, a turnover angle, or the like.
According to another embodiment, if the user's hand corresponding to the second gesture moves in the direction from the first display layer to the desktop layer, the target interface element may move in the direction from the first display layer to the desktop layer, for example, the target interface element may be moved from the first display layer to a third display layer located below the first display layer, in which case, a preset interface corresponding to the target interface element may be entered, for example, a music interface, a video interface, etc.
According to yet another embodiment, the method is used for adjusting the position of the target interface element in the interface. Specifically, after determining the target interface element by the second gesture, if the user's hand changes from the second gesture to the first gesture and moves in the first movement direction, the target interface element moves between the first display layer and the second display layer in the first movement direction, for example, the target interface element may be moved from the first display layer to the second display layer located above the first display layer. Then, if the first gesture of the user's hand moves in the second direction of movement, the target interface element moves between the first display layer and the second display layer in the second direction of movement. Then, if the first gesture of the user's hand moves in the third movement direction, the target interface element moves in the three-dimensional space in the third movement direction until the user's hand changes from the first gesture to the second gesture, that is, the target interface element stops moving in the case where the user's hand changes from the first gesture to the second gesture.
This embodiment may be used to move the position of a target interface element in an interface. Taking the interface shown in fig. 3 as an example, if the position of card 301A in the interface needs to be adjusted, card 301A may be picked up from the source position in the first display layer, passed through the second display layer, and placed in the target position in the first display layer based on the first motion information of the user's hand. Alternatively, if a first non-target interface element is provided at a target location in the interface (the non-target interface element may refer to an interface element other than the target interface element), the first non-target interface element may be moved away. For example, a first non-target interface element may be moved to the source location, and as another example, a first non-target interface element may be moved to a location corresponding to a second non-target interface element, and a second non-target interface element may be moved to the source location.
Taking the interface shown in fig. 3 as an example, card 301A may be picked up from a source location in a first display layer, passed through a second display layer, and placed in a target location in the first display layer based on first motion information of a user's hand. Assume that the target position is the position of card 301C. Alternatively, in the case where the projection object of the user's hand on the display device matches the position of card 301C and the user's hand moves in the third direction of movement, card 301B and card 301C may be controlled to move to the left to set a target position for card 301A. Alternatively, in the case where the projection object of the user's hand on the display device matches the position of card 301C and the displacement of the user's hand moving in the third movement direction exceeds the displacement threshold, card 301B and card 301C may be controlled to move leftward.
In summary, the first motion information of the user's hand may be used to pick up the target interface element from the source position of the first display layer, pass through the second display layer in the middle, move the target interface element to the target position in the second display layer and put down, where the logic of the first motion information is equivalent to moving an object in reality, so that the memory cost and learning cost of gesture interaction may be reduced, and the user experience is improved.
Referring to fig. 4, a schematic diagram of a process of adjusting an interface layout through a spatial gesture according to an embodiment of the present application is shown, where a first display layer of an interface may include: cards such as card 1, card 2, card 3, card 4 and card 5, and the spatial gestures are used to adjust the position of the cards in the interface.
Fig. 4 may include the following steps:
and S1, determining a target card (a card 3 in FIG. 4) corresponding to the palm gesture from the interface according to the matching condition between the projection object of the palm gesture on the display device and the card, so that a user can move left and right through the palm, and the purpose of selecting the card is realized.
And S2, responding to the motion of the fist-making gesture on the Z axis, and entering an adjustment layout mode, so that a user can enter the adjustment layout mode by making a fist and dragging backwards.
In particular, the target card is controlled to move between the first display layer and the second display layer, for example, the target card may be controlled to move in the first direction of movement.
And S3, responding to the movement of the fist-making gesture parallel to the interface, and controlling the target card to move along the direction parallel to the interface so as to achieve the purposes of moving the fist left and right and dragging the card to adjust the position.
Alternatively, in the process of moving the target card left and right, the non-target card in the first display layer, which matches the projection object of the target card, may be moved accordingly, for example, the target card 3 moves to the position corresponding to the card 4 in fig. 4, and then the card 4 moves left accordingly.
It will be appreciated that if there is no non-target card in the first display layer that matches the projected object of the target card, then no movement may be performed with respect to the non-target card.
And S4, responding to the change from the fist making gesture to the palm gesture, and controlling the target card to move along the third movement direction so as to place the target card at the target position in the first display layer. That is, the user can exit the adjustment layout mode by releasing the fist.
Method embodiment II
Referring to fig. 5, a flowchart illustrating steps of a second embodiment of a data processing method of the present application may specifically include the following steps:
step 501, determining the residence time of a gesture on an interface; the interface may include: a plurality of cards;
and 502, switching positions of at least two cards in the interface according to the stay time.
According to the method and the device for adjusting the gesture interaction, the layout of the interface can be adjusted according to the residence time of the gesture on the interface, namely, the positions of at least two cards in the interface are adjusted, so that the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
The layout of the interface is adjusted, which specifically comprises: and switching the positions of at least two cards in the interface. For example, the positions of card 1 and card 2 are interchanged; for another example, after the positions of the card 1 and the card 2 are interchanged, the positions of the card 1 and the card 3 are interchanged, and so on.
In this embodiment of the present application, at least two cards (hereinafter referred to as target cards) corresponding to the position switch may be obtained according to the residence time.
Alternatively, different dwell times may correspond to different numbers of target cards. For example, the longer the dwell time, the greater the number of target cards.
Alternatively, the target card may be determined according to the matching between the projected object of the gesture on the display device and the card. For example, the matching condition may be a matching degree, and at least two target cards may be selected from the interface according to the order of the matching degree from large to small. For example, the interface includes, in order from left to right: cards 1 to 5, the gesture is located in the middle of the cards 3 and 4, and then the cards 3 and 4 can be regarded as target cards. As another example, the interface includes, in order from left to right: cards 1 to 5, and the gesture is located above card 3, then card 2, card 3 and card 4 can be used as target cards
Method example III
Referring to fig. 6, a flowchart illustrating steps of a third embodiment of a data processing method of the present application may specifically include the following steps:
step 601, determining the stay time of the gesture on the interface; the interface may include: a plurality of cards;
step 602, moving at least two cards in the interface according to the residence time.
According to the method and the device for adjusting the gesture interaction, the layout of the interface can be adjusted according to the residence time of the gesture on the interface, namely, the positions of at least two cards in the interface are adjusted, so that the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
The layout of the interface is adjusted, which specifically comprises: at least two cards in the interface are moved.
The above movement allows switching the position of different cards, with particular reference to the embodiment shown in fig. 4. For example, the card 1 is moved from a first position to a second position, and the card 2 is moved from a third position to a fourth position, wherein the second position is the same as the third position, or the first position is the same as the fourth position.
Alternatively, the movement may not involve a corresponding switching of the positions of the different cards. For example, the card 1 is moved from a first position to a second position, the card 2 is moved from a third position to a third position, wherein the second position is different from the third position, and the first position is different from the fourth position, and so on.
Alternatively, different dwell times may correspond to different numbers of target cards. For example, the longer the dwell time, the greater the number of target cards.
Alternatively, the target card may be determined according to the matching between the projected object of the gesture on the display device and the card.
Alternatively, different dwell times may correspond to different travel distances. For example, the longer the stay length, the longer the moving distance.
Method example IV
Referring to fig. 7, a flowchart illustrating steps of a third embodiment of a data processing method of the present application may specifically include the following steps:
701, determining a target interface element corresponding to the hand of a user from an interface; the interface may include: a first display layer and a second display layer; the first display layer is used for displaying unselected interface elements; the second display layer is used for displaying the selected interface element;
step 702, controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
In this embodiment of the present application, the first display layer may display an unselected interface element (abbreviated as a non-target interface element) so that the user selects the target interface element from the unselected interface element. The second display layer may display selected interface elements, such as target interface elements. The embodiment of the application can control the selected interface element to move between the first display layer and the second display layer.
According to the embodiment of the application, the selected interface element can be determined according to the modes of gestures, voices, touch and the like.
In the embodiment of the application, the unselected interface elements and the selected interface elements can be displayed in a vertically stacked manner, so that a three-dimensional effect realized through a two-dimensional display device can be presented, and the target interface elements move between the first display layer and the second display layer along with the hands of a user; therefore, the function corresponding to the target interface element can be realized through the three-dimensional effect of the interface. For example, the method can be used for adjusting the position of the target interface element in the interface, and for example, a function corresponding to the target interface element can be triggered. The embodiment of the application can support the user to realize the function corresponding to the target interface element through the movement of the user hand, so that the user can execute the movement of the user hand according to the actual function requirement, the memory cost and the learning cost of gesture interaction can be reduced, and the user experience is improved.
In this embodiment, optionally, the second display layer is located above the first display layer, or the second display layer is located below the first display layer.
In this embodiment of the present application, optionally, the track points of the target interface element may sequentially include, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer.
In this embodiment of the present application, optionally, the track points of the target interface element may sequentially include, in order from first to last: a position in the first display layer, a position in the second display layer, and a position in the third display layer. Optionally, the second display layer is located between the first display layer and the third display layer.
In this embodiment of the present application, optionally, the moving direction of the target interface element may sequentially include, in order from first to last: a first direction of movement, a second direction of movement, and a third direction of movement;
the second movement direction is parallel to the interface; the first movement direction and the third movement direction are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third movement direction may be a direction from the second display layer to the first display layer.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments and that the acts referred to are not necessarily required by the embodiments of the present application.
The embodiment of the application also provides a data processing device.
With reference to fig. 8, a block diagram of an embodiment of a data processing apparatus of the present application is shown, where the apparatus may specifically include the following modules:
a determining module 801, configured to determine a target interface element corresponding to a user hand from an interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state; and
a control module 802, configured to control movement of the target interface element between the first display layer and the second display layer according to the first movement information of the user's hand in the three-dimensional space.
Optionally, the interface may further include: a desktop layer positioned below the first display layer.
Optionally, the track points of the target interface element may sequentially include, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer.
Optionally, the motion type of the target interface element may include at least one of the following types: flipping, moving, and rotating.
Optionally, the moving direction of the target interface element may sequentially include, in order from first to last: a first direction of movement, a second direction of movement, and a third direction of movement;
the second movement direction is parallel to the interface; the first movement direction and the third movement direction are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third movement direction is a direction from the second display layer to the first display layer.
Optionally, the interface element may include at least one of the following elements: icons, cards, and listings.
Optionally, the determining module 801 may include:
And the matching determining module is used for determining the target interface element corresponding to the hand of the user from the interface according to the matching condition between the projection object of the hand of the user on the display device and the interface element.
Optionally, in a case where the user hand presents a first gesture, the first motion information may include: perpendicular to the direction of movement of the interface.
Optionally, in a case where the user hand presents a second gesture, the first motion information may include: parallel to the direction of movement of the interface;
the apparatus may further include: and the updating module is used for updating the target interface element according to the motion direction parallel to the interface under the condition that the hand of the user presents the second gesture.
The embodiment of the application also provides a data processing device, which specifically can comprise the following modules:
the determining module is used for determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying unselected interface elements; the second display layer is used for displaying the selected interface element; and
and the control module is used for controlling the target interface element to move between the first display layer and the second display layer according to the first movement information of the hand of the user in the three-dimensional space.
In this embodiment, optionally, the second display layer is located above the first display layer, or the second display layer is located below the first display layer.
In this embodiment of the present application, optionally, the track points of the target interface element may sequentially include, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer.
In this embodiment of the present application, optionally, the track points of the target interface element may sequentially include, in order from first to last: a position in the first display layer, a position in the second display layer, and a position in the third display layer. Optionally, the second display layer is located between the first display layer and the third display layer.
In this embodiment of the present application, optionally, the moving direction of the target interface element may sequentially include, in order from first to last: a first direction of movement, a second direction of movement, and a third direction of movement;
the second movement direction is parallel to the interface; the first movement direction and the third movement direction are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third movement direction may be a direction from the second display layer to the first display layer.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Embodiments of the present application may be implemented as a system or device configured as desired using any suitable hardware and/or software. Fig. 9 schematically illustrates an exemplary device 1300 that may be used to implement various embodiments described above in this application.
For one embodiment, fig. 9 illustrates an exemplary device 1300, the device 1300 may include: one or more processors 1302, a system control module (chipset) 1304 coupled to at least one of the processors 1302, a system memory 1306 coupled to the system control module 1304, a non-volatile memory (NVM)/storage 1308 coupled to the system control module 1304, one or more input/output devices 1310 coupled to the system control module 1304, and a network interface 1312 coupled to the system control module 1306. The system memory 1306 may include: instructions 1362, the instructions 1362 being executable by the one or more processors 1302.
The processor 1302 may include one or more single-core or multi-core processors, and the processor 1302 may include any combination of general-purpose or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the device 1300 can be a server, a target device, a wireless device, etc. as described in the embodiments of the present application.
In some embodiments, the device 1300 may include one or more machine-readable media (e.g., system memory 1306 or NVM/storage 1308) having instructions and one or more processors 1302, in combination with the one or more machine-readable media, configured to execute the instructions to implement the modules included in the foregoing devices to perform the actions described above in embodiments of the present application.
The system control module 1304 of an embodiment may include any suitable interface controller for providing any suitable interface to at least one of the processors 1302 and/or any suitable device or component in communication with the system control module 1304.
The system control module 1304 of an embodiment may include one or more memory controllers to provide an interface to the system memory 1306. The memory controller may be a hardware module, a software module, and/or a firmware module.
The system memory 1306 of one embodiment may be used to load and store data and/or instructions 1362. For one embodiment, the system memory 1306 may include any suitable volatile memory, such as suitable DRAM (dynamic random Access memory). In some embodiments, the system memory 1306 may include: double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
The system control module 1304 of an embodiment may include one or more input/output controllers to provide interfaces to the NVM/storage 1308 and the input/output device(s) 1310.
NVM/storage 1308 for one embodiment may be used to store data and/or instructions 1382. NVM/storage 1308 may include any suitable nonvolatile memory (e.g., flash memory, etc.) and/or may include any suitable nonvolatile storage device(s), such as, for example, one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives, etc.
NVM/storage 1308 may include storage resources that are physically part of the device on which device 1300 is installed, or which may be accessed by the device without being part of the device. For example, NVM/storage 1308 may be accessed over a network via network interface 1312 and/or through input/output devices 1310.
Input/output device(s) 1310 for one embodiment may provide an interface for device 1300 to communicate with any other suitable device, input/output device 1310 may include a communication component, an audio component, a sensor component, and the like.
The network interface 1312 for one embodiment may provide an interface for the device 1300 to communicate over one or more networks and/or with any other suitable device, and the device 1300 may communicate wirelessly with one or more components of a wireless network in accordance with any of one or more wireless network standards and/or protocols, such as accessing a wireless network based on a communication standard, such as WiFi,2G or 3G or 4G or a combination thereof.
For one embodiment, at least one of the processors 1302 may be packaged together with logic of one or more controllers (e.g., memory controllers) of the system control module 1304. For one embodiment, at least one of the processors 1302 may be packaged together with logic of one or more controllers of the system control module 1304 to form a System In Package (SiP). For one embodiment, at least one of the processors 1302 may be integrated on the same new product as the logic of one or more controllers of the system control module 1304. For one embodiment, at least one of the processors 1302 may be integrated on the same chip with logic of one or more controllers of the system control module 1304 to form a system on chip (SoC).
In various embodiments, device 1300 may include, but is not limited to: a desktop computing device or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.), among others. In various embodiments, device 1300 may have more or fewer components and/or different architectures. For example, in some embodiments, device 1300 may include one or more cameras, keyboards, liquid Crystal Display (LCD) screens (including touch screen displays), non-volatile memory ports, multiple antennas, graphics chips, application Specific Integrated Circuits (ASICs), and speakers.
Wherein if the display comprises a touch panel, the display screen may be implemented as a touch screen display to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The embodiment of the application also provides a non-volatile readable storage medium, in which one or more modules (programs) are stored, where the one or more modules are applied to a device, and the device may be caused to execute instructions (instructions) of the methods in the embodiments of the application.
In one example, an apparatus is provided, comprising: one or more processors; and instructions in one or more machine-readable media stored thereon, which when executed by the one or more processors, cause the apparatus to perform a method as in an embodiment of the present application, the method may comprise: the method shown in fig. 1 or fig. 2 or fig. 3.
One or more machine-readable media are also provided in one example, having instructions stored thereon that, when executed by one or more processors, cause an apparatus to perform a method as in an embodiment of the present application, the method may comprise: the method shown in fig. 1 or fig. 2 or fig. 3.
The specific manner in which the operations of the respective modules are performed in the apparatus of the above embodiments has been described in detail in the embodiments related to the method, and will not be described in detail herein, but only with reference to the portions of the description related to the embodiments of the method.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present embodiments have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the present application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing has outlined rather broadly the principles and embodiments of the present application in order that the detailed description of a data processing method, a data processing apparatus, an apparatus, and a machine readable medium that are provided herein; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.
Claims (14)
1. A method of data processing, the method comprising:
determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state;
controlling the target interface element to move between the first display layer and the second display layer according to first movement information of the hand of the user in a three-dimensional space;
wherein, in the case of being used for adjusting the position of the target interface element in the interface, the track point of the target interface element sequentially comprises, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer; the movement direction of the target interface element sequentially comprises the following steps from first to last: a first direction of movement, a second direction of movement, and a third direction of movement; the second direction of motion is parallel to the interface; the first and third directions of motion are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third direction of motion is a direction from the second display layer to the first display layer.
2. The method of claim 1, wherein the interface further comprises: a desktop layer positioned below the first display layer.
3. The method of claim 1, wherein the type of motion of the target interface element comprises at least one of the following types: flipping, moving, and rotating.
4. A method according to any one of claims 1 to 3, wherein the interface element comprises at least one of the following elements: icons, cards, and listings.
5. A method according to any one of claims 1 to 3, wherein determining a target interface element corresponding to a user's hand from the interface comprises:
and determining a target interface element corresponding to the hand of the user from the interface according to the matching condition between the projection object of the hand of the user on the display device and the interface element.
6. A method according to any one of claims 1 to 3, wherein in the event that the user's hand presents a first gesture, the first motion information comprises: perpendicular to the direction of movement of the interface.
7. A method according to any one of claims 1 to 3, wherein in the event that the user's hand presents a second gesture, the first motion information comprises: parallel to the direction of movement of the interface;
The method further comprises the steps of:
and under the condition that the hand of the user presents a second gesture, updating the target interface element according to the movement direction parallel to the interface.
8. A method of data processing, the method comprising:
determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying unselected interface elements; the second display layer is used for displaying the selected interface element;
controlling the target interface element to move between the first display layer and the second display layer according to first movement information of the hand of the user in a three-dimensional space;
wherein, in the case of being used for adjusting the position of the target interface element in the interface, the track point of the target interface element sequentially comprises, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer; the movement direction of the target interface element sequentially comprises the following steps from first to last: a first direction of movement, a second direction of movement, and a third direction of movement; the second direction of motion is parallel to the interface; the first and third directions of motion are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third direction of motion is a direction from the second display layer to the first display layer.
9. A data processing apparatus, the apparatus comprising:
the determining module is used for determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying static interface elements; the second display layer is used for displaying interface elements in a motion state; and
the control module is used for controlling the target interface element to move between the first display layer and the second display layer according to first movement information of the hand of the user in a three-dimensional space;
wherein, in the case of being used for adjusting the position of the target interface element in the interface, the track point of the target interface element sequentially comprises, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer; the movement direction of the target interface element sequentially comprises the following steps from first to last: a first direction of movement, a second direction of movement, and a third direction of movement; the second direction of motion is parallel to the interface; the first and third directions of motion are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third direction of motion is a direction from the second display layer to the first display layer.
10. A data processing apparatus, the apparatus comprising:
the determining module is used for determining a target interface element corresponding to the hand of the user from the interface; the interface comprises: a first display layer and a second display layer; the first display layer is used for displaying unselected interface elements; the second display layer is used for displaying the selected interface element; and
the control module is used for controlling the target interface element to move between the first display layer and the second display layer according to first movement information of the hand of the user in a three-dimensional space;
wherein, in the case of being used for adjusting the position of the target interface element in the interface, the track point of the target interface element sequentially comprises, in order from first to last: a source location in a first display layer, a location in a second display layer, and a target location in the first display layer; the movement direction of the target interface element sequentially comprises the following steps from first to last: a first direction of movement, a second direction of movement, and a third direction of movement; the second direction of motion is parallel to the interface; the first and third directions of motion are perpendicular to the interface; the first movement direction is a direction from the first display layer to the second display layer; the third direction of motion is a direction from the second display layer to the first display layer.
11. An apparatus, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform the method of any of claims 1-7.
12. One or more machine readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method of any of claims 1-7.
13. An apparatus, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform the method of claim 8.
14. One or more machine readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method of claim 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910810762.0A CN112445323B (en) | 2019-08-29 | 2019-08-29 | Data processing method, device, equipment and machine-readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910810762.0A CN112445323B (en) | 2019-08-29 | 2019-08-29 | Data processing method, device, equipment and machine-readable medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112445323A CN112445323A (en) | 2021-03-05 |
CN112445323B true CN112445323B (en) | 2023-12-29 |
Family
ID=74742318
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910810762.0A Active CN112445323B (en) | 2019-08-29 | 2019-08-29 | Data processing method, device, equipment and machine-readable medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112445323B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114840092A (en) * | 2022-05-31 | 2022-08-02 | 上海商汤临港智能科技有限公司 | Method, device, equipment and storage medium for interacting with vehicle-mounted display equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080009597A (en) * | 2006-07-24 | 2008-01-29 | 삼성전자주식회사 | User interface device and embodiment method thereof |
CN107562323A (en) * | 2017-09-08 | 2018-01-09 | 广东欧珀移动通信有限公司 | icon moving method, device and terminal |
CN107943360A (en) * | 2017-11-13 | 2018-04-20 | 星潮闪耀移动网络科技(中国)有限公司 | A kind of methods of exhibiting and device of multi views element |
CN109753326A (en) * | 2017-11-06 | 2019-05-14 | 阿里巴巴集团控股有限公司 | Processing method, device, equipment and machine readable media |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8334847B2 (en) * | 2007-10-19 | 2012-12-18 | Qnx Software Systems Limited | System having user interface using object selection and gestures |
US20120218395A1 (en) * | 2011-02-25 | 2012-08-30 | Microsoft Corporation | User interface presentation and interactions |
US9030487B2 (en) * | 2011-08-01 | 2015-05-12 | Lg Electronics Inc. | Electronic device for displaying three-dimensional image and method of using the same |
US20130111382A1 (en) * | 2011-11-02 | 2013-05-02 | Microsoft Corporation | Data collection interaction using customized layouts |
US10025378B2 (en) * | 2013-06-25 | 2018-07-17 | Microsoft Technology Licensing, Llc | Selecting user interface elements via position signal |
US9423927B2 (en) * | 2013-12-04 | 2016-08-23 | Cellco Partnership | Managing user interface elements using gestures |
-
2019
- 2019-08-29 CN CN201910810762.0A patent/CN112445323B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080009597A (en) * | 2006-07-24 | 2008-01-29 | 삼성전자주식회사 | User interface device and embodiment method thereof |
CN107562323A (en) * | 2017-09-08 | 2018-01-09 | 广东欧珀移动通信有限公司 | icon moving method, device and terminal |
CN109753326A (en) * | 2017-11-06 | 2019-05-14 | 阿里巴巴集团控股有限公司 | Processing method, device, equipment and machine readable media |
CN107943360A (en) * | 2017-11-13 | 2018-04-20 | 星潮闪耀移动网络科技(中国)有限公司 | A kind of methods of exhibiting and device of multi views element |
Also Published As
Publication number | Publication date |
---|---|
CN112445323A (en) | 2021-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102423826B1 (en) | User termincal device and methods for controlling the user termincal device thereof | |
US10712938B2 (en) | Portable device and screen display method of portable device | |
US20200004387A1 (en) | Display device and method of indicating an active region in a multi-window display | |
US8743021B1 (en) | Display device detecting gaze location and method for controlling thereof | |
US9952681B2 (en) | Method and device for switching tasks using fingerprint information | |
CN107810470B (en) | Portable device and method for changing screen thereof | |
US9798443B1 (en) | Approaches for seamlessly launching applications | |
US9323446B2 (en) | Apparatus including a touch screen and screen change method thereof | |
KR101885131B1 (en) | Method and apparatus for screen scroll of display apparatus | |
CN107967087B (en) | Display apparatus and method of controlling the same | |
US10152226B2 (en) | Portable device and method of changing screen of portable device | |
US20120280898A1 (en) | Method, apparatus and computer program product for controlling information detail in a multi-device environment | |
US9990103B2 (en) | Digital device and method of controlling therefor | |
US20160349851A1 (en) | An apparatus and associated methods for controlling content on a display user interface | |
KR20140068573A (en) | Display apparatus and method for controlling thereof | |
KR102226535B1 (en) | Electronic device and method for controlling screen | |
US20160088060A1 (en) | Gesture navigation for secondary user interface | |
KR20140074141A (en) | Method for display application excution window on a terminal and therminal | |
CN110928464B (en) | User interface display method, device, equipment and medium | |
KR20170057823A (en) | Method and electronic apparatus for touch input via edge screen | |
KR20140073379A (en) | Display apparatus and method for controlling thereof | |
CN112445323B (en) | Data processing method, device, equipment and machine-readable medium | |
KR20130124139A (en) | Control method of terminal by using spatial interaction | |
CN111124236B (en) | Data processing method, device and machine-readable medium | |
US20150002549A1 (en) | Transparent display device and method for providing user interface thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |