CN115113796B - Article processing method and device based on man-machine interaction and terminal equipment - Google Patents

Article processing method and device based on man-machine interaction and terminal equipment Download PDF

Info

Publication number
CN115113796B
CN115113796B CN202210753870.0A CN202210753870A CN115113796B CN 115113796 B CN115113796 B CN 115113796B CN 202210753870 A CN202210753870 A CN 202210753870A CN 115113796 B CN115113796 B CN 115113796B
Authority
CN
China
Prior art keywords
area
touch operation
key
target
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210753870.0A
Other languages
Chinese (zh)
Other versions
CN115113796A (en
Inventor
寇睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Zhejiang Zeekr Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Zhejiang Zeekr Intelligent Technology Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN202210753870.0A priority Critical patent/CN115113796B/en
Publication of CN115113796A publication Critical patent/CN115113796A/en
Application granted granted Critical
Publication of CN115113796B publication Critical patent/CN115113796B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application provides an article processing method and device based on man-machine interaction and terminal equipment. The method comprises the steps of displaying a control interface through terminal equipment, wherein the control interface comprises at least one article and comprises the following steps: responding to a first touch operation of a target object acting on the control interface, and determining that the target object indicated by the first touch operation is in a selected state; adjusting the number of the target objects in response to a second touch operation acting on the first area or a second touch operation acting on the second area, and displaying the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface. According to the method, the accuracy of the terminal equipment in response to the touch operation is improved, and the user experience is improved.

Description

Article processing method and device based on man-machine interaction and terminal equipment
Technical Field
The present disclosure relates to the field of man-machine interaction technologies, and in particular, to a method and an apparatus for processing an article based on man-machine interaction, and a terminal device.
Background
With the development of computer technology, man-machine interaction exists in aspects of life of people. In particular, users typically control the amount of items purchased when purchasing the items.
In the prior art, an increasing button and a decreasing button for adjusting the quantity of articles are arranged on an article display area, and when a user needs to increase or decrease the quantity of articles, the increasing button or the decreasing button is directly touched to achieve the purpose of adjusting the quantity of articles.
However, in the prior art, the increasing button and the decreasing button are arranged in adjacent areas, the distance between the touch areas is too short, the space is too small, and false touch is easy to occur, so that the accuracy of the terminal equipment in response to the touch operation of the user is not high.
Disclosure of Invention
The application provides an article processing method and device based on man-machine interaction and terminal equipment, which are used for solving the problem that response accuracy is not high when the terminal equipment adjusts the quantity of articles.
In one aspect, the present application provides a method for processing an article based on man-machine interaction, where a control interface is displayed through a terminal device, where the control interface includes at least one article, and the method includes:
responding to a first touch operation of a target object acting on the control interface, and determining that the target object indicated by the first touch operation is in a selected state;
adjusting the number of the target objects in response to a second touch operation acting on the first area or a second touch operation acting on the second area, and displaying the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface.
Optionally, a first key is arranged in the first area, and a second key is arranged in the second area; the adjusting the number of the target objects in response to the second touch operation acting on the first area or the second touch operation acting on the second area includes:
responding to a second touch operation acted on a first key in the first area, and increasing the number of the target objects;
and responding to a second touch operation acted on a second key in the second area, and reducing the number of the target objects.
Optionally, the first key is located on the right side of the article display region, and the second key is located on the left side of the article display region;
alternatively, the first key is located on the left side of the item display area and the second key is located on the right side of the item display area.
Optionally, if the second touch operation is a click operation, the adjusting the number of the target objects in response to the second touch operation acting on the first area or the second touch operation acting on the second area includes:
and responding to the clicking operation, and determining the adjustment quantity of the target object according to the times of the clicking operation so as to adjust the quantity of the target object.
Optionally, the method further comprises:
and deleting the target object in response to a third touch operation acting on the object display area.
Optionally, the third touch operation is an operation of pressing and dragging to a designated area for a long time, and deleting the target object in response to the third touch operation acting on the object display area includes:
and deleting the target object in response to a long press and drag operation acting on the object display area to a designated area.
Optionally, the third touch operation is an operation of touching a preset button of the article display area.
On the other hand, the application provides an article processing device based on man-machine interaction, the device is arranged on a terminal device, a control interface is displayed through the terminal device, the control interface comprises at least one article, and the device comprises:
the determining unit is used for responding to a first touch operation of the target object acting on the control interface and determining that the target object indicated by the first touch operation is in a selected state;
the processing unit is used for responding to the second touch operation acting on the first area or the second touch operation acting on the second area, adjusting the quantity of the target objects and displaying the adjusted quantity; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface.
Optionally, a first key is arranged in the first area, a second key is arranged in the second area, and the processing unit comprises an increasing module and a decreasing module;
the adding module is used for responding to a second touch operation acted on the first key in the first area and adding the number of the target objects;
the reduction module is used for responding to a second touch operation acted on a second key in the second area and reducing the number of the target objects.
Optionally, the first key is located on the right side of the article display region, and the second key is located on the left side of the article display region;
alternatively, the first key is located on the left side of the item display area and the second key is located on the right side of the item display area.
Optionally, the second touch operation is a clicking operation, and the processing module is specifically configured to determine, according to the number of times of the clicking operation, an adjustment quantity of the target object to adjust the quantity of the target object in response to the clicking operation.
Optionally, the apparatus further comprises: a deletion unit; the deleting unit is used for deleting the target object in response to a third touch operation acting on the object display area.
Optionally, the third touch operation is an operation of long pressing and dragging to a designated area, and the deleting unit is specifically configured to delete the target object in response to an operation of long pressing and dragging to the designated area acting on the object display area.
Optionally, the third touch operation is an operation of touching a preset button of the article display area.
In yet another aspect, the present application provides a terminal device, including: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to implement the method as described in any one of the above.
In yet another aspect, the present application also provides a computer-readable storage medium having stored therein computer-executable instructions for implementing the method of any one of the above when executed by a processor.
According to the object processing method, the object processing device and the terminal equipment based on the man-machine interaction, the object indicated by the first touch operation is determined to be in the selected state by responding to the first touch operation of the object on the control interface; and adjusting the number of the target items in response to the second touch operation acting on the first area or the second touch operation acting on the second area, and displaying the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface. When the user adjusts the number of the target articles, the user acts on the first area and the second area respectively positioned on the two boundary areas of the article display area of the target articles on the control interface, and the first area and the second area are not arranged at adjacent positions, so that the situation of false touch can not occur. The terminal equipment can give a correct response when responding to the touch operation of the user, so that the accuracy of the terminal equipment responding to the touch operation of the user is improved, and the user experience is also improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of an interface for adjusting the quantity of an article according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of an article processing method based on man-machine interaction according to an embodiment of the present application;
fig. 3 is a schematic diagram of a control interface of an article processing method based on man-machine interaction according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of another method for processing an article based on human-computer interaction according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a control interface of another method for processing an article based on human-computer interaction according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a control interface of another method for processing an article based on human-computer interaction according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an article processing device based on man-machine interaction according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of another article handling apparatus based on human-computer interaction according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Specific embodiments thereof have been shown by way of example in the drawings and will herein be described in more detail. These drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but to illustrate the concepts of the present application to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The Internet not only changes the aspects of human life, but also subverts the production and life modes of human beings. With the rapid development of internet technology, online shopping is also an indispensable part of people's life.
At the time of online shopping, people also need to determine the number of shopping items. In the prior art, the quantity is increased or decreased by setting a touch control in an article display area for adjusting the quantity of buttons, and the articles are deleted by touching a delete button.
Illustratively, fig. 1 is a schematic diagram of an interface for adjusting the quantity of an article according to an embodiment of the present application. As shown in fig. 1, in the shopping interface for adjusting the number of items, both the "+" button and the "-" button are disposed at the right lower corner position of the item display area, and the user achieves the increase in number by clicking the "+" button and the decrease in number by clicking the "-" button. In this way, the touch areas of the "+" button and the "-" button are relatively close, so that when the user touches the "+" button, the "-" button is easy to touch by mistake, and when the user touches the "-" button, the "+" button is also easy to touch by mistake, so that the terminal device cannot accurately locate the touch position of the user and cannot give a correct response when responding to the touch operation of the user, and therefore, the response accuracy of the terminal device is not high.
In order to solve the above problems, the present application proposes an article processing method based on man-machine interaction, in which a control area for adjusting the number of articles is set at different positions of an article display area, so that a user does not touch one area by mistake when touching the other area, and thus, when responding to a touch operation of the user, a terminal device does not give an incorrect response, thereby improving the response accuracy of the terminal device.
The following describes the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a schematic flow chart of an article processing method based on man-machine interaction according to an embodiment of the present application. The execution body of the embodiment may be an article processing device based on man-machine interaction, where the device is located on a terminal device, and the terminal device may be a mobile phone, a tablet, a computer, or the like. And displaying a control interface through the terminal equipment, wherein the control interface at least comprises one article. As shown in fig. 2, the method for processing an article based on man-machine interaction provided in this embodiment includes:
s201, responding to a first touch operation of a target object on a control interface, and determining that the target object indicated by the first touch operation is in a selected state.
The control interface displayed by the terminal device includes at least one item, and the user selects one item in the control interface as a target item through a first touch operation. The terminal equipment responds to a first touch operation of a user on a target object on the control interface, so that the target object is in a selected state on the control interface.
The specific operation form and operation position of the first touch operation are not limited in the application. For example, the first touch operation may be a plurality of common selection operations such as clicking, touching, long pressing, hooking, and the like. The location of the first touch operation may be any location in the item display area of the target item, for example, may be a middle area, a right area, a left area, etc. of the item display area.
The article display area is used for displaying basic information of the article, such as pictures, names, models, brands, introduction information, prices, sales and the like of the article, and the application is not limited.
S202, responding to a second touch operation acting on the first area or a second touch operation acting on the second area, adjusting the number of target objects, and displaying the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface.
The terminal device may further comprise a first area and a second area located on two boundary areas of the item display area of the target item on the control interface, when the target item is in the selected state. When the user touches the first area or the second area, the terminal equipment responds to the second touch operation of the user on the first area or the second area, adjusts the quantity of the target objects, and displays the adjusted quantity.
The specific positions and sizes of the first area and the second area are not limited, and a user can not touch the other area by mistake when touching the first area or the second area. Illustratively, the first and second regions may be located on either side of the item display area, e.g. to the left or right of the item display area, respectively; or on the upper or lower side of the item display area, respectively, etc.
For example, when the user touches the first area, the terminal device responds to a second touch operation of the user on the first area to adjust the number of the target objects and display the adjusted number; or when the user touches the second area, the terminal device responds to a second touch operation of the user on the second area to adjust the number of the target objects and display the adjusted number.
The specific operation mode of the second touch operation is not limited in this embodiment. Illustratively, the second touch operation may be any of clicking, touching, sliding, and the like.
In one example, the second touch operation is a click operation, and adjusting the number of target items in response to the second touch operation acting on the first area or the second touch operation acting on the second area includes: in response to the clicking operation, an adjustment amount of the target item is determined according to the number of clicking operations to adjust the amount of the target item.
For example, when the second touch operation is a click operation, the terminal device may determine the adjustment amount of the target item according to the number of times the user clicks in response to the click operation of the user. For example, when the user clicks 1 time, the terminal device determines the adjustment number to be 1, and when clicks 2 times, the terminal device determines the adjustment number to be 2. Of course, other rules related to the number of clicks and the adjustment number may be provided, which is not limited in this application.
In response to the second touch operation applied to the first area or the second area, which area is specifically used for increasing and/or decreasing the number is not limited in this application. Illustratively, the number of target items is increased in response to a second touch operation applied to the first region, and the number of target items is decreased in response to a second touch operation applied to the second region.
Illustratively, when the user clicks the first area, the terminal device increases the number of the target items according to the number of times the user clicks the first area, and displays the increased number; when the user clicks the second area, the terminal device reduces the number of the target articles according to the number of times the user clicks the second area, and displays the reduced number.
In yet another example, the second touch operation is a sliding operation. The user slides in the first area or the second area respectively, the terminal equipment responds to the sliding operation of the user, increases or decreases the number of the target objects along with the sliding operation, displays the adjusted number, and stops sliding when the displayed number is the target number, and also stops adjusting the number.
Fig. 3 is a schematic diagram of a control interface of an article processing method based on man-machine interaction according to an embodiment of the present application. As shown in fig. 3 (a), the terminal device displays only basic information of the items on the item display area of each item in the control interface; the terminal equipment responds to a first touch operation of a user on a target object on the control interface. As shown in the diagram (B) in fig. 3, the terminal device displays one control in the control interface for the information of the target item, and also based on the first area a on the right side of the item display area, and another control based on the second area B on the left side of the item display area; the user adjusts the quantity of the articles by touching the first area A or the second area B.
Illustratively, the terminal device determines to increase the number of items in response to the user touching the first region; the control on the first area is used to display the quantity of the item. The terminal equipment responds to the second area touched by the user, and the terminal equipment determines to reduce the quantity of the articles; for example, the terminal equipment responds to a control on the second area touched by the user to reduce the quantity of the articles; the control on the second area may be a "-" symbol control.
Or the terminal equipment responds to the second area touched by the user, and the terminal determines to increase the quantity of the articles; the control on the second area is used to display the quantity of the item. The terminal equipment responds to the touch control of the first area by the user, and the terminal equipment determines to reduce the quantity of the articles; for example, the terminal equipment responds to a control on the first area touched by a user to reduce the quantity of the articles; the control on the first area may be a "-" symbol control.
In addition, each region of the layout in the application can be provided with a corresponding id so as to ensure that when a user touches one of the regions, the terminal equipment can intercept, prevent the touch event from being transferred to the sub-layout and ensure the accuracy of the touch event transfer.
According to the article processing method based on man-machine interaction, the target article indicated by the first touch operation is determined to be in the selected state by responding to the first touch operation of the target article acted on the control interface; and adjusting the number of the target items in response to the second touch operation acting on the first area or the second touch operation acting on the second area, and displaying the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface. According to the method, when the number of the target objects is adjusted, the user acts on the first area and the second area which are respectively positioned on the two boundary areas of the object display area of the target object on the control interface, and the first area and the second area are not arranged at similar positions, so that false touch can not occur. Therefore, the terminal equipment can give a correct response when responding to the touch operation of the user, so that the accuracy of the terminal equipment responding to the touch operation of the user is improved, and the user experience is also improved.
Fig. 4 is a schematic flow chart of another method for processing an article based on man-machine interaction according to an embodiment of the present application. The execution body of the embodiment may be an article processing device based on man-machine interaction, where the device is located on a terminal device, and the terminal device may be a mobile phone, a tablet, a computer, or the like. And displaying a control interface through the terminal equipment, wherein the control interface at least comprises one article. As shown in fig. 4, the method for processing an article based on man-machine interaction provided in this embodiment includes:
s401, responding to a first touch operation of a target object on a control interface, and determining that the target object indicated by the first touch operation is in a selected state.
Illustratively, the specific implementation of step S401 is similar to step S201, and will not be described herein.
When the target object is in the selected state, a first area and a second area which are positioned on two boundary areas of the object display area of the target object are displayed on the control interface for the target object, and the first area and the second area are used for adjusting the quantity of the target object. The first area may be provided with a first key, and the second area may be provided with a second key, for realizing the adjustment of the number by means of the touch keys.
In one example, the first key is located on the right side of the item display area and the second key is located on the left side of the item display area; alternatively, the first key is located on the left side of the item display area and the second key is located on the right side of the item display area.
The positions of the first key and the second key are completely separated and are respectively positioned at the left side and the right side of the article display area, so that the second key is not touched by mistake when the user touches the first key, the first key is not touched by mistake when the user touches the second key, and the terminal equipment can accurately respond to the touch operation of the user, thereby improving the response accuracy of the terminal equipment.
Fig. 5 is a schematic diagram of a control interface of yet another article handling method based on man-machine interaction according to an embodiment of the present application. As shown in fig. 5 (a), the terminal device displays only basic information of the items on the item display area of each item in the control interface; the terminal equipment responds to a first touch operation of a user on a target object on the control interface. As shown in the diagram (b) of fig. 5, the terminal device further includes a first key located on the right side of the item display area of the target item and a second key located on the left side of the item display area of the target item, for the information of the target item in the control interface. Wherein the first key is displayed as a number, the representation is used for increasing the number of the target articles, and the specific number represents the number of the current target articles; the second key is displayed as a "-" symbol, representing a symbol for reducing the number of target items; the user adjusts the number of the target objects by touching the first key and the second key. In this way, the terminal device can accurately determine and adjust the number of the target objects when responding to the operation of the user touching the first key or the second key.
S402, responding to a second touch operation acted on a first key in the first area, and increasing the number of target objects.
In an exemplary embodiment, the terminal device increases the number of the target items in response to a second touch operation of the user on the first key in the first area, and displays the increased number of the target items at the first key.
S403, responding to a second touch operation acted on a second key in the second area, and reducing the number of target objects.
The terminal device, in response to a second touch operation of the user on the second key in the second area, reduces the number of target items and displays the reduced number of target items at the first key.
In one example, the second touch operation is a click operation.
For example, the user clicks the first key on the right side of the target article in fig. 5, the terminal device identifies the number of clicks of the user in response to the clicking operation of the user on the first key, and may determine the increased number of target articles, and the terminal device adjusts the number of target articles according to the increased number, and displays the increased number of target articles at the first key. Similarly, when the user clicks the second key on the left side, the terminal device identifies the number of clicks, determines the reduced number, and simultaneously displays the reduced number of target articles at the first key.
S404, deleting the target object in response to the third touch operation on the object display area.
Illustratively, for the target item, the user also has a delete requirement for the target item. And the terminal equipment responds to a third touch operation of a user acting on the article display area in the control interface so as to delete the target article. The specific form of the third touch operation is not limited in this application.
In one example, the third touch operation is an operation of pressing and dragging to the designated area for a long time, and deleting the target item in response to the third touch operation acting on the item display area includes: the target item is deleted in response to a long press and drag operation acting on the item display area to the designated area.
Illustratively, the user long presses and drags the target item to the designated area, and the terminal device deletes the target item in response to a touch operation applied to the target item by the user. The method and the device have the advantages that the designated area is not limited, for example, the terminal device can respond to the operation that a user presses and drags the target object to the dustbin for a long time to delete the target object; or the terminal equipment responds to the situation that the user presses and drags the target object out of any position of the control interface for a long time, and deletes the target object; etc.
In one example, the third touch operation is an operation of touching a preset button of the article display area. Illustratively, on the control interface, the item display area of the target item is provided with a delete button, and when the user clicks the delete button, the terminal device responds to the operation of clicking the delete button by the user and deletes the target item.
According to the article processing method based on man-machine interaction, the target article indicated by the first touch operation is determined to be in the selected state by responding to the first touch operation of the target article acted on the control interface; then, in response to a second touch operation acting on the first key in the first area, increasing the number of target items; responding to a second touch operation acting on a second key in the second area, and reducing the number of target objects; deleting the target object in response to a third touch operation acting on the object display area; the first key and the second key are respectively arranged in the first area and the second area, the positions of the first key and the second key are completely separated and are respectively positioned at the left side and the right side of the article display area, so that when a user touches one key, the other key is not touched by mistake, and when the terminal equipment responds to the operation of the user touching the first key or the second key, the terminal equipment also cannot generate wrong response, so that the accuracy of the terminal equipment responding to the touch operation of the user can be improved, and the user experience is improved.
In yet another example, fig. 6 is a schematic diagram of a control interface of another article handling method based on man-machine interaction according to an embodiment of the present application. As shown in fig. 6 (a), the terminal device displays only basic information of the target item in the item display area of the target item in the control interface, and the terminal device may respond to a first touch operation of the user on the target item on the control interface. As shown in fig. 6 (b), the terminal device may determine to increase the number in response to a second touch operation of the user touching the right half area in the item display area of the target item, thereby increasing the number of the target items; or, the terminal device may determine to decrease the number of target items in response to a second touch operation of the user touching the left half area in the item display area of the target items, and then display the adjusted number of target items in the lower right corner. Because the touch areas of the left half area and the right half area are large enough, when a user touches one area, the other area cannot be touched by mistake, and therefore, the terminal equipment can give correct response when responding to the touch operation of the user, and the response accuracy of the terminal equipment is improved.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Fig. 7 is a schematic structural diagram of an article processing device based on man-machine interaction according to an embodiment of the present application. The device is arranged on the terminal equipment, a control interface is displayed through the terminal equipment, and the control interface comprises at least one article. As shown in fig. 7, the article processing device 70 based on man-machine interaction of the present embodiment includes: a determination unit 701 and a processing unit 702.
The determining unit 701 is configured to determine, in response to a first touch operation applied to a target object on the control interface, that the target object indicated by the first touch operation is in a selected state.
A processing unit 702, configured to adjust the number of target objects in response to a second touch operation acting on the first area or a second touch operation acting on the second area, and display the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface.
Fig. 8 is a schematic structural diagram of another article handling device based on man-machine interaction according to an embodiment of the present application. The device is arranged on the terminal equipment, a control interface is displayed through the terminal equipment, and the control interface comprises at least one article. As shown in fig. 8, the article processing device 80 based on man-machine interaction of the present embodiment includes: a determination unit 801 and a processing unit 802.
The determining unit 801 is configured to determine, in response to a first touch operation applied to a target object on the control interface, that the target object indicated by the first touch operation is in a selected state.
A processing unit 802, configured to adjust the number of target objects in response to a second touch operation acting on the first area or a second touch operation acting on the second area, and display the adjusted number; the first area and the second area are respectively positioned on two boundary areas of the object display area of the target object on the control interface.
In one example, a first key is disposed in a first region and a second key is disposed in a second region, and the processing unit 802 includes an increasing module 8021 and a decreasing module 8022.
The increasing module 8021 is configured to increase the number of target objects in response to a second touch operation applied to the first key in the first area.
The reducing module 8022 is configured to reduce the number of target objects in response to a second touch operation applied to the second key in the second area.
In one example, the first key is located on the right side of the item display area and the second key is located on the left side of the item display area.
Alternatively, the first key is located on the left side of the item display area and the second key is located on the right side of the item display area.
In one example, if the second touch operation is a click operation, the processing module 802 is specifically configured to determine, in response to the click operation, an adjustment amount of the target item according to the number of click operations, so as to adjust the amount of the target item.
In one example, the apparatus 80 further comprises: and a deletion unit 803.
And a deleting unit 803 for deleting the target article in response to a third touch operation acting on the article display region.
In one example, the third touch operation is an operation of long pressing and dragging to the designated area, and the deleting unit 803 is specifically configured to delete the target item in response to the operation of long pressing and dragging to the designated area acting on the item display area.
In one example, the third touch operation is an operation of touching a preset button of the article display area.
It should be noted that, it should be understood that the division of the modules of the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; the method can also be realized in a form of calling software by a processing element, and the method can be realized in a form of hardware by a part of modules. The functions of the above data processing module may be called and executed by a processing element of the above apparatus, and may be stored in a memory of the above apparatus in the form of program codes. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element here may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
Fig. 9 is a schematic structural diagram of a terminal device provided in an embodiment of the present application. As shown in fig. 9, the terminal device 90 includes: a processor 901, and a memory 902 communicatively coupled to the processor.
Wherein the memory 902 stores computer-executable instructions; processor 901 executes computer-executable instructions stored in memory 902 to implement a method as in any of the preceding claims.
In the specific implementation of the terminal device described above, it should be understood that the processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The method disclosed in connection with the embodiments of the present application may be embodied directly in hardware processor execution or in a combination of hardware and software modules in a processor.
Embodiments of the present application also provide a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, are configured to implement a method as in any of the preceding claims.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by computer instruction related hardware. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Embodiments of the present application also provide a computer program product comprising a computer program for implementing a method as in any of the preceding claims when executed by a processor.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (4)

1. An article processing method based on man-machine interaction, which is characterized in that a control interface is displayed through a terminal device, wherein the control interface comprises at least one article, and the method comprises the following steps:
responding to a first touch operation of a target object acting on the control interface, and determining that the target object indicated by the first touch operation is in a selected state; when the target object is in a non-selected state, the object display area of the target object in the control interface only displays basic information of the target object, and when the target object is in a selected state, a first area and a second area which are positioned on two boundary areas of the object display area of the target object are displayed on the control interface aiming at the target object, wherein each area is provided with a corresponding id so as to prevent a touch event from being transmitted to a sub-layout when a user touches any one area;
adjusting the number of the target objects in response to a second touch operation acting on the first area or a second touch operation acting on the second area, and displaying the adjusted number; wherein the first region and the second region are respectively positioned on two boundary regions of an article display region of the target article on the control interface;
a first key is arranged in the first area, and a second key is arranged in the second area; the adjusting the number of the target objects in response to the second touch operation acting on the first area or the second touch operation acting on the second area includes:
responding to a second touch operation acted on a first key in the first area, and increasing the number of the target objects;
responding to a second touch operation acting on a second key in a second area, and reducing the number of the target objects;
the first key is positioned on the right side of the article display area, and the second key is positioned on the left side of the article display area;
or the first key is positioned on the left side of the article display area, and the second key is positioned on the right side of the article display area;
the second touch operation is a click operation, and the adjusting the number of the target objects in response to the second touch operation acting on the first area or the second touch operation acting on the second area includes:
responding to the clicking operation, and determining the adjustment quantity of the target object according to the times of the clicking operation so as to adjust the quantity of the target object;
deleting the target object in response to a third touch operation acting on the object display area;
the third touch operation is long-press and drag operation to the designated area, and the deleting the target object in response to the third touch operation acting on the object display area includes:
and deleting the target object in response to a long press and drag operation acting on the object display area to a designated area.
2. An article processing device based on man-machine interaction, which is characterized in that the device is arranged on a terminal device, a control interface is displayed through the terminal device, at least one article is included in the control interface, and the device comprises:
the determining unit is used for responding to a first touch operation of the target object acting on the control interface and determining that the target object indicated by the first touch operation is in a selected state; when the target object is in a non-selected state, the object display area of the target object in the control interface only displays basic information of the target object, and when the target object is in a selected state, a first area and a second area which are positioned on two boundary areas of the object display area of the target object are displayed on the control interface aiming at the target object, wherein each area is provided with a corresponding id so as to prevent a touch event from being transmitted to a sub-layout when a user touches any one area;
the processing unit is used for responding to the second touch operation acting on the first area or the second touch operation acting on the second area, adjusting the quantity of the target objects and displaying the adjusted quantity; wherein the first region and the second region are respectively positioned on two boundary regions of an article display region of the target article on the control interface;
a first key is arranged in the first area, and a second key is arranged in the second area; the first key is positioned on the right side of the article display area, and the second key is positioned on the left side of the article display area; or the first key is positioned on the left side of the article display area, and the second key is positioned on the right side of the article display area;
the processing unit is specifically configured to increase the number of the target objects in response to a second touch operation that acts on the first key in the first area; responding to a second touch operation acting on a second key in a second area, and reducing the number of the target objects;
the second touch operation is a clicking operation;
the processing unit is further used for responding to the clicking operation, and determining the adjustment quantity of the target articles according to the times of the clicking operation so as to adjust the quantity of the target articles;
the processing unit is further used for deleting the target object in response to the operation of long-pressing and dragging the object to the designated area, wherein the operation acts on the object display area.
3. A terminal device, characterized in that the terminal device comprises: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the method of any one of claims 1-2.
4. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor are adapted to carry out the method of any of claims 1-2.
CN202210753870.0A 2022-06-29 2022-06-29 Article processing method and device based on man-machine interaction and terminal equipment Active CN115113796B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210753870.0A CN115113796B (en) 2022-06-29 2022-06-29 Article processing method and device based on man-machine interaction and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210753870.0A CN115113796B (en) 2022-06-29 2022-06-29 Article processing method and device based on man-machine interaction and terminal equipment

Publications (2)

Publication Number Publication Date
CN115113796A CN115113796A (en) 2022-09-27
CN115113796B true CN115113796B (en) 2023-12-29

Family

ID=83330625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210753870.0A Active CN115113796B (en) 2022-06-29 2022-06-29 Article processing method and device based on man-machine interaction and terminal equipment

Country Status (1)

Country Link
CN (1) CN115113796B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011058736A1 (en) * 2009-11-12 2011-05-19 京セラ株式会社 Portable terminal, input control program and input control method
CN102279707A (en) * 2011-08-07 2011-12-14 孙强国 Method for inputting characters through enlarging key spacing of soft keyboard
CN103049199A (en) * 2012-12-14 2013-04-17 中兴通讯股份有限公司 Touch screen terminal, control device and working method of touch screen terminal
JP2014211726A (en) * 2013-04-18 2014-11-13 三菱電機株式会社 Quantity selection device and terminal
KR102404095B1 (en) * 2021-07-30 2022-06-02 쿠팡 주식회사 Method for providing information on items to customer and apparatus thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100136616A (en) * 2009-06-19 2010-12-29 삼성전자주식회사 Method and apparatus for reducing the multi touch input error in portable communication system
US8803825B2 (en) * 2011-09-27 2014-08-12 Carefusion 303, Inc. System and method for filtering touch screen inputs

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011058736A1 (en) * 2009-11-12 2011-05-19 京セラ株式会社 Portable terminal, input control program and input control method
CN102279707A (en) * 2011-08-07 2011-12-14 孙强国 Method for inputting characters through enlarging key spacing of soft keyboard
CN103049199A (en) * 2012-12-14 2013-04-17 中兴通讯股份有限公司 Touch screen terminal, control device and working method of touch screen terminal
JP2014211726A (en) * 2013-04-18 2014-11-13 三菱電機株式会社 Quantity selection device and terminal
KR102404095B1 (en) * 2021-07-30 2022-06-02 쿠팡 주식회사 Method for providing information on items to customer and apparatus thereof

Also Published As

Publication number Publication date
CN115113796A (en) 2022-09-27

Similar Documents

Publication Publication Date Title
CN101553775B (en) Operating touch screen interfaces
EP2825944B1 (en) Touch screen hover input handling
KR101872533B1 (en) Three-state touch input system
US9292161B2 (en) Pointer tool with touch-enabled precise placement
US20210096886A1 (en) Account management user interfaces
US20150143285A1 (en) Method for Controlling Position of Floating Window and Terminal
US20160132205A1 (en) System and method for linking applications
CN104007894A (en) Portable device and method for operating multiapplication thereof
CN105378635A (en) Multi-region touchpad
US20120179963A1 (en) Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display
CN108553894A (en) Display control method and device, electronic equipment, storage medium
US8558806B2 (en) Information processing apparatus, information processing method, and program
CN106020698A (en) Mobile terminal and realization method of single-hand mode
CN108764873B (en) Service processing method, device and equipment
KR20160004590A (en) Method for display window in electronic device and the device thereof
WO2022252748A1 (en) Method and apparatus for processing virtual object, and device and storage medium
CN112799571B (en) Display control method, device, terminal and storage medium for secondary confirmation
CN110417984B (en) Method, device and storage medium for realizing operation in special-shaped area of screen
US11144178B2 (en) Method for providing contents for mobile terminal on the basis of user touch and hold time
CN115113796B (en) Article processing method and device based on man-machine interaction and terminal equipment
JPS63226716A (en) Touch input detecting system
CN111127780B (en) PIN input detection method of full-touch POS terminal
KR101260016B1 (en) Method and touch-screen device for implementing pointer interface using skin-type interface
US20210048937A1 (en) Mobile Device and Method for Improving the Reliability of Touches on Touchscreen
CN104461240A (en) Focus selection method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant