CN113126863B - Object selection implementation method and device, storage medium and electronic equipment - Google Patents

Object selection implementation method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113126863B
CN113126863B CN202110425671.2A CN202110425671A CN113126863B CN 113126863 B CN113126863 B CN 113126863B CN 202110425671 A CN202110425671 A CN 202110425671A CN 113126863 B CN113126863 B CN 113126863B
Authority
CN
China
Prior art keywords
main control
option
control object
selection
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110425671.2A
Other languages
Chinese (zh)
Other versions
CN113126863A (en
Inventor
佟帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jizhi Digital Technology Co Ltd
Original Assignee
Shenzhen Jizhi Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jizhi Digital Technology Co Ltd filed Critical Shenzhen Jizhi Digital Technology Co Ltd
Priority to CN202110425671.2A priority Critical patent/CN113126863B/en
Publication of CN113126863A publication Critical patent/CN113126863A/en
Application granted granted Critical
Publication of CN113126863B publication Critical patent/CN113126863B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an object selection implementation method and device, electronic equipment and a storage medium, and relates to the technical field of computers. The method comprises the following steps: displaying a main control object and a first option object; moving the main control object in response to a drag operation of the main control object by a user; when the main control object interacts with the first option object, the first option object is taken as a selected object and adsorbed on the main control object; and responding to the received first end instruction, and showing the main control object and all the selected objects adsorbed by the main control object. The method can provide a convenient selection interaction method to improve the user experience.

Description

Object selection implementation method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an object selection implementation method and apparatus, a storage medium, and an electronic device.
Background
With the development of science and technology, on-line house browsing has become an important house-viewing and house-selecting way. Under the scene of on-line house browsing, a user can make a selection according to the house requirement of the user, and then the house meeting the conditions can be recommended to the user according to the selection result so that the user can browse the house on line.
In the related art, when a user selects various requirements of a house, the requirements and options under the requirements are often provided for the user in a display mode of menu list items, and the user often needs to select, modify and submit the items one by one in the process, and the single content display mode and the complicated selection mode easily cause poor user experience.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The disclosure aims to provide an object selection implementation method, an object selection implementation device, electronic equipment and a storage medium, so as to solve the problem of poor user experience.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to one aspect of the present disclosure, there is provided an object selection implementation method, including:
displaying a main control object and a first option object; moving the main control object in response to a drag operation of the main control object by a user; when the main control object interacts with the first option object, the first option object is taken as a selected object and adsorbed on the main control object; and displaying the main control object and all selected objects adsorbed by the main control object in response to the received first end instruction.
In an embodiment of the present disclosure, the method for implementing object selection further includes: responding to the received second ending instruction, and displaying the main control object and all selected objects adsorbed by the main control object in a first display mode; and receiving a submission instruction in the first display mode, and generating selection result data in response to the submission instruction.
In one embodiment of the present disclosure, before receiving the commit instruction, the method further includes: receiving a deleting instruction of the selected object; and in response to the deleting instruction, removing the selected object from the main control object, and displaying the first option object.
In one embodiment of the present disclosure, receiving a delete instruction for the selected object includes: displaying a deletion mark of the selected object; receiving click operation of a deletion mark on the selected object; and generating the deleting instruction based on the clicking operation.
In one embodiment of the present disclosure, displaying the main control object and all selected objects adsorbed by the main control object includes: displaying a combined object of the main control object and the selected object; wherein the combined object comprises: displaying the thumbnail of the selected object around the edge of the main control object; or, on one side of the main control object, displaying the thumbnails of the selected objects according to the selected sequence.
In an embodiment of the present disclosure, the method for implementing object selection further includes: and in response to the received continuous selection instruction, presenting the combination object and the second option object.
In one embodiment of the present disclosure, presenting the second option object includes: acquiring portrait data selected by a user; determining a distribution probability of the second option object from the user-selected portrait data based on the selected object; and displaying the second option object by using a second display mode based on the distribution probability.
In one embodiment of the present disclosure, before responding to a drag operation of the user on the main control object, the method further includes: responding to a starting operation instruction, and controlling the main control object to enter an operation state; wherein the start operation instruction comprises: and performing long-time pressing operation on the main control object, or performing clicking operation on a preset button, or performing double-click operation on the main control object.
In an embodiment of the present disclosure, before presenting the main control object and the first option object, the method further includes: responding to the selection operation of a display theme, and acquiring a theme template of the display theme; the displaying the main control object and the first option object comprises: rendering the main control object and the first option object according to the theme template; displaying the rendered main control object and the first option object; and, said presenting said combined object and second option object comprises: rendering the combined object and a second option object according to the theme template; and displaying the rendered combined object and the second option object.
In one embodiment of the present disclosure, the interaction of the main control object with the first option object includes: the overlapping state of the main control object and the first option object meets a preset condition;
wherein the overlapping state satisfying a preset condition comprises: the overlapping area is greater than an area threshold, or the overlapping duration is greater than an overlapping duration threshold.
According to another aspect of the present disclosure, there is provided an object selection implementing apparatus, including:
the first display module is used for displaying the main control object and the first option object; the response module is used for responding to the dragging operation of the user on the main control object and moving the main control object; the selection module is used for taking the first option object as a selected object to be adsorbed on the main control object when the main control object interacts with the first option object; and the second display module is used for responding to the received first end instruction and displaying the main control object and all the selected objects adsorbed by the main control object.
In an embodiment of the present disclosure, the response module is further configured to: responding to the received second ending instruction, and displaying the main control object and all selected objects adsorbed by the main control object by using a first display mode; and receiving a commit instruction in the first display mode, and generating selection result data in response to the commit instruction.
In an embodiment of the disclosure, before receiving the commit instruction, the response module is further configured to: receiving a deleting instruction of the selected object; and in response to the deleting instruction, removing the selected object from the main control object, and displaying the first option object.
In an embodiment of the present disclosure, the receiving, by the response module, a delete instruction for the selected object includes: displaying a deletion mark of the selected object; receiving click operation of a deletion mark on the selected object; and generating the deleting instruction based on the clicking operation.
In an embodiment of the disclosure, the displaying, by the second display module, the main control object and all selected objects adsorbed by the main control object includes: displaying a combined object of the main control object and the selected object; wherein the combined object comprises: displaying the thumbnail of the selected object around the edge of the main control object; or, on one side of the main control object, displaying the thumbnails of the selected objects according to the selected sequence.
In an embodiment of the present disclosure, the second display module is further configured to: and in response to the received continuous selection instruction, presenting the combined object and the second option object.
In an embodiment of the disclosure, the displaying the second option object by the second displaying module includes: acquiring portrait data selected by a user; determining a distribution probability of the second option object from the user-selected portrait data based on the selected object; and displaying the second option object by using a second display mode based on the distribution probability.
In an embodiment of the disclosure, before responding to the dragging operation of the main control object by the user, the first presentation module is further configured to: responding to a starting operation instruction, and controlling the main control object to enter an operation state; wherein the start operation instruction comprises: and performing long-time pressing operation on the main control object, or performing clicking operation on a preset button, or performing double-click operation on the main control object.
In an embodiment of the disclosure, before the first display module displays the main control object and the first option object, the first display module further includes: responding to the selection operation of a display theme, and acquiring a theme template of the display theme; the displaying the main control object and the first option object comprises: rendering the main control object and the first option object according to the theme template; displaying the rendered main control object and the first option object; and, said presenting said combined object and second option object comprises: rendering the combined object and a second option object according to the theme template; and displaying the rendered combined object and the second option object.
In an embodiment of the present disclosure, the selecting module is further configured to determine that the main control object interacts with the first option object, and includes: determining that the overlapping state of the main control object and the first option object meets a preset condition; wherein the overlapping state satisfying a preset condition comprises: the overlapping area is greater than an area threshold, or the overlapping duration is greater than an overlapping duration threshold.
According to yet another aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described object selection implementation method.
According to still another aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above object selection implementation via execution of the executable instructions.
According to the object selection implementation method provided by the embodiment of the disclosure, the option object can be adsorbed on the main control object as the selected object in a manner of dragging the main control object to interact with the option object, and all the selected objects can be displayed on the main control object, so that a convenient selection interaction method is provided to improve user experience.
Furthermore, the object selection implementation method provided by the embodiment of the disclosure can also be combined with user selection portrait data to obtain option distribution probability for accurate pushing so that the option object can be displayed in a preset display mode, thereby further improving user experience.
Furthermore, the object selection implementation method provided by the embodiment of the disclosure can also provide different display themes for the user to select, and then obtain corresponding theme templates to render the main control object and the option objects in the interaction, so that the display content manner is diversified, and the user experience is further improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which an object selection implementation of an embodiment of the present disclosure may be applied;
FIG. 2 illustrates a flow diagram of an object selection implementation method of one embodiment of the present disclosure;
FIG. 3A is a diagram illustrating an application of an object selection implementation of one embodiment of the present disclosure;
FIG. 3B is a diagram illustrating an application of an object selection implementation of one embodiment of the present disclosure;
FIG. 3C is a diagram illustrating an application of an object selection implementation of one embodiment of the present disclosure;
FIG. 4 is a diagram illustrating an application of an object selection implementation method according to an embodiment of the present disclosure;
FIG. 5 illustrates a flow diagram of an object selection implementation method of one embodiment of the present disclosure;
FIG. 6 is a flowchart illustrating a method for displaying a second option object in an object selection implementation method according to an embodiment of the disclosure;
fig. 7 is a schematic application diagram of a presentation method for a second option object in an object selection implementation method according to an embodiment of the present disclosure;
FIG. 8 illustrates a flow diagram of an object selection implementation method of one embodiment of the present disclosure;
FIG. 9 is a diagram illustrating an application of an object selection implementation of one embodiment of the present disclosure;
FIG. 10 shows a block diagram of an object selection implementation apparatus of one embodiment of the present disclosure;
FIG. 11 is a block diagram illustrating an architecture of an object selection implementation computer device in an embodiment of the present disclosure; and
fig. 12 shows a program product for implementing the above method according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present disclosure, "a plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise.
In view of the above technical problems in the related art, embodiments of the present disclosure provide an object selection implementation method for solving at least one or all of the above technical problems.
FIG. 1 depicts a schematic diagram of an exemplary system architecture to which an object selection implementation of an embodiment of the disclosure may be applied; as shown in fig. 1:
the system architecture may include a server 101, a network 102, and a client 103. Network 102 serves as a medium for providing communication links between clients 103 and server 101. Network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The server 101 may be a server that provides various services, and for example, may receive an operation performed by a user using the client 103, may generate an instruction based on the operation, and feed back a processing result to the client 103.
The client 103 may be a mobile terminal such as a mobile phone, a game console, a tablet computer, an electronic book reader, smart glasses, a smart home device, an AR (Augmented Reality) device, a VR (Virtual Reality) device, or the client 103 may also be a personal computer such as a laptop computer, a desktop computer, and the like.
In some optional embodiments, the user may perform a dragging operation in the interface provided by the client 103, and the server 101 may control the main control object to interact with the first option object in the interface provided by the client 103 according to the dragging operation to determine the selected object, and record the selected object by the server 101; the user may also perform other operations (e.g., a click operation or a long-press operation) in the interface provided by the client 103, and the server 101 may generate other instructions (e.g., a first end instruction, a submit instruction, etc.) according to the operations, and further display the operation result to the user in the interface provided by the client 103 for the user to use.
Before the second option object is displayed to the user in the interface of the client 103, the server 101 may further obtain the user-selected portrait data, calculate a distribution probability of the second option based on the selected object selected by the user, and then display the second option object to the user in the interface of the client 103 in a second display mode based on the distribution probability.
It should be understood that the number of clients, networks and servers in fig. 1 is only illustrative, and the server 101 may be a physical server, a server cluster composed of a plurality of servers, a cloud server, and any number of clients, networks and servers according to actual needs.
Hereinafter, each step of the object selection implementation method in the exemplary embodiment of the present disclosure will be described in more detail with reference to the drawings and the embodiments.
FIG. 2 shows a flowchart of an object selection implementation method of one embodiment of the present disclosure. The method provided by the embodiment of the present disclosure may be executed by a server or a client as shown in fig. 1, but the present disclosure is not limited thereto.
In the following description, the server cluster 101 is used as an execution subject for illustration.
As shown in fig. 2, the method for implementing object selection provided by the embodiment of the present disclosure may include the following steps:
step S201, displaying a main control object and a first option object;
step S202, responding to the dragging operation of the user to the main control object, and moving the main control object;
step S203, when the main control object interacts with the first option object, the first option object is taken as a selected object to be adsorbed on the main control object;
and step S204, responding to the received first end instruction, and displaying the main control object and all the selected objects adsorbed by the main control object.
The embodiment of the disclosure can be applied to scenes browsed by online houses, and can also be applied to other scenes needing to be selected. In one scenario, one or more rounds of selections and one or more options in each round of selections may be preset, so that the options are displayed to the user through the client in fig. 1, and the user can control the main control object to interact through the client in fig. 1, thereby realizing object selection.
In step S201, a first option object in a round of selection may be displayed to a user, so that the user may operate the main control object to interact with the first option object, thereby implementing object selection; or simultaneously showing a plurality of first option objects in a round of selection to the user, so that the user can operate the main control object to interact with one object, or sequentially interact with a plurality of objects, thereby realizing object selection.
Wherein the main control object and the first option object may be statically or dynamically presented on a two-dimensional plane. Such as: displaying a main control object and a first option object in a planar image (such as a circle, a square and the like) on a display screen panel (such as an LED display screen and a liquid crystal display screen) or a touch panel (such as a screen of a touch mobile phone and a screen of a touch tablet computer), wherein a preset graph can indicate information corresponding to the objects, such as: the main control object can be marked with the words of 'main' or 'U', and the first option object can be marked with specific option content information.
Similarly, the main control object and the first option object may also be presented in a three-dimensional space, such as: the presentation is performed in the form of holographic projection, and accordingly, the presentation forms of the main control object and the first option object may be stereoscopic images (e.g., a sphere, a cube, etc.).
In some embodiments, in the case that the main control object and the first option object are revealed, a prompt statement may also be revealed to prompt the user to make a selection, such as: the user can be guided to operate through voice, text or image information.
In step S202, a drag operation of the main control object by the user, under which the user can move the main control object, may be received by the client. Such as: the user can drag the main control object to move on the two-dimensional plane, and can also drag the main control object to move in the three-dimensional space.
In step S203, the user may drag the main control object to interact with the first option object through the client (for example, to overlap the main control object with the first option object, or to make a distance between the main control object and the first option object smaller than a preset distance), so as to attach the first option object, as the selected object, to the main control object.
In some embodiments, during the process of dragging the main control object by the user, the user may interact with one first option object to adsorb the first option object, or sequentially interact with a plurality of first option objects to adsorb the plurality of first option objects sequentially.
In step S204, the first end instruction may be used to indicate that the user has temporarily selected the object, or does not need to manipulate the main control object temporarily; the first end instruction may be: the user detected on the touch screen ends the operation of dragging the main control object, or the user clicks a button of "temporarily end operation". When the client detects the first end instruction, the main control object and all the selected objects adsorbed by the main control object can be displayed.
The object selection implementation method provided by the disclosure can be applied to selection scenes in different services, and the disclosure does not limit the selection scenes.
Fig. 3A illustrates an application diagram of the object selection implementation method according to an embodiment of the present disclosure, taking a house-watching scene as an example, and as shown in fig. 3A, an exhibition interface is illustrated, in which a main control object and 4 first option objects are exhibited, which specifically includes: a main control object 301, a first option object 302, a first option object 303, a first option object 304, a first option object 305, a prompt message 306; the main control object 301 may use the word "main" as an identifier, and is used for being dragged and controlled by the user; the first option object 302 may use the word "one person" as a label to indicate "the number of people checked in is one person's demand for the number of people checked in"; the first option object 303 may use the word "two people" as a label, indicating that "the number of people living is the number of people living demand for two people"; the first option object 304 may use the word "three people" as a label, indicating "the number of people living is a number of people living demand for three people"; the first option object 305 may use the word "four and more" as a label to indicate "the number of check-in persons demand for four or more"; the reminder 306 may be used to remind the user that the selection of "number of check-ins" is currently in progress.
In one embodiment of the present disclosure, the interaction of the main control object with the first option object includes: the overlapping state of the main control object and the first option object meets a preset condition; wherein the overlapping state satisfying a preset condition comprises: the overlapping area is greater than an area threshold, or the overlapping duration is greater than an overlapping duration threshold.
The selection intention of the user can be recognized by setting preset conditions, such as: the overlap duration threshold may be set to 0.5 second, and when the user drags the main control object to overlap the main control object with the first option object, if the overlap duration does not reach 0.5 second, the overlapped first option object may not be adsorbed to the main control object, and then the selection result may not be generated. By means of the method and the device, the condition of misoperation of some users can be avoided, and user experience is improved.
In some practical applications, the overlapping state satisfying the preset condition may be further set as: simultaneously, the overlapping area is larger than the area threshold value, and the overlapping duration is larger than the overlapping duration threshold value; other conditions may be set.
Fig. 3B is an application diagram of the object selection implementation method according to an embodiment of the present disclosure, taking a house-watching scene as an example, and fig. 3B is a display interface, where a dragged main control object and 4 first option objects are displayed in the display interface, and the display interface specifically includes: a primary control object 307, a first option object 308, a first option object 309, a first option object 310, a first option object 311; wherein the main control object interacts with a first option object, specifically: the user can drag the main control object 307 to move in the interface, and during the movement, the main control object 307 and the first option object 309 can interact in an image overlapping manner, so that the first option object 309 is adsorbed on the main control object 307.
In some embodiments, after the first option object is adsorbed on the main control object, the first option object may move along with the movement of the main control object; after the main control object adsorbs one first option object in the moving process, the main control object can also continuously move to adsorb other first option objects, so that adsorption of a plurality of first option objects is realized.
In one embodiment of the present disclosure, displaying the main control object and all selected objects adsorbed by the main control object includes: displaying a combined object of the main control object and the selected object; wherein the combined object comprises: displaying the thumbnail of the selected object around the edge of the main control object; or, on one side of the main control object, displaying the thumbnails of the selected objects according to the selected sequence.
By the method in the embodiment, the combination object can avoid occupying too much space when displaying the complete information on the combination object, so that the user experience is improved.
Fig. 3C illustrates an application schematic diagram of the object selection implementation method according to an embodiment of the present disclosure, taking a house-watching scene as an example, and as shown in fig. 3C, the display interface illustrates a main control object and 3 first option objects, in which a first option object is adsorbed after receiving a first end instruction, and specifically includes: a combination object 312 formed by the main control object and the first option object, a first option object 313, a first option object 314, a first option object 315; the main control object and the first option object adsorbed by the main control object may be displayed in combination as a combined object 312, wherein the first option object may be displayed as a thumbnail after being adsorbed on the main control object as a selected object; in this embodiment, the selected object may show a "two-person" character before being selected, and may show a "two-person" character after being adsorbed on the main control object as the selected object, and may show the character in a smaller shape.
In some embodiments, when the user drags the main control object to interact with and adsorb the first option object, the first option object may maintain an original form, and after the first end instruction is acted, the first option object may present a thumbnail form on the main control object.
In an embodiment of the present disclosure, the method for implementing object selection further includes: responding to the received second ending instruction, and displaying the main control object and all selected objects adsorbed by the main control object by using a first display mode; and receiving a submission instruction in the first display mode, and generating selection result data in response to the submission instruction.
The second ending instruction may be used to indicate ending selection, and the main control object is not dragged any more, and the second ending instruction may be generated in response to an ending button clicked by a user, or may be generated when the operation duration of the user reaches an operation duration threshold, or may be generated in response to an external instruction from another system; the first display mode may enlarge the main control object and all the selected objects adsorbed by the main control object to a preset degree, or may change the colors or the presentation effects of the main control object and all the selected objects adsorbed by the main control object.
In the first display mode, the client may receive a submission instruction sent by the user, such as: the user can generate a submission instruction by clicking a "submit" button; after receiving the submission instruction, the server side can generate selection result data of the user based on the selected object on the main control object; in a house-watching scene, the selection result data can be used for recommending houses meeting the selection result for the user to browse. Such as: after the user performs multiple rounds of selections by using the object selection implementation method provided by the present disclosure, the selection result may include the following result information: the house can be determined to meet the conditions for the user to browse according to the result information if the house is single, the south or west facing, the whole house is rented and the house renting time is less than or equal to half a year.
Fig. 4 illustrates an application schematic diagram of the object selection implementation method according to an embodiment of the present disclosure, taking a house-viewing scene as an example, and as shown in fig. 4, an exhibition interface is illustrated, in which a combined object 401 formed by the main control object enlarged to a preset degree and all selected objects adsorbed by the main control object after receiving the second end instruction is illustrated, and on the combined object 401 illustrated in this embodiment, a selection result obtained after a user performs multiple rounds of selection can be illustrated.
In one embodiment of the present disclosure, before responding to a drag operation of the user on the main control object, the method further includes: responding to a starting operation instruction, and controlling the main control object to enter an operation state; wherein the start operation instruction comprises: and performing long-time pressing operation on the main control object, or performing clicking operation on a preset button, or performing double-click operation on the main control object.
The preset button can be a button arranged at a fixed position in the interface, and can also be arranged on a main control object. After the operation instruction function is started, the client can enable the user to drag the main control object, and the embodiment can determine the operation intention of the user and avoid misoperation of some users.
In some practical application embodiments, when it is detected that the user touches the main control object, that is, the user is considered to have an operation intention, the main control object can be dragged by the user.
In an embodiment of the present disclosure, before presenting the main control object and the first option object, the method further includes: responding to the selection operation of the display theme, and acquiring a theme template of the display theme; the displaying the main control object and the first option object comprises: rendering the main control object and the first option object according to the theme template; displaying the rendered main control object and the first option object; and, said presenting said combined object and second option object comprises: rendering the combined object and a second option object according to the theme template; and displaying the rendered combined object and the second option object.
The main control object and the first option object in different styles can be displayed in the client interface through the rendering of different theme templates, and the whole interface can be rendered or auditory effects, tactile effects and the like can be added in some practical applications; the theme template can be combined with the scene applied by the method, so that selectable display modes are diversified, and the user experience is improved.
Fig. 5 is a flowchart illustrating an object selection implementation method according to an embodiment of the present disclosure, and as shown in fig. 5, the method includes:
s501, receiving a theme template selected by a user, and rendering an interface on a client; an interface background, prompt information, a main control object and a first option object in a first round of selection can be displayed in an interface on the client;
step S502, receiving a start operation button clicked by a user; after the user clicks the start operation button, the main control object can be dragged;
s503, responding to the dragging operation of the user to the main control object, moving the main control object, and adsorbing the first option object as a selected object on the main control object when the main control object interacts with the first option object;
step S504, responding to the first end instruction, and displaying a combined object of the main control object and the selected object; wherein the selected object can be presented around the main control object in a thumbnail style;
step S505, responding to a second ending instruction, and displaying the main control object and all selected objects adsorbed by the main control object by using a first display mode;
step S506, responding to the submission instruction, and generating selection result data;
and step S507, displaying the houses meeting the conditions according to the selection result data.
In an embodiment of the present disclosure, the method for implementing object selection further includes: and in response to the received continuous selection instruction, presenting the combination object and the second option object.
Similar to the user dragging the main control object, the user can drag the combined object to move the combined object, and when the combined object interacts with the second option object, the second option object can be adsorbed on the combined object as a selected object; in response to receiving the first end instruction, the combined object and all selected objects attracted by the combined object may be presented.
In some practical applications, after the combined object and the second option object are displayed, the user may not make a selection, and the effect of skipping the round of selection is achieved by issuing a continuous selection instruction.
In one embodiment of the present disclosure, presenting the second option object includes: acquiring portrait data selected by a user; determining a distribution probability of the second option object from the user-selected portrait data based on the selected object; and displaying the second option object by using a second display mode based on the distribution probability.
The second display mode may be to display the options in different sizes, colors, shapes, or to control the distance between the options and the main control object. In this embodiment, the user-selected portrait data and the selected object selected by the user may be combined to determine the distribution probability of the second option object, and the second option object may be displayed in a second display manner based on the distribution probability to meet the user requirement.
The following description takes a house-watching scene as an example: given that a user selects 'one-person living', the area of a house is currently selected, and when the area of the house selected by the 'one-person living' is calculated by combining the large data user selection image data, 80% of the users selected by the 'one-person living' select the area of the house less than or equal to 50 ', 15% of the users selected by the' 50-100 'and the rest 5% of the users selected by the large data user selection image data select the area of the house, the method in the embodiment can maximize the graphic area of the second option object less than or equal to 50' and maximize the graphic area of the second option object from the '50-100' in an interface for selecting the area of the house by the user, and minimize the graphic area of the second option objects of other areas, thereby achieving the effects of accurately pushing the option objects to display and more accurately matching the demands of the user.
In some practical applications, the option can be displayed in a second display mode according to a specific service push plan so as to adapt to service requirements.
In some practical applications, the attribute information (such as possibly stored age, gender, hobbies and the like) of the user can be inquired before the user makes a selection, so that the user can be accurately pushed, and the user experience is improved.
Fig. 6 is a flowchart illustrating a method for displaying a second option object in an object selection implementation method according to an embodiment of the present disclosure, and as shown in fig. 6, the method includes:
s601, displaying a main control object and a first option object;
step S602, responding to the dragging operation of the user to the main control object, moving the main control object, and adsorbing the first option object as a selected object on the main control object when the main control object interacts with the first option object;
step S603, responding to the first end instruction, and displaying a combined object of the main control object and the first selected object;
step S604, responding to the received continuous selection instruction, and acquiring user selection portrait data;
step S605, based on the selected object, determining the distribution probability of the second option object according to the portrait data selected by the user;
and S606, displaying the combined object, and displaying a second option object by using a second display mode based on the distribution probability.
Fig. 7 is an application diagram of a presentation method for a second option object in an object selection implementation method according to an embodiment of the present disclosure, and fig. 7 is a presentation interface, where the presentation interface includes: a combined object 701, a second option object 702 for selecting a house area requirement, a second option object 703, a second option object 704, a second option object 705 and prompt information 706; wherein, the graphic area of the second option object 702 indicating the housing area is less than or equal to 50 is the largest, the graphic area of the second option object 703 indicating the housing area is 50-100 is the second largest, the graphic area of the second option object 704 indicating the housing area is 100-150 and the graphic area of the second option object 704 indicating the housing area is more than 150 are smaller; through the display interface in the embodiment, the second option object of ≦ 50 "more likely to be selected by the user can be highlighted, so that the user experience is improved.
In one embodiment of the present disclosure, before receiving the commit instruction, the method further includes: receiving a deleting instruction of the selected object; and in response to the deleting instruction, removing the selected object from the main control object, and displaying the first option object.
Further, in an embodiment of the present disclosure, receiving a deletion instruction for the selected object includes: displaying a deletion mark of the selected object; receiving click operation of a deletion mark on the selected object; and generating the deleting instruction based on the clicking operation.
The deleting mark can be presented as a small 'x' pattern button or a 'deleting' typeface button, and can be arranged on the selected object or at a preset position in the interface; the user may issue a delete command before the submit command is reached, such as: the deleting instruction can be issued immediately after the selected object is adsorbed, or the deleting instruction can be issued after the combined object is displayed in the first display mode. After the selected object is removed from the main control object through the deleting instruction and the first option object is displayed, other first option objects in the number of selection rounds to which the first option object belongs can be displayed at the same time, so that a user can drag the main control object to interact with other first option objects, and reselection is achieved.
Such as: when the user selects the number of people living in, the user drags the main control object to select one from one, two and three, then the one option object can be taken as the one selection object to be adsorbed on the main control object, and at the moment, the combined object of the main control object and the one selection object, the two selection object and the three selection object can be displayed in the interface; after the user clicks the small 'x' button on the 'one-person' selected object on the combined object, the 'one-person' selected object can be removed from the main control object in the combined object, and the 'one-person' selected object is displayed again, so that the 'two-person' selected object and the 'three-person' selected object can be still displayed in the interface at the moment, and the user can drag the main control object to reselect.
Fig. 8 is a flowchart illustrating an object selection implementation method according to an embodiment of the present disclosure, and as shown in fig. 8, the method includes:
step S801, receiving a start operation button clicked by a user;
s802, responding to the dragging operation of a user on the main control object, moving the main control object, and adsorbing the first option object as a selected object on the main control object when the main control object interacts with the first option object;
step S803, responding to the first end instruction, and showing a combined object of the main control object and the first selected object;
step S804, responding to the click operation of the deletion mark on the selected object, and generating a deletion instruction;
step S805, responding to the deleting instruction, removing the selected object from the main control object, and displaying a first option object; when the first option object before the selected object is selected is displayed, other option objects can be displayed; after the first option object and other option objects before the selected object is selected are displayed, the user can drag the main control object to interact with one option object to further adsorb, so that reselection is realized.
FIG. 9 is a schematic view of an application of the object selection implementation method according to an embodiment of the present disclosure, taking a house-watching scene as an example, and as shown in FIG. 9, the display interface may be a display interface that appears after a deletion instruction for the selected object "≦ 50" is received in the display interface of FIG. 4, so that the user may reselect based on the house area requirement; the display interface shown in fig. 9 may specifically include: the system comprises a combined object 901, a first option object 902 for selecting the house area requirement, a first option object 903, a first option object 904, a first option object 905 and prompt information 906.
It is to be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to an exemplary embodiment of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
FIG. 10 shows a block diagram of an object selection implementation 1000 of one embodiment of the present disclosure; as shown in fig. 10, includes:
a first display module 1001 for displaying a main control object and a first option object;
a response module 1002, configured to move the main control object in response to a dragging operation of the main control object by a user;
a selecting module 1003, configured to, when the main control object interacts with the first option object, adsorb the first option object as a selected object on the main control object;
a second presentation module 1004, configured to present the main control object and all the selected objects adsorbed by the main control object in response to the received first end instruction.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Accordingly, various aspects of the present invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Fig. 11 shows a block diagram of an object selection implementation computer device in an embodiment of the present disclosure. It should be noted that the electronic device shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention.
An electronic device 1100 according to this embodiment of the invention is described below with reference to fig. 11. The electronic device 1100 shown in fig. 11 is only an example and should not bring any limitations to the function and the scope of use of the embodiments of the present invention.
As shown in fig. 11, electronic device 1100 is embodied in the form of a general purpose computing device. The components of the electronic device 1100 may include, but are not limited to: the at least one processing unit 1110, the at least one memory unit 1120, and a bus 1130 that couples various system components including the memory unit 1120 and the processing unit 1110.
Wherein the memory unit stores program code that may be executed by the processing unit 1110 to cause the processing unit 1110 to perform steps according to various exemplary embodiments of the present invention as described in the "exemplary methods" section above in this specification. For example, the processing unit 1110 may perform step S201 as shown in fig. 2, exposing the main control object and the first option object; step S202, responding to the dragging operation of the user to the main control object, and moving the main control object; step S203, when the main control object interacts with the first option object, the first option object is taken as a selected object to be adsorbed on the main control object; and step S204, responding to the received first end instruction, and displaying the main control object and all the selected objects adsorbed by the main control object.
The storage unit 1120 may include a readable medium in the form of a volatile memory unit, such as a random access memory unit (RAM) 11201 and/or a cache memory unit 11202, and may further include a read only memory unit (ROM) 11203.
Storage unit 1120 may also include a program/utility 11204 having a set (at least one) of program modules 11205, such program modules 11205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1130 may be representative of one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1100 may also communicate with one or more external devices 1000 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1100, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1100 to communicate with one or more other computing devices. Such communication can occur via an input/output (I/O) interface 1150. Also, the electronic device 1100 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1160. As shown, the network adapter 1160 communicates with the other modules of the electronic device 1100 over the bus 1130. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1100, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Fig. 12 shows a program product for implementing the above method according to an embodiment of the present invention. As shown in fig. 12, the program product may employ a portable compact disc read only memory (CD-ROM) and include program codes, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice in the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (11)

1. An object selection implementation method, comprising:
displaying a main control object and a first option object; wherein the first option object is a conditional option for generating selection result data;
moving the main control object in response to a drag operation of the main control object by a user;
when the main control object interacts with the first option object, the first option object is taken as a selected object of the current selection round and is adsorbed on the main control object;
in response to a received first end instruction of the current selection turn, showing a combined object of the main control object and a selected object of the current selection turn; wherein the combined object comprises: surrounding a thumbnail of the selected object showing the current selection turn at the edge of the main control object; or, on one side of the main control object, displaying the thumbnail of the selected object of the current selection turn according to the selection sequence;
displaying the main control object and all selected objects in all selection turns adsorbed by the main control object by using a first display mode in response to a received second end instruction which indicates to end selection; the first display mode comprises the following steps: amplifying the main control object and all selected objects in all selection turns adsorbed by the main control object to a preset degree;
and receiving a submission instruction in the first display mode, and generating the selection result data based on all the selected objects adsorbed by the main control object in response to the submission instruction.
2. The method of claim 1, prior to receiving a commit instruction, further comprising:
receiving a deleting instruction of the selected object;
and in response to the deleting instruction, removing the selected object from the main control object, and displaying the first option object.
3. The method of claim 2, wherein receiving a delete instruction for the selected object comprises:
displaying a deletion mark of the selected object;
receiving click operation of a deletion mark on the selected object;
and generating the deleting instruction based on the clicking operation.
4. The method of claim 1, further comprising:
and in response to the received continuous selection instruction, presenting the combination object and the second option object.
5. The method of claim 4, wherein presenting the second option object comprises:
acquiring portrait data selected by a user;
determining a distribution probability of the second option object from the user-selected portrait data based on the selected object;
displaying the second option object by using a second display mode based on the distribution probability; the second presentation mode is a mode for presenting the second option object according to a distance between the second option object and the main control object.
6. The method according to any one of claims 1-4, further comprising, prior to responding to a drag operation of the main control object by a user:
responding to a starting operation instruction, and controlling the main control object to enter an operation state;
wherein the start operation instruction comprises: and performing long-time pressing operation on the main control object, or performing clicking operation on a preset button, or performing double-click operation on the main control object.
7. The method of claim 6, further comprising, prior to exposing the primary control object and the first option object: responding to the selection operation of a display theme, and acquiring a theme template of the display theme;
the displaying the main control object and the first option object comprises: rendering the main control object and the first option object according to the theme template; displaying the rendered main control object and the first option object; and the number of the first and second groups,
the presenting the combined object and the second option object includes: rendering the combined object and a second option object according to the theme template; and displaying the rendered combined object and the second option object.
8. The method of any of claims 1-4, wherein the interaction of the primary control object with the first option object comprises:
the overlapping state of the main control object and the first option object meets a preset condition;
wherein the overlapping state satisfying a preset condition comprises: the overlapping area is greater than an area threshold, or the overlapping duration is greater than an overlapping duration threshold.
9. An object selection implementation apparatus, comprising:
the first display module is used for displaying the main control object and the first option object;
the response module is used for responding to the dragging operation of the user on the main control object and moving the main control object;
the selection module is used for taking the first option object as a selected object of the current selection turn to be adsorbed on the main control object when the main control object interacts with the first option object;
the second display module is used for responding to a received first end instruction of the current selection round and displaying a combined object of the main control object and a selected object of the current selection round; wherein the combined object comprises: surrounding a thumbnail of the selected object showing the current selection turn at the edge of the main control object; or, on one side of the main control object, displaying the thumbnail of the selected object of the current selection turn according to the selection sequence;
displaying the main control object and all selected objects in all selection turns adsorbed by the main control object by using a first display mode in response to a received second end instruction which indicates to end selection; the first display mode comprises the following steps: amplifying the main control object and all selected objects in all selection turns adsorbed by the main control object to a preset degree;
and receiving a commit instruction in the first display mode, and generating selection result data in response to the commit instruction.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements an object selection implementation method according to any one of claims 1 to 8.
11. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement an object selection implementation method as claimed in any one of claims 1 to 8.
CN202110425671.2A 2021-04-20 2021-04-20 Object selection implementation method and device, storage medium and electronic equipment Active CN113126863B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110425671.2A CN113126863B (en) 2021-04-20 2021-04-20 Object selection implementation method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110425671.2A CN113126863B (en) 2021-04-20 2021-04-20 Object selection implementation method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113126863A CN113126863A (en) 2021-07-16
CN113126863B true CN113126863B (en) 2023-02-17

Family

ID=76778282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110425671.2A Active CN113126863B (en) 2021-04-20 2021-04-20 Object selection implementation method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113126863B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113743061B (en) * 2021-09-08 2024-04-30 深圳集智数字科技有限公司 Numerical range adjustment method, device, electronic equipment and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087162A (en) * 2018-07-05 2018-12-25 杭州朗和科技有限公司 Data processing method, system, medium and calculating equipment
CN110308843A (en) * 2018-03-27 2019-10-08 阿里巴巴集团控股有限公司 A kind of object processing method and device
CN112116679A (en) * 2020-08-25 2020-12-22 通号城市轨道交通技术有限公司 Train operation diagram generation method and device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017058665A1 (en) * 2015-10-01 2017-04-06 Vid Scale, Inc. Methods and systems for client interpretation and presentation of zoom-coded content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110308843A (en) * 2018-03-27 2019-10-08 阿里巴巴集团控股有限公司 A kind of object processing method and device
CN109087162A (en) * 2018-07-05 2018-12-25 杭州朗和科技有限公司 Data processing method, system, medium and calculating equipment
CN112116679A (en) * 2020-08-25 2020-12-22 通号城市轨道交通技术有限公司 Train operation diagram generation method and device and electronic equipment

Also Published As

Publication number Publication date
CN113126863A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN101669165B (en) Apparatus, system, and method for presenting images in a multiple display environment
US11604580B2 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
Lal Digital design essentials: 100 ways to design better desktop, web, and mobile interfaces
TWI728419B (en) Interactive method, device and equipment
WO2022205772A1 (en) Method and apparatus for displaying page element of live-streaming room
US20120249542A1 (en) Electronic apparatus to display a guide with 3d view and method thereof
CN111324252B (en) Display control method and device in live broadcast platform, storage medium and electronic equipment
WO2014078804A2 (en) Enhanced navigation for touch-surface device
US20020067378A1 (en) Computer controlled user interactive display interfaces with three-dimensional control buttons
CN111432264A (en) Content display method, device and equipment based on media information stream and storage medium
CN111221456A (en) Interactive panel display method, device, equipment and storage medium thereof
CA3055683A1 (en) Reader mode for presentation slides in a cloud collaboration platform
CN107819930A (en) A kind of function prompt method and system
US11099731B1 (en) Techniques for content management using a gesture sensitive element
US20230054388A1 (en) Method and apparatus for presenting audiovisual work, device, and medium
CN108121581B (en) User interface for self-learning
CN114116098B (en) Application icon management method and device, electronic equipment and storage medium
CN115269886A (en) Media content processing method, device, equipment and storage medium
CN113126863B (en) Object selection implementation method and device, storage medium and electronic equipment
US11397511B1 (en) System and method for implementing improved user interface
CN113536755A (en) Method, device, electronic equipment, storage medium and product for generating poster
US8413062B1 (en) Method and system for accessing interface design elements via a wireframe mock-up
CN109416638B (en) Customizable compact overlay window
CN116192789A (en) Cloud document processing method and device and electronic equipment
CN113360064A (en) Method and device for searching local area of picture, medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant