CN113126866B - Object determination method, device, electronic equipment and storage medium - Google Patents

Object determination method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113126866B
CN113126866B CN202110464010.0A CN202110464010A CN113126866B CN 113126866 B CN113126866 B CN 113126866B CN 202110464010 A CN202110464010 A CN 202110464010A CN 113126866 B CN113126866 B CN 113126866B
Authority
CN
China
Prior art keywords
objects
relative distance
layer
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110464010.0A
Other languages
Chinese (zh)
Other versions
CN113126866A (en
Inventor
陈娜
张军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110464010.0A priority Critical patent/CN113126866B/en
Publication of CN113126866A publication Critical patent/CN113126866A/en
Application granted granted Critical
Publication of CN113126866B publication Critical patent/CN113126866B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/26Visual data mining; Browsing structured data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an object determining method, an object determining device, electronic equipment and a storage medium, belongs to the technical field of cloud computing, and further relates to the technical field of big data cloud. The specific implementation scheme of the object determination method is as follows: acquiring first position information of an operating body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information; determining the relative distance between the operating body and each object according to the first position information and the second position information of each object; and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.

Description

Object determination method, device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of cloud computing, and further relates to the technical field of big data cloud, in particular to an object determining method, an object determining device, electronic equipment and a storage medium.
Background
With the rapid development of computers, the reconstruction and deep analysis of massive data is no longer an insurmountable difficulty. And combining data analysis and visual display by using computer data processing software or a platform, so that boring data can be converted into visual images, and the data is analyzed in real time and visually displayed, thereby realizing mass data integration analysis, providing basis for enterprise decision making and making an intelligent business operation strategy.
Disclosure of Invention
The disclosure provides an object determination method, an object determination device, electronic equipment and a storage medium.
According to an aspect of the present disclosure, there is provided an object determining method including:
acquiring first position information of an operating body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
determining the relative distance between the operating body and each object according to the first position information and the second position information of each object; and
and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
According to another aspect of the present disclosure, there is provided an object determining apparatus including: the device comprises an acquisition module, a first determination module and a second determination module.
The device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring first position information of an operating body pointing to a display interface, the display interface comprises a plurality of objects with overlapping areas, and each object is provided with second position information;
the first determining module is used for determining the relative distance between the operating body and each object according to the first position information and the second position information of each object; and
And the second determining module is used for determining a target object pointed by the operating body from the plurality of objects according to the relative distance between the operating body and each object.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described above.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the method as described above.
According to another aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described above.
By utilizing the embodiment of the disclosure, in the case that the display interface comprises a plurality of objects with overlapping areas, the target object pointed by the operation body is automatically determined from the plurality of objects through the relative distance between the operation body and each object on the display interface, so that the operation experience is improved. The method and the device solve the problem that when a plurality of objects with overlapping areas exist on the display interface, the target object cannot be accurately determined.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 schematically illustrates an exemplary system architecture to which object determination methods and apparatus may be applied, according to embodiments of the present disclosure;
FIG. 2 schematically illustrates a flow chart of an object determination method according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a schematic view of a plurality of objects having overlapping regions according to an embodiment of the disclosure;
FIG. 4 schematically illustrates a schematic diagram of an object determination method provided in accordance with an embodiment of the present disclosure;
FIG. 5 schematically illustrates a schematic diagram of an object determination method provided in accordance with another embodiment of the present disclosure;
FIG. 6 schematically illustrates a schematic diagram of an object determination method provided in accordance with another embodiment of the present disclosure;
FIG. 7 schematically illustrates a block diagram of an object determination apparatus according to an embodiment of the disclosure; and
fig. 8 schematically illustrates a block diagram of an electronic device adapted to implement an object determination method according to an embodiment of the disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
With the continuous development of computer technology, big data processing and analysis technologies are also gradually rising. By utilizing the big data processing and analyzing technology, data mining and data analysis can be well carried out, and strategic decisions of enterprises are helped based on the big data processing and analyzing technology, so that better progress and development are realized.
However, in the data processing and analyzing process, a large number of data charts and the like are generally generated, and a plurality of charts exist on a display interface, so that a plurality of charts are overlapped or are positioned on different layers, and therefore a target object cannot be accurately determined or selected, the subsequent editing operation and the like are complicated, and the user experience is affected.
The present disclosure provides an object determination method, an object determination device, an electronic device, and a storage medium. The object determining method comprises the following steps: acquiring first position information of an operating body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information; determining the relative distance between the operating body and each object according to the first position information and the second position information of each object; and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
Fig. 1 schematically illustrates an exemplary system architecture to which object determination methods and apparatuses may be applied according to embodiments of the present disclosure.
It should be noted that fig. 1 is only an example of a system architecture to which embodiments of the present disclosure may be applied to assist those skilled in the art in understanding the technical content of the present disclosure, but does not mean that embodiments of the present disclosure may not be used in other devices, systems, environments, or scenarios. For example, in another embodiment, an exemplary system architecture to which the object determining method and apparatus may be applied may include a terminal device, but the terminal device may implement the object determining method and apparatus provided by the embodiments of the present disclosure without interacting with a server.
As shown in fig. 1, a system architecture 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired and/or wireless communication links, and the like.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications may be installed on the terminal devices 101, 102, 103, such as a knowledge reading class application, a web browser application, a search class application, an instant messaging tool, a mailbox client and/or social platform software, etc. (as examples only).
The terminal devices 101, 102, 103 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (by way of example only) providing support for content browsed by the user using the terminal devices 101, 102, 103. The background management server may analyze and process the received data such as the user request, and feed back the processing result (e.g., the web page, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that, the object determining method provided by the embodiments of the present disclosure may be generally performed by the terminal device 101, 102, or 103. Accordingly, the object determining apparatus provided by the embodiments of the present disclosure may also be provided in the terminal device 101, 102, or 103.
Alternatively, the object determination method provided by the embodiments of the present disclosure may be generally performed by the server 105. Accordingly, the object determining apparatus provided by the embodiments of the present disclosure may be generally provided in the server 105. The object determination method provided by the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the object determining apparatus provided by the embodiments of the present disclosure may also be provided in a server or a server cluster different from the server 105 and capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
For example, when the user performs data processing and analysis, the terminal devices 101, 102, 103 may acquire first location information pointed by the user on the display interface by using an operation body such as a mouse, then send the acquired first location information to the server 105, and the server 105 performs analysis on the first location information and second location information of each of the plurality of objects on the display interface, so as to determine a relative distance between the operation body and each object; and determining the target object pointed by the operation body according to the relative distance between the operation body and each object. Or the first position information pointed by the operating body on the display interface is acquired by a server or a server cluster capable of communicating with the terminal devices 101, 102, 103 and/or the server 105, and finally the target object pointed by the operating body is determined.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 schematically illustrates a flowchart of an object determination method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S201 to S203.
In operation S201, first location information of an operator pointing to a display interface is acquired, wherein the display interface includes a plurality of objects having overlapping areas, each object having second location information.
In operation S202, a relative distance of the operation body from each object is determined according to the first position information and the second position information of each object.
In operation S203, a target object to which the operation body points is determined from among the plurality of objects according to the relative distance of the operation body from each object.
According to the embodiments of the present disclosure, the type of the operation body is not limited. For example, the operating body includes, but is not limited to, a mouse, a user's finger, a stylus, and the like.
According to the embodiments of the present disclosure, the type of the display interface is not limited. For example, the display interface may be, but is not limited to, a display interface on a BI platform (Business Intelligence ), a display interface on a Hadoop platform, a display interface on an HPCC platform (High Performance Computing and Communications, high performance computing and communication), and the like. It should be noted that the above display interface is only an exemplary embodiment, but is not limited thereto, and may include other display interfaces of big data processing analysis tools known in the art, as long as the big data processing analysis result can be displayed. Alternatively, the display interface may be a visual large screen of the BI platform.
According to the embodiment of the disclosure, the first position information of the pointing operation body on the display interface may be position information of a cursor corresponding to the mouse on the display interface, or may also be position information of a position of the pointing operation body directly pointed by a finger or a touch control rod of the user. The cursor may be a dot shape, a vertical line shape, or the like, and the first position information may refer to center point position information of the operation body or position information of any point on the operation body.
According to the embodiments of the present disclosure, the type of the object on the display interface is not limited. For example, the object may be an image directly acquired with an image acquisition device, a data table, a ring chart formed from data, a histogram, a graph, or the like. But is not limited thereto, for example, the object may also be: a layer with at least one image directly acquired with the image acquisition device, a layer with at least one data table, a layer with at least one ring graph, histogram, graph formed from the data, etc.
According to the embodiments of the present disclosure, the display positions of the plurality of objects on the display interface are not limited. Such as a spacing distribution, a partial overlap distribution, a coverage distribution, etc.
According to an embodiment of the present disclosure, the second position information of each object is not limited. For example, the location information of the geometric center point of each object, the location information of a plurality of different points on the boundary of each object, the location information of the vertex of each object, and so on.
According to the embodiment of the present disclosure, the relative distance of the operation body from each object may be determined according to the first position information and the second position information of each object. The relative distance in the embodiment of the present disclosure may be a linear distance between any point on the operation body and a geometric center point of each object, but is not limited thereto, and may be a linear distance between any point on the operation body and a plurality of points of each object.
According to the embodiment of the disclosure, the relative distance between the operating body and each object can be used for characterizing the relative position relationship between the operating body and each object, and the target object specifically pointed by the operating body can be determined by using the relative distance.
For example, the display interface is provided with a first object and a second object, by using the method provided by the embodiment of the disclosure, a first relative distance between the operating body and a certain point of the first object and a second relative distance between the operating body and a corresponding certain point on the second object can be determined, and by using the first relative distance and the second relative distance, whether the operating body points to the first object or the second object can be determined, and further, whether the operating body points to the target object can be determined. However, the present invention is not limited to this, and it is also possible to determine a first relative distance between the operating body and a plurality of points on the boundary of the first object and a second relative distance between the operating body and a corresponding plurality of points on the boundary of the second object, and to determine whether the operating body is directed to the first object or the second object, and thus to determine the target object to which the operating body is directed, using the first relative distance and the second relative distance.
According to the embodiment of the disclosure, by using the object determining method provided by the embodiment of the disclosure, the target object pointed by the operation body can be automatically determined from a plurality of objects by displaying the relative distance between the operation body and each object on the interface, so that the operation experience is improved. The method and the device solve the problem that when a plurality of objects with overlapping areas exist on the display interface, the target object cannot be accurately determined.
The method shown in fig. 2 is further described below with reference to fig. 3-6, in conjunction with the exemplary embodiment.
Fig. 3 schematically illustrates a schematic view of a plurality of objects having overlapping regions according to an embodiment of the present disclosure.
As shown in fig. 3, the plurality of objects on the display interface of the embodiment of the present disclosure may include a graph 301, a graph 302, and a graph 303, but is not limited thereto, and may also include a layer 304 and a layer 305. According to an embodiment of the present disclosure, layer 304 may be a background layer, while graphs 301, 302, and 303 are graphs with bar or ring graphs distributed over layer 305. The layer 304 is combined with the layer 305 having the graphs 301, 302, 303 distributed therein to form a set of analyzable reports.
According to embodiments of the present disclosure, the plurality of objects having overlapping regions on the display interface may be a plurality of objects having partially overlapping regions, such as the graph 301 and the graph 302. But is not limited thereto, and may be an object covering an area where other objects are located, such as layer 304 that completely covers layer 305. In addition, the plurality of objects having overlapping regions on the display interface may also include a chart, such as chart 303, that is spaced apart from other charts.
In the process of implementing the present disclosure, it is found that if the layer 304 is located above the layer 305 where the graph 301, the graph 302, and the graph 303 are located, that is, if the layer 304 is an editable state layer, an operator, for example, a mouse, cannot determine that the graph 303 is a target object even if pointing to the center point position of the graph 303, so that editing processing is performed. In addition, even if the layer 304 is not provided, only the layer 305 is provided, and when the layer 305 is an editable state layer and the charts 301 and 302 are partially overlapped, the object pointed by the mouse cannot be accurately determined by the operator, for example, the mouse once pointing to the overlapping area portion of the charts 301 and 302, and further the object pointed by the user cannot be edited, so that the data table or the image processing is difficult, and the editing experience is affected.
According to the related art of the present disclosure, as shown in fig. 3, a management menu may be provided on an example of a display interface, for example, on the left side outside of a layer editing area, and in a case where a user cannot select editing by pointing to an object in the editable area with an operation body, a chart or layer in the editable area may be activated by pointing to or selecting a corresponding chart name or layer name in the management menu with the operation body. However, when there are many charts, for example, hundreds of charts, there are hundreds of corresponding chart names in the management menu, and in this case, it takes time to select or point to the corresponding chart names in the management menu, resulting in a reduction in operation efficiency.
According to the embodiment of the disclosure, by using the object determining method provided by the embodiment of the disclosure, only the first position information of the pointing display interface of the operating body is required to be acquired, the relative distance between the operating body and each object is determined by using the first position information and the second position information of the object, and the target object pointed by the operating body is determined according to the relative distance between the operating body and each object. The method realizes automatic determination of the target object, and has more flexibility and practicability in operation.
According to other embodiments of the present disclosure, the plurality of objects having overlapping areas include a first object and a second object, the first object being an object covering an area where the second object is located, the first object being located in a first layer, and the second object being located in a second layer. It should be noted that, in the embodiments of the present disclosure, the object of the embodiments of the present disclosure may refer to a chart distributed on a layer.
In the process of realizing the present disclosure, it is found that when the first object is an object covering the area where the second object is located, and the first object is located in the first layer, the second object is located in the second layer, and when the first layer is in an editable state and the first layer is located on the second layer, the operating body cannot penetrate the first layer to select the second object located on the second layer, so that editing operation cannot be performed on the second object, and editing experience is affected.
By using the object determining method provided by the embodiment of the disclosure, the second object can be edited under the condition that the target object is the second object in the second layer. It should be noted that the present invention may be applied to not only 2 objects but also 3 or 4 objects larger than 2 objects. The object determination method provided by the embodiment of the present disclosure may determine the target object to which the operation body points according to the relative distance between the operation body and each object.
By using the object determining method provided by the embodiment of the disclosure, the problem that the object can not be edited by penetrating other layers at will between multiple layers in the prior art is solved, and the effect of automatically determining the target object and automatically replacing the editing states of the objects on different layers is realized.
According to an embodiment of the present disclosure, editing the second object may be, for example, moving the second object from the second layer to the first layer so as to edit the second object at the first layer.
According to the embodiment of the disclosure, the target object on the non-editing state layer can be moved to the editing state layer, so that the user can edit the target object on the editable layer, and after editing is completed, the layer parameters of the target object can be restored to the original state later, but the method is not limited to the method and the device, and the original state can not be restored. By utilizing the method for changing the layer parameters of the target object to edit the target object, which is provided by the embodiment of the disclosure, the effect of converting a plurality of objects into the same layer can be realized, and then the layer integration process is completed in the process of editing the plurality of objects.
According to another embodiment of the present disclosure, the editing of the second object may further be, for example, editing the second object in the second layer through the first layer.
According to other embodiments of the present disclosure, the target object may also be edited across layers. The cross-layer editing mode provided by the embodiment of the disclosure is utilized to edit the target object, so that the conversion of the image layer of the object is not caused, and the operation and implementation modes are simple and feasible.
According to the embodiment of the disclosure, in the case where the target object to which the operation body is directed is determined from among the plurality of objects, the target object is displayed in the display interface in the target display manner, wherein the target display manner is different from the display manner of the other objects than the target object among the plurality of objects.
According to the embodiments of the present disclosure, the target display manner is not particularly limited. The target presentation may, but is not limited to, highlight the target object, decrease the brightness of the target object, or flash the target object in some manner. As long as the effect that the display modes of the target object and other objects except the target object in the plurality of objects are different can be achieved, a detailed description is omitted here.
With embodiments of the present disclosure, a user may be characterized by presenting a target object in a target presentation manner in a display interface, with the target object at which an operator is determined to be pointed.
According to the embodiments of the present disclosure, in the case of displaying a target object in a target display manner, a user-confirmed operation may be designed before the user edits the target object. According to the embodiments of the present disclosure, the type of operation confirmed by the user is not particularly limited. The target object may be selected for activation by, but not limited to, using an operator. The selected activation may be double-clicking a left key with a mouse, single-clicking a left key, double-clicking a right key, and so forth.
In the embodiments of the present disclosure, the user is given the opportunity to confirm again, and the operation flexibility can be further improved.
Fig. 4 schematically illustrates a schematic diagram of an object determination method provided according to an embodiment of the present disclosure.
As shown in fig. 4, there is a first object 401 and a second object 402 on the display interface, wherein the first object 401 completely covers an area of the second object 402. On the display interface, a screen coordinate system may be virtually set, with the vertex of the upper left corner of the editing area as the origin (0, 0), but the method is not limited thereto, and any vertex may be used as the origin, and detailed description thereof is omitted.
On the display interface, first position information of the operating body pointing to the display interface, for example, position information (offsetX, offsetY) of the operating body 403 relative to the origin of coordinates, may be acquired by monitoring a cursor of the operating body, for example, a mouse. It should be noted that, the first position information of the pointing operation body on the display interface may be obtained in real time, or may be obtained after the preset time is still reached, and the first position information is obtained under the condition that the operation body is determined not to change any more.
According to an embodiment of the present disclosure, the second position information of the first object may be coordinate information of any position on the boundary of the first object, for example, coordinate information of a vertex position of an upper left corner of the boundary; however, the present invention is not limited to this, and the coordinate information of the geometric center point of the first object may be obtained. It should be noted that, as long as the second positions of the first object and the second object are selected to correspond to each other, no description is repeated here.
According to an alternative embodiment of the present disclosure, the position information of the geometric center point of the first object 401 may be selected as the second position information of the first object 401, and the position information of the geometric center point of the second object 402 may be selected as the second position information of the second object 402. For example, the second position information of the first object 401 is a first center point center1 in fig. 4, and the second position information of the second object 402 is a second center point center2 in fig. 4.
According to the embodiment of the disclosure, as shown in fig. 4, the first object 401 and the second object 402 are both rectangular structures, and the geometric center point of the first object and the second object may be calculated by using 4 parameters, that is, the coordinates (x+width/2.y+height/2) of the top left corner vertex of the object, the origin of coordinates, the width (width) of the object, and the height (height) of the object, so as to calculate the coordinates (i.e., the second position information) of the geometric center point, and the calculation formula may be (x+width/2.y+height/2).
It should be noted that, the determining manner of the second position information may adjust the calculating manner and the calculating formula according to the graphic structure of the object on the display interface, which is not described herein.
According to an embodiment of the present disclosure, the relative distance 1 of the operation body 403 to the first object 401 and the relative distance 2 of the operation body 403 to the second object 402 may be calculated by the pythagorean theorem.
According to an embodiment of the present disclosure, determining a target object pointed by an operation body from a plurality of objects according to a relative distance of the operation body from each object may perform an operation, for example, determining an object having a smallest relative distance from the plurality of objects according to a relative distance of the operation body from each object; and the object with the smallest relative distance is used as the target object pointed by the operation body. However, the object having the largest relative distance may be used as the target object to which the operator is directed.
According to an alternative embodiment of the present disclosure, the object with the smallest relative distance may be selected as the target object to which the operation body is directed by comparing the relative distance 1 and the relative distance 2.
The object with the smallest relative distance is selected as the target object pointed by the operation body in the optional embodiment of the disclosure, which accords with common general knowledge of public users, is practical and reduces the operation difficulty of the users.
Fig. 5 schematically illustrates a schematic diagram of an object determination method provided according to another embodiment of the present disclosure.
In the process of implementing the present disclosure, it is found that, in the case where the object having the smallest relative distance includes a plurality of objects, as shown in fig. 5, in the case where it is determined that the relative distance of the operation body 503 from the first object 501 is equal to the relative distance of the operation body 503 from the second object, it may be in the case where the geometric center points of the first object 501 and the second object 502 coincide. The target object cannot be determined by the condition that the relative distance is minimum, and the condition that the relative distance is minimum cannot determine the unique target object is satisfied.
According to the embodiment of the disclosure, the area of each object with the smallest relative distance in the display interface can be calculated; and determining a target object according to the area size of each object with the smallest relative distance in the display interface.
As shown in fig. 5, the area of the first object 501 in the display interface and the area of the second object 502 in the display interface may be calculated, and the target object may be determined by comparing the sizes of the areas.
According to the embodiment of the disclosure, the method of comparing the sizes of the areas is utilized to carry out secondary screening on the objects with the minimum relative distances, so that the condition that the target object cannot be determined by the relative distances is further limited, and the accuracy of determining the target object is improved.
According to an alternative embodiment of the present disclosure, an object having the smallest area and the smallest relative distance may be determined as the target object. In the embodiment of the disclosure, the condition of minimum area is utilized for screening, and the condition of minimum relative distance is combined to assist in determining target exclusive sharing, so that the accuracy is improved, the user thinking is further attached, and the user experience is improved.
Fig. 6 schematically illustrates a schematic diagram of an object determination method provided according to another embodiment of the present disclosure.
As shown in fig. 6, in the case where there are a plurality of objects having the same area among the plurality of objects having the smallest relative distance, for example, in fig. 6, the relative distances between the operation body 603 and the first object 601 and the second object 602 are equal. The area of the first object 601 and the area of the second object 602 are equal. In this case, the determination of the target object will not be performed by the two conditions of the minimum relative distance and the minimum area, and the minimum relative distance is satisfied and the unique target object will not be determined.
According to embodiments of the present disclosure, the determination of the target object may be implemented in connection with a management menu as shown in fig. 6. Alternatively, a first distance between the operating body 603 and the boundary of the first object 601 is determined, a second distance between the operating body 603 and the boundary of the second object 602 is determined, then a minimum distance between the first distance and the second distance is determined, and an object corresponding to the minimum distance is taken as a target object. By using the object determining method provided by the embodiment of the disclosure, the minimum relative distance is used as a first determining condition, the minimum area is used as a second determining condition, and the management menu is used as a third determining mode, so that various different scenes are satisfied on the premise of improving accuracy, and the application range is wide; and the priority order of the determined conditions is adopted, so that the logic thinking of the user is met, and the user experience is improved.
Fig. 7 schematically shows a block diagram of an object determining apparatus according to an embodiment of the present disclosure.
As shown in fig. 7, an embodiment of the present disclosure provides an object determining apparatus 700, which includes an acquisition module 701, a first determining module 702, and a second determining module 703.
An obtaining module 701, configured to obtain first position information of an operation body pointing to a display interface, where the display interface includes a plurality of objects having overlapping areas, and each object has second position information;
A first determining module 702, configured to determine a relative distance between the operating body and each object according to the first position information and the second position information of each object; and
the second determining module 703 is configured to determine, from the plurality of objects, a target object pointed by the operation body according to a relative distance between the operation body and each object.
According to the embodiment of the disclosure, by using the object determining device provided by the embodiment of the disclosure, the target object pointed by the operation body can be automatically determined from a plurality of objects by displaying the relative distance between the operation body and each object on the interface, so that the operation experience is improved. The method and the device solve the problem that when a plurality of objects with overlapping areas exist on the display interface, the target object cannot be accurately determined.
According to an embodiment of the present disclosure, the plurality of objects having overlapping regions includes a plurality of objects having partially overlapping regions and/or objects covering regions where other objects are located.
According to the embodiment of the disclosure, the plurality of objects with the overlapping area include a first object and a second object, the first object is an object covering an area where the second object is located, the first object is located in a first layer, and the second object is located in a second layer;
The object determination apparatus 700 further includes an editing module.
And the editing module is used for editing the second object in the case that the target object is the second object in the second layer.
According to an embodiment of the disclosure, the editing module comprises a first editing unit.
And the first editing unit is used for moving the second object from the second layer to the first layer so as to edit the second object at the first layer.
According to an embodiment of the disclosure, the editing module comprises a second editing unit.
And the second editing unit is used for editing the second object in the second layer through the first layer.
According to an embodiment of the present disclosure, the second determination module includes a first determination unit and a second determination unit.
A first determining unit configured to determine an object having a smallest relative distance from among the plurality of objects according to the relative distance between the operation body and each object; and
and the second determining unit is used for taking the object with the smallest relative distance as a target object pointed by the operating body.
According to an embodiment of the present disclosure, the object determination apparatus 700 further comprises a calculation module and a third determination module.
The computing module is used for computing the area of each object with the minimum relative distance in the display interface when the object with the minimum relative distance comprises a plurality of objects; and
And the third determining module is used for determining the target object according to the area size of each object with the smallest relative distance in the display interface.
According to an embodiment of the disclosure, the third determination module comprises a third determination unit.
And a third determining unit configured to determine an object with the smallest relative distance of the smallest area as a target object.
According to an embodiment of the present disclosure, the object determination apparatus 700 further comprises a presentation module.
And the display module is used for displaying the target object in a target display mode in the display interface, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
According to an embodiment of the present disclosure, the second position information of each object is position information of a geometric center point of each object.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
According to an embodiment of the present disclosure, an electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the method as described above.
According to an embodiment of the present disclosure, a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method as described above.
According to an embodiment of the present disclosure, a computer program product comprising a computer program which, when executed by a processor, implements a method as described above.
Fig. 8 illustrates a schematic block diagram of an example electronic device 800 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 8, the apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the device 800 can also be stored. The computing unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
Various components in device 800 are connected to I/O interface 805, including: an input unit 806 such as a keyboard, mouse, etc.; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, etc.; and a communication unit 809, such as a network card, modem, wireless communication transceiver, or the like. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 801 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 801 performs the respective methods and processes described above, for example, the object determination method. For example, in some embodiments, the object determination method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 808. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 800 via ROM 802 and/or communication unit 809. When a computer program is loaded into RAM 803 and executed by computing unit 801, one or more steps of the object determination method described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the object determination method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (13)

1. An object determination method, comprising:
acquiring first position information of an operating body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
determining the relative distance between the operating body and each object according to the first position information and the second position information of each object;
determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object; and
Editing a second object in a second layer under the condition that the target object is the second object, wherein the plurality of objects with the overlapped area comprise a first object and the second object, the first object is an object covering the area where the second object is located, the first object is located in the first layer, and the second object is located in the second layer;
wherein the determining, from the plurality of objects, the target object pointed by the operation body according to the relative distance between the operation body and each object includes:
determining an object with the minimum relative distance from the plurality of objects according to the relative distance between the operating body and each object;
in the case that the object with the smallest relative distance comprises one object, taking the object with the smallest relative distance as a target object pointed by the operation body;
calculating the area of each object with the smallest relative distance in the display interface under the condition that the object with the smallest relative distance comprises a plurality of objects;
in the case that the object having the smallest relative distance includes one, determining the object having the smallest relative distance as the target object; and
And when the object with the smallest area and the smallest relative distance comprises a plurality of objects, determining the target object according to the object name in the management menu selected by the operating body.
2. The method of claim 1, wherein the plurality of objects having overlapping regions comprises a plurality of objects having partially overlapping regions and/or objects covering regions in which other objects are located.
3. The method of claim 1, wherein the editing the second object comprises:
and moving the second object from the second layer to the first layer so as to edit the second object at the first layer.
4. The method of claim 1, wherein the editing the second object comprises:
and editing the second object in the second layer through the first layer.
5. The method of claim 1, further comprising:
and displaying the target object in the display interface in a target display mode, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
6. The method of claim 1, wherein the second location information of each of the objects is location information of a geometric center point of each of the objects.
7. An object determining apparatus comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring first position information of an operating body pointing to a display interface, the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
a first determining module, configured to determine a relative distance between the operating body and each object according to the first position information and the second position information of each object;
the second determining module is used for determining a target object pointed by the operating body from the plurality of objects according to the relative distance between the operating body and each object; and
the editing module is used for editing the second object when the target object is the second object in the second layer, wherein the objects with the overlapped area comprise a first object and a second object, the first object is an object covering the area where the second object is located, the first object is located in the first layer, and the second object is located in the second layer;
wherein the second determining module includes:
a first determining unit configured to determine an object having a smallest relative distance from among the plurality of objects, based on a relative distance between the operation body and each of the objects;
A second determining unit configured to, in a case where the object whose relative distance is the smallest includes one, take the object whose relative distance is the smallest as a target object to which the operation body is directed;
wherein the object determining apparatus is further configured to:
calculating the area of each object with the smallest relative distance in the display interface under the condition that the object with the smallest relative distance comprises a plurality of objects;
in the case that the object having the smallest relative distance includes one, determining the object having the smallest relative distance as the target object; and
and when the object with the smallest area and the smallest relative distance comprises a plurality of objects, determining the target object according to the object name in the management menu selected by the operating body.
8. The apparatus of claim 7, wherein the plurality of objects having overlapping regions comprises a plurality of objects having partially overlapping regions and/or objects covering regions in which other objects are located.
9. The apparatus of claim 7, wherein the editing module comprises:
and the first editing unit is used for moving the second object from the second layer to the first layer so as to edit the second object on the first layer.
10. The apparatus of claim 7, wherein the editing module comprises:
and the second editing unit is used for editing the second object in the second layer through the first layer.
11. The apparatus of claim 7, wherein the second location information of each of the objects is location information of a geometric center point of each of the objects;
the apparatus further comprises:
and the display module is used for displaying the target object in the display interface in a target display mode, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
12. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
13. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-6.
CN202110464010.0A 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium Active CN113126866B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110464010.0A CN113126866B (en) 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110464010.0A CN113126866B (en) 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113126866A CN113126866A (en) 2021-07-16
CN113126866B true CN113126866B (en) 2023-07-21

Family

ID=76780524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110464010.0A Active CN113126866B (en) 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113126866B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097980B (en) * 2022-08-24 2022-12-02 成都智暄科技有限责任公司 Small-area overlapping transparent control selection method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100478881C (en) * 2007-05-16 2009-04-15 珠海金山软件股份有限公司 Device and method for user operating sheltered area
JP4605279B2 (en) * 2008-09-12 2011-01-05 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5556758B2 (en) * 2011-07-22 2014-07-23 日本電気株式会社 Information display device, information display method, and program
CN107704162A (en) * 2016-08-08 2018-02-16 法乐第(北京)网络科技有限公司 One kind mark object control method
CN107704163A (en) * 2016-08-08 2018-02-16 法乐第(北京)网络科技有限公司 One kind mark object control device
CN112035271B (en) * 2019-06-04 2023-10-10 杭州海康威视数字技术股份有限公司 User operation information processing method and device, electronic equipment and storage medium
CN111737800B (en) * 2020-06-23 2024-01-12 广联达科技股份有限公司 Primitive selection method and device and electronic equipment

Also Published As

Publication number Publication date
CN113126866A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN107193911B (en) BIM model-based three-dimensional visualization engine and WEB application program calling method
US11741272B2 (en) Interpreter framework for a computer file
JP4821000B2 (en) Object display processing device, object display processing method, and object display processing program
KR20170037957A (en) Presenting dataset of spreadsheet in form based view
KR20170030529A (en) Visualization suggestions
CN114648615B (en) Method, device and equipment for controlling interactive reproduction of target object and storage medium
US20180189988A1 (en) Chart-type agnostic scene graph for defining a chart
CN113126866B (en) Object determination method, device, electronic equipment and storage medium
CN114063858A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112862016A (en) Method, device and equipment for labeling objects in point cloud and storage medium
CN113837194B (en) Image processing method, image processing apparatus, electronic device, and storage medium
CN110309239B (en) Visual map editing method and device
CN111966767B (en) Track thermodynamic diagram generation method, device, electronic equipment and storage medium
US11086498B2 (en) Server-side chart layout for interactive web application charts
US10395412B2 (en) Morphing chart animations in a browser
CN115933949A (en) Coordinate conversion method and device, electronic equipment and storage medium
WO2014138271A2 (en) Cloud-based real-time value stream mapping system and method
CN114564268A (en) Equipment management method and device, electronic equipment and storage medium
CN114797109A (en) Object editing method and device, electronic equipment and storage medium
CN116860222B (en) Algorithm flow editing method and related device
US10102652B2 (en) Binning to prevent overplotting for data visualization
CN114596637B (en) Image sample data enhancement training method and device and electronic equipment
CN114554089B (en) Video processing method, device, equipment and storage medium
CN114741072B (en) Page generation method, device, equipment and storage medium
CN113986112B (en) Soft keyboard display method, related device and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant