CN113126866A - Object determination method and device, electronic equipment and storage medium - Google Patents

Object determination method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113126866A
CN113126866A CN202110464010.0A CN202110464010A CN113126866A CN 113126866 A CN113126866 A CN 113126866A CN 202110464010 A CN202110464010 A CN 202110464010A CN 113126866 A CN113126866 A CN 113126866A
Authority
CN
China
Prior art keywords
objects
relative distance
operation body
layer
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110464010.0A
Other languages
Chinese (zh)
Other versions
CN113126866B (en
Inventor
陈娜
张军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110464010.0A priority Critical patent/CN113126866B/en
Publication of CN113126866A publication Critical patent/CN113126866A/en
Application granted granted Critical
Publication of CN113126866B publication Critical patent/CN113126866B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/26Visual data mining; Browsing structured data

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure discloses an object determination method and device, electronic equipment and a storage medium, and relates to the technical field of cloud computing, in particular to the technical field of big data cloud. The specific implementation scheme of the object determination method is as follows: acquiring first position information of an operation body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information; determining the relative distance between the operation body and each object according to the first position information and the second position information of each object; and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.

Description

Object determination method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of cloud computing, and further relates to the technical field of big data cloud, in particular to an object determination method and device, an electronic device and a storage medium.
Background
With the rapid development of computers, the reconstruction and deep analysis of massive data are no longer insurmountable difficulties. The data analysis and the visual display are combined by utilizing computer data processing software or a platform, so that boring data can be converted into visual images, the data are analyzed in real time and visually displayed, further, massive data integration analysis is realized, a basis is provided for enterprise decision making, and an intelligent business operation strategy is conveniently made.
Disclosure of Invention
The disclosure provides an object determination method, an object determination device, an electronic device and a storage medium.
According to an aspect of the present disclosure, there is provided an object determination method including:
acquiring first position information of an operation body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
determining the relative distance between the operation body and each object according to the first position information and the second position information of each object; and
and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
According to another aspect of the present disclosure, there is provided an object determination apparatus including: the device comprises an acquisition module, a first determination module and a second determination module.
The device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring first position information of an operation body pointing to a display interface, the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
the first determining module is used for determining the relative distance between the operation body and each object according to the first position information and the second position information of each object; and
and the second determining module is used for determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described above.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method as described above.
According to another aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the method as described above.
By means of the method and the device, under the condition that the display interface comprises the plurality of objects with the overlapping areas, the target object pointed by the operation body is automatically determined from the plurality of objects through the relative distance between the operation body and each object on the display interface, and operation experience is improved. The problem that when a plurality of objects with overlapping areas exist on a display interface, the target object cannot be determined accurately is solved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 schematically illustrates an exemplary system architecture to which the object determination method and apparatus may be applied, according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an object determination method according to an embodiment of the present disclosure;
FIG. 3 schematically shows a schematic view of a plurality of objects having overlapping regions according to an embodiment of the disclosure;
fig. 4 schematically shows a schematic diagram of an object determination method provided according to an embodiment of the present disclosure;
fig. 5 schematically shows a schematic diagram of an object determination method provided according to another embodiment of the present disclosure;
fig. 6 schematically shows a schematic diagram of an object determination method provided according to another embodiment of the present disclosure;
fig. 7 schematically shows a block diagram of an object determination apparatus according to an embodiment of the present disclosure; and
fig. 8 schematically shows a block diagram of an electronic device adapted to implement an object determination method according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
With the continuous development of computer technology, big data processing and analysis technology is also gradually emerging. By utilizing a big data processing and analyzing technology, data mining and data analysis can be well carried out, and an enterprise is helped to make strategic decisions based on the data mining and data analysis, so that better progress and development are realized.
However, in the data processing and analyzing process, a large number of data charts and the like are generally generated, which may cause a situation that a plurality of charts exist on a display interface, and the plurality of charts are overlapped or located in different layers, which may cause that a target object cannot be determined or selected more accurately, and thus operations such as subsequent editing and the like are complicated, and user experience is affected.
The disclosure provides an object determination method, an object determination apparatus, an electronic device, and a storage medium. The object determination method comprises the following steps: acquiring first position information of an operation body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information; determining the relative distance between the operation body and each object according to the first position information and the second position information of each object; and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
Fig. 1 schematically illustrates an exemplary system architecture to which the object determination method and apparatus may be applied, according to an embodiment of the present disclosure.
It should be noted that fig. 1 is only an example of a system architecture to which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, and does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios. For example, in another embodiment, an exemplary system architecture to which the object determination method and apparatus may be applied may include a terminal device, but the terminal device may implement the object determination method and apparatus provided in the embodiments of the present disclosure without interacting with a server.
As shown in fig. 1, the system architecture 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104 and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired and/or wireless communication links, and so forth.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as a knowledge reading application, a web browser application, a search application, an instant messaging tool, a mailbox client, and/or social platform software, etc. (by way of example only).
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (for example only) providing support for content browsed by the user using the terminal devices 101, 102, 103. The background management server may analyze and perform other processing on the received data such as the user request, and feed back a processing result (e.g., a webpage, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the object determination method provided by the embodiment of the present disclosure may be generally executed by the terminal device 101, 102, or 103. Accordingly, the object determination apparatus provided by the embodiment of the present disclosure may also be disposed in the terminal device 101, 102, or 103.
Alternatively, the object determination method provided by the embodiment of the present disclosure may also be generally executed by the server 105. Accordingly, the object determination apparatus provided by the embodiments of the present disclosure may be generally disposed in the server 105. The object determination method provided by the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the object determination apparatus provided in the embodiments of the present disclosure may also be disposed in a server or a server cluster different from the server 105 and capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
For example, when a user performs data processing and analysis, the terminal devices 101, 102, and 103 may acquire first position information pointed by the user on the display interface by using an operation body such as a mouse, and then send the acquired first position information to the server 105, and the server 105 analyzes the first position information and second position information of each of a plurality of objects on the display interface to determine a relative distance between the operation body and each object; and determining a target object pointed by the operation body according to the relative distance between the operation body and each object. Or a server cluster capable of communicating with the terminal devices 101, 102, 103 and/or the server 105 acquires the first position information pointed by the operator on the display interface, and finally determines the target object pointed by the operator.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 schematically shows a flow chart of an object determination method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S201 to S203.
In operation S201, first position information of an operator pointing to a display interface is acquired, where the display interface includes a plurality of objects having overlapping regions, and each object has second position information.
In operation S202, a relative distance of the operation body from each object is determined according to the first position information and the second position information of each object.
In operation S203, a target object to which the manipulation body is directed is determined from the plurality of objects according to a relative distance of the manipulation body from each object.
According to the embodiment of the present disclosure, the type of the operation body is not limited. For example, the operation body includes, but is not limited to, a mouse, a user's finger, a stylus, and the like.
According to the embodiment of the present disclosure, the type of the display interface is not limited. For example, the display interface may be, but is not limited to, a display interface on a Business Intelligence (BI) platform, a display interface on a Hadoop platform, a display interface on a High Performance Computing and Communications (HPCC) platform, and the like. It should be noted that the display interface described above is only an exemplary embodiment, but is not limited to this, and may also include display interfaces of other big data processing and analyzing tools known in the art as long as the big data processing and analyzing results can be displayed. Alternatively, the display interface may be a large visualization screen of the BI platform.
According to the embodiment of the disclosure, the first position information of the operation body pointing to the display interface may be position information of a position where a cursor corresponding to a mouse is located on the display interface, or may also be position information of a position where a user's finger or a stylus is directly pointing. The cursor may be in a dot shape, a vertical line shape, or the like, and the first position information may refer to position information of a center point of the operation body or position information of any point on the operation body.
According to an embodiment of the present disclosure, the type of the object on the display interface is not limited. For example, the object may be an image directly acquired with an image acquisition apparatus, a data table, a ring chart formed from data, a histogram, a graph, or the like. But is not limited thereto, and for example, the object may be: layers with at least one image acquired directly with the image acquisition device, layers with at least one data table, layers with at least one ring graph, bar graph, graph formed from data, etc.
According to the embodiment of the present disclosure, the display positions of the plurality of objects on the display interface are not limited. Such as spaced apart distribution, partially overlapping distribution, and the like.
According to an embodiment of the present disclosure, the second position information of each object is not limited. For example, position information of a geometric center point of each object, position information of a plurality of different points on a boundary of each object, position information of a vertex of each object, and the like.
According to the embodiment of the present disclosure, from the first position information and the second position information of each object, the relative distance of the operating body from each object can be determined. The relative distance in the embodiment of the present disclosure may be a straight-line distance between any point on the operation body and the geometric center point of each object, but is not limited thereto, and may also be a straight-line distance between any point on the operation body and a plurality of points of each object.
According to the embodiment of the disclosure, the relative distance between the operation body and each object can represent the relative position relationship between the operation body and each object, and by using the relative distance, the target object specifically pointed by the operation body can be determined.
For example, a first object and a second object are arranged on a display interface, a first relative distance between an operation body and a certain point of the first object and a second relative distance between the operation body and a corresponding certain point on the second object can be determined by using the method provided by the embodiment of the disclosure, and whether the operation body points to the first object or the second object can be determined by using the first relative distance and the second relative distance, so that a target object pointed by the operation body is determined. However, the present invention is not limited to this, and a first relative distance between a plurality of points on the boundary between the operation body and the first object and a second relative distance between a plurality of corresponding points on the boundary between the operation body and the second object may be determined, and using the first relative distance and the second relative distance, it may be determined whether the operation body points to the first object or the second object, and thus, a target object pointed by the operation body may be determined.
According to the embodiment of the disclosure, by using the object determination method provided by the embodiment of the disclosure, the target object pointed by the operation body can be automatically determined from the plurality of objects through the relative distance between the operation body and each object on the display interface, so that the operation experience is improved. The problem that when a plurality of objects with overlapped areas exist on a display interface, a target object cannot be accurately determined is solved.
The method shown in fig. 2 is further described with reference to fig. 3-6 in conjunction with specific embodiments.
Fig. 3 schematically shows a schematic diagram of a plurality of objects with overlapping regions according to an embodiment of the disclosure.
As shown in fig. 3, the plurality of objects on the display interface of the embodiment of the present disclosure may include a diagram 301, a diagram 302, and a diagram 303, but is not limited to this, and may further include a layer 304 and a layer 305. According to an embodiment of the present disclosure, the layer 304 may be a background layer, and the graph 301, the graph 302, and the graph 303 are graphs with a histogram or a ring graph distributed on the layer 305. The layer 304 is combined with the layer 305 with the graphs 301, 302, 303 distributed thereon to form a set of analyzable reports.
According to an embodiment of the present disclosure, the plurality of objects having the overlapping area on the display interface may be a plurality of objects having a partial overlapping area, such as the graph 301 and the graph 302. But is not limited to this, and may be an object that covers the area where other objects are located, such as the layer 304 that completely covers the layer 305. In addition, the plurality of objects having overlapping regions on the display interface may also include charts that are spaced apart from other charts, such as chart 303.
In the process of implementing the present disclosure, it is found that if the layer 304 is located on the layer 305 where the diagram 301, the diagram 302, and the diagram 303 are located, that is, the layer 304 is an editable state layer, even if an operator, such as a mouse, points at the center point of the diagram 303, it is not possible to determine that the diagram 303 is a target object, and thus, editing processing is performed. Even if there is no layer 304, there is only a layer 305, and when the layer 305 is an editable state layer, and when the graph 301 and the graph 302 are partially overlapped, if an operator, for example, a mouse, points at the area where the graph 301 and the graph 302 are overlapped, the target object pointed by the mouse cannot be accurately determined, and further, the target object pointed by the user target cannot be edited, which makes it difficult to process a data table or an image, and affects the editing experience.
According to the related art of the present disclosure, as shown in fig. 3, a management menu may be provided on an example of the display interface, for example, on the left side outside the layer editing area, and in a case where the user cannot select an edit by pointing to an object in the editable area with an operator, the operator may point to or select a corresponding chart name or layer name in the management menu to activate a chart or layer in the editable area. However, when there are many charts, for example, hundreds of charts, the management menu also has hundreds of corresponding chart names, and in this case, it takes time to select or point to the corresponding chart name in the management menu, which results in a decrease in operation efficiency.
According to the embodiment of the disclosure, by using the object determining method provided by the embodiment of the disclosure, only the first position information of the operation body on the pointing display interface needs to be acquired, the relative distance between the operation body and each object is determined by using the first position information and the second position information of the object, and the target object pointed by the operation body is determined according to the relative distance between the operation body and each object. The target object can be automatically determined, so that the operation is more flexible and practical.
According to other embodiments of the present disclosure, the plurality of objects having the overlapping area include a first object and a second object, the first object is an object covering an area where the second object is located, the first object is located in the first layer, and the second object is located in the second layer. It should be noted that, in the embodiment of the present disclosure, an object of the embodiment of the present disclosure may refer to a chart distributed on a layer.
In the process of implementing the present disclosure, it is found that when a first object is an object covering an area where a second object is located, the first object is located in a first layer, the second object is located in a second layer, and the first layer is in an editable state, and the first layer is located on the second layer, an operator cannot penetrate through the first layer to select the second object located on the second layer, and further cannot perform an editing operation on the second object, which affects editing experience.
By using the object determination method provided by the embodiment of the disclosure, the second object can be edited under the condition that the target object is the second object in the second layer. It should be noted that the present invention is not limited to 2 objects, and may also be 3 or 4 objects larger than 2 objects. As long as the object determination method provided by the embodiment of the present disclosure is used, the target object pointed by the operation body may be determined according to the relative distance between the operation body and each object.
By using the object determination method provided by the embodiment of the disclosure, the problem that objects cannot be edited by penetrating other layers randomly among multiple layers in the prior art is solved, and the effects of automatically determining a target object and automatically replacing the editing states of the objects on different layers are achieved.
According to an embodiment of the present disclosure, the editing of the second object may be, for example, moving the second object from the second layer to the first layer, so as to edit the second object in the first layer.
According to the embodiment of the disclosure, the target object on the non-editing state layer can be moved to the editing state layer, so that the user can edit the target object on the editable layer, and after the editing is completed, the layer parameters of the target object can be subsequently restored to the original state. By using the method for changing the layer parameters of the target object to edit the target object provided by the embodiment of the disclosure, the effect of converting a plurality of objects into the same layer can be realized, and then the layer integration process can be simultaneously completed in the multi-object editing process.
According to another embodiment of the present disclosure, the editing of the second object may be further performed, for example, by editing the second object in the second layer through the first layer.
According to other embodiments of the present disclosure, the target object may also be edited across layers. The target object is edited by using the cross-layer editing mode provided by the embodiment of the disclosure, the conversion of the layer of the object is not caused, and the operation and implementation mode is simple and easy.
According to the embodiment of the disclosure, in the case that the target object pointed by the operation body is determined from the plurality of objects, the target object is displayed in the display interface in a target display mode, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
According to the embodiments of the present disclosure, the target display manner is not particularly limited. The target presentation may be, but is not limited to, highlighting the target object, reducing the brightness of the target object, or flashing the target object in some manner. As long as the effect of making the display manner of the target object different from that of other objects except the target object in the plurality of objects can be achieved, details are not described here.
By means of the method and the device for displaying the target object, the target object pointed by the determined operation body can be represented to the user by displaying the target object in the display interface in the target display mode.
According to the embodiment of the disclosure, in the case of displaying the target object in the target display manner, the operation confirmed by the user can be designed before the user edits the target object. According to the embodiment of the present disclosure, the type of the operation confirmed by the user is not particularly limited. The target object may be, but is not limited to, selected by the operator for activation. The selected activation may be a double click of the left button, a single click of the left button, a double click of the right button, etc. with the mouse.
In the embodiment of the present disclosure, the user is given the opportunity to confirm again, and the operational flexibility can be further improved.
Fig. 4 schematically illustrates a schematic diagram of an object determination method provided according to an embodiment of the present disclosure.
As shown in fig. 4, there is a first object 401 and a second object 402 on a display interface, wherein the first object 401 completely covers the area of the second object 402. On the display interface, a screen coordinate system may be set virtually, and the vertex at the upper left corner of the editing area is used as the origin of coordinates (0, 0), but the present invention is not limited thereto, and any vertex may also be used as the origin of coordinates, which is not described herein again.
On the display interface, first position information of the operation body pointing to the display interface, for example, position information (offset x, offset y) of the operation body 403 with respect to the origin of coordinates, may be acquired by listening to a cursor of the operation body, for example, a mouse. It should be noted that the first position information on the operation body pointing display interface may be obtained in real time, or the first position information may be obtained after waiting for a preset time and under the condition that it is determined that the operation body is not changed any more.
According to an embodiment of the present disclosure, the second position information of the first object may be coordinate information of any position on the boundary of the first object, for example, coordinate information of a vertex position of an upper left corner of the boundary; but is not limited to this, and may be coordinate information of the geometric center point of the first object. It should be noted that, as long as the second positions of the first object and the second object are selected to correspond to each other, the description is omitted here.
According to an alternative embodiment of the present disclosure, the position information of the geometric center point of the first object 401 may be selected as the second position information of the first object 401, and the position information of the geometric center point of the second object 402 may be selected as the second position information of the second object 402. For example, the second position information of the first object 401 is the first center point center1 in fig. 4, and the second position information of the second object 402 is the second center point center2 in fig. 4.
According to the embodiment of the present disclosure, as shown in fig. 4, the first object 401 and the second object 402 are both rectangular structures, and the geometric center point of the first object is calculated by using 4 parameters, i.e., the vertex coordinates (X, Y) at the upper left corner of the first object, the origin of coordinates, the width (width) of the first object, and the height (height) of the first object, to calculate the coordinates (i.e., the second position information) of the geometric center point, where the calculation formula may be (X + width/2.Y + height/2).
It should be noted that the determination method of the second position information may adjust the calculation method and the calculation formula according to the graph structure of the object on the display interface, which is not described herein again.
According to the embodiment of the present disclosure, the relative distance 1 between the operating body 403 and the first object 401 and the relative distance 2 between the operating body 403 and the second object 402 can be calculated by the pythagorean theorem.
According to the embodiment of the present disclosure, determining the target object pointed to by the operation body from the plurality of objects according to the relative distance of the operation body to each object may perform an operation of, for example, determining the object with the smallest relative distance from the plurality of objects according to the relative distance of the operation body to each object; and taking the object with the minimum relative distance as a target object pointed by the operation body. However, the present invention is not limited to this, and an object having the largest relative distance may be a target object to which the operator points.
According to an alternative embodiment of the present disclosure, the object with the smallest relative distance may be selected as the target object pointed by the operating body by comparing the relative distance 1 with the relative distance 2.
The object with the smallest relative distance is selected as the target object pointed by the operation body in the optional embodiment of the disclosure, so that the common living knowledge of the public user is met, the practice is met, and the operation difficulty of the user is reduced.
Fig. 5 schematically illustrates a schematic diagram of an object determination method provided according to another embodiment of the present disclosure.
In implementing the present disclosure, it is found that in the case where the object having the smallest relative distance includes a plurality of objects, as shown in fig. 5, in the case where it is determined that the relative distance of the operating body 503 from the first object 501 is equal to the relative distance of the operating body 503 from the second object, it may be the case where the geometric center points of the first object 501 and the second object 502 coincide. The determination of the target object cannot be performed by a condition that the relative distance is minimum, and the condition that the relative distance is minimum cannot determine a unique target object is satisfied.
According to the embodiment of the disclosure, the area of each object with the smallest relative distance in the display interface can be calculated; and determining a target object according to the area size of each object with the minimum relative distance in the display interface.
As shown in fig. 5, the area of the first object 501 in the display interface and the area of the second object 502 in the display interface may be calculated, and the target object may be determined by comparing the sizes of the areas.
According to the embodiment of the disclosure, the secondary screening is performed on the objects with the minimum relative distances by comparing the area sizes, so that the condition that the target object cannot be determined by using the relative distances is further limited, and the accuracy of determining the target object is improved.
According to an alternative embodiment of the present disclosure, among others, an object with the smallest area and the smallest relative distance may be determined as the target object. In the embodiment of the disclosure, the condition with the smallest area is utilized for screening, the condition is combined with the smallest relative distance, the target is uniquely determined in an auxiliary mode, the accuracy is improved, meanwhile, the thinking of a user is further fitted, and the use experience of the user is improved.
Fig. 6 schematically shows a schematic diagram of an object determination method provided according to another embodiment of the present disclosure.
As shown in fig. 6, in a case where a plurality of objects having the same area still exist among a plurality of objects having the smallest relative distance, for example, in fig. 6, the relative distance between the operating body 603 and the first object 601 and the second object 602 is equal. Also, the area of the first object 601 and the area of the second object 602 are equal. In this case, the target object cannot be determined by the two conditions of the minimum relative distance and the minimum area, and the only target object cannot be determined if the minimum relative distance and the minimum area are satisfied.
According to an embodiment of the present disclosure, the determination of the target object may be implemented in conjunction with a management menu as shown in fig. 6. Alternatively, a first distance between the operating body 603 and the boundary of the first object 601 is determined, a second distance between the operating body 603 and the boundary of the second object 602 is determined, and then the minimum distance between the first distance and the second distance is determined, and an object corresponding to the minimum distance is used as the target object. By using the object determination method provided by the embodiment of the disclosure, the minimum relative distance is used as the first determination condition, the minimum area is used as the second determination condition, and the management menu is used as the third determination mode, so that the method not only meets various different scenes on the premise of improving the accuracy, but also has a wide application range; and the priority order of the determined conditions is adopted, so that the logical thinking of the user is met, and the user experience is improved.
Fig. 7 schematically shows a block diagram of an object determination apparatus according to an embodiment of the present disclosure.
As shown in fig. 7, an embodiment of the present disclosure provides an object determining apparatus 700, which includes an obtaining module 701, a first determining module 702, and a second determining module 703.
An obtaining module 701, configured to obtain first position information that an operator points to a display interface, where the display interface includes a plurality of objects having overlapping areas, and each object has second position information;
a first determining module 702, configured to determine a relative distance between the operation body and each object according to the first position information and the second position information of each object; and
a second determining module 703, configured to determine, according to the relative distance between the operation body and each object, a target object pointed by the operation body from the multiple objects.
According to the embodiment of the disclosure, by using the object determination device provided by the embodiment of the disclosure, the target object pointed by the operation body can be automatically determined from a plurality of objects through the relative distance between the operation body and each object on the display interface, so that the operation experience is improved. The problem that when a plurality of objects with overlapped areas exist on a display interface, a target object cannot be accurately determined is solved.
According to the embodiment of the present disclosure, the plurality of objects having the overlapping area includes a plurality of objects having a partial overlapping area and/or objects covering an area where other objects are located.
According to the embodiment of the disclosure, the plurality of objects having the overlapping area include a first object and a second object, the first object is an object covering an area where the second object is located, the first object is located in a first layer, and the second object is located in a second layer;
the object determination apparatus 700 further includes an editing module.
And the editing module is used for editing the second object under the condition that the target object is the second object in the second layer.
According to an embodiment of the present disclosure, wherein the editing module includes a first editing unit.
And the first editing unit is used for moving the second object from the second layer to the first layer so as to edit the second object in the first layer.
According to an embodiment of the present disclosure, wherein the editing module includes a second editing unit.
And the second editing unit is used for editing a second object in the second layer by passing through the first layer.
According to an embodiment of the present disclosure, the second determination module includes a first determination unit and a second determination unit.
A first determination unit configured to determine an object having a smallest relative distance from the plurality of objects, based on the relative distance of the operation body from each object; and
and the second determining unit is used for taking the object with the minimum relative distance as the target object pointed by the operation body.
According to an embodiment of the present disclosure, the object determination apparatus 700 further includes a calculation module and a third determination module.
The calculation module is used for calculating the area of each object with the minimum relative distance in the display interface under the condition that the object with the minimum relative distance comprises a plurality of objects; and
and the third determining module is used for determining the target object according to the area size of each object with the minimum relative distance in the display interface.
According to an embodiment of the present disclosure, wherein the third determining module includes a third determining unit.
And a third determination unit configured to determine an object having a smallest area and a smallest relative distance as the target object.
According to an embodiment of the present disclosure, the object determination apparatus 700 further includes a presentation module.
And the display module is used for displaying the target object in a target display mode in the display interface, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
According to an embodiment of the present disclosure, wherein the second position information of each object is position information of a geometric center point of each object.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
According to an embodiment of the present disclosure, an electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described above.
According to an embodiment of the present disclosure, a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method as described above.
According to an embodiment of the disclosure, a computer program product comprising a computer program which, when executed by a processor, implements the method as described above.
FIG. 8 illustrates a schematic block diagram of an example electronic device 800 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 8, the apparatus 800 includes a computing unit 801 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)802 or a computer program loaded from a storage unit 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data required for the operation of the device 800 can also be stored. The calculation unit 801, the ROM 802, and the RAM 803 are connected to each other by a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
A number of components in the device 800 are connected to the I/O interface 805, including: an input unit 806, such as a keyboard, a mouse, or the like; an output unit 807 such as various types of displays, speakers, and the like; a storage unit 808, such as a magnetic disk, optical disk, or the like; and a communication unit 809 such as a network card, modem, wireless communication transceiver, etc. The communication unit 809 allows the device 800 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Computing unit 801 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 801 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and the like. The calculation unit 801 executes the respective methods and processes described above, such as the object determination method. For example, in some embodiments, the object determination method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 808. In some embodiments, part or all of the computer program can be loaded and/or installed onto device 800 via ROM 802 and/or communications unit 809. When the computer program is loaded into the RAM 803 and executed by the computing unit 801, one or more steps of the object determination method described above may be performed. Alternatively, in other embodiments, the computing unit 801 may be configured to perform the object determination method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (21)

1. An object determination method, comprising:
acquiring first position information of an operation body pointing to a display interface, wherein the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
determining the relative distance between the operation body and each object according to the first position information and the second position information of each object; and
and determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
2. The method of claim 1, wherein the plurality of objects having overlapping regions comprises a plurality of objects having partially overlapping regions and/or objects that cover regions in which other objects are located.
3. The method according to claim 1 or 2, wherein the plurality of objects having the overlapping area include a first object and a second object, the first object is an object covering an area where the second object is located, the first object is located in a first layer, and the second object is located in a second layer;
the method further comprises the following steps:
and editing the second object under the condition that the target object is the second object in the second layer.
4. The method of claim 3, wherein the editing the second object comprises:
and moving the second object from the second layer to the first layer so as to edit the second object in the first layer.
5. The method of claim 3, wherein the editing the second object comprises:
and editing a second object in the second image layer through the first image layer.
6. The method of claim 1, wherein the determining a target object from the plurality of objects to which the operator is directed based on the relative distance of the operator from each of the objects comprises:
determining an object with the smallest relative distance from the plurality of objects according to the relative distance between the operation body and each object; and
and taking the object with the minimum relative distance as a target object pointed by the operation body.
7. The method of claim 6, further comprising:
in the case that the object with the minimum relative distance comprises a plurality of objects, calculating the area of each object with the minimum relative distance in the display interface; and
and determining the target object according to the area size of each object with the minimum relative distance in the display interface.
8. The method of claim 7, wherein the determining the target object according to the area size of each object with the minimum relative distance in the display interface comprises:
determining the object with the smallest area and the smallest relative distance as the target object.
9. The method of claim 1, further comprising:
and displaying the target object in the display interface in a target display mode, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
10. The method of claim 1, wherein the second location information of each of the objects is location information of a geometric center point of each of the objects.
11. An object determination apparatus, comprising:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring first position information of an operation body pointing to a display interface, the display interface comprises a plurality of objects with overlapping areas, and each object has second position information;
the first determining module is used for determining the relative distance between the operation body and each object according to the first position information and the second position information of each object; and
and the second determination module is used for determining a target object pointed by the operation body from the plurality of objects according to the relative distance between the operation body and each object.
12. The apparatus of claim 11, wherein the plurality of objects with overlapping regions comprises a plurality of objects with partially overlapping regions and/or objects that cover regions where other objects are located.
13. The apparatus according to claim 11 or 12, wherein the plurality of objects having overlapping areas include a first object and a second object, the first object is an object covering an area where the second object is located, the first object is located in a first layer, and the second object is located in a second layer;
the device further comprises:
and the editing module is used for editing the second object under the condition that the target object is the second object in the second layer.
14. The apparatus of claim 13, wherein the editing module comprises:
and the first editing unit is used for moving the second object from the second layer to the first layer so as to edit the second object in the first layer.
15. The apparatus of claim 13, wherein the editing module comprises:
and the second editing unit is used for passing through the first image layer and editing a second object in the second image layer.
16. The apparatus of claim 11, wherein the second determining means comprises:
a first determination unit configured to determine an object having a smallest relative distance from the plurality of objects according to a relative distance between the operation body and each of the objects; and
and a second determination unit, configured to use the object with the smallest relative distance as a target object pointed by the operation body.
17. The apparatus of claim 16, further comprising:
a calculation module, configured to calculate an area of each object with the smallest relative distance in the display interface if the object with the smallest relative distance includes a plurality of objects; and
and the third determining module is used for determining the target object according to the area size of each object with the minimum relative distance in the display interface.
18. The apparatus according to claim 11, wherein the second position information of each of the objects is position information of a geometric center point of each of the objects;
the device further comprises:
and the display module is used for displaying the target object in a target display mode in the display interface, wherein the target display mode is different from the display mode of other objects except the target object in the plurality of objects.
19. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-10.
20. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-10.
21. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-10.
CN202110464010.0A 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium Active CN113126866B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110464010.0A CN113126866B (en) 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110464010.0A CN113126866B (en) 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113126866A true CN113126866A (en) 2021-07-16
CN113126866B CN113126866B (en) 2023-07-21

Family

ID=76780524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110464010.0A Active CN113126866B (en) 2021-04-27 2021-04-27 Object determination method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113126866B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097980A (en) * 2022-08-24 2022-09-23 成都智暄科技有限责任公司 Method for selecting small-area overlapped transparent control

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101059756A (en) * 2007-05-16 2007-10-24 珠海金山软件股份有限公司 Device and method for user operating sheltered area
US20100141680A1 (en) * 2008-09-12 2010-06-10 Tatsushi Nashida Information processing apparatus and information processing method
JP2013025622A (en) * 2011-07-22 2013-02-04 Nec Corp Information display device, information display method and program
CN107704162A (en) * 2016-08-08 2018-02-16 法乐第(北京)网络科技有限公司 One kind mark object control method
CN107704163A (en) * 2016-08-08 2018-02-16 法乐第(北京)网络科技有限公司 One kind mark object control device
CN111737800A (en) * 2020-06-23 2020-10-02 广联达科技股份有限公司 Primitive selection method and device and electronic equipment
CN112035271A (en) * 2019-06-04 2020-12-04 杭州海康威视数字技术股份有限公司 User operation information processing method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101059756A (en) * 2007-05-16 2007-10-24 珠海金山软件股份有限公司 Device and method for user operating sheltered area
US20100141680A1 (en) * 2008-09-12 2010-06-10 Tatsushi Nashida Information processing apparatus and information processing method
JP2013025622A (en) * 2011-07-22 2013-02-04 Nec Corp Information display device, information display method and program
CN107704162A (en) * 2016-08-08 2018-02-16 法乐第(北京)网络科技有限公司 One kind mark object control method
CN107704163A (en) * 2016-08-08 2018-02-16 法乐第(北京)网络科技有限公司 One kind mark object control device
CN112035271A (en) * 2019-06-04 2020-12-04 杭州海康威视数字技术股份有限公司 User operation information processing method and device, electronic equipment and storage medium
CN111737800A (en) * 2020-06-23 2020-10-02 广联达科技股份有限公司 Primitive selection method and device and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115097980A (en) * 2022-08-24 2022-09-23 成都智暄科技有限责任公司 Method for selecting small-area overlapped transparent control
CN115097980B (en) * 2022-08-24 2022-12-02 成都智暄科技有限责任公司 Small-area overlapping transparent control selection method

Also Published As

Publication number Publication date
CN113126866B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN112801164A (en) Training method, device and equipment of target detection model and storage medium
US20170277381A1 (en) Cross-platform interactivity architecture
US20230085732A1 (en) Image processing
CN112862016A (en) Method, device and equipment for labeling objects in point cloud and storage medium
CN114648615A (en) Method, device and equipment for controlling interactive reproduction of target object and storage medium
CN113837194B (en) Image processing method, image processing apparatus, electronic device, and storage medium
CN113126866B (en) Object determination method, device, electronic equipment and storage medium
CN110309239B (en) Visual map editing method and device
CN114327057A (en) Object selection method, device, equipment, medium and program product
CN112784588B (en) Method, device, equipment and storage medium for labeling text
US20140245195A1 (en) Duplicating graphical widgets
WO2019125716A1 (en) System and method for drawing beautification
CN113552988A (en) Interface focus control method and device, electronic equipment and storage medium
CN112947916A (en) Method, device, equipment and storage medium for realizing online canvas
CN115756471A (en) Page code generation method and device, electronic equipment and storage medium
US20220156418A1 (en) Progress tracking with automatic symbol detection
CN111796736B (en) Application sharing method and device and electronic equipment
CN114564268A (en) Equipment management method and device, electronic equipment and storage medium
US20210303744A1 (en) Computer aided design (cad) model connection propagation
CN113885978A (en) Element screenshot method and device combining RPA and AI
CN113138760A (en) Page generation method and device, electronic equipment and medium
CN110851521B (en) Method, device and storage medium for data visualization
CN113986112B (en) Soft keyboard display method, related device and computer program product
CN112507671B (en) Method, apparatus, and readable medium for adjusting text distance
CN116860222B (en) Algorithm flow editing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant