WO2017092584A1 - 一种用于操控操作对象的方法与设备 - Google Patents

一种用于操控操作对象的方法与设备 Download PDF

Info

Publication number
WO2017092584A1
WO2017092584A1 PCT/CN2016/106677 CN2016106677W WO2017092584A1 WO 2017092584 A1 WO2017092584 A1 WO 2017092584A1 CN 2016106677 W CN2016106677 W CN 2016106677W WO 2017092584 A1 WO2017092584 A1 WO 2017092584A1
Authority
WO
WIPO (PCT)
Prior art keywords
operation object
information
indication information
application window
visual association
Prior art date
Application number
PCT/CN2016/106677
Other languages
English (en)
French (fr)
Inventor
毛信良
周田伟
陈二喜
Original Assignee
上海逗屋网络科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海逗屋网络科技有限公司 filed Critical 上海逗屋网络科技有限公司
Publication of WO2017092584A1 publication Critical patent/WO2017092584A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present invention relates to the field of computer technologies, and in particular, to a technique for manipulating an operation object.
  • the manipulation method of the operation object depends on the supporting hardware facilities.
  • the manipulation method of the operation object on the PC side it is necessary to use a traditional mouse, keyboard and other hardware for support, and the user can control the operation object on the screen by means of a mouse click, keyboard key input, etc.; based on other external devices
  • the host is such as PS3, PS4, XBOX360, etc.
  • the user can manipulate the operation object on the screen by operating the handle.
  • the user can only manipulate the operation object by touching or clicking the touch screen.
  • one or more operation objects move frequently on the screen or need to operate on multiple operation objects at the same time, it is difficult to achieve fast and accurate selection of the operation object and fast target switching by relying only on the user's finger to click on the screen. As a result, the operation is inconvenient and the processing efficiency is affected.
  • a method for manipulating an operation object comprising:
  • the object indication information corresponding to the operation object and the visual association information of the object indication information and the operation object are presented in the application window.
  • an apparatus for manipulating an operation object comprising:
  • a first device configured to detect whether an operation object enters a corresponding application window
  • a second device configured to: when the operation object exists, enter the application window, and present, in the application window, object indication information corresponding to the operation object, and visual association information between the object indication information and the operation object.
  • the present invention detects whether an operation object enters a corresponding application window.
  • the object indication information corresponding to the operation object is presented in the application window.
  • the object indication information and the visual association information of the operation object thereby facilitating the user to identify, select, and manipulate the operation object by establishing a visual association between the operation object and the object indication information, and supporting the user to conveniently operate the operation object Perform operations to support more complex human-computer interactions, so that users can perform complex operations more easily, improve human-computer interaction efficiency and enhance the user experience.
  • the present invention can also present the operation object, the object indication information corresponding to the operation object, and the visual association information of the object indication information and the operation object in the application window, thereby being able to be more on the screen. Clear and complete display of related objects, further facilitating user identification, selection and manipulation.
  • the present invention may further determine visual association information according to the object indication information and the operation object; further, determining, according to the object related information of the operation object, the object indication information and the vision of the operation object Correlation information; or, further, determining, according to the existing visual association information in the application window, visual association information between the object indication information and the operation object, so that the determined visual association information and the existing Visually related information is visually distinguished. Therefore, the present invention provides a plurality of easily-recognized visual association information, which further facilitates the user to identify, select, and manipulate the operation object, improve the efficiency of human-computer interaction, and improve the user experience.
  • the present invention may also select the operation object according to a selection operation of the object indication information by the user; further, according to one or more subsequent operations of the selected operation, the visual association information and / or the object indicates that the information performs a hidden operation. Therefore, the present invention can operate the operation object conveniently and efficiently, support more complicated human-computer interaction, and further provide a clearer display effect, improve human-computer interaction efficiency and enhance the user experience.
  • the present invention can also be applied to the touch terminal, and set the object indication information and the position of the operation object according to the characteristics of the touch terminal touch screen; further, according to the user's sliding operation of the object manipulation button
  • the object in the application window indicates information, and adjusts a display position of the operation object and/or the visual association information in the application window according to the moved object indication information. Therefore, the present invention can adapt to the operation characteristics of the user operating the touch screen, and can control the operation object even if only a small movement of a single finger is used, thereby making the operation more convenient and efficient, improving the efficiency of human-computer interaction and improving the user's Use experience.
  • FIG. 1 shows a schematic diagram of an apparatus for manipulating an operating object in accordance with an aspect of the present invention
  • FIG. 2 shows a schematic diagram of an apparatus for manipulating an operating object in accordance with a preferred embodiment of the present invention
  • FIG. 3 shows a flow chart of a method for manipulating an operating object in accordance with another aspect of the present invention
  • FIG. 4 shows a flow chart of a method for manipulating an operating object in accordance with a preferred embodiment of the present invention
  • FIG. 5 illustrates a schematic diagram of presence information within an application window for manipulating an operation object in accordance with another preferred embodiment of the present invention.
  • the terminal, the device of the service network, and the trusted party each include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage,
  • computer readable media does not include non-transitory computer readable media, such as modulated data signals and carrier waves.
  • the apparatus includes a first device 1 and a second device 2.
  • the first device 1 detects whether an operation object enters a corresponding application window; when the operation object enters the application window, the second device 2 presents an object corresponding to the operation object in the application window.
  • the indication information, and the visual indication information of the object indication information and the operation object are included in the apparatus.
  • the device includes, but is not limited to, any mobile device and/or non-mobile device.
  • the device comes with a presentation window for presenting operational objects and/or other related information; or the device is capable of interacting with other devices having presentation windows to control operational objects and/or other objects presented within the presentation window Related Information.
  • the present invention can be applied to various types of devices, such as mobile devices, non-mobile devices; preferably, the present invention is applicable to touch terminals, including mobile touch terminals and non-mobile devices. Touch terminal; more preferably, the present invention is applicable to a mobile device having a touch terminal.
  • the first device 1 detects whether an operation object enters a corresponding application window.
  • the first device 1 may detect whether an operation object enters a corresponding application window based on timing monitoring or based on a specific event triggering manner.
  • the operation object includes, but is not limited to, one or more operation targets that the user is operating, or one or more operation targets that are to be operated by the user.
  • the operation target can be used The user can completely control it, or it can be controlled by the current user and one or more other users.
  • the current user may be a primary controller, such as controlling the movement of the operation object, a main action, a state change, etc., or may be an auxiliary controller, such as controlling only the state of the operation object. Change, etc.
  • the control mode is only an example, and other control methods for the operation object can be applied to the present invention, and are also included in the protection scope of the present invention and are included in the reference. this.
  • the application window is the content that is currently displayed in the terminal screen and is visible to the operation user; those skilled in the art should understand that the application window may be presented on the entire screen or on a part of the screen. .
  • the detecting method includes at least one of the following: when the operating object completely enters the application window; when the operating object portion enters the application window; when the specific position of the operating object enters When the application window is about to enter; when the operation object is about to enter the application window (such as the operation object enters the application window for less than a predetermined threshold, such as entering the application window in the next second).
  • the attributes, categories, controllers, and the like of the operation object are different, different detection methods may be adopted on the operation object based on the different attributes thereof. For example, if the category of the operation object is a character, or the main controller of the operation object is not the current user, then when any part of the character enters the application window, it is detected that the operation object enters the corresponding Application window; if the operation object's category is non-human, or the main controller of the operation object is the current user, then the specific position of the operation object (such as the center of gravity position) is detected when it enters the application window Enter the corresponding application window for the operation object.
  • the category of the operation object is a character, or the main controller of the operation object is not the current user, then when any part of the character enters the application window, it is detected that the operation object enters the corresponding Application window; if the operation object's category is non-human, or the main controller of the operation object is the current user, then the specific position of the operation object (such as the center of gravity position
  • the second device 2 When the operation object enters the application window, the second device 2 presents the object indication information corresponding to the operation object and the visual association information of the object indication information and the operation object in the application window.
  • the second device 2 first determines the object indication information and the style of the visual association information. And location; then, based on the determined style and location, presentation is made within the application window.
  • the determining manner includes, but is not limited to, based on a default setting, based on an attribute setting of a category and/or a controller of the operation object, based on current location information of the operation object, based on a current application window
  • Various types of information including but not limited to any one of operation objects, object indication information, and visual association information or others).
  • the object indication information may be set to be smaller than the icon identifier of the operation object, so that the object indication information occupies a smaller display area to improve screen resource utilization, and the vision
  • the association information may be set to be a connection between the icon identifier and the operation object, or set the icon identifier and the operation object to the same color, shape, etc. to display the object indication information. The association with the operation object.
  • an icon identifier of an operation object belonging to the same category or belonging to the same controller may be set to an approximate shape or have an approximate color; or, for example, the object indication may be set according to current position information of the operation object
  • the display position of the information on the screen; or, for example, the newly set visual related information may be set according to the color or display mode (such as the connection mode or the approximate shape display mode) of the visual related information that has been presented in the current application window.
  • the second device 2 may present the operation object, the object indication information corresponding to the operation object, and the object indication information and the information in the application window.
  • the visual association information of the operation object may be present.
  • the second device 2 may further present the operation object in the application window, so that the operation object, the object indication information corresponding to the operation object, and the object are simultaneously presented in the application window.
  • the visual information of the indication information and the operation object is indicated.
  • the first device 1 detects that the operation object enters the application window after the operation object enters the application window, then the first device When the device 1 confirms that the operation object enters, the operation is directly displayed in the application window. If the first device 1 detects that the operation object enters the application window before the operation object enters the application window, after the operation object enters the application window, The operation object is displayed in the application window.
  • the manner in which the second device 2 presents the object indication information and the visual association information is the same as or similar to that of the second device 2 in FIG. 1 , and therefore is not described herein again, and is included by reference. herein.
  • the second device 2 may determine the object indication information and the style and location of the visual association information according to the operation object. On the contrary, the second device 2 may also re-determine information such as the style and location of the operation object according to the object indication information and the style and location of the visual association information, so that the application window is displayed. Clearer and easier to operate.
  • the visual association information comprises at least one of the following:
  • the object indicating information is shared with the operation object, for example, setting all or part of the operation object and the object indication information to the same color or color system to indicate correspondence between the two a relationship, such as the border of the operation object is red, the main color of the object indication information is also red, etc.;
  • the object indication information is shared with the operation object, for example, the object indication information is the same as the operation object, a circle, or other shape, where the "share” includes However, it is not limited to the shape and size of the two, etc., which are completely or partially identical, or only include similar display shape identifiers (such as small triangles in which the triangles are shared in the upper right corner);
  • the object indication information is shared with the operation object, for example, the object indication information has the same avatar information or other identification information as the operation object;
  • connection information between the object indication information and the operation object for example, by a connection having the same or a different color, a line type, or a combination thereof, between the object indication information and the operation object connection.
  • the line type includes, but is not limited to, a solid line, a broken line, or a dotted line, a dot line, or the like.
  • the apparatus further includes a third device (not shown), wherein the third device selects the operation object according to a user selected operation of the object indication information.
  • the third device interacts with the user, or is capable of acquiring
  • the other operations of the user operation interact to obtain one or more selected operations of the user indicating information of the object; wherein the selected operations include, but are not limited to, selection based on a click, a touch, a button, and the like.
  • the third device uses the operation object as the selected target according to the operation object corresponding to the object indication information, thereby making the operation of the user more convenient and efficient, and the user's use experience is better.
  • the apparatus further comprises a fourth device (not shown), wherein the fourth device is for at least one of the following:
  • the information and/or the object indication information can not only indicate to the user that the specific operation object has been selected, but also does not continue to occlude the display content of the application window, thereby improving the utilization of the screen resource;
  • the hiding method includes, but is not limited to, hiding based on a specific operation, or hiding after performing a certain number of operations, or hiding after the performed operation satisfies certain conditions ( For example, a change in the state value of the operation object satisfies a certain threshold or the like).
  • the present invention is applicable to a touch terminal, the application window being displayed on a touch screen of the touch terminal; the object indication information being closer to the object manipulation button in the application window than the operation object, or the object indication The distance of the information from the object manipulation button is less than or equal to a predetermined distance threshold information.
  • the operation object 1 to the operation object 3 and their corresponding object indication information are connected by an arrow line, and the line types of the arrow lines are different; the operation object 1 corresponds thereto.
  • the object indication information corresponds to the corresponding color to show the correspondence between the two.
  • the object indication information 1 to 4 is adjacent to the object manipulation button, that is, a control button for performing a specific operation on the operation object, such as a positional movement or a state change of the operation object. Therefore, when the user selects or moves the operation object, the range of finger movement is smaller, the operation is more convenient and efficient, and the user experience is better.
  • the distance between the object indication information and the object manipulation button is less than or equal to a predetermined distance threshold information, such as 3 cm, so that the user can move a plurality of the operation objects by moving a single finger within a small range, and the operation is also implemented. It is more convenient and efficient, and the user experience is better.
  • the apparatus further includes a fifth device (not shown) and a sixth device (not shown), wherein the fifth device moves the application window according to a sliding operation of the object manipulation button by the user The object in the indication information; the sixth device adjusts a display position of the operation object and/or the visual association information in the application window according to the moved object indication information.
  • the fifth device moves one or more of the object indication information in the application window according to the sliding operation of the user to the sliding, sliding or other direction of the object manipulation button;
  • the movement may be moved in a proportional manner according to the direction of the sliding operation and the moving position, or may be moved to the same direction and/or at the same distance according to a default setting.
  • the sixth device adjusts the operation object and/or the visual association information according to the moved object indication information, for example according to a predetermined pitch or according to an overall arrangement on the application window.
  • the display position in the application window is not limited to a predetermined pitch or according to an overall arrangement on the application window.
  • the operational object and/or the visual association information may be removed or moved into the current application window.
  • the apparatus includes a first device 1' and a second device 2', the second device 2' including A unit 21', the second unit 22', and the third unit 23'.
  • the first device 1 ′ detects whether an operation object enters a corresponding application window; when the operation object enters the application window, the first unit 21 ′ presents the operation object corresponding to the operation window.
  • Object indicating information; the second unit 22' determines visual association information of the object indication information and the operation object; the third unit 23' presents the view within the application window Awareness of related information.
  • the first device 1' of the device is the same as or substantially the same as the corresponding device shown in FIG. 1, and is not described here, and is hereby incorporated by reference.
  • the first unit 21 ′ determines the object indication information corresponding to the operation object according to a default setting or based on object indication information existing in the current application window. Information such as shape, color, and the like, and presenting object indication information corresponding to the operation object in the application window; then, the second unit 22' is based on the object indication information, or according to the existing application window.
  • the visual association information or the like determines the visual association information of the object indication information and the operation object; the third unit 23' presents the visual association information within the application window.
  • first unit 21' and the second unit 22' may be executed in the order shown in FIG. 2, or may be performed in parallel, or the second unit 22' may be executed first.
  • the operation of the first unit 21' is performed after the operation.
  • the manner of determining the new object indication information or the visual association information based on the object indication information or the visual association information already existing in the current application window includes, but is not limited to, the shape and color of the new object indication information or the visual association information. And/or the position or the like is the same, similar or opposite to the shape, color and/or position of the already existing object indication information or visual association information.
  • the second unit 22' may determine visual association information of the object indication information and the operation object according to object related information of the operation object.
  • the second unit 22' may determine the visual association information according to object related information of the operation object, such as a name, a rank, a weight, a category, a current state, and the like. For example, if there are multiple operation objects, the operation objects belonging to the same category use the same visual association information; the different levels of operation objects use visual association information of different colors.
  • the second unit 22' may determine visual association information between the object indication information and the operation object according to existing visual association information in the application window, wherein the visual association information and the Visually related information is visually distinguished.
  • the second unit 22' may determine the newly determined visual association information as information having different visual effects according to the existing visual association information in the application window; for example, adopting different colors and different Shape, different expressions (such as wiring or same color).
  • each visual related information in the application window has a degree of discrimination, which is convenient for user identification and operation.
  • step S1 the device detects whether an operation object enters a corresponding application window; when the operation object enters the application window, in step S2, the device presents the operation in the application window The object indication information corresponding to the object, and the visual association information of the object indication information and the operation object.
  • step S1 the device detects whether an operation object enters a corresponding application window.
  • the device may detect whether an operation object enters a corresponding application window based on timing monitoring or based on a specific event triggering manner.
  • the operation object includes, but is not limited to, one or more operation targets that the user is operating, or one or more operation targets that are to be operated by the user.
  • the operation target may be completely controlled by the user, or may be jointly controlled by the current user and one or more other users.
  • the current user may be a primary controller, such as controlling the movement of the operation object, a main action, a state change, etc., or may be an auxiliary controller, such as controlling only the state of the operation object. Change, etc.
  • the control mode is only an example, and other control methods for the operation object can be applied to the present invention, and are also included in the protection scope of the present invention and are included in the reference. this.
  • the application window is the content that is currently displayed in the terminal screen and is visible to the operation user; those skilled in the art should understand that the application window may be presented on the entire screen or on a part of the screen. .
  • the detecting method includes at least one of the following: when the operating object completely enters the application window; when the operating object portion enters the application window; when the specific position of the operating object enters When the application window is about to enter; when the operation object is about to enter the application window (such as the operation object enters the application window for less than a predetermined threshold, such as entering the application window in the next second).
  • the attributes, categories, controllers, and the like of the operation object are different, different detection methods may be adopted on the operation object based on the different attributes thereof. For example, if the category of the operation object is a character, or the main controller of the operation object is not the current user, then when any part of the character enters the application window, it is detected that the operation object enters the corresponding Application window; if the operation object's category is non-human, or the main controller of the operation object is the current user, then the specific position of the operation object (such as the center of gravity position) is detected when it enters the application window Enter the corresponding application window for the operation object.
  • the category of the operation object is a character, or the main controller of the operation object is not the current user, then when any part of the character enters the application window, it is detected that the operation object enters the corresponding Application window; if the operation object's category is non-human, or the main controller of the operation object is the current user, then the specific position of the operation object (such as the center of gravity position
  • step S2 the device presents object indication information corresponding to the operation object, and visual association information between the object indication information and the operation object in the application window.
  • step S2 when the step S1 detects that the presence of the operation object enters the application window, in step S2, the device first determines the object indication information and the style and location of the visual association information; The determined style and location are displayed within the application window.
  • the determining manner includes, but is not limited to, based on a default setting, based on an attribute setting of a category and/or a controller of the operation object, based on current location information of the operation object, based on a current application window
  • Various types of information including but not limited to any one of operation objects, object indication information, and visual association information or others).
  • the object indication information may be set to be smaller than the icon identifier of the operation object, so that the object indication information occupies a smaller display area to improve screen resource utilization, and the vision
  • the association information may be set to be a connection between the icon identifier and the operation object, or set the icon identifier and the operation object to the same color, shape, etc. to display the object indication information. The association with the operation object.
  • an icon identifier of an operation object belonging to the same category or belonging to the same controller may be set to an approximate shape or have an approximate color; or, for example, the object indication may be set according to current position information of the operation object The location where the information is displayed on the screen; or, for example, the newly set visual association information may be set to be similar to the presented visual association information according to the color or display manner of the visual related information that has been presented in the current application window (such as the connection manner or the display manner of the approximate shape). Display mode and different colors, etc.
  • the device may present the operation object, the object indication information corresponding to the operation object, and the object indication information in the application window.
  • Visually associated information with the operational object may be present.
  • the device may further present the operation object in the application window, so that the operation object, the object indication information corresponding to the operation object, and the The object indicates information and visual association information of the operation object.
  • the device detects that the operation object enters the application window after the operation object enters the application window, then when Step S1 confirms that when the operation object enters, the operation object is directly displayed in the application window; if in step S1, the device can detect before the operation object enters the application window The operation object enters the application window, and when the operation object enters the application window, the operation object is displayed in the application window.
  • step S2 the manner in which the device presents the object indication information and the visual association information is the same as or similar to the step S2 in FIG. 3, and therefore is not described herein again, and is included by reference. herein.
  • step S2 the device may determine the object indication information and the style and location of the visual association information according to the operation object. On the other hand, in step S2, the device may also re-determine information such as the style and location of the operation object according to the object indication information and the style and location of the visual association information, so that the application The window display is clearer and easier to operate.
  • the visual association information comprises at least one of the following:
  • the object indicating information is shared with the operation object, for example, setting all or part of the operation object and the object indication information to the same color or color system to indicate correspondence between the two a relationship, such as the border of the operation object is red, the main color of the object indication information is also red, etc.;
  • the object indication information is shared with the operation object, for example, the object indication information is the same as the operation object, a circle, or other shape, where the "share” includes However, it is not limited to the shape and size of the two, etc., which are completely or partially identical, or only include similar display shape identifiers (such as small triangles in which the triangles are shared in the upper right corner);
  • the object indication information is shared with the operation object, for example, the object indication information has the same avatar information or other identification information as the operation object;
  • connection information between the object indication information and the operation object for example, by a connection having the same or a different color, a line type, or a combination thereof, between the object indication information and the operation object connection.
  • the line type includes, but is not limited to, a solid line, a broken line, or a dotted line, a dot line, or the like.
  • the method further includes a step S3 (not shown), wherein in step S3, the device selects the operation object according to a selection operation of the object indication information by the user.
  • step S3 the device interacts with the user or interacts with other devices capable of acquiring the user operation to obtain one or more selected operations of the user indication information of the user.
  • the selected operation includes, but is not limited to, selection based on a click, a touch, a button, and the like.
  • step S3 the device takes the operation object as the selected target according to the operation object corresponding to the object indication information, thereby making the operation of the user more convenient and efficient, and the user's use experience is better.
  • the method further comprises a step S4 (not shown), wherein in step S4 the device comprises at least one of the following:
  • the hiding method includes, but is not limited to, hiding based on a specific operation, or hiding after performing a certain number of operations, or hiding after the performed operation satisfies certain conditions ( For example, a change in the state value of the operation object satisfies a certain threshold or the like).
  • the present invention is applicable to a touch terminal, the application window being displayed on a touch screen of the touch terminal; the object indication information being closer to the object manipulation button in the application window than the operation object, or the object indication The distance of the information from the object manipulation button is less than or equal to a predetermined distance threshold information.
  • the operation object 1 to the operation object 3 and their corresponding object indication information are connected by an arrow line, and the line types of the arrow lines are different; the operation object 1 and its corresponding object indication information pass correspondingly The colors correspond to each other to show the correspondence between the two.
  • the object indication information 1 to 4 is adjacent to the object manipulation button, that is, a control button for performing a specific operation on the operation object, such as a positional movement or a state change of the operation object. Therefore, when the user selects or moves the operation object, the range of finger movement is smaller, the operation is more convenient and efficient, and the user experience is better.
  • the distance between the object indication information and the object manipulation button is less than or equal to a predetermined distance threshold information, such as 3 cm, so that the user can move a plurality of the operation objects by moving a single finger within a small range, and the operation is also implemented. It is more convenient and efficient, and the user experience is better.
  • the method further includes a step S5 (not shown) and a step S6 (not shown), wherein in step S5, the device moves the application according to a sliding operation of the object manipulation button by the user The object indication information in the window; in step S6, the device adjusts the operation object and/or the visual association information according to the moved object indication information The display position in the application window.
  • step S5 the device performs one or more of the object indication information in the application window according to the sliding operation of the user to the sliding, sliding, or other direction of the object manipulation button. Moving; wherein the movement may be moved in proportion to the moving position according to the direction of the sliding operation, or may be moved to the same direction and/or at the same distance according to a default setting.
  • step S6 the device adjusts the operation object and/or the vision accordingly according to the moved object indication information, for example according to a predetermined pitch or according to an overall arrangement on the application window.
  • the display location of the associated information in the application window is not limited to a predetermined pitch or according to an overall arrangement on the application window.
  • the operational object and/or the visual association information may be removed or moved into the current application window.
  • step S1' the device detects whether an operation object enters a corresponding application window; when the operation object enters the application window, in step S21', the device presents the application window in the application window.
  • the object indication information corresponding to the operation object in step S22', the device determines the visual association information of the object indication information and the operation object; in step S23', the device is presented in the application window The visual association information.
  • step S1' of the method is the same as or substantially the same as the corresponding step shown in FIG. 3, and thus is not described herein again, and is included herein by reference.
  • step S21' the device determines an object indication corresponding to the operation object according to a default setting or based on object indication information existing in the current application window.
  • Information such as shape, color, and the like, and presenting object indication information corresponding to the operation object in the application window; then, in step S22', the device according to the object indication information, or according to the current application window.
  • the visual association information or the like already existing therein determines the visual association information of the object indication information and the operation object; in step S23', the device presents the visual association information in the application window.
  • step S21' and the step S22' may be performed in the order shown in FIG. 4, or may be performed in parallel or after the operation of the step S22' is performed first. The operation of step S21' is described.
  • the manner of determining the new object indication information or the visual association information based on the object indication information or the visual association information already existing in the current application window includes, but is not limited to, the shape and color of the new object indication information or the visual association information. And/or the position or the like is the same, similar or opposite to the shape, color and/or position of the already existing object indication information or visual association information.
  • the device may determine visual association information of the object indication information and the operation object according to object related information of the operation object.
  • the device may determine the visual association information according to object related information of the operation object, such as a name, a rank, a weight, a category, a current state, and the like. For example, if there are multiple operation objects, the operation objects belonging to the same category use the same visual association information; the different levels of operation objects use visual association information of different colors.
  • object related information of the operation object such as a name, a rank, a weight, a category, a current state, and the like. For example, if there are multiple operation objects, the operation objects belonging to the same category use the same visual association information; the different levels of operation objects use visual association information of different colors.
  • the second unit 22' may determine visual association information between the object indication information and the operation object according to existing visual association information in the application window, wherein the visual association information and the Visually related information is visually distinguished.
  • the second unit 22' may determine the newly determined visual association information as information having different visual effects according to the existing visual association information in the application window; for example, adopting different colors and different Shape, different expressions (such as wiring or same color).
  • each visual related information in the application window has a degree of discrimination, which is convenient for user identification and operation.
  • the present invention can be implemented in software and/or a combination of software and hardware.
  • the various devices of the present invention can be implemented using an application specific integrated circuit (ASIC) or any other similar hardware device.
  • the software program of the present invention may be executed by a processor to implement the steps or functions described above.
  • the software program (including related data structures) of the present invention can be stored in a computer readable recording medium such as a RAM memory, a magnetic or optical drive or a floppy disk and the like.
  • some of the steps or functions of the present invention may be implemented in hardware, for example, as a circuit that cooperates with a processor to perform various steps or functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种用于操控操作对象的方法与设备,检测操作对象是否进入对应的应用窗口(S1),当存在操作对象进入所述应用窗口时,则在所述应用窗口内呈现所述操作对象对应的对象指示信息、以及所述对象指示信息与所述操作对象的视觉关联信息(S2)。与现有技术相比,上述方案通过在操作对象与对象指示信息间建立视觉关联的方式,方便用户识别、选择和操控所述操作对象,支持用户便捷地对该操作对象执行操作,进而支持更为复杂的人机交互,以便用户更为便捷地执行复杂操作,提高人机交互效率并提升用户的使用体验。

Description

一种用于操控操作对象的方法与设备 技术领域
本发明涉及计算机技术领域,尤其涉及一种用于操控操作对象的技术。
背景技术
在现有技术中,对操作对象的操控方法依赖于配套的硬件设施。如在PC端对操作对象的操控方法中,需要采用传统的鼠标、键盘等硬件进行支持,用户通过鼠标点击、键盘按键进行输入等方式,实现对屏幕上操作对象的操控;在基于其他外接设备,如PS3、PS4、XBOX360等主机时,用户可通过操控手柄来操作屏幕上的操作对象。
而在触摸设备上,用户仅能通过触摸或点击触摸屏来实现对操作对象的操控。当一个或多个操作对象在屏幕上频繁移动或需要同时对多个操作对象进行操作时,仅依靠用户手指在屏幕上进行点选则很难实现对操作对象的快速准确选择与快速目标切换,从而导致操作不便,影响了处理效率。
发明内容
本发明的目的是提供一种用于操控操作对象的方法与设备。
根据本发明的一个方面,提供了一种用于操控操作对象的方法,其中,该方法包括:
a检测是否有操作对象进入对应的应用窗口;
b当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
根据本发明的另一方面,还提供了一种用于操控操作对象的设备,其中,该设备包括:
第一装置,用于检测是否有操作对象进入对应的应用窗口;
第二装置,用于当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
与现有技术相比,本发明检测操作对象是否进入对应的应用窗口,当存在操作对象进入所述应用窗口时,则在所述应用窗口内呈现所述操作对象对应的对象指示信息、以及所述对象指示信息与所述操作对象的视觉关联信息;从而通过在操作对象与对象指示信息间建立视觉关联的方式,方便用户识别、选择和操控所述操作对象,支持用户便捷地对该操作对象执行操作,进而支持更为复杂的人机交互,以便用户更为便捷地执行复杂操作,提高人机交互效率并提升用户的使用体验。
而且,本发明还可以在所述应用窗口内呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息,从而能够在屏幕上更加清晰完整的显示相关对象,进一步方便用户的识别、选择和操控。
而且,本发明还可以根据所述对象指示信息与所述操作对象来确定视觉关联信息;进一步地,还可以根据所述操作对象的对象相关信息确定所述对象指示信息与所述操作对象的视觉关联信息;或者,进一步地,还可以根据所述应用窗口内的已有视觉关联信息确定所述对象指示信息与所述操作对象的视觉关联信息,使得所确定的视觉关联信息与所述已有视觉关联信息在视觉上相区分。从而,本发明提供了多种易于识别的视觉关联信息,进一步方便用户识别、选择和操控所述操作对象,提高人机交互效率并提升用户的使用体验。
而且,本发明还可以根据用户对所述对象指示信息的选中操作,选中所述操作对象;进一步地,还可以根据所述选中操作的一种或多种后续操作,对所述视觉关联信息和/或所述对象指示信息执行隐藏操作。从而,本发明能够便捷、高效地对所述操作对象进行操作,支持更为复杂的人机交互,进一步地还能够提供更为清晰的显示效果,提高人机交互效率并提升用户的使用体验。
而且,本发明还可以在触摸终端上进行应用,并根据触摸终端触摸屏的特点来设置所述对象指示信息、所述操作对象的位置;进一步地,还可以根据用户对对象操控按钮的滑动操作移动所述应用窗口中的所述对象指示信息,并根据移动后的所述对象指示信息调整所述操作对象和/或所述视觉关联信息在所述应用窗口中的显示位置。从而,本发明能够适应用户操作触摸屏的操作特点,即使仅采用单指的小幅移动也可以实现对所述操作对象的控制,因此,操作更为便捷、高效,提高人机交互效率并提升用户的使用体验。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本发明的其它特征、目的和优点将会变得更明显:
图1示出根据本发明一个方面的一种用于操控操作对象的设备示意图;
图2示出根据本发明一个优选实施例的一种用于操控操作对象的设备示意图;
图3示出根据本发明另一个方面的一种用于操控操作对象的方法流程图;
图4示出根据本发明一个优选实施例的一种用于操控操作对象的方法流程图;
图5示出根据本发明另一个优选实施例的一种用于操控操作对象的应用窗口内的呈现信息示意图。
附图中相同或相似的附图标记代表相同或相似的部件。
具体实施方式
下面结合附图对本发明作进一步详细描述。
在更加详细地讨论示例性实施例之前应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将各项操作描述成顺序的处理,但是其中的许多操作可以被并行地、并发地或者同时实施。 此外,各项操作的顺序可以被重新安排。当其操作完成时所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
在本申请一个典型的配置中,终端、服务网络的设备和可信方均包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括非暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
后面所讨论的方法(其中一些通过流程图示出)可以通过硬件、软件、固件、中间件、微代码、硬件描述语言或者其任意组合来实施。当用软件、固件、中间件或微代码来实施时,用以实施必要任务的程序代码或代码段可以被存储在机器或计算机可读介质(比如存储介质)中。(一个或多个)处理器可以实施必要的任务。
这里所公开的具体结构和功能细节仅仅是代表性的,并且是用于描述本发明的示例性实施例的目的。但是本发明可以通过许多替换形式来具体实现,并且不应当被解释成仅仅受限于这里所阐述的实施例。
应当理解的是,虽然在这里可能使用了术语“第一”、“第二”等等来描述各个单元,但是这些单元不应当受这些术语限制。使用这些术语仅仅 是为了将一个单元与另一个单元进行区分。举例来说,在不背离示例性实施例的范围的情况下,第一单元可以被称为第二单元,并且类似地第二单元可以被称为第一单元。这里所使用的术语“和/或”包括其中一个或更多所列出的相关联项目的任意和所有组合。
这里所使用的术语仅仅是为了描述具体实施例而不意图限制示例性实施例。除非上下文明确地另有所指,否则这里所使用的单数形式“一个”、“一项”还意图包括复数。还应当理解的是,这里所使用的术语“包括”和/或“包含”规定所陈述的特征、整数、步骤、操作、单元和/或组件的存在,而不排除存在或添加一个或更多其他特征、整数、步骤、操作、单元、组件和/或其组合。
下面结合附图对本发明作进一步详细描述。
图1示出根据本发明一个方面的一种用于操控操作对象的设备示意图;其中,该设备包括第一装置1和第二装置2。具体地,所述第一装置1检测是否有操作对象进入对应的应用窗口;当存在操作对象进入所述应用窗口,所述第二装置2在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
其中,所述设备包括但不限于任意移动设备和/或非移动设备。所述设备自带用于呈现操作对象和/或其他相关信息的呈现窗口;或者所述设备能够与其他具有呈现窗口的设备相交互,以控制该呈现窗口内所呈现的操作对象和/或其他相关信息。
在此,本领域技术人员应能理解,本发明可以应用于各类设备,如移动设备、非移动设备;优选地,本发明可应用于触摸终端,所述触摸终端包括移动触摸终端与非移动触摸终端;更优选地,本发明可应用于具有触摸终端的移动设备。
所述第一装置1检测是否有操作对象进入对应的应用窗口。
具体地,所述第一装置1可以基于定时监测或基于特定事件触发的方式,检测是否有操作对象进入对应的应用窗口。
其中,所述操作对象包括但不限于用户正在操作中的一个或多个操作目标,或即将被用户操作的一个或多个操作目标。所述操作目标可以被用 户完全操控,也可以是由当前用户与其他一名或多名用户来联合操控。当至少两名用户操控所述操作对象时,当前用户可以是主要控制者,如控制该操作对象的移动、主要动作、状态变更等,也可以是辅助控制者,如仅控制该操作对象的状态变更等。在此,本领域技术人员应能理解,所述控制方式仅为举例,其他对操作对象的控制方式如能适用于本发明,同样包含在本发明的保护范围内,并以引用的方式包含于此。
其中,所述应用窗口即为在终端屏幕中当前所显示的、为操作用户可见的内容;本领域技术人员应能理解,所述应用窗口可以呈现于整个屏幕上,也可呈现于部分屏幕上。
在此,所述检测方法包括以下至少任一项:当所述操作对象完全进入所述应用窗口时;当所述操作对象部分进入所述应用窗口时;当所述操作对象的特定位置进入所述应用窗口时;当所述操作对象即将进入所述应用窗口时(如所述操作对象进入所述应用窗口的时间小于预定阈值,如在下一秒就会进入所述应用窗口)。
优选地,当所述操作对象所对应的类别、控制者等属性不同时,可以基于其属性的不同,对所述操作对象采取不同的检测方法。例如,若所述操作对象的类别为人物,或者该操作对象的主要控制者非当前用户时,则当该人物的任一部分进入所述应用窗口时即可检测到有操作对象进入所述对应的应用窗口;若所述操作对象的类别为非人物,或者该操作对象的主要控制者为当前用户时,则当该操作对象的特定位置(如重心位置)进入所述应用窗口时才将其检测为有操作对象进入所述对应的应用窗口。
在此,本领域技术人员应能理解,上述检测方法仅为示例,并非对本发明的限制,其他检测是否有操作对象进入对应的应用窗口的方法,同样适用于本发明,在此不再赘述,同样包含于本发明的保护范围中。
当存在操作对象进入所述应用窗口,所述第二装置2在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
具体地,当所述第一装置1检测到存在操作对象进入所述应用窗口时,所述第二装置2首先确定所述对象指示信息以及所述视觉关联信息的样式 与位置;然后,基于所确定的样式与位置,在所述应用窗口内进行展现。
在此,所述确定方式包括但不限于:基于缺省设置,基于所述操作对象的类别和/或控制者等属性设置、基于所述操作对象的当前位置信息、基于当前应用窗口内已呈现的各类信息(包括但不限于操作对象、对象指示信息以及视觉关联信息中的任意一种或其他)等。
例如,在缺省设置中,可设置所述对象指示信息为小于所述操作对象的图标标识,从而使得所述对象指示信息占用更小的显示面积,以提高屏幕资源利用率,而所述视觉关联信息则可设置为在所述图标标识以及所述操作对象之间的连线,或是将所述图标标识与所述操作对象设置为同样的颜色、形状等,以显示所述对象指示信息与所述操作对象的关联性。
或者,例如,可将属于同一类别或属于同一控制者的操作对象的图标标识设置成近似形状或具有近似的颜色;或者,例如,可根据所述操作对象的当前位置信息来设置所述对象指示信息在屏幕上的显示位置;或者,例如,可根据当前应用窗口内已呈现的视觉关联信息的颜色或显示方式(如连线方式或近似形状的显示方式),将新设置的视觉关联信息设置为与已呈现的视觉关联信息近似的显示方式以及不同的颜色等等。
在此,本领域技术人员应能理解,上述显示方式仅为示例,并非对本发明的限制,其他能够指示所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息的方式同样适用于本发明,在此不再赘述,同样包含于本发明的保护范围中。
优选地,当存在操作对象进入所述应用窗口,所述第二装置2可以在所述应用窗口内呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
具体地,所述第二装置2还可以在所述应用窗口内呈现所述操作对象,使得所述应用窗口内同时呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。在此,本领域技术人员应能理解,若所述第一装置1在所述操作对象进入所述应用窗口内之后才检测到有所述操作对象进入所述应用窗口,则当所述第一装置1确认有该操作对象进入时,则直接在所述应用窗口内显示所述操作 对象;若所述第一装置1在所述操作对象进入所述应用窗口内前便可检测到有所述操作对象进入所述应用窗口,则当该操作对象进入所述应用窗口之后,在所述应用窗口内显示所述操作对象。
在此,所述第二装置2呈现所述对象指示信息以及所述视觉关联信息的方式与图1中所述第二装置2相同或相似,故在此不再赘述,并以引用的方式包含于此。
所述第二装置2可以根据所述操作对象来确定所述对象指示信息以及所述视觉关联信息的样式与位置。反之,优选地,所述第二装置2也可以根据所述对象指示信息以及所述视觉关联信息的样式与位置来重新确定所述操作对象的样式与位置等信息,以使得所述应用窗口显示更为清晰,便于操作。
优选地,所述视觉关联信息包括以下至少任一项:
-所述对象指示信息与所述操作对象共用的显示色彩信息,例如,将所述操作对象与所述对象指示信息的全部或部分设置为相同的色彩或色系,以表示两者间的对应关系,如所述操作对象的边框为红色,则所述对象指示信息的主色调也为红色等;
-所述对象指示信息与所述操作对象共用的显示形状信息,例如,所述对象指示信息与所述操作对象为相同的方框、圆形或其他形状,在此,所述“共用”包括但不限于两者的形状大小等完全或部分一致,或是仅是包含了相似的显示形状标识(如两者在右上角共用三角形状的小标识)等;
-所述对象指示信息与所述操作对象共用的显示图标信息,例如,所述对象指示信息与所述操作对象具有相同的头像信息或其他标识信息等;
-所述对象指示信息与所述操作对象之间的连线信息,例如,通过具有相同或不同颜色、线型或其组合的连线,在所述对象指示信息以及所述操作对象之间进行连接。其中,所述线型包括但不限于实线、虚线或如点画线、圆点线等。
优选地,所述设备还包括第三装置(未示出),其中,所述第三装置根据用户对所述对象指示信息的选中操作,选中所述操作对象。
具体地,所述第三装置通过与所述用户进行交互,或者与能够获取所 述用户操作的其他装置进行交互,以获取用户的一个或多个对所述对象指示信息的选中操作;其中,所述选中操作包括但不限于基于点击、触摸、按键等方式进行的选中。
然后,所述第三装置根据所述对象指示信息所对应的操作对象,将所述操作对象作为所选中的目标,从而使得用户的操作更加便捷、高效,用户的使用体验更好。
更优选地,所述设备还包括第四装置(未示出),其中,所述第四装置用于以下至少任一项:
-当所述操作对象被选中时,隐藏所述视觉关联信息和/或所述对象指示信息:即当所述操作对象通过所述对象指示信息被选中时,则在屏幕上隐藏所述视觉关联信息和/或所述对象指示信息,从而既能向用户表明已选中特定的操作对象,同时也不会继续遮挡所述应用窗口的显示内容,提高了屏幕资源利用率;
-当所述操作对象被移出所述应用窗口时,隐藏所述视觉关联信息和/或所述对象指示信息:即当所述操作对象已不再显示于该应用窗口时,则同时将所述视觉关联信息和/或所述对象指示信息进行隐藏,以避免所述应用窗口的显示混乱;
-当所述操作对象被选中且被执行操作后,隐藏所述视觉关联信息和/或所述对象指示信息:即当所述操作对象被选中且执行一个或多个操作后,再隐藏所述视觉关联信息和/或所述对象指示信息;在此,隐藏方法包括但不限于基于特定操作进行隐藏,或者在执行特定数量后的操作后隐藏,或者在所执行的操作满足一定条件后隐藏(如对所述操作对象的状态值的改变满足某一阈值等)等。
优选地,本发明可应用于触摸终端,所述应用窗口显示于触摸终端的触屏上;所述对象指示信息比所述操作对象接近所述应用窗口内的对象操控按钮,或者所述对象指示信息与所述对象操控按钮的距离小于或等于预定的距离阈值信息。
如图5所示,所述操作对象1至操作对象3与其对应的对象指示信息通过箭头线相连,所述箭头线的线型各不相同;所述操作对象1与其对应 的对象指示信息通过相应的颜色相对应,以示出二者的对应关系。所述对象指示信息1至4邻近所述对象操控按钮,所述对象操控按钮即用于对所述操作对象执行具体操作的控制按钮,如使得所述操作对象的位置移动或状态改变等。从而所述用户在选择或移动所述操作对象时,手指移动的范围更小,操作更加便捷、高效,用户体验更好。
或者,所述对象指示信息与所述对象操控按钮的距离小于等于预定的距离阈值信息,如3cm,使得用户通过单指在小幅度内移动即可移动多个所述操作对象,同样实现了操作更加便捷、高效,用户体验更好。
更优选地,所述设备还包括第五装置(未示出)和第六装置(未示出),其中,所述第五装置根据用户对所述对象操控按钮的滑动操作移动所述应用窗口中的所述对象指示信息;所述第六装置根据移动后的所述对象指示信息调整所述操作对象和/或所述视觉关联信息在所述应用窗口中的显示位置。
具体地,所述第五装置根据所述用户对所述对象操控按钮的上滑、下滑或其他方向的滑动操作,对所述应用窗口中的一个或多个所述对象指示信息进行移动;其中,所述移动可以根据所述滑动操作的方向与移动位置来等比例移动,也可以按照缺省设置向同一方向和/或以同一距离移动。
然后,所述第六装置根据移动后的所述对象指示信息,例如根据预定的间距或根据所述应用窗口上的整体排布,相应地调整所述操作对象和/或所述视觉关联信息在所述应用窗口中的显示位置。
优选地,所述操作对象和/或所述视觉关联信息可以被移除或移入当前应用窗口。
图2示出根据本发明一个优选实施例的一种用于操控操作对象的设备示意图;其中,所述设备包括第一装置1’和第二装置2’,所述第二装置2’包括第一单元21’、所述第二单元22’、所述第三单元23’。具体地,所述第一装置1’检测是否有操作对象进入对应的应用窗口;当存在操作对象进入所述应用窗口,所述第一单元21’在所述应用窗口内呈现所述操作对象对应的对象指示信息;所述第二单元22’确定所述对象指示信息与所述操作对象的视觉关联信息;所述第三单元23’在所述应用窗口内呈现所述视 觉关联信息。
其中,所述设备的第一装置1’与图1所示对应装置相同或基本相同,故此处不再赘述,并通过引用的方式包含于此。
具体地,当存在操作对象进入所述应用窗口,所述第一单元21’根据缺省设置,或基于当前应用窗口内所存在的对象指示信息,确定所述操作对象所对应的对象指示信息的形状、颜色等信息,并在在所述应用窗口内呈现所述操作对象对应的对象指示信息;然后,所述第二单元22’根据所述对象指示信息,或者根据当前应用窗口内已经存在的视觉关联信息等,确定所述对象指示信息与所述操作对象的视觉关联信息;所述第三单元23’在所述应用窗口内呈现所述视觉关联信息。
在此,本领域技术人员应能理解,所述第一单元21’与所述第二单元22’可以按照图2所示的顺序执行,也可以并行,或先执行所述第二单元22’的操作后再执行所述第一单元21’的操作。
在此,基于当前应用窗口内已经存在的对象指示信息或视觉关联信息,来确定新的对象指示信息或视觉关联信息的方式包括但不限于将新的对象指示信息或视觉关联信息的形状、颜色和/或位置等与已经存在的对象指示信息或视觉关联信息的形状、颜色和/或位置相同、相近或相反。
优选地,所述第二单元22’可以根据所述操作对象的对象相关信息确定所述对象指示信息与所述操作对象的视觉关联信息。
具体地,所述第二单元22’可以根据所述操作对象的对象相关信息,如名称、等级、权重、类别、当前状态等内容,确定所述视觉关联信息。例如,若存在多个操作对象,则属于相同类别的操作对象则采用同样的视觉关联信息;不同等级的操作对象则采用不同颜色的视觉关联信息等。
优选地,所述第二单元22’可以根据所述应用窗口内的已有视觉关联信息确定所述对象指示信息与所述操作对象的视觉关联信息,其中,所述视觉关联信息与所述已有视觉关联信息在视觉上相区分。
具体地,所述第二单元22’可根据所述应用窗口内已有的视觉关联信息,将新确定的视觉关联信息确定为与其具有不同视觉效果的信息;例如,采用不同的色彩、不同的形状、不同的表达方式(如连线或同型色等)。 从而使得所述应用窗口内的各个视觉关联信息具有区分度,便于用户的辨识与操作。
在此,本领域技术人员应能理解,上述示例仅为举例,并非对本发明的限制,其他的确定所述视觉关联信息的方式如能适用于本发明,同样包含在本发明的保护范围内。
图3示出根据本发明另一个方面的一种用于操控操作对象的方法流程图。具体地,在步骤S1中,所述设备检测是否有操作对象进入对应的应用窗口;当存在操作对象进入所述应用窗口,在步骤S2中,所述设备在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
在步骤S1中,所述设备检测是否有操作对象进入对应的应用窗口。
具体地,在步骤S1中,所述设备可以基于定时监测或基于特定事件触发的方式,检测是否有操作对象进入对应的应用窗口。
其中,所述操作对象包括但不限于用户正在操作中的一个或多个操作目标,或即将被用户操作的一个或多个操作目标。所述操作目标可以被用户完全操控,也可以是由当前用户与其他一名或多名用户来联合操控。当至少两名用户操控所述操作对象时,当前用户可以是主要控制者,如控制该操作对象的移动、主要动作、状态变更等,也可以是辅助控制者,如仅控制该操作对象的状态变更等。在此,本领域技术人员应能理解,所述控制方式仅为举例,其他对操作对象的控制方式如能适用于本发明,同样包含在本发明的保护范围内,并以引用的方式包含于此。
其中,所述应用窗口即为在终端屏幕中当前所显示的、为操作用户可见的内容;本领域技术人员应能理解,所述应用窗口可以呈现于整个屏幕上,也可呈现于部分屏幕上。
在此,所述检测方法包括以下至少任一项:当所述操作对象完全进入所述应用窗口时;当所述操作对象部分进入所述应用窗口时;当所述操作对象的特定位置进入所述应用窗口时;当所述操作对象即将进入所述应用窗口时(如所述操作对象进入所述应用窗口的时间小于预定阈值,如在下一秒就会进入所述应用窗口)。
优选地,当所述操作对象所对应的类别、控制者等属性不同时,可以基于其属性的不同,对所述操作对象采取不同的检测方法。例如,若所述操作对象的类别为人物,或者该操作对象的主要控制者非当前用户时,则当该人物的任一部分进入所述应用窗口时即可检测到有操作对象进入所述对应的应用窗口;若所述操作对象的类别为非人物,或者该操作对象的主要控制者为当前用户时,则当该操作对象的特定位置(如重心位置)进入所述应用窗口时才将其检测为有操作对象进入所述对应的应用窗口。
在此,本领域技术人员应能理解,上述检测方法仅为示例,并非对本发明的限制,其他检测是否有操作对象进入对应的应用窗口的方法,同样适用于本发明,在此不再赘述,同样包含于本发明的保护范围中。
当存在操作对象进入所述应用窗口,在步骤S2中,所述设备在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
具体地,当所述步骤S1检测到存在操作对象进入所述应用窗口时,在步骤S2中,所述设备首先确定所述对象指示信息以及所述视觉关联信息的样式与位置;然后,基于所确定的样式与位置,在所述应用窗口内进行展现。
在此,所述确定方式包括但不限于:基于缺省设置,基于所述操作对象的类别和/或控制者等属性设置、基于所述操作对象的当前位置信息、基于当前应用窗口内已呈现的各类信息(包括但不限于操作对象、对象指示信息以及视觉关联信息中的任意一种或其他)等。
例如,在缺省设置中,可设置所述对象指示信息为小于所述操作对象的图标标识,从而使得所述对象指示信息占用更小的显示面积,以提高屏幕资源利用率,而所述视觉关联信息则可设置为在所述图标标识以及所述操作对象之间的连线,或是将所述图标标识与所述操作对象设置为同样的颜色、形状等,以显示所述对象指示信息与所述操作对象的关联性。
或者,例如,可将属于同一类别或属于同一控制者的操作对象的图标标识设置成近似形状或具有近似的颜色;或者,例如,可根据所述操作对象的当前位置信息来设置所述对象指示信息在屏幕上的显示位置;或者, 例如,可根据当前应用窗口内已呈现的视觉关联信息的颜色或显示方式(如连线方式或近似形状的显示方式),将新设置的视觉关联信息设置为与已呈现的视觉关联信息近似的显示方式以及不同的颜色等等。
在此,本领域技术人员应能理解,上述显示方式仅为示例,并非对本发明的限制,其他能够指示所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息的方式同样适用于本发明,在此不再赘述,同样包含于本发明的保护范围中。
优选地,当存在操作对象进入所述应用窗口,在步骤S2中,所述设备可以在所述应用窗口内呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
具体地,在步骤S2中,所述设备还可以在所述应用窗口内呈现所述操作对象,使得所述应用窗口内同时呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。在此,本领域技术人员应能理解,若在步骤S1中,所述设备在所述操作对象进入所述应用窗口内之后才检测到有所述操作对象进入所述应用窗口,则当所述步骤S1确认有该操作对象进入时,则直接在所述应用窗口内显示所述操作对象;若在步骤S1中,所述设备在所述操作对象进入所述应用窗口内前便可检测到有所述操作对象进入所述应用窗口,则当该操作对象进入所述应用窗口之后,在所述应用窗口内显示所述操作对象。
在此,在步骤S2中,所述设备呈现所述对象指示信息以及所述视觉关联信息的方式与图3中所述步骤S2相同或相似,故在此不再赘述,并以引用的方式包含于此。
在步骤S2中,所述设备可以根据所述操作对象来确定所述对象指示信息以及所述视觉关联信息的样式与位置。反之,优选地,在步骤S2中,所述设备也可以根据所述对象指示信息以及所述视觉关联信息的样式与位置来重新确定所述操作对象的样式与位置等信息,以使得所述应用窗口显示更为清晰,便于操作。
优选地,所述视觉关联信息包括以下至少任一项:
-所述对象指示信息与所述操作对象共用的显示色彩信息,例如,将所述操作对象与所述对象指示信息的全部或部分设置为相同的色彩或色系,以表示两者间的对应关系,如所述操作对象的边框为红色,则所述对象指示信息的主色调也为红色等;
-所述对象指示信息与所述操作对象共用的显示形状信息,例如,所述对象指示信息与所述操作对象为相同的方框、圆形或其他形状,在此,所述“共用”包括但不限于两者的形状大小等完全或部分一致,或是仅是包含了相似的显示形状标识(如两者在右上角共用三角形状的小标识)等;
-所述对象指示信息与所述操作对象共用的显示图标信息,例如,所述对象指示信息与所述操作对象具有相同的头像信息或其他标识信息等;
-所述对象指示信息与所述操作对象之间的连线信息,例如,通过具有相同或不同颜色、线型或其组合的连线,在所述对象指示信息以及所述操作对象之间进行连接。其中,所述线型包括但不限于实线、虚线或如点画线、圆点线等。
优选地,所述方法还包括步骤S3(未示出),其中,在步骤S3中,所述设备根据用户对所述对象指示信息的选中操作,选中所述操作对象。
具体地,在步骤S3中,所述设备通过与所述用户进行交互,或者与能够获取所述用户操作的其他装置进行交互,以获取用户的一个或多个对所述对象指示信息的选中操作;其中,所述选中操作包括但不限于基于点击、触摸、按键等方式进行的选中。
然后,在步骤S3中,所述设备根据所述对象指示信息所对应的操作对象,将所述操作对象作为所选中的目标,从而使得用户的操作更加便捷、高效,用户的使用体验更好。
更优选地,所述方法还包括步骤S4(未示出),其中,在步骤S4中,所述设备包括以下至少任一项:
-当所述操作对象被选中时,隐藏所述视觉关联信息和/或所述对象指示信息:即当所述操作对象通过所述对象指示信息被选中时,则在屏幕上隐藏所述视觉关联信息和/或所述对象指示信息,从而既能向用户表明已选中特定的操作对象,同时也不会继续遮挡所述应用窗口的显示内容,提高 了屏幕资源利用率;
-当所述操作对象被移出所述应用窗口时,隐藏所述视觉关联信息和/或所述对象指示信息:即当所述操作对象已不再显示于该应用窗口时,则同时将所述视觉关联信息和/或所述对象指示信息进行隐藏,以避免所述应用窗口的显示混乱;
-当所述操作对象被选中且被执行操作后,隐藏所述视觉关联信息和/或所述对象指示信息:即当所述操作对象被选中且执行一个或多个操作后,再隐藏所述视觉关联信息和/或所述对象指示信息;在此,隐藏方法包括但不限于基于特定操作进行隐藏,或者在执行特定数量后的操作后隐藏,或者在所执行的操作满足一定条件后隐藏(如对所述操作对象的状态值的改变满足某一阈值等)等。
优选地,本发明可应用于触摸终端,所述应用窗口显示于触摸终端的触屏上;所述对象指示信息比所述操作对象接近所述应用窗口内的对象操控按钮,或者所述对象指示信息与所述对象操控按钮的距离小于或等于预定的距离阈值信息。
如图5所示,所述操作对象1至操作对象3与其对应的对象指示信息通过箭头线相连,所述箭头线的线型各不相同;所述操作对象1与其对应的对象指示信息通过相应的颜色相对应,以示出二者的对应关系。所述对象指示信息1至4邻近所述对象操控按钮,所述对象操控按钮即用于对所述操作对象执行具体操作的控制按钮,如使得所述操作对象的位置移动或状态改变等。从而所述用户在选择或移动所述操作对象时,手指移动的范围更小,操作更加便捷、高效,用户体验更好。
或者,所述对象指示信息与所述对象操控按钮的距离小于等于预定的距离阈值信息,如3cm,使得用户通过单指在小幅度内移动即可移动多个所述操作对象,同样实现了操作更加便捷、高效,用户体验更好。
更优选地,所述方法还包括步骤S5(未示出)和步骤S6(未示出),其中,在步骤S5中,所述设备根据用户对所述对象操控按钮的滑动操作移动所述应用窗口中的所述对象指示信息;在步骤S6中,所述设备根据移动后的所述对象指示信息调整所述操作对象和/或所述视觉关联信息在 所述应用窗口中的显示位置。
具体地,在步骤S5中,所述设备根据所述用户对所述对象操控按钮的上滑、下滑或其他方向的滑动操作,对所述应用窗口中的一个或多个所述对象指示信息进行移动;其中,所述移动可以根据所述滑动操作的方向与移动位置来等比例移动,也可以按照缺省设置向同一方向和/或以同一距离移动。
然后,在步骤S6中,所述设备根据移动后的所述对象指示信息,例如根据预定的间距或根据所述应用窗口上的整体排布,相应地调整所述操作对象和/或所述视觉关联信息在所述应用窗口中的显示位置。
优选地,所述操作对象和/或所述视觉关联信息可以被移除或移入当前应用窗口。
图4示出根据本发明一个优选实施例的一种用于操控操作对象的方法流程图。具体地,在步骤S1’中,所述设备检测是否有操作对象进入对应的应用窗口;当存在操作对象进入所述应用窗口,在步骤S21’中,所述设备在所述应用窗口内呈现所述操作对象对应的对象指示信息;在步骤S22’中,所述设备确定所述对象指示信息与所述操作对象的视觉关联信息;在步骤S23’中,所述设备在所述应用窗口内呈现所述视觉关联信息。
其中,所述方法的步骤S1’与图3所示对应步骤相同或基本相同,故此处不再赘述,并通过引用的方式包含于此。
具体地,当存在操作对象进入所述应用窗口,在步骤S21’中,所述设备根据缺省设置,或基于当前应用窗口内所存在的对象指示信息,确定所述操作对象所对应的对象指示信息的形状、颜色等信息,并在在所述应用窗口内呈现所述操作对象对应的对象指示信息;然后,在步骤S22’中,所述设备根据所述对象指示信息,或者根据当前应用窗口内已经存在的视觉关联信息等,确定所述对象指示信息与所述操作对象的视觉关联信息;在步骤S23’中,所述设备在所述应用窗口内呈现所述视觉关联信息。
在此,本领域技术人员应能理解,所述步骤S21’与所述步骤S22’可以按照图4所示的顺序执行,也可以并行,或先执行所述步骤S22’的操作后再执行所述步骤S21’的操作。
在此,基于当前应用窗口内已经存在的对象指示信息或视觉关联信息,来确定新的对象指示信息或视觉关联信息的方式包括但不限于将新的对象指示信息或视觉关联信息的形状、颜色和/或位置等与已经存在的对象指示信息或视觉关联信息的形状、颜色和/或位置相同、相近或相反。
优选地,在步骤S22’中,所述设备可以根据所述操作对象的对象相关信息确定所述对象指示信息与所述操作对象的视觉关联信息。
具体地,在步骤S22’中,所述设备可以根据所述操作对象的对象相关信息,如名称、等级、权重、类别、当前状态等内容,确定所述视觉关联信息。例如,若存在多个操作对象,则属于相同类别的操作对象则采用同样的视觉关联信息;不同等级的操作对象则采用不同颜色的视觉关联信息等。
优选地,所述第二单元22’可以根据所述应用窗口内的已有视觉关联信息确定所述对象指示信息与所述操作对象的视觉关联信息,其中,所述视觉关联信息与所述已有视觉关联信息在视觉上相区分。
具体地,所述第二单元22’可根据所述应用窗口内已有的视觉关联信息,将新确定的视觉关联信息确定为与其具有不同视觉效果的信息;例如,采用不同的色彩、不同的形状、不同的表达方式(如连线或同型色等)。从而使得所述应用窗口内的各个视觉关联信息具有区分度,便于用户的辨识与操作。
在此,本领域技术人员应能理解,上述示例仅为举例,并非对本发明的限制,其他的确定所述视觉关联信息的方式如能适用于本发明,同样包含在本发明的保护范围内。
需要注意的是,本发明可在软件和/或软件与硬件的组合体中被实施,例如,本发明的各个装置可采用专用集成电路(ASIC)或任何其他类似硬件设备来实现。在一个实施例中,本发明的软件程序可以通过处理器执行以实现上文所述步骤或功能。同样地,本发明的软件程序(包括相关的数据结构)可以被存储到计算机可读记录介质中,例如,RAM存储器,磁或光驱动器或软磁盘及类似设备。另外,本发明的一些步骤或功能可采用硬件来实现,例如,作为与处理器配合从而执行各个步骤或功能的电路。
对于本领域技术人员而言,显然本发明不限于上述示范性实施例的细节,而且在不背离本发明的精神或基本特征的情况下,能够以其他的具体形式实现本发明。因此,无论从哪一点来看,均应将实施例看作是示范性的,而且是非限制性的,本发明的范围由所附权利要求而不是上述说明限定,因此旨在将落在权利要求的等同要件的含义和范围内的所有变化涵括在本发明内。不应将权利要求中的任何附图标记视为限制所涉及的权利要求。此外,显然“包括”一词不排除其他单元或步骤,单数不排除复数。系统权利要求中陈述的多个单元或装置也可以由一个单元或装置通过软件或者硬件来实现。第一,第二等词语用来表示名称,而并不表示任何特定的顺序。

Claims (20)

  1. 一种用于操控操作对象的方法,其中,该方法包括:
    a检测是否有操作对象进入对应的应用窗口;
    b当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
  2. 根据权利要求1所述的方法,其中,所述步骤b包括:
    当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
  3. 根据权利要求1或2所述的方法,其中,所述步骤b包括:
    b1当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象对应的对象指示信息;
    b2确定所述对象指示信息与所述操作对象的视觉关联信息;
    b3在所述应用窗口内呈现所述视觉关联信息。
  4. 根据权利要求3所述的方法,其中,所述步骤b2包括:
    根据所述操作对象的对象相关信息确定所述对象指示信息与所述操作对象的视觉关联信息。
  5. 根据权利要求3或4所述的方法,其中,所述步骤b2包括:
    根据所述应用窗口内的已有视觉关联信息确定所述对象指示信息与所述操作对象的视觉关联信息,其中,所述视觉关联信息与所述已有视觉关联信息在视觉上相区分。
  6. 根据权利要求1至5中任一项所述的方法,其中,该方法还包括:
    根据用户对所述对象指示信息的选中操作,选中所述操作对象。
  7. 根据权利要求6所述的方法,其中,该方法还包括:
    当所述操作对象被选中时,隐藏所述视觉关联信息和/或所述对象指示信息;或者,
    当所述操作对象被移出所述应用窗口时,隐藏所述视觉关联信息和/或所述对象指示信息;或者,
    当所述操作对象被选中且被执行操作后,隐藏所述视觉关联信息和/或所述对象指示信息。
  8. 根据权利要求1至7中任一项所述的方法,其中,所述视觉关联信息包括以下至少任一项:
    所述对象指示信息与所述操作对象共用的显示色彩信息;
    所述对象指示信息与所述操作对象共用的显示形状信息;
    所述对象指示信息与所述操作对象共用的显示图标信息;
    所述对象指示信息与所述操作对象之间的连线信息。
  9. 根据权利要求1至8中任一项所述的方法,其中,所述应用窗口显示于触摸终端的触屏上;所述对象指示信息比所述操作对象接近所述应用窗口内的对象操控按钮,或者所述对象指示信息与所述对象操控按钮的距离小于或等于预定的距离阈值信息。
  10. 根据权利要求9所述的方法,其中,该方法还包括:
    根据用户对所述对象操控按钮的滑动操作移动所述应用窗口中的所述对象指示信息;
    根据移动后的所述对象指示信息调整所述操作对象和/或所述视觉关联信息在所述应用窗口中的显示位置。
  11. 一种用于操控操作对象的设备,其中,该设备包括:
    第一装置,用于检测是否有操作对象进入对应的应用窗口;
    第二装置,用于当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
  12. 根据权利要求11所述的设备,其中,所述第二装置用于:
    当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象、所述操作对象对应的对象指示信息,以及所述对象指示信息与所述操作对象的视觉关联信息。
  13. 根据权利要求11或12所述的设备,其中,所述第二装置包括:
    第一单元,用于当存在操作对象进入所述应用窗口,在所述应用窗口内呈现所述操作对象对应的对象指示信息;
    第二单元,用于确定所述对象指示信息与所述操作对象的视觉关联信息;
    第三单元,用于在所述应用窗口内呈现所述视觉关联信息。
  14. 根据权利要求13所述的设备,其中,所述第二单元用于:
    根据所述操作对象的对象相关信息确定所述对象指示信息与所述操作对象的视觉关联信息。
  15. 根据权利要求13或14所述的设备,其中,所述第二单元用于:
    根据所述应用窗口内的已有视觉关联信息确定所述对象指示信息与所述操作对象的视觉关联信息,其中,所述视觉关联信息与所述已有视觉关联信息在视觉上相区分。
  16. 根据权利要求11至15中任一项所述的设备,其中,该设备还包括:
    第三装置,用于根据用户对所述对象指示信息的选中操作,选中所述操作对象。
  17. 根据权利要求16所述的设备,其中,该设备还包括第四装置,用于:
    当所述操作对象被选中时,隐藏所述视觉关联信息和/或所述对象指示信息;或者,
    当所述操作对象被移出所述应用窗口时,隐藏所述视觉关联信息和/或所述对象指示信息;或者,
    当所述操作对象被选中且被执行操作后,隐藏所述视觉关联信息和/或所述对象指示信息。
  18. 根据权利要求11至17中任一项所述的设备,其中,所述视觉关联信息包括以下至少任一项:
    所述对象指示信息与所述操作对象共用的显示色彩信息;
    所述对象指示信息与所述操作对象共用的显示形状信息;
    所述对象指示信息与所述操作对象共用的显示图标信息;
    所述对象指示信息与所述操作对象之间的连线信息。
  19. 根据权利要求11至18中任一项所述的设备,其中,所述应用窗口显示于触摸终端的触屏上;所述对象指示信息比所述操作对象接近所述应用窗口内的对象操控按钮,或者所述对象指示信息与所述对象操控按钮的距离小于或等于预定的距离阈值信息。
  20. 根据权利要求19所述的设备,其中,该设备还包括:
    第五装置,用于根据用户对所述对象操控按钮的滑动操作移动所述应用窗口中的所述对象指示信息;
    第六装置,用于根据移动后的所述对象指示信息调整所述操作对象和/或所述视觉关联信息在所述应用窗口中的显示位置。
PCT/CN2016/106677 2015-12-01 2016-11-21 一种用于操控操作对象的方法与设备 WO2017092584A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510868886.6A CN105278840A (zh) 2015-12-01 2015-12-01 一种用于操控操作对象的方法与设备
CN201510868886.6 2015-12-01

Publications (1)

Publication Number Publication Date
WO2017092584A1 true WO2017092584A1 (zh) 2017-06-08

Family

ID=55147937

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/106677 WO2017092584A1 (zh) 2015-12-01 2016-11-21 一种用于操控操作对象的方法与设备

Country Status (2)

Country Link
CN (1) CN105278840A (zh)
WO (1) WO2017092584A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278840A (zh) * 2015-12-01 2016-01-27 上海逗屋网络科技有限公司 一种用于操控操作对象的方法与设备
CN106528032A (zh) * 2016-12-05 2017-03-22 上海逗屋网络科技有限公司 一种对象显示方法与设备
CN111552429B (zh) * 2020-04-29 2021-07-23 杭州海康威视数字技术股份有限公司 一种图形选中方法、装置及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793164A (zh) * 2012-10-31 2014-05-14 国际商业机器公司 用于触摸屏显示处理的方法、装置和浏览器
CN104462418A (zh) * 2014-12-11 2015-03-25 小米科技有限责任公司 页面显示方法及装置、电子设备
CN104699399A (zh) * 2015-02-16 2015-06-10 上海逗屋网络科技有限公司 一种用于在触摸终端上确定目标操作对象的方法与设备
CN105278840A (zh) * 2015-12-01 2016-01-27 上海逗屋网络科技有限公司 一种用于操控操作对象的方法与设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101398087B1 (ko) * 2012-11-08 2014-05-27 (주)위메이드엔터테인먼트 터치 스크린 상에서의 사용자 입력을 보정하는 방법 및 장치, 온라인 게임에서의 사용자 입력을 보정하는 방법
CN103389867A (zh) * 2013-07-03 2013-11-13 珠海金山办公软件有限公司 移动设备选择目标切换的方法和系统
CN103927094B (zh) * 2014-03-28 2017-03-01 联想(北京)有限公司 信息处理方法、装置及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793164A (zh) * 2012-10-31 2014-05-14 国际商业机器公司 用于触摸屏显示处理的方法、装置和浏览器
CN104462418A (zh) * 2014-12-11 2015-03-25 小米科技有限责任公司 页面显示方法及装置、电子设备
CN104699399A (zh) * 2015-02-16 2015-06-10 上海逗屋网络科技有限公司 一种用于在触摸终端上确定目标操作对象的方法与设备
CN105278840A (zh) * 2015-12-01 2016-01-27 上海逗屋网络科技有限公司 一种用于操控操作对象的方法与设备

Also Published As

Publication number Publication date
CN105278840A (zh) 2016-01-27

Similar Documents

Publication Publication Date Title
US10656821B2 (en) Moving an object by drag operation on a touch panel
US10684768B2 (en) Enhanced target selection for a touch-based input enabled user interface
US10627990B2 (en) Map information display device, map information display method, and map information display program
JP6037973B2 (ja) ユーザ・インターフェースのための入力モード間の自動切り替え
US10042546B2 (en) Systems and methods to present multiple frames on a touch screen
US10444951B2 (en) Method and device for identifying a left-hand or a right-hand mode of operation on a user handheld device
US20160004373A1 (en) Method for providing auxiliary information and touch control display apparatus using the same
US20120054671A1 (en) Multi-touch interface gestures for keyboard and/or mouse inputs
US9632693B2 (en) Translation of touch input into local input based on a translation profile for an application
US9207767B2 (en) Guide mode for gesture spaces
EP2960763A1 (en) Computerized systems and methods for cascading user interface element animations
WO2017092584A1 (zh) 一种用于操控操作对象的方法与设备
KR20160083691A (ko) 컨텐츠 선택 방법 및 그 전자 장치
US10698566B2 (en) Touch control based application launch
WO2017107725A1 (zh) 一种用于控制操作界面的方法与设备
US10394442B2 (en) Adjustment of user interface elements based on user accuracy and content consumption
JP2016085523A (ja) ノードを表示する方法、並びに、ノードを表示するためのコンピュータ及びそのコンピュータ・プログラム
US20090213067A1 (en) Interacting with a computer via interaction with a projected image
WO2018098960A1 (zh) 一种触屏设备操作方法及触屏设备
CN108920230B (zh) 鼠标悬浮操作的响应方法、装置、设备和存储介质
CN104346095A (zh) 一种信息处理方法及电子设备
US20130067408A1 (en) Contextually applicable commands
JP6662861B2 (ja) ユーザアクションに応答して直接操作の有効化を決定するためのヒットテスト
WO2016081280A1 (en) Method and system for mouse pointer to automatically follow cursor
JP2014137616A (ja) 表示制御装置、表示制御システム及び表示制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16869894

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 091018)

122 Ep: pct application non-entry in european phase

Ref document number: 16869894

Country of ref document: EP

Kind code of ref document: A1