WO2020192544A1 - Procédé de sélection d'objet interactif sur un support d'affichage d'un dispositif - Google Patents

Procédé de sélection d'objet interactif sur un support d'affichage d'un dispositif Download PDF

Info

Publication number
WO2020192544A1
WO2020192544A1 PCT/CN2020/080161 CN2020080161W WO2020192544A1 WO 2020192544 A1 WO2020192544 A1 WO 2020192544A1 CN 2020080161 W CN2020080161 W CN 2020080161W WO 2020192544 A1 WO2020192544 A1 WO 2020192544A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive object
interactive
reference position
display medium
display
Prior art date
Application number
PCT/CN2020/080161
Other languages
English (en)
Chinese (zh)
Inventor
牛旭恒
方俊
李江亮
Original Assignee
北京外号信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京外号信息技术有限公司 filed Critical 北京外号信息技术有限公司
Publication of WO2020192544A1 publication Critical patent/WO2020192544A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Definitions

  • the present invention relates to an interaction method, and more specifically, to a method for selecting an interaction object on a display medium of a device.
  • One aspect of the present invention relates to a method for selecting interactive objects on a display medium of a device, wherein a position on the display medium is set as a reference position, and one or more interactive objects are displayed on the On the display medium and can be selected, the display position of the one or more interactive objects on the display medium will change as the position or posture of the device changes, and the method includes: obtaining the one or more The display position of the interactive object on the display medium; and automatically select the interactive object from the one or more interactive objects according to the obtained display position and the reference position.
  • the obtaining the display position of the one or more interactive objects on the display medium includes: determining the one or more interactive objects during and/or after the position or posture of the device changes The display position on the display medium.
  • the obtaining the display position of the one or more interactive objects on the display medium includes: repeatedly obtaining the display position of the one or more interactive objects on the display medium.
  • the automatically selecting an interactive object from the one or more interactive objects according to the obtained display position and the reference position includes: according to the last obtained display position of each interactive object and the The reference position determines the distance between the interactive object and the reference position; and the interactive object is automatically selected according to the determined distance.
  • the automatically selecting an interactive object according to the determined distance includes: if the distance between the interactive object and the reference position is less than a threshold, selecting the interactive object.
  • the interactive object closest to the reference position is selected.
  • said determining the distance between the interactive object and the reference position according to the last obtained display position of each interactive object and the reference position comprises: repeatedly according to the last obtained display of each interactive object The position and the reference position determine the distance between the interactive object and the reference position.
  • the automatically selecting an interactive object according to the determined distance includes: if the distance between the interactive object and the reference position continues to decrease, and the current distance is less than a threshold, then selecting the interactive object.
  • the automatically selecting an interactive object from the one or more interactive objects according to the obtained display position and the reference position includes: determining according to the display position of each interactive object on the display medium The direction of the reference position relative to the interactive object; determine the moving direction of the interactive object on the display medium according to the change of the display position of each interactive object on the display medium; and determine the direction of movement of the interactive object on the display medium according to the reference position relative to the interactive object The direction and the moving direction of the interactive object on the display medium are used to automatically select the interactive object.
  • the automatically selecting the interactive object according to the direction of the reference position relative to the interactive object and the moving direction of the interactive object on the display medium includes: if the moving direction of the interactive object on the display medium is consistent with the reference If the position is closest to the direction of the interactive object, the interactive object is selected.
  • the automatically selecting an interactive object from the one or more interactive objects according to the obtained display position and the reference position includes: determining according to the display position of each interactive object on the display medium The direction of the reference position relative to the interactive object; determine the moving direction of the interactive object on the display medium according to the change of the display position of each interactive object on the display medium; according to the last obtained display position of each interactive object And the reference position to determine the distance between the interactive object and the reference position; and according to the direction of the reference position relative to the interactive object, the moving direction of the interactive object on the display medium, and the distance between the interactive object and the reference position Distance to automatically select interactive objects.
  • the reference position is a fixed position on the display medium.
  • a position on the display medium is set as the reference position in the following manner: a certain preset or default position on the display medium is set as the reference position; according to the user's instruction To set the reference position; or set the reference position according to the display position of the currently selected interactive object on the display medium.
  • Another aspect of the present invention relates to a storage medium in which a computer program is stored, and when the computer program is executed by a processor, it can be used to implement the above method.
  • Another aspect of the present invention relates to an electronic device, which includes a processor and a memory, and a computer program is stored in the memory.
  • the computer program When the computer program is executed by the processor, the computer program can be used to implement the above method.
  • Figure 1 shows an exemplary optical label
  • Figure 2 shows an exemplary optical label network
  • Figure 3 shows a screenshot of the display medium of the device
  • Fig. 4 schematically shows a display medium of a device according to an embodiment
  • FIG. 5 shows a method for selecting interactive objects on the display medium shown in FIG. 4 according to an embodiment
  • Figure 6 schematically shows a display medium after changing the position or posture of the device according to an embodiment
  • FIG. 7 schematically shows the display medium shown in FIG. 6 with a virtual circle
  • Figure 8 schematically shows the display medium after changing the position or posture of the device according to an embodiment
  • FIG. 9 schematically shows a change diagram of the display position of the interactive object on the display medium when the position or posture of the device changes according to an embodiment
  • FIG. 10 schematically shows a change diagram of the display position of an interactive object on a display medium when the position or posture of the device is changed according to an embodiment
  • Figure 11 schematically shows a display medium according to an embodiment
  • FIG. 12 schematically shows a change diagram of the display position of the interactive object on the display medium when the position or posture of the device changes according to an embodiment
  • Figure 13 schematically shows a display medium according to an embodiment
  • Fig. 14 shows a method for selecting interactive objects on a display medium according to an embodiment.
  • interactive objects may refer to various objects displayed on the display medium of the device that can be operated by the user.
  • the display position of the interactive objects on the display medium will change with the change of the position or posture of the device.
  • the device may be various computing devices or electronic devices that have a display medium or can communicate with the display medium, such as mobile phones, tablet computers, smart glasses, smart helmets, smart watches, and so on.
  • the display medium can be integrated into the device and moved with the device.
  • the display medium can also be a separate component from the device and can communicate with the device (for example, the display medium can receive information about the scene to be presented from the device). In this case, when the device's position or posture changes, the display The scene presented on the medium will change accordingly, but the display medium itself can keep its position or posture unchanged.
  • the display medium may be, for example, an electronic screen, a curtain, etc., but it is not limited thereto, and it may be various media that can be used to display interactive objects. For example, in some smart glasses, an image containing an interactive object is projected onto a prism, lens, mirror, etc. through a projector, and finally projected into the human eye. These prisms, lenses, mirrors, etc. are also considered to be display media.
  • the interactive objects on the display medium of the device may be associated with actual objects in the real-world scene, but may also be virtual objects displayed on the display medium of the device (for example, virtual objects displayed in virtual reality or augmented reality applications) .
  • the expression form of the interactive object may be, for example, icons, signs, texts, graphics, virtual characters, virtual objects, etc. displayed on the display medium.
  • the following uses the icon of the optical communication device displayed on the device display medium as the interactive object for illustration.
  • Optical communication devices are also called optical tags, and these two terms can be used interchangeably in this article.
  • Optical tags can transmit information by emitting different lights. They have the advantages of long recognition distance, relaxed requirements for visible light conditions, and strong directivity, and the information transmitted by optical tags can change over time, which can provide large information capacity and flexibility Configuration capabilities. Compared with the traditional two-dimensional code, the optical label has a longer recognition distance and stronger information interaction capabilities, which can provide users with great convenience.
  • the optical tag usually includes a controller and at least one light source, and the controller can drive the light source through different driving modes to transmit different information outward.
  • Fig. 1 shows an exemplary optical label 100, which includes three light sources (respectively a first light source 101, a second light source 102, and a third light source 103).
  • the optical label 100 also includes a controller (not shown in FIG. 1) for selecting a corresponding driving mode for each light source according to the information to be transmitted.
  • the controller can use different driving signals to control the light emitting mode of the light source, so that when the light label 100 is photographed by a device with imaging function, the image of the light source therein can show a different appearance. (For example, different colors, patterns, brightness, etc.).
  • the driving mode of each light source at the moment can be analyzed, so as to analyze the information transmitted by the optical label 100 at the moment.
  • each optical tag can be assigned an identification information (ID), which is used to uniquely identify or identify the optical tag by the manufacturer, manager or user of the optical tag .
  • ID identification information
  • the controller in the optical tag can drive the light source to transmit the identification information outward, and the user can use the device to perform image capture on the optical tag to obtain the identification information transmitted by the optical tag, so that the corresponding identification information can be accessed based on the identification information.
  • Services for example, accessing a webpage associated with the identification information of the optical tag, obtaining other information associated with the identification information (for example, location information of the optical tag corresponding to the identification information), and so on.
  • the device can obtain multiple images containing the optical label through continuous image acquisition of the optical label through the camera on it, and identify the optical label by analyzing the imaging of the optical label (or each light source in the optical label) in each image Information passed.
  • the identification information (ID) of the optical tag and other information can be stored in the server.
  • Fig. 2 shows an exemplary optical label network.
  • the optical label network includes a plurality of optical labels and at least one server, wherein the information related to each optical label can be stored on the server.
  • the identification information (ID) or other information of each optical label can be saved on the server, such as service information related to the optical label, description information or attributes related to the optical label, such as the location information, physical Size information, physical shape information, orientation information, etc.
  • the device can use the identified identification information of the optical label to query the server to obtain other information related to the optical label.
  • the location information of the optical tag may refer to the actual location of the optical tag in the physical world, which may be indicated by geographic coordinate information.
  • the server may be a software program running on a computing device, a computing device, or a cluster composed of multiple computing devices.
  • Fig. 3 shows a screenshot of the display medium of the device.
  • the upper part of the display medium of the device is used to display the icon corresponding to the light label (as an interactive object).
  • two circular icons are shown in the upper part of the display medium, which correspond to two light labels in the field of view of the camera of the device.
  • Fig. 4 schematically shows a display medium of a device according to an embodiment.
  • the four interactive objects 41, 42, 43, 44 are shown as four circular icons on the display medium.
  • a cross-shaped mark is also displayed on the display medium, which is used to indicate a reference position on the display medium.
  • the reference position is the center position of the display medium. However, it is understandable that the reference position can also be other positions on the display medium. .
  • the reference position is preset or default, and can always remain unchanged.
  • the reference position may also be specified by the user of the device. For example, the user may specify a certain position on the display medium as the reference position by clicking on it. Once the reference position is determined and not specified again, it will not change as the position or posture of the device changes.
  • the four interactive objects 41, 42, 43, 44 can be associated with actual objects (such as light tags) in the real world scene, or virtual objects displayed on the device display medium (such as in virtual reality or augmented reality applications). Virtual objects shown in).
  • the display position of the interactive objects 41, 42, 43, 44 on the display medium will change as the position or posture of the device changes. For example, when the interactive objects 41, 42, 43, 44 correspond to actual objects in the real world scene, as the position or posture of the device changes, the field of view of the camera on them will change, so that the actual objects appear on the device display medium.
  • the display position of will change, and accordingly, the display position of the interactive objects 41, 42, 43, 44 corresponding to these actual objects will also change on the display medium.
  • the display positions of these virtual objects will change with the position or posture of the device (such as smart glasses, smart helmets, etc.) change.
  • the mark for indicating the reference position may not be displayed on the display medium to avoid interference with normal display.
  • Fig. 5 shows a method for selecting an interactive object on the display medium shown in Fig. 4 according to an embodiment, wherein a position in the display medium is determined as a reference position.
  • the center position of the display medium of the device is preset as the reference position.
  • the method shown in Figure 5 includes:
  • Step 501 Obtain the display position of each interactive object on the display medium.
  • the device can determine the display position of each interactive object on the display medium during or after the change of its position or posture.
  • the display position of the interactive object on the display medium can be determined according to the position of the actual object detected by the device in the field of view of the device.
  • the interactive object When the interactive object is a virtual object displayed on the display medium of the device (for example, a virtual object displayed in virtual reality or augmented reality applications), it can be based on various sensors (such as smart glasses, smart helmets, etc.) built in the device ( For example, accelerometers, gyroscopes, etc.) to track the position or posture changes of the device, so as to obtain the display position of the interactive object on the display medium of the device.
  • the interactive object usually occupies an area (such as a circular area) on the display medium. Therefore, in order to conveniently indicate its display position, in one embodiment, its center position can be selected as its display position. Of course, other methods are also feasible of.
  • Fig. 6 schematically shows the display medium after changing the position or posture of the device according to an embodiment.
  • the device may also periodically or repeatedly obtain the display position of each interactive object on the display medium.
  • the device can repeatedly obtain each interactive object in the display medium at a certain time interval (for example, every 0.05 seconds, every 0.1 seconds, every 0.2 seconds, every 0.3 seconds, every 0.5 seconds, every 1 second, etc.).
  • a certain time interval for example, every 0.05 seconds, every 0.1 seconds, every 0.2 seconds, every 0.3 seconds, every 0.5 seconds, every 1 second, etc.
  • some interactive objects may leave the display range of the display medium of the device (that is, leave the field of view of the device). In this case, the display of these leaving interactive objects may no longer be determined Position; and some interactive objects may enter the display range of the display medium of the device (that is, enter the field of view of the device), in this case, the display position of these entered interactive objects can be obtained.
  • Step 502 Select an interactive object according to the obtained display position and the reference position.
  • the device may select the interactive object according to the display position and the reference position. After determining the display position of the interactive object on the display medium each time, the device may select the interactive object according to the determined display position and the reference position. The device may also periodically or repeatedly select the interaction object according to the determined display position and the reference position.
  • the display position may be, for example, the final display position of the interactive object, or may be one, more, a part, or all of a series of display positions of the interactive object determined within a period of time.
  • the distance between the interactive object and the reference position can be determined according to the final display position and the reference position of each interactive object. If the distance between the interactive object and the reference position is less than a certain preset Set the threshold, then select the interactive object.
  • Fig. 7 schematically shows the display medium shown in Fig. 6 with a virtual circle.
  • the virtual circle 701 takes the reference position as the center and the preset threshold as the radius.
  • the virtual circle 701 may be displayed on the display medium to help the user better realize the selection of interactive objects, or it may not be displayed on the display medium.
  • the distance between the center of each interactive object and the reference position can be determined. If the distance between the center of an interactive object and the reference position is less than the preset Threshold, the center will enter the range of circle 701. As shown in Figure 7, the center of the interactive object 42 is within the range of the circle 701, which means that the distance between the interactive object 42 and the reference position is less than the preset threshold, so the device can automatically select the interactive object 42 for subsequent various Kind of operation. After the device selects the interactive object, the selected interactive object can be specially marked (for example, highlighted) to inform the user of the currently selected interactive object.
  • the interactive object is selected by judging whether the distance between the interactive object and the reference position is less than a certain threshold, it is possible that two or more interactive objects satisfy the condition at the same time.
  • the interactive object closest to the reference position can be selected.
  • these interactive objects can be selected at the same time.
  • the user of the device can be further instructed to select from these interactive objects that meet the conditions, for example, a prompt box can be displayed to the user, which contains a list of multiple interactive objects that meet the conditions, and the user can list them in it. Select an interactive object.
  • the device may further consider the changing trend of the distance. Specifically, the device may periodically or repeatedly determine the display position of each interactive object on the display medium, and repeatedly determine the interactive object and the reference position according to the final display position of each interactive object and the reference position the distance between. If the distance between the interactive object and the reference position continues to decrease, and the current distance is less than the threshold, the interactive object is automatically selected. For example, if the user wants to select the interactive object 41 after selecting the interactive object 42 (as shown in the display medium in FIG. 7), the user can translate or rotate the device so that the display position of the interactive object 41 on the display medium can be Closer to the reference position.
  • FIG. 8 schematically shows the display medium after changing the position or posture of the device according to an embodiment.
  • the display medium of FIG. 8 shows that the interactive object 41 is gradually approaching the reference position, but the distance from the reference position is still greater than the preset threshold, and the interactive object 42 is gradually moving away from the reference position, but with the reference position The distance is still less than the preset threshold.
  • the interactive objects 43 and 44 have left the display range of the display medium and are no longer displayed.
  • the device does not select the interactive object 42.
  • the distance between the interactive object 41 and the reference position continues to decrease and is less than the preset threshold, and the device can select the interactive object 41 at this time.
  • the device may select the interactive object according to the moving direction of the interactive object. Specifically, the device can determine the direction of the reference position relative to the interactive object according to the display position of each interactive object on the display medium; determine that the interactive object is displayed according to the change in the display position of each interactive object on the display medium The moving direction on the medium; and, selecting the interactive object according to the direction of the reference position relative to the interactive object and the moving direction of the interactive object on the display medium.
  • the device can select any one of a series of display positions of the interactive object on the display medium within a period of time as the display position of the interactive object on the display medium, for example, the beginning The display position of the interactive object on the display medium, the display position of the interactive object on the display medium at the end of a period of time, or the display position of the interactive object on the display medium at the middle of a period of time.
  • the device may also select the average value of a part or all of the above-mentioned series of display positions as the display position of the interactive object on the display medium.
  • the device can select any two or more of a series of display positions of the interactive object on the display medium within a period of time to determine the moving direction of the interactive object. If among one or more interactive objects, the moving direction of an interactive object on the display medium is closest to the direction of the reference position relative to the interactive object, then the interactive object can be selected.
  • FIG. 9 schematically shows a display position change diagram of interactive objects on the display medium when the position or posture of the device is changed according to an embodiment, wherein circles drawn by dotted lines are used to represent the interactive objects 41, 42 The initial display positions of, 43, and 44 use a circle drawn by a solid line to represent the display position of the interactive objects 41, 42, 43, 44 after the device changes its position or posture.
  • FIG. 9 schematically shows a display position change diagram of interactive objects on the display medium when the position or posture of the device is changed according to an embodiment, wherein circles drawn by dotted lines are used to represent the interactive objects 41, 42
  • the initial display positions of, 43, and 44 use a circle drawn by a
  • each interactive object is shown with a solid arrow
  • the direction of the reference position relative to the initial display position of each interactive object is shown with a dotted arrow. It can be seen from FIG. 9 that the moving direction of the interactive object 42 is closest to the direction of the reference position relative to it, so the interactive object 42 can be selected to perform corresponding operations on it.
  • the proximity for example, the angle
  • a predetermined condition for example, the angle is less than a certain A preset angle value, such as 10°
  • Selecting the interactive object by the reference position relative to the direction of the interactive object and the moving direction of the interactive object on the display medium can prevent the device from moving significantly in the process of selecting the interactive object, thereby greatly improving the efficiency of selecting the interactive object.
  • the user s interactive experience.
  • the device may select the interactive object according to the moving direction of the interactive object and the distance from the reference position. Specifically, the device can determine the direction of the reference position relative to the interactive object according to the display position of each interactive object on the display medium; determine that the interactive object is displayed according to the change in the display position of each interactive object on the display medium The direction of movement on the medium; the distance between the interactive object and the reference position is determined according to the final display position of each interactive object and the reference position; and, according to the direction of the reference position relative to the interactive object, where the interactive object is Display the moving direction on the medium and the distance between the interactive object and the reference position to select the interactive object.
  • the display medium in FIG. 10 also has an interactive object 45, and the direction of the reference position relative to the interactive object 45 is basically the same as the direction of the reference position relative to the interactive object 42.
  • the moving direction of the object 42 is also basically the same. In this case, it may be difficult to determine whether the interactive object 42 or the interactive object 45 should be selected by considering only the closeness of the moving direction of the interactive object to the reference position relative to it (for example, the closeness of the two directions may be the same or different. Not big). In this case, the distance between the interactive object and the reference position can be further considered. Since the interactive object 42 is closer to the reference position, the interactive object 42 can be selected to perform corresponding operations on it.
  • the reference position can be set according to the display position of the currently selected interactive object on the display medium. For example, after the interactive object is selected, the reference position can be changed to the display position of the currently selected interactive object on the display medium.
  • the selected interactive object may be specially marked (for example, highlighted) to inform the user that it is the currently selected interactive object and the reference position is currently located at the display position of the interactive object.
  • the reference position on the display medium may remain unchanged (for example, it does not change with the change of the position and posture of the device) until the next interactive object is selected.
  • FIG. 11 schematically shows the display medium after the display position of the interactive object shown in FIG.
  • a hexagon can be used to identify the currently selected interactive object 42, and the interactive object 42 or six
  • the current display position of the polygon on the display medium is set as the new reference position.
  • Fig. 12 schematically shows a change in the display position of the interactive object on the display medium when the position or posture of the device is changed according to an embodiment, where the user wants to select the interactive object after selecting the interactive object 42 41. To this end, the user can change the position or posture of the device to make the interactive object 41 move to the reference position on the display medium. According to the proximity between the moving direction of each interactive object on the display medium and the direction of the reference position relative to the interactive object on the display medium, it can be determined that the interactive object 41 should be selected.
  • FIG. 13 schematically shows the display medium after the interactive object 41 is selected.
  • a hexagon can be used to identify the currently selected interactive object 41, and the interactive object 41 or the hexagon can be currently displayed
  • the display position on the medium is set to the new reference position. This is very advantageous, because the user usually gazes at the currently selected interactive object, which allows the user to easily know the current reference position and move a small amount to the desired next interactive object (for example, make the device Make a small rotation in the direction where the next interactive object is located) to select the next interactive object, thereby improving the interaction efficiency and improving the user's interactive experience.
  • the amount of change in the distance between the display position of each interactive object and the reference position can be determined, and the interactive object with the fastest or most reduced distance can be selected.
  • This method is actually similar to the method of selecting the interactive object according to the direction of the reference position relative to the interactive object and the moving direction of the interactive object on the display medium shown in FIGS. 9 and 12.
  • the distance between the interactive object and the reference position is reduced. Small the fastest or reduce the most.
  • a distance reduction threshold or a distance reduction speed threshold may be set to avoid possible misoperation.
  • the reference position on the display medium can also be changed according to the user's instruction. For example, the user can click on the display medium to determine a new reference position.
  • the interactive objects presented on the display medium of the device may be objects with three-dimensional spatial positions, such as some virtual objects arranged in virtual reality or augmented reality scenes.
  • the positions of interactive objects may be determined by three-dimensional coordinates, for example. Said. Since the objects in the three-dimensional space can have different depths or distances relative to the device, it may sometimes be difficult to select interactive objects based on the presentation position of these interactive objects on the device display medium (for example, some interactive objects may be spaced more apart in space). Far, but may have a relatively close presentation position on the device display medium). This situation is especially serious when there are more interactive objects in the space.
  • a method for selecting interactive objects presented on the display medium of a device wherein one or more interactive objects are located in the space around the device and can be displayed in all the interactive objects. On the display medium, and wherein the position of the one or more interactive objects relative to the device will change as the position or posture of the device changes.
  • the method is shown in Figure 14, and can include the following steps:
  • Step 601 Set a selection area in a space around the device, wherein the position of the selection area relative to the device remains unchanged before being reset.
  • the selection area may be a three-dimensional structure, such as a sphere, a cylinder, a cube, or other regular or irregular three-dimensional structures.
  • Setting the selection area may include setting the position of the selection area relative to the device (for example, the distance and direction relative to the device), setting the shape of the selection area, setting the size of the selection area, and so on.
  • the shape, size, or position relative to the device of the selection area may also be preset or default.
  • the position of the selected area relative to the device remains unchanged before being reset. Therefore, the position of the selected area in space will change as the position or posture of the device changes, as if it is fixedly coupled to the device .
  • the direction of the selection area relative to the device can be set by selecting a certain position on the display medium of the device, and the distance of the selection area relative to the device can be set by dragging a sliding block, inputting a value, etc.
  • the position of the selected area relative to the device can be set according to the position of the currently selected interactive object in space.
  • Step 602 Obtain the position of the one or more interactive objects relative to the device according to the position and posture of the device and the position of the one or more interactive objects.
  • the position and/or posture changes of the device can be measured or tracked in various known ways.
  • the built-in acceleration sensor, magnetic sensor, direction sensor, gravity sensor, gyroscope can be used.
  • Cameras, etc. through methods known in the art (for example, inertial navigation, visual odometry, SLAM, VSLAM, SFM, etc.) to measure or track its position change and/or posture change.
  • the position of the one or more interactive objects relative to the device can be obtained.
  • Step 603 Automatically select an interactive object from the one or more interactive objects according to the location of the selection area relative to the device and the location of the one or more interactive objects relative to the device.
  • the interaction object may be selected.
  • the present invention can be implemented in the form of a computer program.
  • the computer program can be stored in various storage media (for example, a hard disk, an optical disk, a flash memory, etc.), and when the computer program is executed by a processor, it can be used to implement the method of the present invention.
  • the present invention can be implemented in the form of an electronic device.
  • the electronic device includes a processor and a memory, and a computer program is stored in the memory.
  • the computer program When the computer program is executed by the processor, it can be used to implement the method of the present invention.
  • references to "various embodiments”, “some embodiments”, “one embodiment”, or “an embodiment” herein refer to the specific features, structures, or properties described in connection with the embodiments included in In at least one embodiment. Therefore, the appearances of the phrases “in various embodiments”, “in some embodiments”, “in one embodiment”, or “in an embodiment” in various places throughout this document do not necessarily refer to the same implementation example.
  • specific features, structures, or properties can be combined in any suitable manner in one or more embodiments. Therefore, a specific feature, structure, or property shown or described in one embodiment can be combined in whole or in part with the feature, structure, or property of one or more other embodiments without limitation, as long as the combination is not non-limiting. Logical or not working.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de sélection d'un objet interactif sur un support d'affichage d'un dispositif, une position sur le support d'affichage étant définie comme étant une position de référence, un ou plusieurs objets interactifs étant affichés sur le support d'affichage et pouvant être sélectionnés, et les positions d'affichage du ou des objets interactifs changeant en fonction d'un changement de la position ou de la posture du dispositif. Le procédé consiste : à obtenir les positions d'affichage, sur un support d'affichage, d'un ou plusieurs objets interactifs ; et à sélectionner automatiquement, selon les positions d'affichage obtenues et une position de référence, un objet interactif parmi le ou les objets interactifs.
PCT/CN2020/080161 2019-03-27 2020-03-19 Procédé de sélection d'objet interactif sur un support d'affichage d'un dispositif WO2020192544A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910237492.9 2019-03-27
CN201910237492.9A CN111752425B (zh) 2019-03-27 2019-03-27 用于选择在设备的显示媒介上的交互对象的方法

Publications (1)

Publication Number Publication Date
WO2020192544A1 true WO2020192544A1 (fr) 2020-10-01

Family

ID=72610900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/080161 WO2020192544A1 (fr) 2019-03-27 2020-03-19 Procédé de sélection d'objet interactif sur un support d'affichage d'un dispositif

Country Status (3)

Country Link
CN (1) CN111752425B (fr)
TW (1) TWI766258B (fr)
WO (1) WO2020192544A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114089879B (zh) * 2021-11-15 2022-08-05 北京灵犀微光科技有限公司 一种增强现实显示设备的光标控制方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101644987A (zh) * 2008-08-08 2010-02-10 深圳富泰宏精密工业有限公司 移动终端及其菜单选择的方法
CN107402685A (zh) * 2016-05-18 2017-11-28 中兴通讯股份有限公司 移动终端及其操作方法和操作装置
CN107957774A (zh) * 2016-10-18 2018-04-24 阿里巴巴集团控股有限公司 虚拟现实空间环境中的交互方法及装置
CN109062476A (zh) * 2018-08-01 2018-12-21 Oppo(重庆)智能科技有限公司 应用的菜单处理方法、移动终端及计算机可读存储介质
US10162483B1 (en) * 2008-10-22 2018-12-25 D.R. Systems, Inc. User interface systems and methods
CN109196447A (zh) * 2016-03-31 2019-01-11 奇跃公司 使用姿势和多dof控制器与3d虚拟对象的交互
CN305453172S (fr) * 2018-11-06 2019-11-22
US20190362557A1 (en) * 2018-05-22 2019-11-28 Magic Leap, Inc. Transmodal input fusion for a wearable system

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1049290A (ja) * 1996-08-05 1998-02-20 Sony Corp 情報処理装置および方法
US7688306B2 (en) * 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
CN102902310B (zh) * 2004-03-01 2016-01-20 苹果公司 基于加速度计操作便携式设备的方法及装置
US20100192100A1 (en) * 2009-01-23 2010-07-29 Compal Electronics, Inc. Method for operating a space menu and electronic device with operating space menu
CN201766640U (zh) * 2010-08-26 2011-03-16 北京播思软件技术有限公司 一种根据姿态实现屏幕内容或菜单滚动的移动通信终端
US9201467B2 (en) * 2011-01-26 2015-12-01 Sony Corporation Portable terminal having user interface function, display method, and computer program
EP3146729A4 (fr) * 2014-05-21 2018-04-11 Millennium Three Technologies Inc. Motifs de repère de cadre, leur détection automatique dans des images et leurs applications
US10068373B2 (en) * 2014-07-01 2018-09-04 Samsung Electronics Co., Ltd. Electronic device for providing map information
CN106970734B (zh) * 2016-01-13 2020-12-18 阿里巴巴集团控股有限公司 一种显示设备的任务启动方法和装置
CN105718840B (zh) * 2016-01-27 2018-07-24 西安小光子网络科技有限公司 一种基于光标签的信息交互系统及方法
CN106446737B (zh) * 2016-08-30 2019-07-09 西安小光子网络科技有限公司 一种多个光标签的快速识别方法
CN106527903B (zh) * 2016-12-08 2020-04-07 青岛海信电器股份有限公司 触摸控制方法及装置
CN107562312B (zh) * 2017-08-25 2019-12-31 维沃移动通信有限公司 一种图标移动方法和移动终端
CN108897881B (zh) * 2018-07-05 2023-08-22 腾讯科技(深圳)有限公司 交互式图像显示方法、装置、设备和可读存储介质
CN109298813A (zh) * 2018-08-02 2019-02-01 珠海格力电器股份有限公司 一种应用展示方法、装置、终端及可读存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101644987A (zh) * 2008-08-08 2010-02-10 深圳富泰宏精密工业有限公司 移动终端及其菜单选择的方法
US10162483B1 (en) * 2008-10-22 2018-12-25 D.R. Systems, Inc. User interface systems and methods
CN109196447A (zh) * 2016-03-31 2019-01-11 奇跃公司 使用姿势和多dof控制器与3d虚拟对象的交互
CN107402685A (zh) * 2016-05-18 2017-11-28 中兴通讯股份有限公司 移动终端及其操作方法和操作装置
CN107957774A (zh) * 2016-10-18 2018-04-24 阿里巴巴集团控股有限公司 虚拟现实空间环境中的交互方法及装置
US20190362557A1 (en) * 2018-05-22 2019-11-28 Magic Leap, Inc. Transmodal input fusion for a wearable system
CN109062476A (zh) * 2018-08-01 2018-12-21 Oppo(重庆)智能科技有限公司 应用的菜单处理方法、移动终端及计算机可读存储介质
CN305453172S (fr) * 2018-11-06 2019-11-22

Also Published As

Publication number Publication date
TW202044007A (zh) 2020-12-01
CN111752425A (zh) 2020-10-09
CN111752425B (zh) 2022-02-15
TWI766258B (zh) 2022-06-01

Similar Documents

Publication Publication Date Title
US11500473B2 (en) User-defined virtual interaction space and manipulation of virtual cameras in the interaction space
US20230306688A1 (en) Selecting two-dimensional imagery data for display within a three-dimensional model
US10928975B2 (en) On-the-fly adjustment of orientation of virtual objects
US9224237B2 (en) Simulating three-dimensional views using planes of content
US9591295B2 (en) Approaches for simulating three-dimensional views
US9437038B1 (en) Simulating three-dimensional views using depth relationships among planes of content
JP6013583B2 (ja) 有効インターフェース要素の強調のための方式
US10297088B2 (en) Generating accurate augmented reality objects in relation to a real-world surface via a digital writing device
KR102539579B1 (ko) 정보의 표시 영역을 적응적으로 변경하기 위한 전자 장치 및 그의 동작 방법
JP2023531728A (ja) 人工現実感対話モードの統合
KR20150116871A (ko) Hdm에 대한 인간―신체―제스처―기반 영역 및 볼륨 선택
US11954268B2 (en) Augmented reality eyewear 3D painting
US11573627B2 (en) Method of controlling device and electronic device
WO2018025511A1 (fr) Dispositif de traitement d'informations, procédé et programme informatique
US11297244B2 (en) Click-and-lock zoom camera user interface
WO2020192544A1 (fr) Procédé de sélection d'objet interactif sur un support d'affichage d'un dispositif
US20230410441A1 (en) Generating user interfaces displaying augmented reality graphics
KR20240037800A (ko) 전자 장치 및 전자 장치의 제어 방법
KR20220074543A (ko) 실감형 3차원 디스플레이의 사용자 인터페이스 제공 장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20779427

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20779427

Country of ref document: EP

Kind code of ref document: A1