SE544511C2 - Improved manner of adapting size of an object such as a detected tree - Google Patents
Improved manner of adapting size of an object such as a detected treeInfo
- Publication number
- SE544511C2 SE544511C2 SE1730065A SE1730065A SE544511C2 SE 544511 C2 SE544511 C2 SE 544511C2 SE 1730065 A SE1730065 A SE 1730065A SE 1730065 A SE1730065 A SE 1730065A SE 544511 C2 SE544511 C2 SE 544511C2
- Authority
- SE
- Sweden
- Prior art keywords
- change
- graphical indication
- size
- controller
- user equipment
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
A user equipment (100) comprising a touch display (110) and a controller (CPU), wherein the controller is configured to: display a graphical indication of an object; receive an input indicating a change in size of the object; receive two touch points (F1,F2) through said touch display (110); detect a change in distance between the two touch points; and change the size of the object according to the change in distance between the two touch points.
Description
IMPROVED MANNER OF ADAPTING SIZE OF AN OBJECT SUCH AS ADETECTED TREE TECHNICAL FIELD The present invention generally relates to methods, apparatus and computer programs forforest inventory management. ln particular, the invention relates to a method, apparatusand computer program for adapting determined parameters of objects, such as the size of a tree, in a given area.
BACKGROUND Computing devices such as computers and even tablets and smartphones as is proposedby the Swedish application KT002 are used to survey forest areas. Such computerizedforestry surveying is of course not always exact and need be adapted sometimes.However, as the inventors of the Swedish application KT002 have realized, specialproblems exist when such adaptations are to be performed in the field - where forestsurveys are performed by their very nature. The teachings herein provide an insightful manner of overcon1ing these problems realized by the inventors.
SUMMARYAs mentioned in the background section, the inventors have realized that when adaptingdeterrnined parameters of detected trees, such as their width and/or height, special considerations which regard visibility and preciseness must be taken.
To overcome or at least mitigate such problems as discussed herein the inventors providea user equipment comprising a touch display and a controller, wherein the controller isconfigured to: display a graphical indication of an object; receive an input indicating achange in size of the object; receive two touch points through said touch display; detect achange in distance between the two touch points; and change the size of the object according to the change in distance between the two touch points.
To overcome or at least mitigate such problems as discussed herein the inventors alsoprovide a method for use in a user equipment comprising a touch display, wherein the method comprises: displaying a graphical indication of an object; receiving an input indicating a change in size of the object; receiving two touch points through said touchdisplay; detecting a change in distance between the two touch points; and changing the size of the object according to the change in distance between the two touch points.
To overcome or at least mitigate such problems as discussed herein the inventors alsoprovide a computer-readable medium comprising computer readable instructions thatwhen loaded in to a controller causes the controller to eXecute the method according to herein.
BRIEF DESCRIPTION OF THE DRAWINGS The above, as well as additional objects, features and advantages of the present invention,will be better understood through the following illustrative and non-limiting detaileddescription of preferred embodiments of the present invention, with reference to theappended drawings, wherein: Figure lA is a schematic view of a user equipment configured according toan embodiment of the teachings herein; Figure lB is a schematic view of the components of a user equipmentconfigured according to an embodiment of the teachings herein; Figures 2A, 2B and 2C shows a user equipment according to oneembodiment of the teachings herein being used to survey a forest area using imageanalysis through various stages of the image analysis; Figures 3A, 3B, 3C, 3D, 3E, 3F, 3G and 3H each shows a user equipmentaccording to the teachings herein being used according to one or more embodiments ofthe teachings herein; Figure 4 shows a flowchart for a general method of the teachings herein;and Figure 5 shows a schematic view of a computer-readable product l0 according to one embodiment of the teachings herein.
DESCRIPTIONFigure lA shows an example of a User Equipment 100, in this embodiment a smartphone100. Another example of a UE is a tablet computer. Figure lB shows a schematic view of components of a UE 100. The UE 100 comprises a user interface (UI) which in the example of figures 1A and 1B comprises a display 110 and one or more physical buttons120. The display 110 may be a touch display and the user interface may thus alsocomprise virtual keys (not shown). The UI is connected to a controller CPU which isconfigured for controlling the overall operation of the UE 100. The controller may be aprocessor or other programmable logical unit. The controller may also be one or moresuch programmable logical units, but for the reasons of this application the controller willbe eXemplified as being a Central Processing Unit (CPU). The controller CPU isconnected to or arranged to carry a computer readable memory for storing instructionsand also for storing data. The memory MEM may comprise several memory circuits thatmay be local to the UE or remote. Local memories are examples of non-transitorymediums. Remote memories are non-transitory in themselves, but present themselves to the UE as transitory mediums.
The UE 100 further comprises or is arranged to be connected to a camera 130 forreceiving an image stream from which image stream is to be processed by the controllerCPU and at least temporarily stored in the memory MEM. As the camera 130 records a video sequence, the video sequence may simultaneously be displayed on the display 110.
The UE 100 may also comprise sensors, such as an accelerometer 140 configured toprovide the controller with sensor data, either to be processed by the controller or (at leastpartially) pre-processed. This enables the controller to determine or follow movements ofthe camera, both as regards lateral movements and changes in angles, that is the pose ofthe camera. A pose is thus a position and a direction or angle of a camera, resulting in six(6) degrees of freedom indicating how a camera is moved and or rotated making itpossible to determine how the camera is moved and or rotated from one pose to another pose.
The UE 100 may also comprise positional sensors, such as a global navigational systemsensor configured to provide the controller with position data, either to be processed bythe controller or (at least partially) pre-processed. This enables the controller to determine or follow the actual position of the camera.
Figure 2A shows a UE 100, such as disclosed with reference to figures lA and lB, beingused to survey a forest area. The forest area is recorded somehow and an image analysisis performed to provide estimations of detected tree stems 210. Possibly the recording ismade by video recording and the detection is made by use of SLAM as is disclosed in the Swedish application KT002.
Graphical indications 220 may be used to more clearly indicate which trees T have beenidentified by simply overlaying the recording of the forest area with the graphical indications 220 at the location of the detected tree(s) T.
As the detection of the tree stems” width may be incorrect and as the consequences of anincorrect deterrnination may lead to a rather large error in any calculations, such asfinancial , it is important to provide a user with capabilities to adapt or amend the widthof the detected trees, especially since the number of trees, and therefore the total volume,may be very large in a survey and a small error may thus be multiplied to result in a large CITOY.
As the forest surveying is by its very nature performed in the field, operators willprobably be wearing protective gloves, either for weather protection or manual laborprotection, also, forestry personnel are known to have rather large hands and fingers, theinventors have realized that it may be difficult for an operator to properly align thegraphical indicators as the gloves or hands may cover or obscure the displayed tree(s) during manipulation.
The inventors have therefore devised a simple but highly useful manner that overcomesthe shortcomings of having large hands or wearing gloves and also alleviates anyclumsiness on behalf of the operator, and which is beneficial for overcoming shortcornings in human perception and deXterity.
The manner proposed is illustrated with reference to figures 3 and 4, where a UE havingsurveyed a single tree T is shown in figure 3A. The reason only one tree is shown is formaking the illustrations more clearly, but it should be understood that the manner taught herein may also be used for surveying forest areas where many trees are located and thusly detected. As can be seen the detected tree stem or trunk is indicated using agraphical indicator 220, in this example being to diverging lines. The graphical indicator 220 may of course also indicate the height of the tree(s).
In figure 3B it is shown how the detected tree T is selected by the user tapping (or ratherproviding touch input that is received by the UE 100) on the tree T with a finger F, orpossibly a stylus (not shown). As the UE is arranged with a touch display a touch input ismost beneficial to use, but it should be noted that other manners of selecting the tree Tare also possible including mouse or joystick selection, a stepwise toggle from tree to tree using a toggle button.
In one embodiment it is also possible to select an undetected tree by tapping (or ratherproviding touch input that is received by the UE 100) on the location of the tree. The UEis then configured to generate a detected tree at the location tapped, showing graphicalindicators 220 for the tree T. The width of the specified tree may be set to a presetstandard width. In one embodiment the width of the specified tree may be set to the widthof surrounding trees, such as the average of the widths of the detected trees or to the average of the width of the adjacent trees.
As a tree is detected, the UE may change the appearance of the graphical indicator 220 toindicate which tree is selected. In the example of figure 3B, the graphical indicator 220has become bolder, but other manners such as changing colors, blinking effects etc are also possible.
The inventors are therefore proposing a clever manner of increasing the size of thedetected tree width by enabling the operator to anywhere on the display ll0 perform apinching operation. A pinching operation is where an operator touches the display ll0 attwo (distinct) points, such as by touching with two fingers Fl and P2, as ins shown infigure 3C, and either moves the fingers (and therefore the touching points) apart ortogether. As is clearly shown in figure 3C, the two fingers touch the display ll0 at alocation offset that of the selected tree T, as is indicated by one of the two dashed arrows.This enables the operator to clearly see the selected tree while making any changes to it so that the operator can clearly see the effects of his commands. To further accommodate the operator possibly working in an unpractical or uncomfortable position orenvironment, the inventors have realized that by simply focusing on the change ofdistance between the two touching points or fingers Fl, F2, the operator is able to adapt the detected tree in a most convenient manner.
The UE 100 is thus configured to detect a change in the distance between the twotouching points, indicated by the two fingers Fl, F2 and irrespective of the direction ofsuch a change, change the width of the selected tree. Figure 3D clearly shows that the direction of the fingers” movements is different from that of any change in width.
Figure 3E shows the effect on the width of the selected tree T when the movement of thefingers increase the distance between them, whereby the width of the selected tree T isincreased, as is indicated by the two diverging lines of the graphical indicator 220 being further spaced apart.
Figure 3F shows the effect on the width of the selected tree T when the movement of thefingers decrease the distance between them, whereby the width of the selected tree T isdecreased, as is indicated by the two diverging lines of the graphical indicator 220 being closer to one another.
In one embodiment, the teachings herein as relates to changing the width of a tree, mayalso be used to change the height of the tree, whereby a tree is selected, possibly in adifferent manner or in a sequence, and the height of the tree is changed according to the movement of the fingers.
Figure 4 shows a flowchart for a general method of performing the manner taught herein.A forest area has been surveyed through image or other recordings on which imageanalysis has been performed to detect trees and their widths which are displayed 400 on a display ll0 of a UE 100.
The controller CPU of the UE 100 is configured to receive input 410 indicating a tree tobe selected. The input is in one embodiment received as touch input on the location of the tree. The controller thereby selects 413 the indicated tree. In one embodiment the selected tree is a detected tree. In one embodiment the selected tree is an undetected tree wherebythe controller is configured to generate 416 a detected tree located at the location of thereceived input.
As a tree has been selected the controller is configured to receive 420 input indicatingtwo points. ln one embodiment, the two points are indicated by receiving two touchinputs at two points (indicated in figures 3D, 3E and 3F by the fingers Fl, F2, hereafterbeing representative of the two touch points). The controller is further configured todetect 430 a change in distance between the two points (Fl, F2) and adapt 435 the widthof the selected tree accordingly. In one embodiment, the change in distance is the changein absolute distance, irrespective of which direction or angle on the display ll0 the change is detected at.
In one embodiment a tree may be selected for change of one of height or width by oneform of input and be selected for change of the other of height or width by another formof input. In one embodiment, the first form of input is a single tap or shortpress tap (asingle tap being a shortpress tap). In such an embodiment, the second form of input may be a double tap or longpress tap.
If a tree has been selected for changing the width, a detected change in distance betweenthe two points will then result in a corresponding change in width. Similarly, if a tree hasbeen selected for changing the height a detected change in distance between the two points will then result in a corresponding change in height.
Figure 3G shows an example where a tree has been selected for changing the height andwhere the two finger input indicates a change in distance whereby the height of thegraphic indication of the detected tree°s height and the detected tree°s height are adjusted accordingly.
Returning to figure 4, the method may (being optional as indicated by the dashed lines)thus continue with receiving a further input 440 indicating that the other of height and width is also to be changed. This further input may be received already in the first inputin the form of an input associated with changing first one of width and height, and then changing the other, possibly after an acceptance has been issued such as releasing the two points on the touch display. The further input may also or altematively be received as aninput indicating change of the other of width and height as discussed above. As thefurther input is received, two (touch) points are again received 445, a change distancebetween the two points is detected 450 and the other of width and height is adjusted accordingly 455.
In one embodiment, as in figure 3H where both the width and the height of the detectedtree is indicated by the graphical indicator 220, the controller is configured to detect 430-l a change in distance in a longitudinal direction relative the display, and adapt 435-1 awidth of the selected tree accordingly. In such an embodiment, as in figure 3G, thecontroller is also configured to 430-2 detect a change in distance in a vertical directionrelative the display, and adapt 435-2 a height of the selected tree accordinglysubstantially simultaneously with the adaptation of the width if the movement of the fingers indicate such a change.
The size of a tree thus relate to the height of a tree, the width of a tree or the height and width of a tree.
In one embodiment the change in size, i.e. width and/or height, equals the change in distance.
In one embodiment the change in size, i.e. width and/or height, width and/or height isproportional to the change in distance, where a scaling factor is applied by the controller.This enables for a more precise alignment by making the scaling factor less than l or for a faster alignment by making the scaling factor larger than l.
It should be noted that even though the description herein is focused and eXemplified onreceiving input through finger touch, any finger touch is to be considered included as an altemative for the teachings herein, such as stylus input.
Figure 5 shows a schematic view of a computer-readable product l0 according to oneembodiment of the teachings herein. The computer-readable product is configured to carry or store a computer program or computer program instructions ll along with application related data. The computer-readable productl0 may be a data disc as in figure5 or a Universal Serial Bus, a memory card or other commonly known computer readableproducts, these being examples of transitory mediums. The computer-readable product 10may be inserted or plugged in or otherwise connected to a computer-readable productreader 12 configured to read the information, such as the program instructions 11 storedon the computer-readable product 12 and possibly eXecute the instructions or to connectto a device configured to eXecute the instructions such as a UE 100, as the one disclosedin figures 1A and lB. The UE 100 may thus connect Wirelessly or through a Wiredconnection to a computer-readable product reader 12 (this being an example of a non-transitory medium) to receive the computer instructions ll. The UE 100 may in oneembodiment comprise the computer-readable product reader 12 to receive the computer instructions ll.
In this manner a smartphone of standardized model may be upgraded to incorporate theteachings herein, by loading the computer program instructions into the controller (and/ormemory) of the smartphone (or other UE) and causing the controller to eXecute the computer program instructions.
Claims (11)
1. A user equipment (100) for íorest lirixßfizritory' m a1:1a§fc:^nenfa' comprising a touch display (110) and a controller (CPU) and further comprising or being arranged to be connectedto a camera (130),wherein the controller is configured to:receive a video recording of a forest area from the camera (130);display the video recording of the forest area on the touch display (110);detect tree stems (210) in the forest area by perforrning image analysis onthe video recording of the forest area to provide estimations of locations and widths ofthe detected tree stems (210);display a graphical indication (220) of an object on the touch display(110), the object being a detected tree stem (210), wherein the graphical indication(220) is based on the estimations and overlaying the video recording of the forest area;receive an input indicating a selection of the graphical indication (220) ofthe object in order to change the size of the graphical indication (220) ofthe object, wherein the change in the size is at least a change in width ofthe graphical indication (220) of the object;receive two touch points (F1,F2) through said touch display (110);detect a change in distance between the two touch points; andchange the width of the graphical indication (220) of the object according to the changein distance between the two touch points, indicated by two diverging lines of the graphical indication (220).
2. The user equipment (100) of claim 1, wherein the controller is further configured to determine the change in distance as being an absolute change in any direction.
3. The user equipment (100) of claim 1 or 2, wherein the controller is further configuredto determine that the input indicating a change in size of the graphical indication (220) of the object indicates a ll simultaneous change in two sizes, wherein a change in distance in one directionindicates a change of a first size and a change in distance in a second direction indicates a change of a second size.
4. The user equipment (100) of any previous claim, wherein the controller is further configured to:receive a further input indicating a change in a second size of the graphicalindication (220) of the object;receive two touch points (F1,F2) through said touch display (110);detect a second change in distance between the two touch points; andchange the second size of the graphical indication (220) of the objectaccording to the change in distance between the two touch points.
5. The user equipment (100) of any previous claim, wherein the controller is furtherconf1gured to receive the input indicating a change in size of the graphical indication(220) of the object at a f1rst location and to receive the two touch points at a second location, whereby the two touch points is remote from the object.
6. The user equipment (100) of any of claims 1 to 5, wherein the controller is furtherconf1gured to receive the input indicating a change in size of the graphical indication(220) of the object as comprised in receiving the two touch points.
7. The user equipment (100) of any previous claim, wherein the graphical indication(220) of the object indicates the size of object.
8. The user equipment (100) of any previous claim, wherein the user equipment (100) is a smartphone or an intemet tablet.
9. The user equipment (100) of any previous claim, wherein the user equipment (100) is configured to be used to survey a forest area using image analysis.
10. A method for use in a user equipment (100) for forest inventorx» nianafiemrexitcomprising a controller a touch display (1 10) and further comprising or being arrangedto be connected to a camera (130),wherein the method comprises:receiving a video recording of a forest area from the camera (130);displaying the video recording of the forest area on the touch display(1 10);detecting tree stems (210) in the forest area byperforming image analysis on the video recording of the forest area to provideestimations of locations and widths of the detected tree stems (2l0);displaying a graphical indication (220) of an object on the touch display(110), the object being a detected tree stem (210), wherein the graphical indication(220) is based on the estimations and overlaying the video recording of the forest area;receiving an input indicating a selection of the graphical indication (220)of the object in order to change the size of the graphical indication (220) ofthe object, wherein the change in the size is at least a change in width ofthe graphical indication (220) of the object;receiving two touch points (F1,F2) through said touch display (110);detecting a change in distance between the two touch points; andchanging the width of the graphical indication (220) of the object according to thechange in distance between the two touch points, indicated by two diverging lines of the graphical indication (220). 1 1. A computer-readable medium (10) comprising computer readable instructions (1 1) that when loaded in to a controller causes the controller to execute the method according to clairn13
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1730065A SE544511C2 (en) | 2017-03-15 | 2017-03-15 | Improved manner of adapting size of an object such as a detected tree |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1730065A SE544511C2 (en) | 2017-03-15 | 2017-03-15 | Improved manner of adapting size of an object such as a detected tree |
Publications (2)
Publication Number | Publication Date |
---|---|
SE1730065A1 SE1730065A1 (en) | 2018-09-16 |
SE544511C2 true SE544511C2 (en) | 2022-06-28 |
Family
ID=63792061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1730065A SE544511C2 (en) | 2017-03-15 | 2017-03-15 | Improved manner of adapting size of an object such as a detected tree |
Country Status (1)
Country | Link |
---|---|
SE (1) | SE544511C2 (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100205219A1 (en) * | 2009-09-30 | 2010-08-12 | Adam Robert Rousselle | Method and system for locating a stem of a target tree |
US20110041098A1 (en) * | 2009-08-14 | 2011-02-17 | James Thomas Kajiya | Manipulation of 3-dimensional graphical objects or view in a multi-touch display |
WO2013060085A1 (en) * | 2011-10-27 | 2013-05-02 | The Hong Kong University Of Science And Technology | System and method for constrained manipulations of 3d objects by multitouch inputs |
-
2017
- 2017-03-15 SE SE1730065A patent/SE544511C2/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110041098A1 (en) * | 2009-08-14 | 2011-02-17 | James Thomas Kajiya | Manipulation of 3-dimensional graphical objects or view in a multi-touch display |
US20100205219A1 (en) * | 2009-09-30 | 2010-08-12 | Adam Robert Rousselle | Method and system for locating a stem of a target tree |
WO2013060085A1 (en) * | 2011-10-27 | 2013-05-02 | The Hong Kong University Of Science And Technology | System and method for constrained manipulations of 3d objects by multitouch inputs |
Non-Patent Citations (1)
Title |
---|
Pfeuffer K., Alexander J., Gellersen H. "Gaze+touch vs. Touch: What's the Trade-off When Using Gaze to Extend Touch to Remote Displays?", published in Network and Parallel Computing; [Lecture Notes in Computer Science; Lect.Notes Computer], in 2015-08-30. doi:10.1007/978-3-319-22668-2_27 * |
Also Published As
Publication number | Publication date |
---|---|
SE1730065A1 (en) | 2018-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10019074B2 (en) | Touchless input | |
AU2014382730B2 (en) | Method and device for detecting a touch between a first object and a second object | |
US9678639B2 (en) | Virtual mouse for a touch screen device | |
KR102165444B1 (en) | Apparatus and Method for Portable Device displaying Augmented Reality image | |
US10318152B2 (en) | Modifying key size on a touch screen based on fingertip location | |
US20050088409A1 (en) | Method of providing a display for a gui | |
KR102237363B1 (en) | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element | |
JP2013186540A (en) | Information processing apparatus and information processing method | |
JP2019518259A5 (en) | ||
KR101749070B1 (en) | Apparatus and method for assessing user interface | |
KR102224932B1 (en) | Apparatus for processing user input using vision sensor and method thereof | |
CN105739835A (en) | Setting a parameter | |
CN106406572A (en) | Cursor control method and device | |
RU2018115965A (en) | AIRCRAFT VERIFICATION SYSTEM WITH VISUALIZATION AND RECORDING | |
JP2014524170A5 (en) | ||
US8826192B1 (en) | Graphical method of inputting parameter ranges | |
SE544511C2 (en) | Improved manner of adapting size of an object such as a detected tree | |
CN110799916B (en) | Method and system for facilitating user navigation among multiple operator workstation screens | |
CN105739837A (en) | Setting a parameter | |
JP7544484B2 (en) | Graphical user interface for indicating off-screen points of interest - Patents.com | |
US20160188175A1 (en) | Selection of a graphical element | |
US20150042621A1 (en) | Method and apparatus for controlling 3d object | |
WO2018161421A1 (en) | Performance test method and performance test apparatus for touch display screen of terminal device | |
WO2023062792A1 (en) | Image processing device, image processing method, and storage medium | |
US11243663B2 (en) | Method for operating an information device |