US20180039400A1 - Multichannel controller - Google Patents

Multichannel controller Download PDF

Info

Publication number
US20180039400A1
US20180039400A1 US15/785,910 US201715785910A US2018039400A1 US 20180039400 A1 US20180039400 A1 US 20180039400A1 US 201715785910 A US201715785910 A US 201715785910A US 2018039400 A1 US2018039400 A1 US 2018039400A1
Authority
US
United States
Prior art keywords
control mode
icon
user
user interface
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/785,910
Inventor
Brian T. Pettey
Christopher L. Holt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robotzone LLC
Original Assignee
Robotzone LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robotzone LLC filed Critical Robotzone LLC
Priority to US15/785,910 priority Critical patent/US20180039400A1/en
Publication of US20180039400A1 publication Critical patent/US20180039400A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Abstract

A non-transitory computer-readable storage medium having computer executable instructions that, when executed, cause a computing device to execute operations comprising transitioning a displayed user interface presented on a display of the computing device from a first control mode for controlling a moveable device, to a second control mode for controlling the moveable device is presented. The user interface comprises a control mode selector and a functionality limitation setting mechanism. The control mode selector is configured to display a plurality of control mode selector icons, each corresponding to one of a plurality of control modes. The plurality of control mode selector icons includes a first control mode selector icon, corresponding to the first control mode, and a second control mode selector icon, corresponding to the second control mode. The control mode selector is configured to receive a user input selection of the second control mode selector icon. The control mode selector is configured to display the second control mode by displaying a user-controllable icon corresponding to the second control mode in a main portion of the user interface. The functionality limitation setting mechanism is configured to receive a user input selection of a functionality limitation applied to the second control mode. The user input is received through a touch screen interface of the computing device. The functionality limitation comprises a position lock function that is configured to, when actuated, lock a position of the controlled moveable device with respect to a selected axis of motion.

Description

    REFERENCE TO RELATED CASE
  • The present application is a continuation of and claims priority to U.S. application Ser. No. 14/303,894, filed Jun. 13, 2014, which is a continuation of U.S. application Ser. No. 13/083,912, filed Apr. 11, 2011, now U.S. Pat. No. 8,791,911, which is based on and claims the benefit of U.S. provisional application Ser. No. 61/441,113, filed Feb. 9, 2011, the content of which is hereby incorporated by reference in their entirety.
  • BACKGROUND
  • Multichannel controllers are commonly used to control a wide variety of systems. For example, a multichannel controller can be used to control a pan and tilt camera system. In such a case, one channel of the multichannel controller may be used to control pan motion of the pan and tilt camera system, and another channel of the multichannel controller may be used to control tilt motion of the pan and tilt camera system. One method of providing multichannel control has included using controllers with physical joysticks. Positioning of the physical joysticks causes signals to be sent to the system being controlled.
  • SUMMARY
  • A non-transitory computer-readable storage medium having computer executable instructions that, when executed, cause a computing device to execute operations comprising transitioning a displayed user interface presented on a display of the computing device from a first control mode for controlling a moveable device, to a second control mode for controlling the moveable device is presented. The user interface comprises a control mode selector and a functionality limitation setting mechanism. The control mode selector is configured to display a plurality of control mode selector icons, each corresponding to one of a plurality of control modes. The plurality of control mode selector icons includes a first control mode selector icon, corresponding to the first control mode, and a second control mode selector icon, corresponding to the second control mode. The control mode selector is configured to receive a user input selection of the second control mode selector icon. The control mode selector is configured to display the second control mode by displaying a user-controllable icon corresponding to the second control mode in a main portion of the user interface. The functionality limitation setting mechanism is configured to receive a user input selection of a functionality limitation applied to the second control mode. The user input is received through a touch screen interface of the computing device. The functionality limitation comprises a position lock function that is configured to, when actuated, lock a position of the controlled moveable device with respect to a selected axis of motion.
  • These and various other features and advantages that characterize the claimed embodiments will become apparent upon reading the following detailed description and upon reviewing the associated drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a multichannel controller implemented using a handheld device.
  • FIG. 2 is a control mode selector user interface of a multichannel controller.
  • FIG. 3 is a touchpad control mode user interface of a multichannel controller.
  • FIG. 4 is a joystick control mode user interface of a multichannel controller.
  • FIG. 5 is a trackball control mode user interface of a multichannel controller.
  • FIG. 6 is a touchpad/sliders combination control mode user interface of a multichannel controller.
  • FIG. 7 is a touchpad/wheels combination control mode user interface of a multichannel controller.
  • FIG. 8 is a joystick/wheels combination control mode user interface of a multichannel controller.
  • FIG. 9 is a trackball/sliders combination control mode user interface of a multichannel controller.
  • FIG. 10 is a orientation selector user interface of a multichannel controller.
  • FIG. 11 is an inverted axis selector user interface of a multichannel controller.
  • FIG. 12 is a maximum rotational speed selector user interface of a multichannel controller.
  • FIG. 13-1 is a sensitivity selector user interface of a multichannel controller.
  • FIG. 13-2 is a custom sensitivity user interface of a multichannel controller.
  • FIG. 14 is a position lock user interface of a multichannel controller.
  • FIG. 15 is a rotation lock user interface of a multichannel controller.
  • FIG. 16 is an accelerometer selector user interface of a multichannel controller.
  • FIGS. 17-1, 17-2, 17-3, and 17-4 are user interfaces of a multichannel controller associated with managing user profiles.
  • FIGS. 18-1, 18-2, 18-3, and 18-4 are user interfaces of a multichannel controller associated with managing predefined motions.
  • FIG. 19 is an operating environment of a multichannel controller.
  • DETAILED DESCRIPTION
  • Embodiments of the present disclosure include multichannel controllers. In certain embodiments, multichannel controllers are used to control motion of pan and tilt camera systems. Embodiments are not however limited to any particular setting. Those skilled in the art will appreciate that although some embodiments are described in the context of pan and tilt camera systems, that embodiments are not limited to pan and tilt systems and can be used in other settings. Additionally, the present disclosure presents several examples of user interfaces that can be used to implement multichannel controllers. Those skilled in the art will appreciate that embodiments are not limited to the specific user interfaces shown in the figures and may include any one or more combination of features shown in the example interfaces.
  • Embodiments of multichannel controllers are implemented using any suitable computing device. In one configuration, a controller is implemented using a smart phone such as an Android based phone or an iPhone. Alternatively, a controller can be implemented using a specially chosen device.
  • In one embodiment, a multichannel controller sends signals to pan and tilt motors to control pan and tilt motions of a camera. In one configuration, each motor is capable of receiving a signal that indicates a direction of rotation (e.g. clockwise or counterclockwise) and a speed of rotation (e.g. 0-100% of the maximum rotational speed of the motor).
  • FIG. 1 shows a handheld device 100 that is used to implement a multichannel controller. Handheld device 100 includes a touchscreen 102 that displays user interfaces of the controller. Each of the user interfaces includes a main portion 104 and an icons portion 106 (e.g. a scrollable icons taskbar). Icons portion 106 includes icons 108 that are used to configure various control modes and settings of a multichannel device. As will be described in greater detail below, the application may include more icons 108 than can be shown in icons portion 106. In such a case, a user can scroll the icons to the left or right to view additional icons. For instance, in the example shown in FIG. 1, only five icons 108 are shown in icons portion 106. A user can view icons to the left of the five icons 108 by touching any part of icons portion 106 and moving it to the right. Similarly, a user can view icons to the right of the five icons 108 by touching any part of icons portion 106 and moving it to the left. The left and right motion capability of icons portion 106 is represented by arrow 110.
  • One of the icons 108 is a Control Mode Selector icon. Upon the Control Mode Selector icon being selected (e.g. by being touched), a Control Mode Selector interface is displayed in the main portion 104 of the user interface. FIG. 2 shows one example of a Control Mode Selector interface 200. Interface 200 includes a touchpad icon 202, a joystick icon 204, a trackball icon 206, a touchpad/slider icon 208, a touchpad/wheels icon 210, a joystick/wheels icon 212, and a trackball/sliders icon 214. Selection of one of icons 202, 204, 206, 208, 210, 212, or 214 puts the controller into the corresponding control mode. An optional confirmation step may be implemented after selection of one the icons. For instance, upon joystick icon 204 being selected, a window may be displayed that states “Do you want to enter into the Joystick Control Mode? Yes/No.” The user can select “Yes” to enter the joystick control mode, or select “No” to return to the Control Mode Selector interface 200.
  • FIG. 3 shows an example of a controller in a touchpad control mode (e.g. after selecting touchpad icon 202 in FIG. 2). In the touchpad control mode, main portion 104 of the user interface is one solid color (e.g. white). Alternatively, main portion 104 may receive and display video from the camera in the pan and tilt system. A user is able to control the motors of the pan and tilt system by making touch gestures in the main portion 104. Left-to-right and right-to-left touch gestures send signals to the pan motor to rotate, and up-to-down and down-to-up touch gestures send signals to the tilt motor to rotate. The rotational speed of the motors is dependent upon the speed of the touch gesture. For instance, a quick touch gesture sends a signal to the motor to rotate quickly, and a slow touch gesture sends a signal to the motor to rotate slowly. If a touch gesture includes a combination of up/down and left/right motions (e.g. a diagonal touch gesture), signals are sent to both the pan and tilt motors. In an embodiment, the controller is also able to control a zoom (e.g. optical or digital magnification of the camera). For instance, a user can make touch gestures associated with making an object larger to zoom in on an object, and can make touch gestures associated with making an object smaller to zoom out on an object. Additionally, a user can set the control mode of the controller to moving object track control mode. For instance, a user can select a moving object being displayed in main portion 104, and the controller controls the camera to keep the moving object within the camera's field of view.
  • FIG. 4 shows an example of a controller in a joystick control mode (e.g. after selecting joystick icon 204 in FIG. 2). In the joystick control mode, main portion 104 of the user interface includes a joystick icon 402. A user is able to move the position of the joystick icon 402 by touching the icon and moving it in any direction (e.g. up, down, left, right, diagonally). The joystick icon 402 returns to its original position in the middle of main portion 104 when the user releases touch of the icon. In the joystick control mode, any left or right movement of the joystick icon sends a signal to rotate the pan motor, and any up or down movement of the joystick sends a signal to rotate the tilt motor. A combination of left/right and up/down movements (e.g. moving the joystick icon diagonally) sends signals to both the pan and tilt motors. The rotational speed of the motors is dependent upon the distance the joystick icon 402 is moved from its center/home position. For instance, moving the joystick icon 402 a small distance from its home position causes slow rotation, while moving the joystick icon 402 a greater distance from its home position causes faster rotation. In the joystick control mode, main portion 104 of the user interface is one solid color (e.g. white). Alternatively, main portion 104 may receive and display video from the camera in the pan and tilt system. In such a case, joystick icon 204 may be presented in transparent or translucent graphics such that a user can see video from the camera behind the joystick icon.
  • FIG. 5 shows an example of a controller in a trackball control mode (e.g. after selecting trackball icon 206 in FIG. 2). In the trackball control mode, main portion 104 of the user interface include a trackball icon 502. A user is able to simulate rotating trackball icon 502 by touching the icon and moving it in any direction (e.g. up, down, left, right, diagonally). The trackball icon 502 has simulated momentum. For instance, the trackball icon 502 will continue “rotating” for a brief period of time after the user has released touch of the icon 502. In the trackball control mode, any left or right rotation of the trackball icon sends a signal to rotate the pan motor, and any up or down rotation of the trackball sends a signal to rotate the tilt motor. A combination of up/down and left/right rotations (e.g. rotating the trackball diagonally) sends signals to both the pan and tilt motors. The rotational speed of the motors is dependent upon the rotational speed of the trackball. Fast rotation of the trackball icon 502 sends signals to the motors to rotate quickly, and slow rotation of the trackball icon 502 sends signals to the motors to rotate slowly. In the trackball control mode, main portion 104 of the user interface is one solid color (e.g. white). Alternatively, main portion 104 may receive and display video from the camera in the pan and tilt system. In such a case, trackball icon 502 may be presented in transparent or translucent graphics such that a user can see video from the camera behind the trackball icon.
  • FIG. 6 shows an example of a controller in a touchpad/sliders combination control mode (e.g. after selecting touchpad/sliders icon 208 in FIG. 2). In the touchpad/sliders combination control mode, main portion 104 of the user interface includes a touchpad section 602, a pan slider section 604, and a tilt slider section 606. Touchpad section 602 functions in the same manner as the touchpad mode shown in FIG. 3. For instance, touchpad section 602 either shows a solid color or shows video from the pan and tilt system camera. A user can send signals to rotate the pan and tilt motors by making touch gestures in the touchpad section 602.
  • Pan slider section 604 includes a moveable slider icon 608 that is able to be moved left and right within slider slot 610. Tilt slider section 606 includes a moveable slider icon 612 that is able to be moved up and down within slider slot 614. Movement of the pan slider icon 608 sends signals to the pan motor, and movement of the tilt slider icon 612 sends signals to the tilt motor. The rotational speed of the motors is dependent on how far the slider icons are moved from their center/home positions. Moving a slider icon further away from its center/home position causes faster rotation than a smaller move away from the center/home position. Additionally, similar to the joystick icon in the joystick mode, slider icons 608 and 612 move back to their center/home positions when touch is released.
  • FIG. 7 shows an example of a controller in a touchpad/wheels combination control mode (e.g. after selecting touchpad/wheels icon 210 in FIG. 2). In the touchpad/wheels combination control mode, main portion 104 of the user interface includes a touchpad section 702, a pan wheel icon 704, and a tilt wheel icon 706. Touchpad section 702 functions similarly to the touchpad modes shown in FIGS. 3 and 6. For instance, touchpad section 702 either shows a solid color or shows video from the pan and tilt system camera. A user can send signals to rotate the pan and tilt motors by making touch gestures in touchpad section 702.
  • Pan wheel icon 704 and tilt wheel icon 706 are able to be rotated in either direction. Rotation of pan wheel icon 704 sends signals to the pan motor to rotate, and rotation of tilt wheel icon 706 sends signals to the tilt motor to rotate. The speed and direction of rotation of the wheel icons determine the speed and direction of rotation of the pan and tilt motors. Fast rotation of the wheel icons sends signals to the motors to rotate quickly, and slower rotation of the wheel icons sends signals to the motors to rotate slower. The wheel icons have momentum such that the wheel icons will continue to rotate after a user has released touch of the icons.
  • FIG. 8 shows an example of a controller in a joystick/wheels combination control mode (e.g. after selecting joystick/wheels icon 212 in FIG. 2). In the joystick/wheels combination control mode, main portion 104 of the user interface includes a joystick icon 802, a pan wheel icon 804, and a tilt wheel icon 806. Joystick icon 802 functions the same as joystick icon 402 in FIG. 4. Pan and tilt wheel icons 804 and 806 function the same as pan and tilt wheel icons 704 and 706 in FIG. 7. Similar to each of the other control mode user interfaces, in the joystick/wheels combination control mode, main portion 104 of the user interface is in one embodiment one solid color (e.g. white). Alternatively, main portion 104 may receive and display video from the camera in the pan and tilt system. In such a case, icons 802, 804, and 806 may be presented in transparent or translucent graphics such that a user can see video from the camera behind the icons.
  • FIG. 9 shows an example of a controller in a trackball/sliders combination control mode (e.g. after selecting trackball/sliders icon 214 in FIG. 2). In the trackball/sliders combination control mode, main portion 104 of the user interface includes a trackball icon 902, a pan slider icon 904, and a tilt slider icon 906. Trackball icon 902 functions the same as trackball icon 502 in FIG. 5. Pan and tilt slider icons 904 and 906 function the same as pan and tilt slider icons 604 and 606 in FIG. 6. Again, the background of the user interface may be one solid color or may display video from the camera. In such a case, icons 902, 904, and 906 may be presented in transparent or translucent graphics such that a user can see camera video behind the icons.
  • Another one of the icons 108 in icons portion 106 in FIG. 1 is an Orientation Selector icon. FIG. 10 shows an example of an Orientation Selector user interface 1002 that is displayed after the Orientation Selector icon is selected. Interface 1002 includes four icons 1004, 1006, 1008, and 1010 that represent the four orientations that the controller may be positioned in (e.g. device bottom down, bottom to the right, bottom up, and bottom to the left). Selection of one of icons 1004, 1006, 1008, or 1010 determines the orientation that the controller presents the user interfaces.
  • FIG. 11 shows an example of an Inverted Axis Selector user interface 1102 that is displayed after an Inverted Axis Selector icon 108 is selected from icons portion 106 in FIG. 1. Interface 1102 includes a pan icon 1104 that can be toggled between on and off positions to invert control of the pan axis, and a tilt icon 1106 that can be toggled between on and off positions to invert control of the tilt axis. Toggling either icon 1104 or 1106 causes the direction of rotation for the axis to be reversed.
  • FIG. 12 shows an example of a Maximum Rotational Speed Selector user interface 1202 that is displayed after a Maximum Rotational Speed Selector icon 108 is selected from icons portion 106 in FIG. 1. Interface 1202 includes four icons 1204, 1206, 1208, and 1210 that can be set between 0 and 100% to set the maximum rotational speed of the motors.
  • FIG. 13-1 shows an example of a Sensitivity Selector user interface 1302 that is displayed after a Sensitivity Selector icon 108 is selected from icons portion 106 in FIG. 1. Interface 1302 includes a pan axis portion 1306 and a tilt axis portion 1308. Each portion 1306 and 1308 includes three radio buttons. A user can set the sensitivity of each axis to linear, non-linear, or custom. Additionally, a user can select edit buttons 1310 and 1312 to edit the customized sensitivity. FIG. 13-2 shows an example of a Custom Sensitivity user interface 1322 that is displayed after one of the edit buttons 1310 or 1312 is selected. User interface 1322 includes a user editable sensitivity response line 1324. A user can move response line 1324 up and down along the entire length of the line to set a custom sensitivity response. User interface 1322 includes a cancel button 1326 and a save button 1328. A user can press the cancel button 1326 to undo any changes to response line 1324 and return to the previous screen, and a user can press the save button 1328 to save changes to response line 1324 and return to the previous screen.
  • FIG. 14 shows an example of a Position Lock user interface 1402 that is displayed after a Position Lock icon 108 is selected from icons portion 106 in FIG. 1. Interface 1402 includes a pan position lock icon 1404 and a tilt position lock icon 1406. Toggling either icon 1404 or 1406 from the off to the on position locks the corresponding motor at its current position.
  • FIG. 15 shows an example of a Rotation Lock user interface 1502 that is displayed after a Rotation Lock icon 108 is selected from icons portion 106 in FIG. 1. Interface 1502 includes a pan axis portion 1504 and a tilt axis portion 1506. Each axis portion includes an icon to toggle the rotation lock from the off to the on position. Each axis portion also includes a radio button to indicate the direction of rotation and a speed selector to select from 0 to 100% of the maximum rotation speed.
  • FIG. 16 shows an example of an Accelerometer Selector user interface 1602 that is displayed after an Accelerometer Selector icon 108 is selected from icons portion 106 in FIG. 1. Interface 1602 includes a pan icon 1604 and a tilt icon 1606. Toggling pan icon 1604 from the off to the on position causes rotation of the pan motor to be controlled by accelerometer feedback, and toggling icon 1606 from the off to the on position causes rotation of the tilt motor to be controlled by accelerometer feedback. In accelerometer control mode, the rotational speed of the motors is dependent upon the angle of the controller from a center/home position.
  • FIGS. 17-1, 17-2, 17-3, and 17-4 show examples of user interfaces associated with managing user profiles. FIG. 17-1 shows a Manage Profiles user interface 1702 that is displayed after a Manage Profile icon 108 is selected from icons portion 106 in FIG. 1. Interface 1702 includes a save profile icon 1704, a load profile icon 1706, and a delete profile icon 1708. Selection of save profile icon 1704 causes user interface 1712 in FIG. 17-2 to be displayed. In interface 1712, a user can save the current controller settings as a new profile by selecting the yes radio button 1714 or save the current controller settings as an existing profile by selecting the yes radio button 1716. If the save as a new profile button 1714 is selected, the user is presented with a screen that enables the user to type in a name of the new profile. If the save as existing profile button 1716 is selected, the user is presented with a screen that enables the user to select one of the previously saved profiles.
  • FIG. 17-3 shows a Load Saved Profile user interface 1722 that is displayed upon a user selecting the load profile icon 1706 in FIG. 17-1. Interface 1722 shows icons 1724 that represent the previously save profiles. Selection of one of icons 1724 loads the controller with the previously saved settings. A confirmation step is optionally displayed prior to changing the controller settings.
  • FIG. 17-4 shows a Delete Saved Profile user interface 1732 that is displayed upon a user selecting the delete profile icon 1708 in FIG. 17-1. Interface 1732 shows icons 1734 that represent the previously saved profiles. Selection of one of icons 1734 deletes the previously saved profile. A confirmation step is optionally displayed prior to deleting the selected profile.
  • FIGS. 18-1, 18-2, 18-3, and 18-4 show examples of user interfaces associated with managing predefined motions. FIG. 18-1 shows a Manage Motions user interface 1802 that is displayed after a Manage Motion icon 108 is selected from icons portion 106 in FIG. 1. Interface 1802 includes a record motion icon 1804, a perform saved motion icon 1806, and a delete saved motion icon 1808. Selection of record motion icon 1804 causes user interface 1812 in FIG. 18-2 to be displayed. In interface 1812, a user can record a motion by toggling icon 1814 to the on position, and a user can enter a name for the recorded motion be selecting icon 1816. In one embodiment, a user is able to record a motion by drawing a shape on the user interface. For instance, a user can draw and record a spiral motion, a circle, a line, a rectangle, or any other shape. The controller illustratively records both the shape of the drawing and the speed at which each portion of the shape is drawn. The controller translates the shape and speed into control commands for the multichannel device. For example, slowly drawn portions cause the multichannel device to rotate more slowly, and quickly drawn portions cause the multichannel device to rotate more quickly.
  • FIG. 18-3 shows a Perform Saved Motion user interface 1822 that is displayed after the Perform Saved Motion icon 1806 in FIG. 18-1 is selected. Interface 1822 includes icons 1824 that correspond to previously recorded/saved motions. Selection of one of icons 1824 causes the motors to perform the previously recorded/saved motion.
  • FIG. 18-4 shows a Delete Save Motion user interface 1832 that is displayed after the Delete Saved Motion icon 1808 in FIG. 18-1 is selected. Interface 1832 includes icons 1834 that correspond to previously recorded/saved motions. Selection of one of icons 1834 causes the selected motion to be deleted. A confirmation step is optionally displayed prior to deleting the motion.
  • FIG. 19 shows one illustrative operating environment of a multichannel controller 1902. Multichannel controller illustratively includes a touchscreen 1904, input keys 1906, a controller/processor 1908, memory 1910, a communications module/communications interface 1912, and a housing/case 1914. Touchscreen 1904 illustratively includes any type of single touch or multitouch screen (e.g. capacitive touchscreen, vision based touchscreen, etc.). Touchscreen 1904 is able to detect a user's finger, stylus, etc. contacting touchscreen 1904 and generates input data (e.g. x and y coordinates) based on the detected contact. Input keys 1906 include buttons or other mechanical devices that a user is able to press or otherwise actuate to input data. For instance, input keys 1906 may include a home button, a back button, 0-9 number keys, a QWERTY keyboard, etc. Memory 1910 includes volatile, non-volatile or a combination of volatile and non-volatile memory. Memory 1910 may be implemented using more than one type of memory. For example, memory 1910 may include any combination of flash memory, magnetic hard drives, RAM, etc. Memory 1910 stores the computer executable instructions that are used to implement the multichannel controllers. Memory 1910 also stores user saved data such as programmed maneuvers or profile settings. Controller/processor 1908 can be implemented using any type of controller/processor (e.g. ASIC, RISC, ARM, etc.) that can process user inputs and the stored instructions to generate commands for controlling systems such as, but not limited to, pan and tilt camera systems. The generated commands, etc. are sent to communications module/communications interface 1914 that transmits the commands to the controlled systems. Finally with respect to multichannel controller 1902, the controller housing 1914 can be any suitable housing. In one embodiment, housing 1914 has a form factor such that controller 1902 is able to fit within a user's hand. Housing 1914 may however be larger (e.g. tablet sized) and is not limited to any particular form factor.
  • As is shown in FIG. 19, multichannel controller optionally communicates (e.g. wirelessly or wired) to a command translation unit 1920. Command translation unit 1920 converts or transforms commands received from multichannel controller 1902 into the format that can be processed by the systems being controlled. It should be noted however that not all implementations include a command translation unit 1920, and that in other embodiments, multichannel controller 1902 instead directly sends commands to the systems being controlled.
  • FIG. 19 shows multichannel controller 1902 controlling N systems 1932, 1934, and 1936. In an embodiment, multichannel controller 1902 can control any number of systems (e.g. 1, 2, 3, 4 etc.). A user is illustratively able to choose which system is being controlled by selecting the system from a user interface. For instance, controller 1902 illustratively includes a user interface that shows the user all of the systems that can be controlled (e.g. represents each system as a separate icon), and the user selects one of the systems. Alternatively, in certain embodiments, a user is able to control multiple systems at the same time. For example, in one embodiment, controller 1902 is able to control systems 1932, 1934, and 1936 at variable levels of autonomy (e.g. manual, semi-autonomous, or fully autonomous). Controller 1902 is able to control more than one system at a time and is able to control the systems at different levels of autonomy. One controlled system may for instance be performing a programmed maneuver or tracking an object, while another system may be in fully manual control mode.
  • In an embodiment, multichannel controller 1902 is able to control systems 1932, 1934, and 1936 in either an open loop mode or in a closed loop mode. In open loop mode, controller 1902 does not receive feedback from the controlled systems. For instance, controller 1902 does not necessarily know the position, speed, etc. of the controlled systems. However, in closed loop mode, controller 1902 does receive feedback from one or more of the controlled systems. Controller 1902 may for instance receive feedback indicating a position (e.g. angular position), speed, etc. of a pan and/or tilt motor. In such cases, controller 1902 is able to use the feedback in generating new commands for the systems. For instance, a user may wish to set a speed, position, etc. of a controlled system. Controller 1902 illustratively receives feedback from the controlled system indicating its current speed, position, etc., and the controller adjusts the command signal based on the current speed, position, etc. and based on the speed, position, etc. that is intended/desired by a user.
  • Finally, it is to be understood that even though numerous characteristics and advantages of various embodiments have been set forth in the foregoing description, together with details of the structure and function of various embodiments, this detailed description is illustrative only, and changes may be made in detail, especially in matters of structure and arrangements of parts within the principles of the present disclosure to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed. In addition, although certain embodiments described herein are directed to pan and tilt systems, it will be appreciated by those skilled in the art that the teachings of the disclosure can be applied to other types of multichannel control systems, without departing from the scope and spirit of the disclosure.

Claims (20)

What is claimed is:
1. A non-transitory computer-readable storage medium having computer executable instructions that, when executed, cause a computing device to execute operations comprising transitioning a displayed user interface presented on a display of the computing device from a first control mode for controlling a moveable device, to a second control mode for controlling the moveable device, the user interface comprising:
a control mode selector configured to:
display a plurality of control mode selector icons, each corresponding to one of a plurality of control modes, wherein the plurality of control mode selector icons includes a first control mode selector icon, corresponding to the first control mode, and a second control mode selector icon, corresponding to the second control mode;
receive a user input selection of the second control mode selector icon; and
display the second control mode by displaying a user-controllable icon corresponding to the second control mode in a main portion of the user interface; and
a functionality limitation setting mechanism configured to:
receive a user input selection of a functionality limitation applied to the second control mode, wherein the user input is received through a touch screen interface of the computing device; and
wherein the functionality limitation comprises a position lock function that is configured to, when actuated, lock a position of the controlled moveable device with respect to a selected axis of motion.
2. The user interface of claim 1, wherein the second control mode comprises a touchpad mode, wherein the touchpad mode displays, in the main portion of the user interface, a solid color or a video from the controlled moving device and enables a user to control the moveable device by making touch gestures in the main portion of the user interface.
3. The user interface of claim 1, wherein the second control mode comprises a trackball mode, wherein the trackball mode displays, in the main portion of the user interface, a user controllable trackball icon and enables a user to control the moveable device by rotating the trackball icon.
4. The user interface of claim 1, wherein the second control mode comprises a joystick mode, wherein the joystick mode displays, in the main portion of the user interface, a user controllable joystick icon and enables a user to control the moveable device by moving a position of the joystick icon.
5. The user interface of claim 2, wherein the touchpad mode also comprises a displayed slider icon, in the main portion of the user interface.
6. The user interface of claim 2, wherein the touchpad mode also comprises a displayed wheel icon, in the main portion of the user interface.
7. The user interface of claim 3, wherein the trackball mode also comprises a displayed slider icon, in the main portion of the user interface.
8. The user interface of claim 3, wherein the trackball mode also comprises a displayed wheel icon, in the main portion of the user interface.
9. The user interface of claim 1, wherein the functionality limitation setting mechanism further comprises a speed sensitivity setting mechanism for the moveable device.
10. The user interface of claim 1, wherein the functionality limitation setting mechanism further comprises a zoom setting mechanism for the moveable device.
11. A device comprising a controller coupled to one or more processors, a display, and a touchscreen interface on the display, the controller comprising:
a display configured to display a plurality of control mode selector icons, each corresponding to a control mode for a controllable device, wherein one of the plurality of control mode selector icons comprises a first control mode and a second control mode;
a user input mechanism configured to receive a user selection of the second control mode, and an indication to change a selected control mode from the first control mode to the second control mode, wherein the second control mode comprises a position lock functionality mechanism configured to lock the moveable device into a position along an axis of motion;
a signal generator configured to receive a user input selection of a functionality limitation applied to an axis of motion of a controllable device, the functionality limitation selected on the second user controllable icon such that movement of the controllable device is limited with respect to movement of the second user controllable icon on the user interface; and
a processor that is configured to transmit the control signal to a motor of the controllable device.
12. The device of claim 11, wherein the position lock functionality mechanism is selectable through the user input mechanism.
13. The device of claim 11, wherein the controller is a computing device.
14. The device of claim 11, wherein the controller is a phone.
15. The user interface of claim 11, wherein the second control mode comprises a touchpad mode, wherein the touchpad mode displays, in the main portion of the user interface, a solid color or a video from the controlled moving device and enables a user to control the moveable device by making touch gestures in the main portion of the user interface.
16. The user interface of claim 11, wherein the second control mode comprises a trackball mode, wherein the trackball mode displays, in the main portion of the user interface, a user controllable trackball icon and enables a user to control the moveable device by rotating the trackball icon.
17. The user interface of claim 11, wherein the second control mode comprises a joystick mode, wherein the joystick mode displays, in the main portion of the user interface, a user controllable joystick icon and enables a user to control the moveable device by moving a position of the joystick icon.
18. A method for controlling a moveable device, the method comprising:
displaying a plurality of control mode selector icons on a touchscreen display of a computing device, each of the plurality of control mode selector icons corresponding to a control mode of a plurality of control modes;
receiving a user input selection of one of the plurality of control mode selector icons;
displaying a control mode corresponding to the selected control mode selector icon, wherein displaying the control mode comprises displaying, in the main portion of the user interface, a user controllable icon corresponding to the control mode;
receiving a user input selection of a position on an axis of motion for the moveable device, wherein the moveable device is locked in the selected position on the axis of motion;
receiving a user input indicative of a direction and speed of movement for the moveable device on the user controllable icon;
translating the received user input into a control signal comprising the indicated direction and speed, modified by the selected position on the axis of motion; and
transmitting the control single to the moveable device.
19. The method of claim 18, wherein the computing device is a tablet computer.
20. The method of claim 18, wherein the computing device is a phone.
US15/785,910 2011-02-09 2017-10-17 Multichannel controller Abandoned US20180039400A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/785,910 US20180039400A1 (en) 2011-02-09 2017-10-17 Multichannel controller

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201161441113P 2011-02-09 2011-02-09
US13/083,912 US8791911B2 (en) 2011-02-09 2011-04-11 Multichannel controller
US14/303,894 US9823825B2 (en) 2011-02-09 2014-06-13 Multichannel controller
US15/785,910 US20180039400A1 (en) 2011-02-09 2017-10-17 Multichannel controller

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/303,894 Continuation US9823825B2 (en) 2011-02-09 2014-06-13 Multichannel controller

Publications (1)

Publication Number Publication Date
US20180039400A1 true US20180039400A1 (en) 2018-02-08

Family

ID=46600330

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/083,912 Expired - Fee Related US8791911B2 (en) 2011-02-09 2011-04-11 Multichannel controller
US14/303,894 Active 2031-06-27 US9823825B2 (en) 2011-02-09 2014-06-13 Multichannel controller
US15/785,910 Abandoned US20180039400A1 (en) 2011-02-09 2017-10-17 Multichannel controller

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/083,912 Expired - Fee Related US8791911B2 (en) 2011-02-09 2011-04-11 Multichannel controller
US14/303,894 Active 2031-06-27 US9823825B2 (en) 2011-02-09 2014-06-13 Multichannel controller

Country Status (1)

Country Link
US (3) US8791911B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741743A (en) * 2021-01-04 2021-12-03 北京沃东天骏信息技术有限公司 Display method and device, equipment and storage medium
US20220208053A1 (en) * 2020-12-31 2022-06-30 Lenovo (Beijing) Limited Information processing method and device

Families Citing this family (102)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9207717B2 (en) 2010-10-01 2015-12-08 Z124 Dragging an application to a screen using the application manager
US9213365B2 (en) 2010-10-01 2015-12-15 Z124 Method and system for viewing stacked screen displays using gestures
US8453987B2 (en) * 2004-06-30 2013-06-04 Robotzone, Llc Pan and tilt systems
US8791911B2 (en) 2011-02-09 2014-07-29 Robotzone, Llc Multichannel controller
US9390617B2 (en) 2011-06-10 2016-07-12 Robotzone, Llc Camera motion control system with variable autonomy
US20130080932A1 (en) 2011-09-27 2013-03-28 Sanjiv Sirpal Secondary single screen mode activation through user interface toggle
US8816553B2 (en) 2011-10-24 2014-08-26 Robotzone, Llc Hobby servo blocks for use with hobby servo motors
US9063563B1 (en) 2012-09-25 2015-06-23 Amazon Technologies, Inc. Gesture actions for interface elements
US8791663B2 (en) 2012-10-19 2014-07-29 Robotzone, Llc Hobby servo motor linear actuator systems
JP6103948B2 (en) * 2013-01-17 2017-03-29 キヤノン株式会社 IMAGING DEVICE, REMOTE OPERATION TERMINAL, CAMERA SYSTEM, IMAGING DEVICE CONTROL METHOD AND PROGRAM, REMOTE OPERATION TERMINAL CONTROL METHOD AND PROGRAM
CN103092481A (en) * 2013-01-17 2013-05-08 广东欧珀移动通信有限公司 Method and device for intelligent terminal dynamic gesture unlocking
CL2013001371E1 (en) * 2013-02-22 2013-12-06 Samsung Electronics Co Ltd Industrial drawing applicable to the screen of a mobile communication device, consisting of a horizontal rectangle that inside it presents, from left to right, a square, three circles, the center with double edge and two rectangles in its center, the sides with two triangles and an inverted "v" shaped figure.
USD737833S1 (en) 2013-06-09 2015-09-01 Apple Inc. Display screen or portion thereof with graphical user interface
USD738905S1 (en) 2013-06-09 2015-09-15 Apple Inc. Display screen or portion thereof with animated graphical user interface
KR102140811B1 (en) * 2013-07-23 2020-08-03 삼성전자주식회사 User Interface Providing Method for Device and Device Thereof
US9177362B2 (en) 2013-08-02 2015-11-03 Facebook, Inc. Systems and methods for transforming an image
KR20180128091A (en) 2013-09-03 2018-11-30 애플 인크. User interface for manipulating user interface objects with magnetic properties
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
USD746831S1 (en) 2013-09-10 2016-01-05 Apple Inc. Display screen or portion thereof with graphical user interface
US9329750B2 (en) * 2013-09-10 2016-05-03 Google Inc. Three-dimensional tilt and pan navigation using a single gesture
USD741890S1 (en) * 2013-09-10 2015-10-27 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD760752S1 (en) * 2013-09-10 2016-07-05 Apple Inc. Display screen or portion thereof with graphical user interface
USD751082S1 (en) 2013-09-13 2016-03-08 Airwatch Llc Display screen with a graphical user interface for an email application
USD711427S1 (en) 2013-10-22 2014-08-19 Apple Inc. Display screen or portion thereof with icon
USD731523S1 (en) * 2013-11-08 2015-06-09 Microsoft Corporation Display screen with graphical user interface
USD772887S1 (en) * 2013-11-08 2016-11-29 Microsoft Corporation Display screen with graphical user interface
USD749117S1 (en) * 2013-11-25 2016-02-09 Tencent Technology (Shenzhen) Company Limited Graphical user interface for a portion of a display screen
USD733745S1 (en) * 2013-11-25 2015-07-07 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with graphical user interface
USD757772S1 (en) * 2014-01-03 2016-05-31 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US20150215514A1 (en) * 2014-01-24 2015-07-30 Voxx International Corporation Device for wirelessly controlling a camera
USD766289S1 (en) * 2014-03-27 2016-09-13 Osram Gmbh Display screen with graphical user interface
USD762698S1 (en) * 2014-04-08 2016-08-02 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD759080S1 (en) * 2014-05-01 2016-06-14 Beijing Qihoo Technology Co. Ltd Display screen with a graphical user interface
USD753678S1 (en) 2014-06-01 2016-04-12 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD749097S1 (en) * 2014-06-26 2016-02-09 Xiaomi Inc. Display screen or portion thereof with a graphical user interface
WO2015200889A1 (en) 2014-06-27 2015-12-30 Apple Inc. Electronic device with rotatable input mechanism for navigating calendar application
US9726463B2 (en) * 2014-07-16 2017-08-08 Robtozone, LLC Multichannel controller for target shooting range
US10275031B2 (en) 2014-08-07 2019-04-30 E2C Ltd. Enhanced accessibility in portable multifunction devices
TWI582641B (en) 2014-09-02 2017-05-11 蘋果公司 Button functionality
USD871426S1 (en) * 2014-09-02 2019-12-31 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
WO2016036509A1 (en) 2014-09-02 2016-03-10 Apple Inc. Electronic mail user interface
CN113824998A (en) 2014-09-02 2021-12-21 苹果公司 Music user interface
USD753697S1 (en) 2014-09-02 2016-04-12 Apple Inc. Display screen or portion thereof with graphical user interface
US20160062571A1 (en) 2014-09-02 2016-03-03 Apple Inc. Reduced size user interface
USD800758S1 (en) 2014-09-23 2017-10-24 Seasonal Specialties, Llc Computer display screen with graphical user interface for lighting
USD774536S1 (en) * 2014-12-10 2016-12-20 Lemobile Information Technology (Beijing) Co., Ltd. Display screen with an animated graphical user interface
USD770519S1 (en) * 2015-01-02 2016-11-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
US10365807B2 (en) 2015-03-02 2019-07-30 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
WO2016144563A1 (en) * 2015-03-08 2016-09-15 Apple Inc. User interface using a rotatable input mechanism
US20160266778A1 (en) * 2015-03-11 2016-09-15 Atieva, Inc. Positionable User Interface for Vehicular Use
USD788161S1 (en) 2015-09-08 2017-05-30 Apple Inc. Display screen or portion thereof with graphical user interface
USD795288S1 (en) * 2016-03-04 2017-08-22 Breville Pty Limited Display screen or portion thereof with graphical user
US20170277182A1 (en) * 2016-03-24 2017-09-28 Magna Electronics Inc. Control system for selective autonomous vehicle control
USD810777S1 (en) 2016-06-03 2018-02-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD804502S1 (en) 2016-06-11 2017-12-05 Apple Inc. Display screen or portion thereof with graphical user interface
USD854558S1 (en) * 2016-06-30 2019-07-23 Curascript, Inc. Display with graphical user interface
USD825612S1 (en) 2016-07-27 2018-08-14 Apple Inc. Display screen or portion thereof with graphical user interface
USD962954S1 (en) 2016-09-06 2022-09-06 Apple Inc. Display screen or portion thereof with graphical user interface
USD821435S1 (en) 2017-02-23 2018-06-26 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US10365823B2 (en) * 2017-03-02 2019-07-30 International Business Machines Corporation Simplified text entry user interface for touch devices
USD905713S1 (en) * 2017-04-28 2020-12-22 Oshkosh Defense, Llc Display screen or portion thereof with graphical user interface
USD846587S1 (en) 2017-06-04 2019-04-23 Apple Inc. Display screen or portion thereof with animated graphical user interface
US20190020808A1 (en) * 2017-07-11 2019-01-17 Sony Corporation Remotely controllable camera on head-mount for the blind
USD851111S1 (en) 2017-09-09 2019-06-11 Apple Inc. Electronic device with graphical user interface
USD843442S1 (en) 2017-09-10 2019-03-19 Apple Inc. Type font
USD861024S1 (en) * 2017-12-22 2019-09-24 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
USD857724S1 (en) * 2017-12-22 2019-08-27 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface
CN208156846U (en) * 2018-04-10 2018-11-27 深圳市大疆创新科技有限公司 Tripod head controlling device and clouds terrace system
USD879132S1 (en) 2018-06-03 2020-03-24 Apple Inc. Electronic device with graphical user interface
USD938968S1 (en) 2018-09-06 2021-12-21 Apple Inc. Electronic device with animated graphical user interface
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
DK179896B1 (en) 2018-09-11 2019-08-30 Apple Inc. Indholds-baserede taktile outputs
USD902221S1 (en) 2019-02-01 2020-11-17 Apple Inc. Electronic device with animated graphical user interface
USD900871S1 (en) 2019-02-04 2020-11-03 Apple Inc. Electronic device with animated graphical user interface
USD918251S1 (en) * 2019-02-18 2021-05-04 Samsung Electronics Co., Ltd. Foldable mobile phone with transitional graphical user interface
USD944281S1 (en) * 2019-03-26 2022-02-22 Facebook, Inc. Display device with graphical user interface
USD912697S1 (en) 2019-04-22 2021-03-09 Facebook, Inc. Display screen with a graphical user interface
USD914051S1 (en) 2019-04-22 2021-03-23 Facebook, Inc. Display screen with an animated graphical user interface
USD914058S1 (en) 2019-04-22 2021-03-23 Facebook, Inc. Display screen with a graphical user interface
USD913314S1 (en) 2019-04-22 2021-03-16 Facebook, Inc. Display screen with an animated graphical user interface
USD930695S1 (en) 2019-04-22 2021-09-14 Facebook, Inc. Display screen with a graphical user interface
USD914049S1 (en) 2019-04-22 2021-03-23 Facebook, Inc. Display screen with an animated graphical user interface
USD912693S1 (en) 2019-04-22 2021-03-09 Facebook, Inc. Display screen with a graphical user interface
USD913313S1 (en) 2019-04-22 2021-03-16 Facebook, Inc. Display screen with an animated graphical user interface
USD976264S1 (en) * 2019-05-16 2023-01-24 Pathway Innovations And Technologies, Inc. Display screen or portion thereof with a graphical user interface
US10817142B1 (en) 2019-05-20 2020-10-27 Facebook, Inc. Macro-navigation within a digital story framework
US11388132B1 (en) 2019-05-29 2022-07-12 Meta Platforms, Inc. Automated social media replies
US10757054B1 (en) 2019-05-29 2020-08-25 Facebook, Inc. Systems and methods for digital privacy controls
USD913315S1 (en) 2019-05-31 2021-03-16 Apple Inc. Electronic device with graphical user interface
US10996761B2 (en) 2019-06-01 2021-05-04 Apple Inc. User interfaces for non-visual output of time
USD912700S1 (en) 2019-06-05 2021-03-09 Facebook, Inc. Display screen with an animated graphical user interface
USD924255S1 (en) 2019-06-05 2021-07-06 Facebook, Inc. Display screen with a graphical user interface
USD914739S1 (en) 2019-06-05 2021-03-30 Facebook, Inc. Display screen with an animated graphical user interface
USD914705S1 (en) 2019-06-05 2021-03-30 Facebook, Inc. Display screen with an animated graphical user interface
USD917533S1 (en) 2019-06-06 2021-04-27 Facebook, Inc. Display screen with a graphical user interface
USD916915S1 (en) 2019-06-06 2021-04-20 Facebook, Inc. Display screen with a graphical user interface
USD914757S1 (en) 2019-06-06 2021-03-30 Facebook, Inc. Display screen with an animated graphical user interface
USD918264S1 (en) 2019-06-06 2021-05-04 Facebook, Inc. Display screen with a graphical user interface
USD916101S1 (en) * 2019-07-23 2021-04-13 Iblush, Inc. Display screen or portion thereof with transitional graphical user interface
USD918229S1 (en) * 2019-07-31 2021-05-04 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD924912S1 (en) 2019-09-09 2021-07-13 Apple Inc. Display screen or portion thereof with graphical user interface
CN112312021B (en) * 2020-10-30 2022-04-15 维沃移动通信有限公司 Shooting parameter adjusting method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030093430A1 (en) * 2000-07-26 2003-05-15 Mottur Peter A. Methods and systems to control access to network devices
US20060250357A1 (en) * 2005-05-04 2006-11-09 Mammad Safai Mode manager for a pointing device
US20080027591A1 (en) * 2006-07-14 2008-01-31 Scott Lenser Method and system for controlling a remote vehicle
US20080084481A1 (en) * 2006-10-06 2008-04-10 The Vitec Group Plc Camera control interface
US20080126981A1 (en) * 2006-05-30 2008-05-29 Nike, Inc. Custom ordering of an article
US20110285636A1 (en) * 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
US20120030626A1 (en) * 2010-07-30 2012-02-02 Apple Inc. Hybrid Knob/Slider Control
US20120192078A1 (en) * 2011-01-26 2012-07-26 International Business Machines Method and system of mobile virtual desktop and virtual trackball therefor

Family Cites Families (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1320234A (en) 1919-10-28 johnson
US405523A (en) 1889-06-18 Half to edwin churchill
US1371622A (en) 1919-10-28 1921-03-15 William F Hudson Crank-case repair-arm
US2420425A (en) 1945-10-11 1947-05-13 Christopher L Hardwick Spacing bracket for crated stoves
US3145960A (en) 1962-03-08 1964-08-25 Gen Electric Motor mounting arrangement
CA997413A (en) 1972-12-05 1976-09-21 Fujitsu Limited Control method of a dc motor
USD243929S (en) 1976-04-05 1977-04-05 Utility Hardware, Inc. Utility pole slanted insulator bracket
US4033531A (en) 1976-04-27 1977-07-05 Fred Levine Mounting assembly with selectively used one-piece or two-piece brackets
US4044978A (en) 1976-05-10 1977-08-30 Williams James F Boat motor display and work stand
DE3005093C2 (en) 1980-02-12 1985-02-14 Klein, Schanzlin & Becker Ag, 6710 Frankenthal Bracket for centrifugal pumps
USD296075S (en) 1986-05-15 1988-06-07 Jones Robert E Adjustable height mount for pump motor
ES2030103T3 (en) 1987-06-16 1992-10-16 Marposs Societa' Per Azioni CALIBER TO CHECK LINEAR MEASURES.
US5053685A (en) 1990-01-31 1991-10-01 Kensington Laboratories, Inc. High precision linear actuator
USD327518S (en) 1990-12-04 1992-06-30 Interlego, A.G. Toy construction piece
US5557154A (en) 1991-10-11 1996-09-17 Exlar Corporation Linear actuator with feedback position sensor device
USD342011S (en) 1991-11-26 1993-12-07 Maguire James V Foundation bolt mounting bracket
US6121966A (en) * 1992-11-02 2000-09-19 Apple Computer, Inc. Navigable viewing system
JPH09501333A (en) 1993-07-21 1997-02-10 エイチ. クリーマン,チャールズ Surgical instruments for endoscopy and surgery
SG67927A1 (en) 1993-10-20 1999-10-19 Videoconferencing Sys Inc Adaptive videoconferencing system
US5526041A (en) 1994-09-07 1996-06-11 Sensormatic Electronics Corporation Rail-based closed circuit T.V. surveillance system with automatic target acquisition
US5806402A (en) 1995-09-06 1998-09-15 Henry; Michael F. Regulated speed linear actuator
US6281930B1 (en) 1995-10-20 2001-08-28 Parkervision, Inc. System and method for controlling the field of view of a camera
US6624846B1 (en) * 1997-07-18 2003-09-23 Interval Research Corporation Visual user interface for use in controlling the interaction of a device with a spatial region
US6396961B1 (en) * 1997-11-12 2002-05-28 Sarnoff Corporation Method and apparatus for fixating a camera on a target point using image alignment
US7755668B1 (en) * 1998-04-09 2010-07-13 Johnston Gregory E Mobile surveillance system
US7270589B1 (en) 1999-05-14 2007-09-18 Carnegie Mellon University Resilient leg design for hopping running and walking machines
MY126873A (en) 2000-01-07 2006-10-31 Vasu Tech Ltd Configurable electronic controller for appliances
US6249091B1 (en) * 2000-05-08 2001-06-19 Richard S. Belliveau Selectable audio controlled parameters for multiparameter lights
US7796162B2 (en) * 2000-10-26 2010-09-14 Front Row Technologies, Llc Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
US7782363B2 (en) * 2000-06-27 2010-08-24 Front Row Technologies, Llc Providing multiple video perspectives of activities through a data network to a remote multimedia server for selective display by remote viewing audiences
US7812856B2 (en) * 2000-10-26 2010-10-12 Front Row Technologies, Llc Providing multiple perspectives of a venue activity to electronic wireless hand held devices
US7149549B1 (en) * 2000-10-26 2006-12-12 Ortiz Luis M Providing multiple perspectives for a venue activity through an electronic hand held device
JP3632644B2 (en) 2001-10-04 2005-03-23 ヤマハ株式会社 Robot and robot motion pattern control program
US20030174242A1 (en) * 2002-03-14 2003-09-18 Creo Il. Ltd. Mobile digital camera control
JP2004077706A (en) * 2002-08-14 2004-03-11 Fuji Photo Optical Co Ltd Lens system
US20040184798A1 (en) * 2003-03-17 2004-09-23 Dumm Mark T. Camera control system and associated pan/tilt head
US7336009B2 (en) 2003-06-19 2008-02-26 Btr Robotics Limited Liability Company Hobby servo enhancements
US20050110634A1 (en) * 2003-11-20 2005-05-26 Salcedo David M. Portable security platform
US7527439B1 (en) * 2004-05-06 2009-05-05 Dumm Mark T Camera control system and associated pan/tilt head
US8453987B2 (en) 2004-06-30 2013-06-04 Robotzone, Llc Pan and tilt systems
US7934691B2 (en) 2004-06-30 2011-05-03 Robotzone Llc Pan systems
US7559129B2 (en) 2004-06-30 2009-07-14 Btr Robotics Limited Liability Company Apparatus for enhancing hobby servo performance
US20060082662A1 (en) * 2004-10-15 2006-04-20 Brian Isaacson System and process for digitizing and tracking audio, video and text information
US20060114322A1 (en) * 2004-11-30 2006-06-01 Romanowich John F Wide area surveillance system
DE102005007327B4 (en) 2005-02-17 2010-06-17 Continental Automotive Gmbh Circuit arrangement and method for operating an injector arrangement
US7501731B2 (en) 2005-03-09 2009-03-10 Btr Robotics Limited Liability Company Apparatus for enhancing hobby servo performance
US20080088089A1 (en) 2005-04-18 2008-04-17 James Carl Bliehall Electronically controlled target positioning system for training in marksmanship and target identification
US7750944B2 (en) 2005-05-02 2010-07-06 Ge Security, Inc. Methods and apparatus for camera operation
US20060248210A1 (en) * 2005-05-02 2006-11-02 Lifesize Communications, Inc. Controlling video display mode in a video conferencing system
US20060269264A1 (en) * 2005-05-10 2006-11-30 Stafford Gregory R Method, device and system for capturing digital images in a variety of settings and venues
DE112006003044T5 (en) 2005-10-21 2008-10-23 Deere & Company, Moline Versatile robot control module
JP4201025B2 (en) * 2006-06-30 2008-12-24 ソニー株式会社 Monitoring device, monitoring system, filter setting method, and monitoring program
USD571643S1 (en) 2007-02-12 2008-06-24 Trelleborg Industrial Avs Ltd. Vibration and shock absorber
US20090322866A1 (en) 2007-04-19 2009-12-31 General Electric Company Security checkpoint systems and methods
US20110248448A1 (en) 2010-04-08 2011-10-13 Bruce Hodge Method and apparatus for determining and retrieving positional information
US7891902B2 (en) 2007-06-19 2011-02-22 Robotzone, Llc Hobby servo shaft adapter
US7859151B2 (en) 2007-08-09 2010-12-28 Robotzone, Llc Hobby servo shaft attachment mechanism
US7795768B2 (en) 2007-08-09 2010-09-14 Btr Robotics Limited Liability Company Mechanisms and gears for attachment to a hobby servo output shaft
US8324773B2 (en) 2007-08-09 2012-12-04 Robotzone Llc Hobby servo shaft attachment mechanisms having textured surfaces
US7900927B1 (en) 2007-12-31 2011-03-08 James Bliehall Portable, carriage driven, moving target system for training in marksmanship and target identification
US9550130B2 (en) 2008-03-28 2017-01-24 Robotzone, Llc Kits and components for modular hobby mechanical and robotic construction
US8130275B2 (en) * 2008-06-13 2012-03-06 Nintendo Co., Ltd. Information-processing apparatus, and storage medium storing a photographing application launch program executed by information-processing apparatus
US20100045666A1 (en) * 2008-08-22 2010-02-25 Google Inc. Anchored Navigation In A Three Dimensional Environment On A Mobile Device
US8488001B2 (en) * 2008-12-10 2013-07-16 Honeywell International Inc. Semi-automatic relative calibration method for master slave camera control
EP2401525B1 (en) 2009-02-25 2018-12-05 Exlar Corporation Actuation system and method of generating a rotary output
JP5453953B2 (en) * 2009-06-24 2014-03-26 ソニー株式会社 Movable mechanism control device, movable mechanism control method, program
JP2011009929A (en) * 2009-06-24 2011-01-13 Sony Corp Movable-mechanical-section controlling device, method of controlling movable mechanical section, and program
US8655257B2 (en) 2009-08-24 2014-02-18 Daniel Spychaiski Radio controlled combat training device and method of using the same
US20110045445A1 (en) 2009-08-24 2011-02-24 Daniel Joseph Spychalski Officer under fire perpetrator machine
JP5347847B2 (en) 2009-08-31 2013-11-20 株式会社リコー Image capturing apparatus, communication establishment method, program
CN102231802A (en) * 2009-10-14 2011-11-02 鸿富锦精密工业(深圳)有限公司 Camera switching system and method thereof
US20110089639A1 (en) 2009-10-16 2011-04-21 Jason Earl Bellamy Remote control target base
JP5517668B2 (en) 2010-02-19 2014-06-11 キヤノン株式会社 COMMUNICATION DEVICE, IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
US20110267462A1 (en) 2010-04-29 2011-11-03 Fred Cheng Versatile remote video monitoring through the internet
US20120139468A1 (en) 2010-12-02 2012-06-07 Robotzone, Llc Multi-rotation hobby servo motors
US8791911B2 (en) 2011-02-09 2014-07-29 Robotzone, Llc Multichannel controller
US8712602B1 (en) 2011-05-24 2014-04-29 Timothy S. Oliver Mobile target system
US9390617B2 (en) 2011-06-10 2016-07-12 Robotzone, Llc Camera motion control system with variable autonomy
US8816553B2 (en) 2011-10-24 2014-08-26 Robotzone, Llc Hobby servo blocks for use with hobby servo motors
AU2011250746A1 (en) 2011-11-13 2013-05-30 Hex Systems Pty Ltd Projectile Target System
US20130341869A1 (en) 2012-01-18 2013-12-26 Jonathan D. Lenoff Target Shot Placement Apparatus and Method
WO2013123547A1 (en) 2012-02-23 2013-08-29 Marathon Robotics Pty Ltd Systems and methods for arranging firearms training scenarios
US8791663B2 (en) 2012-10-19 2014-07-29 Robotzone, Llc Hobby servo motor linear actuator systems
US9726463B2 (en) 2014-07-16 2017-08-08 Robtozone, LLC Multichannel controller for target shooting range

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030093430A1 (en) * 2000-07-26 2003-05-15 Mottur Peter A. Methods and systems to control access to network devices
US20060250357A1 (en) * 2005-05-04 2006-11-09 Mammad Safai Mode manager for a pointing device
US20080126981A1 (en) * 2006-05-30 2008-05-29 Nike, Inc. Custom ordering of an article
US20080027591A1 (en) * 2006-07-14 2008-01-31 Scott Lenser Method and system for controlling a remote vehicle
US20080084481A1 (en) * 2006-10-06 2008-04-10 The Vitec Group Plc Camera control interface
US20110285636A1 (en) * 2010-05-20 2011-11-24 Howard John W Touch screen with virtual joystick and methods for use therewith
US20120030626A1 (en) * 2010-07-30 2012-02-02 Apple Inc. Hybrid Knob/Slider Control
US20120192078A1 (en) * 2011-01-26 2012-07-26 International Business Machines Method and system of mobile virtual desktop and virtual trackball therefor

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220208053A1 (en) * 2020-12-31 2022-06-30 Lenovo (Beijing) Limited Information processing method and device
US11462143B2 (en) * 2020-12-31 2022-10-04 Lenovo (Beijing) Limited Area brightness adjusting method and display therefor
CN113741743A (en) * 2021-01-04 2021-12-03 北京沃东天骏信息技术有限公司 Display method and device, equipment and storage medium

Also Published As

Publication number Publication date
US8791911B2 (en) 2014-07-29
US20140298233A1 (en) 2014-10-02
US9823825B2 (en) 2017-11-21
US20120200510A1 (en) 2012-08-09

Similar Documents

Publication Publication Date Title
US20180039400A1 (en) Multichannel controller
US10754517B2 (en) System and methods for interacting with a control environment
US10852913B2 (en) Remote hover touch system and method
TWI461973B (en) Method, system, and computer-readable medium for visual feedback display
KR101379398B1 (en) Remote control method for a smart television
US20210240332A1 (en) Cursor integration with a touch screen user interface
EP3000013B1 (en) Interactive multi-touch remote control
KR102143584B1 (en) Display apparatus and method for controlling thereof
EP3087456B1 (en) Remote multi-touch control
CN103718150B (en) Electronic equipment with the task management based on posture
US20110296329A1 (en) Electronic apparatus and display control method
US10599317B2 (en) Information processing apparatus
JP2008146243A (en) Information processor, information processing method and program
CN103914258A (en) Mobile terminal and method for operating same
JP3982288B2 (en) 3D window display device, 3D window display method, and 3D window display program
CN103294337A (en) Electronic apparatus and control method
JP2007334870A (en) Method and system for mapping position of direct input device
KR20140035870A (en) Smart air mouse
CN104736969A (en) Information display device and display information operation method
CN110096207B (en) Display device, operation method of display device, and computer-readable non-volatile storage medium
US20130021367A1 (en) Methods of controlling window display on an electronic device using combinations of event generators
CN104699228A (en) Mouse realization method and system for intelligent TV screen terminal
KR101574752B1 (en) Touch Mouse Device For Controlling Remote Access
KR102482630B1 (en) Method and apparatus for displaying user interface
KR100974910B1 (en) Method for controlling touch-sensing devices, and touch-sensing devices using the same

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION