US8854485B1 - Methods and systems for providing functionality of an interface to include an artificial horizon - Google Patents
Methods and systems for providing functionality of an interface to include an artificial horizon Download PDFInfo
- Publication number
- US8854485B1 US8854485B1 US13/213,678 US201113213678A US8854485B1 US 8854485 B1 US8854485 B1 US 8854485B1 US 201113213678 A US201113213678 A US 201113213678A US 8854485 B1 US8854485 B1 US 8854485B1
- Authority
- US
- United States
- Prior art keywords
- camera
- interface
- artificial horizon
- range
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Definitions
- a user interface provides functionality to enable interaction between humans and machines.
- a goal of interaction between a human and a machine at the user interface is generally effective operation and control of the machine, and feedback from the machine that aids the user in making operational decisions.
- Examples of user interfaces include interactive aspects of computer operating systems, hand tools, heavy machinery operator controls, process controls, etc.
- Design considerations applicable when creating user interfaces may be related to or involve ergonomics and psychology.
- user interfaces can be designed so as to be associated with the functionalities of a product, such as to enable intended uses of the product by users with efficiency, effectiveness, and satisfaction, taking into account requirements from context of use.
- a user interface includes hardware and software components.
- User interfaces exist for various systems, and provide a manner to receive inputs allowing users to manipulate a system, and/or receive outputs allowing the system to indicate effects of the users' manipulation.
- Many types of user interfaces exist.
- One example user interface includes a graphical user interface (GUI) that is configured to accept inputs via devices such as a computer keyboard and mouse and provide graphical outputs on a display.
- GUI graphical user interface
- Another example user interface includes touchscreens that include displays that accept input by touch of fingers or a stylus.
- This disclosure may disclose, inter alia, methods and systems for providing functionality of an interface to include an artificial horizon.
- a method comprises receiving information indicating a range of motion of a camera on a device, and providing an interface on a second device remote from the device.
- the interface may be configured to receive an input indicating a command for an orientation of the camera on the device.
- the method may also comprise based on the information indicating the range of motion of the camera, providing an artificial horizon at a fixed position on the interface that indicates the range of motion of the camera on either side of the artificial horizon.
- the fixed position of the artificial horizon may be associated with an orientation of the camera having a tilt value of about zero or having a pan value of about zero.
- Any of the methods described herein may be provided in a form of instructions stored on a non-transitory, computer readable medium, that when executed by a computing device, cause the computing device to perform functions of the method. Further examples may also include articles of manufacture including tangible computer-readable media that have computer-readable instructions encoded thereon, and the instructions may comprise instructions to perform functions of the methods described herein.
- the computer readable medium may include non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM).
- the computer readable medium may also include non-transitory media, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example.
- the computer readable media may also be any other volatile or non-volatile storage systems.
- the computer readable medium may be considered a computer readable storage medium, for example, or a tangible storage medium.
- a non-transitory computer readable medium having stored thereon instructions executable by a computing device to cause the computing device to perform functions comprises receiving information indicating a range of motion of a camera on a device, and providing an interface on the computing device.
- the interface may be configured to receive an input indicating a command for an orientation of the camera on the device.
- the functions may also comprise based on the information indicating the range of motion of the camera, providing an artificial horizon at a fixed position on the interface that indicates the range of motion of the camera on either side of the artificial horizon.
- the fixed position of the artificial horizon may be associated with an orientation of the camera having a tilt value of about zero or having a pan value of about zero.
- circuitry may be provided that is wired to perform logical functions in any processes or methods described herein.
- any type of devices may be used or configured to perform logical functions in any processes or methods described herein.
- a device comprising a processor and memory including instructions stored therein executable by the processor to perform functions.
- the functions may comprise receiving information indicating a range of motion of a camera on a device, and providing an interface.
- the interface may be configured to receive an input indicating a command for an orientation of the camera on the device.
- the functions may further comprise based on the information indicating the range of motion of the camera, providing an artificial horizon at a fixed position on the interface that indicates the range of motion of the camera on either side of the artificial horizon.
- the fixed position of the artificial horizon is associated with an orientation of the camera having a tilt value of about zero or having a pan value of about zero.
- any type of devices may be used or configured as means for performing functions of any of the methods described herein (or any portions of the methods described herein).
- FIG. 1 is an example system for generating and modifying a view of a robotic device.
- FIG. 2A-2C are example illustrations of robotic devices.
- FIG. 3 is a block diagram of an example method of providing functionality of an interface to include an artificial horizon.
- FIGS. 4A-4B are conceptual illustrations of example orientations of a camera on a device.
- FIGS. 5A-5B are example illustrations of an interface to receive inputs to control orientations or the camera on the device.
- FIG. 6 is another example interface.
- FIG. 7 illustrates still another example of the interface.
- FIG. 8 illustrates another example of the interface in FIG. 6 including an artificial horizon that represents an orientation of the camera having a pan value of about zero.
- FIG. 9 illustrates still another example of the interface in FIG. 6 including both horizontal and vertical artificial horizons.
- a method includes receiving information indicating a range of motion of a camera on a device, and providing an interface on a second device remote from the device.
- the interface may be configured to receive an input indicating a command for an orientation of the camera on the device.
- the method may further include based on the information indicating the range of motion of the camera, providing an artificial horizon at a fixed position on the interface that indicates the range of motion of the camera on either side of the artificial horizon.
- the fixed position of the artificial horizon may be associated with an orientation of the camera having a tilt value of about zero or having a pan value of about zero.
- FIG. 1 is an example system 100 for generating and modifying a view of a robotic device, and for controlling operation of the robotic device.
- the system 100 includes a robotic device 102 coupled to a network 104 , and a server 106 and a client device 108 also coupled to the network 104 .
- the robotic device 102 may further be coupled directly (or indirectly) to the server 106 and the client device 108 as shown.
- the system 100 may include more or fewer components, and each of the robotic device 102 , the server 106 , and the client device 108 may comprise multiple elements as well.
- one or more of the described functions of the system 100 may be divided up into additional functional or physical components, or combined into fewer functional or physical components.
- additional functional and/or physical components may be added to the examples illustrated by FIG. 1 .
- Cloud-based computing generally refers to networked computer architectures in which application execution and storage may be divided, to some extent, between client and server devices.
- a “cloud” may refer to a service or a group of services accessible over a network (e.g., Internet) by client and server devices, for example.
- Cloud-based computing can also refer to distributed computing architectures in which data and program logic for a cloud-based application are shared between one or more client devices and/or server devices on a near real-time basis. Parts of this data and program logic may be dynamically delivered, as needed or otherwise, to various clients accessing the cloud-based application. Details of the architecture may be transparent to users of client devices. Thus, a PC user or robot client device accessing a cloud-based application may not be aware that the PC or robot downloads program logic and/or data from the server devices, or that the PC or robot offloads processing or storage functions to the server devices, for example.
- the system 100 includes a number of devices coupled to or configured to be capable of communicating with the network 104 .
- client devices may be coupled to the network 104 .
- different types of devices may be coupled to the network 104 .
- any of the devices may generally comprise a display system, memory, and a processor.
- any of the devices shown in FIG. 1 may be coupled to the network 104 or to each other using wired or wireless communications.
- communication links between the network 104 and devices may include wired connections, such as a serial or parallel bus.
- Communication links may also be wireless links, which may include Bluetooth, IEEE 802.11 (IEEE 802.11 may refer to IEEE 802.11-2007, IEEE 802.11n-2009, or any other IEEE 802.11 revision), or other wireless based communication links.
- the system 100 may include access points through which the devices may communicate with the network 104 . Access points may take various forms, for example, an access point may take the form of a wireless access point (WAP) or wireless router.
- WAP wireless access point
- an access point may be a base station in a cellular network that provides Internet connectivity via the cellular network.
- the robotic device 102 , the server 106 , and the client device 108 may include a wired or wireless network interface through which the devices can connect to the network 104 (or access points).
- the devices may be configured use one or more protocols such as 802.11, 802.16 (WiMAX), LTE, GSM, GPRS, CDMA, EV-DO, and/or HSPDA, among others.
- the client devices may be configured use multiple wired and/or wireless protocols, such as “3G” or “4G” data connectivity using a cellular communication protocol (e.g., CDMA, GSM, or WiMAX, as well as for “WiFi” connectivity using 802.11). Other examples are also possible.
- the network 104 may represent a networked computer architecture, and in one example, the network 104 represents a queue for handling requests from client devices.
- the network 104 may further include any of a local area network (LAN), wide area network (WAN), wireless network (Wi-Fi), or Internet, for example.
- LAN local area network
- WAN wide area network
- Wi-Fi wireless network
- Internet for example.
- the server 106 may be a component coupled to the network 104 (as shown), or a component of the network 106 depending on a configuration of the system 100 .
- the server 106 may include a processor and memory including instructions executable by the processor to perform functions as described herein.
- the client device 108 may include any type of computing device (e.g., PC, laptop computer, etc.), or any type of mobile computing device (e.g., laptop, mobile telephone, cellular telephone, etc.).
- the client device 108 may include a processor and memory including instructions executable by the processor to perform functions as described herein.
- the robotic device 102 may comprise any computing device that may include connection abilities to the network 104 and that has an actuation capability (e.g., electromechanical capabilities).
- a robotic device may further be a combination of computing devices.
- the robotic device 102 may collect data and upload the data to the network 104 .
- the network 104 may be configured to perform calculations or analysis on the data and return processed data to the robotic device 102 .
- the robotic device 102 may include one or more sensors, such as a gyroscope, an accelerometer, or distance sensors to measure movement of the robotic device 102 .
- Other sensors may further include any of Global Positioning System (GPS) receivers, infrared sensors, optical sensors, biosensors, Radio Frequency identification (RFID) systems, wireless sensors, and/or compasses, among others, for example.
- GPS Global Positioning System
- RFID Radio Frequency identification
- any of the robotic device 102 , the server 106 , and the client device 108 may include an integrated user-interface (UI) that allows a user to interact with the device.
- UI user-interface
- the robotic device 102 may include various buttons and/or a touchscreen interface that allow a user to provide input.
- the robotic device 102 may include a microphone configured to receive voice commands from a user.
- the robotic device 102 may include one or more interfaces that allow various types of user-interface devices to be connected to the robotic device 102 .
- FIG. 2A illustrates an example robotic device 200 .
- the robotic device 200 is configured as a robot.
- a robot may contain computer hardware, such as a processor 202 , memory or data storage 204 , and one or more sensors 206 .
- a robot controller e.g., processor 202 , computing system, and sensors 206
- the robot may have a link to access cloud servers (as shown in FIG. 1 ).
- a wired link may include, for example, a parallel bus or a serial bus such as a Universal Serial Bus (USB).
- a wireless link may include, for example, Bluetooth, IEEE 802.11, Cellular (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee, among other possibilities.
- the storage 204 may be used for compiling data from various sensors 206 of the robotic device 200 and storing program instructions.
- the processor 202 may be coupled to the storage 204 and may be configured to control the robotic device 200 based on the program instructions.
- the processor 202 may also be able to interpret data from the various sensors 206 on the robot.
- Example sensors may include a camera, smoke sensors, light sensors, radio sensors, infrared sensors, microphones, speakers, gyroscope, accelerometer, distance sensors, a camera, radar, capacitive sensors and touch sensors, etc.
- Example distance sensors include infrared ranging sensors, photoelectric distance sensors, proximity sensors, ultrasonic sensors, radar, or other types of sensors that may provide outputs used to determine a distance of the robotic device 200 to an object.
- the robotic device 200 may also have components or devices that allow the robotic device 200 to interact with an environment (e.g., surrounding or ambient environment).
- the robotic device 200 may have a camera to provide images of a field of view of the environment as well as mechanical actuators 208 , such as motors, wheels, movable arms, etc., that enable the robotic device 200 to move or interact with the environment.
- various sensors and devices on the robotic device 200 may be modules. Different modules may be added or removed from the robotic device 200 depending on requirements. For example, in a low power situation, a robot may have fewer modules to reduce power usages. However, additional sensors may be added as needed. To increase an amount of data a robot may be able to collect, additional sensors may be added, for example.
- the robotic device 200 may be configured to receive a device, such as device 210 , that includes the processor 202 , the storage 204 , and the sensors 206 .
- the robotic device 200 may be a robot that have a number of mechanical actuators (e.g., a movable base), and the robot may be configured to receive a mobile telephone to function as the “brains” or control components of the robot.
- the device 210 may be considered a module of the robot.
- the device 210 may be physically attached to the robot or in communication with the robot. For example, a mobile phone may sit on a robot's “chest” and form an interactive display.
- the device 210 may provide a robot with sensors, a wireless link, and processing capabilities, for example.
- the device 210 may allow a user to download new routines for his or her robot from the cloud.
- a laundry folding routine may be stored on the cloud, and a user may be able to select this routine using a mobile phone to download the routine from the cloud, and when the mobile phone is placed into or coupled to the robot, the robot would be able to perform the downloaded action.
- the robotic device 200 may be coupled to a mobile or cellular telephone to provide additional sensing capabilities.
- the cellular phone may not be physically attached to the robot, but may be coupled to the robot wirelessly.
- a low cost robot may omit a direct connection to the internet.
- This robot may be able to connect to a user's cellular phone via a wireless technology (e.g., Bluetooth) to be able to access the internet.
- the robot may be able to access various sensors and communication means of the cellular phone.
- the robot may not need as many sensors to be physically provided on the robot, however, the robot may be able to keep the same or similar functionality.
- the robotic device 200 may include mechanical robot features, and may be configured to receive the device 210 (e.g., a mobile phone), which can provide additional peripheral components to the robotic device 200 , such as any of an accelerometer, gyroscope, compass, GPS, camera, WiFi connection, a touch screen, etc., that are included within the device 210 .
- the device 210 e.g., a mobile phone
- the device 210 can provide additional peripheral components to the robotic device 200 , such as any of an accelerometer, gyroscope, compass, GPS, camera, WiFi connection, a touch screen, etc.
- FIG. 2B illustrates a graphical example of a robot 212 .
- the robot 212 is shown as a mechanical form of a person including arms, legs, and a head.
- the robot 212 may be configured to receive any number of modules or components, such a mobile phone, which may be configured to operate the robot.
- a device e.g., robot 212
- a mobile phone e.g., device 210
- Other types of devices that have connectivity to the Internet can be coupled to robot 212 to provide additional functions on the robot 212 .
- the device 210 may be separate from the robot 212 and can be attached or coupled to the robot 212 .
- the robot 212 may be a toy with only limited mechanical functionality, and by connecting device 210 to the robot 212 , the toy robot 212 may now be capable of performing a number of functions with the aid of the device 210 and/or the cloud.
- the robot 212 (or components of a robot) can be attached to a mobile phone to transform the mobile phone into a robot (e.g., with legs/arms) that is connected to a server to cause operation/functions of the robot.
- FIG. 2C illustrates another example of a robot 214 .
- the robot 214 includes a computing device 216 , sensors 218 , and a mechanical actuator 220 .
- the computing device 216 may be a laptop computer, which may be coupled to the sensors 218 .
- the sensors 218 may include a camera, infrared projectors, and other motion sensing or vision sensing elements.
- the sensors 218 may be included within a tablet device, which may also function as the computing device 216 .
- the mechanical actuator 220 may include a base, wheels, and a motor upon which the computing device 216 and the sensors 218 can be positioned, for example.
- Any of the robots illustrated in FIGS. 2A-2C may be configured to operate according to a robot operating system (e.g., an operating system designed for specific functions of the robot).
- a robot operating system may provide libraries and tools (e.g., hardware abstraction, device drivers, visualizers, message-passing, package management, etc.) to enable robot applications.
- robot operating systems include open source software such as ROS (robot operating system), DROS, or ARCOS (advanced robotics control operating system); proprietary software such as the robotic development platform ESRP from Evolution Robotics® and MRDS (Microsoft® Robotics Developer Studio), and other examples also include ROSJAVA.
- a robot operating system may include publish and subscribe functionality, and may also include functionality to control components of the robot, such as head tracking, base movement (e.g., velocity control, navigation framework), etc.
- any of the robots illustrated in FIGS. 2A-2C may be configured to operate according to example methods described herein, or according to instructions received from devices that may be configured to operate according to example methods described herein.
- FIG. 3 is a block diagram of an example method of providing functionality of an interface to include an artificial horizon.
- Method 300 shown in FIG. 3 presents an embodiment of a method that, for example, could be used with the system 100 , for example, and may be performed by a device, such as any devices illustrated in FIGS. 1-2 , or components of the device.
- Method 300 may include one or more operations, functions, or actions as illustrated by one or more of blocks 302 - 310 .
- the blocks are illustrated in a sequential order, these blocks may also be performed in parallel, and/or in a different order than those described herein.
- the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation.
- each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor or computing device for implementing specific logical functions or steps in the process.
- the program code may be stored on any type of computer readable medium, for example, such as a storage device including a disk or hard drive.
- the computer readable medium may include non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM).
- the computer readable medium may also include non-transitory media, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example.
- the computer readable media may also be any other volatile or non-volatile storage systems.
- the computer readable medium may be considered a computer readable storage medium, for example, or a tangible storage device.
- each block in FIG. 3 may represent circuitry that is wired to perform the specific logical functions in the process.
- the method 300 includes receiving information indicating a range of motion of a camera on a device.
- the information may be received at a server, at a second device remote from the device, or at the device itself.
- a second device may be remote from the device when the second device is not physically coupled to the device.
- the second device may be remote from the device, and may also be in proximity to the device or a large distance away from the device.
- the second device may be remote from the device and operationally coupled to the device or in communication with the device.
- the information may be received at any time, such as during initialization of the device, or during initialization of the device that receives the information, for example.
- the information may further be received during initialization of an interface on a given device.
- the information may include one or more of a range of tilt values or a range of panning values of the camera.
- the camera may be mounted on a pan/tilt unit capable of adjusting a pan or a tilt orientation of the camera. Capabilities of the pan/tilt unit may be provided to indicate the range of tilt or pan values of the camera.
- the method 300 includes providing an interface on a second device remote from the device.
- the interface may be provided on the device that receives the information at block 302 .
- the interface is provided on a device that is remote from the robotic device.
- the device may be operated to control the robotic device from a remote location.
- the interface may be configured to receive an input indicating a command for an orientation of the camera on the device.
- the interface may be a user interface that can receive inputs from a user.
- the interface may be provided on a touchscreen display of a device, and the device may receive an input to the interface via a touch/contact to the touchscreen by a user or stylus.
- the interface may include a graphical user interface (GUI) of a device, and the device may receive an input to the interface via an input to a keyboard or mouse coupled to the device.
- GUI graphical user interface
- the interface may be a general interface to a device, and the device may receive an input via receiving a signal (via a wireless or wired connection) to the interface (e.g., which may take the form of a receiver, a port, etc.).
- the interface may further be provided via a Web-based interface as well.
- the interface may be provided so as to overlay data, such as to overlay a map display or a video/image stream received from the camera.
- portions of the interface may be transparent, or semi-transparent.
- a video stream from a camera may be provided on a display of a device in a background, and the interface may be provided overlaid onto the video stream in a foreground of the display.
- the interface may be configured to indicate or receive a command for an orientation of the camera on a robotic device.
- inputs to the interface may be associated with orientations of the camera on the robotic device including directional orientations used to control movement of the camera, and the interface may be configured to associate inputs with corresponding commands that can be used to control operation of the camera and/or the robotic device.
- the interface may thus take the form of an interface enabling a user to remote control the camera on the robotic device.
- the camera on the robotic device may have a range of motion, and inputs received on the interface can be associated with commands for controlling motion of the camera.
- the interface may be configured in a number of ways, and may include a shape (e.g., rectangular shape) configured to overlay an x-y axis.
- the x-axis can be configured to represent a pan value for the orientation of the camera on the device and the y-axis can be configured to represent a tilt value for the orientation of the camera on the device (other examples include opposite configurations).
- the command for the orientation of the camera on the device can indicate a fixed orientation of the camera on the device that includes a position of the camera with respect to the device. For instance, the command may indicate to move the camera an amount in a horizontal or vertical direction with respect to a frame of reference of the device (or to capture images, or to generate images with respect to a frame of reference of the robotic device).
- the command may indicate a dynamic orientation of the camera on the device that is associated with geographic position coordinates.
- the orientation of the camera may be configured with respect to a general frame of reference related to geographic position coordinates.
- the device may be configured such that the camera is facing downward due to an orientation of the device (e.g., a robotic device is tilting/leaning forward) and/or due to placement of the camera on the leaning robotic device (e.g., not due to orientation of the camera itself).
- the command may be indicative of an orientation of the camera taking into account the orientation of the device by using geographic position coordinates.
- the command indicating panning or tilting of the camera may be performed by maintaining the camera stationary and digitally processing captured images so as to generate digitally processed images from a viewpoint of a camera in an orientation according to the pan/tilt values in the command.
- the interface may further include an indicator representing a current orientation of the camera on the device, and the interface can change a position of the indicator on the interface corresponding to movement of the camera on the device.
- the interface may further provide a second indicator representing a location of an input, and that includes a representation or is representative of a command for the orientation of the camera on the device (e.g., based on a configuration of the interface, the second indicator can be representative of the command, such as indicating to pan right/left, tilt up/down, etc.).
- the second indicator may be provided on the interface at or near the location of the input. In instances in which the interface is provided on a touchscreen, the second indicator may be provided at a location surrounding or adjacent a location of the input.
- This may enable a user to view the second indicator, since an indicator placed at the location of the input could be placed underneath a user's finger or stylus that provides contact with the touchscreen, and which may block or obstruct a view of an indicator placed at the location of the input.
- the second indicator may be placed at the location of the input (such as underneath a user's finger on a touchscreen display).
- a user may visualize a current orientation of the camera when providing an input to request a change to the orientation of the camera.
- the interface may also include a first group of indicators on the interface along a perimeter of the interface at locations substantially matching x-axis coordinates of the position of the indicator, and a second group of indicators on the interface along the perimeter of the interface at locations substantially matching y-axis coordinates of the position of the indicator.
- the perimeter indicators may be useful to visualize a range of motion of the camera.
- a display of indicators in the first group of indicators and indicators in the second group of indicators can be configured to fade-in or fade-out as the input on the interface changes.
- An additional indicator may also be positioned at a location of the input, for example.
- multiple inputs may be received on the interface over time indicating commands for the orientation of the camera on the robotic device, and multiple indicators may be provided on the interface that each are representative of command.
- the interface may be presented on a touchscreen and the user may provide an initial input by contacting the touchscreen using a finger, and then slide the finger across the interface to provide multiple inputs.
- Indicators may be provided to represent locations of some or all received inputs.
- One or more indicators may represent a previous location of an input on the interface, and one or more indicators may represent a current location of the input on the interface. Further, one or more indicators may represent a future location of a projected future input on the interface.
- a display of one or more of the indicators that represents the previous location of the input on the interface may fade-out over time.
- a display of an indicator that represents the future location of the projected future input on the interface may fade-in over time.
- the interface may be configured in other manners as well, and may take the form or other geometric shapes or designs based on applications.
- the method 300 includes based on the information indicating the range of motion of the camera, providing an artificial horizon at a fixed position on the interface that indicates the range of motion of the camera on either side of the artificial horizon.
- the fixed position of the artificial horizon may be associated with an orientation of the camera having a tilt value of about zero or having a pan value of about zero.
- the artificial horizon may be associated with an orientation of the camera being substantially parallel to a ground plane.
- the position of the artificial horizon may be fixed on the interface, and indicators for the camera or inputs may change.
- the artificial horizon may indicate that the range of motion of the camera on either side of the artificial horizon is unequal such that the camera has a larger range of motion to one side of the artificial horizon.
- the artificial horizon may be positioned at positions on the interface other than a center, for example.
- the artificial horizon may be representative of a plane associated with a field of view of the camera.
- the artificial horizon may be specific to the camera, and may be configured according to capabilities of the camera.
- the artificial horizon may be provided as a semi-transparent component of the interface.
- the artificial horizon may overlay a portion of the interface from the fixed position to about a perimeter of the interface, and due to transparency, indicators of the camera or of inputs can be displayed in the artificial horizon.
- the method 300 may optionally include providing a second artificial horizon at a second fixed position on the interface that indicates the range of panning values of the camera.
- the second fixed position of the artificial horizon may be associated with an orientation of the camera having a pan value of about zero.
- the artificial horizon may indicate a range of tilt values of the camera such that the fixed position of the artificial horizon is associated with the orientation of the camera having the tilt value of about zero, and the second artificial horizon is associated with a panning range of the camera.
- the artificial horizon and the second artificial horizon may be configured on the interface to be substantially perpendicular.
- the method 300 may optionally include receiving a given input on the interface, and generating a control signal indicating a given orientation of the camera on the device according to a location of the input on the interface.
- the location of the input on the interface may be associated with a tilt and/or a pan value of the camera.
- the remote device may generate the control signal and may provide the control signal to the device.
- the method 300 may include receiving information indicating movement of the device, and determining changes to the range of motion of the camera on the device. For example, if the device is a robotic device that includes a camera on a portion of the robotic device and the robotic device is leaning forward such that the range of motion of the camera is now restricted, a change to the range of motion can be determined. Based on the changes to the range of motion of the camera on the device, the fixed position of the artificial horizon on the interface can be adjusted in real-time.
- the range of motion of the camera may be altered or modified due to many reasons including movement of components of the robotic device or a position of the robotic device (e.g., the robotic device may be under a table restricting movement of a camera positioned on top of the robotic device).
- the changes to the range of motion of the camera may be determined in many ways, such as, by calculating available distances surrounding the camera in an ambient environment using previous and/or current images/video stream of the camera.
- Information indicating the changes to the range of motion can be calculated by a server or the robotic device and provided to the remote device controlling the robotic device, for example.
- the method 300 may include providing one or more of an audio or vibration indicator that is indicative of repositioning the indicator on the interface to be at the pre-set location.
- the audio or vibration signal may provide feedback to a user indicating that the change on the interface has been made.
- the method 300 may include providing on the interface text that indicates the command for the orientation of the camera on the robotic device.
- the text may provide further feedback to the user indicating the command that corresponds to a received input.
- the method 300 may be performed by a second device remote from the device to control operation of a device (e.g., to control operation of a robotic device).
- the second device may include a processor and memory including instructions stored therein executable by the processor to perform functions of the method 300 .
- the second device may be remote from the robotic device, and may send signals (either via a wired or wireless connection) to the robotic device.
- the interface may be provided by or on the second device.
- the second device may include a touchscreen display configured to receive the input on the interface (e.g., via a contact with the touchscreen), and based on the input, the second device may be configured to generate a control signal for the command for the orientation of the camera on the robotic device. The second device may subsequently provide the control signal to the robotic device.
- the method 300 may be performed to operate any type of robotic device, including robotic devices that may be configured to turn in place or not, that may be stationary or mobile, or that may have other functionality or limitations.
- the method 300 may further include receiving a double-tap or double-click input on an interface, and generating a control signal indicating the orientation of the camera on the robotic device to be reset to a default orientation (e.g., facing forward).
- FIGS. 4A-4B are conceptual illustrations of example orientations of a camera on a device
- FIGS. 5A-5B are example illustrations of an interface 500 to receive inputs to control the orientations.
- FIG. 4A illustrates an example birds-eye view of a camera.
- a camera may be controlled to move about a pivot point in a horizontal direction so as to pan the camera from left to right.
- the camera may be mounted on a pan/tilt unit that can receive commands indicating directions to move the camera.
- the panning may include horizontal movement or rotation of the camera (e.g., still or video camera), or scanning of a subject horizontally on video or a display device.
- the camera may be configured to pan in any amount (such as in the range of 0 degrees to 360 degrees).
- FIG. 4B illustrates an example side-view of a camera.
- the camera may be controlled to move about a pivot point in a vertical direction so as to tilt the camera up and down.
- the camera may be configured to tilt in any amount (such as in the range of 0 degrees to 360 degrees).
- panning or tilting of the camera may be performed by maintaining the camera stationary and digitally processing captured images.
- FIGS. 5A-5B are example illustrations of an interface 500 .
- the interface 500 may be provided on a display of a device, and may be configured to receive inputs and correlate the inputs to commands for an orientation of the camera on a robotic device that is remote from the device.
- the interface 500 is shown to include two concentric circles 502 and 504 that may be representative of a range of motion of the camera on the robotic device, or a range of orientation of the camera.
- An input on the interface 500 may be associated with coordinates on an x-y axis, and the x-axis may correspond to values for the panning of the camera and the y-axis may correspond to values for the tilting of the camera.
- the interface 500 may receive an input from a user's finger or from a stylus at a location 506 .
- the interface 500 may be configured to provide an indicator 508 at a location representing the location of the received input.
- the interface 500 may be configured to be provided on a touchscreen, and a user may provide an input at the location 506 .
- the indicator 508 is provided adjacent the location 506 so as to represent the input. In this way, a user may view the indicator 508 , rather than providing the indicator exactly at the location 506 in which the indicator 508 would be provided underneath the user's finger.
- the indicator 508 may be provided at the location 508 and may be of a size such that a user can view the indicator 508 .
- the example interface 500 is shown to further include an artificial horizon 510 illustrated at a center position or illustrated dividing the interface 500 down the middle.
- the artificial horizon 510 may be associated with an orientation of the camera having a pan value of zero.
- the camera is shown to include an equal amount of panning available to a left and a right, such that a pan value of zero is at a center.
- the artificial horizon 510 is shown as a center line of the interface along the y-axis.
- the interface 500 may further include an artificial horizon 512 , as shown in FIG. 5B , to correspond to a horizon for the tilting values of the camera.
- the camera is shown to have an equal (or about equal) amount of tilting available up and down, such that a tilt value of zero is at a center.
- the artificial horizon 512 is shown as a center line of the interface along the x-axis.
- FIG. 6 is another example interface 600 .
- the interface 600 is shown to be a substantially rectangular shape, and may include as structure two rectangles 602 and 604 .
- the interface 600 may receive an input 606 at any location within the rectangle 602 .
- the input 606 may be associated with pan and tilt values of a camera on a robotic device.
- the interface 600 may be associated with an x-y axis such that an x-axis coordinate of the input 606 is associated with a pan value, and a y-axis coordinate of the input 606 is associated with a tilt value.
- the interface 600 may be configured to provide an indicator (shown as a circle enclosing arrows) at a location of the input 606 .
- the interface 600 may further include an indicator 608 representing a current orientation of the camera.
- the indicator 608 may represent current pan/tilt values of the camera on the robotic device to provide feedback to the user.
- a position of the indicator 608 on the interface 600 may be adjusted as inputs are received due to movement of the camera, for example.
- the interface 606 may be further configured to provide additional indicators that are representative of the current orientation of the camera. As shown, indicators 610 , 612 , 614 , 616 , and 618 may be provided along a perimeter of the rectangle 602 at x-axis coordinates that substantially match an x-axis coordinate of the camera indicator 608 . Similarly, the indicators 620 , 622 , 624 , and 626 may be provided along a perimeter of the rectangle 602 at y-axis coordinates that substantially match an x-axis coordinate of the camera indicator 608 .
- the groups of indicators along the x-axis and y-axis perimeter of the rectangle 602 may alternatively be provided at x and y axis coordinates that substantially match the indicator 606 associated with an input, for example. Groups of indicators provided at x and y axis coordinates that substantially match the indicator 606 associated with an input could be provided (or displayed) at times when an input is being provided.
- the interface 600 further includes an artificial horizon 628 provided at a position that indicates a zero tilt value of the camera (e.g., indicates a plane parallel to ground with respect to the camera).
- a range of motion 630 of the camera below the artificial horizon 628 and a range of motion 632 of the camera above the artificial horizon 628 is shown in FIG. 6 .
- the range of vertical motion of the camera is about equal above and below the ground plane, and thus, the artificial horizon 628 is provided at about center of the interface 600 .
- FIG. 7 illustrates an example of the interface 600 in which the artificial horizon 628 is shown to be lower than that in FIG. 6 .
- the range of motion of the camera is less with respect to motion below the artificial horizon 628 than with respect to motion above the artificial horizon 628 .
- the camera may tilt upward more degrees than the camera may tilt downward in this example.
- FIG. 8 illustrates another example of the interface 600 including an artificial horizon 634 that represents an orientation of the camera having a pan value of about zero.
- a range of motion to the left 636 for the camera is shown, as well as a range of motion to the right 638 for the camera.
- the camera may pan to the right in more degrees than the camera may pan to the left.
- FIG. 9 illustrates still another example of the interface 600 including both the horizontal and vertical artificial horizons 628 and 634 as shown in FIG. 6 and FIG. 8 .
- a portion of the artificial horizon 628 and the artificial horizon 634 may overlap to indicate a range of motion of the camera include both pan and tilt range.
- a user may be able to determine a pan/tilt of the camera relative to the ground.
- the fixed position of the artificial horizon indicates an amount of available motion, and is useful in instances in which a pan/tilt unit does not tilt a same amount above and below 0 degrees, for example.
- a display of the artificial horizon illustrates a level for 0 degrees, which may not be a center of the interface 600 .
- a pan/tilt unit may be capable of tilting upward 25 degrees and downward 75 degrees.
- the artificial horizon would be provided on the interface 600 at a position about 25% from a top of the interface 600 .
- a pan/tilt unit may be capable of tilting upward 50 degrees and downward 50 degrees, and in this example, the artificial horizon would be provided on the interface 600 at about a center of the interface 600 .
- the camera may be provided on a right shoulder of a robotic device (e.g., the robotic device 212 in FIG. 2 ), and the pan/tilt unit of the camera may be capable of panning a few degrees to the left (due to interference with a head of the robotic device 212 ) and a large amount of degrees to the right (e.g., such as to rotate about the shoulder and to provide a rear view of the robotic device 212 ).
- the robotic device 212 raises a right arm
- the right arm may then obstruct or interfere with movement of the camera to the right, in which case the range of motion of the camera may change.
- a new range of motion may be provided to the interface, and a position of the artificial horizon on the interface can be adjusted.
- the range of motion of the camera can be determined in real-time, and the position of the artificial horizon can also be adjusted in real-time.
- a camera may be provided on a head of a robotic device (e.g., the robotic device 212 in FIG. 2 ), and may include a range of motion that can be illustrated by the artificial horizon on the interface 600 .
- the head of the robotic device may also be capable of rotating, which also provides a further range of motion of the camera.
- the range of motion of the camera may also take into account a range of motion of a base or component of the device upon which the camera is mounted (in addition to a pan/tilt unit of the camera).
- interfaces are provided that may be configured to both receive inputs as well as provide outputs (e.g., touchscreen displays).
- an interface may be provided on a handheld computer that can receive an input and provide a display representative of the output.
- a motion-detection device may be configured to receive an input and to provide the input to a display device which displays an output representative of the input.
- the motion-detection device may include a camera, a depth sensor, microphones, etc., and may be configured to provide motion capture, facial recognition, and voice recognition capabilities.
- the depth sensor may be configured to include an infrared laser projector and a monochrome CMOS sensor that can capture video data in 3D under ambient light conditions.
- the motion-detection device may be configured to provide an interface using the infrared laser projector, for example, to receive inputs from users.
- the inputs can be associated with indicating a command for an orientation of the camera on a device that is remote from the motion-detection device.
- the interface may be viewable by a user, such as a laser projected interface, or may be a conceptual interface in which inputs are received due to motion of the user and the interface is not visible to the user.
- the motion-detection device may be coupled to a display device, and may provide outputs to the display device.
- the motion-detection device may generate a display representative of the interface or representative of inputs to the interface, and provide the display to the display device (or may provide information associated with the inputs to the display device and the display device can generate the display).
- the display may include an indicator representing a location of a received input, and the indicator may be representative of the command for the orientation of the camera on the device.
- the location of the received input can be associated with a physical or geographic location, or can be associated with a location on the display representative of the interface that maps to the location of the received input. For instance, a user may provide an input to the interface provided by the motion-detection device at a physical location, and the physical location can be mapped to a position on a display representative of the interface.
- the motion-detection device or the display device may further be configured to providing an artificial horizon at a fixed position on the interface that indicates a range of motion of the camera on either side of the artificial horizon.
- a first device may be configured to receive an input at an interface that may be provided by or on the first device, and a second device different from the first device may be configured to provide an output based on the input.
- a motion-detection device may receive an input, and an output can be provided on a display device coupled (either wired or wirelessly) to the motion-detection device.
- a user may provide an input on a device (e.g., a keyboard, mobile phone, computing device, etc.) that is coupled to a separate device (e.g., a display) on which an output is provided.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/213,678 US8854485B1 (en) | 2011-08-19 | 2011-08-19 | Methods and systems for providing functionality of an interface to include an artificial horizon |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/213,678 US8854485B1 (en) | 2011-08-19 | 2011-08-19 | Methods and systems for providing functionality of an interface to include an artificial horizon |
Publications (1)
Publication Number | Publication Date |
---|---|
US8854485B1 true US8854485B1 (en) | 2014-10-07 |
Family
ID=51626972
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/213,678 Active 2033-03-14 US8854485B1 (en) | 2011-08-19 | 2011-08-19 | Methods and systems for providing functionality of an interface to include an artificial horizon |
Country Status (1)
Country | Link |
---|---|
US (1) | US8854485B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150201160A1 (en) * | 2014-01-10 | 2015-07-16 | Revolve Robotics, Inc. | Systems and methods for controlling robotic stands during videoconference operation |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030216834A1 (en) * | 2000-05-01 | 2003-11-20 | Allard James R. | Method and system for remote control of mobile robot |
US20050036036A1 (en) * | 2001-07-25 | 2005-02-17 | Stevenson Neil James | Camera control apparatus and method |
US20050071046A1 (en) * | 2003-09-29 | 2005-03-31 | Tomotaka Miyazaki | Surveillance system and surveillance robot |
US7535493B2 (en) * | 1996-07-22 | 2009-05-19 | Canon Kabushiki Kaisha | Method, apparatus, and system for controlling a camera, and a storage medium storing a program used with the method, apparatus and/or system |
US20100073490A1 (en) * | 2004-07-13 | 2010-03-25 | Yulun Wang | Mobile robot with a head-based movement mapping scheme |
US20110025861A1 (en) * | 2004-05-06 | 2011-02-03 | Dumm Mark T | Camera control system and associated pan/tilt head |
-
2011
- 2011-08-19 US US13/213,678 patent/US8854485B1/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7535493B2 (en) * | 1996-07-22 | 2009-05-19 | Canon Kabushiki Kaisha | Method, apparatus, and system for controlling a camera, and a storage medium storing a program used with the method, apparatus and/or system |
US20030216834A1 (en) * | 2000-05-01 | 2003-11-20 | Allard James R. | Method and system for remote control of mobile robot |
US20050036036A1 (en) * | 2001-07-25 | 2005-02-17 | Stevenson Neil James | Camera control apparatus and method |
US20050071046A1 (en) * | 2003-09-29 | 2005-03-31 | Tomotaka Miyazaki | Surveillance system and surveillance robot |
US20110025861A1 (en) * | 2004-05-06 | 2011-02-03 | Dumm Mark T | Camera control system and associated pan/tilt head |
US20100073490A1 (en) * | 2004-07-13 | 2010-03-25 | Yulun Wang | Mobile robot with a head-based movement mapping scheme |
US8401275B2 (en) * | 2004-07-13 | 2013-03-19 | Intouch Technologies, Inc. | Mobile robot with a head-based movement mapping scheme |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150201160A1 (en) * | 2014-01-10 | 2015-07-16 | Revolve Robotics, Inc. | Systems and methods for controlling robotic stands during videoconference operation |
US9615053B2 (en) * | 2014-01-10 | 2017-04-04 | Revolve Robotics, Inc. | Systems and methods for controlling robotic stands during videoconference operation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8875041B1 (en) | Methods and systems for providing feedback on an interface controlling a robotic device | |
US10279475B2 (en) | Methods and systems for selecting a velocity profile for controlling a robotic device | |
US9344623B2 (en) | Methods and systems for providing functionality of an interface to control orientations of a camera on a device | |
JP6676213B2 (en) | Detection of User Motion Range for Virtual Reality User Interface | |
CN109643159B (en) | Manipulating virtual objects with six degree-of-freedom controllers in augmented and/or virtual reality environments | |
US9086757B1 (en) | Methods and systems for providing functionality of an interface to control directional orientations of a device | |
EP3400499B1 (en) | Two-handed object manipulations in virtual reality | |
JP6895390B2 (en) | A system for tracking handheld electronics in virtual reality | |
JP6748961B2 (en) | Projection image adjustment system and projection image adjustment method | |
US20200311428A1 (en) | Virtual item display simulations | |
CN112313605B (en) | Placement and manipulation of objects in an augmented reality environment | |
US20180046363A1 (en) | Digital Content View Control | |
WO2020047307A1 (en) | Virtual item simulation using detected surfaces | |
US20170277559A1 (en) | Classifying work processes | |
US9547412B1 (en) | User interface configuration to avoid undesired movement effects | |
US20210263168A1 (en) | System and method to determine positioning in a virtual coordinate system | |
KR20150012274A (en) | Operating a computing device by detecting rounded objects in image | |
US11231575B2 (en) | Virtual slide stage (VSS) method for viewing whole slide images | |
US20160227868A1 (en) | Removable face shield for augmented reality device | |
KR20220122675A (en) | Joint Infrared and Visible Light Visual-Inertial Object Tracking | |
CN113752250A (en) | Method and device for controlling robot joint, robot and storage medium | |
JP2023536434A (en) | System and method for object tracking using fused data | |
JP2020113094A (en) | Method of generating 3d object disposed in expanded real space | |
US9030501B2 (en) | Methods and systems for modifying a display of a field of view of a robotic device to include zoomed-in and zoomed-out views | |
US20220253198A1 (en) | Image processing device, image processing method, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DESAI, MUNJAL;HICKMAN, RYAN;LEWIS, THOR;AND OTHERS;SIGNING DATES FROM 20110819 TO 20110824;REEL/FRAME:027001/0698 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: X DEVELOPMENT LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOOGLE INC.;REEL/FRAME:039900/0610 Effective date: 20160901 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044142/0357 Effective date: 20170929 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
AS | Assignment |
Owner name: X DEVELOPMENT LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GOOGLE INC.;REEL/FRAME:047631/0671 Effective date: 20160901 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECTIVE BY NULLIFICATIONTO CORRECT INCORRECTLY RECORDED APPLICATION NUMBERS PREVIOUSLY RECORDED ON REEL 044142 FRAME 0357. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:047837/0678 Effective date: 20170929 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:X DEVELOPMENT LLC;REEL/FRAME:064658/0001 Effective date: 20230401 |