WO2018208981A1 - Device stabilization - Google Patents

Device stabilization Download PDF

Info

Publication number
WO2018208981A1
WO2018208981A1 PCT/US2018/031892 US2018031892W WO2018208981A1 WO 2018208981 A1 WO2018208981 A1 WO 2018208981A1 US 2018031892 W US2018031892 W US 2018031892W WO 2018208981 A1 WO2018208981 A1 WO 2018208981A1
Authority
WO
WIPO (PCT)
Prior art keywords
computer vision
stabilization system
based device
images
gimbal assembly
Prior art date
Application number
PCT/US2018/031892
Other languages
French (fr)
Inventor
Eirik DYRSETH
Lars FLESLAND
Matias HOLSVE
Original Assignee
Flowmotion Technologies As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flowmotion Technologies As filed Critical Flowmotion Technologies As
Publication of WO2018208981A1 publication Critical patent/WO2018208981A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • FIG. 1 depicts a diagram of an example of a system for stabilizing a media capturing device using computer vision.
  • FIG. 2 depicts a flowchart of an example of a method for stabilizing a device based on detectable stimuli.
  • FIG. 3 depicts a diagram of a system for stabilizing a device based on detectable stimuli in an environment at the device.
  • FIG. 4 depicts a flow chart of an example of a method for stabilizing a device using computer vision.
  • FIG. 5 depicts a diagram of a system for directly controlling drive mechanisms of a gimbal assembly for purposes of stabilizing a device using computer vision.
  • FIG. 6 depicts a diagram of a computer vision-based reference orientation identification system.
  • FIG. 7 depicts a flowchart of an example of a method for using computer vision to determine a reference orientation of a device for purposes of stabilizing the device.
  • FIG. 8 depicts a diagram of an example of a device stabilization control system.
  • FIG. 1 depicts a diagram 100 of an example of a system for stabilizing a media capturing device using computer vision.
  • the system of the example of FIG. 1 includes a stabilized device 102 and a computer vision-based device stabilization system 104.
  • Either or both the stabilized device 102 and the computer vision-based device stabilization system 104 can include a computer-readable medium.
  • a computer-readable medium as discussed in this paper, is intended to include all mediums that are statutory (e.g., in the United States, under 35 U.S.C. 101), and to specifically exclude all mediums that are non-statutory in nature to the extent that the exclusion is necessary for a claim that includes the computer-readable medium to be valid.
  • Known statutory computer-readable mediums include hardware (e.g., registers, random access memory (RAM), non-volatile (NV) storage, to name a few), but may or may not be limited to hardware.
  • a computer-readable medium is intended to represent a variety of potentially applicable technologies.
  • a computer-readable medium can be used to form a network or part of a network.
  • a computer-readable medium can include a bus or other data conduit or plane.
  • a first component is co-located on one device and a second component is located on a different device, a computer-readable medium can include a wireless or wired back-end network or LAN.
  • a computer-readable medium can also encompass a relevant portion of a WAN or other network, if applicable.
  • a computer system will include a processor, memory, non-volatile storage, and an interface.
  • a typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor.
  • the processor can be, for example, a general-purpose central processing unit (CPU), such as a microprocessor, or a special-purpose processor, such as a microcontroller.
  • the memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM).
  • RAM dynamic RAM
  • SRAM static RAM
  • the memory can be local, remote, or distributed.
  • the bus can also couple the processor to non-volatile storage.
  • the non-volatile storage is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software on the computer system.
  • the non-volatile storage can be local, remote, or distributed.
  • the non-volatile storage is optional because systems can be created with all applicable data available in memory.
  • a software program is assumed to be stored at an applicable known or convenient location (from nonvolatile storage to hardware registers) when the software program is referred to as “implemented in a computer-readable storage medium.”
  • a processor is considered to be “configured to execute a program” when at least one value associated with the program is stored in a register readable by the processor.
  • operating system software is a software program that includes a file management system, such as a disk operating system.
  • file management system such as a disk operating system.
  • One example of operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Washington, and their associated file management systems.
  • the file management system is typically stored in the non-volatile storage and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile storage.
  • the bus can also couple the processor to the interface.
  • the interface can include one or more input and/or output (I/O) devices.
  • the I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other I/O devices, including a display device.
  • the display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device.
  • the interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the computer system.
  • the interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g. "direct PC"), or other interfaces for coupling a computer system to other computer systems. Interfaces enable computer systems and other devices to be coupled together in a network.
  • the computer systems can be compatible with or implemented as part of or through a cloud-based computing system.
  • a cloud-based computing system is a system that provides virtualized computing resources, software and/or information to end user devices.
  • the computing resources, software and/or information can be virtualized by maintaining centralized services and resources that the edge devices can access over a communication interface, such as a network.
  • "Cloud” may be a marketing term and for the purposes of this paper can include any of the networks described herein.
  • the cloud-based computing system can involve a subscription for services or use a utility pricing model. Users can access the protocols of the cloud-based computing system through a web browser or other container application located on their end user device.
  • a computer system can be implemented as an engine, as part of an engine or through multiple engines.
  • an engine includes one or more processors or a portion thereof.
  • a portion of one or more processors can include some portion of hardware less than all of the hardware comprising any given one or more processors, such as a subset of registers, the portion of the processor dedicated to one or more threads of a multi-threaded processor, a time slice during which the processor is wholly or partially dedicated to carrying out part of the engine's functionality, or the like.
  • a first engine and a second engine can have one or more dedicated processors or a first engine and a second engine can share one or more processors with one another or other engines.
  • an engine can be centralized or its functionality distributed.
  • An engine can include hardware, firmware, or software embodied in a computer-readable medium for execution by the processor.
  • the processor transforms data into new data using implemented data structures and methods, such as is described with reference to the FIGS, in this paper.
  • the engines described in this paper, or the engines through which the systems and devices described in this paper can be implemented, can be cloud-based engines.
  • a cloud-based engine is an engine that can run applications and/or functionalities using a cloud-based computing system. All or portions of the applications and/or functionalities can be distributed across multiple computing devices, and need not be restricted to only one computing device.
  • the cloud-based engines can execute functionalities and/or modules that end users access through a web browser or container application without having the functionalities and/or modules installed locally on the end- users' computing devices.
  • datastores are intended to include repositories having any applicable organization of data, including tables, comma-separated values (CSV) files, traditional databases (e.g., SQL), or other applicable known or convenient organizational formats.
  • Datastores can be implemented, for example, as software embodied in a physical computer-readable medium on a specific-purpose machine, in firmware, in hardware, in a combination thereof, or in an applicable known or convenient device or system.
  • Datastore- associated components such as database interfaces, can be considered "part of a datastore, part of some other system component, or a combination thereof, though the physical location and other characteristics of datastore-associated components is not critical for an understanding of the techniques described in this paper.
  • Datastores can include data structures.
  • a data structure is associated with a particular way of storing and organizing data in a computer so that it can be used efficiently within a given context.
  • Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address, a bit string that can be itself stored in memory and manipulated by the program.
  • some data structures are based on computing the addresses of data items with arithmetic operations; while other data structures are based on storing addresses of data items within the structure itself.
  • Many data structures use both principles, sometimes combined in non-trivial ways.
  • the implementation of a data structure usually entails writing a set of procedures that create and manipulate instances of that structure.
  • the datastores, described in this paper can be cloud- based datastores.
  • a cloud-based datastore is a datastore that is compatible with cloud-based computing systems and engines.
  • the stabilized device 102 is intended to represent a device capable of being stabilized.
  • the stabilized device 102 can be an applicable device to be stabilized in functioning to perform an applicable function.
  • the stabilized device 102 can include a media capturing device.
  • the stabilized device 102 can be a smart phone with a camera configured to capture a video.
  • the stabilized device 102 can be one of a light, a sonar system, a gun, a laser system, applicable devices used on an airplane or a boat, a headlamp, and a tray used by a waiter to carry food and beverage.
  • the stabilized device 102 can be a laser system used in fish farming to kill lice.
  • the stabilized device 102 is a light on a boat.
  • the stabilized device 102 can be a portable. In being portable, the stabilized device 102 can be detached for an applicable device for stabilizing the stabilized device, such as the computer vision-based device stabilization systems described in this paper.
  • the computer vision-based device stabilization system 104 is intended to represent a system 104 that functions to stabilize a device based on computer perception.
  • the computer vision-based device stabilization system 104 is physically, and potentially removably, connected to an applicable device, such as the stabilized devices described in this paper, for purposes of stabilizing the device.
  • the computer vision-based device stabilization system 104 can be coupled to a stabilized device through an applicable physical connection.
  • the computer vision-based device stabilization system 104 can include a frame and clamps for physically affixing the computer vision-based device stabilization system 104 to a stabilized device.
  • the computer vision-based device stabilization system 104 can include a frame sized for a stabilized device to physically affix the computer vision-based device stabilization system 104 to the stabilized device.
  • the computer vision-based device stabilization system 104 includes one or a plurality of gimbals.
  • the computer vision-based device stabilization system 104 can include one or a plurality of gimbals to allow a stabilized device affixed to the computer vision-based device stabilization system 104 to rotate about one or a plurality of axes.
  • the device can be stabilized.
  • a gimbal of the computer vision-based device stabilization system 104 can be driven with a motor to cause a device affixed to the computer vision-based device stabilization system 104 to rotate about an axes for purposes of stabilizing the device.
  • the computer vision-based device stabilization system 104 can include a three-axis gimbal for use in stabilizing device affixed to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can include one or a plurality of motors for causing a three-axis gimbal to move along one or a plurality of axes for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 includes a sensor that functions to operate for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104.
  • a sensor included as part of the computer vision-based device stabilization system 104 can be an applicable sensor for sensing electromagnetic radiation, sound waves, pressure, or other detectable stimuli in an environment surrounding the computer vision-based device stabilization system 104.
  • a sensor included as part of the computer vision-based device stabilization system 104 can be a camera configured to capture images or video of an environment at the computer vision-based device stabilization system 104 for purposes of stabilizing a device affixed to the computer vision- based device stabilization system 104.
  • a sensor included as part of the computer vision-based device stabilization system 104 can be a camera configured to capture images within a field of view of an environment at the computer vision-based device stabilization system 104.
  • a sensor included as part of the computer vision-based device stabilization system 104 can be a 360° camera configured to capture at 360° field of view of the environment at the computer vision-based device stabilization system 104.
  • a sensor of the computer vision-based device stabilization system 104 can be integrated as part of the stabilized device 102.
  • a sensor of the computer vision-based device stabilization system 104 can be a camera of a smartphone being stabilized.
  • the computer vision-based device stabilization system 104 functions to stabilize a device based on detectable stimuli in an environment surrounding the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can stabilize a device based on stimuli detected by an applicable sensor at the computer vision-based device stabilization system 104 such as the sensors described in this paper.
  • the computer vision- based device stabilization system 104 can control displacement of one or a plurality of gimbals to move a device physically connected to the computer vision-based device stabilization system 104 for purposes of stabilizing the device.
  • the computer vision-based device stabilization system 104 can displace a device towards the person.
  • the computer vision-based device stabilization system 104 can displace a device towards the person.
  • the computer vision-based device stabilization system 104 functions to stabilize a device based on images of a field of view at the computer vision- based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can apply compute vision to images.
  • the computer vision-based device stabilization system 104 can apply computer vision to a changing field of view of an environment to determine either the computer vision-based device stabilization system 104 and a device affixed to it is moving.
  • the computer vision-based device stabilization system 104 can subsequently correct movement of the device to stabilize it by causing one or more gimbals to displace, subsequently causing the device to displace along one or more axes.
  • the computer vision-based device stabilization system 104 can use an applicable computer-vision method for purposes of stabilizing a device affixed to it.
  • the computer vision-based device stabilization system 104 can use one or an applicable combination of image processing, machine vision, pattern recognition, machine leaming, and photogrammetry to stabilize a device affixed to it through computer vision.
  • the computer vision-based device stabilization system 104 functions to stabilize a device based on objects in images of an environment at the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can identify the objects in the images.
  • the computer vision-based device stabilization system 104 can apply computer vision to images of an environment to identify objects in the images at the computer vision- based device stabilization system 104.
  • the computer vision-based device stabilization system 104 functions to track movement of objects in a field of view of an environment captured at the computer vision-based device stabilization system 104 for purposes of stabilizing a device.
  • the computer vision-based device stabilization system 104 can track movement of objects in a field of view captured at the computer vision-based using computer vision.
  • the computer vision-based device stabilization system 104 can identify objects that are moving in a captured field of view, e.g. from multiple images of the field of view.
  • the computer vision-based device stabilization system 104 can track movement of identified moving objects in a captured field of view at the computer vision-based device stabilization system 104 to identify a reference orientation for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104. For example, if a painting in a field of view captured at the computer vision-based device stabilization system 104 is identified as a moving object, then the computer vision-based device stabilization system 104 can track movements of the painting in the field of view to determine a reference orientation for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 functions to stabilize a device using long term tracking.
  • the computer vision-based device stabilization system 104 can use long term tracking of an object in an image of a field of view at the computer vision-based device stabilization system 104 to stabilize a device affixed to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can use tracking of an object in a field of view or absent from a field of view over time to track movements of either or both the computer vision-based device stabilization system 104 and a device to subsequently stabilize the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can track an object in an image of a field of view using point tracking. For example, the computer vision- based device stabilization system 104 can identify a point in an object in an image of an environment at the computer vision-based device stabilization system 104. Further in the example, the computer vision-based device stabilization system 104 can subsequently follow movement of the object in the field of view in subsequent images of the environment by tracking movement of the point in the obj ect in the subsequent images of the environment.
  • the computer vision-based device stabilization system 104 functions to use predictive tracking models in tracking objects in a captured field of view at the computer vision-based device stabilization system 104 for purposes of stabilizing a device.
  • the computer vision-based device stabilization system 104 can use a predictive tracking model to stabilize a device if a tracked object in a field of view captured at the computer vision-based device stabilization system 104 used in stabilizing the device moves out of the view.
  • a predictive tracking model includes applicable data used in determining an expected movement of a device.
  • a predictive tracking model can specify a partem of movement of a device that the device will continue to follow as it moves.
  • a predictive tracking model can be generated through either or both machine learning and tracked movements of a device.
  • a predictive tracking model can be generated based on a pattern of movement of a device as determined by tracking an object in a field of view of an environment captured at the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 functions to maintain predictive tracking models.
  • the computer vision-based device stabilization system 104 can generate and update predictive tracking models for use in stabilizing a device.
  • the computer vision-based device stabilization system 104 can maintain predictive tracking models that are specific to one or a combination of characteristics of a stabilized device, a device type of a stabilized device, characteristics of an environment surrounding a stabilized device, and characteristics of a user of a stabilized device.
  • the computer vision-based device stabilization system 104 can create a predictive tracking model including patterns of involuntary movements made by the user due to Parkinson's disease.
  • the computer vision-based device stabilization system 104 can generate a predictive tracking model based on a weight of a device and physical laws governing movements of the device based on its weight.
  • the computer vision-based device stabilization system 104 functions to stabilize a device based on compensation parameters.
  • Compensation parameters include applicable parameters for compensating stabilization of a device based on detectable stimuli in an environment of the computer vision-based device stabilization system 104.
  • An example of a compensation parameter is intended movement of either or both a device affixed to the computer vision-based device stabilization system 104 or the computer vision- based device stabilization system 104 itself.
  • a compensation parameter can include linear movements of a user operating the computer vision-based device stabilization system 104 to stabilize an affixed device.
  • a compensation parameter can include linear forward and backward movements of a boat upon which the computer vision- based device stabilization system 104 is operating to stabilize an affixed device.
  • the computer vision-based device stabilization system 104 can determine compensation parameters based on input received from an applicable source.
  • the computer vision-based device stabilization system 104 can determine linear movements of a user based on either or both images of a field of a view at the computer vision-based device stabilization system 104 and an accelerometer integrated at the computer vision-based device stabilization system 104 as part of a sensor.
  • the computer vision-based device stabilization system 104 can determine linear movements based on position data received over time from a global positioning system (hereinafter "GPS") sensor integrated at the computer vision-based device stabilization system 104.
  • GPS global positioning system
  • the computer vision-based device stabilization system 104 includes a battery for use in powering the computer vision-based device stabilization system 104 to stabilize an affixed device.
  • a battery of the computer vision-based device stabilization system 104 can be a battery or a device the computer vision-based device stabilization system 104 is operating to stabilize.
  • a battery of the computer vision-based device stabilization system 104 can be a battery of a smart phone the computer vision-based device stabilization system 104 is operating to stabilize.
  • a battery can be integrated as part of the computer vision-based device stabilization system 104 separate from a device the computer vision-based device stabilization system 104 is operating to stabilize.
  • the computer vision-based device stabilization system 104 can be integrated with its own battery, thereby allowing the computer vision-based device stabilization system 104 to stabilize devices that do not require a power source.
  • the computer vision-based device stabilization system 104 includes a battery separate from an affixed device and secured in close proximity to the affixed device. In securing a battery separate from an affixed device in close proximity to the affixed device, a total size of the computer vision-based device stabilization system 104 can decrease.
  • fewer wires or shorter wires in length are need to power the computer vision-based device stabilization system 104 leading to one or a combination of: longer battery life in operation of the computer vision-based device stabilization system 104; reduced assembly costs in assembling the computer vision-based device stabilization system 104; achievement of a smaller actual size of the computer vision-based device stabilization system 104; and reduced heat output of the computer vision-based device stabilization system 104 in operation.
  • an inertia of an inner portion of the computer vision-based device stabilization system 104 increases while an inertia of an outer portion of the computer vision- based device stabilization system 104 decreases leading to greater ease in stabilizing the affixed device.
  • the computer vision-based device stabilization system 104 includes a handle.
  • a handle included as part of the computer vision-based device stabilization system 104 can be removable from the computer vision-based device stabilization system 104.
  • a handle can be removable from the computer vision-based device stabilization system 104 to allow for easier transportation and storage of the computer vision- based device stabilization system 104.
  • a handle included as part of the computer vision-based device stabilization system 104 can include an attachment mechanism for coupling the handle to the computer vision-based device stabilization system 104.
  • a handle included as part of the computer vision-based device stabilization system 104 can include a steel insert with threads.
  • a handle included as part of the computer vision-based device stabilization system 104 can be foldable and portable to allow for easy transport and storage of the handle. Additionally, a handle included as part of the computer vision-based device stabilization system 104 can be module to allow for easy transport and storage of the handle.
  • the computer vision-based device stabilization system 104 includes a handle attachment mechanism for securing a removable handle to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can include a threaded recess for attaching a removable handle to the computer vision-based device stabilization system 104.
  • a handle attachment mechanism of the computer vision-based device stabilization system 104 can be configured to receive different types of handles.
  • the computer vision-based device stabilization system 104 can include a handle attachment mechanism capable of securing both a gripped handle and a handle with a clamp to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 includes a handle that is coupled to one or more gimbals of the computer vision-based device stabilization system 104.
  • a handle can be physically coupled to one or more gimbals of the computer vision-based device stabilization system 104.
  • the computer vision- based device stabilization system 104 can include a handle with a large pitch, thereby requiring less revolutions of one or more gimbals in attaching the one or more gimbals to the handle.
  • a handle coupled to one or more gimbals of the computer vision-based device stabilization system 104 can include a dampener between the handle and the one or more gimbals.
  • a dampened between a handle and one or more gimbals of the computer vision-based device stabilization system 104 can dampen translational movement of either or both the handle and the one or more gimbals along one or a plurality of translational axes.
  • the computer vision-based device stabilization system 104 includes a handle integrated with electrical components to facilitate transmission of electrical signals through the handle.
  • the computer vision-based device stabilization system 104 can include a handle having a grip with electrically conductive pins to allow for transferring electrical signals through the handle.
  • the computer vision- based device stabilization system 104 can include a handle with an attachment mechanism with electrical connections for electrically coupling the handle to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can include a handle with an electrical connector separate from an attachment mechanism and configured to electrically couple the handle to the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 includes a handle through which a user can provide user input for purposes of controlling the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can include a handle through which a user can provide input to control either or both moving an affixed device and stabilizing an affixed device through the computer vision-based device stabilization system 104.
  • the computer vision-based device stabilization system 104 can include a handle with one or a plurality of actuators or applicable mechanisms through which a user can input to control either or both movement of an affixed device and stabilization of an affixed device.
  • the computer vision-based device stabilization system 104 can include a handle with a button that when activated causes an affixed device to be displaced along a specific axis.
  • User input provided through a handle can be provided to a control circuit, e.g. residing on a main printed circuit board (hereinafter referred to as "PCB") of the computer vision-based device stabilization system 104, which can subsequently control operation of one or more gimbals of the computer vision-based device stabilization system 104 based on the input.
  • PCB main printed circuit board
  • the stabilized device 102 is affixed to one or more gimbals of the computer vision-based device stabilization system 104 for purposes of stabilizing the stabilized device 102.
  • the computer vision-based device stabilization system 104 senses stimuli in an environment surrounding the computer vision-based device stabilization system 104. Further, in the example of operation of the example system shown in FIG. 1 , the computer vision-based device stabilization system 104 stabilizes the stabilized device 104 through the one or more gimbals based on the detectable stimuli in an environment surrounding the computer vision-based device stabilization system 104.
  • FIG. 2 depicts a flowchart 200 of an example of a method for stabilizing a device based on detectable stimuli.
  • the flowchart 200 begins at module 202, where a user affixes a device to one or more gimbals of a computer vision-based device stabilization system.
  • a device affixed to one or more gimbals of a computer vision-based device stabilization system can be an applicable portable device capable of being stabilized, such as a smart phone, a light, a sonar system, a gun, a laser system, a utensil, and a tray.
  • a device affixed to one or more gimbals of a computer vision-based device stabilization system can be a light on a ship rocking back and forth.
  • the flowchart 200 continues to module 204, where stimuli in an environment surrounding the computer vision-based device stabilization system are sensed.
  • one or more sensors detect stimuli in an environment surrounding the computer vision-based device stabilization system and within range of the one or more sensors.
  • a camera can generate images of a field of view in an environment surrounding the computer vision-based device stabilization system.
  • a transducer can translate audio noises made in an environment surrounding the computer vision-based device stabilization system into an electrical signal as part of sensing stimuli in the environment.
  • the flowchart 200 continues to module 206, where the device is stabilized through the one or more gimbals based on the detectable stimuli in the environment surrounding the computer vision-based device stabilization system.
  • the computer vision-based device stabilization system can cause displacement of the gimbals to displace the device along one or more axes for purposes of stabilizing the device.
  • the device can be stabilized by applying computer vision to images of a field of view in the environment for purposes of stabilizing the device based on the detectable stimuli in the environment. For example, objects in images of a field of view in the environment can be identified and tracked for purposes of stabilizing the device.
  • FIG. 3 depicts a diagram 300 of a system for stabilizing a device based on detectable stimuli in an environment at the device.
  • the system shown in FIG. 3 can be implemented as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the system shown in FIG. 3 can be implemented as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the gimbal assembly 306 includes a computer-readable medium 302, a frame 304, a gimbal assembly 306, optionally a handle 308, a sensor 310, and a computer vision-based stabilization control system 312.
  • the gimbal assembly 306, the sensor 310, and the computer vision-based stabilization control system 312 are coupled to each other through the computer readable-medium 302.
  • the gimbal assembly 306 can be physically coupled to or implemented as part of the frame 304.
  • the frame 304 is intended to represent a frame configured to receive and physically affix a device to itself for purposes of stabilizing the device.
  • the frame 304 can be of a size and shape to physically secure a device to itself for purposes of stabilizing the device.
  • the frame 304 can be of a size and shape to physically secure a smart phone or camera that falls within a range of dimensions.
  • the frame 304 can include securing mechanisms for securing a device to itself for purposes of stabilizing the device.
  • the frame 304 can include clips for securing a device to the frame 304 for purposes of stabilizing the device.
  • the frame 304 can be configured to rigidly secure a device to itself for purposes of causing the device to move in the same movements as the frame 304 is moved.
  • the frame 304 can be configured to rigidly secure a device such that as the frame moves specific distances along three axes, the device moves the same specific distances along the three axes.
  • the frame 304 functions to contain a battery.
  • a battery contained within the frame 304 can be used to power one or an applicable combination of the components shown in FIG. 3 or all or portions of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • a form factor of an applicable device for stabilizing a device using computer vision such as the computer vision-based device stabilization systems described in this paper, can be decreased leading to greater portability of both devices.
  • the battery does not need to be contained within a handle of an applicable device for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper, allowing for device compatibility with a removable handle.
  • the handle in being compatible with a removable handle, the handle can be removed and the computer vision- based device stabilized system along with a stabilized device can be mounted, e.g. on a helmet of a user.
  • the frame 304 utilizes a claw mechanism as a securing mechanism for purposes of affixing a device to itself for purposes of stabilizing the device.
  • a claw mechanism can be configured to allow for a battery to be stored in the frame 304.
  • a claw mechanism can eliminate a need to thread a rod through at least a portion of the frame 304, thereby preventing the frame 304 from containing a battery.
  • a claw mechanism included as part of the frame 304 can include gears with clockwise threads and counter clockwise threads that displace opposing members of the claw mechanism towards and away from each other for purposes of physically engaging and subsequently securing a device to the frame 304.
  • the gimbal assembly 306 is intended to represent an assembly to cause a device affixed to the frame 304 to be displaced for purposes of stabilizing the device. Specifically, the gimbal assembly 306 can be operated to cause the frame 304 to displace along one or a plurality of axes for purposes of stabilizing a device affixed to the frame 304.
  • the gimbal assembly 306 can include one or more gimbals to cause displacement of the frame 304 and subsequently a device affixed to the frame 304 along one or more axes.
  • the gimbal assembly 306 can include three gimbals to cause the frame 304 and a device affixed to the frame to displace along one of three corresponding axes for purposes of stabilizing the device.
  • the gimbal assembly 306 includes one or more pivot mechanisms allowing one or more gimbals in the gimbal assembly 306 to pivot about one or more axes for purposes of stabilizing a device.
  • the gimbal assembly 306 can include a pivot mechanism to facilitate a corresponding gimbal of the gimbal assembly 306 to displace along a specific axis.
  • the corresponding gimbal of the gimbal assembly can displace along the specific axis to cause the frame 304 to displace and subsequently stabilize a device affixed to the frame.
  • a pivot mechanism can correspond to one or more gimbals on a 1 : 1 or a l :n basis.
  • each gimbal of the gimbal assembly 306 can have its own corresponding pivot mechanism that allows each gimbal to pivot about a corresponding axis. Further in the example, in having one corresponding pivot mechanism for each gimbal of the gimbal assembly, each gimbal can pivot independently of each other to allow for displacement of the frame 104 along any direction.
  • the gimbal assembly 306 drive mechanisms for driving displacement of a gimbal of the gimbal assembly 306 through a pivot mechanism along an axes.
  • a drive mechanism can include an applicable mechanism for introducing mechanical force to cause displacement of a gimbal about a pivot mechanism.
  • a drive mechanism can include an electric motor, e.g. a brushed or brushless electric motor.
  • the gimbal assembly 306 can include a separate drive mechanism for each gimbal in the gimbal assembly 306.
  • each gimbal By including a separate drive mechanism for each gimbal in the gimbal assembly 306 displacement of each gimbal can be controlled separately from the other gimbals to allow for displacement of the frame in any direction for purposes of stabilizing a device affixed to the frame.
  • Characteristics of different drive mechanisms of het gimbal assembly can vary and be dependent upon a specific gimbal the drive mechanism displaces. For example, if one gimbal of the gimbal assembly 306 supports the largest amount of weight of an affixed device compared to the other gimbals of the gimbal assembly 306, then a drive mechanism for the one gimbal assembly can be stronger than drive mechanisms for the other gimbals.
  • the gimbal assembly 306 includes one or more encoders for measuring performance characteristics of drive mechanisms.
  • One or more encoders included in the gimbal assembly 306 can be used to measure performance characteristics of drive mechanisms, e.g. motors, in displacing one or more gimbals of the gimbal assembly 306 around a pivot mechanism.
  • Performance characteristics measured by one or more encoders of the gimbal assembly 306 include applicable performance characteristics related to a drive mechanism of the gimbal assembly.
  • performance characteristics measured by one or more encoders of the gimbal assembly can include an angular position of a drive mechanism, e.g. with respect to a fixed reference point.
  • performance characteristics measured by one or more encoders of the gimbal assembly can include an angular speed at which a drive mechanism is moving.
  • performance characteristics of drive mechanisms measured by one or more encoders of the gimbal assembly 306 are used to determine one or a combination of an actual orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304.
  • an angular position and an angular speed of a drive mechanism, as determined by an encoder of the gimbal assembly can be used to determine an actual orientation of one or more gimbals of the gimbal assembly 306, which can subsequently be used to stabilize a device affixed to the frame 304.
  • an actual orientation of the frame 304 corresponding to an actual orientation of one or more gimbals in the gimbal assembly 306, can be determined, which can be used to displace the one or more gimbals to a new position.
  • the frame 304 can be displaced to a reference orientation corresponding to the new position of the one or more gimbals for purposes of stabilizing a device affixed to the frame 304.
  • a reference orientation of the frame 304 can correspond to a reference orientation of a device affixed to the frame 304. For example, moving the frame 304 to a reference orientation can subsequently displace a device affixed to the frame 304 to a reference orientation of the device for purposes of stabilizing the device.
  • An encoder of the gimbal assembly 306 is an applicable encoder for measuring performance of a drive mechanism of the gimbal assembly.
  • an encoder of the gimbal assembly 306 can be an incremental angular encoder or an absolute angular encoder.
  • One or more encoders of the gimbal assembly 306 can be implemented on a main PCB of an applicable system for controlling stabilization of a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • one or more encoder of the gimbal assembly 306 can generate performance characteristics data indicating performance characteristics of a drive mechanism and subsequently provide the performance characteristics data to a main PCB of an applicable system for controlling stabilization of a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • an encoder of the gimbal assembly 306 can generate performance characteristics data or a device mechanism at an applicable frequency for stabilizing a device using computer vision.
  • an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at an applicable frequency for stabilizing a device based on computer vision using the drive mechanism.
  • an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency between 400Hz and 1600Hz.
  • An encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency to reduce or otherwise eliminate shaking in a video or a series of images captured at an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency greater than or equal to 400Hz.
  • an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency to limit triggering of image stabilization at a media capturing device being stabilized by an applicable system for stabilizing a device, such as the computer vision-based device stabilization systems described in this paper.
  • an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency of 1600Hz or greater.
  • an encoder of the gimbal assembly 306 can generate and provide performance characteristics data of a drive mechanism according to an arbitration- free protocol. In generating and providing performance characteristics data according to an arbitration-free protocol, an encoder of the gimbal assembly 306 can utilize short wires and refrain from using time on a bus, thereby reducing times required to generate and communicate the data. Further, an encoder of the gimbal assembly 306 can act as a wireless encoder in either or both generating and sending performance characteristics data. [0057] In a specific implementation, the gimbal assembly 306 includes a first, second, and third drive mechanism positioned in terms of distance in ascending order away from a proximal end towards a distal end of the frame 304.
  • the first drive mechanism can be positioned closer to a proximal end of the frame 304, while a third drive mechanism can be positioned closer to a distal end of the frame 304, and a second drive mechanism can be positioned between the first drive mechanism and the second drive mechanism.
  • the gimbal assembly 306 includes a specific number of wires physically passing through specific drive mechanisms for purposes of controlling operating of three drive mechanisms in progressive distances away from a proximal end of the frame 304. Specifically, at a first drive mechanism closest to the proximal end of the gimbal assembly 306, six wires can be split into ten wires for purposes of driving three drive mechanisms of the gimbal assembly 306. Further, at a second drive mechanism second closest to the proximal end of the gimbal assembly 306, six wires can be split into seven wires for purposes of driving three drive mechanisms of the gimbal assembly 306. Additionally, at a third drive mechanism furthest from the proximal end and closes to the distal end of the gimbal assembly 306, six wires can be split into five wires for purposes of driving all three drive mechanisms of the gimbal assembly 306.
  • the gimbal assembly 306 is designed to reduced friction of wires through drive mechanisms included in the gimbal assembly 306.
  • the gimbal assembly 306 can couple at a fewest of three wired cables to each of the drive mechanisms for purposes of control the drive mechanisms.
  • the gimbal assembly 306 can include wires for controlling the drive mechanisms with an increased group winding through each drive mechanism leading to a smaller overall core. As a result, reduced wire friction can be observed through the gimbal assembly 306 and in particular drive mechanisms of the assembly 306.
  • the gimbal assembly 306 is optionally coupled to a handle 308.
  • the handle 308 can be an applicable handle for use with an applicable system for stabilizing a device based on computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the handle 308 is capable of being removably coupled to the gimbal assembly, thereby allowing for the handle 308 to be optional.
  • the handle 308 is removed from the gimbal assembly 306, the gimbal assembly 306 and corresponding frame 304 can be mounted, e.g. on a helmet or body part of a user.
  • the handle 308 can include one or a plurality of actuators or applicable mechanisms through which a user can provide user input for controlling either or both movement of a device being stabilized and actual stabilization of the device.
  • the handle 308 can be used to provide user input indicating to rotate a stabilized device, while the device is being stabilized.
  • the sensor 310 is intended to represent an applicable sensor for detecting stimuli in an environment at either or both the gimbal assembly 306 and a device affixed to the frame 304, such as the sensors described in this paper.
  • the sensor 310 can be a camera configured to capture images of a field of view of an environment at either or both the gimbal assembly 306 and a device affixed to the frame 304.
  • the sensor 310 can be integrated as part of a device affixed to the gimbal assembly 306.
  • the sensor 310 can be a camera of a smart phone affixed to the frame 304 for purposes of stabilizing the smart phone.
  • the senor 310 can be separate from a device affixed to the frame 304 for purposes of stabilizing the device.
  • the sensor can be 310 can be a 360° camera.
  • the sensor 310 can be positioned on or near the frame.
  • the sensor 310 can capture images of a field of view in an environment at a frequency to limit or eliminate shaking in the images captured by the sensor by stabilizing the sensor 310.
  • the sensor 310 can capture images at a frequency fast enough to allow the sensor 310 to be stabilized, as part of it being integrated with a device affixed to the frame 304, the frame itself, or the gimbal assembly 306.
  • the sensor 310 can capture images of a field of view in an environment at a rate of 240 frames-per-second or greater.
  • the computer vision-based stabilization control system 312 is intended to represent a system that functions to control stabilization of a device affixed to the frame 304 using computer vision.
  • the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on one or a combination of an actual orientation of one or a combination of an actual orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on one or a combination of a reference orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can determine how to operate the gimbal assembly to move a device affixed to the frame 304 from its actual orientation to its reference orientation for purposes of stabilizing the device based on an actual orientation and a reference orientation of the device.
  • the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on detectable stimuli detected by an applicable sensor, such as the sensors described in this paper.
  • the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on images of a field of view in an environment at either or both the gimbal assembly 306 and a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 by applying computer vision to the images of the field of view in an environment at either or both the gimbal assembly 306 and the device affixed to the frame 304 to determine one or a combination of an actual orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 functions to control operation of the gimbal assembly 306 for purposes of stabilizing the device. Specifically, the computer vision-based stabilization control system 312 can control actuation of drive mechanisms of the gimbal assembly 306 to control movement of the gimbal assembly 306 and subsequently the frame 304 for purposes of stabilizing a device affixed to the frame 304. For example, the computer vision-based stabilization control system 312 can control actuation of one or more motors of the gimbal assembly 306 to control displacement of one or more gimbals of the gimbal assembly 306 to subsequently stabilize a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can send an actuation signal to the drive mechanisms to cause the drive mechanisms to operate in a specific way.
  • the computer vision-based stabilization control system 312 can send an actuation signal to cause a drive mechanism of the gimbal assembly to generate a specific amount of linear force or torque in a specific direction to move a device affixed to the frame 304 from its actual orientation to a reference orientation.
  • the computer vision-based stabilization control system 312 can control actuation based on a magnetic field of the motor using an actuation signal, either wired or wireless.
  • the computer vision-based stabilization control system 312 functions to control operation of the gimbal assembly 306 for stabilizing a device based on performance characteristics of one or more drive mechanisms of the gimbal assembly 306.
  • the computer vision-based stabilization control system 312 can control operation of the gimbal assembly 306 based on performance characteristics data received from one or more encoders of the gimbal assembly 306.
  • the computer vision-based stabilization control system 312 can control operation of the gimbal assembly 306 based on performance characteristics data of one or more drive mechanisms of the gimbal assembly determined at a frequency of 1600Hz.
  • the computer vision-based stabilization control system 312 can use either or both an angular position of a drive mechanism of the gimbal assembly 306 and an angular speed at which a drive mechanism is moving to control operation of the gimbal assembly 306. For example, the computer vision-based stabilization control system 312 can determine a current position, e.g. angle position, of a drive mechanism of the gimbal assembly and a desired orientation of a drive mechanism for purposes of stabilizing a device affixed to the frame 304.
  • a current position e.g. angle position
  • a desired orientation of a drive mechanism can correlate to a desired orientation of the device in stabilizing it and subsequent actuation of the drive mechanism causing displacement of one or more gimbals in the gimbal assembly 306 displaces the device to the desired orientation.
  • the computer vision-based stabilization control system 312 functions to use computer vision to stabilize a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can apply computer vision to images of a field of view in an environment at either or both a device affixed to the frame 304 and the gimbal assembly 306 to control stabilization of the device using the gimbal assembly 306.
  • In applying computer vision to images of an environment at either or both a device affixed to the frame 304 can use computer vision to determine movement of the device or the gimbal assembly 306.
  • a reference orientation to displace the device to for purposes of stabilizing the device, e.g. an original position of the device. For example, if, based on computer vision, the computer vision-based stabilization control system 312 determines a device affixed to the frame 304 has fallen 1mm from an old orientation to a new orientation, then the computer vision-based stabilization control system 312 can determine a reference orientation of the device for purposes of stabilizing the device is at coordinates in space of the old orientation.
  • the computer vision-based stabilization control system 312 functions to generate an actuation signal for purposes of stabilizing a device based on a determined actual orientation and reference orientation.
  • the computer vision-based stabilization control system 312 can control generate an actuation signal based on a desired orientation and a reference orientation of one or a combination of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can determines the device has moved 1mm to the left of its reference orientation.
  • the computer vision-based stabilization control system 312 can generate an actuation signal to cause the gimbal assembly to move the device lmm to the right to its reference orientation for stabilizing the device. Additionally, in generating an actuation signal based on actual and reference orientations for purposes of stabilizing a device, the computer vision-based stabilization control system 312 can factor in compensation parameters.
  • the computer vision-based stabilization control system 312 determines a device affixed to the frame 304 has been displaced lmm to the right while linearly travelling at a speed of twenty miles per hour, then the computer vision-based stabilization control system 312 can generate an actuation signal to cause the gimbal assembly 306 to displace the device 1mm to the left while it is still travelling at a speed of twenty miles per hour.
  • the computer vision-based stabilization control system 312 can be integrated as part of a main PCB of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the computer vision-based stabilization control system 312 can send actuation signal directly to drive mechanisms of the gimbal assembly 306 without passing through an encoder of the gimbal assembly 306.
  • drive mechanisms can be controlled faster leading to increased stabilization speeds and decreased de-stabilization of a device affixed to the frame 304. Further, this can reduce shaking in a video captured by a device affixed to the frame 304 as a result of the observed decreased de-stabilization of a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 functions to control operation of a drive mechanism of the gimbal assembly 306 at an applicable frequency for stabilizing a device using computer vision.
  • the computer vision-based stabilization control system 312 can either or both generate and send actuation signals to a drive mechanism at an applicable frequency for stabilizing a device using the drive mechanism based on computer vision.
  • the computer vision-based stabilization control system 312 can generate and send actuation signals at a frequency between 400Hz and 1600Hz.
  • the computer vision-based stabilization control system 312 can generate and send actuation signals at a frequency to reduce or otherwise eliminate shaking in a video or a series of images captured at an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the computer vision-based stabilization control system 312 can generate and send actuation signals to a drive mechanism at a frequency greater than or equal to 400Hz.
  • the computer vision-based stabilization control system 312 can generate and send actuation signals to a drive mechanism at a frequency to limit triggering of image stabilization at a stabilized media capturing device affixed to the frame 304.
  • the computer vision- based stabilization control system 312 can generate and send actuation signals to a drive mechanism at a frequency of 1600Hz or greater.
  • the computer vision-based stabilization control system 312 functions to disable a stabilizer on a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can functions to disable a software- based stabilizer operating on a smart phone or camera affixed to the frame 304 for purposes of eliminating interference in stabilization of the phone or camera caused by the stabilizer operating on the phone or camera.
  • the computer vision-based stabilization control system 312 functions to be implemented, at least in part, at a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can be implemented as a native application or a web-based application at a device affixed to the frame 304.
  • the computer vision-based stabilization control system 312 can function to synchronize an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper, to actually stabilize the device.
  • the computer vision-based stabilization control system 312 can be implemented at a smart phone and configure the smart phone to provide a video feed captured at the phone to the computer vision-based stabilization control system 312 for purposes of stabilizing the phone. Additionally, in being implemented, at least in part, at a device affixed to the frame 304, the computer vision-based stabilization control system 312 can disable a stabilizer on the device. For example, the computer vision-based stabilization control system 312 can enable applications at a device to disable a software-based stabilizer of the device.
  • the computer vision-based stabilization control system 312 is implemented as part of a 2-differental bus. In being implemented as part of a 2- differential bus, the computer vision-based stabilization control system 312 can send using a first bus a signal transmission window to the sensor 310 and subsequently receive data, e.g. performance characteristics data. Additionally, using a different bus the computer vision-based stabilization control system 312 can send actuation signals to a drive mechanism of the gimbal assembly 306.
  • FIG. 4 depicts a flowchart 400 of an example of a method for stabilizing a device using computer vision.
  • the flowchart 400 begins at module 402, where a device is affixed to a frame of a computer vision-based device stabilization system.
  • a device can be affixed to a frame of a computer vision-based device stabilization system through an applicable securing mechanism.
  • a device can be affixed to a frame of a computer vision-based device stabilization system through a claw mechanism. Further in the example, the claw mechanism extends either not at all or only a portion into the frame to allow a battery to be contained within the frame for purposes of reducing a form factor of the computer vision-based device stabilization system.
  • performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system are received.
  • Performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system can be received from one or a plurality of encoders corresponding to the drive mechanisms.
  • Performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system can be either or both generated and transmitted according to an arbitration- free protocol.
  • Performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system can include either or both an angular position of the drive mechanisms and an angular speed at which the drive mechanisms are being displaced.
  • Images of a captured field of view of an environment at the computer vision-based device stabilization system can be received from an applicable sensing source, such as the sensors described in this paper.
  • images of a captured field of view of an environment at the computer vision-based device stabilization system can be received from a camera of the device affixed to the frame of the computer vision-based device stabilization system.
  • images of a captured field of view of an environment at the computer vision-based device stabilization system can be received from a sensor integrated as part of the frame and separate from the device affixed to the frame.
  • an actual orientation of the device can be determined from the performance characteristics data. For example, an actual orientation of the device can be determined from an actual orientation of one or a combination of the drive mechanisms of the gimbal assembly, gimbals of the gimbal assembly, and the frame, as indicated by the performance characteristics data of the drive mechanisms of the gimbal assembly. Additionally, an actual orientation of the device can be determined from one or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated as part of the computer vision- based device stabilization system. For example, data generated by an accelerometer, a gyroscope, and a GPS sensor integrated on a main PCB of the computer vision-based device stabilization system can be used to determine an actual orientation of the device affixed to the frame.
  • the flowchart 400 continues to module 410, where computer vision is applied to the images of the captured field of view of the environment to determine a reference orientation of the device.
  • computer vision is applied to the images of the captured field of view of the environment to determine a reference orientation of the device.
  • one or a plurality of objects in the captured field of view can be identified using computer vision.
  • an identified object can be tracked in the images using computer vision to determine a reference orientation. For example, if an object is at an initial location in an image is at a specific coordinate in space, a reference orientation of the device can be determined to be the specific coordinate.
  • the flowchart 400 continues to module 412, where the gimbal assembly is controlled based on the performance characteristics to stabilize the device by correlating the actual orientation of the device with the reference orientation of the device. For example, if the actual orientation is at a first coordinate in space and the reference orientation of the device is at a second coordinate in space, then the gimbal assembly can be controlled to move the device from the first coordinate to the second coordinate. In controlling the gimbal assembly by correlating the actual orientation with the reference orientation, the drive mechanisms of the gimbal assembly can be actuated to cause the gimbal to move the device from the actual orientation to the reference orientation for purposes of stabilizing the device.
  • FIG. 5 depicts a diagram 500 of a system for directly controlling drive mechanisms of a gimbal assembly for purposes of stabilizing a device using computer vision.
  • the system shown in FIG. 5 includes a drive mechanism 502, an encoder 504, and a main PCB 506. Further, in the system shown in FIG. 5, the main PCB 506 includes a computer vision-based stabilization control system 508.
  • the components shown in FIG. 5 can be integrated as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the drive mechanism 502 is intended to represent a drive mechanism of a gimbal assembly for purposes of stabilizing a device coupled to the gimbal assembly.
  • the drive mechanism 502 can be an applicable mechanism for driving one or a plurality if gimbals in a gimbal assembly.
  • the drive mechanism 502 can be a brushless electric motor.
  • the drive mechanism 502 can be controlled through actuation signals sent to the drive mechanism 502 for purposes of stabilizing a device.
  • the drive mechanism 502 can be actuated to move a device affixed to a frame coupled to a gimbal assembly from an actual orientation to a reference orientation for purposes of stabilizing the device.
  • the encoder 504 is intended to represent an applicable mechanism for measuring performance characteristics of the drive mechanism 502. While the encoder 504 is shown to be separate from the main PCB 506, in various implementation, the encoder 504 can be implemented at the main PCB 506.
  • the encoder can be an applicable encoder for measuring performance characteristics of the drive mechanism 502.
  • the encoder 504 can be an incremental angular encoder configured to measure an angular position of the drive mechanism 502 with respect to a fixed reference point.
  • the encoder 504 can determine performance characteristics at an applicable frequency for stabilizing a device using computer vision. For example, the encoder 504 can determine performance characteristics of the drive mechanism 502 at a frequency between 400Hz and 1600Hz.
  • the main PCB 506 is intended to represent a main PCB of a system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
  • the main PCB 506 can include one or a combination of an accelerometer, a gyroscope, and a GPS sensor.
  • One or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated as part of the main PCB 506 can be used to determine an actual orientation of a device for purposes of stabilizing the device.
  • an accelerometer and a gyroscope on the main PCB 506 can be used to determine changes to yaw, pitch, and roll of a frame supporting a device, which can subsequently be used to determine an actual orientation of the device as it is affixed to the frame.
  • the main PCB 506 includes a computer vision-based stabilization control system 508.
  • the computer vision-based stabilization control system 508 is intended to represent an applicable system for controlling stabilization of a device using computer vision, such as the computer vision-based stabilization control systems described in this paper.
  • the computer vision-based stabilization control system 508 can control a gimbal assembly for purposes of stabilizing a device using computer vision.
  • the computer vision-based stabilization control system 508 can apply computer vision to images of a field of view of an environment at a device to determine a reference orientation of the device for purposes of stabilizing the device. Additionally, the computer vision-based stabilization control system 508 can calculate an actual orientation of a device for purposes of stabilizing the device.
  • the computer vision-based stabilization control system 508 can calculate an actual orientation of a device from either or both performance characteristics data received from the encoder 504 and position data generated by one or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated at the main PCB 506.
  • the computer vision-based stabilization control system 508 functions to send an actuation signal to the drive mechanism 502 for purposes of actuating the drive mechanism 502 in stabilizing a device.
  • the computer vision-based stabilization control system 508 can send the actuation signal directly to the drive mechanism 502 instead of sending the actuation signal to the drive mechanism 502 indirectly through the encoder 504.
  • the computer vision- based stabilization control system 508 can generate the actuation signal based on an actual orientation of the device, a reference orientation of the device, and performance characteristics data of the drive mechanism 502 received from the encoder 504. For example, the computer vision-based stabilization control system 508 can determine how to operate the drive mechanism 502 to move a device from its actual orientation to a reference orientation based on the performance characteristics data of the drive mechanism 502.
  • FIG. 6 depicts a diagram of a computer vision-based reference orientation identification system 602.
  • the computer vision-based reference orientation identification system 602 is intended to represent a system that functions to determine a reference orientation for use in stabilizing a device.
  • the computer vision-based reference orientation identification system 602 can be implemented as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. Additionally, the computer vision-based reference orientation identification system 602 can be implemented as part of an applicable system for controlling a gimbal assembly for purposes of stabilizing a device, such as the computer vision-based stabilization control systems described in this paper.
  • the computer vision-based reference orientation identification system 602 functions to identify a reference orientation using applicable input received at the computer vision-based reference orientation identification system 602.
  • the computer vision-based reference orientation identification system 602 can identify a reference orientation based on input received from a user. For example, the computer vision-based reference orientation identification system 602 can identify a reference orientation by tracking an object within portion of an image of a field of view of an environment identified by a user. Additionally, the computer vision-based reference orientation identification system 602 can identify a reference orientation based on input stimuli. For example, the computer vision-based reference orientation identification system 602 can apply computer vision to captured images of a field of view of an environment at a device for purposes of stabilizing the device. In another example, the computer vision-based reference orientation identification system 602 can use input from an acoustic sensor to determine distances in images of a field of view of an environment at a device for purposes of stabilizing the device.
  • the computer vision-based reference orientation identification system 602 shown in FIG. 6 includes an input engine 604, a predictive tracking model maintenance engine 606, a predictive tracking model datastore 608, a tracking object identification engine 610, and a reference orientation identification engine 612.
  • the input engine 604 is intended to represent an engine that functions to receive input for purposes of stabilizing a device using computer vision.
  • the input engine 604 can receive input from a user operating an applicable system for stabilizing a device, such as the computer vision-based device stabilization systems described in this paper.
  • the input engine 604 can receive input indicating a portion of a frame of field of view of an environment at a device to stabilize to use in actually stabilizing the device using computer vision.
  • the input engine 604 can receive user input indicating an object to track in a field of view of an environment at a device for purposes of stabilizing the device using computer vision.
  • the input engine 604 functions to receive input indicating detectable stimuli in an environment at a device for purposes of stabilizing the device using computer vision.
  • the input engine 604 can receive input indicating detectable stimuli in an environment at a device for purposes of stabilizing the device from an applicable sensor for detecting stimuli.
  • the input engine 604 can receive images of a field of view of an environment at a device from a camera, either or both implemented as part of the device or separate from the device.
  • the input engine 604 can receive from an acoustic sensor input indicating distances a device or a system for controlling stabilization of the device are away from objects in an environment at the device.
  • the predictive tracking model maintenance engine 606 is intended to represent an engine that functions to maintain a predictive tracking model for use in determining a reference orientation of a device using computer vision.
  • a predictive tracking model maintained by the predictive tracking model maintenance engine 606 can be applied to determine a reference orientation of a device for purposes of stabilizing the device. For example, if an obj ect in images of a field of view of an environment at a device are being tracked to determine a reference orientation, and the object disappears from the images, then a predictive tracking model can be applied to determine the reference orientation without needing to track the object.
  • the predictive tracking model maintenance engine 606 can generate and continuously update a predictive tracking model.
  • the predictive tracking model maintenance engine 606 can maintain a predictive tracking model in real time as a device is stabilized.
  • a predictive tracking model maintained by the predictive tracking model maintenance engine 606 can be specific to a device, characteristics of a device, and characteristics of operation of a device by a user.
  • a predictive tracking model can be unique to a specific device or unique to a specific type of device.
  • a predictive tracking model can be used to determine a reference orientation of all smart phones of a specific make and model.
  • the predictive tracking model maintenance engine 606 functions to maintain a predictive tracking model based on historical movements of a device. Specifically, the predictive tracking model maintenance engine 606 can update a predictive tracking model in real time to indicate how orientation of a device has changed to build a historical movement partem of the device. For example, if a device on a ship is constantly swaying back and forth in rhythm with waves, then the predictive tracking model maintenance engine 606 can maintain a predictive tracking model to indicate the constant swaying back and forth.
  • the predictive tracking model maintenance engine 606 can maintain a predictive tracking model based on historical movements determined through computer vision. Specifically, the predictive tracking model maintenance engine 606 can update a predictive tracking model to include movements of a device determined by applying computer vision to images of a field of an environment.
  • the predictive tracking model maintenance engine 606 functions to maintain a predictive tracking model based on either or both physical natural laws.
  • the predictive tracking model maintenance engine 606 can apply physical or natural laws based on characteristics of a device or an operator of the device. For example, if a human is operating a device, then the predictive tracking model maintenance engine 606 can build a predictive tracking model indicating the device cannot fly as a human cannot fly. In another example, if a device has a specific mass, then the predictive tracking model maintenance engine 606 can build a predictive tracking model indicating if a device is accelerating at a specific rate, then it will have a specific increasing velocity based on the mass of the object.
  • the predictive tracking model datastore 608 is intended to represent a datastore that functions to store predictive tracking model data indicating predictive tracking models. Predictive tracking models indicated by predictive tracking model data stored in the predictive tracking model datastore 608 can be used to determine a reference orientation of a device for purposes of stabilizing the device. For example, if a tracked object falls out of images of a field of view of an environment at a device, then a predictive tracking model indicated by predictive tracking model data stored in the predictive tracking model datastore 608 can be applied to determine a reference orientation of the device for purposes of stabilizing it.
  • the tracking object identification engine 610 is intended to represent an engine that functions to identify an object in images to track for purposes of determining a reference orientation of a device.
  • a reference orientation determined by tracking an object identified by the tracking object identification engine 610 can be used to stabilize a device.
  • the tracking object identification engine 610 can identify an object to track in images of a field of view of an environment at a device for purposes of stabilizing the device.
  • the tracking object identification engine 610 can identify an object to track in images captured by a camera of a device of a field of view of an environment at the device for purposes of stabilizing it.
  • the tracking object identification engine 610 can identify an object to track in images captured by a camera separate from a device of a field of view of an environment at the device for purposes of stabilizing the device.
  • the tracking object identification engine 610 can apply computer vision to identify objects to track in images of a field of view of an environment.
  • the tracking object identification engine 610 can identify objects to track in images using applicable computer vision techniques.
  • the tracking object identification engine 610 can use either or both a scale invariant object transform (hereinafter referred to as "SIFT") object detection method or a speeded up robust objects (hereinafter referred to as "SURF”) object detection method to identify objects to track in images.
  • SIFT scale invariant object transform
  • SURF speeded up robust objects
  • the tracking object identification engine 610 can apply computer vision to identify key points in images and match the key points in the images as part of point tracking to identify objects to track in the images.
  • the tracking object identification engine 610 functions to create a model of an object to track by applying computer vision to images including the object.
  • the tracking object identification engine 610 can use a point correlation computer vision method to create a model of an object to track.
  • the tracking object identification engine 610 can slide an object around in a frame or an image to identify key points, e.g. through application of either or both SIFT or SURF object detection methods.
  • the tracking object identification engine 610 can identify key points through application of computer vision across a plurality of images.
  • the tracking object identification engine 610 can match the key points to build a model of the object.
  • the tracking object identification engine 610 functions to select an object to track based on characteristics of the object. For example, if an object is moving across a plurality of images of a field of view of an environment, then the tracking object identification engine 610 can select the object for purposes of tracking the object. In another example, if an object is of a specific size in an image, then the tracking object identification engine 610 can select the object for purposes of tracking the object.
  • the tracking object identification engine 610 functions to identify an obj ect to track based on user input. For example, the tracking object identification engine 610 can identify an object to track in a portion of an image of a captured field of view of an environment. Further in the example, the tracking object identification engine 610 can center the portion of the image of the captured field of view of the environment for purposes of tracking the identified object in the portion of the image. Alternatively, the tracking object identification engine 610 can select a portion of an image of a captured field of view of an environment independent from user input and subsequently identify an obj ect to track in the portion of the image. In tracking an object in only a portion of an image, computational resources needed to identify and track the object are reduced. This reduces consumed processing power or a required processing power and conserves battery power. As a result, the tracking object identification engine 610 and other applicable parts of the computer vision-based reference orientation identification system 602 can be implemented on a non- industrial device.
  • the reference orientation identification engine 612 is intended to represent an engine that functions to determine a reference position of a device for purposes of stabilizing a device.
  • the reference orientation identification engine 612 can determine a reference position of a device by tracking objects in images of a field of view of an environment at the device. For example, the reference orientation identification engine 612 can determine a changing reference position of a device has moved to the left 1mm based on objects tracked in images of a field of view of an environment of the device moving 1mm to the left.
  • the reference orientation identification engine 612 can apply computer vision to images of a field of view of an environment at a device to track objects in the images. For example, the reference orientation identification engine 612 can apply either or both SIFT and SURF feature recognition mechanisms to track movement of an object across a plurality of images of a field of view.
  • the reference orientation identification engine 612 functions to apply a predictive tracking model for purposes of tracking objects.
  • the reference orientation identification engine 612 can apply a predictive tracking model to predict a position an object should be at for purposes of identifying a reference orientation. For example, if a predictive tracking model indicates an object should have moved 1mm down with respect to a point in space, then the reference orientation identification engine 612 can predict a changing reference orientation as 1mm down with respect to the point.
  • the reference orientation identification engine 612 can apply a predictive tracking model based on whether it is currently able to track an object in images of a field of view at a device. For example, if a tracked object disappears from images of a field of view, e.g. a user obscures a sensor for capturing the images, then the reference orientation identification engine 612 can apply a predictive tracking model to determine a reference orientation of a device.
  • the reference orientation identification engine 612 functions to use compensation parameters to determine a reference orientation of a device. For example, if compensation parameters indicate a device is being moved linearly forward at a speed of ten miles per hour, then the reference orientation identification engine 612 can factor in that a sensor used to capture images is moving at ten miles per hour as part of tracking an object in the captured images.
  • the reference orientation identification engine 612 can receive input from an applicable sensor for use in applying compensation parameters to determine a reference orientation.
  • the reference orientation identification engine 612 can determine a linear speed of a device from an accelerometer and subsequently use the speed as a compensation parameter.
  • the reference orientation identification engine 612 can determine an orientational speed of a gimbal from an accelerometer and subsequently use the orientational speed as a compensation parameter in determining a reference orientation of a device.
  • the reference orientation identification engine 612 functions to manipulate images to change a size of tracked objects in the images. In manipulating images to change sizes of tracked object, the reference orientation identification engine 612 can use changes to zoom level or aspect ratios resulting from changing sizes of images and corresponding tracked objects to determine a reference position. In changing sizes of tracked objects in images, the objects can be more easily identified and tracked without risk of losing the tracked object. The reference orientation identification engine 612 can use an applicable image processing technique to manipulate images to change sizes of tracked objects. [0103] In a specific implementation, the reference orientation identification engine 612 functions to determine a reference position based on input received from an acoustic sensor.
  • the reference orientation identification engine 612 can use input received from an acoustic sensor to determine a measure of distance between a sensor and an object, a device being stabilized and an object, or different objects. Further in the example, the reference orientation identification engine 612 can use a measured distance to determine dimensions of a tracked object, a captured field of view, or other features in a captured field of view for purposes of determining a reference orientation of a device.
  • the input engine 604 receives input from a sensor for use in determining a reference orientation of a device for purposes of stabilizing a device.
  • the predictive tracking model maintenance engine 606 maintains a predictive tracking model for use in determining a reference orientation of the device.
  • the predictive tracking model datastore 608 stores predictive tracking model data indicating the predictive tracking model maintained by the predictive tracking model maintenance engine 606.
  • the tracking object identification engine 610 identifies an object to track in images, received as input by the input engine 604, of a field of view of an environment at the device.
  • the reference orientation identification engine 612 tracks the object in the images to determine a reference orientation. In the example of operation of the example system shown in FIG. 6, the reference orientation identification engine 612 uses the predictive tracking model to further determine the reference orientation of the device.
  • FIG. 7 depicts a flowchart 700 of an example of a method for using computer vision to determine a reference orientation of a device for purposes of stabilizing the device.
  • the flowchart 700 begins at module 702, where input including images of a field of view of an environment at a device are received for purposes of stabilizing the device.
  • An applicable engine for receiving perceivable stimuli such as the input engines described in this paper, can receive input including images of a field of view of an environment at a device for purposes of stabilizing the device.
  • Images of a field of view of an environment at a device can be received from an applicable source, such as the sensors described in this paper.
  • images of a field of view of an environment at a device can be received from a camera integrated as part of the device.
  • images of a field of view of an environment at a device can be received from a camera integrated as part of a frame of a computer vision-based device stabilization system.
  • the flowchart 700 continues to module 704, where an obj ect to track in the images is identified to determine a reference orientation of the device for purposes of stabilizing the device.
  • An applicable engine for identifying objects to track in identifying a reference orientation of a device such as the tracking object identification engines described in this paper, can identify an object to track in the images to determine a reference orientation of the device.
  • An object to track in the images can be identified using computer vision. For example, either or both a SIFT object detection method and a SURF object detection method can be used to identify an object to track in the images of the field of view of the environment at the device.
  • the flowchart 700 continues to module 706, where the object is tracked to determine the reference orientation of the device for purposes of stabilizing the device.
  • An applicable engine for tracking an object in images for purposes of determining a reference orientation of a device can track the object in the images of the field of view of the environment at the device.
  • the obj ect can be tracked to determine the reference orientation by applying computer vision to the images of the field of view of the environment at the device.
  • the reference orientation can be determined based on input received from an acoustic sensor.
  • the flowchart 700 optionally continues to module 708, where a predictive tracking model is used to track the object to determine the reference orientation of the device for purposes of stabilizing the device.
  • a predictive tracking model can be applied by an applicable engine for tracking an object in images for purposes of determining a reference orientation of a device, such as the reference orientation identification engines described in this paper. For example, a predictive tracking model can be applied to determine an expected position of the device.
  • a predictive tracking model can be applied if the object can no longer be tracked or otherwise disappears from the images of the field of view of the environment at the device. For example, if a mountain no longer appears in the images of the field of view of the environment at the device, then the predictive tracking model can be applied to determine the reference orientation.
  • An applicable engine for maintaining predictive tracking models such as the predictive tracking model maintenance engines described in this paper can maintain a predictive tracking model for use in determining the reference orientation of the device.
  • FIG. 8 depicts a diagram 800 of an example of a device stabilization control system 802.
  • the device stabilization control system 802 is intended to represent a system that functions to control stabilization of a device based on a determine actual orientation of a device and a determined reference orientation of the device.
  • the device stabilization control system 802 can control operation of an applicable system for stabilizing a device, such as the computer vision-based device stabilization systems described in this paper.
  • the device stabilization control system 802 can generate and send actuation signals to a drive mechanism of a gimbal assembly for purposes of causing the gimbal assembly to displace a device from an actual orientation to a reference orientation.
  • the device stabilization control system 802 can be implemented as part of an applicable system for controlling a gimbal assembly for purposes of stabilizing a device, such as the computer vision-based stabilization control systems described in this paper.
  • the device stabilization control system 802 functions to control stabilization of a device through application of computer vision to determine a reference orientation. Specifically, the device stabilization control system 802 can apply computer vision to images of a field of view of an environment at a device to determine a reference orientation of the device for purposes of stabilizing the device. Further, the device stabilization control system 802 can compare a reference orientation of a device to an actual orientation of the device to determine how to actuate driving mechanisms to cause the device to be moved from its actual orientation to its reference orientation. The device stabilization control system 802 can determine an actual orientation of a device based on either or both performance characteristics data of a drive mechanism and data received from an accelerometer, a GPS sensor, and a gyroscope.
  • the device stabilization control system 802 shown in FIG. 8 includes an input engine 804, an actual orientation identification engine 806, a reference orientation identification engine 808, and a stabilization control engine 810.
  • the input engine 804 functions according to an applicable engine for receiving input for use in identifying a reference orientation of a device, such as the input engines described in this paper.
  • the input engine 804 can receive input for identifying a reference orientation of a device from an applicable source.
  • the input engine 804 can receive input including images of a field of view of an environment from a camera integrated as part of a device to be stabilized.
  • the input engine 804 can receive input indicating pitch, roll, and yaw from one or a combination of an accelerometer, a gyroscope, and a GPS sensor.
  • the actual orientation identification engine 806 is intended to represent an engine that functions to determine an actual orientation of a device for purposes of stabilizing the device.
  • the actual orientation identification engine 806 can determine an actual orientation of a device based on performance characteristics of a drive mechanism for a gimbal assembly. For example, an actual position of a device can be determined based on angular positions of a drive mechanism indicated by performance characteristics data of gimbals affixed to the device.
  • the actual orientation identification engine 806 can determine an actual orientation of a device based on data receive from one or a combination of a GPS sensor, an accelerometer, and a gyroscope. For example, an actual orientation of a device can be determined based on measurements made by an accelerometer of a frame of a computer vision based-device stabilization system.
  • the reference orientation identification engine 808 is intended to represent an engine that functions to determine a reference orientation of a device for purposes of stabilizing the device, such as the reference orientation identification engines described in this paper.
  • the reference orientation identification engine 808 can determine a reference orientation using computer vision. For example, the reference orientation identification engine 808 can track an identified obj ect image in a field of view of a device to determine a reference orientation of a device. Additionally, the reference orientation identification engine 808 can determine a reference orientation based on data received from one or a combination of an accelerometer, a GPS sensor, and a gyroscope.
  • the stabilization control engine 810 functions to control stabilization of a device based on a determined actual orientation of the device and a reference orientation of the device.
  • the stabilization control engine 810 can control a drive mechanism of a gimbal assembly to cause a device to displace from an actual orientation to a reference orientation.
  • the stabilization control engine 810 can generate and send an actuation signal to cause drive mechanisms to displace a device to be displaced from its actual orientation to a new orientation.
  • the stabilization control engine 810 can control a drive mechanism of a gimbal assembly based on performance characteristics data for the drive mechanism and received from an applicable source, such as the encoders described in this paper.
  • the stabilization control engine 810 can determine an actual orientation of a drive mechanism from performance characteristics data and subsequently how much to displace the drive mechanism to a desired position in order to position a device at a reference orientation based on the actual position. Further in the example, the stabilization control engine 810 can generate and send an actuation signal to cause the drive mechanism to displace to the desired position, subsequently positioning the device at a reference orientation.
  • the input engine 804 receives input for use in stabilizing a device.
  • the actual orientation identification engine 806 determines an actual orientation of a device based on the input received by the input engine 804.
  • the reference orientation identification engine 808 determines a reference orientation of a device based on computer vision.
  • the stabilization control engine 810 actuates a drive mechanism of a gimbal assembly to cause the device to be moved from its actual orientation to its reference orientation based on performance characteristics of the drive mechanism received by the input engine 804.

Abstract

Techniques for stabilizing a device. A system utilizing such techniques can include a gimbal assembly, drive mechanisms, a frame, and a computer vision-based stabilization control system. A method utilizing such techniques can include identification of an object in images of a field of view of an environment at a device and tracking of the object in the images to identify a reference orientation of the device.

Description

DEVICE STABILIZATION
BRIEF DESCRIPTION OF THE DRAWINGS
[0001] FIG. 1 depicts a diagram of an example of a system for stabilizing a media capturing device using computer vision.
[0002] FIG. 2 depicts a flowchart of an example of a method for stabilizing a device based on detectable stimuli.
[0003] FIG. 3 depicts a diagram of a system for stabilizing a device based on detectable stimuli in an environment at the device.
[0004] FIG. 4 depicts a flow chart of an example of a method for stabilizing a device using computer vision.
[0005] FIG. 5 depicts a diagram of a system for directly controlling drive mechanisms of a gimbal assembly for purposes of stabilizing a device using computer vision.
[0006] FIG. 6 depicts a diagram of a computer vision-based reference orientation identification system.
[0007] FIG. 7 depicts a flowchart of an example of a method for using computer vision to determine a reference orientation of a device for purposes of stabilizing the device.
[0008] FIG. 8 depicts a diagram of an example of a device stabilization control system.
DETAILED DESCRIPTION
[0009] FIG. 1 depicts a diagram 100 of an example of a system for stabilizing a media capturing device using computer vision. The system of the example of FIG. 1 includes a stabilized device 102 and a computer vision-based device stabilization system 104.
[0010] Either or both the stabilized device 102 and the computer vision-based device stabilization system 104 can include a computer-readable medium. A computer-readable medium, as discussed in this paper, is intended to include all mediums that are statutory (e.g., in the United States, under 35 U.S.C. 101), and to specifically exclude all mediums that are non-statutory in nature to the extent that the exclusion is necessary for a claim that includes the computer-readable medium to be valid. Known statutory computer-readable mediums include hardware (e.g., registers, random access memory (RAM), non-volatile (NV) storage, to name a few), but may or may not be limited to hardware.
[0011] A computer-readable medium, as discussed in this paper, is intended to represent a variety of potentially applicable technologies. For example, a computer-readable medium can be used to form a network or part of a network. Where two components are co-located on a device, a computer-readable medium can include a bus or other data conduit or plane. Where a first component is co-located on one device and a second component is located on a different device, a computer-readable medium can include a wireless or wired back-end network or LAN. A computer-readable medium can also encompass a relevant portion of a WAN or other network, if applicable.
[0012] The devices, systems, and computer-readable mediums described in this paper can be implemented as a computer system or parts of a computer system or a plurality of computer systems. In general, a computer system will include a processor, memory, non-volatile storage, and an interface. A typical computer system will usually include at least a processor, memory, and a device (e.g., a bus) coupling the memory to the processor. The processor can be, for example, a general-purpose central processing unit (CPU), such as a microprocessor, or a special-purpose processor, such as a microcontroller.
[0013] The memory can include, by way of example but not limitation, random access memory (RAM), such as dynamic RAM (DRAM) and static RAM (SRAM). The memory can be local, remote, or distributed. The bus can also couple the processor to non-volatile storage. The non-volatile storage is often a magnetic floppy or hard disk, a magnetic-optical disk, an optical disk, a read-only memory (ROM), such as a CD-ROM, EPROM, or EEPROM, a magnetic or optical card, or another form of storage for large amounts of data. Some of this data is often written, by a direct memory access process, into memory during execution of software on the computer system. The non-volatile storage can be local, remote, or distributed. The non-volatile storage is optional because systems can be created with all applicable data available in memory.
[0014] Software is typically stored in the non-volatile storage. Indeed, for large programs, it may not even be possible to store the entire program in the memory. Nevertheless, it should be understood that for software to run, if necessary, it is moved to a computer-readable location appropriate for processing, and for illustrative purposes, that location is referred to as the memory in this paper. Even when software is moved to the memory for execution, the processor will typically make use of hardware registers to store values associated with the software, and local cache that, ideally, serves to speed up execution. As used herein, a software program is assumed to be stored at an applicable known or convenient location (from nonvolatile storage to hardware registers) when the software program is referred to as "implemented in a computer-readable storage medium." A processor is considered to be "configured to execute a program" when at least one value associated with the program is stored in a register readable by the processor. [0015] In one example of operation, a computer system can be controlled by operating system software, which is a software program that includes a file management system, such as a disk operating system. One example of operating system software with associated file management system software is the family of operating systems known as Windows® from Microsoft Corporation of Redmond, Washington, and their associated file management systems. Another example of operating system software with its associated file management system software is the Linux operating system and its associated file management system. The file management system is typically stored in the non-volatile storage and causes the processor to execute the various acts required by the operating system to input and output data and to store data in the memory, including storing files on the non-volatile storage.
[0016] The bus can also couple the processor to the interface. The interface can include one or more input and/or output (I/O) devices. Depending upon implementation-specific or other considerations, the I/O devices can include, by way of example but not limitation, a keyboard, a mouse or other pointing device, disk drives, printers, a scanner, and other I/O devices, including a display device. The display device can include, by way of example but not limitation, a cathode ray tube (CRT), liquid crystal display (LCD), or some other applicable known or convenient display device. The interface can include one or more of a modem or network interface. It will be appreciated that a modem or network interface can be considered to be part of the computer system. The interface can include an analog modem, ISDN modem, cable modem, token ring interface, satellite transmission interface (e.g. "direct PC"), or other interfaces for coupling a computer system to other computer systems. Interfaces enable computer systems and other devices to be coupled together in a network.
[0017] The computer systems can be compatible with or implemented as part of or through a cloud-based computing system. As used in this paper, a cloud-based computing system is a system that provides virtualized computing resources, software and/or information to end user devices. The computing resources, software and/or information can be virtualized by maintaining centralized services and resources that the edge devices can access over a communication interface, such as a network. "Cloud" may be a marketing term and for the purposes of this paper can include any of the networks described herein. The cloud-based computing system can involve a subscription for services or use a utility pricing model. Users can access the protocols of the cloud-based computing system through a web browser or other container application located on their end user device.
[0018] A computer system can be implemented as an engine, as part of an engine or through multiple engines. As used in this paper, an engine includes one or more processors or a portion thereof. A portion of one or more processors can include some portion of hardware less than all of the hardware comprising any given one or more processors, such as a subset of registers, the portion of the processor dedicated to one or more threads of a multi-threaded processor, a time slice during which the processor is wholly or partially dedicated to carrying out part of the engine's functionality, or the like. As such, a first engine and a second engine can have one or more dedicated processors or a first engine and a second engine can share one or more processors with one another or other engines. Depending upon implementation- specific or other considerations, an engine can be centralized or its functionality distributed. An engine can include hardware, firmware, or software embodied in a computer-readable medium for execution by the processor. The processor transforms data into new data using implemented data structures and methods, such as is described with reference to the FIGS, in this paper.
[0019] The engines described in this paper, or the engines through which the systems and devices described in this paper can be implemented, can be cloud-based engines. As used in this paper, a cloud-based engine is an engine that can run applications and/or functionalities using a cloud-based computing system. All or portions of the applications and/or functionalities can be distributed across multiple computing devices, and need not be restricted to only one computing device. In some embodiments, the cloud-based engines can execute functionalities and/or modules that end users access through a web browser or container application without having the functionalities and/or modules installed locally on the end- users' computing devices.
[0020] As used in this paper, datastores are intended to include repositories having any applicable organization of data, including tables, comma-separated values (CSV) files, traditional databases (e.g., SQL), or other applicable known or convenient organizational formats. Datastores can be implemented, for example, as software embodied in a physical computer-readable medium on a specific-purpose machine, in firmware, in hardware, in a combination thereof, or in an applicable known or convenient device or system. Datastore- associated components, such as database interfaces, can be considered "part of a datastore, part of some other system component, or a combination thereof, though the physical location and other characteristics of datastore-associated components is not critical for an understanding of the techniques described in this paper.
[0021] Datastores can include data structures. As used in this paper, a data structure is associated with a particular way of storing and organizing data in a computer so that it can be used efficiently within a given context. Data structures are generally based on the ability of a computer to fetch and store data at any place in its memory, specified by an address, a bit string that can be itself stored in memory and manipulated by the program. Thus, some data structures are based on computing the addresses of data items with arithmetic operations; while other data structures are based on storing addresses of data items within the structure itself. Many data structures use both principles, sometimes combined in non-trivial ways. The implementation of a data structure usually entails writing a set of procedures that create and manipulate instances of that structure. The datastores, described in this paper, can be cloud- based datastores. A cloud-based datastore is a datastore that is compatible with cloud-based computing systems and engines.
[0022] The stabilized device 102 is intended to represent a device capable of being stabilized. The stabilized device 102 can be an applicable device to be stabilized in functioning to perform an applicable function. The stabilized device 102 can include a media capturing device. For example, the stabilized device 102 can be a smart phone with a camera configured to capture a video. Additionally, the stabilized device 102 can be one of a light, a sonar system, a gun, a laser system, applicable devices used on an airplane or a boat, a headlamp, and a tray used by a waiter to carry food and beverage. For example, the stabilized device 102 can be a laser system used in fish farming to kill lice. In another example, the stabilized device 102 is a light on a boat. The stabilized device 102 can be a portable. In being portable, the stabilized device 102 can be detached for an applicable device for stabilizing the stabilized device, such as the computer vision-based device stabilization systems described in this paper.
[0023] The computer vision-based device stabilization system 104 is intended to represent a system 104 that functions to stabilize a device based on computer perception. The computer vision-based device stabilization system 104 is physically, and potentially removably, connected to an applicable device, such as the stabilized devices described in this paper, for purposes of stabilizing the device. The computer vision-based device stabilization system 104 can be coupled to a stabilized device through an applicable physical connection. For example, the computer vision-based device stabilization system 104 can include a frame and clamps for physically affixing the computer vision-based device stabilization system 104 to a stabilized device. In another example, the computer vision-based device stabilization system 104 can include a frame sized for a stabilized device to physically affix the computer vision-based device stabilization system 104 to the stabilized device. [0024] In a specific implementation, the computer vision-based device stabilization system 104 includes one or a plurality of gimbals. Specifically, the computer vision-based device stabilization system 104 can include one or a plurality of gimbals to allow a stabilized device affixed to the computer vision-based device stabilization system 104 to rotate about one or a plurality of axes. In providing one or more gimbals for purposes of allowing a device affixed to the computer vision-based device stabilization system 104 to rotate about one or more axes, the device can be stabilized. For example, a gimbal of the computer vision-based device stabilization system 104 can be driven with a motor to cause a device affixed to the computer vision-based device stabilization system 104 to rotate about an axes for purposes of stabilizing the device. The computer vision-based device stabilization system 104 can include a three-axis gimbal for use in stabilizing device affixed to the computer vision-based device stabilization system 104. For example, the computer vision-based device stabilization system 104 can include one or a plurality of motors for causing a three-axis gimbal to move along one or a plurality of axes for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104.
[0025] In a specific implementation, the computer vision-based device stabilization system 104 includes a sensor that functions to operate for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104. A sensor included as part of the computer vision-based device stabilization system 104 can be an applicable sensor for sensing electromagnetic radiation, sound waves, pressure, or other detectable stimuli in an environment surrounding the computer vision-based device stabilization system 104. Specifically, a sensor included as part of the computer vision-based device stabilization system 104 can be a camera configured to capture images or video of an environment at the computer vision-based device stabilization system 104 for purposes of stabilizing a device affixed to the computer vision- based device stabilization system 104. For example, a sensor included as part of the computer vision-based device stabilization system 104 can be a camera configured to capture images within a field of view of an environment at the computer vision-based device stabilization system 104. In another example, a sensor included as part of the computer vision-based device stabilization system 104 can be a 360° camera configured to capture at 360° field of view of the environment at the computer vision-based device stabilization system 104. A sensor of the computer vision-based device stabilization system 104 can be integrated as part of the stabilized device 102. For example, a sensor of the computer vision-based device stabilization system 104 can be a camera of a smartphone being stabilized.
[0026] In a specific implementation, the computer vision-based device stabilization system 104 functions to stabilize a device based on detectable stimuli in an environment surrounding the computer vision-based device stabilization system 104. The computer vision-based device stabilization system 104 can stabilize a device based on stimuli detected by an applicable sensor at the computer vision-based device stabilization system 104 such as the sensors described in this paper. In stabilizing a device based on detectable stimuli, the computer vision- based device stabilization system 104 can control displacement of one or a plurality of gimbals to move a device physically connected to the computer vision-based device stabilization system 104 for purposes of stabilizing the device. For example, based on detected sound of a person speaking, the computer vision-based device stabilization system 104 can displace a device towards the person. In another example, based on recognition of a person in images captured at the computer vision-based device stabilization system 104, the computer vision- based device stabilization system 104 can displace a device towards the person.
[0027] In a specific implementation, the computer vision-based device stabilization system 104 functions to stabilize a device based on images of a field of view at the computer vision- based device stabilization system 104. In stabilizing a device based on images of a field of view, the computer vision-based device stabilization system 104 can apply compute vision to images. For example, the computer vision-based device stabilization system 104 can apply computer vision to a changing field of view of an environment to determine either the computer vision-based device stabilization system 104 and a device affixed to it is moving. Further in the example, the computer vision-based device stabilization system 104 can subsequently correct movement of the device to stabilize it by causing one or more gimbals to displace, subsequently causing the device to displace along one or more axes. The computer vision-based device stabilization system 104 can use an applicable computer-vision method for purposes of stabilizing a device affixed to it. For example, the computer vision-based device stabilization system 104 can use one or an applicable combination of image processing, machine vision, pattern recognition, machine leaming, and photogrammetry to stabilize a device affixed to it through computer vision.
[0028] In a specific implementation, the computer vision-based device stabilization system 104 functions to stabilize a device based on objects in images of an environment at the computer vision-based device stabilization system 104. In stabilizing a device based on objects in images of an environment at the computer vision-based device stabilization system 104, the computer vision-based device stabilization system 104 can identify the objects in the images. For example, the computer vision-based device stabilization system 104 can apply computer vision to images of an environment to identify objects in the images at the computer vision- based device stabilization system 104.
[0029] In a specific implementation, the computer vision-based device stabilization system 104 functions to track movement of objects in a field of view of an environment captured at the computer vision-based device stabilization system 104 for purposes of stabilizing a device. The computer vision-based device stabilization system 104 can track movement of objects in a field of view captured at the computer vision-based using computer vision. In tracking moving objects in a field of view captured at the computer vision-based device stabilization system 104, the computer vision-based device stabilization system 104 can identify objects that are moving in a captured field of view, e.g. from multiple images of the field of view. Further, the computer vision-based device stabilization system 104 can track movement of identified moving objects in a captured field of view at the computer vision-based device stabilization system 104 to identify a reference orientation for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104. For example, if a painting in a field of view captured at the computer vision-based device stabilization system 104 is identified as a moving object, then the computer vision-based device stabilization system 104 can track movements of the painting in the field of view to determine a reference orientation for purposes of stabilizing a device affixed to the computer vision-based device stabilization system 104.
[0030] In a specific implementation, the computer vision-based device stabilization system 104 functions to stabilize a device using long term tracking. Specifically, the computer vision- based device stabilization system 104 can use long term tracking of an object in an image of a field of view at the computer vision-based device stabilization system 104 to stabilize a device affixed to the computer vision-based device stabilization system 104. For example, the computer vision-based device stabilization system 104 can use tracking of an object in a field of view or absent from a field of view over time to track movements of either or both the computer vision-based device stabilization system 104 and a device to subsequently stabilize the computer vision-based device stabilization system 104. In using long term tracking to stabilize a device, the computer vision-based device stabilization system 104 can track an object in an image of a field of view using point tracking. For example, the computer vision- based device stabilization system 104 can identify a point in an object in an image of an environment at the computer vision-based device stabilization system 104. Further in the example, the computer vision-based device stabilization system 104 can subsequently follow movement of the object in the field of view in subsequent images of the environment by tracking movement of the point in the obj ect in the subsequent images of the environment.
[0031] In a specific implementation, the computer vision-based device stabilization system 104 functions to use predictive tracking models in tracking objects in a captured field of view at the computer vision-based device stabilization system 104 for purposes of stabilizing a device. For example, the computer vision-based device stabilization system 104 can use a predictive tracking model to stabilize a device if a tracked object in a field of view captured at the computer vision-based device stabilization system 104 used in stabilizing the device moves out of the view. A predictive tracking model includes applicable data used in determining an expected movement of a device. For example, a predictive tracking model can specify a partem of movement of a device that the device will continue to follow as it moves. A predictive tracking model can be generated through either or both machine learning and tracked movements of a device. For example, a predictive tracking model can be generated based on a pattern of movement of a device as determined by tracking an object in a field of view of an environment captured at the computer vision-based device stabilization system 104.
[0032] In a specific implementation the computer vision-based device stabilization system 104 functions to maintain predictive tracking models. In maintaining predictive tracking models, the computer vision-based device stabilization system 104 can generate and update predictive tracking models for use in stabilizing a device. The computer vision-based device stabilization system 104 can maintain predictive tracking models that are specific to one or a combination of characteristics of a stabilized device, a device type of a stabilized device, characteristics of an environment surrounding a stabilized device, and characteristics of a user of a stabilized device. For example, if a user of the computer vision-based device stabilization system 104 has Parkinson's disease, then the computer vision-based device stabilization system 104 can create a predictive tracking model including patterns of involuntary movements made by the user due to Parkinson's disease. In another example, the computer vision-based device stabilization system 104 can generate a predictive tracking model based on a weight of a device and physical laws governing movements of the device based on its weight.
[0033] In a specific implementation, the computer vision-based device stabilization system 104 functions to stabilize a device based on compensation parameters. Compensation parameters include applicable parameters for compensating stabilization of a device based on detectable stimuli in an environment of the computer vision-based device stabilization system 104. An example of a compensation parameter is intended movement of either or both a device affixed to the computer vision-based device stabilization system 104 or the computer vision- based device stabilization system 104 itself. For example, a compensation parameter can include linear movements of a user operating the computer vision-based device stabilization system 104 to stabilize an affixed device. In another example, a compensation parameter can include linear forward and backward movements of a boat upon which the computer vision- based device stabilization system 104 is operating to stabilize an affixed device. The computer vision-based device stabilization system 104 can determine compensation parameters based on input received from an applicable source. For example, the computer vision-based device stabilization system 104 can determine linear movements of a user based on either or both images of a field of a view at the computer vision-based device stabilization system 104 and an accelerometer integrated at the computer vision-based device stabilization system 104 as part of a sensor. In another example, the computer vision-based device stabilization system 104 can determine linear movements based on position data received over time from a global positioning system (hereinafter "GPS") sensor integrated at the computer vision-based device stabilization system 104.
[0034] In a specific implementation, the computer vision-based device stabilization system 104 includes a battery for use in powering the computer vision-based device stabilization system 104 to stabilize an affixed device. A battery of the computer vision-based device stabilization system 104 can be a battery or a device the computer vision-based device stabilization system 104 is operating to stabilize. For example, a battery of the computer vision-based device stabilization system 104 can be a battery of a smart phone the computer vision-based device stabilization system 104 is operating to stabilize. Further, a battery can be integrated as part of the computer vision-based device stabilization system 104 separate from a device the computer vision-based device stabilization system 104 is operating to stabilize. For example, the computer vision-based device stabilization system 104 can be integrated with its own battery, thereby allowing the computer vision-based device stabilization system 104 to stabilize devices that do not require a power source.
[0035] In a specific implementation, the computer vision-based device stabilization system 104 includes a battery separate from an affixed device and secured in close proximity to the affixed device. In securing a battery separate from an affixed device in close proximity to the affixed device, a total size of the computer vision-based device stabilization system 104 can decrease. Additionally, in securing a battery separate from an affixed device in close proximity to the affixed device, fewer wires or shorter wires in length are need to power the computer vision-based device stabilization system 104 leading to one or a combination of: longer battery life in operation of the computer vision-based device stabilization system 104; reduced assembly costs in assembling the computer vision-based device stabilization system 104; achievement of a smaller actual size of the computer vision-based device stabilization system 104; and reduced heat output of the computer vision-based device stabilization system 104 in operation. Further, in securing a battery separate from an affixed device in close proximity to the affixed device, an inertia of an inner portion of the computer vision-based device stabilization system 104 increases while an inertia of an outer portion of the computer vision- based device stabilization system 104 decreases leading to greater ease in stabilizing the affixed device.
[0036] In a specific implementation, the computer vision-based device stabilization system 104 includes a handle. A handle included as part of the computer vision-based device stabilization system 104 can be removable from the computer vision-based device stabilization system 104. For example, a handle can be removable from the computer vision-based device stabilization system 104 to allow for easier transportation and storage of the computer vision- based device stabilization system 104. A handle included as part of the computer vision-based device stabilization system 104 can include an attachment mechanism for coupling the handle to the computer vision-based device stabilization system 104. For example, a handle included as part of the computer vision-based device stabilization system 104 can include a steel insert with threads. Further, a handle included as part of the computer vision-based device stabilization system 104 can be foldable and portable to allow for easy transport and storage of the handle. Additionally, a handle included as part of the computer vision-based device stabilization system 104 can be module to allow for easy transport and storage of the handle.
[0037] In a specific implementation, the computer vision-based device stabilization system 104 includes a handle attachment mechanism for securing a removable handle to the computer vision-based device stabilization system 104. For example, the computer vision-based device stabilization system 104 can include a threaded recess for attaching a removable handle to the computer vision-based device stabilization system 104. A handle attachment mechanism of the computer vision-based device stabilization system 104 can be configured to receive different types of handles. For example, the computer vision-based device stabilization system 104 can include a handle attachment mechanism capable of securing both a gripped handle and a handle with a clamp to the computer vision-based device stabilization system 104. [0038] In a specific implementation, the computer vision-based device stabilization system 104 includes a handle that is coupled to one or more gimbals of the computer vision-based device stabilization system 104. A handle can be physically coupled to one or more gimbals of the computer vision-based device stabilization system 104. For example, the computer vision- based device stabilization system 104 can include a handle with a large pitch, thereby requiring less revolutions of one or more gimbals in attaching the one or more gimbals to the handle. Further, a handle coupled to one or more gimbals of the computer vision-based device stabilization system 104 can include a dampener between the handle and the one or more gimbals. Specifically, a dampened between a handle and one or more gimbals of the computer vision-based device stabilization system 104 can dampen translational movement of either or both the handle and the one or more gimbals along one or a plurality of translational axes.
[0039] In a specific implementation, the computer vision-based device stabilization system 104 includes a handle integrated with electrical components to facilitate transmission of electrical signals through the handle. For example, the computer vision-based device stabilization system 104 can include a handle having a grip with electrically conductive pins to allow for transferring electrical signals through the handle. Additionally, the computer vision- based device stabilization system 104 can include a handle with an attachment mechanism with electrical connections for electrically coupling the handle to the computer vision-based device stabilization system 104. Further, the computer vision-based device stabilization system 104 can include a handle with an electrical connector separate from an attachment mechanism and configured to electrically couple the handle to the computer vision-based device stabilization system 104.
[0040] In a specific implementation, the computer vision-based device stabilization system 104 includes a handle through which a user can provide user input for purposes of controlling the computer vision-based device stabilization system 104. For example, the computer vision- based device stabilization system 104 can include a handle through which a user can provide input to control either or both moving an affixed device and stabilizing an affixed device through the computer vision-based device stabilization system 104. Specifically, the computer vision-based device stabilization system 104 can include a handle with one or a plurality of actuators or applicable mechanisms through which a user can input to control either or both movement of an affixed device and stabilization of an affixed device. For example, the computer vision-based device stabilization system 104 can include a handle with a button that when activated causes an affixed device to be displaced along a specific axis. User input provided through a handle can be provided to a control circuit, e.g. residing on a main printed circuit board (hereinafter referred to as "PCB") of the computer vision-based device stabilization system 104, which can subsequently control operation of one or more gimbals of the computer vision-based device stabilization system 104 based on the input.
[0041] In an example of operation of the example system shown in FIG. 1, the stabilized device 102 is affixed to one or more gimbals of the computer vision-based device stabilization system 104 for purposes of stabilizing the stabilized device 102. In the example of operation of the example system shown in FIG. 1, the computer vision-based device stabilization system 104 senses stimuli in an environment surrounding the computer vision-based device stabilization system 104. Further, in the example of operation of the example system shown in FIG. 1 , the computer vision-based device stabilization system 104 stabilizes the stabilized device 104 through the one or more gimbals based on the detectable stimuli in an environment surrounding the computer vision-based device stabilization system 104.
[0042] FIG. 2 depicts a flowchart 200 of an example of a method for stabilizing a device based on detectable stimuli. The flowchart 200 begins at module 202, where a user affixes a device to one or more gimbals of a computer vision-based device stabilization system. A device affixed to one or more gimbals of a computer vision-based device stabilization system can be an applicable portable device capable of being stabilized, such as a smart phone, a light, a sonar system, a gun, a laser system, a utensil, and a tray. For example, a device affixed to one or more gimbals of a computer vision-based device stabilization system can be a light on a ship rocking back and forth.
[0043] The flowchart 200 continues to module 204, where stimuli in an environment surrounding the computer vision-based device stabilization system are sensed. In a specific implementation, one or more sensors detect stimuli in an environment surrounding the computer vision-based device stabilization system and within range of the one or more sensors. For example, a camera can generate images of a field of view in an environment surrounding the computer vision-based device stabilization system. In another example, a transducer can translate audio noises made in an environment surrounding the computer vision-based device stabilization system into an electrical signal as part of sensing stimuli in the environment.
[0044] The flowchart 200 continues to module 206, where the device is stabilized through the one or more gimbals based on the detectable stimuli in the environment surrounding the computer vision-based device stabilization system. In stabilizing the device based on the detectable stimuli in the environment, the computer vision-based device stabilization system can cause displacement of the gimbals to displace the device along one or more axes for purposes of stabilizing the device. The device can be stabilized by applying computer vision to images of a field of view in the environment for purposes of stabilizing the device based on the detectable stimuli in the environment. For example, objects in images of a field of view in the environment can be identified and tracked for purposes of stabilizing the device. Further, sounds in the environment surrounding the computer vision-based device stabilization system can be used to stabilize the device. For example, the device can be displaced to face a direction from which a voice is emanating in the environment surrounding the computer vision-based device stabilization system. [0045] FIG. 3 depicts a diagram 300 of a system for stabilizing a device based on detectable stimuli in an environment at the device. The system shown in FIG. 3 can be implemented as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. The system shown in FIG. 3 includes a computer-readable medium 302, a frame 304, a gimbal assembly 306, optionally a handle 308, a sensor 310, and a computer vision-based stabilization control system 312. In the example system shown in FIG. 3 the gimbal assembly 306, the sensor 310, and the computer vision-based stabilization control system 312 are coupled to each other through the computer readable-medium 302. Additionally, in the example system shown in FIG. 3, the gimbal assembly 306 can be physically coupled to or implemented as part of the frame 304.
[0046] The frame 304 is intended to represent a frame configured to receive and physically affix a device to itself for purposes of stabilizing the device. The frame 304 can be of a size and shape to physically secure a device to itself for purposes of stabilizing the device. For example, the frame 304 can be of a size and shape to physically secure a smart phone or camera that falls within a range of dimensions. Additionally, the frame 304 can include securing mechanisms for securing a device to itself for purposes of stabilizing the device. For example, the frame 304 can include clips for securing a device to the frame 304 for purposes of stabilizing the device. The frame 304 can be configured to rigidly secure a device to itself for purposes of causing the device to move in the same movements as the frame 304 is moved. For example, the frame 304 can be configured to rigidly secure a device such that as the frame moves specific distances along three axes, the device moves the same specific distances along the three axes.
[0047] In a specific implementation, the frame 304 functions to contain a battery. A battery contained within the frame 304 can be used to power one or an applicable combination of the components shown in FIG. 3 or all or portions of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. In including a battery in the frame 304, a form factor of an applicable device for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper, can be decreased leading to greater portability of both devices. Further, in including a battery in the frame 304, the battery does not need to be contained within a handle of an applicable device for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper, allowing for device compatibility with a removable handle. Further in the example, in being compatible with a removable handle, the handle can be removed and the computer vision- based device stabilized system along with a stabilized device can be mounted, e.g. on a helmet of a user.
[0048] In a specific implementation, the frame 304 utilizes a claw mechanism as a securing mechanism for purposes of affixing a device to itself for purposes of stabilizing the device. Advantageously, a claw mechanism can be configured to allow for a battery to be stored in the frame 304. For example, a claw mechanism can eliminate a need to thread a rod through at least a portion of the frame 304, thereby preventing the frame 304 from containing a battery. A claw mechanism included as part of the frame 304 can include gears with clockwise threads and counter clockwise threads that displace opposing members of the claw mechanism towards and away from each other for purposes of physically engaging and subsequently securing a device to the frame 304. Specifically, when the claw mechanism is rotated to cause threads to engage each other as the mechanism is rotated, the opposing members of the claw mechanism can displace towards or away from each other. Opposing members of a claw mechanism of the frame 304 can include rubber for further use in securing a device between the opposing members and subsequently to the frame 304 through the claw mechanism. [0049] The gimbal assembly 306 is intended to represent an assembly to cause a device affixed to the frame 304 to be displaced for purposes of stabilizing the device. Specifically, the gimbal assembly 306 can be operated to cause the frame 304 to displace along one or a plurality of axes for purposes of stabilizing a device affixed to the frame 304. The gimbal assembly 306 can include one or more gimbals to cause displacement of the frame 304 and subsequently a device affixed to the frame 304 along one or more axes. For example, the gimbal assembly 306 can include three gimbals to cause the frame 304 and a device affixed to the frame to displace along one of three corresponding axes for purposes of stabilizing the device.
[0050] In a specific implementation, the gimbal assembly 306 includes one or more pivot mechanisms allowing one or more gimbals in the gimbal assembly 306 to pivot about one or more axes for purposes of stabilizing a device. For example, the gimbal assembly 306 can include a pivot mechanism to facilitate a corresponding gimbal of the gimbal assembly 306 to displace along a specific axis. Further in the example, the corresponding gimbal of the gimbal assembly can displace along the specific axis to cause the frame 304 to displace and subsequently stabilize a device affixed to the frame. A pivot mechanism can correspond to one or more gimbals on a 1 : 1 or a l :n basis. For example, each gimbal of the gimbal assembly 306 can have its own corresponding pivot mechanism that allows each gimbal to pivot about a corresponding axis. Further in the example, in having one corresponding pivot mechanism for each gimbal of the gimbal assembly, each gimbal can pivot independently of each other to allow for displacement of the frame 104 along any direction.
[0051] In a specific implementation, the gimbal assembly 306 drive mechanisms for driving displacement of a gimbal of the gimbal assembly 306 through a pivot mechanism along an axes. A drive mechanism can include an applicable mechanism for introducing mechanical force to cause displacement of a gimbal about a pivot mechanism. For example, a drive mechanism can include an electric motor, e.g. a brushed or brushless electric motor. The gimbal assembly 306 can include a separate drive mechanism for each gimbal in the gimbal assembly 306. By including a separate drive mechanism for each gimbal in the gimbal assembly 306 displacement of each gimbal can be controlled separately from the other gimbals to allow for displacement of the frame in any direction for purposes of stabilizing a device affixed to the frame. Characteristics of different drive mechanisms of het gimbal assembly can vary and be dependent upon a specific gimbal the drive mechanism displaces. For example, if one gimbal of the gimbal assembly 306 supports the largest amount of weight of an affixed device compared to the other gimbals of the gimbal assembly 306, then a drive mechanism for the one gimbal assembly can be stronger than drive mechanisms for the other gimbals.
[0052] In a specific implementation, the gimbal assembly 306 includes one or more encoders for measuring performance characteristics of drive mechanisms. One or more encoders included in the gimbal assembly 306 can be used to measure performance characteristics of drive mechanisms, e.g. motors, in displacing one or more gimbals of the gimbal assembly 306 around a pivot mechanism. Performance characteristics measured by one or more encoders of the gimbal assembly 306 include applicable performance characteristics related to a drive mechanism of the gimbal assembly. For example, performance characteristics measured by one or more encoders of the gimbal assembly can include an angular position of a drive mechanism, e.g. with respect to a fixed reference point. In another example, performance characteristics measured by one or more encoders of the gimbal assembly can include an angular speed at which a drive mechanism is moving.
[0053] In a specific implementation, performance characteristics of drive mechanisms measured by one or more encoders of the gimbal assembly 306 are used to determine one or a combination of an actual orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304. Specifically, either or both an angular position and an angular speed of a drive mechanism, as determined by an encoder of the gimbal assembly can be used to determine an actual orientation of one or more gimbals of the gimbal assembly 306, which can subsequently be used to stabilize a device affixed to the frame 304. For example, based on an angular position of a drive mechanism, an actual orientation of the frame 304, corresponding to an actual orientation of one or more gimbals in the gimbal assembly 306, can be determined, which can be used to displace the one or more gimbals to a new position. Further in the example, by displacing the one or more gimbals to a new position the frame 304 can be displaced to a reference orientation corresponding to the new position of the one or more gimbals for purposes of stabilizing a device affixed to the frame 304. Further, a reference orientation of the frame 304 can correspond to a reference orientation of a device affixed to the frame 304. For example, moving the frame 304 to a reference orientation can subsequently displace a device affixed to the frame 304 to a reference orientation of the device for purposes of stabilizing the device.
[0054] An encoder of the gimbal assembly 306 is an applicable encoder for measuring performance of a drive mechanism of the gimbal assembly. For example, an encoder of the gimbal assembly 306 can be an incremental angular encoder or an absolute angular encoder. One or more encoders of the gimbal assembly 306 can be implemented on a main PCB of an applicable system for controlling stabilization of a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. Further, one or more encoder of the gimbal assembly 306 can generate performance characteristics data indicating performance characteristics of a drive mechanism and subsequently provide the performance characteristics data to a main PCB of an applicable system for controlling stabilization of a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. [0055] In a specific implementation, an encoder of the gimbal assembly 306 can generate performance characteristics data or a device mechanism at an applicable frequency for stabilizing a device using computer vision. Specifically, an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at an applicable frequency for stabilizing a device based on computer vision using the drive mechanism. For example, an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency between 400Hz and 1600Hz. An encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency to reduce or otherwise eliminate shaking in a video or a series of images captured at an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. For example, an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency greater than or equal to 400Hz. Further, an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency to limit triggering of image stabilization at a media capturing device being stabilized by an applicable system for stabilizing a device, such as the computer vision-based device stabilization systems described in this paper. For example, an encoder of the gimbal assembly 306 can measure performance characteristics of a drive mechanism at a frequency of 1600Hz or greater.
[0056] In a specific implementation, an encoder of the gimbal assembly 306 can generate and provide performance characteristics data of a drive mechanism according to an arbitration- free protocol. In generating and providing performance characteristics data according to an arbitration-free protocol, an encoder of the gimbal assembly 306 can utilize short wires and refrain from using time on a bus, thereby reducing times required to generate and communicate the data. Further, an encoder of the gimbal assembly 306 can act as a wireless encoder in either or both generating and sending performance characteristics data. [0057] In a specific implementation, the gimbal assembly 306 includes a first, second, and third drive mechanism positioned in terms of distance in ascending order away from a proximal end towards a distal end of the frame 304. For example, the first drive mechanism can be positioned closer to a proximal end of the frame 304, while a third drive mechanism can be positioned closer to a distal end of the frame 304, and a second drive mechanism can be positioned between the first drive mechanism and the second drive mechanism.
[0058] In a specific implementation, the gimbal assembly 306 includes a specific number of wires physically passing through specific drive mechanisms for purposes of controlling operating of three drive mechanisms in progressive distances away from a proximal end of the frame 304. Specifically, at a first drive mechanism closest to the proximal end of the gimbal assembly 306, six wires can be split into ten wires for purposes of driving three drive mechanisms of the gimbal assembly 306. Further, at a second drive mechanism second closest to the proximal end of the gimbal assembly 306, six wires can be split into seven wires for purposes of driving three drive mechanisms of the gimbal assembly 306. Additionally, at a third drive mechanism furthest from the proximal end and closes to the distal end of the gimbal assembly 306, six wires can be split into five wires for purposes of driving all three drive mechanisms of the gimbal assembly 306.
[0059] In a specific implementation, the gimbal assembly 306 is designed to reduced friction of wires through drive mechanisms included in the gimbal assembly 306. In reducing friction of wires through drive mechanisms included in the gimbal assembly 306, the gimbal assembly 306 can couple at a fewest of three wired cables to each of the drive mechanisms for purposes of control the drive mechanisms. Further, the gimbal assembly 306 can include wires for controlling the drive mechanisms with an increased group winding through each drive mechanism leading to a smaller overall core. As a result, reduced wire friction can be observed through the gimbal assembly 306 and in particular drive mechanisms of the assembly 306. [0060] The gimbal assembly 306 is optionally coupled to a handle 308. The handle 308 can be an applicable handle for use with an applicable system for stabilizing a device based on computer vision, such as the computer vision-based device stabilization systems described in this paper. The handle 308 is capable of being removably coupled to the gimbal assembly, thereby allowing for the handle 308 to be optional. When the handle 308 is removed from the gimbal assembly 306, the gimbal assembly 306 and corresponding frame 304 can be mounted, e.g. on a helmet or body part of a user. The handle 308 can include one or a plurality of actuators or applicable mechanisms through which a user can provide user input for controlling either or both movement of a device being stabilized and actual stabilization of the device. For example, the handle 308 can be used to provide user input indicating to rotate a stabilized device, while the device is being stabilized.
[0061] The sensor 310 is intended to represent an applicable sensor for detecting stimuli in an environment at either or both the gimbal assembly 306 and a device affixed to the frame 304, such as the sensors described in this paper. For example, the sensor 310 can be a camera configured to capture images of a field of view of an environment at either or both the gimbal assembly 306 and a device affixed to the frame 304. The sensor 310 can be integrated as part of a device affixed to the gimbal assembly 306. For example, the sensor 310 can be a camera of a smart phone affixed to the frame 304 for purposes of stabilizing the smart phone. Additionally, the sensor 310 can be separate from a device affixed to the frame 304 for purposes of stabilizing the device. For example, the sensor can be 310 can be a 360° camera. In being separate from a device affixed to the frame 304 for purposes of stabilizing the device, the sensor 310 can be positioned on or near the frame.
[0062] In a specific implementation, the sensor 310 can capture images of a field of view in an environment at a frequency to limit or eliminate shaking in the images captured by the sensor by stabilizing the sensor 310. Specifically, the sensor 310 can capture images at a frequency fast enough to allow the sensor 310 to be stabilized, as part of it being integrated with a device affixed to the frame 304, the frame itself, or the gimbal assembly 306. For example, the sensor 310 can capture images of a field of view in an environment at a rate of 240 frames-per-second or greater.
[0063] The computer vision-based stabilization control system 312 is intended to represent a system that functions to control stabilization of a device affixed to the frame 304 using computer vision. The computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on one or a combination of an actual orientation of one or a combination of an actual orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304. Further, the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on one or a combination of a reference orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304. For example, the computer vision-based stabilization control system 312 can determine how to operate the gimbal assembly to move a device affixed to the frame 304 from its actual orientation to its reference orientation for purposes of stabilizing the device based on an actual orientation and a reference orientation of the device.
[0064] The computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on detectable stimuli detected by an applicable sensor, such as the sensors described in this paper. For example, the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 based on images of a field of view in an environment at either or both the gimbal assembly 306 and a device affixed to the frame 304. Further in the example, the computer vision-based stabilization control system 312 can control stabilization of a device affixed to the frame 304 by applying computer vision to the images of the field of view in an environment at either or both the gimbal assembly 306 and the device affixed to the frame 304 to determine one or a combination of an actual orientation of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304.
[0065] In controlling stabilization of a device, the computer vision-based stabilization control system 312 functions to control operation of the gimbal assembly 306 for purposes of stabilizing the device. Specifically, the computer vision-based stabilization control system 312 can control actuation of drive mechanisms of the gimbal assembly 306 to control movement of the gimbal assembly 306 and subsequently the frame 304 for purposes of stabilizing a device affixed to the frame 304. For example, the computer vision-based stabilization control system 312 can control actuation of one or more motors of the gimbal assembly 306 to control displacement of one or more gimbals of the gimbal assembly 306 to subsequently stabilize a device affixed to the frame 304. In controlling actuation of drive mechanisms of the gimbal assembly 306, the computer vision-based stabilization control system 312 can send an actuation signal to the drive mechanisms to cause the drive mechanisms to operate in a specific way. For example, the computer vision-based stabilization control system 312 can send an actuation signal to cause a drive mechanism of the gimbal assembly to generate a specific amount of linear force or torque in a specific direction to move a device affixed to the frame 304 from its actual orientation to a reference orientation. In the case where a drive mechanism is an electric motor, the computer vision-based stabilization control system 312 can control actuation based on a magnetic field of the motor using an actuation signal, either wired or wireless.
[0066] In a specific implementation, the computer vision-based stabilization control system 312 functions to control operation of the gimbal assembly 306 for stabilizing a device based on performance characteristics of one or more drive mechanisms of the gimbal assembly 306. Specifically, the computer vision-based stabilization control system 312 can control operation of the gimbal assembly 306 based on performance characteristics data received from one or more encoders of the gimbal assembly 306. For example, the computer vision-based stabilization control system 312 can control operation of the gimbal assembly 306 based on performance characteristics data of one or more drive mechanisms of the gimbal assembly determined at a frequency of 1600Hz. The computer vision-based stabilization control system 312 can use either or both an angular position of a drive mechanism of the gimbal assembly 306 and an angular speed at which a drive mechanism is moving to control operation of the gimbal assembly 306. For example, the computer vision-based stabilization control system 312 can determine a current position, e.g. angle position, of a drive mechanism of the gimbal assembly and a desired orientation of a drive mechanism for purposes of stabilizing a device affixed to the frame 304. Further in the example, a desired orientation of a drive mechanism can correlate to a desired orientation of the device in stabilizing it and subsequent actuation of the drive mechanism causing displacement of one or more gimbals in the gimbal assembly 306 displaces the device to the desired orientation.
[0067] In a specific implementation, the computer vision-based stabilization control system 312 functions to use computer vision to stabilize a device affixed to the frame 304. The computer vision-based stabilization control system 312 can apply computer vision to images of a field of view in an environment at either or both a device affixed to the frame 304 and the gimbal assembly 306 to control stabilization of the device using the gimbal assembly 306. In applying computer vision to images of an environment at either or both a device affixed to the frame 304 can use computer vision to determine movement of the device or the gimbal assembly 306. Based on determined movement of a device affixed to the frame 304 or the gimbal assembly 306 can determine a reference orientation to displace the device to for purposes of stabilizing the device, e.g. an original position of the device. For example, if, based on computer vision, the computer vision-based stabilization control system 312 determines a device affixed to the frame 304 has fallen 1mm from an old orientation to a new orientation, then the computer vision-based stabilization control system 312 can determine a reference orientation of the device for purposes of stabilizing the device is at coordinates in space of the old orientation.
[0068] In a specific implementation, the computer vision-based stabilization control system 312 functions to generate an actuation signal for purposes of stabilizing a device based on a determined actual orientation and reference orientation. The computer vision-based stabilization control system 312 can control generate an actuation signal based on a desired orientation and a reference orientation of one or a combination of a drive mechanism of the gimbal assembly 306, one or more gimbals of the gimbal assembly 306, the frame 304, and a device affixed to the frame 304. For example, in comparing an actual orientation and a reference orientation of a device affixed to the frame 304, the computer vision-based stabilization control system 312 can determines the device has moved 1mm to the left of its reference orientation. Further in the example, the computer vision-based stabilization control system 312 can generate an actuation signal to cause the gimbal assembly to move the device lmm to the right to its reference orientation for stabilizing the device. Additionally, in generating an actuation signal based on actual and reference orientations for purposes of stabilizing a device, the computer vision-based stabilization control system 312 can factor in compensation parameters. For example, if the computer vision-based stabilization control system 312 determines a device affixed to the frame 304 has been displaced lmm to the right while linearly travelling at a speed of twenty miles per hour, then the computer vision-based stabilization control system 312 can generate an actuation signal to cause the gimbal assembly 306 to displace the device 1mm to the left while it is still travelling at a speed of twenty miles per hour.
[0069] In a specific implementation, the computer vision-based stabilization control system 312 can be integrated as part of a main PCB of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. In being implemented as part of a main PCB of an applicable system for stabilizing a device, the computer vision-based stabilization control system 312 can send actuation signal directly to drive mechanisms of the gimbal assembly 306 without passing through an encoder of the gimbal assembly 306. As a result, drive mechanisms can be controlled faster leading to increased stabilization speeds and decreased de-stabilization of a device affixed to the frame 304. Further, this can reduce shaking in a video captured by a device affixed to the frame 304 as a result of the observed decreased de-stabilization of a device affixed to the frame 304.
[0070] In a specific implementation, the computer vision-based stabilization control system 312 functions to control operation of a drive mechanism of the gimbal assembly 306 at an applicable frequency for stabilizing a device using computer vision. Specifically, the computer vision-based stabilization control system 312 can either or both generate and send actuation signals to a drive mechanism at an applicable frequency for stabilizing a device using the drive mechanism based on computer vision. For example, the computer vision-based stabilization control system 312 can generate and send actuation signals at a frequency between 400Hz and 1600Hz. Further, the computer vision-based stabilization control system 312 can generate and send actuation signals at a frequency to reduce or otherwise eliminate shaking in a video or a series of images captured at an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. For example, the computer vision-based stabilization control system 312 can generate and send actuation signals to a drive mechanism at a frequency greater than or equal to 400Hz. Further, the computer vision-based stabilization control system 312 can generate and send actuation signals to a drive mechanism at a frequency to limit triggering of image stabilization at a stabilized media capturing device affixed to the frame 304. For example, the computer vision- based stabilization control system 312 can generate and send actuation signals to a drive mechanism at a frequency of 1600Hz or greater.
[0071] In a specific implementation, the computer vision-based stabilization control system 312 functions to disable a stabilizer on a device affixed to the frame 304. In particular, the computer vision-based stabilization control system 312 can functions to disable a software- based stabilizer operating on a smart phone or camera affixed to the frame 304 for purposes of eliminating interference in stabilization of the phone or camera caused by the stabilizer operating on the phone or camera.
[0072] In a specific implementation, the computer vision-based stabilization control system 312 functions to be implemented, at least in part, at a device affixed to the frame 304. For example, the computer vision-based stabilization control system 312 can be implemented as a native application or a web-based application at a device affixed to the frame 304. In being implemented, at least in part, at a device affixed to the frame 304, the computer vision-based stabilization control system 312 can function to synchronize an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper, to actually stabilize the device. For example, the computer vision-based stabilization control system 312 can be implemented at a smart phone and configure the smart phone to provide a video feed captured at the phone to the computer vision-based stabilization control system 312 for purposes of stabilizing the phone. Additionally, in being implemented, at least in part, at a device affixed to the frame 304, the computer vision-based stabilization control system 312 can disable a stabilizer on the device. For example, the computer vision-based stabilization control system 312 can enable applications at a device to disable a software-based stabilizer of the device.
[0073] In a specific implementation, the computer vision-based stabilization control system 312 is implemented as part of a 2-differental bus. In being implemented as part of a 2- differential bus, the computer vision-based stabilization control system 312 can send using a first bus a signal transmission window to the sensor 310 and subsequently receive data, e.g. performance characteristics data. Additionally, using a different bus the computer vision-based stabilization control system 312 can send actuation signals to a drive mechanism of the gimbal assembly 306.
[0074] FIG. 4 depicts a flowchart 400 of an example of a method for stabilizing a device using computer vision. The flowchart 400 begins at module 402, where a device is affixed to a frame of a computer vision-based device stabilization system. A device can be affixed to a frame of a computer vision-based device stabilization system through an applicable securing mechanism. For example, a device can be affixed to a frame of a computer vision-based device stabilization system through a claw mechanism. Further in the example, the claw mechanism extends either not at all or only a portion into the frame to allow a battery to be contained within the frame for purposes of reducing a form factor of the computer vision-based device stabilization system.
[0075] The flowchart 400 continues to module 404, where performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system are received. Performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system can be received from one or a plurality of encoders corresponding to the drive mechanisms. Performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system can be either or both generated and transmitted according to an arbitration- free protocol. Performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system can include either or both an angular position of the drive mechanisms and an angular speed at which the drive mechanisms are being displaced.
[0076] The flowchart 400 continues to module 406, where images of a captured field of view of an environment at the computer vision-based device stabilization system are received. Images of a captured field of view of an environment at the computer vision-based device stabilization system can be received from an applicable sensing source, such as the sensors described in this paper. For example, images of a captured field of view of an environment at the computer vision-based device stabilization system can be received from a camera of the device affixed to the frame of the computer vision-based device stabilization system. In another example, images of a captured field of view of an environment at the computer vision- based device stabilization system can be received from a sensor integrated as part of the frame and separate from the device affixed to the frame.
[0077] The flowchart 400 continues to module 408, where an actual orientation of the device is determined. An actual orientation of the device can be determined from the performance characteristics data. For example, an actual orientation of the device can be determined from an actual orientation of one or a combination of the drive mechanisms of the gimbal assembly, gimbals of the gimbal assembly, and the frame, as indicated by the performance characteristics data of the drive mechanisms of the gimbal assembly. Additionally, an actual orientation of the device can be determined from one or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated as part of the computer vision- based device stabilization system. For example, data generated by an accelerometer, a gyroscope, and a GPS sensor integrated on a main PCB of the computer vision-based device stabilization system can be used to determine an actual orientation of the device affixed to the frame.
[0078] The flowchart 400 continues to module 410, where computer vision is applied to the images of the captured field of view of the environment to determine a reference orientation of the device. In applying computer vision to images of the captured field of view of the environment to determine a reference orientation of the device, one or a plurality of objects in the captured field of view can be identified using computer vision. Further, in applying computer vision to images of the captured field of view of the environment to determine a reference orientation of the device, an identified object can be tracked in the images using computer vision to determine a reference orientation. For example, if an object is at an initial location in an image is at a specific coordinate in space, a reference orientation of the device can be determined to be the specific coordinate.
[0079] The flowchart 400 continues to module 412, where the gimbal assembly is controlled based on the performance characteristics to stabilize the device by correlating the actual orientation of the device with the reference orientation of the device. For example, if the actual orientation is at a first coordinate in space and the reference orientation of the device is at a second coordinate in space, then the gimbal assembly can be controlled to move the device from the first coordinate to the second coordinate. In controlling the gimbal assembly by correlating the actual orientation with the reference orientation, the drive mechanisms of the gimbal assembly can be actuated to cause the gimbal to move the device from the actual orientation to the reference orientation for purposes of stabilizing the device. For example, an actuation signal can be generated and sent to the drive mechanisms of the gimbal assembly to cause the drive mechanisms to actuate and subsequently displace the device from the actual orientation to the reference orientation for purposes of stabilizing the device. [0080] FIG. 5 depicts a diagram 500 of a system for directly controlling drive mechanisms of a gimbal assembly for purposes of stabilizing a device using computer vision. The system shown in FIG. 5 includes a drive mechanism 502, an encoder 504, and a main PCB 506. Further, in the system shown in FIG. 5, the main PCB 506 includes a computer vision-based stabilization control system 508. The components shown in FIG. 5 can be integrated as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper.
[0081] The drive mechanism 502 is intended to represent a drive mechanism of a gimbal assembly for purposes of stabilizing a device coupled to the gimbal assembly. The drive mechanism 502 can be an applicable mechanism for driving one or a plurality if gimbals in a gimbal assembly. For example, the drive mechanism 502 can be a brushless electric motor. The drive mechanism 502 can be controlled through actuation signals sent to the drive mechanism 502 for purposes of stabilizing a device. For example, the drive mechanism 502 can be actuated to move a device affixed to a frame coupled to a gimbal assembly from an actual orientation to a reference orientation for purposes of stabilizing the device.
[0082] The encoder 504 is intended to represent an applicable mechanism for measuring performance characteristics of the drive mechanism 502. While the encoder 504 is shown to be separate from the main PCB 506, in various implementation, the encoder 504 can be implemented at the main PCB 506. The encoder can be an applicable encoder for measuring performance characteristics of the drive mechanism 502. For example, the encoder 504 can be an incremental angular encoder configured to measure an angular position of the drive mechanism 502 with respect to a fixed reference point. The encoder 504 can determine performance characteristics at an applicable frequency for stabilizing a device using computer vision. For example, the encoder 504 can determine performance characteristics of the drive mechanism 502 at a frequency between 400Hz and 1600Hz. [0083] The main PCB 506 is intended to represent a main PCB of a system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. The main PCB 506 can include one or a combination of an accelerometer, a gyroscope, and a GPS sensor. One or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated as part of the main PCB 506 can be used to determine an actual orientation of a device for purposes of stabilizing the device. For example, an accelerometer and a gyroscope on the main PCB 506 can be used to determine changes to yaw, pitch, and roll of a frame supporting a device, which can subsequently be used to determine an actual orientation of the device as it is affixed to the frame.
[0084] The main PCB 506 includes a computer vision-based stabilization control system 508. The computer vision-based stabilization control system 508 is intended to represent an applicable system for controlling stabilization of a device using computer vision, such as the computer vision-based stabilization control systems described in this paper. The computer vision-based stabilization control system 508 can control a gimbal assembly for purposes of stabilizing a device using computer vision. The computer vision-based stabilization control system 508 can apply computer vision to images of a field of view of an environment at a device to determine a reference orientation of the device for purposes of stabilizing the device. Additionally, the computer vision-based stabilization control system 508 can calculate an actual orientation of a device for purposes of stabilizing the device. The computer vision-based stabilization control system 508 can calculate an actual orientation of a device from either or both performance characteristics data received from the encoder 504 and position data generated by one or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated at the main PCB 506.
[0085] The computer vision-based stabilization control system 508 functions to send an actuation signal to the drive mechanism 502 for purposes of actuating the drive mechanism 502 in stabilizing a device. The computer vision-based stabilization control system 508 can send the actuation signal directly to the drive mechanism 502 instead of sending the actuation signal to the drive mechanism 502 indirectly through the encoder 504. As a result, faster actuation times are achieved corresponding to faster stabilization of a device, potentially faster than a human eye is capable of detecting that the device has actually moved. The computer vision- based stabilization control system 508 can generate the actuation signal based on an actual orientation of the device, a reference orientation of the device, and performance characteristics data of the drive mechanism 502 received from the encoder 504. For example, the computer vision-based stabilization control system 508 can determine how to operate the drive mechanism 502 to move a device from its actual orientation to a reference orientation based on the performance characteristics data of the drive mechanism 502.
[0086] FIG. 6 depicts a diagram of a computer vision-based reference orientation identification system 602. The computer vision-based reference orientation identification system 602 is intended to represent a system that functions to determine a reference orientation for use in stabilizing a device. The computer vision-based reference orientation identification system 602 can be implemented as part of an applicable system for stabilizing a device using computer vision, such as the computer vision-based device stabilization systems described in this paper. Additionally, the computer vision-based reference orientation identification system 602 can be implemented as part of an applicable system for controlling a gimbal assembly for purposes of stabilizing a device, such as the computer vision-based stabilization control systems described in this paper.
[0087] In a specific implementation, the computer vision-based reference orientation identification system 602 functions to identify a reference orientation using applicable input received at the computer vision-based reference orientation identification system 602. The computer vision-based reference orientation identification system 602 can identify a reference orientation based on input received from a user. For example, the computer vision-based reference orientation identification system 602 can identify a reference orientation by tracking an object within portion of an image of a field of view of an environment identified by a user. Additionally, the computer vision-based reference orientation identification system 602 can identify a reference orientation based on input stimuli. For example, the computer vision-based reference orientation identification system 602 can apply computer vision to captured images of a field of view of an environment at a device for purposes of stabilizing the device. In another example, the computer vision-based reference orientation identification system 602 can use input from an acoustic sensor to determine distances in images of a field of view of an environment at a device for purposes of stabilizing the device.
[0088] The computer vision-based reference orientation identification system 602 shown in FIG. 6 includes an input engine 604, a predictive tracking model maintenance engine 606, a predictive tracking model datastore 608, a tracking object identification engine 610, and a reference orientation identification engine 612. The input engine 604 is intended to represent an engine that functions to receive input for purposes of stabilizing a device using computer vision. The input engine 604 can receive input from a user operating an applicable system for stabilizing a device, such as the computer vision-based device stabilization systems described in this paper. For example, the input engine 604 can receive input indicating a portion of a frame of field of view of an environment at a device to stabilize to use in actually stabilizing the device using computer vision. In another example, the input engine 604 can receive user input indicating an object to track in a field of view of an environment at a device for purposes of stabilizing the device using computer vision.
[0089] In a specific implementation, the input engine 604 functions to receive input indicating detectable stimuli in an environment at a device for purposes of stabilizing the device using computer vision. The input engine 604 can receive input indicating detectable stimuli in an environment at a device for purposes of stabilizing the device from an applicable sensor for detecting stimuli. For example, the input engine 604 can receive images of a field of view of an environment at a device from a camera, either or both implemented as part of the device or separate from the device. In another example, the input engine 604 can receive from an acoustic sensor input indicating distances a device or a system for controlling stabilization of the device are away from objects in an environment at the device.
[0090] The predictive tracking model maintenance engine 606 is intended to represent an engine that functions to maintain a predictive tracking model for use in determining a reference orientation of a device using computer vision. A predictive tracking model maintained by the predictive tracking model maintenance engine 606 can be applied to determine a reference orientation of a device for purposes of stabilizing the device. For example, if an obj ect in images of a field of view of an environment at a device are being tracked to determine a reference orientation, and the object disappears from the images, then a predictive tracking model can be applied to determine the reference orientation without needing to track the object. In maintaining a predictive tracking model, the predictive tracking model maintenance engine 606 can generate and continuously update a predictive tracking model. For example, the predictive tracking model maintenance engine 606 can maintain a predictive tracking model in real time as a device is stabilized. A predictive tracking model maintained by the predictive tracking model maintenance engine 606 can be specific to a device, characteristics of a device, and characteristics of operation of a device by a user. For example, a predictive tracking model can be unique to a specific device or unique to a specific type of device. Further in the example, a predictive tracking model can be used to determine a reference orientation of all smart phones of a specific make and model.
[0091] In a specific implementation, the predictive tracking model maintenance engine 606 functions to maintain a predictive tracking model based on historical movements of a device. Specifically, the predictive tracking model maintenance engine 606 can update a predictive tracking model in real time to indicate how orientation of a device has changed to build a historical movement partem of the device. For example, if a device on a ship is constantly swaying back and forth in rhythm with waves, then the predictive tracking model maintenance engine 606 can maintain a predictive tracking model to indicate the constant swaying back and forth. The predictive tracking model maintenance engine 606 can maintain a predictive tracking model based on historical movements determined through computer vision. Specifically, the predictive tracking model maintenance engine 606 can update a predictive tracking model to include movements of a device determined by applying computer vision to images of a field of an environment.
[0092] In a specific implementation, the predictive tracking model maintenance engine 606 functions to maintain a predictive tracking model based on either or both physical natural laws. In maintaining a predictive tracking model based on either or both physical natural laws, the predictive tracking model maintenance engine 606 can apply physical or natural laws based on characteristics of a device or an operator of the device. For example, if a human is operating a device, then the predictive tracking model maintenance engine 606 can build a predictive tracking model indicating the device cannot fly as a human cannot fly. In another example, if a device has a specific mass, then the predictive tracking model maintenance engine 606 can build a predictive tracking model indicating if a device is accelerating at a specific rate, then it will have a specific increasing velocity based on the mass of the object.
[0093] The predictive tracking model datastore 608 is intended to represent a datastore that functions to store predictive tracking model data indicating predictive tracking models. Predictive tracking models indicated by predictive tracking model data stored in the predictive tracking model datastore 608 can be used to determine a reference orientation of a device for purposes of stabilizing the device. For example, if a tracked object falls out of images of a field of view of an environment at a device, then a predictive tracking model indicated by predictive tracking model data stored in the predictive tracking model datastore 608 can be applied to determine a reference orientation of the device for purposes of stabilizing it.
[0094] The tracking object identification engine 610 is intended to represent an engine that functions to identify an object in images to track for purposes of determining a reference orientation of a device. A reference orientation determined by tracking an object identified by the tracking object identification engine 610 can be used to stabilize a device. The tracking object identification engine 610 can identify an object to track in images of a field of view of an environment at a device for purposes of stabilizing the device. For example, the tracking object identification engine 610 can identify an object to track in images captured by a camera of a device of a field of view of an environment at the device for purposes of stabilizing it. In another example, the tracking object identification engine 610 can identify an object to track in images captured by a camera separate from a device of a field of view of an environment at the device for purposes of stabilizing the device.
[0095] In a specific implementation, the tracking object identification engine 610 can apply computer vision to identify objects to track in images of a field of view of an environment. The tracking object identification engine 610 can identify objects to track in images using applicable computer vision techniques. For example, the tracking object identification engine 610 can use either or both a scale invariant object transform (hereinafter referred to as "SIFT") object detection method or a speeded up robust objects (hereinafter referred to as "SURF") object detection method to identify objects to track in images. The tracking object identification engine 610 can apply computer vision to identify key points in images and match the key points in the images as part of point tracking to identify objects to track in the images. [0096] In a specific implementation, the tracking object identification engine 610 functions to create a model of an object to track by applying computer vision to images including the object. The tracking object identification engine 610 can use a point correlation computer vision method to create a model of an object to track. For example, the tracking object identification engine 610 can slide an object around in a frame or an image to identify key points, e.g. through application of either or both SIFT or SURF object detection methods. Further in the example, the tracking object identification engine 610 can identify key points through application of computer vision across a plurality of images. Still further in the example, the tracking object identification engine 610 can match the key points to build a model of the object.
[0097] In a specific implementation, the tracking object identification engine 610 functions to select an object to track based on characteristics of the object. For example, if an object is moving across a plurality of images of a field of view of an environment, then the tracking object identification engine 610 can select the object for purposes of tracking the object. In another example, if an object is of a specific size in an image, then the tracking object identification engine 610 can select the object for purposes of tracking the object.
[0098] In a specific implementation, the tracking object identification engine 610 functions to identify an obj ect to track based on user input. For example, the tracking object identification engine 610 can identify an object to track in a portion of an image of a captured field of view of an environment. Further in the example, the tracking object identification engine 610 can center the portion of the image of the captured field of view of the environment for purposes of tracking the identified object in the portion of the image. Alternatively, the tracking object identification engine 610 can select a portion of an image of a captured field of view of an environment independent from user input and subsequently identify an obj ect to track in the portion of the image. In tracking an object in only a portion of an image, computational resources needed to identify and track the object are reduced. This reduces consumed processing power or a required processing power and conserves battery power. As a result, the tracking object identification engine 610 and other applicable parts of the computer vision-based reference orientation identification system 602 can be implemented on a non- industrial device.
[0099] The reference orientation identification engine 612 is intended to represent an engine that functions to determine a reference position of a device for purposes of stabilizing a device. The reference orientation identification engine 612 can determine a reference position of a device by tracking objects in images of a field of view of an environment at the device. For example, the reference orientation identification engine 612 can determine a changing reference position of a device has moved to the left 1mm based on objects tracked in images of a field of view of an environment of the device moving 1mm to the left. The reference orientation identification engine 612 can apply computer vision to images of a field of view of an environment at a device to track objects in the images. For example, the reference orientation identification engine 612 can apply either or both SIFT and SURF feature recognition mechanisms to track movement of an object across a plurality of images of a field of view.
[0100] In a specific implementation, the reference orientation identification engine 612 functions to apply a predictive tracking model for purposes of tracking objects. The reference orientation identification engine 612 can apply a predictive tracking model to predict a position an object should be at for purposes of identifying a reference orientation. For example, if a predictive tracking model indicates an object should have moved 1mm down with respect to a point in space, then the reference orientation identification engine 612 can predict a changing reference orientation as 1mm down with respect to the point. The reference orientation identification engine 612 can apply a predictive tracking model based on whether it is currently able to track an object in images of a field of view at a device. For example, if a tracked object disappears from images of a field of view, e.g. a user obscures a sensor for capturing the images, then the reference orientation identification engine 612 can apply a predictive tracking model to determine a reference orientation of a device.
[0101] In a specific implementation, the reference orientation identification engine 612 functions to use compensation parameters to determine a reference orientation of a device. For example, if compensation parameters indicate a device is being moved linearly forward at a speed of ten miles per hour, then the reference orientation identification engine 612 can factor in that a sensor used to capture images is moving at ten miles per hour as part of tracking an object in the captured images. The reference orientation identification engine 612 can receive input from an applicable sensor for use in applying compensation parameters to determine a reference orientation. For example, the reference orientation identification engine 612 can determine a linear speed of a device from an accelerometer and subsequently use the speed as a compensation parameter. In another example, the reference orientation identification engine 612 can determine an orientational speed of a gimbal from an accelerometer and subsequently use the orientational speed as a compensation parameter in determining a reference orientation of a device.
[0102] In a specific implementation, the reference orientation identification engine 612 functions to manipulate images to change a size of tracked objects in the images. In manipulating images to change sizes of tracked object, the reference orientation identification engine 612 can use changes to zoom level or aspect ratios resulting from changing sizes of images and corresponding tracked objects to determine a reference position. In changing sizes of tracked objects in images, the objects can be more easily identified and tracked without risk of losing the tracked objet. The reference orientation identification engine 612 can use an applicable image processing technique to manipulate images to change sizes of tracked objects. [0103] In a specific implementation, the reference orientation identification engine 612 functions to determine a reference position based on input received from an acoustic sensor. For example, the reference orientation identification engine 612 can use input received from an acoustic sensor to determine a measure of distance between a sensor and an object, a device being stabilized and an object, or different objects. Further in the example, the reference orientation identification engine 612 can use a measured distance to determine dimensions of a tracked object, a captured field of view, or other features in a captured field of view for purposes of determining a reference orientation of a device.
[0104] In an example of operation of the example system shown in FIG. 6, the input engine 604 receives input from a sensor for use in determining a reference orientation of a device for purposes of stabilizing a device. In the example of operation of the example system shown in FIG. 6, the predictive tracking model maintenance engine 606 maintains a predictive tracking model for use in determining a reference orientation of the device. Further, in the example of operation of the example system shown in FIG. 6, the predictive tracking model datastore 608 stores predictive tracking model data indicating the predictive tracking model maintained by the predictive tracking model maintenance engine 606. In the example of operation of the example system shown in FIG. 6, the tracking object identification engine 610 identifies an object to track in images, received as input by the input engine 604, of a field of view of an environment at the device. Additionally, in the example of operation of the example system shown in FIG. 6, the reference orientation identification engine 612 tracks the object in the images to determine a reference orientation. In the example of operation of the example system shown in FIG. 6, the reference orientation identification engine 612 uses the predictive tracking model to further determine the reference orientation of the device.
[0105] FIG. 7 depicts a flowchart 700 of an example of a method for using computer vision to determine a reference orientation of a device for purposes of stabilizing the device. The flowchart 700 begins at module 702, where input including images of a field of view of an environment at a device are received for purposes of stabilizing the device. An applicable engine for receiving perceivable stimuli, such as the input engines described in this paper, can receive input including images of a field of view of an environment at a device for purposes of stabilizing the device. Images of a field of view of an environment at a device can be received from an applicable source, such as the sensors described in this paper. For example, images of a field of view of an environment at a device can be received from a camera integrated as part of the device. Additionally, images of a field of view of an environment at a device can be received from a camera integrated as part of a frame of a computer vision-based device stabilization system.
[0106] The flowchart 700 continues to module 704, where an obj ect to track in the images is identified to determine a reference orientation of the device for purposes of stabilizing the device. An applicable engine for identifying objects to track in identifying a reference orientation of a device, such as the tracking object identification engines described in this paper, can identify an object to track in the images to determine a reference orientation of the device. An object to track in the images can be identified using computer vision. For example, either or both a SIFT object detection method and a SURF object detection method can be used to identify an object to track in the images of the field of view of the environment at the device.
[0107] The flowchart 700 continues to module 706, where the object is tracked to determine the reference orientation of the device for purposes of stabilizing the device. An applicable engine for tracking an object in images for purposes of determining a reference orientation of a device, such as the reference orientation identification engines described in this paper, can track the object in the images of the field of view of the environment at the device. The obj ect can be tracked to determine the reference orientation by applying computer vision to the images of the field of view of the environment at the device. The reference orientation can be determined based on input received from an acoustic sensor.
[0108] The flowchart 700 optionally continues to module 708, where a predictive tracking model is used to track the object to determine the reference orientation of the device for purposes of stabilizing the device. A predictive tracking model can be applied by an applicable engine for tracking an object in images for purposes of determining a reference orientation of a device, such as the reference orientation identification engines described in this paper. For example, a predictive tracking model can be applied to determine an expected position of the device. A predictive tracking model can be applied if the object can no longer be tracked or otherwise disappears from the images of the field of view of the environment at the device. For example, if a mountain no longer appears in the images of the field of view of the environment at the device, then the predictive tracking model can be applied to determine the reference orientation. An applicable engine for maintaining predictive tracking models, such as the predictive tracking model maintenance engines described in this paper can maintain a predictive tracking model for use in determining the reference orientation of the device.
[0109] FIG. 8 depicts a diagram 800 of an example of a device stabilization control system 802. The device stabilization control system 802 is intended to represent a system that functions to control stabilization of a device based on a determine actual orientation of a device and a determined reference orientation of the device. In controlling device stabilization, the device stabilization control system 802 can control operation of an applicable system for stabilizing a device, such as the computer vision-based device stabilization systems described in this paper. For example, the device stabilization control system 802 can generate and send actuation signals to a drive mechanism of a gimbal assembly for purposes of causing the gimbal assembly to displace a device from an actual orientation to a reference orientation. The device stabilization control system 802 can be implemented as part of an applicable system for controlling a gimbal assembly for purposes of stabilizing a device, such as the computer vision-based stabilization control systems described in this paper.
[0110] In a specific implementation, the device stabilization control system 802 functions to control stabilization of a device through application of computer vision to determine a reference orientation. Specifically, the device stabilization control system 802 can apply computer vision to images of a field of view of an environment at a device to determine a reference orientation of the device for purposes of stabilizing the device. Further, the device stabilization control system 802 can compare a reference orientation of a device to an actual orientation of the device to determine how to actuate driving mechanisms to cause the device to be moved from its actual orientation to its reference orientation. The device stabilization control system 802 can determine an actual orientation of a device based on either or both performance characteristics data of a drive mechanism and data received from an accelerometer, a GPS sensor, and a gyroscope.
[0111] The device stabilization control system 802 shown in FIG. 8 includes an input engine 804, an actual orientation identification engine 806, a reference orientation identification engine 808, and a stabilization control engine 810. The input engine 804 functions according to an applicable engine for receiving input for use in identifying a reference orientation of a device, such as the input engines described in this paper. The input engine 804 can receive input for identifying a reference orientation of a device from an applicable source. For example, the input engine 804 can receive input including images of a field of view of an environment from a camera integrated as part of a device to be stabilized. In another example, the input engine 804 can receive input indicating pitch, roll, and yaw from one or a combination of an accelerometer, a gyroscope, and a GPS sensor. [0112] The actual orientation identification engine 806 is intended to represent an engine that functions to determine an actual orientation of a device for purposes of stabilizing the device. The actual orientation identification engine 806 can determine an actual orientation of a device based on performance characteristics of a drive mechanism for a gimbal assembly. For example, an actual position of a device can be determined based on angular positions of a drive mechanism indicated by performance characteristics data of gimbals affixed to the device. Additionally, the actual orientation identification engine 806 can determine an actual orientation of a device based on data receive from one or a combination of a GPS sensor, an accelerometer, and a gyroscope. For example, an actual orientation of a device can be determined based on measurements made by an accelerometer of a frame of a computer vision based-device stabilization system.
[0113] The reference orientation identification engine 808 is intended to represent an engine that functions to determine a reference orientation of a device for purposes of stabilizing the device, such as the reference orientation identification engines described in this paper. The reference orientation identification engine 808 can determine a reference orientation using computer vision. For example, the reference orientation identification engine 808 can track an identified obj ect image in a field of view of a device to determine a reference orientation of a device. Additionally, the reference orientation identification engine 808 can determine a reference orientation based on data received from one or a combination of an accelerometer, a GPS sensor, and a gyroscope.
[0114] The stabilization control engine 810 functions to control stabilization of a device based on a determined actual orientation of the device and a reference orientation of the device. In controlling stabilization of a device the stabilization control engine 810 can control a drive mechanism of a gimbal assembly to cause a device to displace from an actual orientation to a reference orientation. For example, the stabilization control engine 810 can generate and send an actuation signal to cause drive mechanisms to displace a device to be displaced from its actual orientation to a new orientation. Additionally, the stabilization control engine 810 can control a drive mechanism of a gimbal assembly based on performance characteristics data for the drive mechanism and received from an applicable source, such as the encoders described in this paper. For example, the stabilization control engine 810 can determine an actual orientation of a drive mechanism from performance characteristics data and subsequently how much to displace the drive mechanism to a desired position in order to position a device at a reference orientation based on the actual position. Further in the example, the stabilization control engine 810 can generate and send an actuation signal to cause the drive mechanism to displace to the desired position, subsequently positioning the device at a reference orientation.
[0115] In an example of operation of the example system shown in FIG. 8, the input engine 804 receives input for use in stabilizing a device. In the example of operation of the example system shown in FIG. 8, the actual orientation identification engine 806 determines an actual orientation of a device based on the input received by the input engine 804. Further, in the example of operation of the example system shown in FIG. 8, the reference orientation identification engine 808 determines a reference orientation of a device based on computer vision. In the example of operation of the example system shown in FIG. 8, the stabilization control engine 810 actuates a drive mechanism of a gimbal assembly to cause the device to be moved from its actual orientation to its reference orientation based on performance characteristics of the drive mechanism received by the input engine 804.
[0116] These and other examples provided in this paper are intended to illustrate but not necessarily to limit the described implementation. As used herein, the term "implementation" means an implementation that serves to illustrate by way of example but not limitation. The techniques described in the preceding text and figures can be mixed and matched as circumstances demand to produce alternative implementations.

Claims

CLAIMS We claim:
1. A method comprising:
affixing a device to a frame of a computer vision-based device stabilization system through a securing mechanism;
receiving performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system;
receiving images of a captured field of view of an environment at the computer vision- based device stabilization system;
determining an actual orientation of the device;
identifying a reference orientation of the device by applying computer vision to the images of the captured field of view of the environment at the computer vision-based device stabilization system;
controlling the drive mechanisms of the gimbal assembly by correlating the actual orientation and the reference orientation of the device to move the device from the actual orientation to the reference orientation for purposes of stabilizing the device.
2. The method of claim 1 , further comprising:
applying computer vision to identify at least one object in the images of the captured field of view of the environment at the computer vision-based device stabilization system; tracking the object in the images of the captured field of view of the environment at the computer vision-based device stabilization system using computer vision to identify the reference orientation of the device.
3. The method of claim 1 , further comprising:
applying either or both a scale invariant object transform obj ects detection method and a speeded up robust objects detection method to identify at least one object in the images of the captured field of view of the environment at the computer vision-based device stabilization system;
tracking the object in the images of the captured field of view of the environment at the computer vision-based device stabilization system using either or both the scale invariant object transform objects detection method and the speeded up robust objects detection method to identify the reference orientation of the device.
4. The method of claim 1 , wherein the securing mechanism used to affix the device to the frame of the computer vision-based device stabilization system is a claw mechanism with gears that is configured to allow space in the frame for storage of a battery to power the computer vision-based device stabilization system.
5. The method of claim 1 , wherein the drive mechanisms of the gimbal assembly are controlled by sending actuation signals directly to the drive mechanisms from a main printed circuit board of the computer vision-based device stabilization system.
6. The method of claim 1 , further comprising identifying the actual orientation from the performance characteristics data of the drive mechanisms of the gimbal assembly of the computer vision-based device stabilization system.
7. The method of claim 1 , further comprising identifying the actual orientation from data received form one or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated at the computer vision-based device stabilization system.
8. The method of claim 1, wherein the performance characteristics data of the drive mechanisms of the gimbal assembly is generated at a frequency between 400Hz and 1600Hz, and actuation signals are sent to the drive mechanism of the gimbal assembly to control the drive mechanism of the gimbal assembly at a frequency between 400Hz and 1600Hz in response to the performance characteristics data.
9. The method of claim 1 , further comprising a handle without a battery affixed to the computer vision-based device stabilization system, the handle removable to allow for mounting of the computer vision-based device stabilization system.
10. The method of claim 1 , further comprising:
maintaining a predictive tracking model based on historical movements of the device; identifying the reference orientation of the device by applying the predictive tracking model when a tracked object in the images of the field of view of the environment at the computer vision-based device stabilization system can no longer be tracked in the images.
1 1. A system comprising:
a frame of a computer vision-based device stabilization system with a securing mechanism configured to receive and affix a device to a frame of the computer vision-based device stabilization system;
an input engine configured to:
receive performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system;
receive images of a captured field of view of an environment at the computer vision-based device stabilization system;
an actual orientation identification engine configured to determine an actual orientation of the device;
a computer vision-based reference orientation identification system configured to identify a reference orientation of the device by applying computer vision to the images of the captured field of view of the environment at the computer vision-based device stabilization system;
a stabilization control engine configured to control the drive mechanisms of the gimbal assembly by correlating the actual orientation and the reference orientation of the device to move the device from the actual orientation to the reference orientation for purposes of stabilizing the device.
12. The system of claim 1 1, further comprising:
a tracking object identification engine configured to apply computer vision to identify at least one object in the images of the captured field of view of the environment at the computer vision-based device stabilization system;
a reference orientation identification engine configured to track the object in the images of the captured field of view of the environment at the computer vision-based device stabilization system using computer vision to identify the reference orientation of the device.
13. The system of claim 1 1, further comprising:
a tracking object identification engine configured to apply either or both a scale invariant object transform objects detection method and a speeded up robust objects detection method to identify at least one object in the images of the captured field of view of the environment at the computer vision-based device stabilization system; a reference orientation identification engine configured to track the object in the images of the captured field of view of the environment at the computer vision-based device stabilization system using either or both the scale invariant object transform objects detection method and the speeded up robust objects detection method to identify the reference orientation of the device.
14. The system of claim 1 1, wherein the securing mechanism used to affix the device to the frame of the computer vision-based device stabilization system is a claw mechanism with gears that is configured to allow space in the frame for storage of a battery to power the computer vision-based device stabilization system.
15. The system of claim 1 1, wherein the stabilization control engine is configured to
control the drive mechanisms of the gimbal assembly by sending actuation signals directly to the drive mechanisms from a main printed circuit board of the computer vision-based device stabilization system.
16. The system of claim 1 1, wherein the actual orientation identification engine is
configured to identify the actual orientation from the performance characteristics data of the drive mechanisms of the gimbal assembly of the computer vision-based device stabilization system.
17. The system of claim 1 1, wherein the actual orientation identification engine is
configured to identify the actual orientation from data received form one or a combination of an accelerometer, a gyroscope, and a GPS sensor integrated at the computer vision-based device stabilization system.
18. The system of claim 1 1, further comprising a handle without a battery affixed to the computer vision-based device stabilization system, the handle removable to allow for mounting of the computer vision-based device stabilization system.
19. The system of claim 1 1, further comprising:
a predictive tracking model maintenance engine configured to maintain a predictive tracking model based on historical movements of the device;
a reference orientation identification engine configured to identify the reference orientation of the device by applying the predictive tracking model when a tracked object in the images of the field of view of the environment at the computer vision-based device stabilization system can no longer be tracked in the images.
20. A system comprising:
means for affixing a device to a frame of a computer vision-based device stabilization system through a securing mechanism;
means for receiving performance characteristics data of drive mechanisms of a gimbal assembly of the computer vision-based device stabilization system;
means for receiving images of a captured field of view of an environment at the computer vision-based device stabilization system;
means for determining an actual orientation of the device;
means for identifying a reference orientation of the device by applying computer vision to the images of the captured field of view of the environment at the computer vision-based device stabilization system;
means for controlling the drive mechanisms of the gimbal assembly by correlating the actual orientation and the reference orientation of the device to move the device from the actual orientation to the reference orientation for purposes of stabilizing the device.
PCT/US2018/031892 2017-05-10 2018-05-09 Device stabilization WO2018208981A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762504416P 2017-05-10 2017-05-10
US62/504,416 2017-05-10

Publications (1)

Publication Number Publication Date
WO2018208981A1 true WO2018208981A1 (en) 2018-11-15

Family

ID=64105004

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/031892 WO2018208981A1 (en) 2017-05-10 2018-05-09 Device stabilization

Country Status (1)

Country Link
WO (1) WO2018208981A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782714B (en) * 2009-12-07 2011-07-06 杨子良 Multiaxial intelligent balance adjusting camera stabilizer
US20110188847A1 (en) * 2010-02-04 2011-08-04 Mckay Thomas Hand-held image stabilization and balancing system for cameras
US20140267778A1 (en) * 2013-03-15 2014-09-18 Freefly Systems, Inc. Apparatuses and methods for controlling a gimbal and other displacement systems
US20150071627A1 (en) * 2013-09-12 2015-03-12 Chi Khai Hoang Automated Stabilizing Apparatus
US20160171330A1 (en) * 2014-12-15 2016-06-16 Reflex Robotics, Inc. Vision based real-time object tracking system for robotic gimbal control
US20160352992A1 (en) * 2015-05-27 2016-12-01 Gopro, Inc. Image Stabilization Mechanism

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101782714B (en) * 2009-12-07 2011-07-06 杨子良 Multiaxial intelligent balance adjusting camera stabilizer
US20110188847A1 (en) * 2010-02-04 2011-08-04 Mckay Thomas Hand-held image stabilization and balancing system for cameras
US20140267778A1 (en) * 2013-03-15 2014-09-18 Freefly Systems, Inc. Apparatuses and methods for controlling a gimbal and other displacement systems
US20150071627A1 (en) * 2013-09-12 2015-03-12 Chi Khai Hoang Automated Stabilizing Apparatus
US20160171330A1 (en) * 2014-12-15 2016-06-16 Reflex Robotics, Inc. Vision based real-time object tracking system for robotic gimbal control
US20160352992A1 (en) * 2015-05-27 2016-12-01 Gopro, Inc. Image Stabilization Mechanism

Similar Documents

Publication Publication Date Title
US10482608B2 (en) Method and apparatus to generate haptic feedback from video content analysis
CN106471548B (en) Use the method and apparatus of the acceleration template matches of peripheral information
US20190096081A1 (en) Camera pose determination and tracking
KR101637990B1 (en) Spatially correlated rendering of three-dimensional content on display components having arbitrary positions
JP6943988B2 (en) Control methods, equipment and systems for movable objects
US20190339594A1 (en) Gimbal control method, device, and gimbal
EP1376464A1 (en) Image processing device and method therefor and program codes, storing medium
JP2004227563A (en) Integration of inertia sensor
US20070171202A1 (en) Trajectory estimation apparatus, method, and medium for estimating two-dimensional trajectory of gesture
KR20140060314A (en) Method of controlling a cursor by measurements of the attitude of a pointer and pointer implementing said method
EP2000953A3 (en) Information processing apparatus, information processing method, and computer program
JP7086116B2 (en) Electronic devices for analog stroke generation and digital recording of analog strokes, as well as input systems and methods for digitizing analog recordings.
JP2016514865A5 (en)
US20140149062A1 (en) Sensor calibration
US20230288982A1 (en) Adaptive intelligent head-hand vr system and method
CN110737798A (en) Indoor inspection method and related product
US20200211278A1 (en) 3d model generation system and method
WO2018208981A1 (en) Device stabilization
CA2331075A1 (en) Control device and method of controlling an object
EP3392748B1 (en) System and method for position tracking in a virtual reality system
US9297660B2 (en) System and method for determining parameters representing orientation of a solid in movement subject to two vector fields
CN110837295A (en) Handheld control equipment and tracking and positioning method, equipment and system thereof
US20190272426A1 (en) Localization system and method and computer readable storage medium
US20170220133A1 (en) Accurately positioning instruments
CN111429519B (en) Three-dimensional scene display method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18797608

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26-02-2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18797608

Country of ref document: EP

Kind code of ref document: A1