US20230384863A1 - Electronic Device with an Input Device Having a Haptic Engine - Google Patents

Electronic Device with an Input Device Having a Haptic Engine Download PDF

Info

Publication number
US20230384863A1
US20230384863A1 US18/233,112 US202318233112A US2023384863A1 US 20230384863 A1 US20230384863 A1 US 20230384863A1 US 202318233112 A US202318233112 A US 202318233112A US 2023384863 A1 US2023384863 A1 US 2023384863A1
Authority
US
United States
Prior art keywords
input
haptic
input device
electronic device
haptic output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/233,112
Inventor
Tyler S. Bushnell
Benjamin J. Kallman
David M. Pelletier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US18/233,112 priority Critical patent/US20230384863A1/en
Publication of US20230384863A1 publication Critical patent/US20230384863A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts

Definitions

  • the described embodiments relate generally to electronic devices. More particularly, the present embodiments relate to an electronic device that includes a haptic engine used to detect an input action associated with an input device and provides haptic feedback based on the detected input action.
  • a haptic engine used to detect an input action associated with an input device and provides haptic feedback based on the detected input action.
  • Portable electronic devices have become increasingly popular, and the features and functionality provided by portable electronic devices continue to expand to meet the needs and expectations of many consumers.
  • some portable electronic devices include features such as touch sensors, a display, various input devices, speakers, and microphones.
  • the electronic device may take on a small form factor. In such cases, it can be challenging to include all of the components in the electronic device that are needed to provide the various functionalities in the smallest space.
  • Embodiments disclosed herein provide an electronic device that is configured to provide haptic feedback to a user based on an input action associated with an input device.
  • a haptic engine is configured to detect an input action associated with the input device (e.g., a translational input) and produce a haptic output based on the detected input action.
  • the haptic output may be perceived by a user as haptic feedback.
  • the haptic feedback can indicate to the user that the user input has been received by the electronic device.
  • an input device includes a haptic engine operably connected to or mechanically coupled to an input surface of the input device and to a processing device.
  • the haptic engine may include an electromagnetic actuator, such as a linear actuator, that detects a user input or input action associated with the input surface and provides a haptic output based on the detected input action.
  • the electromagnetic actuator includes a magnet assembly and a coil assembly adjacent to the magnet assembly. For example, the coil assembly can at least partially surround the magnet assembly.
  • the haptic engine detects the input action based on a first movement between the magnet assembly and the coil assembly with respect to each other, the first movement inducing an input device signal in the coil assembly.
  • the processing device is configured to receive or respond to the input device signal and to responsively cause a haptic output signal to be transmitted to the haptic engine.
  • the haptic output signal produces a second movement between the magnet assembly and the coil assembly with respect to each other to produce a haptic output that is applied or transferred to the input surface.
  • the input device is an input button in an electronic watch (e.g., a smart watch).
  • the electronic watch also includes a display and a processing device operably connected to the display and to the electromagnetic actuator.
  • An input action e.g., a translational input action
  • the display is configured to display a user interface screen associated with an application program, and the input action also causes a change in the user interface screen displayed on the display. For example, an icon may be selected and a different user interface screen displayed based on the selected icon, or the digits in the time displayed on the display are changed based on the input action.
  • an electronic device includes an input device configured to receive a user input, a haptic device operably connected to the input device, and a processing device operably connected to the haptic device.
  • the processing device is configured to receive or respond to an input device signal from the haptic device based on the user input.
  • the processing device is configured to cause a haptic output signal to be transmitted to the haptic device.
  • the haptic output signal causes the haptic device to produce a haptic output.
  • an electronic device includes an input device configured to receive a user input and a haptic engine operably connected to the input device.
  • the haptic engine is configured to detect the user input and produce a haptic output based on the detected user input.
  • the haptic engine is further configured to operate in a first mode in which the haptic engine engages the input device, and in a second mode in which the haptic engine is not engaged with the input device.
  • an electronic watch includes an electromagnetic actuator operably connected to an input button, and a processing device operably connected to the electromagnetic actuator.
  • the electromagnetic actuator includes a magnet assembly and a coil assembly adjacent the magnet assembly.
  • the electromagnetic actuator is configured to detect an input action (e.g., a translational input action) provided to the input button based on a first movement between the magnet assembly and the coil assembly. The first movement induces an input device signal in the coil assembly.
  • the processing device is configured to receive or respond to the input device signal and to cause a haptic output signal to be transmitted to the electromagnetic actuator to cause a second movement between the magnet assembly and the coil assembly to produce a haptic output.
  • the haptic output may be applied to the input button and/or to another region or surface of the electronic device.
  • an electronic watch includes a linear actuator operably connected to a crown and a processing device operably connected to the linear actuator.
  • the linear actuator includes a magnet assembly movably coupled to a shaft and a coil assembly adjacent the magnet assembly.
  • the linear actuator is configured to detect an input action (e.g., a translational input action) provided to the crown based on a first movement between the magnet assembly and the coil assembly, the first movement inducing an input device signal.
  • the processing device is configured to receive or respond to the input device signal and to cause a haptic output signal to be transmitted to the linear actuator.
  • the haptic output signal causes a second movement between the magnet assembly and the coil assembly to produce a haptic output.
  • the haptic output may be applied to the crown and/or to another region or surface of the electronic device.
  • FIG. 1 shows one example of an electronic device that can include a haptic engine configured to provide haptic output based on an input action associated with an input device;
  • FIG. 2 depicts a simplified schematic of the electronic device taken along line A-A in FIG. 1 ;
  • FIG. 3 shows a schematic cross-sectional view of a first example of the electronic device taken along line A-A in FIG. 1 ;
  • FIG. 4 depicts a simplified cross-sectional view of a second example of the electronic device taken along line B-B in FIG. 1 ;
  • FIG. 5 shows show a simplified cross-sectional view of a third example of the electronic device taken along line A-A in FIG. 1 ;
  • FIGS. 6 A- 6 B show a simplified cross-sectional view of a fourth example of the electronic device taken along line A-A in FIG. 1 ;
  • FIGS. 7 A- 7 B depict a simplified cross-sectional view of a fifth example of the electronic device taken along line A-A in FIG. 1 ;
  • FIG. 8 depicts one example of the coil and magnet assemblies that are suitable for use in the haptic devices shown in FIGS. 2 - 7 ;
  • FIG. 9 shows a flowchart of a method of operating an electronic device.
  • FIG. 10 is an illustrative block diagram of the electronic device shown in FIG. 1 .
  • cross-hatching in the figures is provided to distinguish the elements or components from one another.
  • the cross-hatching is not intended to indicate a type of material or materials or the nature of the material(s).
  • a haptic device may be configured to produce a mechanical movement or vibration that may be transmitted through the enclosure and/or an input device of the electronic device. In some cases, the movement or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user.
  • the haptic feedback may be coupled to an input action on an input device.
  • an input action is the pressing of an input button. The haptic feedback can indicate to a user that the input action has been received or registered by the input device and/or the electronic device.
  • the electronic device includes an input device and a haptic engine operably connected to the input device.
  • a haptic engine may include an electromechanical assembly that is capable of producing a change in momentum using a moving mass that results in a haptic output.
  • the haptic engine is configured to function as both an input sensor and a haptic device.
  • the input sensor may be integrated within the haptic device in that the electromechanical components that produce and receive signals from both the haptic device and the input sensor.
  • an input action e.g., button press
  • an input action can cause the magnet assembly and the coil assembly to move with respect to each other.
  • This movement induces a current (or “input device signal”) in the coil assembly.
  • the input device signal indicates an input action associated with the input device has occurred.
  • a processing device may be responsive to the input device signal, and may cause a haptic output signal to be transmitted to the coil assembly.
  • the haptic output signal may cause the haptic device to produce a haptic output.
  • an input device can be an input button in an electronic device, and one input action associated with the input button is a button press or translational input.
  • the button press may cause the input button, or components within the input button, to translate or move in the same direction as the direction of the button press (e.g., horizontal direction).
  • an input action can include a force input where an amount of force, or varying amounts of force, is applied to an input device.
  • a processing device operably connected to the haptic engine is configured to detect the applied force on the input device.
  • the processing device can be configured to detect motion or a rotation of the input device (or of a component in the input device).
  • the input device may be a crown of an electronic watch (e.g., a smart watch) that a user can rotate to provide one or more inputs to the smart watch.
  • a haptic engine may produce one or more types of haptic output, such as vibrations, an applied force, movement, and combinations thereof.
  • the haptic output may be transmitted through the enclosure and/or an input device of the electronic device and detected by a user.
  • the movement, force, and/or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user.
  • a user can press an input button and the haptic engine can apply a force to the input button in a direction opposite the direction of the button press.
  • the applied force may be perceived by the user as a “tap” or “knock” that indicates the input device and/or the electronic device registered the button press.
  • the haptic engine may move a mass in one direction or in opposing directions in response to the button press. The movement may be perceived by the user as a vibration that indicates the input device and/or the electronic device registered the button press.
  • the position of the haptic engine can be adjusted so that a haptic output can be applied to different regions of an electronic device.
  • a haptic engine may be positioned at a first position to apply a haptic output directly to an input device (e.g., an exterior surface of an input button).
  • the haptic engine may also be positioned at a second position to apply a haptic output to a different region of the electronic device (e.g., an exterior surface of an enclosure).
  • the position of the haptic engine can be adjusted using any suitable method. For example, in one embodiment an electromagnet or switch may position the haptic device.
  • FIGS. 1 - 10 These and other embodiments are discussed below with reference to FIGS. 1 - 10 . However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these Figures is for explanatory purposes only and should not be construed as limiting.
  • FIG. 1 shows one example of an electronic device that can include a haptic engine configured to produce haptic output based on an input action associated with an input device.
  • the electronic device 100 may be implemented as an electronic or smart watch that is adapted to be worn by a user.
  • a different type of electronic device can be used in other embodiments.
  • the electronic device can be a gaming device, a digital music player, a sports accessory device, a medical device, a health assistant, a tablet computing device, a notebook computer, a smart phone, and other types of electronic devices that provide, or are suitable to provide, haptic feedback to a user.
  • the electronic device 100 includes input devices 106 , 108 .
  • one or both of the input devices 106 , 108 may be configured as input/output devices.
  • the term “input device” is intended to be construed broadly to include both input and input/output devices.
  • An input device may include an input component, such as a button, knob, dial, crown, and the like. Although shown on a side of the electronic device 100 , the input devices 106 , 108 can be positioned substantially anywhere on the electronic device 100 .
  • the electronic device 100 includes at least one haptic engine (see e.g., FIG. 2 ) operably connected to one or both input devices.
  • the haptic engine is configured to detect an input action associated with one or both input devices 106 , 108 and provide haptic feedback to a user when an input action is detected.
  • the haptic engine functions as both an input sensor and a haptic device. In some embodiments, at least some of the components of the haptic engine can be used as the input sensor.
  • an input action e.g., a translation of input device 108
  • an input action can cause a magnet assembly and a coil assembly of the electromagnetic actuator to move with respect to each other. This movement induces a current (or “input device signal”) in the coil assembly.
  • the input device signal may indicate that an input action associated with the input device has occurred.
  • a processing device may be responsive to the input device signal and may, in turn, cause a haptic output signal to be transmitted to the coil assembly.
  • the haptic output signal causes the electromagnetic actuator to produce a haptic output.
  • the haptic output may be perceived by the user as haptic feedback that indicates the input action has been registered or entered by the electronic device 100 and/or the input device(s) 106 , 108 .
  • the input device 106 is a crown and the input device 108 an input button.
  • Input devices in other embodiments are not limited to these configurations.
  • an input device may be a rocker switch, a portion of the enclosure 102 , one or more keys in a keyboard, a slide switch, a virtual icon or image on a display, or any other suitable input device.
  • the input device 106 (e.g., crown) is configured to receive translational and rotational input actions.
  • the input device 106 may include a shaft that extends into the electronic device 100 . Pressing the input device 106 can cause the shaft, or components coupled to the shaft, to move or translate a given distance. Additionally or alternatively, the shaft may rotate when a user rotates the input device 106 . The amount of shaft rotation can be detected and measured by an optical encoder positioned adjacent to the shaft. The amount of shaft rotation may be used as an input to the electronic device 100 and/or to an application program running on the electronic device 100 .
  • One or more functions can be performed when the input device 106 is rotated and/or pressed. For example, if the display 104 of the electronic device 100 is displaying a time keeping application, the input device 106 may be rotated in either direction to change or adjust the position of the hands or the digits that are displayed for the time keeping application. Additionally or alternatively, the input device 106 may be rotated to move a cursor or other type of selection mechanism from a first displayed location to a second displayed location in order to select an icon or move the selection mechanism between various icons that are presented on the display 104 .
  • the input device 106 may be pressed to perform various functions, such as changing the image on a display, waking the electronic device 100 from a sleep state, and/or to select or activate an application.
  • the input device 106 can be rotated or pressed to disable an application or function.
  • the input device 106 may be pressed to disable an alert produced by an application on the electronic device 100 or received by the electronic device 100 .
  • the input device 108 (e.g., an input component or input button) can be configured to be pressed to cause various functions to be performed and/or disabled.
  • the input device 108 may include a shaft that extends into the electronic device 100 . Pressing the input device 108 can cause the shaft, or components coupled to the shaft, to move or translate a given distance.
  • a single press can activate an application and/or display a particular image or screen on the display. Additionally or alternatively, a single press may disable or delay an alert.
  • a multiple press (e.g., a double press or double click) can activate an application and a component within the electronic device 100 .
  • a double click may access an application that uses a wireless communication network to transmit data associated with the application (e.g., an electronic payment application).
  • a press-hold may operate to turn on and turn off the electronic device 100 or to place the electronic device 100 in a power saving mode (e.g., a mode where minimal functions and applications operate and other applications and functions are disabled).
  • pressing both of the input devices 106 , 108 in various combinations can cause one or more functions to be performed. For example, pressing the input device 106 and then immediately pressing the input device 108 can cause an action to be performed on the electronic device 100 . Additionally or alternatively, simultaneous press-holds on the input devices 106 , 108 can cause another action to be performed on the electronic device 100 .
  • the electronic device 100 further includes an enclosure 102 that forms an outer surface or partial outer surface for the internal components of the electronic device 100 .
  • the enclosure 102 defines openings and/or apertures that receive and/or support a display 104 and the input devices 106 , 108 .
  • the enclosure 102 can be formed of one or more components operably connected together, such as a front piece and a back piece. Alternatively, the enclosure 102 can be formed of a single piece operably connected to the display 104 .
  • the enclosure 102 is formed into a substantially rectangular shape, although this configuration is not required. For example, certain embodiments may include a substantially circular enclosure 102 .
  • the display 104 can provide a visual output for the electronic device 100 and/or function to receive user inputs to the electronic device 100 .
  • the display 104 may incorporate an input device configured to receive touch input, force input, temperature input, and the like.
  • the display 104 may be substantially any size and may be positioned substantially anywhere on the electronic device 100 .
  • the display 104 can be implemented with any suitable display, including, but not limited to, a multi-touch sensing touchscreen device that uses liquid crystal display (LCD) element, a light emitting diode (LED) element, an organic light-emitting display (OLED) element, or an organic electro luminescence (OEL) element.
  • LCD liquid crystal display
  • LED light emitting diode
  • OLED organic light-emitting display
  • OEL organic electro luminescence
  • FIG. 2 shows a simplified schematic of the electronic device taken along line A-A in FIG. 1 .
  • the electronic device 100 can include a haptic engine 200 operably connected to a processing device 202 .
  • the haptic engine 200 is configured to detect an input action associated with the input device 108 and produce a haptic output based on a detected input action.
  • the input device 108 may also be referred to as an input component, which may include an input button, push button, or actuator.
  • the haptic engine 200 may produce one or more types of haptic output, such as vibrations, an applied force, movement, and combinations thereof.
  • the haptic output may be applied or transferred to at least one surface of the electronic device 100 and/or to the input device 108 .
  • the haptic output can be transmitted through the input device and/or the enclosure and perceived as haptic feedback by a user.
  • the haptic engine 200 is configured to apply a haptic output to a bottom surface of the enclosure 102 (e.g., the momentum of the haptic output can be transferred to the bottom surface).
  • the haptic output may be detected by the user as haptic feedback because the bottom surface of the electronic device 100 is in contact with the wrist.
  • the haptic output may be applied or transferred to a side of the electronic device 100 , a top surface of the electronic device 100 , multiple surfaces of the electronic device 100 , and combinations thereof.
  • the haptic engine 200 may be configured to produce a haptic output that is applied or transferred to the input device 108 .
  • the haptic engine 200 may be mechanically or structurally coupled to the input surface 210 (of the input device 108 ) to receive movement from and/or transmit movement to the input surface 210 .
  • movement of the input surface 210 results in movement of one or more components of the haptic engine 200 .
  • the haptic engine 200 can detect such input action and produce a first signal (“input device signal”) that is received by the processing device 202 . Based on the input device signal, the processing device 202 can cause a second signal (“haptic output signal”) to be transmitted to the haptic engine 200 that causes the haptic engine 200 to produce a haptic output (e.g., a vibration or an applied force). A user may then detect the haptic output as haptic feedback when the user's finger is in contact with the input surface 210 .
  • a first signal (“input device signal”) that is received by the processing device 202 .
  • the processing device 202 can cause a second signal (“haptic output signal”) to be transmitted to the haptic engine 200 that causes the haptic engine 200 to produce a haptic output (e.g., a vibration or an applied force).
  • haptic output signal e.g., a vibration or an applied force
  • the haptic engine 200 can be configured to operate in two or more modes. For example, in a first mode, the haptic engine 200 may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108 ). In a second mode, the haptic engine 200 can be positioned at a second position to produce a haptic output within the electronic device 100 and/or to apply to one or more non-input-device surfaces or regions of the electronic device 100 .
  • a first mode the haptic engine 200 may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108 ).
  • the haptic engine 200 can be positioned at a second position to produce a haptic output within the electronic device 100 and/or to apply to one or more non-input-device surfaces or regions of the electronic device 100 .
  • a second haptic device 204 may be operably connected to the processing device 202 .
  • the haptic engine 200 can produce a haptic output for the input device 108 while the second haptic device 204 may produce a haptic output for one or more different surfaces (non-input-device surfaces) or regions of the electronic device 100 .
  • haptic device can be used in the haptic engine 200 and/or the second haptic device 204 .
  • Example haptic devices include, but are not limited to, actuators, vibrator, and other type of motors.
  • a haptic device and haptic engine may produce one or more types of haptic output, such as movement, vibrations, transfer of momentum, and other actions that may produce a perceptible or tactile output.
  • an input sensor and a haptic device are separate components within the electronic device 100 .
  • the input sensor can detect or sense an input action using any suitable sensing technology, such as capacitive, piezoelectric, piezoresistive, electromagnetic, ultrasonic, and magnetic sensing technologies.
  • a capacitive input sensor can be used to detect the presence of a user's finger on the input device.
  • a capacitive sensor may be used to detect a user applying a force on the input device. For example, when the input device is an input button, the input sensor can detect the presence of a user's finger on the button and/or the user pressing the input button.
  • FIG. 3 shows a schematic cross-sectional view of a first example of the electronic device taken along line A-A in FIG. 1 .
  • the haptic engine 200 is mechanically coupled to the input device 108 .
  • a mechanical coupling between the haptic engine 200 and the input device 108 facilitates a transfer of motion between the two components.
  • the haptic engine 200 may be mechanically coupled to the input device 108 such that a translational input to the input device 108 is transferred or structurally communicated to the haptic engine 200 .
  • translational (e.g., vibrational) output from the haptic engine 200 may be transferred or mechanically communicated to the input device 108 .
  • the haptic engine 200 includes an electromagnetic actuator or linear actuator that uses a moving mass to create a haptic output (e.g., movement, applied force, and/or vibration).
  • the moving mass may be one or more magnets that move(s) in one direction or in an oscillating manner in response to a current passing through a coil that is adjacent to the magnet(s).
  • the moving magnet(s) can produce a vibration or applied force that is perceived as haptic feedback by a user.
  • the haptic engine 200 includes a magnet assembly 300 coupled to and/or movably positioned about a shaft 302 .
  • the magnet assembly 300 can include one or more magnets.
  • the magnet assembly 300 includes two magnets 300 a, 300 b of opposite polarities.
  • the magnets 300 a, 300 b can be made of any suitable ferromagnetic material, such as neodymium.
  • the shaft 302 may be formed form one or more components that are fixed with respect to each other or may be separated to allow for decoupling of the haptic engine 200 from other elements of the device.
  • the shaft 302 can be made of a non-ferrous material such as tungsten, titanium, stainless steel, or the like.
  • a coil assembly 304 at least partially surrounds the magnet assembly 300 and/or the shaft 302 .
  • the coil assembly 304 includes one or more coils. Each coil can be formed with a winding of a conductive material, such as a metal.
  • the width of the coil assembly 304 can be less than or substantially equal to the width of the magnet assembly 300 . In other embodiments, the width of the coil assembly 304 may be greater than the width of the magnet assembly 300 .
  • a frame 306 can be positioned at least partially around the coil assembly 304 , the magnet assembly 300 , and/or the shaft 302 to increase the momentum of the linear actuator.
  • the frame 306 can be made of any suitable material.
  • the frame 306 is made of a metal, such as tungsten.
  • the coil assembly 304 and the magnet assembly 300 are positioned such that a first air gap separates the coil assembly 304 from the magnet assembly 300 .
  • the coil assembly 304 and the frame 306 are positioned such that a second air gap separates the coil assembly 304 from the frame 306 .
  • the first and second air gaps are located on opposing sides of the coil assembly 304 .
  • the frame 306 can be disengaged from the input device 108 .
  • the shaft 302 extends through a bearing 308 and a collar 310 which support the frame 306 .
  • the collar 310 allows the shaft 302 to pass one frame 306 in only one direction.
  • the collar 310 may permit the shaft 302 to only move in a direction away from the input device 108 .
  • the coil assembly 304 may be energized by transmitting a current along a length of a wire that forms a coil in the coil assembly 304 .
  • a direction of the current along the wire of the coil determines a direction of a magnetic field that emanates from the coil assembly 304 .
  • the opposing polarities of the magnets 300 a, 300 b generate a radial magnetic field that interacts with the magnetic field of the coil assembly 304 .
  • the Lorentz force resulting from the interaction of the magnetic fields causes the frame 306 and the magnet assembly 300 to move in a first direction aligned with the axis of the shaft 302 . Reversing the current flow through the coil assembly 304 reverses the Lorentz force.
  • the magnetic field or force on the magnet assembly 300 is also reversed and the frame 306 and the magnet assembly 300 move in an opposing second direction.
  • the frame 306 and the magnet assembly 300 can move in one direction or in an oscillating manner depending on the direction of the current flow through the coil assembly 304 .
  • the frame 306 includes one or more magnets 312 that assist in moving the frame 306 and produce increased momentum when a current passes through the coil assembly 304 .
  • the shaft 302 , the magnet assembly 300 , and the frame 306 can move a given distance into the electronic device 100 .
  • This movement induces a current (“input device signal”) in the coil assembly 304 .
  • a processing device e.g., processing device 202 in FIG. 2
  • a haptic output is produced by the magnet assembly 300 and the frame 306 moving in one direction or in an oscillating manner based on the haptic output signal passing through the coil assembly 304 .
  • a housing 314 may be attached to the enclosure 102 and positioned at least partially around the frame 306 , the magnet assembly 300 , the coil assembly 304 , and the shaft 302 .
  • the shaft 302 extends through the housing 314 with a contact area 316 attached to the interior surface of the input device 108 .
  • the momentum of a haptic output can be transferred to the input device 108 using the contact area 316 .
  • a bracket 322 can at least partially surround the housing 314 and attach to an interior surface of the enclosure 102 using one or more fasteners 324 .
  • the bracket 322 fixes the housing 314 to the enclosure 102 .
  • Any suitable fastener may be used, such as screws, welding, and/or an adhesive.
  • the shaft 302 can extend into and/or pass through an opening in the bracket 322 . This allows the shaft 302 to move in or through the opening when a force is applied to the input device 108 .
  • the coil assembly 304 is fixed to the housing 314 .
  • the frame 306 and the magnet assembly 300 move with respect to the coil assembly 304 .
  • the coil assembly 304 may not contact any portion of the frame 306 even when the frame 306 and the magnet assembly 300 are maximally displaced within the housing 314 (e.g., to one end of the shaft 302 ).
  • the coil assembly 304 may move instead of, or in addition to, the frame 306 and the magnet assembly 300 .
  • the coil assembly 304 can be electrically connected to a power source using a flexible circuit or other signal line.
  • a compliant element 318 can be positioned on each side of the frame 306 to bias the frame 306 towards the center region of the travel.
  • the compliant elements 318 provide a return force or local biasing to the frame 306 .
  • the compliant elements 318 may be any suitable compliant element such as a leaf spring, beehive spring, and the like.
  • the haptic engine 200 can function as a force sensor. Using the known characteristics of the input device signal and the linear actuator, such as the mass of the magnet assembly 300 and the spring coefficients of the compliant elements 318 , the acceleration of the movement of the input device 108 can be correlated to an amount of force.
  • a compliant structure 320 can be positioned between the input device 108 and the enclosure 102 to allow travel between the input device 108 and the enclosure 102 and to return the input device 108 to a resting position. In one embodiment, the compliant structure 320 is positioned around an interior perimeter of the input device 108 . In other embodiments, one or more discrete compliant structures 320 may be positioned around an interior perimeter of the input device 108 .
  • the magnet assembly 300 and the coil assembly 304 can be used as an input sensor.
  • the shaft 302 , the magnet assembly 300 , and the frame 306 can move inward, which in turn induces a current (“input device signal”) in the coil assembly 304 .
  • a processing device (not shown) operably connected to the coil assembly 304 may receive or be responsive to the input device signal and cause a haptic output signal to be transmitted to the coil assembly 304 .
  • the haptic output signal is transmitted along the length of a wire in a coil in the coil assembly 304 , which in turn produces a magnetic field that causes the frame 306 and the magnets 300 a, 300 b to move and produce a haptic output (an applied force, movement, and/or vibration).
  • the movement, vibration, and/or applied force may be perceived by a user as haptic feedback.
  • the input action is sensed through the movement of the frame 306 and the magnets 300 a, 300 b with respect to the coil assembly 304 , and a haptic output is produced by the movement of the frame 306 and the magnets 300 a, 300 b with respect to the coil assembly 304 .
  • a discrete input sensor can be included in the electronic device.
  • the compliant structure 320 can be formed as a force sensing layer configured to detect an amount of force applied to the input device 108 .
  • the force sensing layer can include two electrode layers separated by a dielectric or compliant material (e.g., air, foam, silicon, and the like). Each electrode layer can include one or more electrodes that are aligned in at least one direction to produce one or more capacitors.
  • a force is applied to the input device 108 (e.g., when a user presses the input device 108 )
  • the distance between the electrodes in at least one capacitor changes, which changes a capacitance of the capacitor.
  • a processing device (not shown) can receive or be responsive to a signal from each capacitor representing the capacitance of that capacitor.
  • the processing device may be configured to correlate the signal(s) to an amount of force that was applied to the input device 108 .
  • the force sensing layer provides for a range of force input values that can be used to control a variety of functions. For example, a user can press the input device with a first force to perform a scrolling action at a first speed and press the input device with a second force to perform a scrolling action at a different second speed (e.g., a faster speed).
  • a user can press the input device with a first force to perform a scrolling action at a first speed and press the input device with a second force to perform a scrolling action at a different second speed (e.g., a faster speed).
  • a different type of input sensor can be used.
  • the input sensor can be configured to detect any suitable characteristic or property.
  • the input sensor may be an image sensor, a light or optical sensor, a proximity sensor, a magnet, a biometric sensor, a touch sensor, an accelerometer, and so on.
  • the input sensor can include one or more strain gauges, a tactile or reed switch, or a capacitive touch sensor.
  • the capacitive touch sensor may include a first electrode disposed within the enclosure 102 adjacent the input device 108 and a second electrode attached to or embedded in the input device 108 .
  • FIG. 4 depicts a simplified cross-sectional view of a second example of the electronic device taken along line B-B in FIG. 1 . Similar to the previous examples, the electronic device includes a haptic engine 200 , the description of which is provided above with respect to FIG. 3 .
  • the input device 106 in this case a crown, is configured to receive both translational and rotational input actions from a user.
  • An optical encoder 402 can be used to determine an amount of rotation for a rotational input.
  • the optical encoder 402 can convert the angular motion of the shaft 400 to an analog or digital code.
  • the shaft 400 includes a pattern (not shown) formed in or on the shaft 400 that selectively reflects light toward an optical sensor (not shown). The reflected light is used to determine the amount of rotation.
  • the input device 106 is operably coupled to the haptic engine 200 .
  • the haptic engine includes a magnet assembly 300 coupled to a frame 306 , which is coupled to a shaft 400 .
  • the shaft 400 may be formed form one or more components that are fixed with respect to each other or may be separated to allow for decoupling of the haptic engine 200 from other elements of the device.
  • a coil 304 is positioned adjacent to the magnet assembly 300 and is configured to produce a current in response to a movement of the magnet assembly 300 .
  • the coil 304 may also induce movement of the magnet assembly 300 when driven by a current or electrical signal.
  • the haptic engine 200 is mechanically or structurally coupled to the input device 106 such that motion of the input device 106 may be mechanically transferred to the haptic engine 200 and, similarly, motion produced by the haptic engine 200 may be mechanically transferred to the input device 106 .
  • the haptic engine 200 is configured to detect a translational input action (e.g., a press) associated with the input device 106 and produce a haptic output based on a detected input action.
  • the haptic output may be applied to an exterior surface of the input device 106 (e.g., crown) and/or to another region or exterior surface of the electronic device.
  • the haptic engine 200 provides a haptic output based on a rotational input action.
  • the optical encoder 402 may produce an input device signal when the input device 106 is rotated.
  • the processing unit may receive or be responsive to the input device signal and, in turn, cause a haptic output signal to be transmitted to the haptic engine 200 .
  • the haptic output signal may cause the haptic engine 200 to produce a haptic output that can be perceived by a user as haptic feedback indicating the rotational input action has been received by the electronic device.
  • the positions of the haptic engine 200 and the optical encoder 402 shown in FIG. 4 are for illustration purposes only.
  • the optical encoder 402 may be situated at any location adjacent the shaft 400 to allow the optical encoder 402 to emit light toward the shaft 400 and receive the light reflected by the shaft 400 .
  • FIG. 5 shows a simplified cross-sectional view of a third example of the electronic device taken along line A-A in FIG. 1 .
  • the haptic engine 200 is mechanically or structurally coupled to the input device 108 .
  • the coil assembly 304 is movably coupled or positioned about the shaft 302 and the magnet assembly 300 is attached to the housing 314 .
  • a current can be applied to the coil assembly 304 to move the frame 306 and the coil assembly 304 in one direction or in an oscillating manner (as represented by arrow 500 ) to produce a haptic output within the enclosure 102 .
  • the haptic output may be applied directly to the input device 108 through the contact area 316 of the shaft 302 .
  • a vibrational haptic output produced by the moving magnet assembly 300 and the frame 306 can be transferred through the contact area 316 to the input device 108 .
  • a haptic engine can be configured to operate in two or more modes. For example, in a first mode, the haptic engine may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108 ). In a second mode, the haptic engine can be positioned at a second position to produce a haptic output within the electronic device and/or to apply a haptic output to one or more non-input-device surfaces or regions of the electronic device.
  • FIGS. 6 A- 7 B illustrate two embodiments where the haptic engine operates in two modes.
  • FIGS. 6 A- 6 B depict a simplified cross-sectional view of a fourth example of the electronic device taken along line A-A in FIG. 1 .
  • the haptic engine can be positioned in two different positions.
  • the shaft 600 includes a body 601 and a contact area 602 .
  • the body 601 and the contact area 602 may be fixed to each other or may be configured to separate to decouple the haptic engine from the input device 108 .
  • the position of the body 601 and the contact area 602 are adjustable via a biasing mechanism. Any suitable biasing mechanism can be used.
  • the biasing mechanism includes a magnet 604 attached to, or embedded in, the end of the shaft 600 that is opposite the contact area 602 .
  • An electromagnet 606 can be positioned within the enclosure 102 a given distance from the magnet 604 .
  • a current can be received by the electromagnet 606 that produces an attracting or repelling magnetic field with respect to the magnet 604 , depending on whether the contact area 602 is to be moved away from or toward the interior surface of the input device 108 .
  • the shaft 600 can be situated in one of three or more positions using the electromagnet 606 and the magnet 604 .
  • one or more magnets and electromagnets can be included in, or attached to, the contact area 602 and the input device 108 , respectively.
  • one or more electromagnets may be included in, or attached to, the input device 108 .
  • the electromagnet(s) are used to produce a magnetic field that attracts or repels the magnet assembly 300 .
  • the electromagnet(s) can be activated to move the magnet assembly 300 to a given position.
  • the one or more electromagnets are deactivated when the magnet assembly 300 is at the given position.
  • a current applied to the coil assembly 304 can then be used to move the magnet assembly 300 and produce a haptic output.
  • the attachment mechanism may be a mechanical switch that is configured to position the shaft 600 in at least two different positions. For example, the switch may adjust the position of a movable arm that is attached to the shaft 600 .
  • a current can be applied to the coil assembly 304 to move the frame 306 and the magnet assembly 300 in one direction or in an oscillating manner (as represented by arrow 608 ) along the body 601 to produce haptic output within the enclosure 102 .
  • the haptic output is not applied or transferred directly to the input device 108 because the contact area 602 is not in contact with the input device 108 .
  • the biasing mechanism adjusts the position of the shaft 600 so that the contact area 602 contacts the interior surface of the input device 108 (see FIG. 6 B ).
  • a current is applied to the coil assembly 304 to move the magnet assembly 300 and the frame 306 in one direction or in an oscillating manner (in two opposing directions) along the body 601 .
  • the haptic output is applied (or the momentum of the haptic output is transferred) directly to the input device 108 through the contact area 602 .
  • the haptic output created by the moving mass of the frame 306 and the magnet assembly 300 may be perceived by a user as haptic feedback on the input device 108 .
  • FIGS. 7 A- 7 B show an alternate haptic engine that can be situated in multiple positions.
  • the width (W) of the coil assembly 700 is greater than the width of the magnet assembly 300 .
  • the coil assembly 700 can include one or more coils.
  • a single coil having a width (W) can be used in some embodiments.
  • two coils having a combined width (W) can be positioned side-by-side about the shaft 302 and magnet assembly 300 .
  • Each coil can be energized independently to move the frame 306 and the magnet assembly 300 to a given position along the shaft 302 and to produce a haptic output once the magnet assembly 300 and the frame 306 are situated at the given location.
  • both coils may be energized to move the magnet assembly 300 and the frame 306 to produce the haptic output.
  • the shaft 302 includes a contact area 316 that is attached to the input device 108 and a collar 310 that is disengagably coupled to the frame 306 .
  • a current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 to move in a direction away from the input device 108 .
  • the magnet assembly 300 can be positioned at a first location within the housing 314 ( FIG. 7 A ). In this position, the collar 310 of the shaft 302 is disengaged with respect to the frame 306 .
  • the magnet assembly 300 After the magnet assembly 300 is situated at the first position, another current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 and the frame 306 to move in one direction or in an oscillating manner (in two opposing directions) to produce the haptic output.
  • the magnet assembly 300 can move a length L 1 along the shaft 302 when moving in an oscillating manner to produce the haptic output within the enclosure 102 .
  • a current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 to move in a direction toward the input device 108 .
  • the magnet assembly may be positioned at a second location within the housing 314 ( FIG. 7 B ). In this position, the collar 310 of the shaft 302 is engaged with the frame 306 . After the magnet assembly 300 is situated at the second position, another current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 and the frame 306 to move in one direction or in an oscillating manner (in two opposing directions) to produce the haptic output.
  • the magnet assembly 300 can move a length L 1 when moving in an oscillating manner to produce the haptic output.
  • FIG. 8 depicts one example of the coil and magnet assemblies that are suitable for use in the haptic engines shown in FIGS. 2 - 7 .
  • the first and second magnets 300 a, 300 b of the magnet assembly 300 are positioned about the shaft 302 .
  • the magnets 300 a , 300 b have opposite polarities.
  • the north pole of magnet 300 a can be adjacent to the south pole of magnet 300 b.
  • a coil assembly 304 is formed with a coil 800 that encircles the magnets 300 a, 300 b .
  • the magnets 300 a, 300 b move in a direction aligned with the shaft 302 when a haptic output signal is transmitted through the coil 800 .
  • the coil 800 can be stationary or move with respect to the magnets 300 a, 300 b. Additionally, a width of the coil 800 can be greater than, less than, or substantially the same as the width of the magnet assembly 300 .
  • FIG. 9 shows a flowchart of a method of operating an electronic device.
  • the method of FIG. 9 may be applied using, for example, haptic engines described above with respect to FIGS. 2 - 8 .
  • the method of FIG. 9 may be used to operate a haptic engine in two (or more) modes.
  • a first mode may allow the haptic output to be applied directly to an exterior surface of the input device.
  • a second mode may allow the haptic output to be delivered via another exterior surface of the device.
  • the operations of the method of FIG. 9 may be performed by a processing unit or other logical element of the electronic device.
  • the device may determine whether a haptic output is to be produced in response to a user input (received at an input device) or in response to another event.
  • Other events include, for example, notifications, alerts, alarms, and other events that may be signaled to a user. If the determination is negative, the method returns to block 900 .
  • the device makes a determination as to whether a haptic engine or haptic device should be adjusted. For example, the mode of the haptic engine or the haptic device may be changed so that a shaft or other element of the haptic engine engages the input device. Additionally or alternatively, one or more characteristics of a haptic output signal that is received by the haptic engine or the haptic device can be adjusted. For example, a frequency, amplitude, and/or a phase of the haptic output signal may be changed. Adjusting one or more of the characteristics of the haptic output signal can adjust the magnitude and/or the type of haptic output. If the haptic engine or haptic device will not be adjusted, the method passes to block 904 where the haptic output is produced.
  • the haptic output of block 908 may correspond to a first mode in which a localized haptic output that is delivered to the input device.
  • the localized haptic output may be concentrated or focused on an exterior surface of the input device and used to provide haptic feedback or acknowledgement of a user action received using the input device.
  • FIG. 10 is an illustrative block diagram of the electronic device shown in FIG. 1 .
  • the electronic device 100 can include the display 104 , one or more processing units 1000 , memory 1002 , one or more I/O devices 1004 , one or more sensors 1006 , a power source 1008 , a network communications interface 1010 , and an input device 1012 that includes a haptic engine 1014 .
  • the processing unit(s) 1000 can control or coordinate some or all of the operations of the electronic device 100 .
  • the processing unit(s) 1000 can communicate, either directly or indirectly, with substantially all of the components of the electronic device 100 .
  • a system bus or signal line 1016 or other communication mechanism can provide communication between the processing unit(s) 1000 , the memory 1002 , the I/O device(s) 1004 , the sensor(s) 1006 , the power source 1008 , the network communications interface 1010 , and/or the input device 1012 and haptic engine 1014 .
  • the one or more processing units 1000 can be implemented as any electronic device capable of processing, receiving, or transmitting data, instructions, and/or program code.
  • the processing unit(s) 1000 can each be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices.
  • the term “processing unit” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
  • the processing unit(s) 1000 and the processing device 202 are the same processing unit.
  • data processing, inputs, and outputs can be distributed between the processing unit(s) 1000 and the processing device 202 .
  • the processing unit(s) 1000 is configured to receive or be responsive to an input device signal from the input device 1012 .
  • the input device signal indicates an input action that is associated with an input device 1012 has occurred.
  • the processing device(s) 1000 is configured to cause a haptic output signal to be transmitted to a coil assembly in the haptic engine 1014 .
  • the haptic engine 1014 produces a haptic output based on the haptic output signal.
  • the memory 1002 can store electronic data that can be used by the electronic device 100 and instructions and/or program code that is executed by the processing unit(s) 1000 .
  • a memory can store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing and control signals or data for the haptic device 1012 (or one or more components included therein), data structures or databases, and so on.
  • the memory 1002 can be configured as any type of memory. By way of example only, the memory can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, or combinations of such devices.
  • the one or more I/O devices 1004 can transmit and/or receive data to and from a user or another electronic device.
  • the I/O device(s) 1004 can include a touch sensing input surface such as a track pad, one or more buttons, one or more microphones or speakers, one or more ports such as a microphone port, and/or a keyboard.
  • the electronic device 100 may also include one or more sensors 1006 positioned substantially anywhere on the electronic device 100 .
  • the sensor(s) 1006 may be configured to sense substantially any type of characteristic, such as, but not limited to, images, pressure, light, touch, force, biometric data, temperature, position, motion, and so on.
  • the sensor(s) 1006 may be an image sensor, a temperature sensor, a light or optical sensor, an atmospheric pressure sensor, a proximity sensor, a humidity sensor, a magnet, a gyroscope, a biometric sensor, an accelerometer, and so on.
  • the power source 1008 can be implemented with one or more devices capable of providing energy to the electronic device 100 .
  • the power source 1008 can be one or more batteries or rechargeable batteries.
  • the power source 1008 may be a connection cable that connects the electronic device to another power source, such as a wall outlet or another electronic device.
  • the network communication interface 1010 can facilitate transmission of data to or from other electronic devices.
  • a network communication interface 1010 can transmit electronic signals via a wireless and/or wired network connection.
  • wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, infrared, and Ethernet.
  • the input device 1012 can be any suitable input device that is configured to provide a haptic feedback to a user in response to an input action.
  • the input device 1012 can be an input button, a crown, a section of the enclosure, and/or a display.
  • the haptic engine 1014 can be implemented as any suitable device configured to provide force feedback, vibratory feedback, tactile sensations, and the like.
  • the haptic engine 1014 can be implemented as an electromagnetic actuator (e.g., linear actuator) configured to provide a punctuated haptic feedback, such as a tap or a knock.
  • the electromagnetic actuator may be configured to translate in two directions to provide a vibratory haptic feedback.
  • FIG. 10 is exemplary only. In other examples, an electronic device may include fewer or more components than those shown in FIG. 10 . Additionally or alternatively, an electronic device can be included in a system and one or more components shown in FIG. 10 is separate from the electronic device but in communication with the electronic device. For example, an electronic device may be operatively connected to, or in communication with a separate display. As another example, one or more applications or data can be stored in a memory separate from an electronic device. In some embodiments, the separate memory can be in a cloud-based system or in an associated electronic device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An electronic device is configured to provide haptic feedback to a user based on an input action associated with an input device. The electronic device includes a haptic engine operably connected to a processing device. The haptic engine includes an electromagnetic actuator that detects an input action associated with the input device. The electromagnetic actuator also produces a haptic output in response to the detection of the input action.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 17/145,115, filed Jan. 8, 2021, which is a continuation of U.S. Nonprovisional patent application Ser. No. 16/800,723, filed Feb. 25, 2020, now U.S. Pat. No. 10,890,978, which is a continuation of U.S. Nonprovisional patent application Ser. No. 15/366,674, filed Dec. 1, 2016, now U.S. Pat. No. 10,585,480, which is a nonprovisional patent application of and claims the benefit of U.S. Provisional Patent Application No. 62/334,036, filed May 10, 2016, the disclosures of which are hereby incorporated by reference herein in their entirety.
  • FIELD
  • The described embodiments relate generally to electronic devices. More particularly, the present embodiments relate to an electronic device that includes a haptic engine used to detect an input action associated with an input device and provides haptic feedback based on the detected input action.
  • BACKGROUND
  • Portable electronic devices have become increasingly popular, and the features and functionality provided by portable electronic devices continue to expand to meet the needs and expectations of many consumers. For example, some portable electronic devices include features such as touch sensors, a display, various input devices, speakers, and microphones. In some cases, the electronic device may take on a small form factor. In such cases, it can be challenging to include all of the components in the electronic device that are needed to provide the various functionalities in the smallest space.
  • SUMMARY
  • Embodiments disclosed herein provide an electronic device that is configured to provide haptic feedback to a user based on an input action associated with an input device. A haptic engine is configured to detect an input action associated with the input device (e.g., a translational input) and produce a haptic output based on the detected input action. The haptic output may be perceived by a user as haptic feedback. The haptic feedback can indicate to the user that the user input has been received by the electronic device.
  • In some embodiments, an input device includes a haptic engine operably connected to or mechanically coupled to an input surface of the input device and to a processing device. The haptic engine may include an electromagnetic actuator, such as a linear actuator, that detects a user input or input action associated with the input surface and provides a haptic output based on the detected input action. The electromagnetic actuator includes a magnet assembly and a coil assembly adjacent to the magnet assembly. For example, the coil assembly can at least partially surround the magnet assembly. The haptic engine detects the input action based on a first movement between the magnet assembly and the coil assembly with respect to each other, the first movement inducing an input device signal in the coil assembly. The processing device is configured to receive or respond to the input device signal and to responsively cause a haptic output signal to be transmitted to the haptic engine. The haptic output signal produces a second movement between the magnet assembly and the coil assembly with respect to each other to produce a haptic output that is applied or transferred to the input surface.
  • In some embodiments, the input device is an input button in an electronic watch (e.g., a smart watch). The electronic watch also includes a display and a processing device operably connected to the display and to the electromagnetic actuator. An input action (e.g., a translational input action) received by the input button causes the processing device to receive or respond to an input device signal from the electromagnetic actuator and a haptic output signal to be transmitted to the electromagnetic actuator. The display is configured to display a user interface screen associated with an application program, and the input action also causes a change in the user interface screen displayed on the display. For example, an icon may be selected and a different user interface screen displayed based on the selected icon, or the digits in the time displayed on the display are changed based on the input action.
  • In some embodiments, an electronic device includes an input device configured to receive a user input, a haptic device operably connected to the input device, and a processing device operably connected to the haptic device. The processing device is configured to receive or respond to an input device signal from the haptic device based on the user input. In response to the input device signal, the processing device is configured to cause a haptic output signal to be transmitted to the haptic device. The haptic output signal causes the haptic device to produce a haptic output.
  • In some embodiments, an electronic device includes an input device configured to receive a user input and a haptic engine operably connected to the input device. The haptic engine is configured to detect the user input and produce a haptic output based on the detected user input. The haptic engine is further configured to operate in a first mode in which the haptic engine engages the input device, and in a second mode in which the haptic engine is not engaged with the input device.
  • In some embodiments, an electronic watch includes an electromagnetic actuator operably connected to an input button, and a processing device operably connected to the electromagnetic actuator. The electromagnetic actuator includes a magnet assembly and a coil assembly adjacent the magnet assembly. The electromagnetic actuator is configured to detect an input action (e.g., a translational input action) provided to the input button based on a first movement between the magnet assembly and the coil assembly. The first movement induces an input device signal in the coil assembly. The processing device is configured to receive or respond to the input device signal and to cause a haptic output signal to be transmitted to the electromagnetic actuator to cause a second movement between the magnet assembly and the coil assembly to produce a haptic output. The haptic output may be applied to the input button and/or to another region or surface of the electronic device.
  • In some embodiments, an electronic watch includes a linear actuator operably connected to a crown and a processing device operably connected to the linear actuator. The linear actuator includes a magnet assembly movably coupled to a shaft and a coil assembly adjacent the magnet assembly. The linear actuator is configured to detect an input action (e.g., a translational input action) provided to the crown based on a first movement between the magnet assembly and the coil assembly, the first movement inducing an input device signal. The processing device is configured to receive or respond to the input device signal and to cause a haptic output signal to be transmitted to the linear actuator. The haptic output signal causes a second movement between the magnet assembly and the coil assembly to produce a haptic output. The haptic output may be applied to the crown and/or to another region or surface of the electronic device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
  • FIG. 1 shows one example of an electronic device that can include a haptic engine configured to provide haptic output based on an input action associated with an input device;
  • FIG. 2 depicts a simplified schematic of the electronic device taken along line A-A in FIG. 1 ;
  • FIG. 3 shows a schematic cross-sectional view of a first example of the electronic device taken along line A-A in FIG. 1 ;
  • FIG. 4 depicts a simplified cross-sectional view of a second example of the electronic device taken along line B-B in FIG. 1 ;
  • FIG. 5 shows show a simplified cross-sectional view of a third example of the electronic device taken along line A-A in FIG. 1 ;
  • FIGS. 6A-6B show a simplified cross-sectional view of a fourth example of the electronic device taken along line A-A in FIG. 1 ;
  • FIGS. 7A-7B depict a simplified cross-sectional view of a fifth example of the electronic device taken along line A-A in FIG. 1 ;
  • FIG. 8 depicts one example of the coil and magnet assemblies that are suitable for use in the haptic devices shown in FIGS. 2-7 ;
  • FIG. 9 shows a flowchart of a method of operating an electronic device; and
  • FIG. 10 is an illustrative block diagram of the electronic device shown in FIG. 1 .
  • The cross-hatching in the figures is provided to distinguish the elements or components from one another. The cross-hatching is not intended to indicate a type of material or materials or the nature of the material(s).
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
  • The following disclosure relates to an electronic device that is configured to provide haptic feedback to a user. In general, a haptic device may be configured to produce a mechanical movement or vibration that may be transmitted through the enclosure and/or an input device of the electronic device. In some cases, the movement or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user. In some embodiments, the haptic feedback may be coupled to an input action on an input device. One example of an input action is the pressing of an input button. The haptic feedback can indicate to a user that the input action has been received or registered by the input device and/or the electronic device.
  • In a particular embodiment, the electronic device includes an input device and a haptic engine operably connected to the input device. A haptic engine may include an electromechanical assembly that is capable of producing a change in momentum using a moving mass that results in a haptic output. In the embodiments described herein, the haptic engine is configured to function as both an input sensor and a haptic device. In particular, the input sensor may be integrated within the haptic device in that the electromechanical components that produce and receive signals from both the haptic device and the input sensor. For example, when the haptic device is an electromagnetic actuator, an input action (e.g., button press) can cause the magnet assembly and the coil assembly to move with respect to each other. This movement induces a current (or “input device signal”) in the coil assembly. The input device signal indicates an input action associated with the input device has occurred. A processing device may be responsive to the input device signal, and may cause a haptic output signal to be transmitted to the coil assembly. The haptic output signal may cause the haptic device to produce a haptic output.
  • As used herein, the term “input action” may be construed broadly to include any type of user input associated with an input device, or with one or more components in an input device. Example input actions include, but are not limited to, a translational input, touch input, force input, motion input, acceleration input, pressure input, velocity input, rotational input, and combinations thereof. In a non-limiting example, an input device can be an input button in an electronic device, and one input action associated with the input button is a button press or translational input. The button press may cause the input button, or components within the input button, to translate or move in the same direction as the direction of the button press (e.g., horizontal direction).
  • Additionally or alternatively, an input action can include a force input where an amount of force, or varying amounts of force, is applied to an input device. In such embodiments, a processing device operably connected to the haptic engine is configured to detect the applied force on the input device. Additionally or alternatively, the processing device can be configured to detect motion or a rotation of the input device (or of a component in the input device). In a non-limiting example, the input device may be a crown of an electronic watch (e.g., a smart watch) that a user can rotate to provide one or more inputs to the smart watch.
  • In general, a haptic engine may produce one or more types of haptic output, such as vibrations, an applied force, movement, and combinations thereof. The haptic output may be transmitted through the enclosure and/or an input device of the electronic device and detected by a user. In some cases, the movement, force, and/or vibration is transmitted to the skin of the user and perceived as a stimulus or haptic feedback by the user. In one non-limiting example, a user can press an input button and the haptic engine can apply a force to the input button in a direction opposite the direction of the button press. The applied force may be perceived by the user as a “tap” or “knock” that indicates the input device and/or the electronic device registered the button press. Alternatively, the haptic engine may move a mass in one direction or in opposing directions in response to the button press. The movement may be perceived by the user as a vibration that indicates the input device and/or the electronic device registered the button press.
  • In some embodiments, the position of the haptic engine can be adjusted so that a haptic output can be applied to different regions of an electronic device. For example, a haptic engine may be positioned at a first position to apply a haptic output directly to an input device (e.g., an exterior surface of an input button). The haptic engine may also be positioned at a second position to apply a haptic output to a different region of the electronic device (e.g., an exterior surface of an enclosure). The position of the haptic engine can be adjusted using any suitable method. For example, in one embodiment an electromagnet or switch may position the haptic device.
  • Directional terminology, such as “top”, “bottom”, “front”, “back”, “leading”, “trailing”, etc., is used with reference to the orientation of the Figure(s) being described. Because components of embodiments described herein can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration only and is in no way limiting. When used in conjunction with the components of an input device and of an electronic device, the directional terminology is intended to be construed broadly, and therefore should not be interpreted to preclude the presence of one or more intervening components or other intervening features or elements.
  • These and other embodiments are discussed below with reference to FIGS. 1-10 . However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these Figures is for explanatory purposes only and should not be construed as limiting.
  • FIG. 1 shows one example of an electronic device that can include a haptic engine configured to produce haptic output based on an input action associated with an input device. In the illustrated embodiment, the electronic device 100 may be implemented as an electronic or smart watch that is adapted to be worn by a user. A different type of electronic device can be used in other embodiments. For example, the electronic device can be a gaming device, a digital music player, a sports accessory device, a medical device, a health assistant, a tablet computing device, a notebook computer, a smart phone, and other types of electronic devices that provide, or are suitable to provide, haptic feedback to a user.
  • The electronic device 100 includes input devices 106, 108. In some embodiments, one or both of the input devices 106, 108 may be configured as input/output devices. The term “input device” is intended to be construed broadly to include both input and input/output devices. An input device may include an input component, such as a button, knob, dial, crown, and the like. Although shown on a side of the electronic device 100, the input devices 106, 108 can be positioned substantially anywhere on the electronic device 100.
  • As will be described in more detail later, the electronic device 100 includes at least one haptic engine (see e.g., FIG. 2 ) operably connected to one or both input devices. The haptic engine is configured to detect an input action associated with one or both input devices 106, 108 and provide haptic feedback to a user when an input action is detected. The haptic engine functions as both an input sensor and a haptic device. In some embodiments, at least some of the components of the haptic engine can be used as the input sensor. For example, when the haptic engine is an electromagnetic actuator (e.g., a linear actuator), an input action (e.g., a translation of input device 108) can cause a magnet assembly and a coil assembly of the electromagnetic actuator to move with respect to each other. This movement induces a current (or “input device signal”) in the coil assembly. The input device signal may indicate that an input action associated with the input device has occurred. A processing device may be responsive to the input device signal and may, in turn, cause a haptic output signal to be transmitted to the coil assembly. The haptic output signal causes the electromagnetic actuator to produce a haptic output. The haptic output may be perceived by the user as haptic feedback that indicates the input action has been registered or entered by the electronic device 100 and/or the input device(s) 106, 108.
  • In the illustrated embodiment, the input device 106 is a crown and the input device 108 an input button. Input devices in other embodiments are not limited to these configurations. For example, an input device may be a rocker switch, a portion of the enclosure 102, one or more keys in a keyboard, a slide switch, a virtual icon or image on a display, or any other suitable input device.
  • The input device 106 (e.g., crown) is configured to receive translational and rotational input actions. For example, the input device 106 may include a shaft that extends into the electronic device 100. Pressing the input device 106 can cause the shaft, or components coupled to the shaft, to move or translate a given distance. Additionally or alternatively, the shaft may rotate when a user rotates the input device 106. The amount of shaft rotation can be detected and measured by an optical encoder positioned adjacent to the shaft. The amount of shaft rotation may be used as an input to the electronic device 100 and/or to an application program running on the electronic device 100.
  • One or more functions can be performed when the input device 106 is rotated and/or pressed. For example, if the display 104 of the electronic device 100 is displaying a time keeping application, the input device 106 may be rotated in either direction to change or adjust the position of the hands or the digits that are displayed for the time keeping application. Additionally or alternatively, the input device 106 may be rotated to move a cursor or other type of selection mechanism from a first displayed location to a second displayed location in order to select an icon or move the selection mechanism between various icons that are presented on the display 104. Additionally or alternatively, the input device 106 may be pressed to perform various functions, such as changing the image on a display, waking the electronic device 100 from a sleep state, and/or to select or activate an application. In some embodiments, the input device 106 can be rotated or pressed to disable an application or function. For example, the input device 106 may be pressed to disable an alert produced by an application on the electronic device 100 or received by the electronic device 100.
  • In some embodiments, the input device 108 (e.g., an input component or input button) can be configured to be pressed to cause various functions to be performed and/or disabled. The input device 108 may include a shaft that extends into the electronic device 100. Pressing the input device 108 can cause the shaft, or components coupled to the shaft, to move or translate a given distance. For example, a single press can activate an application and/or display a particular image or screen on the display. Additionally or alternatively, a single press may disable or delay an alert. A multiple press (e.g., a double press or double click) can activate an application and a component within the electronic device 100. For example, a double click may access an application that uses a wireless communication network to transmit data associated with the application (e.g., an electronic payment application). Additionally or alternatively, a press-hold may operate to turn on and turn off the electronic device 100 or to place the electronic device 100 in a power saving mode (e.g., a mode where minimal functions and applications operate and other applications and functions are disabled).
  • In some embodiments, pressing both of the input devices 106, 108 in various combinations can cause one or more functions to be performed. For example, pressing the input device 106 and then immediately pressing the input device 108 can cause an action to be performed on the electronic device 100. Additionally or alternatively, simultaneous press-holds on the input devices 106, 108 can cause another action to be performed on the electronic device 100.
  • The electronic device 100 further includes an enclosure 102 that forms an outer surface or partial outer surface for the internal components of the electronic device 100. The enclosure 102 defines openings and/or apertures that receive and/or support a display 104 and the input devices 106, 108. The enclosure 102 can be formed of one or more components operably connected together, such as a front piece and a back piece. Alternatively, the enclosure 102 can be formed of a single piece operably connected to the display 104. In the illustrated embodiment, the enclosure 102 is formed into a substantially rectangular shape, although this configuration is not required. For example, certain embodiments may include a substantially circular enclosure 102.
  • The display 104 can provide a visual output for the electronic device 100 and/or function to receive user inputs to the electronic device 100. For example, the display 104 may incorporate an input device configured to receive touch input, force input, temperature input, and the like. The display 104 may be substantially any size and may be positioned substantially anywhere on the electronic device 100. The display 104 can be implemented with any suitable display, including, but not limited to, a multi-touch sensing touchscreen device that uses liquid crystal display (LCD) element, a light emitting diode (LED) element, an organic light-emitting display (OLED) element, or an organic electro luminescence (OEL) element.
  • FIG. 2 shows a simplified schematic of the electronic device taken along line A-A in FIG. 1 . The electronic device 100 can include a haptic engine 200 operably connected to a processing device 202. The haptic engine 200 is configured to detect an input action associated with the input device 108 and produce a haptic output based on a detected input action. As used herein, the input device 108 may also be referred to as an input component, which may include an input button, push button, or actuator. As described earlier, the haptic engine 200 may produce one or more types of haptic output, such as vibrations, an applied force, movement, and combinations thereof. The haptic output may be applied or transferred to at least one surface of the electronic device 100 and/or to the input device 108. For example, the haptic output can be transmitted through the input device and/or the enclosure and perceived as haptic feedback by a user.
  • For example, in one embodiment the haptic engine 200 is configured to apply a haptic output to a bottom surface of the enclosure 102 (e.g., the momentum of the haptic output can be transferred to the bottom surface). When a user is wearing the electronic device 100 on his or her wrist, the haptic output may be detected by the user as haptic feedback because the bottom surface of the electronic device 100 is in contact with the wrist. In other embodiments, the haptic output may be applied or transferred to a side of the electronic device 100, a top surface of the electronic device 100, multiple surfaces of the electronic device 100, and combinations thereof.
  • Additionally or alternatively, the haptic engine 200 may be configured to produce a haptic output that is applied or transferred to the input device 108. The haptic engine 200 may be mechanically or structurally coupled to the input surface 210 (of the input device 108) to receive movement from and/or transmit movement to the input surface 210. By mechanically coupling the haptic engine 200 to the input surface 210, movement of the input surface 210 results in movement of one or more components of the haptic engine 200. In one non-limiting example, when a user applies a force to the input surface 210 of the input device 108 (e.g., presses the input surface 210), the haptic engine 200 can detect such input action and produce a first signal (“input device signal”) that is received by the processing device 202. Based on the input device signal, the processing device 202 can cause a second signal (“haptic output signal”) to be transmitted to the haptic engine 200 that causes the haptic engine 200 to produce a haptic output (e.g., a vibration or an applied force). A user may then detect the haptic output as haptic feedback when the user's finger is in contact with the input surface 210.
  • In some embodiments, the haptic engine 200 can be configured to operate in two or more modes. For example, in a first mode, the haptic engine 200 may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108). In a second mode, the haptic engine 200 can be positioned at a second position to produce a haptic output within the electronic device 100 and/or to apply to one or more non-input-device surfaces or regions of the electronic device 100.
  • Additionally or alternatively, a second haptic device 204 may be operably connected to the processing device 202. In such embodiments, the haptic engine 200 can produce a haptic output for the input device 108 while the second haptic device 204 may produce a haptic output for one or more different surfaces (non-input-device surfaces) or regions of the electronic device 100.
  • Any suitable type of haptic device can be used in the haptic engine 200 and/or the second haptic device 204. Example haptic devices include, but are not limited to, actuators, vibrator, and other type of motors. As described earlier, a haptic device and haptic engine may produce one or more types of haptic output, such as movement, vibrations, transfer of momentum, and other actions that may produce a perceptible or tactile output.
  • In some embodiments, an input sensor and a haptic device are separate components within the electronic device 100. In such embodiments, the input sensor can detect or sense an input action using any suitable sensing technology, such as capacitive, piezoelectric, piezoresistive, electromagnetic, ultrasonic, and magnetic sensing technologies. For example, in one embodiment a capacitive input sensor can be used to detect the presence of a user's finger on the input device. Additionally or alternatively, a capacitive sensor may be used to detect a user applying a force on the input device. For example, when the input device is an input button, the input sensor can detect the presence of a user's finger on the button and/or the user pressing the input button.
  • Example embodiments of a haptic engine will now be discussed. FIG. 3 shows a schematic cross-sectional view of a first example of the electronic device taken along line A-A in FIG. 1 . In the illustrated embodiment, the haptic engine 200 is mechanically coupled to the input device 108. In some examples, a mechanical coupling between the haptic engine 200 and the input device 108 facilitates a transfer of motion between the two components. In particular, the haptic engine 200 may be mechanically coupled to the input device 108 such that a translational input to the input device 108 is transferred or structurally communicated to the haptic engine 200. Additionally, translational (e.g., vibrational) output from the haptic engine 200 may be transferred or mechanically communicated to the input device 108.
  • In the example of FIG. 3 , the haptic engine 200 includes an electromagnetic actuator or linear actuator that uses a moving mass to create a haptic output (e.g., movement, applied force, and/or vibration). In one non-limiting example, the moving mass may be one or more magnets that move(s) in one direction or in an oscillating manner in response to a current passing through a coil that is adjacent to the magnet(s). The moving magnet(s) can produce a vibration or applied force that is perceived as haptic feedback by a user.
  • In the illustrated embodiment, the haptic engine 200 includes a magnet assembly 300 coupled to and/or movably positioned about a shaft 302. The magnet assembly 300 can include one or more magnets. In the illustrated embodiment, the magnet assembly 300 includes two magnets 300 a, 300 b of opposite polarities. The magnets 300 a, 300 b can be made of any suitable ferromagnetic material, such as neodymium. The shaft 302 may be formed form one or more components that are fixed with respect to each other or may be separated to allow for decoupling of the haptic engine 200 from other elements of the device. The shaft 302 can be made of a non-ferrous material such as tungsten, titanium, stainless steel, or the like.
  • A coil assembly 304 at least partially surrounds the magnet assembly 300 and/or the shaft 302. The coil assembly 304 includes one or more coils. Each coil can be formed with a winding of a conductive material, such as a metal. In one embodiment, the width of the coil assembly 304 can be less than or substantially equal to the width of the magnet assembly 300. In other embodiments, the width of the coil assembly 304 may be greater than the width of the magnet assembly 300.
  • In some embodiments, a frame 306 can be positioned at least partially around the coil assembly 304, the magnet assembly 300, and/or the shaft 302 to increase the momentum of the linear actuator. The frame 306 can be made of any suitable material. In one embodiment, the frame 306 is made of a metal, such as tungsten.
  • The coil assembly 304 and the magnet assembly 300 are positioned such that a first air gap separates the coil assembly 304 from the magnet assembly 300. Similarly, the coil assembly 304 and the frame 306 are positioned such that a second air gap separates the coil assembly 304 from the frame 306. In the illustrated embodiment, the first and second air gaps are located on opposing sides of the coil assembly 304.
  • In some embodiments, the frame 306 can be disengaged from the input device 108. The shaft 302 extends through a bearing 308 and a collar 310 which support the frame 306. The collar 310 allows the shaft 302 to pass one frame 306 in only one direction. For example, the collar 310 may permit the shaft 302 to only move in a direction away from the input device 108.
  • The coil assembly 304 may be energized by transmitting a current along a length of a wire that forms a coil in the coil assembly 304. A direction of the current along the wire of the coil determines a direction of a magnetic field that emanates from the coil assembly 304. The opposing polarities of the magnets 300 a, 300 b generate a radial magnetic field that interacts with the magnetic field of the coil assembly 304. The Lorentz force resulting from the interaction of the magnetic fields causes the frame 306 and the magnet assembly 300 to move in a first direction aligned with the axis of the shaft 302. Reversing the current flow through the coil assembly 304 reverses the Lorentz force. As a result, the magnetic field or force on the magnet assembly 300 is also reversed and the frame 306 and the magnet assembly 300 move in an opposing second direction. Thus, the frame 306 and the magnet assembly 300 can move in one direction or in an oscillating manner depending on the direction of the current flow through the coil assembly 304. In some embodiments, the frame 306 includes one or more magnets 312 that assist in moving the frame 306 and produce increased momentum when a current passes through the coil assembly 304.
  • When a user provides an input action to the input button 108 (e.g., a button press), the shaft 302, the magnet assembly 300, and the frame 306 can move a given distance into the electronic device 100. This movement induces a current (“input device signal”) in the coil assembly 304. A processing device (e.g., processing device 202 in FIG. 2 ) can receive or be responsive to the input device signal and cause a haptic output signal to be transmitted to the coil assembly 304. A haptic output is produced by the magnet assembly 300 and the frame 306 moving in one direction or in an oscillating manner based on the haptic output signal passing through the coil assembly 304.
  • A housing 314 may be attached to the enclosure 102 and positioned at least partially around the frame 306, the magnet assembly 300, the coil assembly 304, and the shaft 302. In the illustrated embodiment, the shaft 302 extends through the housing 314 with a contact area 316 attached to the interior surface of the input device 108. The momentum of a haptic output can be transferred to the input device 108 using the contact area 316.
  • A bracket 322 can at least partially surround the housing 314 and attach to an interior surface of the enclosure 102 using one or more fasteners 324. The bracket 322 fixes the housing 314 to the enclosure 102. Any suitable fastener may be used, such as screws, welding, and/or an adhesive. In some embodiments, the shaft 302 can extend into and/or pass through an opening in the bracket 322. This allows the shaft 302 to move in or through the opening when a force is applied to the input device 108.
  • In the example embodiment, the coil assembly 304 is fixed to the housing 314. The frame 306 and the magnet assembly 300 move with respect to the coil assembly 304. In such embodiments, the coil assembly 304 may not contact any portion of the frame 306 even when the frame 306 and the magnet assembly 300 are maximally displaced within the housing 314 (e.g., to one end of the shaft 302). It should be appreciated that in other embodiments the coil assembly 304 may move instead of, or in addition to, the frame 306 and the magnet assembly 300. However, it may be easier to provide the interconnections for the coil assembly 304 when the coil assembly 304 is fixed to the housing 314. For example, the coil assembly 304 can be electrically connected to a power source using a flexible circuit or other signal line.
  • A compliant element 318 can be positioned on each side of the frame 306 to bias the frame 306 towards the center region of the travel. The compliant elements 318 provide a return force or local biasing to the frame 306. The compliant elements 318 may be any suitable compliant element such as a leaf spring, beehive spring, and the like.
  • In some embodiments, the haptic engine 200 can function as a force sensor. Using the known characteristics of the input device signal and the linear actuator, such as the mass of the magnet assembly 300 and the spring coefficients of the compliant elements 318, the acceleration of the movement of the input device 108 can be correlated to an amount of force.
  • A compliant structure 320 can be positioned between the input device 108 and the enclosure 102 to allow travel between the input device 108 and the enclosure 102 and to return the input device 108 to a resting position. In one embodiment, the compliant structure 320 is positioned around an interior perimeter of the input device 108. In other embodiments, one or more discrete compliant structures 320 may be positioned around an interior perimeter of the input device 108.
  • As discussed earlier, in some embodiments at least some of the components of the haptic engine 200 are shared and form both an input sensor and a haptic device. In the illustrated embodiment, the magnet assembly 300 and the coil assembly 304 can be used as an input sensor. When a user performs an input action on the input device 108 (e.g., by pressing), the shaft 302, the magnet assembly 300, and the frame 306 can move inward, which in turn induces a current (“input device signal”) in the coil assembly 304. A processing device (not shown) operably connected to the coil assembly 304 may receive or be responsive to the input device signal and cause a haptic output signal to be transmitted to the coil assembly 304. The haptic output signal is transmitted along the length of a wire in a coil in the coil assembly 304, which in turn produces a magnetic field that causes the frame 306 and the magnets 300 a, 300 b to move and produce a haptic output (an applied force, movement, and/or vibration). The movement, vibration, and/or applied force may be perceived by a user as haptic feedback. Thus, the input action is sensed through the movement of the frame 306 and the magnets 300 a, 300 b with respect to the coil assembly 304, and a haptic output is produced by the movement of the frame 306 and the magnets 300 a, 300 b with respect to the coil assembly 304.
  • Additionally or alternatively, a discrete input sensor can be included in the electronic device. For example, in one embodiment, the compliant structure 320 can be formed as a force sensing layer configured to detect an amount of force applied to the input device 108. In one example, the force sensing layer can include two electrode layers separated by a dielectric or compliant material (e.g., air, foam, silicon, and the like). Each electrode layer can include one or more electrodes that are aligned in at least one direction to produce one or more capacitors. When a force is applied to the input device 108 (e.g., when a user presses the input device 108), the distance between the electrodes in at least one capacitor changes, which changes a capacitance of the capacitor. A processing device (not shown) can receive or be responsive to a signal from each capacitor representing the capacitance of that capacitor. The processing device may be configured to correlate the signal(s) to an amount of force that was applied to the input device 108.
  • The force sensing layer provides for a range of force input values that can be used to control a variety of functions. For example, a user can press the input device with a first force to perform a scrolling action at a first speed and press the input device with a second force to perform a scrolling action at a different second speed (e.g., a faster speed).
  • In some embodiments, a different type of input sensor can be used. The input sensor can be configured to detect any suitable characteristic or property. For example, the input sensor may be an image sensor, a light or optical sensor, a proximity sensor, a magnet, a biometric sensor, a touch sensor, an accelerometer, and so on. In an example embodiment, the input sensor can include one or more strain gauges, a tactile or reed switch, or a capacitive touch sensor. For example, the capacitive touch sensor may include a first electrode disposed within the enclosure 102 adjacent the input device 108 and a second electrode attached to or embedded in the input device 108.
  • FIG. 4 depicts a simplified cross-sectional view of a second example of the electronic device taken along line B-B in FIG. 1 . Similar to the previous examples, the electronic device includes a haptic engine 200, the description of which is provided above with respect to FIG. 3 .
  • With respect to the example of FIG. 4 , the input device 106, in this case a crown, is configured to receive both translational and rotational input actions from a user. An optical encoder 402 can be used to determine an amount of rotation for a rotational input. The optical encoder 402 can convert the angular motion of the shaft 400 to an analog or digital code. Typically, the shaft 400 includes a pattern (not shown) formed in or on the shaft 400 that selectively reflects light toward an optical sensor (not shown). The reflected light is used to determine the amount of rotation.
  • As shown in FIG. 4 , the input device 106 is operably coupled to the haptic engine 200. As describe above with respect to FIG. 3 , the haptic engine includes a magnet assembly 300 coupled to a frame 306, which is coupled to a shaft 400. The shaft 400 may be formed form one or more components that are fixed with respect to each other or may be separated to allow for decoupling of the haptic engine 200 from other elements of the device. A coil 304 is positioned adjacent to the magnet assembly 300 and is configured to produce a current in response to a movement of the magnet assembly 300. The coil 304 may also induce movement of the magnet assembly 300 when driven by a current or electrical signal.
  • In the example of FIG. 4 , the haptic engine 200 is mechanically or structurally coupled to the input device 106 such that motion of the input device 106 may be mechanically transferred to the haptic engine 200 and, similarly, motion produced by the haptic engine 200 may be mechanically transferred to the input device 106. In this example, the haptic engine 200 is configured to detect a translational input action (e.g., a press) associated with the input device 106 and produce a haptic output based on a detected input action. The haptic output may be applied to an exterior surface of the input device 106 (e.g., crown) and/or to another region or exterior surface of the electronic device.
  • Additionally, in some embodiments the haptic engine 200 provides a haptic output based on a rotational input action. The optical encoder 402 may produce an input device signal when the input device 106 is rotated. The processing unit may receive or be responsive to the input device signal and, in turn, cause a haptic output signal to be transmitted to the haptic engine 200. The haptic output signal may cause the haptic engine 200 to produce a haptic output that can be perceived by a user as haptic feedback indicating the rotational input action has been received by the electronic device.
  • It should be noted that the positions of the haptic engine 200 and the optical encoder 402 shown in FIG. 4 are for illustration purposes only. For example, the optical encoder 402 may be situated at any location adjacent the shaft 400 to allow the optical encoder 402 to emit light toward the shaft 400 and receive the light reflected by the shaft 400.
  • FIG. 5 shows a simplified cross-sectional view of a third example of the electronic device taken along line A-A in FIG. 1 . Similar to the previous examples, the haptic engine 200 is mechanically or structurally coupled to the input device 108. In the illustrated embodiment, the coil assembly 304 is movably coupled or positioned about the shaft 302 and the magnet assembly 300 is attached to the housing 314. A current can be applied to the coil assembly 304 to move the frame 306 and the coil assembly 304 in one direction or in an oscillating manner (as represented by arrow 500) to produce a haptic output within the enclosure 102. The haptic output may be applied directly to the input device 108 through the contact area 316 of the shaft 302. For example, a vibrational haptic output produced by the moving magnet assembly 300 and the frame 306 can be transferred through the contact area 316 to the input device 108.
  • As discussed earlier, a haptic engine can be configured to operate in two or more modes. For example, in a first mode, the haptic engine may be positioned at a first position to apply a haptic output directly to the interior surface of an input device (e.g., input device 108). In a second mode, the haptic engine can be positioned at a second position to produce a haptic output within the electronic device and/or to apply a haptic output to one or more non-input-device surfaces or regions of the electronic device. FIGS. 6A-7B illustrate two embodiments where the haptic engine operates in two modes.
  • FIGS. 6A-6B depict a simplified cross-sectional view of a fourth example of the electronic device taken along line A-A in FIG. 1 . In the illustrated embodiment, the haptic engine can be positioned in two different positions. In this example, the shaft 600 includes a body 601 and a contact area 602. The body 601 and the contact area 602 may be fixed to each other or may be configured to separate to decouple the haptic engine from the input device 108. The position of the body 601 and the contact area 602 are adjustable via a biasing mechanism. Any suitable biasing mechanism can be used. For example, in one embodiment the biasing mechanism includes a magnet 604 attached to, or embedded in, the end of the shaft 600 that is opposite the contact area 602. An electromagnet 606 can be positioned within the enclosure 102 a given distance from the magnet 604. A current can be received by the electromagnet 606 that produces an attracting or repelling magnetic field with respect to the magnet 604, depending on whether the contact area 602 is to be moved away from or toward the interior surface of the input device 108. In some embodiments, the shaft 600 can be situated in one of three or more positions using the electromagnet 606 and the magnet 604.
  • Other embodiments can use different types of biasing mechanisms. For example, one or more magnets and electromagnets can be included in, or attached to, the contact area 602 and the input device 108, respectively. Alternatively, one or more electromagnets may be included in, or attached to, the input device 108. The electromagnet(s) are used to produce a magnetic field that attracts or repels the magnet assembly 300. The electromagnet(s) can be activated to move the magnet assembly 300 to a given position. The one or more electromagnets are deactivated when the magnet assembly 300 is at the given position. A current applied to the coil assembly 304 can then be used to move the magnet assembly 300 and produce a haptic output. In some embodiments, the attachment mechanism may be a mechanical switch that is configured to position the shaft 600 in at least two different positions. For example, the switch may adjust the position of a movable arm that is attached to the shaft 600.
  • When the shaft 600 is positioned a given distance from the interior surface of the input device 108 (FIG. 6A), a current can be applied to the coil assembly 304 to move the frame 306 and the magnet assembly 300 in one direction or in an oscillating manner (as represented by arrow 608) along the body 601 to produce haptic output within the enclosure 102. The haptic output is not applied or transferred directly to the input device 108 because the contact area 602 is not in contact with the input device 108.
  • When a haptic output is to be applied directly to the input device 108, the biasing mechanism adjusts the position of the shaft 600 so that the contact area 602 contacts the interior surface of the input device 108 (see FIG. 6B). A current is applied to the coil assembly 304 to move the magnet assembly 300 and the frame 306 in one direction or in an oscillating manner (in two opposing directions) along the body 601. The haptic output is applied (or the momentum of the haptic output is transferred) directly to the input device 108 through the contact area 602. The haptic output created by the moving mass of the frame 306 and the magnet assembly 300 may be perceived by a user as haptic feedback on the input device 108.
  • FIGS. 7A-7B show an alternate haptic engine that can be situated in multiple positions. In this embodiment, the width (W) of the coil assembly 700 is greater than the width of the magnet assembly 300. The coil assembly 700 can include one or more coils. For example, a single coil having a width (W) can be used in some embodiments. Alternatively, two coils having a combined width (W) can be positioned side-by-side about the shaft 302 and magnet assembly 300. Each coil can be energized independently to move the frame 306 and the magnet assembly 300 to a given position along the shaft 302 and to produce a haptic output once the magnet assembly 300 and the frame 306 are situated at the given location. Alternatively, in some embodiments both coils may be energized to move the magnet assembly 300 and the frame 306 to produce the haptic output.
  • In this example, the shaft 302 includes a contact area 316 that is attached to the input device 108 and a collar 310 that is disengagably coupled to the frame 306. When a haptic output is to be produced within the electronic device, but not applied or transferred directly to the input device 108, a current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 to move in a direction away from the input device 108. The magnet assembly 300 can be positioned at a first location within the housing 314 (FIG. 7A). In this position, the collar 310 of the shaft 302 is disengaged with respect to the frame 306.
  • After the magnet assembly 300 is situated at the first position, another current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 and the frame 306 to move in one direction or in an oscillating manner (in two opposing directions) to produce the haptic output. For example, the magnet assembly 300 can move a length L1 along the shaft 302 when moving in an oscillating manner to produce the haptic output within the enclosure 102.
  • When a haptic output is to be applied directly to the input device 108, a current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 to move in a direction toward the input device 108. The magnet assembly may be positioned at a second location within the housing 314 (FIG. 7B). In this position, the collar 310 of the shaft 302 is engaged with the frame 306. After the magnet assembly 300 is situated at the second position, another current is passed through the coil assembly 700 to produce a magnetic field that causes the magnet assembly 300 and the frame 306 to move in one direction or in an oscillating manner (in two opposing directions) to produce the haptic output. For example, as shown in FIG. 7B, the magnet assembly 300 can move a length L1 when moving in an oscillating manner to produce the haptic output.
  • FIG. 8 depicts one example of the coil and magnet assemblies that are suitable for use in the haptic engines shown in FIGS. 2-7 . The first and second magnets 300 a, 300 b of the magnet assembly 300 are positioned about the shaft 302. As described earlier, the magnets 300 a, 300 b have opposite polarities. For example, the north pole of magnet 300 a can be adjacent to the south pole of magnet 300 b.
  • A coil assembly 304 is formed with a coil 800 that encircles the magnets 300 a, 300 b. As described earlier, in one embodiment the magnets 300 a, 300 b move in a direction aligned with the shaft 302 when a haptic output signal is transmitted through the coil 800. The coil 800 can be stationary or move with respect to the magnets 300 a, 300 b. Additionally, a width of the coil 800 can be greater than, less than, or substantially the same as the width of the magnet assembly 300.
  • FIG. 9 shows a flowchart of a method of operating an electronic device. The method of FIG. 9 may be applied using, for example, haptic engines described above with respect to FIGS. 2-8 . In particular, the method of FIG. 9 may be used to operate a haptic engine in two (or more) modes. A first mode may allow the haptic output to be applied directly to an exterior surface of the input device. A second mode may allow the haptic output to be delivered via another exterior surface of the device. In general, the operations of the method of FIG. 9 may be performed by a processing unit or other logical element of the electronic device.
  • At block 900, a determination is made as to whether a haptic output is to be produced. In particular, the device may determine whether a haptic output is to be produced in response to a user input (received at an input device) or in response to another event. Other events include, for example, notifications, alerts, alarms, and other events that may be signaled to a user. If the determination is negative, the method returns to block 900.
  • In response to a positive determination at block 900, the process continues at block 902. At block 902, a determination is made as to whether the haptic output is to be applied directly to an input device. If the haptic output will not be applied directly to the input device, the method passes to block 904 where the device produces the haptic output. The device may determine that the haptic output is not to be applied directly to the input device because the haptic output is associated with one or more non-user input events including, for example, a notification, alert, or alarm. In some cases, the haptic output of block 904 corresponds to a second mode in which a general haptic output that is delivered to an exterior surface of the electronic device. The exterior surface may include, but is not limited to, an exterior surface of the input device.
  • If the haptic output will be applied directly to the input device (and/or the momentum of the haptic output transferred directly to the input device), the process continues at block 906. At block 906, the device makes a determination as to whether a haptic engine or haptic device should be adjusted. For example, the mode of the haptic engine or the haptic device may be changed so that a shaft or other element of the haptic engine engages the input device. Additionally or alternatively, one or more characteristics of a haptic output signal that is received by the haptic engine or the haptic device can be adjusted. For example, a frequency, amplitude, and/or a phase of the haptic output signal may be changed. Adjusting one or more of the characteristics of the haptic output signal can adjust the magnitude and/or the type of haptic output. If the haptic engine or haptic device will not be adjusted, the method passes to block 904 where the haptic output is produced.
  • If the haptic engine or haptic device will be adjusted, the process continues at block 908 where the haptic engine is adjusted and the haptic output is produced. The haptic output of block 908 may correspond to a first mode in which a localized haptic output that is delivered to the input device. For example, the localized haptic output may be concentrated or focused on an exterior surface of the input device and used to provide haptic feedback or acknowledgement of a user action received using the input device.
  • FIG. 10 is an illustrative block diagram of the electronic device shown in FIG. 1 . The electronic device 100 can include the display 104, one or more processing units 1000, memory 1002, one or more I/O devices 1004, one or more sensors 1006, a power source 1008, a network communications interface 1010, and an input device 1012 that includes a haptic engine 1014. The processing unit(s) 1000 can control or coordinate some or all of the operations of the electronic device 100. The processing unit(s) 1000 can communicate, either directly or indirectly, with substantially all of the components of the electronic device 100. For example, a system bus or signal line 1016 or other communication mechanism can provide communication between the processing unit(s) 1000, the memory 1002, the I/O device(s) 1004, the sensor(s) 1006, the power source 1008, the network communications interface 1010, and/or the input device 1012 and haptic engine 1014. The one or more processing units 1000 can be implemented as any electronic device capable of processing, receiving, or transmitting data, instructions, and/or program code. For example, the processing unit(s) 1000 can each be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices. As described herein, the term “processing unit” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
  • In some embodiments, the processing unit(s) 1000 and the processing device 202 (FIG. 2 ) are the same processing unit. Alternatively, data processing, inputs, and outputs can be distributed between the processing unit(s) 1000 and the processing device 202. In embodiments where the one or more processing units 1000 and the processing device 202 are the same processing unit(s), the processing unit(s) 1000 is configured to receive or be responsive to an input device signal from the input device 1012. The input device signal indicates an input action that is associated with an input device 1012 has occurred. Additionally, in response to the receipt of the input device signal, the processing device(s) 1000 is configured to cause a haptic output signal to be transmitted to a coil assembly in the haptic engine 1014. The haptic engine 1014 produces a haptic output based on the haptic output signal.
  • The memory 1002 can store electronic data that can be used by the electronic device 100 and instructions and/or program code that is executed by the processing unit(s) 1000. For example, a memory can store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing and control signals or data for the haptic device 1012 (or one or more components included therein), data structures or databases, and so on. The memory 1002 can be configured as any type of memory. By way of example only, the memory can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, or combinations of such devices. The one or more I/O devices 1004 can transmit and/or receive data to and from a user or another electronic device. The I/O device(s) 1004 can include a touch sensing input surface such as a track pad, one or more buttons, one or more microphones or speakers, one or more ports such as a microphone port, and/or a keyboard.
  • The electronic device 100 may also include one or more sensors 1006 positioned substantially anywhere on the electronic device 100. The sensor(s) 1006 may be configured to sense substantially any type of characteristic, such as, but not limited to, images, pressure, light, touch, force, biometric data, temperature, position, motion, and so on. For example, the sensor(s) 1006 may be an image sensor, a temperature sensor, a light or optical sensor, an atmospheric pressure sensor, a proximity sensor, a humidity sensor, a magnet, a gyroscope, a biometric sensor, an accelerometer, and so on.
  • The power source 1008 can be implemented with one or more devices capable of providing energy to the electronic device 100. For example, the power source 1008 can be one or more batteries or rechargeable batteries. Additionally or alternatively, the power source 1008 may be a connection cable that connects the electronic device to another power source, such as a wall outlet or another electronic device.
  • The network communication interface 1010 can facilitate transmission of data to or from other electronic devices. For example, a network communication interface 1010 can transmit electronic signals via a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, infrared, and Ethernet.
  • The input device 1012 can be any suitable input device that is configured to provide a haptic feedback to a user in response to an input action. For example, the input device 1012 can be an input button, a crown, a section of the enclosure, and/or a display.
  • The haptic engine 1014 can be implemented as any suitable device configured to provide force feedback, vibratory feedback, tactile sensations, and the like. For example, in one embodiment, the haptic engine 1014 can be implemented as an electromagnetic actuator (e.g., linear actuator) configured to provide a punctuated haptic feedback, such as a tap or a knock. Additionally or alternatively, the electromagnetic actuator may be configured to translate in two directions to provide a vibratory haptic feedback.
  • It should be noted that FIG. 10 is exemplary only. In other examples, an electronic device may include fewer or more components than those shown in FIG. 10 . Additionally or alternatively, an electronic device can be included in a system and one or more components shown in FIG. 10 is separate from the electronic device but in communication with the electronic device. For example, an electronic device may be operatively connected to, or in communication with a separate display. As another example, one or more applications or data can be stored in a memory separate from an electronic device. In some embodiments, the separate memory can be in a cloud-based system or in an associated electronic device.
  • The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.

Claims (20)

What is claimed is:
1. An electronic device comprising:
an input device having an input surface and configured to receive a translational input; and
a haptic engine mechanically coupled to the input device and configured to:
produce an input device signal in response to the translational input; and
produce a haptic output along the input surface.
2. The electronic device of claim 1, wherein the haptic engine comprises:
a shaft coupled to the input device;
a frame coupled to the shaft and configured to move in response to the translational input;
a magnet assembly coupled to the frame; and
a coil assembly at least partially surrounding the magnet assembly and configured to:
generate the input device signal in response to movement of the magnet assembly; and
receive a haptic output signal to generate the haptic output.
3. The electronic device of claim 2, further comprising a housing at least partially surrounding the frame, wherein the coil assembly is attached to the housing.
4. The electronic device of claim 3, wherein the haptic engine further comprises a compliant element positioned between the frame and the housing and configured to compress in response to the translational input.
5. The electronic device of claim 2, wherein the haptic engine is configured to operate in at least two modes, the at least two modes comprising:
a first mode in which the shaft engages the input device, wherein the haptic output is provided to the input surface when the haptic engine is operating in the first mode; and
a second mode in which the shaft is not engaged with the input device.
6. The electronic device of claim 5, wherein:
the electronic device further comprises housing forming at least a portion of an exterior surface of the device; and
the haptic output is provided to the exterior surface when the haptic engine is operating in the second mode.
7. The electronic device of claim 1, wherein the input surface comprises an input button in an electronic watch and the translational input comprises a translational movement of the input button.
8. The electronic device of claim 7, wherein:
the electronic watch further comprises:
a display configured to display a user interface screen associated with an application program; and
a processing device operably connected to the haptic engine and to the display; and
in response to the input device signal, the processing device causes a haptic output signal to be transmitted to the haptic engine causing the haptic engine to produce the haptic output; and
in response to the input device signal, the display displays a change in the user interface screen.
9. The electronic device of claim 1, wherein:
the input surface comprises a crown in an electronic watch;
the translational input comprises a translational movement of the crown; and
the crown is further configured to receive a rotational input.
10. An electronic device, comprising:
an input device configured to receive a user input;
a linear actuator coupled to the input device and configured to generate an input device signal in response to the user input; and
a processing device operably connected to the linear actuator and configured to cause a haptic output signal to be transmitted to the linear actuator, in response to the input device signal, wherein the haptic output signal causes the linear actuator to produce a haptic output.
11. The electronic device of claim 10, further comprising a force sensor disposed at least partially around an interior perimeter of the input device and positioned between the input device and an enclosure of the electronic device.
12. The electronic device of claim 10, wherein the electronic device comprises an electronic watch and the input device includes a crown.
13. The electronic device of claim 10, wherein the electronic device comprises an electronic watch and the input device includes an input button.
14. An electronic watch, comprising:
an input button;
an electromagnetic actuator mechanically coupled to the input button, the electromagnetic actuator comprising:
a magnet assembly; and
a coil assembly adjacent the magnet assembly; and
a processing device operably coupled to the electromagnetic actuator, wherein
the electromagnetic actuator is configured to detect an input action provided to the input button based on a first movement between the magnet assembly and the coil assembly, the first movement inducing an input device signal; and
the processing device is configured to cause a haptic output signal to be transmitted to the electromagnetic actuator, in response to the input device signal, the haptic output signal causing a second movement between the magnet assembly and the coil assembly to produce a haptic output.
15. The electronic watch of claim 14, wherein the coil assembly is fixed and the magnet assembly moves in response to the input action.
16. The electronic watch of claim 14, wherein the magnet assembly is fixed and the coil assembly moves in response to the input action.
17. The electronic watch of claim 14, wherein:
the electronic watch further comprises a frame at least partially surrounding the magnet and coil assemblies;
the magnet assembly is attached to the frame; and
a shaft configured to mechanically couple the frame to the input button.
18. The electronic watch of claim 17, further comprising a biasing mechanism configured to select an operating mode of the electromagnetic actuator by moving the frame.
19. The electronic watch of claim 18, wherein the operating mode comprises:
a first mode in which the shaft engages the input button; and
a second mode in which the shaft is not engaged with the input button.
20. The electronic watch of claim 19, wherein the biasing mechanism comprises:
a magnet attached to the shaft; and
an electromagnet included in an enclosure of the electronic watch and configured to produce an attracting or repelling force with respect to the magnet.
US18/233,112 2016-05-10 2023-08-11 Electronic Device with an Input Device Having a Haptic Engine Pending US20230384863A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/233,112 US20230384863A1 (en) 2016-05-10 2023-08-11 Electronic Device with an Input Device Having a Haptic Engine

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201662334036P 2016-05-10 2016-05-10
US15/366,674 US10585480B1 (en) 2016-05-10 2016-12-01 Electronic device with an input device having a haptic engine
US16/800,723 US10890978B2 (en) 2016-05-10 2020-02-25 Electronic device with an input device having a haptic engine
US17/145,115 US11762470B2 (en) 2016-05-10 2021-01-08 Electronic device with an input device having a haptic engine
US18/233,112 US20230384863A1 (en) 2016-05-10 2023-08-11 Electronic Device with an Input Device Having a Haptic Engine

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/145,115 Continuation US11762470B2 (en) 2016-05-10 2021-01-08 Electronic device with an input device having a haptic engine

Publications (1)

Publication Number Publication Date
US20230384863A1 true US20230384863A1 (en) 2023-11-30

Family

ID=69723571

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/366,674 Active 2038-01-06 US10585480B1 (en) 2016-05-10 2016-12-01 Electronic device with an input device having a haptic engine
US16/800,723 Active US10890978B2 (en) 2016-05-10 2020-02-25 Electronic device with an input device having a haptic engine
US17/145,115 Active 2037-07-31 US11762470B2 (en) 2016-05-10 2021-01-08 Electronic device with an input device having a haptic engine
US18/233,112 Pending US20230384863A1 (en) 2016-05-10 2023-08-11 Electronic Device with an Input Device Having a Haptic Engine

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US15/366,674 Active 2038-01-06 US10585480B1 (en) 2016-05-10 2016-12-01 Electronic device with an input device having a haptic engine
US16/800,723 Active US10890978B2 (en) 2016-05-10 2020-02-25 Electronic device with an input device having a haptic engine
US17/145,115 Active 2037-07-31 US11762470B2 (en) 2016-05-10 2021-01-08 Electronic device with an input device having a haptic engine

Country Status (1)

Country Link
US (4) US10585480B1 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8405618B2 (en) * 2006-03-24 2013-03-26 Northwestern University Haptic device with indirect haptic feedback
US8487759B2 (en) 2009-09-30 2013-07-16 Apple Inc. Self adapting haptic device
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
CN106020559B (en) * 2016-06-30 2018-05-29 华为技术有限公司 Pressure sensitive detection device, electronic equipment and touch display screen
US10768747B2 (en) 2017-08-31 2020-09-08 Apple Inc. Haptic realignment cues for touch-input displays
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US10942571B2 (en) 2018-06-29 2021-03-09 Apple Inc. Laptop computing device with discrete haptic regions
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
TWI696104B (en) * 2019-03-15 2020-06-11 致伸科技股份有限公司 Touch pad module and computer using the same
US11205106B2 (en) 2019-09-19 2021-12-21 Sensormatic Electronics, LLC Self-detaching anti-theft device with energy limit
US12101138B2 (en) * 2019-09-19 2024-09-24 Sensormatic Electronics, LLC Self-detaching anti-theft device using direct and harvested resonant energy
US11156022B2 (en) 2019-09-20 2021-10-26 Sensormatic Electronics, LLC Tack with free spinning feature
US11380470B2 (en) 2019-09-24 2022-07-05 Apple Inc. Methods to control force in reluctance actuators based on flux related parameters
US10976824B1 (en) * 2019-09-26 2021-04-13 Apple Inc. Reluctance haptic engine for an electronic device
CN115003424B (en) * 2020-02-05 2023-08-01 阿尔卑斯阿尔派株式会社 Input device and input module
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly
US11977683B2 (en) 2021-03-12 2024-05-07 Apple Inc. Modular systems configured to provide localized haptic feedback using inertial actuators
US11809631B2 (en) * 2021-09-21 2023-11-07 Apple Inc. Reluctance haptic engine for an electronic device
US11868553B2 (en) 2022-03-11 2024-01-09 Meta Platforms Technologies, Llc Pressure sensing for user interactions
WO2023172604A1 (en) * 2022-03-11 2023-09-14 Meta Platforms Technologies, Llc Pressure sensing for user interactions
WO2024073836A1 (en) * 2022-10-04 2024-04-11 Titan Haptics Inc. Linear actuator and method of operation

Family Cites Families (392)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE214030C (en) 1908-01-07 1909-10-08
WO1991020136A1 (en) 1990-06-18 1991-12-26 Motorola, Inc. Selective call receiver having a variable frequency vibrator
US5196745A (en) 1991-08-16 1993-03-23 Massachusetts Institute Of Technology Magnetic positioning device
EP0580117A3 (en) 1992-07-20 1994-08-24 Tdk Corp Moving magnet-type actuator
US5739759A (en) 1993-02-04 1998-04-14 Toshiba Corporation Melody paging apparatus
US5424756A (en) 1993-05-14 1995-06-13 Ho; Yung-Lung Track pad cursor positioning device and method
US5436622A (en) 1993-07-06 1995-07-25 Motorola, Inc. Variable frequency vibratory alert method and structure
US5999168A (en) 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
US5825308A (en) * 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US5668423A (en) 1996-03-21 1997-09-16 You; Dong-Ok Exciter for generating vibration in a pager
US5842967A (en) 1996-08-07 1998-12-01 St. Croix Medical, Inc. Contactless transducer stimulation and sensing of ossicular chain
US6084319A (en) 1996-10-16 2000-07-04 Canon Kabushiki Kaisha Linear motor, and stage device and exposure apparatus provided with the same
US6707443B2 (en) 1998-06-23 2004-03-16 Immersion Corporation Haptic trackball device
US6717573B1 (en) 1998-06-23 2004-04-06 Immersion Corporation Low-cost haptic mouse implementations
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
FI981469A (en) 1998-06-25 1999-12-26 Nokia Mobile Phones Ltd Integrated motion detector in a mobile telecommunications device
US6373465B2 (en) 1998-11-10 2002-04-16 Lord Corporation Magnetically-controllable, semi-active haptic interface system and apparatus
GB2344888A (en) 1998-12-18 2000-06-21 Notetry Ltd Obstacle detection system
DE20022244U1 (en) 1999-07-01 2001-11-08 Immersion Corp Control of vibrotactile sensations for haptic feedback devices
US6693622B1 (en) 1999-07-01 2004-02-17 Immersion Corporation Vibrotactile haptic feedback devices
US8169402B2 (en) 1999-07-01 2012-05-01 Immersion Corporation Vibrotactile haptic feedback devices
JP4058223B2 (en) 1999-10-01 2008-03-05 日本碍子株式会社 Piezoelectric / electrostrictive device and manufacturing method thereof
JP3344385B2 (en) 1999-10-22 2002-11-11 ヤマハ株式会社 Vibration source drive
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
JP3380873B2 (en) 2000-04-28 2003-02-24 昭彦 米谷 Data input device
WO2001091100A1 (en) 2000-05-24 2001-11-29 Immersion Corporation Haptic devices using electroactive polymers
US6445093B1 (en) 2000-06-26 2002-09-03 Nikon Corporation Planar motor with linear coil arrays
US6388789B1 (en) 2000-09-19 2002-05-14 The Charles Stark Draper Laboratory, Inc. Multi-axis magnetically actuated device
WO2002027705A1 (en) 2000-09-28 2002-04-04 Immersion Corporation Directional tactile feedback for haptic feedback interface devices
US7370289B1 (en) 2001-03-07 2008-05-06 Palmsource, Inc. Method and apparatus for notification on an electronic handheld device using an attention manager
US7202851B2 (en) 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
EP1440429B1 (en) 2001-10-23 2012-12-12 Immersion Corporation Method of using tactile feedback to deliver silent status information to a user of a handheld weapon
JP2003154315A (en) 2001-11-22 2003-05-27 Matsushita Electric Ind Co Ltd Vibratory linear actuator
US20030117132A1 (en) 2001-12-21 2003-06-26 Gunnar Klinghult Contactless sensing input device
US6952203B2 (en) 2002-01-08 2005-10-04 International Business Machines Corporation Touchscreen user interface: Bluetooth™ stylus for performing right mouse clicks
US7063671B2 (en) 2002-06-21 2006-06-20 Boston Scientific Scimed, Inc. Electronically activated capture device
US7336006B2 (en) 2002-09-19 2008-02-26 Fuji Xerox Co., Ltd. Magnetic actuator with reduced magnetic flux leakage and haptic sense presenting device
JP3988608B2 (en) 2002-10-07 2007-10-10 日本電気株式会社 Radio telephone with vibrator control function and vibrator control method for radio telephone
GB2410316B (en) 2002-10-20 2007-03-21 Immersion Corp System and method for providing rotational haptic feedback
US7798982B2 (en) 2002-11-08 2010-09-21 Engineering Acoustics, Inc. Method and apparatus for generating a vibrational stimulus
JP2004236202A (en) 2003-01-31 2004-08-19 Nec Commun Syst Ltd Portable phone, call arrival information control method to be used for the portable phone and call arrival information control program
US7080271B2 (en) 2003-02-14 2006-07-18 Intel Corporation Non main CPU/OS based operational environment
JP3891947B2 (en) 2003-03-07 2007-03-14 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Magnetic resonance imaging device
DE10319319A1 (en) 2003-04-29 2005-01-27 Infineon Technologies Ag Sensor device with magnetostrictive force sensor
US7567243B2 (en) 2003-05-30 2009-07-28 Immersion Corporation System and method for low power haptic feedback
US7130664B1 (en) 2003-06-12 2006-10-31 Williams Daniel P User-based signal indicator for telecommunications device and method of remotely notifying a user of an incoming communications signal incorporating the same
US20050036603A1 (en) 2003-06-16 2005-02-17 Hughes David A. User-defined ring tone file
WO2005008798A1 (en) 2003-07-22 2005-01-27 Ngk Insulators, Ltd. Actuator device
EP1513032A1 (en) 2003-09-02 2005-03-09 The Swatch Group Management Services AG Object with a metallic case comprising an electronic module suitable for the memorization of information, and electronic module compatible with such an object
KR20050033909A (en) 2003-10-07 2005-04-14 조영준 Key switch using magnetic force
WO2005050683A1 (en) 2003-11-20 2005-06-02 Preh Gmbh Control element with programmable haptics
US7355305B2 (en) 2003-12-08 2008-04-08 Shen-Etsu Chemical Co., Ltd. Small-size direct-acting actuator
US20050191604A1 (en) 2004-02-27 2005-09-01 Allen William H. Apparatus and method for teaching dyslexic individuals
US20060209037A1 (en) 2004-03-15 2006-09-21 David Wang Method and system for providing haptic effects
JP2005301900A (en) 2004-04-15 2005-10-27 Alps Electric Co Ltd On-vehicle tactile force applying type input device
US7508382B2 (en) 2004-04-28 2009-03-24 Fuji Xerox Co., Ltd. Force-feedback stylus and applications to freeform ink
GB2414309B (en) 2004-05-18 2009-02-25 Simon Richard Daniel Spherical display and control device
US7392066B2 (en) 2004-06-17 2008-06-24 Ixi Mobile (R&D), Ltd. Volume control system and method for a mobile communication device
US20060017691A1 (en) 2004-07-23 2006-01-26 Juan Manuel Cruz-Hernandez System and method for controlling audio output associated with haptic effects
US8002089B2 (en) 2004-09-10 2011-08-23 Immersion Corporation Systems and methods for providing a haptic device
US8106888B2 (en) 2004-10-01 2012-01-31 3M Innovative Properties Company Vibration sensing touch input device
JP4860625B2 (en) 2004-10-08 2012-01-25 イマージョン コーポレーション Haptic feedback for simulating buttons and scrolling motion on touch input devices
WO2006051581A1 (en) 2004-11-09 2006-05-18 Takahiko Suzuki Tactile feedback controller, method of controlling the feedback controller, and method of transmitting message using tactile feedback controller
US7068168B2 (en) 2004-11-12 2006-06-27 Simon Girshovich Wireless anti-theft system for computer and other electronic and electrical equipment
DE102005009110A1 (en) 2005-01-13 2006-07-27 Siemens Ag Device for communicating environmental information to a visually impaired person
DE602005027850D1 (en) 2005-01-31 2011-06-16 Research In Motion Ltd User hand recognition and display lighting customization for wireless terminal
US8290192B2 (en) 2005-02-03 2012-10-16 Nokia Corporation Gaming headset vibrator
CN101160104B (en) 2005-02-22 2012-07-04 马科外科公司 Haptic guidance system and method
EP1857425B1 (en) 2005-03-08 2014-12-17 NGK Insulators, Ltd. Piezoelectric ceramic composition and method for producing same
JP2006260179A (en) 2005-03-17 2006-09-28 Matsushita Electric Ind Co Ltd Trackball device
US20060223547A1 (en) 2005-03-31 2006-10-05 Microsoft Corporation Environment sensitive notifications for mobile devices
TWI260151B (en) 2005-05-06 2006-08-11 Benq Corp Mobile phone
US7825903B2 (en) 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
DE102005043587B4 (en) 2005-06-02 2009-04-02 Preh Gmbh Turntable with programmable feel
US7710397B2 (en) 2005-06-03 2010-05-04 Apple Inc. Mouse with improved input mechanisms using touch sensors
EP1907086B1 (en) 2005-06-27 2011-07-20 Coactive Drive Corporation Synchronized vibration device for haptic feedback
US8981682B2 (en) 2005-06-27 2015-03-17 Coactive Drive Corporation Asymmetric and general vibration waveforms from multiple synchronized vibration actuators
US7234379B2 (en) 2005-06-28 2007-06-26 Ingvar Claesson Device and a method for preventing or reducing vibrations in a cutting tool
US7633076B2 (en) 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
WO2007049253A2 (en) 2005-10-28 2007-05-03 Koninklijke Philips Electronics N.V. Display system with a haptic feedback via interaction with physical objects
JP5208362B2 (en) 2005-10-28 2013-06-12 ソニー株式会社 Electronics
US20070106457A1 (en) 2005-11-09 2007-05-10 Outland Research Portable computing with geospatial haptic compass
US8639485B2 (en) 2005-11-14 2014-01-28 Immersion Medical, Inc. Systems and methods for editing a model of a physical system for a simulation
US9182837B2 (en) 2005-11-28 2015-11-10 Synaptics Incorporated Methods and systems for implementing modal changes in a device in response to proximity and force indications
GB2433351B (en) 2005-12-16 2009-03-25 Dale Mcphee Purcocks Keyboard
KR100877067B1 (en) 2006-01-03 2009-01-07 삼성전자주식회사 Haptic button, and haptic device using it
US7667691B2 (en) 2006-01-19 2010-02-23 International Business Machines Corporation System, computer program product and method of preventing recordation of true keyboard acoustic emanations
US8405618B2 (en) 2006-03-24 2013-03-26 Northwestern University Haptic device with indirect haptic feedback
WO2007114631A2 (en) 2006-04-03 2007-10-11 Young-Jun Cho Key switch using magnetic force
US7708478B2 (en) 2006-04-13 2010-05-04 Nokia Corporation Actuator mechanism and a shutter mechanism
US7360446B2 (en) 2006-05-31 2008-04-22 Motorola, Inc. Ceramic oscillation flow meter having cofired piezoresistive sensors
US8174512B2 (en) 2006-06-02 2012-05-08 Immersion Corporation Hybrid haptic device utilizing mechanical and programmable haptic effects
US7326864B2 (en) 2006-06-07 2008-02-05 International Business Machines Corporation Method and apparatus for masking keystroke sounds from computer keyboards
JP2008033739A (en) 2006-07-31 2008-02-14 Sony Corp Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
US7675414B2 (en) 2006-08-10 2010-03-09 Qualcomm Incorporated Methods and apparatus for an environmental and behavioral adaptive wireless communication device
US20080062624A1 (en) 2006-09-13 2008-03-13 Paul Regen Transformable Mobile Computing Device
US7414351B2 (en) 2006-10-02 2008-08-19 Robert Bosch Gmbh Energy harvesting device manufactured by print forming processes
US7890863B2 (en) 2006-10-04 2011-02-15 Immersion Corporation Haptic effects with proximity sensing
US20080084384A1 (en) 2006-10-05 2008-04-10 Immersion Corporation Multiple Mode Haptic Feedback System
US20080111791A1 (en) 2006-11-15 2008-05-15 Alex Sasha Nikittin Self-propelled haptic mouse system
GB2446428B (en) 2006-12-02 2010-08-04 Nanomotion Ltd Controllable coupling force
JP2008158909A (en) 2006-12-25 2008-07-10 Pro Tech Design Corp Tactile feedback controller
KR101515767B1 (en) 2006-12-27 2015-04-28 임머숀 코퍼레이션 Virtual detents through vibrotactile feedback
US7893922B2 (en) 2007-01-15 2011-02-22 Sony Ericsson Mobile Communications Ab Touch sensor with tactile feedback
CN201044066Y (en) 2007-04-06 2008-04-02 深圳市顶星数码网络技术有限公司 Notebook computer with touch panel dividing strip
US8378965B2 (en) 2007-04-10 2013-02-19 Immersion Corporation Vibration actuator with a unidirectional drive
US8072418B2 (en) * 2007-05-31 2011-12-06 Disney Enterprises, Inc. Tactile feedback mechanism using magnets to provide trigger or release sensations
US7956770B2 (en) 2007-06-28 2011-06-07 Sony Ericsson Mobile Communications Ab Data input device and portable electronic device
WO2009006318A1 (en) 2007-06-29 2009-01-08 Artificial Muscle, Inc. Electroactive polymer transducers for sensory feedback applications
US8154537B2 (en) 2007-08-16 2012-04-10 Immersion Corporation Resistive actuator with dynamic variations of frictional forces
KR101425222B1 (en) 2007-08-22 2014-08-04 삼성전자주식회사 Apparatus and method for vibration control in mobile phone
US8432365B2 (en) 2007-08-30 2013-04-30 Lg Electronics Inc. Apparatus and method for providing feedback for three-dimensional touchscreen
US8084968B2 (en) 2007-09-17 2011-12-27 Sony Ericsson Mobile Communications Ab Use of an accelerometer to control vibrator performance
US7667371B2 (en) 2007-09-17 2010-02-23 Motorola, Inc. Electronic device and circuit for providing tactile feedback
US9110590B2 (en) 2007-09-19 2015-08-18 Typesoft Technologies, Inc. Dynamically located onscreen keyboard
US20090085879A1 (en) 2007-09-28 2009-04-02 Motorola, Inc. Electronic device having rigid input surface with piezoelectric haptics and corresponding method
CN101409164A (en) 2007-10-10 2009-04-15 唐艺华 Key-press and keyboard using the same
US20090115734A1 (en) 2007-11-02 2009-05-07 Sony Ericsson Mobile Communications Ab Perceivable feedback
DE602007011948D1 (en) 2007-11-16 2011-02-24 Research In Motion Ltd Touch screen for an electronic device
US9058077B2 (en) 2007-11-16 2015-06-16 Blackberry Limited Tactile touch screen for electronic device
US7911328B2 (en) 2007-11-21 2011-03-22 The Guitammer Company Capture and remote reproduction of haptic events in synchronous association with the video and audio capture and reproduction of those events
CA2706469A1 (en) 2007-11-21 2009-05-28 Artificial Muscle, Inc. Electroactive polymer transducers for tactile feedback devices
US8253686B2 (en) 2007-11-26 2012-08-28 Electronics And Telecommunications Research Institute Pointing apparatus capable of providing haptic feedback, and haptic interaction system and method using the same
US8265308B2 (en) 2007-12-07 2012-09-11 Motorola Mobility Llc Apparatus including two housings and a piezoelectric transducer
US8836502B2 (en) 2007-12-28 2014-09-16 Apple Inc. Personal media device input and output control based on associated conditions
US9857872B2 (en) 2007-12-31 2018-01-02 Apple Inc. Multi-touch display screen with localized tactile feedback
US8138896B2 (en) 2007-12-31 2012-03-20 Apple Inc. Tactile feedback in an electronic device
US20090166098A1 (en) 2007-12-31 2009-07-02 Apple Inc. Non-visual control of multi-touch device
US20090167702A1 (en) 2008-01-02 2009-07-02 Nokia Corporation Pointing device detection
US20090174672A1 (en) 2008-01-03 2009-07-09 Schmidt Robert M Haptic actuator assembly and method of manufacturing a haptic actuator assembly
US8004501B2 (en) 2008-01-21 2011-08-23 Sony Computer Entertainment America Llc Hand-held device with touchscreen and digital tactile pixels
US20090207129A1 (en) 2008-02-15 2009-08-20 Immersion Corporation Providing Haptic Feedback To User-Operated Switch
GB2458146B (en) 2008-03-06 2013-02-13 Nanomotion Ltd Ball-mounted mirror moved by piezoelectric motor
KR100952698B1 (en) 2008-03-10 2010-04-13 한국표준과학연구원 Tactile transmission method using tactile feedback apparatus and the system thereof
US9513704B2 (en) 2008-03-12 2016-12-06 Immersion Corporation Haptically enabled user interface
US7904210B2 (en) 2008-03-18 2011-03-08 Visteon Global Technologies, Inc. Vibration control system
KR101003609B1 (en) 2008-03-28 2010-12-23 삼성전기주식회사 Vibrator and controlling method thereof and portable terminal having the same
US9274601B2 (en) 2008-04-24 2016-03-01 Blackberry Limited System and method for generating a feedback signal in response to an input signal provided to an electronic device
US20090267892A1 (en) 2008-04-24 2009-10-29 Research In Motion Limited System and method for generating energy from activation of an input device in an electronic device
US8217892B2 (en) 2008-05-06 2012-07-10 Dell Products L.P. Tactile feedback input device
JP5199371B2 (en) 2008-05-26 2013-05-15 デースン エレクトリック シーオー エルティーディー Haptic steering wheel switch unit and haptic steering wheel switch system including the same
US8604670B2 (en) 2008-05-30 2013-12-10 The Trustees Of The University Of Pennsylvania Piezoelectric ALN RF MEM switches monolithically integrated with ALN contour-mode resonators
US8345025B2 (en) 2008-06-05 2013-01-01 Dell Products, Lp Computation device incorporating motion detection and method thereof
US9733704B2 (en) 2008-06-12 2017-08-15 Immersion Corporation User interface impact actuator
DE102008030404A1 (en) 2008-06-26 2009-12-31 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Hearing aid device and method
FR2934066B1 (en) 2008-07-21 2013-01-25 Dav HAPTIC RETURN CONTROL DEVICE
KR100973979B1 (en) 2008-08-22 2010-08-05 한국과학기술원 Electromagnetic Multi-axis Actuator
JPWO2010026883A1 (en) 2008-09-05 2012-02-02 三洋電機株式会社 Linear motor and portable device equipped with linear motor
DE102008046102B4 (en) 2008-09-05 2016-05-12 Lisa Dräxlmaier GmbH Control element with specific feedback
US8749495B2 (en) 2008-09-24 2014-06-10 Immersion Corporation Multiple actuation handheld device
US10289199B2 (en) 2008-09-29 2019-05-14 Apple Inc. Haptic feedback system
US20100116629A1 (en) 2008-11-12 2010-05-13 Milo Borissov Dual action push-type button
DE102008061205A1 (en) 2008-11-18 2010-05-20 Institut für Luft- und Kältetechnik gemeinnützige Gesellschaft mbH Electrodynamic linear vibration motor
US8217910B2 (en) 2008-12-19 2012-07-10 Verizon Patent And Licensing Inc. Morphing touch screen layout
US8686952B2 (en) 2008-12-23 2014-04-01 Apple Inc. Multi touch with multi haptics
EP2202619A1 (en) 2008-12-23 2010-06-30 Research In Motion Limited Portable electronic device including tactile touch-sensitive input device and method of controlling same
US20130207793A1 (en) 2009-01-21 2013-08-15 Bayer Materialscience Ag Electroactive polymer transducers for tactile feedback devices
US20100225600A1 (en) 2009-03-09 2010-09-09 Motorola Inc. Display Structure with Direct Piezoelectric Actuation
WO2010104953A1 (en) 2009-03-10 2010-09-16 Artificial Muscle, Inc. Electroactive polymer transducers for tactile feedback devices
US10564721B2 (en) 2009-03-12 2020-02-18 Immersion Corporation Systems and methods for using multiple actuators to realize textures
KR101973918B1 (en) 2009-03-12 2019-04-29 임머숀 코퍼레이션 Systems and methods for providing features in a friction display
US8653785B2 (en) 2009-03-27 2014-02-18 Qualcomm Incorporated System and method of managing power at a portable computing device and a portable computing device docking station
DE102009015991A1 (en) 2009-04-02 2010-10-07 Pi Ceramic Gmbh Keramische Technologien Und Bauelemente Device for generating a haptic feedback of a keyless input unit
US9459734B2 (en) 2009-04-06 2016-10-04 Synaptics Incorporated Input device with deflectable electrode
WO2010119397A2 (en) 2009-04-15 2010-10-21 Koninklijke Philips Electronics N.V. A foldable tactile display
KR101553842B1 (en) 2009-04-21 2015-09-17 엘지전자 주식회사 Mobile terminal providing multi haptic effect and control method thereof
JP5707606B2 (en) 2009-04-22 2015-04-30 株式会社フコク Rotary input device and electronic device
CN103955131B (en) 2009-04-26 2017-04-12 耐克创新有限合伙公司 GPS features and functionality in an athletic watch system
CN102422244A (en) 2009-05-07 2012-04-18 伊梅森公司 Method and apparatus for providing a haptic feedback shape-changing display
US20100313425A1 (en) 2009-06-11 2010-12-16 Christopher Martin Hawes Variable amplitude vibrating personal care device
US20100328229A1 (en) 2009-06-30 2010-12-30 Research In Motion Limited Method and apparatus for providing tactile feedback
US8378797B2 (en) 2009-07-17 2013-02-19 Apple Inc. Method and apparatus for localization of haptic feedback
KR101993848B1 (en) 2009-07-22 2019-06-28 임머숀 코퍼레이션 Interactive touch screen gaming metaphors with haptic feedback across platforms
US8730182B2 (en) 2009-07-30 2014-05-20 Immersion Corporation Systems and methods for piezo-based haptic feedback
US8654524B2 (en) 2009-08-17 2014-02-18 Apple Inc. Housing as an I/O device
US8390594B2 (en) 2009-08-18 2013-03-05 Immersion Corporation Haptic feedback using composite piezoelectric actuator
KR101016208B1 (en) 2009-09-11 2011-02-25 한국과학기술원 Hybrid actuator using vibrator and solenoid, apparatus providing passive haptic feedback using the same, display unit using the same and control method
FR2950166B1 (en) 2009-09-16 2015-07-17 Dav ROTARY CONTROL DEVICE WITH HAPTIC RETURN
US9424444B2 (en) 2009-10-14 2016-08-23 At&T Mobility Ii Llc Systems, apparatus, methods and computer-readable storage media for facilitating integrated messaging, contacts and social media for a selected entity
CN201708677U (en) 2009-10-19 2011-01-12 常州美欧电子有限公司 Flat linear vibration motor
US8262480B2 (en) 2009-11-12 2012-09-11 Igt Touch screen displays with physical buttons for gaming devices
WO2011062910A1 (en) 2009-11-17 2011-05-26 Immersion Corporation Systems and methods for a friction rotary device for haptic feedback
US20110132114A1 (en) 2009-12-03 2011-06-09 Sony Ericsson Mobile Communications Ab Vibration apparatus for a hand-held mobile device, hand-held mobile device comprising the vibration apparatus and method for operating the vibration apparatus
US8633916B2 (en) 2009-12-10 2014-01-21 Apple, Inc. Touch pad with force sensors and actuator feedback
US8773247B2 (en) 2009-12-15 2014-07-08 Immersion Corporation Haptic feedback device using standing waves
US9436280B2 (en) 2010-01-07 2016-09-06 Qualcomm Incorporated Simulation of three-dimensional touch sensation using haptics
JP5385165B2 (en) 2010-01-15 2014-01-08 ホシデン株式会社 Input device
US8493177B2 (en) 2010-01-29 2013-07-23 Immersion Corporation System and method of haptically communicating vehicle information from a vehicle to a keyless entry device
US9870053B2 (en) 2010-02-08 2018-01-16 Immersion Corporation Systems and methods for haptic feedback using laterally driven piezoelectric actuators
US9092056B2 (en) 2010-02-22 2015-07-28 Panasonic Corporation Of North America Keyboard having selectively viewable glyphs
KR101487944B1 (en) 2010-02-24 2015-01-30 아이피플렉 홀딩스 코포레이션 Augmented reality panorama supporting visually imparired individuals
GB201003136D0 (en) 2010-02-24 2010-04-14 Rue De Int Ltd Optically variable security device comprising a coloured cast cured hologram
US9361018B2 (en) 2010-03-01 2016-06-07 Blackberry Limited Method of providing tactile feedback and apparatus
US9535500B2 (en) 2010-03-01 2017-01-03 Blackberry Limited Method of providing tactile feedback and apparatus
JP5847407B2 (en) 2010-03-16 2016-01-20 イマージョン コーポレーションImmersion Corporation System and method for pre-touch and true touch
WO2011116929A1 (en) 2010-03-22 2011-09-29 Fm Marketing Gmbh Input apparatus with haptic feedback
KR101735297B1 (en) 2010-03-30 2017-05-16 (주)멜파스 Panel and device for sensing touch input
WO2011129475A1 (en) 2010-04-16 2011-10-20 엘지이노텍 주식회사 Linear vibrator having a broad bandwidth, and mobile device
JP2011242386A (en) 2010-04-23 2011-12-01 Immersion Corp Transparent compound piezoelectric material aggregate of contact sensor and tactile sense actuator
US20110267294A1 (en) * 2010-04-29 2011-11-03 Nokia Corporation Apparatus and method for providing tactile feedback for user
US20110267181A1 (en) * 2010-04-29 2011-11-03 Nokia Corporation Apparatus and method for providing tactile feedback for user
US20120327006A1 (en) 2010-05-21 2012-12-27 Disney Enterprises, Inc. Using tactile feedback to provide spatial awareness
US8628173B2 (en) 2010-06-07 2014-01-14 Xerox Corporation Electrical interconnect using embossed contacts on a flex circuit
US8836643B2 (en) 2010-06-10 2014-09-16 Qualcomm Incorporated Auto-morphing adaptive user interface device and methods
US9904393B2 (en) 2010-06-11 2018-02-27 3M Innovative Properties Company Positional touch sensor with force measurement
US9086727B2 (en) 2010-06-22 2015-07-21 Microsoft Technology Licensing, Llc Free space directional force feedback apparatus
US8411874B2 (en) 2010-06-30 2013-04-02 Google Inc. Removing noise from audio
US8798534B2 (en) 2010-07-09 2014-08-05 Digimarc Corporation Mobile devices and methods employing haptics
US20120038469A1 (en) 2010-08-11 2012-02-16 Research In Motion Limited Actuator assembly and electronic device including same
KR101187980B1 (en) 2010-08-13 2012-10-05 삼성전기주식회사 Haptic feedback device and electronic device including the same
US8576171B2 (en) 2010-08-13 2013-11-05 Immersion Corporation Systems and methods for providing haptic feedback to touch-sensitive input devices
KR101197861B1 (en) 2010-08-13 2012-11-05 삼성전기주식회사 Haptic feedback actuator, haptic feedback device and electronic device
JP5343946B2 (en) 2010-08-25 2013-11-13 株式会社デンソー Tactile presentation device
FR2964761B1 (en) 2010-09-14 2012-08-31 Thales Sa HAPTIC INTERACTION DEVICE AND METHOD FOR GENERATING HAPTIC AND SOUND EFFECTS
CN103189820B (en) 2010-10-21 2016-04-20 京瓷株式会社 Touch panel equipment
KR101259683B1 (en) 2010-10-27 2013-05-02 엘지이노텍 주식회사 Horizental vibration motor
US20120113008A1 (en) 2010-11-08 2012-05-10 Ville Makinen On-screen keyboard with haptic effects
US8878401B2 (en) 2010-11-10 2014-11-04 Lg Innotek Co., Ltd. Linear vibrator having a trembler with a magnet and a weight
JP6258035B2 (en) 2010-11-18 2018-01-10 グーグル エルエルシー Orthogonal dragging on the scroll bar
US10120446B2 (en) 2010-11-19 2018-11-06 Apple Inc. Haptic input device
WO2012067370A2 (en) 2010-11-19 2012-05-24 (주)하이소닉 Haptic module using piezoelectric element
CN201897778U (en) 2010-11-23 2011-07-13 英业达股份有限公司 Touch pad and electronic display device using same
CN201945951U (en) 2011-01-22 2011-08-24 苏州达方电子有限公司 Soft protecting cover and keyboard
KR101580022B1 (en) 2011-03-04 2015-12-23 애플 인크. Linear vibrator providing localized and generalized haptic feedback
DE102011014763A1 (en) 2011-03-22 2012-09-27 Fm Marketing Gmbh Input device with haptic feedback
US20120249461A1 (en) * 2011-04-01 2012-10-04 Analog Devices, Inc. Dedicated user interface controller for feedback responses
US9448713B2 (en) 2011-04-22 2016-09-20 Immersion Corporation Electro-vibrotactile display
US9557857B2 (en) 2011-04-26 2017-01-31 Synaptics Incorporated Input device with force sensing and haptic response
US9218727B2 (en) 2011-05-12 2015-12-22 Apple Inc. Vibration in portable devices
US8717151B2 (en) 2011-05-13 2014-05-06 Qualcomm Incorporated Devices and methods for presenting information to a user on a tactile output surface of a mobile device
US8681130B2 (en) 2011-05-20 2014-03-25 Sony Corporation Stylus based haptic peripheral for touch screen and tablet devices
JP5459795B2 (en) 2011-06-06 2014-04-02 株式会社ワコム Electronics
US9563274B2 (en) 2011-06-10 2017-02-07 Sri International Adaptable input/output device
US9710061B2 (en) 2011-06-17 2017-07-18 Apple Inc. Haptic feedback device
US8780074B2 (en) 2011-07-06 2014-07-15 Sharp Kabushiki Kaisha Dual-function transducer for a touch panel
US20130016042A1 (en) 2011-07-12 2013-01-17 Ville Makinen Haptic device with touch gesture interface
WO2013021835A1 (en) 2011-08-11 2013-02-14 株式会社村田製作所 Touch panel
WO2013031224A1 (en) 2011-08-30 2013-03-07 京セラ株式会社 Haptic feedback device
US20130076635A1 (en) 2011-09-26 2013-03-28 Ko Ja (Cayman) Co., Ltd. Membrane touch keyboard structure for notebook computers
US8723824B2 (en) 2011-09-27 2014-05-13 Apple Inc. Electronic devices with sidewall displays
US9582178B2 (en) * 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US10082950B2 (en) 2011-11-09 2018-09-25 Joseph T. LAPP Finger-mapped character entry systems
CN104169847B (en) 2011-11-18 2019-03-12 森顿斯公司 Localized haptic feedback
US9286907B2 (en) 2011-11-23 2016-03-15 Creative Technology Ltd Smart rejecter for keyboard click noise
US20130154996A1 (en) 2011-12-16 2013-06-20 Matthew Trend Touch Sensor Including Mutual Capacitance Electrodes and Self-Capacitance Electrodes
WO2013099743A1 (en) 2011-12-27 2013-07-04 株式会社村田製作所 Tactile presentation device
EP2618564A1 (en) 2012-01-18 2013-07-24 Harman Becker Automotive Systems GmbH Method for operating a conference system and device for a conference system
JP5942152B2 (en) 2012-01-20 2016-06-29 パナソニックIpマネジメント株式会社 Electronics
US8890824B2 (en) 2012-02-07 2014-11-18 Atmel Corporation Connecting conductive layers using in-mould lamination and decoration
US8872448B2 (en) 2012-02-24 2014-10-28 Nokia Corporation Apparatus and method for reorientation during sensed drop
US9134807B2 (en) 2012-03-02 2015-09-15 Microsoft Technology Licensing, Llc Pressure sensitive key normalization
US9539164B2 (en) 2012-03-20 2017-01-10 Xerox Corporation System for indoor guidance with mobility assistance
US9977499B2 (en) 2012-05-09 2018-05-22 Apple Inc. Thresholds for determining feedback in computing devices
WO2013170099A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Calibration of haptic feedback systems for input devices
KR102124297B1 (en) 2012-06-04 2020-06-19 홈 컨트롤 싱가포르 피티이. 엘티디. User-interface for entering alphanumerical characters
GB201212046D0 (en) 2012-07-06 2012-08-22 Rue De Int Ltd Security devices
US9996199B2 (en) 2012-07-10 2018-06-12 Electronics And Telecommunications Research Institute Film haptic system having multiple operation points
KR101242525B1 (en) * 2012-07-19 2013-03-12 (주)엠투시스 Haptic actuator
US9466783B2 (en) 2012-07-26 2016-10-11 Immersion Corporation Suspension element having integrated piezo material for providing haptic effects to a touch screen
KR101934310B1 (en) 2012-08-24 2019-01-03 삼성디스플레이 주식회사 touch display apparatus sensing touch force
US9116546B2 (en) 2012-08-29 2015-08-25 Immersion Corporation System for haptically representing sensor input
KR101975596B1 (en) 2012-08-29 2019-05-07 삼성전자주식회사 Touch screen device for compensating distortion of input sensing signal
EP2912644A4 (en) 2012-10-23 2016-05-25 Univ New York Somatosensory feedback wearable object
TW201416726A (en) 2012-10-26 2014-05-01 Dongguan Masstop Liquid Crystal Display Co Ltd Color filter substrate having touch-sensing function
US9319150B2 (en) 2012-10-29 2016-04-19 Dell Products, Lp Reduction of haptic noise feedback in system
US9122330B2 (en) 2012-11-19 2015-09-01 Disney Enterprises, Inc. Controlling a user's tactile perception in a dynamic physical environment
KR20140073398A (en) 2012-12-06 2014-06-16 삼성전자주식회사 Display apparatus and method for controlling thereof
US20140168175A1 (en) 2012-12-13 2014-06-19 Research In Motion Limited Magnetically coupling stylus and host electronic device
EP2743798A1 (en) 2012-12-13 2014-06-18 BlackBerry Limited Magnetically coupling stylus and host electronic device
KR20140138224A (en) 2013-01-06 2014-12-03 인텔 코오퍼레이션 A method, apparatus, and system for distributed pre-processing of touch data and display region control
TW201430623A (en) 2013-01-30 2014-08-01 Hon Hai Prec Ind Co Ltd Electronic device and human-computer interaction method
US9024738B2 (en) 2013-02-01 2015-05-05 Blackberry Limited Apparatus, systems and methods for mitigating vibration of an electronic device
US9304587B2 (en) 2013-02-13 2016-04-05 Apple Inc. Force sensing mouse
US9442570B2 (en) 2013-03-13 2016-09-13 Google Technology Holdings LLC Method and system for gesture recognition
US9285905B1 (en) 2013-03-14 2016-03-15 Amazon Technologies, Inc. Actuator coupled device chassis
US20150182163A1 (en) 2013-12-31 2015-07-02 Aliphcom Wearable device to detect inflamation
US9557830B2 (en) 2013-03-15 2017-01-31 Immersion Corporation Programmable haptic peripheral
US9707593B2 (en) 2013-03-15 2017-07-18 uBeam Inc. Ultrasonic transducer
US9405369B2 (en) 2013-04-26 2016-08-02 Immersion Corporation, Inc. Simulation of tangible user interface interactions and gestures using array of haptic cells
US9519346B2 (en) 2013-05-17 2016-12-13 Immersion Corporation Low-frequency effects haptic conversion system
KR20240065191A (en) 2013-06-11 2024-05-14 애플 인크. Wearable electronic device
US9753436B2 (en) * 2013-06-11 2017-09-05 Apple Inc. Rotary input mechanism for an electronic device
US8867757B1 (en) 2013-06-28 2014-10-21 Google Inc. Microphone under keyboard to assist in noise cancellation
US9874980B2 (en) 2013-07-31 2018-01-23 Atmel Corporation Dynamic configuration of touch sensor electrode clusters
KR102231031B1 (en) 2013-08-09 2021-03-23 애플 인크. Tactile switch for an electronic device
EP3340025B1 (en) 2013-09-03 2019-06-12 Apple Inc. User interface for manipulating user interface objects with magnetic properties
US9558637B2 (en) * 2013-09-10 2017-01-31 Immersion Corporation Systems and methods for performing haptic conversion
US9607491B1 (en) 2013-09-18 2017-03-28 Bruce J. P. Mortimer Apparatus for generating a vibrational stimulus using a planar reciprocating actuator
US20150084909A1 (en) 2013-09-20 2015-03-26 Synaptics Incorporated Device and method for resistive force sensing and proximity sensing
WO2015045063A1 (en) 2013-09-26 2015-04-02 富士通株式会社 Drive control apparatus, electronic device, and drive control method
WO2015047343A1 (en) 2013-09-27 2015-04-02 Honessa Development Laboratories Llc Polarized magnetic actuators for haptic response
US10282014B2 (en) 2013-09-30 2019-05-07 Apple Inc. Operating multiple functions in a display of an electronic device
US9921649B2 (en) 2013-10-07 2018-03-20 Immersion Corporation Electrostatic haptic based user input elements
US8860284B1 (en) 2013-10-08 2014-10-14 19th Space Electronics Piezoelectric multiplexer
US10120478B2 (en) 2013-10-28 2018-11-06 Apple Inc. Piezo based force sensing
US20150126070A1 (en) 2013-11-05 2015-05-07 Sony Corporation Apparatus for powering an electronic device in a secure manner
US9514902B2 (en) 2013-11-07 2016-12-06 Microsoft Technology Licensing, Llc Controller-less quick tactile feedback keyboard
CN203630729U (en) 2013-11-21 2014-06-04 联想(北京)有限公司 Glass keyboard
CN105765504A (en) 2013-11-21 2016-07-13 3M创新有限公司 Touch systems and methods employing force direction determination
US9639158B2 (en) 2013-11-26 2017-05-02 Immersion Corporation Systems and methods for generating friction and vibrotactile effects
US20150185842A1 (en) 2013-12-31 2015-07-02 Microsoft Corporation Haptic feedback for thin user interfaces
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US8977376B1 (en) 2014-01-06 2015-03-10 Alpine Electronics of Silicon Valley, Inc. Reproducing audio signals with a haptic apparatus on acoustic headphones and their calibration and measurement
AU2015100011B4 (en) 2014-01-13 2015-07-16 Apple Inc. Temperature compensating transparent force sensor
US9632583B2 (en) 2014-01-21 2017-04-25 Senseg Ltd. Controlling output current for electrosensory vibration
US9396629B1 (en) 2014-02-21 2016-07-19 Apple Inc. Haptic modules with independently controllable vertical and horizontal mass movements
US10203762B2 (en) 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US9594429B2 (en) 2014-03-27 2017-03-14 Apple Inc. Adjusting the level of acoustic and haptic output in haptic devices
CN106133650B (en) 2014-03-31 2020-07-07 索尼公司 Haptic reproduction device, signal generation apparatus, haptic reproduction system, and haptic reproduction method
KR20150118813A (en) 2014-04-15 2015-10-23 삼성전자주식회사 Providing Method for Haptic Information and Electronic Device supporting the same
US10133351B2 (en) 2014-05-21 2018-11-20 Apple Inc. Providing haptic output based on a determined orientation of an electronic device
DE102015209639A1 (en) 2014-06-03 2015-12-03 Apple Inc. Linear actuator
US9886090B2 (en) 2014-07-08 2018-02-06 Apple Inc. Haptic notifications utilizing haptic input devices
US9489049B2 (en) 2014-08-26 2016-11-08 Samsung Electronics Co., Ltd. Force simulation finger sleeve using orthogonal uniform magnetic field
WO2016036671A2 (en) 2014-09-02 2016-03-10 Apple Inc. Haptic notifications
US10656719B2 (en) 2014-09-30 2020-05-19 Apple Inc. Dynamic input surface for electronic devices
WO2016091944A1 (en) 2014-12-09 2016-06-16 Agfa Healthcare System to deliver alert messages from at least one critical service running on a monitored target system to a wearable device
WO2016092313A1 (en) * 2014-12-10 2016-06-16 Hci Viocare Technologies Ltd. Force sensing device
US10915161B2 (en) 2014-12-11 2021-02-09 Intel Corporation Facilitating dynamic non-visual markers for augmented reality on computing devices
US9589432B2 (en) * 2014-12-22 2017-03-07 Immersion Corporation Haptic actuators having programmable magnets with pre-programmed magnetic surfaces and patterns for producing varying haptic effects
US9762236B2 (en) 2015-02-02 2017-09-12 Uneo Inc. Embedded button for an electronic device
EP3250997A1 (en) 2015-03-08 2017-12-06 Apple Inc. User interface using a rotatable input mechanism
WO2016158518A1 (en) 2015-03-27 2016-10-06 富士フイルム株式会社 Electroacoustic transducer
US20160293829A1 (en) 2015-04-01 2016-10-06 19th Space Electronics Piezoelectric switch with lateral moving beams
KR20160131275A (en) * 2015-05-06 2016-11-16 엘지전자 주식회사 Watch type terminal
US20160334901A1 (en) 2015-05-15 2016-11-17 Immersion Corporation Systems and methods for distributing haptic effects to users interacting with user interfaces
ES2778935T3 (en) 2015-05-28 2020-08-12 Nokia Technologies Oy Rendering a notification on a head-mounted display
US20160379776A1 (en) 2015-06-27 2016-12-29 Intel Corporation Keyboard for an electronic device
DE102015008537A1 (en) * 2015-07-02 2017-01-05 Audi Ag Motor vehicle operating device with haptic feedback
KR102373491B1 (en) 2015-07-15 2022-03-11 삼성전자주식회사 Method for sensing a rotation of rotation member and an electronic device thereof
US20170024010A1 (en) 2015-07-21 2017-01-26 Apple Inc. Guidance device for the sensory impaired
CN108472686B (en) 2015-09-16 2020-05-12 泰克宣技术有限公司 Apparatus and method for audio-haptic spatialization of sound and perception of bass
US9886057B2 (en) 2015-09-22 2018-02-06 Apple Inc. Electronic device with enhanced pressure resistant features
WO2017053430A1 (en) * 2015-09-22 2017-03-30 Immersion Corporation Pressure-based haptics
US9990040B2 (en) * 2015-09-25 2018-06-05 Immersion Corporation Haptic CAPTCHA
US20170090655A1 (en) 2015-09-29 2017-03-30 Apple Inc. Location-Independent Force Sensing Using Differential Strain Measurement
US9971407B2 (en) 2015-09-30 2018-05-15 Apple Inc. Haptic feedback for rotary inputs
EP3157266B1 (en) 2015-10-16 2019-02-27 Nxp B.V. Controller for a haptic feedback element
CN105446646B (en) 2015-12-11 2019-01-11 小米科技有限责任公司 Content input method, device and touch control device based on dummy keyboard
US9875625B2 (en) * 2015-12-18 2018-01-23 Immersion Corporation Systems and methods for multifunction haptic output devices
US9927887B2 (en) * 2015-12-31 2018-03-27 Synaptics Incorporated Localized haptics for two fingers
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
US20170249024A1 (en) 2016-02-27 2017-08-31 Apple Inc. Haptic mouse
US10025399B2 (en) * 2016-03-16 2018-07-17 Lg Electronics Inc. Watch type mobile terminal and method for controlling the same
JP6681765B2 (en) 2016-03-29 2020-04-15 株式会社ジャパンディスプレイ Detector
US10373381B2 (en) 2016-03-30 2019-08-06 Microsoft Technology Licensing, Llc Virtual object manipulation within physical environment
US10209821B2 (en) 2016-04-05 2019-02-19 Google Llc Computing devices having swiping interfaces and methods of operating the same
KR102625859B1 (en) 2016-04-19 2024-01-17 삼성디스플레이 주식회사 Display, electronic watch having the same and electronic device having the same
KR102498502B1 (en) 2016-04-20 2023-02-13 삼성전자주식회사 Cover device and electronic device including the cover device
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
US10078483B2 (en) 2016-05-17 2018-09-18 Google Llc Dual screen haptic enabled convertible laptop
US9829981B1 (en) * 2016-05-26 2017-11-28 Apple Inc. Haptic output device
WO2017212630A1 (en) 2016-06-10 2017-12-14 三菱電機株式会社 Vehicle air-conditioning device and obstruction detection system for vehicle air-conditioning device
US20170357325A1 (en) 2016-06-14 2017-12-14 Apple Inc. Localized Deflection Using a Bending Haptic Actuator
US20170364158A1 (en) 2016-06-20 2017-12-21 Apple Inc. Localized and/or Encapsulated Haptic Actuators and Elements
US10649529B1 (en) 2016-06-28 2020-05-12 Apple Inc. Modification of user-perceived feedback of an input device using acoustic or haptic output
US20180005496A1 (en) * 2016-07-01 2018-01-04 Intel Corporation Distributed haptics for wearable electronic devices
US20180015362A1 (en) 2016-07-13 2018-01-18 Colopl, Inc. Information processing method and program for executing the information processing method on computer
US10845878B1 (en) 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
JP2018019065A (en) 2016-07-27 2018-02-01 モダ−イノチップス シーオー エルティディー Piezoelectric vibration device and electronic apparatus including the same
US10152182B2 (en) 2016-08-11 2018-12-11 Microsoft Technology Licensing, Llc Touch sensor having jumpers
US10397686B2 (en) 2016-08-15 2019-08-27 Bragi GmbH Detection of movement adjacent an earpiece device
WO2018038367A1 (en) 2016-08-26 2018-03-01 주식회사 하이딥 Touch input device including display panel having strain gauge, and method for manufacturing display panel having strain gauge
US10372214B1 (en) 2016-09-07 2019-08-06 Apple Inc. Adaptable user-selectable input area in an electronic device
US10122184B2 (en) 2016-09-15 2018-11-06 Blackberry Limited Application of modulated vibrations in docking scenarios
US10591993B2 (en) 2016-09-21 2020-03-17 Apple Inc. Haptic structure for providing localized haptic output
KR102564349B1 (en) 2016-09-30 2023-08-04 엘지디스플레이 주식회사 Organic light emitting display apparatus
US10346117B2 (en) 2016-11-09 2019-07-09 Microsoft Technology Licensing, Llc Device having a screen region on a hinge coupled between other screen regions
CN206339935U (en) 2016-11-16 2017-07-18 甘肃工业职业技术学院 A kind of keyboard with touch pad
US10037660B2 (en) 2016-12-30 2018-07-31 Immersion Corporation Flexible haptic actuator
US10437359B1 (en) 2017-02-28 2019-10-08 Apple Inc. Stylus with external magnetic influence
US10110986B1 (en) 2017-03-28 2018-10-23 Motorola Mobility Llc Haptic feedback for head-wearable speaker mount such as headphones or earbuds to indicate ambient sound
US10032550B1 (en) 2017-03-30 2018-07-24 Apple Inc. Moving-coil haptic actuator for electronic devices
US20180284894A1 (en) 2017-03-31 2018-10-04 Intel Corporation Directional haptics for immersive virtual reality
KR101886683B1 (en) 2017-05-22 2018-08-09 주식회사 하이딥 Touch input apparatus including light block layer and method for making the same
IT201700072559A1 (en) 2017-06-28 2017-09-28 Trama S R L APTIC INTERFACE
CN207115337U (en) 2017-07-04 2018-03-16 惠州Tcl移动通信有限公司 Keyboard and electronic equipment with contact type panel
US10622538B2 (en) 2017-07-18 2020-04-14 Apple Inc. Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body
US10775889B1 (en) 2017-07-21 2020-09-15 Apple Inc. Enclosure with locally-flexible regions
US10768747B2 (en) 2017-08-31 2020-09-08 Apple Inc. Haptic realignment cues for touch-input displays
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US10556252B2 (en) 2017-09-20 2020-02-11 Apple Inc. Electronic device having a tuned resonance haptic actuation system
US10768738B1 (en) 2017-09-27 2020-09-08 Apple Inc. Electronic device having a haptic actuator with magnetic augmentation
US10235849B1 (en) 2017-12-22 2019-03-19 Immersion Corporation Haptic delivery cluster for providing a haptic effect
US11073712B2 (en) 2018-04-10 2021-07-27 Apple Inc. Electronic device display for through-display imaging
US10942571B2 (en) 2018-06-29 2021-03-09 Apple Inc. Laptop computing device with discrete haptic regions
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US11188151B2 (en) 2018-09-25 2021-11-30 Apple Inc. Vibration driven housing component for audio reproduction, haptic feedback, and force sensing
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
US10599223B1 (en) 2018-09-28 2020-03-24 Apple Inc. Button providing force sensing and/or haptic output
US10691211B2 (en) 2018-09-28 2020-06-23 Apple Inc. Button providing force sensing and/or haptic output
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly

Also Published As

Publication number Publication date
US20200233495A1 (en) 2020-07-23
US20210157411A1 (en) 2021-05-27
US11762470B2 (en) 2023-09-19
US10890978B2 (en) 2021-01-12
US10585480B1 (en) 2020-03-10

Similar Documents

Publication Publication Date Title
US11762470B2 (en) Electronic device with an input device having a haptic engine
US10936071B2 (en) Wearable electronic device with haptic rotatable input
AU2020281126B2 (en) Rotary input mechanism for an electronic device
JP6903726B2 (en) Input device with force sensor
US9971407B2 (en) Haptic feedback for rotary inputs
JP6878502B2 (en) Devices and methods that provide localized haptic effects on the display screen
US10372214B1 (en) Adaptable user-selectable input area in an electronic device
US20170108931A1 (en) Multiple mode haptic feedback system
US9772688B2 (en) Haptic feedback assembly
US10976824B1 (en) Reluctance haptic engine for an electronic device
US11150731B2 (en) Multi-modal haptic feedback for an electronic device using a single haptic actuator
JP7210603B2 (en) Systems and methods for detecting and responding to touch input using haptic feedback
KR20050044412A (en) Method and device for generating feedback
US10312039B2 (en) Generator button for electronic devices
US11809631B2 (en) Reluctance haptic engine for an electronic device
US12073710B2 (en) Portable electronic device having a haptic button assembly
WO2017048364A1 (en) Force feedback surface for an electronic device
US10698489B1 (en) Compact pivoting input device
EP3454181A1 (en) Haptic actuation systems for a touch surface
US11803243B2 (en) Electronic device having a haptic device with an actuation member and a restoration mechanism
US11099635B2 (en) Blow event detection and mode switching with an electronic device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION