WO2017051263A2 - Robot arm for testing of touchscreen applications - Google Patents

Robot arm for testing of touchscreen applications Download PDF

Info

Publication number
WO2017051263A2
WO2017051263A2 PCT/IB2016/053292 IB2016053292W WO2017051263A2 WO 2017051263 A2 WO2017051263 A2 WO 2017051263A2 IB 2016053292 W IB2016053292 W IB 2016053292W WO 2017051263 A2 WO2017051263 A2 WO 2017051263A2
Authority
WO
WIPO (PCT)
Prior art keywords
stylus
robotic arm
touch
touchscreen
linear actuator
Prior art date
Application number
PCT/IB2016/053292
Other languages
French (fr)
Other versions
WO2017051263A3 (en
Inventor
Achu WILSON
Aronin P
Akhil A
Original Assignee
Sastra Robotics India Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sastra Robotics India Private Limited filed Critical Sastra Robotics India Private Limited
Publication of WO2017051263A2 publication Critical patent/WO2017051263A2/en
Publication of WO2017051263A3 publication Critical patent/WO2017051263A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R1/00Details of instruments or arrangements of the types included in groups G01R5/00 - G01R13/00 and G01R31/00
    • G01R1/02General constructional details
    • G01R1/06Measuring leads; Measuring probes
    • G01R1/067Measuring probes
    • G01R1/06705Apparatus for holding or moving single probes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • G06F11/2205Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing using arrangements specific to the hardware being tested
    • G06F11/2221Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing using arrangements specific to the hardware being tested to test input/output devices or peripheral units

Definitions

  • the present invention relates to a robotic arm, more specifically it relates to a robotic arm to test functionality of a touchscreen panel of a computing device.
  • touch screen display and touch screen are one of the main user interfaces in many electronic devices, which may include smart phones, personal computers, ATMs, Gaming equipments, medical equipment, voting equipment, other machinery, etc; and proper functionality is essential in order to guarantee the best user experience for the end users. It has become very essential for touch screen and device manufacturers to test touch screen functionality. Display and touch screen testing is often performed manually, which results in uneven quality and lengthened testing time. Testing displays and touch screens with an automated system proposes valuable benefits. Automated testing guarantees continuous and steady quality, fast and efficient test cycle, and comprehensive defect coverage during a single test phase.
  • the object of the present invention is to provide a robotic arm to test functionality of a touchscreen panel of a computing device.
  • the present invention pertains to a robotic arm to test the functionality of a touchscreen panel of a computing device. It consists of a stylus which is adapted to move in three dimensional space for emulating various touch based movements on to the touchscreen panels to provide commands to the computing device, and the stylus further comprises a stylus tip. There are one or more than one rotating motors which moves the stylus in a plane and a linear actuator which moves the stylus in an axis vertical to the plane.
  • the stylus tip is replaceable.
  • the stylus tip is having atleast a shape, a size, or a material appropriate for emulating human touch as required by the touchscreen panel of the computing device, wherein the material of the stylus tip can be atleast one of a cotton, leather, resins, polymers or combination thereof.
  • the robotic arm consists of more than one stylus tip which will perform simultaneous touches on the touchscreen panel
  • the linear actuator moves the stylus and further applies precise and controllable force onto the stylus for emulating a press by a human.
  • the linear actuator controls the speed of the stylus for emulating a movement by a human
  • it consists of a base system which controls the functionality of the robotic arm, and the base system also comprises a controller which controls the motion of the robotic arm, the rotating motor and the linear actuator
  • the base system further has a test setup which calibrates a position of the robot arm with respect to the touchscreen panel of a computing device.
  • the controller further determines the path of the stylus according to the test setup, and the robotic arm effectuates atleast one of the rotating motor or the linear actuator to move the stylus according to the determined path.
  • the robotic arm consists of a camera which provides visual feedback Brief Description of the Drawings
  • FIG. 1 schematically illustrates the Isometric view of the robot Detailed Description
  • the terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item.
  • the present invention relates to a robotic arm which can make human like touch and swipe actions on touchscreen panels.
  • the robotic arm is applicable to various fields like testing of touchscreen applications, testing and calibration of touch screen response, latency etc. It can be used for repeatable testing of touchscreen enabled devices like smartphones, tablet computers, car infotainment systems etc.
  • the invention is based on a robotic arm to test the functionality of a touchscreen panel of a computing device. It consists of a stylus (8) which is adapted to move into three dimensional space for emulating various touch based movements on to the touchscreen panels to provide commands to the computing device. This gives the robotic arm three degrees of freedom to allow positioning of the robotic arm in all three axes.
  • two axes of the robot are rotary joints, using which the robot can move over the entire plane of the touchscreen while the third axis moves in a linear manner allowing it to touch the screen. This inturn aids in emulating the touch and swipe actions of a human.
  • the stylus (8) further comprises a stylus tip (10), which is replaceable.
  • the stylus tip (10) can accommodate different sizes, different shapes and different materials of tips. For example, each finger on a person's hand is of different sizes and therefore has different areas for contact surface. The little finger has a smaller contact surface and the thumb has a larger contact surface. Another aspect is to do with people of different ages and different physiques; the size of the fingers varies from person to person. Even in the same age group, some people may have bigger finger contact surface because they are tall and/or fat, some may have smaller finger contact surface because they are short and/or thin. To accommodate for all sizes of finger contact surface areas, different sizes of stylus tips (10) are used. Different types of stylus tip (10) can be attached to the stylus (8).
  • Some of the different touchscreen panel technologies may require tips of different sizes and material properties.
  • Some cases of touchscreen panel devices involve hands with gloves put on.
  • the material of the glove ranges over many fabric types - cotton, leather, resins, polymers and so on.
  • Such materials can be either added on top of the stylus tips (10) or can be used to make a custom stylus tip (10) of a particular fabric.
  • tips can be chosen for use with any type of touchscreen panels.
  • the robotic arm also has a provision for more than one stylus tip (10) which can perform simultaneous touches on the touchscreen panel.
  • Some touchscreen panel gestures involve two-finger touch or multi-finger touch. Such multi-finger gestures can be emulated using different types of stylus tips (10) having different shapes of contact surface.
  • the multi touch functionality can also be implemented by using multiple tips.
  • the linear axis which may also be called as the linear actuator (2) is adapted to move the stylus (8) and further apply a precise and controllable force onto the stylus (8) for emulating a press by a human.
  • the linear actuator (2) can be setup for a particular hard pressing degree. In other words, the amount of force exerted on the touchscreen panel when the tip is in contact with the touchscreen panel can be determined.
  • Each touch gesture can be set to have the same or different hard pressing degree. This can mimic a user who touches the touchscreen panel either hard or with a light force.
  • the maximum and minimum range of force that can be exerted on the touchscreen panel is dependent on the model of the linear actuator (2) used.
  • the linear actuator (2) is also adapted to control the speed of the stylus (8) for emulating a movement by a human.
  • the linear actuator (2) can be setup for a particular speed of movement, wherein the speed at which it can move towards and away from the touchscreen panel can be determined.
  • the forward and reverse strokes of the linear actuator (2) can be set for the same or different speed for each stroke. Coupled with the physical properties and design of the stylus tip (10), this can mimic a user who touches fast or a user who touches slowly.
  • the maximum and minimum range of speed with which the linear actuator (2) can move towards or away from the touchscreen panel is dependent on the model of the linear actuator (2) used.
  • the robotic arm consists of a base system (12) which controls the functionality of the robotic arm which further comprises a controller which controls the motion of the robotic arm, the rotating motor/s (4,6) and the linear actuator (2). It has a test setup which calibrates a position of the robot arm with respect to the touchscreen panel of a computing device.
  • the robot can be mounted in horizontal, vertical or even tilted planes, so that devices with touch screens in any orientation can be tested.
  • the test setup Prior to initializing the test procedure, the test setup may be calibrated to establish a fixed coordinate system between the robot and the touchscreen panel. Once the calibration is done, the robot can move to any point on the touchscreen, defined by its pixel coordinates to make a touch or swipe action.
  • the calibration data is stored in a non-volatile memory inside the controller, so that recalibration is to be done only when the physical mounting of the robot relative to the touchscreen panel under test is changed.
  • the controller of the robotic arm determines the path of the stylus (8) according to the test setup, and the robotic arm also effectuates atleast one of the rotating motor/s (4, 6) or the linear actuator (2) to move the stylus according to the determined path.
  • the controller of the test set up accepts motion commands from an external PC or USB memory drive and controls the motion of the robot. Recurring test patterns can be programmed into a USB drive and then plugged into the robot controller, so that the usage of a dedicated external PC can be avoided. Once the system is calibrated, commands can be sent from the external PC to make a touch or swipe operation on an interactive element like a button or slider on the touchscreen.
  • the controller will then calculate the inverse kinematics solution and plans the path from current position of the stylus (8) to the requested touch position.
  • the duration of the touch can also be commanded from the PC, thereby emulating short press and long press functionality of the same entity in some touch screen applications.
  • the robotic arm further consists of a camera to provide visual feedback.
  • a camera can be mounted on the robot to provide a visual feedback of the touchscreen device under test.
  • the images from the camera can then be processed using optical character recognition to recognize the text on buttons and plan the next touch action. Also, image recognition can be used to recognize elements that are not text.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)
  • Manipulator (AREA)

Abstract

The present invention pertains to a robotic arm to test the functionality of a touchscreen panel of a computing device. It consists of a stylus which is adapted to move into three dimensional space for emulating various touch based movements on to the touchscreen panels to provide commands to the computing device, and the stylus further comprises a stylus tip. There are one or more than one rotating motor which is adapted to move the stylus in a plane and a linear actuator which moves the stylus in an axis vertical to the plane.

Description

Title of Invention
Robot Arm for Testing of touchscreen Applications Field of Invention
The present invention relates to a robotic arm, more specifically it relates to a robotic arm to test functionality of a touchscreen panel of a computing device. Background of Invention
There is a constant increase in touch screen technologies across various devices. The touch screen display and touch screen are one of the main user interfaces in many electronic devices, which may include smart phones, personal computers, ATMs, Gaming equipments, medical equipment, voting equipment, other machinery, etc; and proper functionality is essential in order to guarantee the best user experience for the end users. It has become very essential for touch screen and device manufacturers to test touch screen functionality. Display and touch screen testing is often performed manually, which results in uneven quality and lengthened testing time. Testing displays and touch screens with an automated system proposes valuable benefits. Automated testing guarantees continuous and steady quality, fast and efficient test cycle, and comprehensive defect coverage during a single test phase.
Ob ject of the Invention The object of the present invention is to provide a robotic arm to test functionality of a touchscreen panel of a computing device.
Summary of Invention The present invention pertains to a robotic arm to test the functionality of a touchscreen panel of a computing device. It consists of a stylus which is adapted to move in three dimensional space for emulating various touch based movements on to the touchscreen panels to provide commands to the computing device, and the stylus further comprises a stylus tip. There are one or more than one rotating motors which moves the stylus in a plane and a linear actuator which moves the stylus in an axis vertical to the plane.
According to one of the embodiments of the robotic arm, the stylus tip is replaceable. According to a further embodiment of the robotic arm, the stylus tip is having atleast a shape, a size, or a material appropriate for emulating human touch as required by the touchscreen panel of the computing device, wherein the material of the stylus tip can be atleast one of a cotton, leather, resins, polymers or combination thereof.
According to a preferred embodiment of the robotic arm, it consists of more than one stylus tip which will perform simultaneous touches on the touchscreen panel
According to yet another embodiment of the robotic arm, the linear actuator moves the stylus and further applies precise and controllable force onto the stylus for emulating a press by a human.
According to a another embodiment of the robotic arm, the linear actuator controls the speed of the stylus for emulating a movement by a human According to a further embodiment of the robotic arm, it consists of a base system which controls the functionality of the robotic arm, and the base system also comprises a controller which controls the motion of the robotic arm, the rotating motor and the linear actuator
As per another embodiment of the robotic arm the base system further has a test setup which calibrates a position of the robot arm with respect to the touchscreen panel of a computing device.
According to yet another embodiment of the robotic arm, the controller further determines the path of the stylus according to the test setup, and the robotic arm effectuates atleast one of the rotating motor or the linear actuator to move the stylus according to the determined path.
According to a preferred embodiment of the robotic arm, it consists of a camera which provides visual feedback Brief Description of the Drawings
FIG. 1 schematically illustrates the Isometric view of the robot Detailed Description
The best and other modes for carrying out the present invention are presented in terms of the embodiments, herein depicted in FIG. 1. The embodiments are described herein for illustrative purposes and are subject to many variations. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but are intended to cover the application or implementation without departing from the spirit or scope of the present invention. Further, it is to be understood that the phraseology and terminology employed herein are for the purpose of the description and should not be regarded as limiting. Any heading utilized within this description is for convenience only and has no legal or limiting effect.
The terms "a" and "an" herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. The present invention relates to a robotic arm which can make human like touch and swipe actions on touchscreen panels. The robotic arm is applicable to various fields like testing of touchscreen applications, testing and calibration of touch screen response, latency etc. It can be used for repeatable testing of touchscreen enabled devices like smartphones, tablet computers, car infotainment systems etc.
As illustrated in Fig. 1 , the invention is based on a robotic arm to test the functionality of a touchscreen panel of a computing device. It consists of a stylus (8) which is adapted to move into three dimensional space for emulating various touch based movements on to the touchscreen panels to provide commands to the computing device. This gives the robotic arm three degrees of freedom to allow positioning of the robotic arm in all three axes. There are one or more than one rotating motor (4,6) which is adapted to move the stylus (8) in a plane, and a linear actuator (2) which is adapted to move the stylus (8) in an axis vertical to the plane. In other words, two axes of the robot are rotary joints, using which the robot can move over the entire plane of the touchscreen while the third axis moves in a linear manner allowing it to touch the screen. This inturn aids in emulating the touch and swipe actions of a human.
The stylus (8) further comprises a stylus tip (10), which is replaceable. The stylus tip (10) can accommodate different sizes, different shapes and different materials of tips. For example, each finger on a person's hand is of different sizes and therefore has different areas for contact surface. The little finger has a smaller contact surface and the thumb has a larger contact surface. Another aspect is to do with people of different ages and different physiques; the size of the fingers varies from person to person. Even in the same age group, some people may have bigger finger contact surface because they are tall and/or fat, some may have smaller finger contact surface because they are short and/or thin. To accommodate for all sizes of finger contact surface areas, different sizes of stylus tips (10) are used. Different types of stylus tip (10) can be attached to the stylus (8).
Some of the different touchscreen panel technologies may require tips of different sizes and material properties. Some cases of touchscreen panel devices involve hands with gloves put on. The material of the glove ranges over many fabric types - cotton, leather, resins, polymers and so on. Such materials can be either added on top of the stylus tips (10) or can be used to make a custom stylus tip (10) of a particular fabric. Thus with such a modularity, tips can be chosen for use with any type of touchscreen panels.
According to one of the preferred embodiments, the robotic arm also has a provision for more than one stylus tip (10) which can perform simultaneous touches on the touchscreen panel. Some touchscreen panel gestures involve two-finger touch or multi-finger touch. Such multi-finger gestures can be emulated using different types of stylus tips (10) having different shapes of contact surface. The multi touch functionality can also be implemented by using multiple tips.
The linear axis which may also be called as the linear actuator (2) is adapted to move the stylus (8) and further apply a precise and controllable force onto the stylus (8) for emulating a press by a human. The linear actuator (2) can be setup for a particular hard pressing degree. In other words, the amount of force exerted on the touchscreen panel when the tip is in contact with the touchscreen panel can be determined. Each touch gesture can be set to have the same or different hard pressing degree. This can mimic a user who touches the touchscreen panel either hard or with a light force. The maximum and minimum range of force that can be exerted on the touchscreen panel is dependent on the model of the linear actuator (2) used.
The linear actuator (2) is also adapted to control the speed of the stylus (8) for emulating a movement by a human. The linear actuator (2) can be setup for a particular speed of movement, wherein the speed at which it can move towards and away from the touchscreen panel can be determined. The forward and reverse strokes of the linear actuator (2) can be set for the same or different speed for each stroke. Coupled with the physical properties and design of the stylus tip (10), this can mimic a user who touches fast or a user who touches slowly. The maximum and minimum range of speed with which the linear actuator (2) can move towards or away from the touchscreen panel is dependent on the model of the linear actuator (2) used.
The robotic arm consists of a base system (12) which controls the functionality of the robotic arm which further comprises a controller which controls the motion of the robotic arm, the rotating motor/s (4,6) and the linear actuator (2). It has a test setup which calibrates a position of the robot arm with respect to the touchscreen panel of a computing device. The robot can be mounted in horizontal, vertical or even tilted planes, so that devices with touch screens in any orientation can be tested. Prior to initializing the test procedure, the test setup may be calibrated to establish a fixed coordinate system between the robot and the touchscreen panel. Once the calibration is done, the robot can move to any point on the touchscreen, defined by its pixel coordinates to make a touch or swipe action. The calibration data is stored in a non-volatile memory inside the controller, so that recalibration is to be done only when the physical mounting of the robot relative to the touchscreen panel under test is changed.
The controller of the robotic arm determines the path of the stylus (8) according to the test setup, and the robotic arm also effectuates atleast one of the rotating motor/s (4, 6) or the linear actuator (2) to move the stylus according to the determined path. The controller of the test set up accepts motion commands from an external PC or USB memory drive and controls the motion of the robot. Recurring test patterns can be programmed into a USB drive and then plugged into the robot controller, so that the usage of a dedicated external PC can be avoided. Once the system is calibrated, commands can be sent from the external PC to make a touch or swipe operation on an interactive element like a button or slider on the touchscreen. The controller will then calculate the inverse kinematics solution and plans the path from current position of the stylus (8) to the requested touch position. The duration of the touch can also be commanded from the PC, thereby emulating short press and long press functionality of the same entity in some touch screen applications.
The robotic arm further consists of a camera to provide visual feedback. A camera can be mounted on the robot to provide a visual feedback of the touchscreen device under test. The images from the camera can then be processed using optical character recognition to recognize the text on buttons and plan the next touch action. Also, image recognition can be used to recognize elements that are not text.
The above-mentioned description represents merely the exemplary embodiment of the present disclosure, without any intention to limit the scope of the present disclosure thereto. Various equivalent changes, alterations or modifications based on the claims of present disclosure are all consequently viewed as being embraced by the scope of the present disclosure.

Claims

robot. Recurring test patterns can be programmed into a USB drive and then plugged into the robot controller, so that the usage of a dedicated external PC can be avoided. Once the system is calibrated, commands can be sent from the external PC to make a touch or swipe operation on an interactive element like a button or slider on the touchscreen. The controller will then calculate the inverse kinematics solution and plans the path from current position of the stylus (8) to the requested touch position. The duration of the touch can also be commanded from the PC, thereby emulating short press and long press functionality of the same entity in some touch screen applications. The robotic arm further consists of a camera to provide visual feedback. A camera can be mounted on the robot to provide a visual feedback of the touchscreen device under test. The images from the camera can then be processed using optical character recognition to recognize the text on buttons and plan the next touch action. Also, image recognition can be used to recognize elements that are not text. The above-mentioned description represents merely the exemplary embodiment of the present disclosure, without any intention to limit the scope of the present disclosure thereto. Various equivalent changes, alterations or modifications based on the claims of present disclosure are all consequently viewed as being embraced by the scope of the present disclosure. We Claim.
1. A robotic arm to test functionality of a touchscreen panel of a computing device comprising, a stylus adapted to move into three dimensional space for emulating various touch based movements on to the touchscreen panels to provide commands to the computing device, wherein the stylus further comprises a stylus tip
one or more than one rotating motor adapted to move the stylus in a plane
a linear actuator adapted to move the stylus in an axis vertical to the plane.
2. The robotic arm according to claim 1 , wherein the stylus tip is replaceable.
3. The robotic arm according to any of the claims 1 or 2, wherein the stylus tip is having atleast a shape, a size, or a material appropriate for emulating human touch as required by the touchscreen panel of the computing device, wherein the material of the stylus tip can be atleast one of a cotton, leather, resins, polymers or combination thereof.
4. The robotic arm according to any of the claims 1 to 3 comprising more than one stylus tip adapted to perform simultaneous touches on the touchscreen panel.
5. The robotic arm according to any of the claims 1 to 4, wherein the linear actuator is further adapted to move the stylus and further apply precise and controllable force onto the stylus for emulating a press by a human.
6. The robotic arm according to any of the claims 1 to 5 wherein the linear actuator is further adapted to control the speed of the stylus for emulating a movement by a human
7. The robotic arm according to any of the claims 1 to 6 comprising a base system adapted to control functionality of the robotic arm, wherein the base system further comprises a controller adapted to control the motion of the robotic arm, the rotating motor and the linear actuator
8. The robotic arm according to the claim 7, wherein the base system further comprise a test setup adapted to calibrate a position of the robot arm with respect to the touchscreen panel of the computing device.
9. The robotic arm according to the claim 8, wherein the controller is further adapted to determine the path of the stylus according to the test setup, and the robotic arm is adapted to effectuate atleast one of the rotating motor or the linear actuator to move the stylus according to the determined path.
PCT/IB2016/053292 2015-06-04 2016-06-04 Robot arm for testing of touchscreen applications WO2017051263A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN2801CH2015 2015-06-04
IN2801/CHE/2015 2015-06-04

Publications (2)

Publication Number Publication Date
WO2017051263A2 true WO2017051263A2 (en) 2017-03-30
WO2017051263A3 WO2017051263A3 (en) 2018-03-29

Family

ID=58386267

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/053292 WO2017051263A2 (en) 2015-06-04 2016-06-04 Robot arm for testing of touchscreen applications

Country Status (1)

Country Link
WO (1) WO2017051263A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107228996A (en) * 2017-06-15 2017-10-03 北京日扬弘创科技有限公司 test device
WO2017051263A3 (en) * 2015-06-04 2018-03-29 Sastra Robotics India Private Limited Robot arm for testing of touchscreen applications
FR3086406A1 (en) * 2018-09-24 2020-03-27 Ponant Technologies AUTOMATED, NON-INTRUSIVE TEST BENCH FOR MECHANICAL AND / OR SOFTWARE AND / OR VISUAL AND / OR SOUND MAN-MACHINE INTERFACE TESTING OF A DEVICE / EQUIPMENT
CN114052915A (en) * 2021-11-02 2022-02-18 武汉联影智融医疗科技有限公司 Method and system for testing positioning accuracy of surgical robot and mold body
CN114393576A (en) * 2021-12-27 2022-04-26 江苏明月智能科技有限公司 Four-axis mechanical arm clicking and position calibrating method and system based on artificial intelligence
CN114754677A (en) * 2022-04-14 2022-07-15 平方和(北京)科技有限公司 Device and method for automatic accurate positioning in touch screen and touch pen test equipment
DE102021101621A1 (en) 2021-01-26 2022-07-28 Valeo Schalter Und Sensoren Gmbh Device and method for checking an operating device with a touch-sensitive control panel
EP4092536A1 (en) 2021-05-19 2022-11-23 Leonardo S.p.a. Testing system for a touchscreen device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109459627B (en) * 2018-09-18 2020-01-21 华中科技大学 Device and method for testing robotized touch control pen

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9652077B2 (en) * 2010-12-09 2017-05-16 T-Mobile Usa, Inc. Touch screen testing platform having components for providing conductivity to a tip
US20120280934A1 (en) * 2011-05-04 2012-11-08 Apple Inc. Simulating Single and Multi-Touch Events for Testing A Touch Panel
US9481084B2 (en) * 2012-06-22 2016-11-01 Microsoft Technology Licensing, Llc Touch quality test robot
US9317147B2 (en) * 2012-10-24 2016-04-19 Microsoft Technology Licensing, Llc. Input testing tool
US10261611B2 (en) * 2012-12-03 2019-04-16 Apkudo, Llc System and method for objectively measuring user experience of touch screen based devices
WO2017051263A2 (en) * 2015-06-04 2017-03-30 Sastra Robotics India Private Limited Robot arm for testing of touchscreen applications

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017051263A3 (en) * 2015-06-04 2018-03-29 Sastra Robotics India Private Limited Robot arm for testing of touchscreen applications
CN107228996A (en) * 2017-06-15 2017-10-03 北京日扬弘创科技有限公司 test device
FR3086406A1 (en) * 2018-09-24 2020-03-27 Ponant Technologies AUTOMATED, NON-INTRUSIVE TEST BENCH FOR MECHANICAL AND / OR SOFTWARE AND / OR VISUAL AND / OR SOUND MAN-MACHINE INTERFACE TESTING OF A DEVICE / EQUIPMENT
WO2020064653A1 (en) * 2018-09-24 2020-04-02 Ponant Technologies Non-intrusive automated test bench, intended to perform mechanical and/or software and/or visual and/or audio tests on the human-machine interface of an apparatus/device
DE102021101621A1 (en) 2021-01-26 2022-07-28 Valeo Schalter Und Sensoren Gmbh Device and method for checking an operating device with a touch-sensitive control panel
EP4092536A1 (en) 2021-05-19 2022-11-23 Leonardo S.p.a. Testing system for a touchscreen device
CN114052915A (en) * 2021-11-02 2022-02-18 武汉联影智融医疗科技有限公司 Method and system for testing positioning accuracy of surgical robot and mold body
CN114052915B (en) * 2021-11-02 2023-11-21 武汉联影智融医疗科技有限公司 Method, system and die body for testing positioning accuracy of surgical robot
CN114393576A (en) * 2021-12-27 2022-04-26 江苏明月智能科技有限公司 Four-axis mechanical arm clicking and position calibrating method and system based on artificial intelligence
CN114754677A (en) * 2022-04-14 2022-07-15 平方和(北京)科技有限公司 Device and method for automatic accurate positioning in touch screen and touch pen test equipment
CN114754677B (en) * 2022-04-14 2022-10-14 平方和(北京)科技有限公司 Device and method for automatic accurate positioning in touch screen and touch pen test equipment

Also Published As

Publication number Publication date
WO2017051263A3 (en) 2018-03-29

Similar Documents

Publication Publication Date Title
WO2017051263A2 (en) Robot arm for testing of touchscreen applications
US9652077B2 (en) Touch screen testing platform having components for providing conductivity to a tip
US8996166B2 (en) Touch screen testing platform
EP2987063B1 (en) Virtual tools for use with touch-sensitive surfaces
JP6539816B2 (en) Multi-modal gesture based interactive system and method using one single sensing system
EP3557379A1 (en) Systems, devices and methods for providing immersive reality interface modes
US20190073112A1 (en) Dynamic user interactions for display control and measuring degree of completeness of user gestures
Gustafson et al. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device
US20200310561A1 (en) Input device for use in 2d and 3d environments
WO2015200025A1 (en) Touch screen testing platform having components for providing conductivity to a tip
US9501147B2 (en) Haptic device incorporating stretch characteristics
US20160378293A1 (en) System and methods for touch target presentation
US20130343607A1 (en) Method for touchless control of a device
US20140253446A1 (en) Mechanical Actuator Apparatus for a Touchscreen
US10372223B2 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit
JP5374564B2 (en) Drawing apparatus, drawing control method, and drawing control program
CN105183091A (en) Electronic device and information processing method
Matlani et al. Virtual mouse using hand gestures
Wilson et al. Flowmouse: A computer vision-based pointing and gesture input device
Olafsdottir et al. Multi-touch gestures for discrete and continuous control
KR101360980B1 (en) Writing utensil-type electronic input device
US20110221690A1 (en) Input device and pointing device
KR20140106996A (en) Method and apparatus for providing haptic
CN107367966B (en) Man-machine interaction method and device
CN104461369A (en) Operational input method and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16848216

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16848216

Country of ref document: EP

Kind code of ref document: A2

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 07/06/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16848216

Country of ref document: EP

Kind code of ref document: A2