CN105511780A - Test method and device - Google Patents

Test method and device Download PDF

Info

Publication number
CN105511780A
CN105511780A CN201510846499.2A CN201510846499A CN105511780A CN 105511780 A CN105511780 A CN 105511780A CN 201510846499 A CN201510846499 A CN 201510846499A CN 105511780 A CN105511780 A CN 105511780A
Authority
CN
China
Prior art keywords
gesture
terminal
event
coordinate
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510846499.2A
Other languages
Chinese (zh)
Inventor
于淼
周秀虎
张文瓅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201510846499.2A priority Critical patent/CN105511780A/en
Publication of CN105511780A publication Critical patent/CN105511780A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a test method and device. When the test method is applied to a first terminal, the test method comprises steps of receiving a gesture input on a touch screen by a user; obtaining gesture description information corresponding to the gesture; and sending the gesture description information to a second terminal in order to make the second terminal simulate the trace of the gesture on a touch screen of the second terminal in dependence on the gesture description information. According to the invention, the first terminal with a touch screen is used as a master control device, the user (the tester) does not input coordinate points no longer, and directly inputs the gesture on the touch screen, compared with a manner that the coordinate points are input through a mouse and a keyboard, the manner that the gesture is directly input on the master control device is visual and clear, the test method is convenient to operate, the test efficiency is greatly improved, input traces are not confined to straight lines or polygonal lines any longer, complicated curve gestures such as complicated unlocking, game gestures and the like can be simulated, and therefore the test depth is greatly increased.

Description

Method of testing and device
Technical field
The disclosure relates to automatic test field, particularly relates to method of testing and device.
Background technology
The terminal device of the band such as mobile phone touch-screen all needs when dispatching from the factory to test usually.In order to improve testing efficiency, reducing testing cost, in industry, generally all adopting the mode of robotization to carry out test that is long-range, batch to mobile phone.In automatic test course, simulation single-point or multi-touch gesture are absolutely necessary an important step.
In the related, tested mobile phone can be connected with computer usage data line, tester generates the coordinate points such as starting point and terminal by operating mouse and keyboard on computers, then these coordinate points are converted to the analog track such as straight line or broken line and are transferred to tested mobile phone by computer, this analog track is converted to corresponding screen touch gesture operation and shows on screen by tested mobile phone, thus realizes the test to tested mobile phone screen touch function.
Summary of the invention
For overcoming Problems existing in correlation technique, the disclosure provides method of testing and device, to improve testing efficiency when testing terminal device gesture touch function and MTD.
According to the first aspect of disclosure embodiment, provide a kind of method of testing, described method is used for first terminal, and described method comprises:
Receive the gesture that user inputs on the touchscreen;
Obtain the gesture descriptor that described gesture is corresponding;
Described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
Optionally, the gesture descriptor that the described gesture of described acquisition is corresponding, comprising:
On described touch-screen, the track corresponding with described gesture is drawn according to described gesture;
Described track is converted to coordinate points;
Gesture descriptor is generated according to described coordinate points.
Optionally, described according to described coordinate points generation gesture descriptor, comprising:
Obtain the starting point coordinate point in described coordinate points, terminal point coordinate point and flex point coordinate points;
According to described starting point coordinate point, terminal point coordinate point and flex point coordinate points, build track event sets, using described track event sets as described gesture descriptor, wherein said track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement.
Optionally, described first terminal is wirelessly connected with described second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
Optionally, before the gesture that described reception user inputs on the touchscreen, described method also comprises:
Arrange according to described user the resolution that instruction arranges described first terminal touch-screen, described resolution is the resolution of described second terminal display screen.
According to the second aspect of disclosure embodiment, provide a kind of method of testing, described method is used for the second terminal, and described method comprises,
Receive the data message that first terminal sends;
Described data message is detected;
When confirming that described data message is gesture descriptor, on the touch-screen of described second terminal, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
Optionally, described gesture descriptor comprises track event sets, described track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, the coordinate of the flex point coordinate points at the end of the coordinate of flex point coordinate points when described flex point event comprises that finger carries out the event moved linearly on the touchscreen, described rectilinear movement starts and described rectilinear movement;
Described track of simulating contained gesture in described gesture descriptor according to described gesture descriptor on the touch-screen of described second terminal, comprising:
Resolve described gesture descriptor to obtain described starting point event, endpoints and flex point event;
Perform described starting point event, flex point event and endpoints, to simulate the track of contained gesture in described gesture descriptor.
According to the third aspect of disclosure embodiment, provide a kind of proving installation, described device is used for first terminal, and described device comprises:
Gesture receiver module, for receiving the gesture that user inputs on the touchscreen;
Data obtaining module, the gesture descriptor that the gesture for obtaining the reception of described gesture receiver module is corresponding;
Information sending module, sends to the second terminal for the gesture descriptor obtained by described data obtaining module, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
Optionally, described data obtaining module comprises:
Track describes submodule, for drawing the track corresponding with described gesture according to described gesture on described touch-screen;
Coordinate points transform subblock, for being converted to coordinate points by described track;
Information generates submodule, for generating gesture descriptor according to described coordinate points.
Optionally, described information generation submodule is used for:
Obtain the starting point coordinate point in described coordinate points, terminal point coordinate point and flex point coordinate points, according to described starting point coordinate point, terminal point coordinate point and flex point coordinate points, build track event sets, using described track event sets as described gesture descriptor, wherein said track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement.
Optionally, described first terminal is wirelessly connected with described second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
Optionally, described device also comprises:
Resolution arranges module, and for arranging according to described user the resolution that instruction arranges described first terminal touch-screen, described resolution is the resolution of described second terminal display screen.
According to the fourth aspect of disclosure embodiment, provide a kind of proving installation, described device is used for the second terminal, and described device comprises,
Information receiving module, for receiving the data message that first terminal sends;
Information detecting module, detects for the data message received described information receiving module;
Trace simulation module, for when described information detecting module confirms that described data message is gesture descriptor, simulates the track of contained gesture in described gesture descriptor on the touch-screen of described second terminal according to described gesture descriptor.
Optionally, described gesture descriptor comprises track event sets, described track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, the coordinate of the flex point coordinate points at the end of the coordinate of flex point coordinate points when described flex point event comprises that finger carries out the event moved linearly on the touchscreen, described rectilinear movement starts and described rectilinear movement;
Described trace simulation module when to simulate in described gesture descriptor the track of contained gesture on the touch-screen of described second terminal according to described gesture descriptor, for:
Resolve described gesture descriptor to obtain described starting point event, endpoints and flex point event; Perform described starting point event, flex point event and endpoints, to simulate the track of contained gesture in described gesture descriptor.
According to the 5th aspect of disclosure embodiment, a kind of proving installation is provided, comprises:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive the gesture that user inputs on the touchscreen;
Obtain the gesture descriptor that described gesture is corresponding;
Described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
According to the 6th aspect of disclosure embodiment, a kind of proving installation is provided, comprises:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive the data message that first terminal sends;
Described data message is detected;
When confirming that described data message is gesture descriptor, on the touch-screen of described proving installation, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect:
In the disclosed embodiments, use with the first terminal of touch-screen as main control equipment, user's (being also tester) no longer input coordinate point, but directly input gesture on the touchscreen, main control equipment obtains the gesture descriptor of this gesture, then send it to the second terminal as equipment under test, checked the gesture touch function of equipment under test by the gesture simulate effect checking on equipment under test.Compared with the mode by mouse-keyboard input coordinate point, the mode that main control equipment directly inputs gesture is simple and clear, convenient operation, testing efficiency significantly improves, and the track of input is also no longer confined to straight line or broken line, but the curve gesture of the complexity such as style unlocks, game gesture can be simulated, thus also substantially increase MTD.
Should be understood that, it is only exemplary and explanatory that above general description and details hereinafter describe, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing to be herein merged in instructions and to form the part of this instructions, shows embodiment according to the invention, and is used from instructions one and explains principle of the present invention.
Fig. 1 is the process flow diagram of a kind of method of testing according to an exemplary embodiment;
Fig. 2 is the one the second terminal connection diagrams according to an exemplary embodiment;
Fig. 3 is the process flow diagram of a kind of method of testing according to an exemplary embodiment;
Fig. 4 is the track schematic diagram according to an exemplary embodiment;
Fig. 5 is the process flow diagram of a kind of method of testing according to an exemplary embodiment;
Fig. 6 is the track schematic diagram according to an exemplary embodiment;
Fig. 7 is the process flow diagram of a kind of method of testing according to an exemplary embodiment;
Fig. 8 be according to an exemplary embodiment from first terminal to the data flow diagram of the second terminal;
Fig. 9 is the block diagram of a kind of proving installation according to an exemplary embodiment;
Figure 10 is the block diagram of a kind of proving installation according to an exemplary embodiment;
Figure 11 is the block diagram of a kind of proving installation according to an exemplary embodiment;
Figure 12 is the block diagram of a kind of proving installation according to an exemplary embodiment;
Figure 13 is the block diagram of a kind of device for testing according to an exemplary embodiment.
Embodiment
Here will be described exemplary embodiment in detail, its sample table shows in the accompanying drawings.When description below relates to accompanying drawing, unless otherwise indicated, the same numbers in different accompanying drawing represents same or analogous key element.Embodiment described in following exemplary embodiment does not represent all embodiments consistent with the present invention.On the contrary, they only with as in appended claims describe in detail, the example of apparatus and method that aspects more of the present invention are consistent.
Terminal device herein can be mobile phone, panel computer, E-book reader, MP3 (MovingPictureExpertsGroupAudioLayerIII, dynamic image expert compression standard audio frequency aspect 3) player, MP4 (MovingPictureExpertsGroupAudioLayerIV, dynamic image expert compression standard audio frequency aspect 4) player and pocket computer on knee etc.
Fig. 1 is the process flow diagram of a kind of method of testing according to an exemplary embodiment.The method may be used for first terminal.In the present embodiment, first terminal is main control terminal, and the second terminal is measured terminal, and first terminal, the second terminal can be all mobile phone, panel computer etc.
In step S101, receive the gesture that user inputs on the touchscreen.
User is also tester.Tester directly can input gesture to be tested on the touch-screen of first terminal, can be such as one or more straight line, curve, can also be the complicated gestures such as style unlocks, game gesture, and be no longer generate the such mode of coordinate points by mouse and keyboard.
In step s 102, the gesture descriptor that described gesture is corresponding is obtained.
Gesture descriptor is a kind of information can reappearing the trace image of this gesture according to this information in the second terminal.The gesture of user is drawn with the form of trace image on the touchscreen, and this trace image can be converted to the information such as a series of coordinates as gesture descriptor by first terminal.
In step s 103, described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
Exemplarily, first terminal can be connected to wirelessly with the second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
Such as wireless mode can be bluetooth, WIFI etc., and the present embodiment does not limit.Can be shown in Figure 2, in fig. 2,201 is first terminal, also be main control terminal, 202,203,204 are the second terminal, are also measured terminal, first terminal and each second terminal all set up wireless connections by bluetooth, can test each second terminal successively or simultaneously.
Shown in Figure 3, in the present embodiment or the disclosure some other embodiments, the gesture descriptor that the described gesture of described acquisition is corresponding, can comprise:
In step S301, on described touch-screen, draw the track corresponding with described gesture according to described gesture.
In step s 302, described track is converted to coordinate points.
Exemplarily can be shown in Figure 4, Fig. 4 is the schematic diagram of a track, and this track can be decomposed into several coordinate points such as a, b, c, and wherein a is the starting point of track, and k is the terminal of track.
In step S303, generate gesture descriptor according to described coordinate points.
Shown in Figure 5, in the present embodiment or the disclosure some other embodiments, described according to described coordinate points generation gesture descriptor, can comprise:
In step S501, obtain the starting point coordinate point in described coordinate points, terminal point coordinate point and flex point coordinate points.
For a track, a starting point, a terminal and some flex points can be included.Article one, track is exactly a curve, and curve is be made up of numerous little line segments in essence, the intersection point of adjacent two little line segments and flex point, means and turns round herein.Exemplarily still can be shown in Figure 4, in the diagram, this track is made up of starting point (a), all flex points (b ~ j) between terminal (k) and Origin And Destination, and ab, bc, jk etc. are little line segment.
In step S502, according to described starting point coordinate point, terminal point coordinate point and flex point coordinate points, build track event sets, using described track event sets as described gesture descriptor.Wherein said track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, the coordinate of the flex point coordinate points at the end of the coordinate of flex point coordinate points when described flex point event comprises that finger carries out the event moved linearly on the touchscreen, described rectilinear movement starts and described rectilinear movement.
Three class event: UP (finger leaves touch-screen), DOWN (finger presses touch-screen) and MOVE (finger moves on the touchscreen) can be defined.The corresponding DOWN event of starting point coordinate, the corresponding UP event of terminal point coordinate, the corresponding MOVE event of flex point coordinate.Exemplarily can be shown in Figure 6, each arrow represents a MOVE event in figure 6.
Starting point event comprises two parameters, UP event+starting point coordinate, i.e. (up, (Xa, Ya)), and wherein Xa represents the X-coordinate of a point, and Ya represents the Y-coordinate of a point.
Endpoints comprises two parameters: DOWN event+terminal point coordinate, i.e. (down, (Xk, Yk)).
Flex point event comprises three parameters: the terminal point coordinate of the starting point coordinate+MOVE event of MOVE event+MOVE event, such as (move, (Xa, Ya), (Xb, Yb)).
In the present embodiment or the disclosure some other embodiments, before the gesture that described reception user inputs on the touchscreen, described method can also comprise:
Arrange according to described user the resolution that instruction arranges described first terminal touch-screen, described resolution is the resolution of described second terminal display screen.
Like this, the resolution of first terminal touch-screen is arranged according to the measured terminal i.e. resolution of the second terminal, the part workload of the coordinate conversion between different resolution can be reduced, raise the efficiency.
In the present embodiment, use with the first terminal of touch-screen as main control equipment, user's (being also tester) no longer input coordinate point, but directly input gesture on the touchscreen, main control equipment obtains the gesture descriptor of this gesture, then gesture descriptor is sent to the second terminal as equipment under test, checked the gesture touch function of equipment under test by the gesture simulate effect checking on equipment under test.Compared with the mode by mouse-keyboard input coordinate point, the mode that main control equipment directly inputs gesture is simple and clear, convenient operation, testing efficiency significantly improves, and the track of input is also no longer confined to straight line or broken line, but the curve gesture of the complexity such as style unlocks, game gesture can be simulated, thus also substantially increase MTD.
Fig. 7 is the process flow diagram of a kind of method of testing according to an exemplary embodiment.The method may be used for the second terminal.
In step s 701, the data message that first terminal sends is received.
In step S702, described data message is detected.
In step S703, when confirming that described data message is gesture descriptor, on the touch-screen of described second terminal, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
Exemplarily, can be shown in Figure 8 to the data flow diagram of the second terminal from first terminal, in fig. 8, master control mobile phone 801 receives multiple point touching trace graphics to be tested, after track reception program (such as comprise graphics adaptation, flex point statistics, the convert track event sets supervisor to) process of inside, obtain multiple point touching track event sets, then send to tested mobile phone 802 as packet by the blue-teeth data transmitting terminal of first terminal.This trace graphics is plotted on screen by event handling center after receiving this packet by Bluetooth data reception termination by tested mobile phone 802, checks for tester.
In the present embodiment or the disclosure some other embodiments, described gesture descriptor comprises track event sets, described track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement,
Described track of simulating contained gesture in described gesture descriptor according to described gesture descriptor on the touch-screen of described second terminal, can comprise:
Resolve described gesture descriptor to obtain described starting point event, endpoints and flex point event;
Perform described starting point event, flex point event and endpoints, to simulate the track of contained gesture in described gesture descriptor.
Such as, refer to operation if related to, then the second terminal can perform with pseudo-concurrent fashion more, namely by performing individual event linearly, reaches the presentation of concurrence performance.Wherein pseudo-ly concurrently refer to the concurrent of single core processor.
In the present embodiment, use with the first terminal of touch-screen as main control equipment, user's (being also tester) no longer input coordinate point, but directly input gesture on the touchscreen, main control equipment obtains the gesture descriptor of this gesture, then gesture descriptor is sent to the second terminal as equipment under test, checked the gesture touch function of equipment under test by the gesture simulate effect checking on equipment under test.Compared with the mode by mouse-keyboard input coordinate point, the mode that main control equipment directly inputs gesture is simple and clear, convenient operation, testing efficiency significantly improves, and the track of input is also no longer confined to straight line or broken line, but the curve gesture of the complexity such as style unlocks, game gesture can be simulated, thus also substantially increase MTD.
Following is disclosure device embodiment, may be used for performing disclosure embodiment of the method.For the details do not disclosed in disclosure device embodiment, please refer to disclosure embodiment of the method.
Fig. 9 is the block diagram of a kind of proving installation according to an exemplary embodiment.This device may be used for first terminal.In the present embodiment, first terminal is main control terminal, and the second terminal is measured terminal, and first terminal, the second terminal can be mobile phone, panel computer etc.
Gesture receiver module 901, for receiving the gesture that user inputs on the touchscreen.
User is also tester.Tester directly can input gesture to be tested on the touch-screen of first terminal, can be such as one or more straight line, curve, can also be the complicated gestures such as style unlocks, game gesture, and be no longer generate the such mode of coordinate points by mouse and keyboard.
Data obtaining module 902, the gesture descriptor that the gesture for obtaining the reception of described gesture receiver module 901 is corresponding.
Gesture descriptor is a kind of information can reappearing the trace image of this gesture according to this information in the second terminal.The gesture of user is drawn with the form of trace image on the touchscreen, and this trace image can be converted to the information such as a series of coordinates as gesture descriptor by first terminal.
Information sending module 903, sends to the second terminal for the gesture descriptor obtained by described data obtaining module 902, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
Exemplarily, first terminal can be connected to wirelessly with the second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
Such as wireless mode can be bluetooth, WIFI etc., and the present embodiment does not limit.Can be shown in Figure 2, in fig. 2,201 is first terminal, also be main control terminal, 202,203,204 are the second terminal, are also measured terminal, first terminal and each second terminal all set up wireless connections by bluetooth, can test each second terminal successively or simultaneously.
Shown in Figure 10, in the present embodiment or the disclosure some other embodiments, described data obtaining module 902 can comprise:
Track describes submodule 9021, for drawing the track corresponding with described gesture according to described gesture on described touch-screen.
Coordinate points transform subblock 9022, for being converted to coordinate points by described track.
Exemplarily can be shown in Figure 4, Fig. 4 is the schematic diagram of a track, and this track can be decomposed into several coordinate points such as a, b, c, and wherein a is the starting point of track, and k is the terminal of track.
Information generates submodule 9023, for generating gesture descriptor according to described coordinate points.
In the present embodiment or the disclosure some other embodiments, described information generates submodule 9023 and may be used for:
Obtain the starting point coordinate point in described coordinate points, terminal point coordinate point and flex point coordinate points, according to described starting point coordinate point, terminal point coordinate point and flex point coordinate points, build track event sets, using described track event sets as described gesture descriptor, wherein said track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement.
For a track, a starting point, a terminal and some flex points can be included.Article one, track is exactly a curve, and curve is be made up of numerous little line segments in essence, the intersection point of adjacent two little line segments and flex point, means and turns round herein.Exemplarily still can be shown in Figure 4, in the diagram, this track is made up of starting point (a), all flex points (b ~ j) between terminal (k) and Origin And Destination, and ab, bc, jk etc. are little line segment.
Three class event: UP (finger leaves touch-screen), DOWN (finger presses touch-screen) and MOVE (finger moves on the touchscreen) can be defined.The corresponding DOWN event of starting point coordinate, the corresponding UP event of terminal point coordinate, the corresponding MOVE event of flex point coordinate.Exemplarily can be shown in Figure 6, each arrow represents a MOVE event in figure 6.
Starting point event comprises two parameters, UP event+starting point coordinate, i.e. (up, (Xa, Ya)), and wherein Xa represents the X-coordinate of a point, and Ya represents the Y-coordinate of a point.
Endpoints comprises two parameters: DOWN event+terminal point coordinate, i.e. (down, (Xk, Yk)).
Flex point event comprises three parameters: the terminal point coordinate of the starting point coordinate+MOVE event of MOVE event+MOVE event, such as (move, (Xa, Ya), (Xb, Yb)).
In the present embodiment or the disclosure some other embodiments, described first terminal is wirelessly connected with described second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
Shown in Figure 11, in the present embodiment or the disclosure some other embodiments, described device can also comprise:
Resolution arranges module 904, and for arranging according to described user the resolution that instruction arranges described first terminal touch-screen, described resolution is the resolution of described second terminal display screen.
Like this, the resolution of first terminal touch-screen is arranged according to the measured terminal i.e. resolution of the second terminal, the part workload of the coordinate conversion between different resolution can be reduced, raise the efficiency.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations has been described in detail in about the embodiment of the method, will not elaborate explanation herein.
In the present embodiment, use with the first terminal of touch-screen as main control equipment, user's (being also tester) no longer input coordinate point, but directly input gesture on the touchscreen, main control equipment obtains the gesture descriptor of this gesture, then gesture descriptor is sent to the second terminal as equipment under test, checked the gesture touch function of equipment under test by the gesture simulate effect checking on equipment under test.Compared with the mode by mouse-keyboard input coordinate point, the mode that main control equipment directly inputs gesture is simple and clear, convenient operation, testing efficiency significantly improves, and the track of input is also no longer confined to straight line or broken line, but the curve gesture of the complexity such as style unlocks, game gesture can be simulated, thus also substantially increase MTD.
Figure 12 is the block diagram of a kind of proving installation according to an exemplary embodiment, and this device can be used for the second terminal.
Information receiving module 1201, for receiving the data message that first terminal sends.
Information detecting module 1202, detects for the data message received described information receiving module 1201.
Trace simulation module 1203, for when described information detecting module 1202 confirms that described data message is gesture descriptor, simulates the track of contained gesture in described gesture descriptor on the touch-screen of described second terminal according to described gesture descriptor.
Exemplarily, can be shown in Figure 8 to the data flow diagram of the second terminal from first terminal, in fig. 8, master control mobile phone 801 receives multiple point touching trace graphics to be tested, after track reception program (such as comprise graphics adaptation, flex point statistics, the convert track event sets supervisor to) process of inside, obtain multiple point touching track event sets, then send to tested mobile phone 802 as packet by the blue-teeth data transmitting terminal of first terminal.This trace graphics is plotted on screen by event handling center after receiving this packet by Bluetooth data reception termination by tested mobile phone 802, checks for tester.
In the present embodiment or the disclosure some other embodiments, described gesture descriptor comprises track event sets, described track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement,
Described trace simulation module 1203 when to simulate in described gesture descriptor the track of contained gesture on the touch-screen of described second terminal according to described gesture descriptor, for:
Resolve described gesture descriptor to obtain described starting point event, endpoints and flex point event; Perform described starting point event, flex point event and endpoints, to simulate the track of contained gesture in described gesture descriptor.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations has been described in detail in about the embodiment of the method, will not elaborate explanation herein.
In the present embodiment, use with the first terminal of touch-screen as main control equipment, user's (being also tester) no longer input coordinate point, but directly input gesture on the touchscreen, main control equipment obtains the gesture descriptor of this gesture, then gesture descriptor is sent to the second terminal as equipment under test, checked the gesture touch function of equipment under test by the gesture simulate effect checking on equipment under test.Compared with the mode by mouse-keyboard input coordinate point, the mode that main control equipment directly inputs gesture is simple and clear, convenient operation, testing efficiency significantly improves, and the track of input is also no longer confined to straight line or broken line, but the curve gesture of the complexity such as style unlocks, game gesture can be simulated, thus also substantially increase MTD.
The disclosure also discloses a kind of proving installation, comprising:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive the gesture that user inputs on the touchscreen;
Obtain the gesture descriptor that described gesture is corresponding;
Described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
The disclosure also discloses a kind of proving installation, comprising:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive the data message that first terminal sends;
Described data message is detected;
When confirming that described data message is gesture descriptor, on the touch-screen of described proving installation, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
The disclosure also discloses a kind of non-transitory computer-readable recording medium, and when the instruction in described storage medium is performed by the processor of the first equipment, make the first equipment can perform a kind of method of testing, described method comprises:
Receive the gesture that user inputs on the touchscreen;
Obtain the gesture descriptor that described gesture is corresponding;
Described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
The disclosure also discloses a kind of non-transitory computer-readable recording medium, and when the instruction in described storage medium is performed by the processor of the second equipment, make the second equipment can perform a kind of method of testing, described method comprises:
Receive the data message that first terminal sends;
Described data message is detected;
When confirming that described data message is gesture descriptor, on the touch-screen of the second terminal, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
Figure 13 is the block diagram of a kind of device for testing according to an exemplary embodiment.This device and the device shown in figure 1300, this device 1300 can be such as mobile phone, computing machine, digital broadcast terminal, messaging devices, game console, tablet device, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Figure 13, device 1300 can comprise following one or more assembly: processing components 1302, storer 1304, power supply module 1306, multimedia groupware 1308, audio-frequency assembly 1310, the interface 1312 of I/O (I/O), sensor module 1314, and communications component 1316.
The integrated operation of the usual control device 1300 of processing components 1302, such as with display, call, data communication, camera operation and record operate the operation be associated.Processing components 1302 can comprise one or more processor 1320 to perform instruction, to complete all or part of step of above-mentioned method.In addition, processing components 1302 can comprise one or more module, and what be convenient between processing components 1302 and other assemblies is mutual.Such as, processing components 1302 can comprise multi-media module, mutual with what facilitate between multimedia groupware 1308 and processing components 1302.
Storer 1304 is configured to store various types of data to be supported in the operation of equipment 1300.The example of these data comprises for any application program of operation on device 1300 or the instruction of method, contact data, telephone book data, message, picture, video etc.Storer 1304 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.
The various assemblies that power supply module 1306 is device 1300 provide electric power.Power supply module 1306 can comprise power-supply management system, one or more power supply, and other and the assembly generating, manage and distribute electric power for device 1300 and be associated.
Multimedia groupware 1308 is included in the screen providing an output interface between described device 1300 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises one or more touch sensor with the gesture on sensing touch, slip and touch panel.Described touch sensor can the border of not only sensing touch or sliding action, but also detects the duration relevant to described touch or slide and pressure.In certain embodiments, multimedia groupware 1308 comprises a front-facing camera and/or post-positioned pick-up head.When device 1300 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 1310 is configured to export and/or input audio signal.Such as, audio-frequency assembly 1310 comprises a microphone (MIC), and when device 1300 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal received can be stored in storer 1304 further or be sent via communications component 1316.In certain embodiments, audio-frequency assembly 1310 also comprises a loudspeaker, for output audio signal.
I/O interface 1312 is for providing interface between processing components 1302 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor module 1314 comprises one or more sensor, for providing the state estimation of various aspects for device 1300.Such as, sensor module 1314 can detect the opening/closing state of equipment 1300, the relative positioning of assembly, such as described assembly is display and the keypad of device 1300, the position of all right pick-up unit 1300 of sensor module 1314 or device 1300 assemblies changes, the presence or absence that user contacts with device 1300, the temperature variation of device 1300 orientation or acceleration/deceleration and device 1300.Sensor module 1314 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor module 1314 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor module 1314 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 1316 is configured to the communication being convenient to wired or wireless mode between device 1300 and other equipment.Device 1300 can access the wireless network based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communications component 1316 receives from the broadcast singal of external broadcasting management system or broadcast related information via broadcast channel.In one exemplary embodiment, described communications component 1316 also comprises near-field communication (NFC) module, to promote junction service.Such as, can based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 1300 can be realized, for performing the said method of end side by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components.
Those skilled in the art, at consideration instructions and after putting into practice invention disclosed herein, will easily expect other embodiment of the present invention.The application is intended to contain any modification of the present invention, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present invention and comprised the undocumented common practise in the art of the disclosure or conventional techniques means.Instructions and embodiment are only regarded as exemplary, and true scope of the present invention and spirit are pointed out by appended claim.
Should be understood that, the present invention is not limited to precision architecture described above and illustrated in the accompanying drawings, and can carry out various amendment and change not departing from its scope.Scope of the present invention is only limited by appended claim.

Claims (16)

1. a method of testing, is characterized in that, described method is used for first terminal, and described method comprises:
Receive the gesture that user inputs on the touchscreen;
Obtain the gesture descriptor that described gesture is corresponding;
Described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
2. method according to claim 1, is characterized in that, the gesture descriptor that the described gesture of described acquisition is corresponding, comprising:
On described touch-screen, the track corresponding with described gesture is drawn according to described gesture;
Described track is converted to coordinate points;
Gesture descriptor is generated according to described coordinate points.
3. method according to claim 2, is characterized in that, described according to described coordinate points generation gesture descriptor, comprising:
Obtain the starting point coordinate point in described coordinate points, terminal point coordinate point and flex point coordinate points;
According to described starting point coordinate point, terminal point coordinate point and flex point coordinate points, build track event sets, using described track event sets as described gesture descriptor, wherein said track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement.
4. method according to claim 1, is characterized in that, described first terminal is wirelessly connected with described second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
5. method according to claim 1, is characterized in that, before the gesture that described reception user inputs on the touchscreen, described method also comprises:
Arrange according to described user the resolution that instruction arranges described first terminal touch-screen, described resolution is the resolution of described second terminal display screen.
6. a method of testing, is characterized in that, described method is used for the second terminal, and described method comprises,
Receive the data message that first terminal sends;
Described data message is detected;
When confirming that described data message is gesture descriptor, on the touch-screen of described second terminal, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
7. method according to claim 6, it is characterized in that, described gesture descriptor comprises track event sets, described track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement,
Described track of simulating contained gesture in described gesture descriptor according to described gesture descriptor on the touch-screen of described second terminal, comprising:
Resolve described gesture descriptor to obtain described starting point event, endpoints and flex point event;
Perform described starting point event, flex point event and endpoints, to simulate the track of contained gesture in described gesture descriptor.
8. a proving installation, is characterized in that, described device is used for first terminal, and described device comprises:
Gesture receiver module, for receiving the gesture that user inputs on the touchscreen;
Data obtaining module, the gesture descriptor that the gesture for obtaining the reception of described gesture receiver module is corresponding;
Information sending module, sends to the second terminal for the gesture descriptor obtained by described data obtaining module, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
9. device according to claim 8, is characterized in that, described data obtaining module comprises:
Track describes submodule, for drawing the track corresponding with described gesture according to described gesture on described touch-screen;
Coordinate points transform subblock, for being converted to coordinate points by described track;
Information generates submodule, for generating gesture descriptor according to described coordinate points.
10. device according to claim 9, is characterized in that, described information generates submodule and is used for:
Obtain the starting point coordinate point in described coordinate points, terminal point coordinate point and flex point coordinate points, according to described starting point coordinate point, terminal point coordinate point and flex point coordinate points, build track event sets, using described track event sets as described gesture descriptor, wherein said track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement.
11. devices according to claim 8, is characterized in that, described first terminal is wirelessly connected with described second terminal, and described gesture descriptor sends to described second terminal by described wireless mode.
12. devices according to claim 8, is characterized in that, described device also comprises:
Resolution arranges module, and for arranging according to described user the resolution that instruction arranges described first terminal touch-screen, described resolution is the resolution of described second terminal display screen.
13. 1 kinds of proving installations, is characterized in that, described device is used for the second terminal, and described device comprises,
Information receiving module, for receiving the data message that first terminal sends;
Information detecting module, detects for the data message received described information receiving module;
Trace simulation module, for when described information detecting module confirms that described data message is gesture descriptor, simulates the track of contained gesture in described gesture descriptor on the touch-screen of described second terminal according to described gesture descriptor.
14. devices according to claim 13, it is characterized in that, described gesture descriptor comprises track event sets, described track event sets comprises starting point event, endpoints and flex point event, described starting point event comprises finger and presses the event of touch-screen and the coordinate of described starting point coordinate point, described endpoints comprises finger and leaves the event of touch-screen and the coordinate of described terminal point coordinate point, described flex point event comprises the event that finger carries out moving linearly on the touchscreen, the coordinate of the coordinate of flex point coordinate points when described rectilinear movement starts and the flex point coordinate points at the end of described rectilinear movement,
Described trace simulation module when to simulate in described gesture descriptor the track of contained gesture on the touch-screen of described second terminal according to described gesture descriptor, for:
Resolve described gesture descriptor to obtain described starting point event, endpoints and flex point event; Perform described starting point event, flex point event and endpoints, to simulate the track of contained gesture in described gesture descriptor.
15. 1 kinds of proving installations, is characterized in that, comprising:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive the gesture that user inputs on the touchscreen;
Obtain the gesture descriptor that described gesture is corresponding;
Described gesture descriptor is sent to the second terminal, with the track making described second terminal simulate described gesture according to described gesture descriptor on the touch-screen of described second terminal.
16. 1 kinds of proving installations, is characterized in that, comprising:
Processor;
For the storer of storage of processor executable instruction;
Wherein, described processor is configured to:
Receive the data message that first terminal sends;
Described data message is detected;
When confirming that described data message is gesture descriptor, on the touch-screen of described proving installation, simulate the track of contained gesture in described gesture descriptor according to described gesture descriptor.
CN201510846499.2A 2015-11-26 2015-11-26 Test method and device Pending CN105511780A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510846499.2A CN105511780A (en) 2015-11-26 2015-11-26 Test method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510846499.2A CN105511780A (en) 2015-11-26 2015-11-26 Test method and device

Publications (1)

Publication Number Publication Date
CN105511780A true CN105511780A (en) 2016-04-20

Family

ID=55719801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510846499.2A Pending CN105511780A (en) 2015-11-26 2015-11-26 Test method and device

Country Status (1)

Country Link
CN (1) CN105511780A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861848A (en) * 2017-11-14 2018-03-30 东软集团股份有限公司 Gesture password method of testing, device, readable storage medium storing program for executing and electronic equipment
CN109324741A (en) * 2018-09-30 2019-02-12 广州云测信息技术有限公司 A kind of method of controlling operation thereof, device and system
CN109670292A (en) * 2018-12-12 2019-04-23 北京云测信息技术有限公司 A kind of ios device gesture operation implementation method for prototype test
CN114115563A (en) * 2021-11-30 2022-03-01 南京星云数字技术有限公司 Operation track acquisition method, operation track playback method and operation track playback device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101577044A (en) * 2008-05-09 2009-11-11 海信集团有限公司 Control device and control method thereof
CN102860034A (en) * 2010-04-28 2013-01-02 Lg电子株式会社 Image display apparatus and method for operating the same
US20130222229A1 (en) * 2012-02-29 2013-08-29 Tomohiro Kanda Display control apparatus, display control method, and control method for electronic device
CN103455265A (en) * 2012-06-01 2013-12-18 腾讯科技(深圳)有限公司 Controlled equipment, control method and control system thereof and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101577044A (en) * 2008-05-09 2009-11-11 海信集团有限公司 Control device and control method thereof
CN102860034A (en) * 2010-04-28 2013-01-02 Lg电子株式会社 Image display apparatus and method for operating the same
US20130222229A1 (en) * 2012-02-29 2013-08-29 Tomohiro Kanda Display control apparatus, display control method, and control method for electronic device
CN103455265A (en) * 2012-06-01 2013-12-18 腾讯科技(深圳)有限公司 Controlled equipment, control method and control system thereof and mobile terminal

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107861848A (en) * 2017-11-14 2018-03-30 东软集团股份有限公司 Gesture password method of testing, device, readable storage medium storing program for executing and electronic equipment
CN107861848B (en) * 2017-11-14 2021-06-04 东软集团股份有限公司 Gesture password testing method and device, readable storage medium and electronic equipment
CN109324741A (en) * 2018-09-30 2019-02-12 广州云测信息技术有限公司 A kind of method of controlling operation thereof, device and system
CN109670292A (en) * 2018-12-12 2019-04-23 北京云测信息技术有限公司 A kind of ios device gesture operation implementation method for prototype test
CN114115563A (en) * 2021-11-30 2022-03-01 南京星云数字技术有限公司 Operation track acquisition method, operation track playback method and operation track playback device

Similar Documents

Publication Publication Date Title
CN105446646A (en) Virtual keyboard based content input method, apparatus and touch device
CN104598130A (en) Mode switching method, terminal, wearable equipment and device
CN104571923A (en) Touch feedback method, device and terminal
CN104536638A (en) Touch key and fingerprint identification implementation method and device and terminal equipment
CN104732201A (en) Touch key press and fingerprint identification implementation device and method, and terminal device
CN104598076A (en) Method and device for shielding touch messages
CN105242870A (en) False touch method and device of terminal with touch screen
CN104536684A (en) Interface displaying method and device
CN105160320A (en) Fingerprint identification method and apparatus, and mobile terminal
CN104899610A (en) Picture classification method and device
CN103995666A (en) Method and device for setting work mode
CN103916692A (en) Video playing method and device and playing terminal
CN105159496A (en) Touch event response method and mobile terminal
CN104536935A (en) Calculation displaying method, calculation editing method and device
CN105511780A (en) Test method and device
CN103986999A (en) Method, device and terminal equipment for detecting earphone impedance
CN105426042A (en) Icon position exchange method and apparatus
CN104407924A (en) Method and device for optimizing internal memory
CN104571709A (en) Mobile terminal and processing method of virtual keys
CN105323152A (en) Message processing method, device and equipment
CN106303896A (en) The method and apparatus playing audio frequency
CN104299016A (en) Object location method and device
CN105159709A (en) Application starting method and apparatus, and intelligent terminal
CN106325621B (en) Mobile terminal and touch-responsive method
CN204214891U (en) Test circuit board and terminal device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160420