WO2019136961A1 - Procédé d'identification d'affichage d'interface d'utilisateur, dispositif terminal, support de stockage et appareil - Google Patents

Procédé d'identification d'affichage d'interface d'utilisateur, dispositif terminal, support de stockage et appareil Download PDF

Info

Publication number
WO2019136961A1
WO2019136961A1 PCT/CN2018/097518 CN2018097518W WO2019136961A1 WO 2019136961 A1 WO2019136961 A1 WO 2019136961A1 CN 2018097518 W CN2018097518 W CN 2018097518W WO 2019136961 A1 WO2019136961 A1 WO 2019136961A1
Authority
WO
WIPO (PCT)
Prior art keywords
screenshot
feature vector
target
user interface
normal
Prior art date
Application number
PCT/CN2018/097518
Other languages
English (en)
Chinese (zh)
Inventor
彭远杰
刘慧众
Original Assignee
深圳壹账通智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳壹账通智能科技有限公司 filed Critical 深圳壹账通智能科技有限公司
Publication of WO2019136961A1 publication Critical patent/WO2019136961A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application belongs to the field of mobile terminal technologies, and in particular, to a user interface display and identification method, a terminal device, a computer readable storage medium, and a device.
  • mobile phones and other mobile terminals can be applied to a variety of operating systems, such as the Android operating system. Since different models of mobile phones have different resolutions, multiple operating systems are applied to mobile phones of different resolutions, and must be compatible with mobile phones of different resolutions.
  • the existing detection user interface display is detected by the test script, that is, the entire test script is executed, and the test result is compared with the set fixed value after the test is completed. If the two are consistent, the test user interface is displayed normally, otherwise, the test user interface is tested. The display is not normal.
  • the existing detection method has the problem of low script maintenance efficiency. If the user interface is adjusted, the entire test case needs to be rewritten, and the fixed value needs to be reset again, and the test cost is high, which is not suitable for practical application.
  • the embodiment of the present invention provides a user interface display and identification method, a terminal device, a computer readable storage medium, and a device, so as to solve the problem that the script maintenance efficiency of the existing user interface display detection method is low.
  • a first aspect of the embodiment of the present application provides a user interface display and identification method, including:
  • a second aspect of the embodiments of the present application provides a user interface display identifying terminal device, including a memory, a processor, and a computer program stored in the memory and operable on the processor, the processor executing the The steps of the above user interface display identification method are implemented when the computer program is described.
  • a third aspect of the embodiments of the present application provides a computer readable storage medium storing a computer program, the computer program being executed by a processor to implement the steps of the user interface display identification method.
  • a fourth aspect of the embodiments of the present application provides a user interface display identification apparatus, which may include a module for implementing the steps of the user interface display identification method.
  • the embodiment of the present application determines the cosine distance between the target interface screenshot and the normal interface screenshot by acquiring a target interface screenshot of the user interface to be detected, and determines that the cosine distance exceeds the pre-predetermined
  • the threshold is set, it is detected whether the level and position of each element control of the user interface to be detected are abnormal.
  • the detection is normal, it is determined that the user interface to be detected is displayed normally, and the problem that the existing user interface display detection method has low script maintenance efficiency is solved.
  • FIG. 1 is a schematic flowchart of a user interface display and identification method provided by an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a user interface display and identification method according to another embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a user interface display and identification method according to still another embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a user interface display and identification method according to another embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a user interface display and identification method according to another embodiment of the present application.
  • FIG. 6 is a schematic block diagram of a user interface display recognition program according to an embodiment of the present application.
  • FIG. 7 is a schematic block diagram of a user interface display and identification terminal device provided by an embodiment of the present application.
  • FIG. 1 is a schematic flowchart of a user interface display and identification method according to an embodiment of the present disclosure.
  • an angle trigger of a terminal is used as an example.
  • the terminal may be a smart phone or a tablet.
  • Mobile terminals such as computers.
  • the processing of the terminal may include the following steps:
  • S101 Obtain a target interface screenshot of the user interface to be detected, and extract a feature vector of the target interface screenshot.
  • the user interface to be detected is any user interface that needs to be detected, and the user interface may also be referred to as a user interface, which is a medium for interaction and information exchange between the system and the user, and realizes an internal form of information and a human acceptable form. The conversion between.
  • the method for obtaining the screenshot of the target interface of the user interface to be detected may be obtained in real time or acquired in a preset time.
  • the specific acquisition mode may be set according to actual needs.
  • the obtained target interface screenshot may be displayed, and a prompt for re-acquiring the screenshot may be generated. If the re-acquisition of the screenshot request is received, the target interface of the user interface to be detected is re-acquired according to the request.
  • the screenshot can be obtained by taking a screenshot of the target interface of the user interface to be detected multiple times, and selecting a target interface screenshot that meets the requirements from the obtained plurality of target interface screenshots, for example, selecting the clearest target interface screenshot.
  • the extracted feature vector may be saved to facilitate subsequent data query.
  • S102 Obtain a normal interface screenshot of the pre-stored normal user interface, and extract a feature vector of the normal interface screenshot.
  • the manner of obtaining the pre-stored normal user interface includes: pre-storing the correspondence between the user interface type and the normal user interface, obtaining the type of the user interface to be detected, and determining the normal user interface corresponding to the acquired type according to the foregoing relationship.
  • the pre-stored normal user interface may also be determined by an interface when the user interface to be detected is normally displayed.
  • the manner of obtaining the screenshot of the normal interface of the pre-stored normal user interface is the same as the way of obtaining the screenshot of the target interface of the user interface to be detected, which may be acquired in real time or acquired at a preset time.
  • S103 Determine a cosine distance between the target interface screenshot and the normal interface screenshot according to the feature vector of the target interface screenshot and the feature vector of the normal interface screenshot.
  • the cosine distance also called cosine similarity
  • cosine similarity is a measure of the magnitude of the difference between two individuals using the cosine of the angles of the two vectors in the vector space.
  • a vector is a directional line segment in a multidimensional space. If the directions of the two vectors are the same, that is, the angle is close to zero, then the two vectors are similar.
  • S104 Determine whether the cosine distance exceeds a preset threshold.
  • the preset threshold is set according to an actual requirement. If the cosine distance exceeds the preset threshold, step S105 is performed. If the cosine distance does not exceed the preset threshold, the user interface to be detected may be directly determined to be abnormal. The alarm is reminded.
  • the above-mentioned abnormality of the user interface to be detected is determined by the cosine distance, and the abnormality of the user interface to be detected is relatively large, because the above-mentioned screenshot of the target interface of the user interface to be detected is extracted, and the feature of the target interface screenshot is extracted.
  • Vector that is, the subsequent image display abnormality judgment is performed according to the feature vector of the image, and the feature vector is an overall feature of the image.
  • This judgment mode determines that the cosine distance exceeds a preset threshold when the abnormality of the display of the user interface to be detected is relatively small.
  • a subsequent detection step is performed when it is determined that the cosine distance exceeds a preset threshold.
  • the level of each element control of the above-mentioned user interface to be detected may include an application in a Mac, a browser, an iOS APP, and the like.
  • Obtaining the hierarchy and location of each element control of the user interface to be detected may be obtained during an automated testing process or during a multi-machine compatibility testing process.
  • the corresponding relationship between the user interface type and the abnormal control detection rule may be preset, the type of the user interface to be detected is obtained, and the abnormal control detection rule corresponding to the type of the user interface to be detected is determined according to the foregoing relationship, and the abnormal control is detected according to the abnormality control.
  • the rule detects whether the level and position of each of the above element controls are abnormal.
  • the first abnormality information may be generated, and may further be based on the first abnormality information.
  • the first alarm information is generated, and the alarm is performed according to the first alarm information. If it is detected that the position of the one or more element controls in the position of each of the element controls is abnormal, determining that the user interface to be detected displays an abnormality, may generate a second abnormality information, and further may generate a second abnormal information according to the second abnormality information. Second, the alarm information is alarmed according to the second alarm information.
  • the user interface display identification method in the embodiment of the present application can solve the problem that the script maintenance efficiency of the existing user interface display detection method is low, and at the same time, the user interface to be detected can be detected through the screenshot of the user interface to be detected.
  • the large anomaly display, combined with the level and position of each element control of the user interface to be detected, can further detect the abnormal display of the user interface, improve the accuracy of the user interface display recognition, reduce the test cost, and is suitable for practical applications.
  • FIG. 2 is a schematic flowchart of a user interface display identification method according to another embodiment of the present application.
  • the acquiring a target interface screenshot of the user interface to be detected, and extracting the feature vector of the target interface screenshot may include S201.
  • S202 to S207 are the same as S102 to S107 in the previous embodiment.
  • S201 may include S2011 to S2014:
  • S2011 Obtain a target interface screenshot of the user interface to be detected, select a first preset number of samples from the target interface screenshot, and extract a second preset number of features for each sample.
  • the first preset number and the second preset number may be set according to actual needs, for example, m samples may be selected from the acquired screenshots, and n features are extracted for each sample.
  • S2012 Form a first matrix according to the first preset number of samples and a second preset number of features extracted by each sample.
  • each sample may be taken as one row to form a first matrix A of m*n.
  • S2013 Obtain a second matrix according to the first matrix and the transposed matrix of the first matrix.
  • S2014 Calculate a feature vector of the second matrix, and determine a feature vector of the target interface screenshot according to the feature vector of the second matrix.
  • the feature vector of the matrix C is obtained, and the feature vector is the feature vector of the screenshot.
  • the method is simple, the obtained results are accurate, and are suitable for application.
  • FIG. 3 is a schematic flowchart of a user interface display identification method according to another embodiment of the present application.
  • the difference between the embodiment corresponding to FIG. 1 is that: according to the feature vector of the target interface screenshot and the feature vector of the normal interface screenshot, determining that the cosine distance of the target interface screenshot and the normal interface screenshot may include S303 .
  • S301 to S302 are the same as S101 to S102 in the previous embodiment, and S304 to S307 are the same as S104 to S107 in the previous embodiment.
  • S303 may include S3031 to S3033:
  • S3031 Calculate a feature vector of the target interface screenshot and a length of a feature vector of the normal interface screenshot, respectively.
  • S3032 Obtain an inner product of a feature vector of the target interface screenshot and a feature vector of the normal interface screenshot.
  • S3033 Determine a cosine distance between the target interface screenshot and the normal interface screenshot according to the length of the feature vector of the target interface screenshot, the length of the feature vector of the normal interface screenshot, and the inner product.
  • the cosine theorem describes the relationship between any one of the triangles and the three sides. Given the three sides of a triangle, you can use the cosine theorem to find the angle of each corner of the triangle. Assuming that the three sides of the triangle are a, b and c, and the corresponding three angles are A, B and C, then the cosine of the angle A is:
  • the denominator represents the length of two vectors, and the numerator represents the inner product of two vectors.
  • the feature vector of the target interface screenshot X and the feature vector of the normal interface screenshot Y are: x1, x2, ..., x10 and y1, y2, ..., y10, respectively.
  • the cosine distance between them can be expressed by the cosine of the angle between them:
  • FIG. 4 is a schematic flowchart of a user interface display and identification method according to another embodiment of the present disclosure.
  • the difference between the embodiment corresponding to FIG. 1 is that if the cosine distance is determined to exceed the preset threshold, acquiring the level and position of each element control of the user interface to be detected may include S405.
  • S401 to S404 are the same as S101 to S104 in the previous embodiment, and S406 to S407 are the same as S106 to S107 in the previous embodiment.
  • S405 may include S4051 to S4052:
  • S4052 Acquire a level and a position of each element control of the user interface to be detected in real time according to the running state obtaining command.
  • the running state obtaining command is generated by the SDK, and the level and position information of each element control of the user interface to be detected is obtained in real time according to the running state obtaining command (returning the rectangular coordinates of the upper left and lower right coordinate points), which is simple and convenient. Speed up subsequent processing.
  • FIG. 5 is a schematic flowchart of a user interface display and identification method according to another embodiment of the present application.
  • the detecting whether the level and position of each of the element controls are abnormal may include S506.
  • S501 to S505 are the same as S101 to S105 in the previous embodiment, and S507 is the same as S107 in the previous embodiment.
  • S506 may include S5061 ⁇ S5062:
  • S5061 Determine, according to a level of each of the element controls, a child control in each of the element controls, and a parent control corresponding to the child control.
  • S5062 detecting whether a location of the target sub-control is set at a front end of the target parent control corresponding to the target sub-control, and detecting whether a range occupied by the target sub-control exceeds a range occupied by the target parent control, the target sub-control To determine any of the child controls in the child controls in each of the described element controls.
  • the control one and the second control are taken as an example, the second control is a child control of the control one, the control one is the parent control of the second control, the second control is the front end of the control one, and the control one is a rectangular frame, and the position is (x1, y1) ) (x2, y2), representing the lower left and lower right corners of the rectangle, and the second control is also a rectangle with a position of (x3, y3) (x4, y4).
  • the detection rule corresponding to the user interface to be detected may be: the child control is set at the front end of the parent control, and the range of the child control cannot exceed the range occupied by the parent control. That is, the above control 2 is at the front end of the control, and x3>x1, x4 ⁇ x2, y1>y3, y2 ⁇ y4 meet the rule, otherwise the display of the user interface to be detected has a problem.
  • FIG. 6 is a schematic diagram showing the operating environment of the user interface display recognition program provided by the embodiment of the present application.
  • the user interface display recognition program 600 is installed and runs in the terminal device 60.
  • the terminal device 60 can be a mobile terminal, a palmtop computer, a server, or the like.
  • the terminal device 60 can include, but is not limited to, a memory 601, a processor 602, and a display 603.
  • FIG. 7 is a functional block diagram of the user interface display recognition program 600 provided by the embodiment of the present application.
  • the user interface display identification device provided by the embodiment of the present application may also include these functional modules.
  • the user interface display recognition program 600 may be divided into one or more modules, the one or more modules being stored in the memory 601 and being processed by one or more processors ( This embodiment is performed by the processor 602) to complete the application.
  • the user interface display recognition program 600 can be divided into a first screenshot processing unit 701, a second screenshot processing unit 702, a cosine distance determining unit 703, a cosine distance determining unit 704, and an element control acquiring unit. 705.
  • the first screenshot processing unit 701 is configured to acquire a target interface screenshot of the user interface to be detected, and extract a feature vector of the target interface screenshot.
  • the second screenshot processing unit 702 is configured to acquire a normal interface screenshot of the pre-stored normal user interface, and extract a feature vector of the normal interface screenshot.
  • the cosine distance determining unit 703 is configured to determine a cosine distance between the target interface screenshot and the normal interface screenshot according to the feature vector of the target interface screenshot and the feature vector of the normal interface screenshot.
  • the cosine distance determining unit 704 is configured to determine whether the cosine distance exceeds a preset threshold.
  • the element control obtaining unit 705 is configured to acquire a level and a position of each element control of the user interface to be detected if it is determined that the cosine distance exceeds the preset threshold.
  • the element control detecting unit 706 is configured to detect whether the level and position of each of the element controls are abnormal.
  • the user interface display determining unit 707 is configured to determine that the user interface to be detected is displayed normally if it is detected that the level and position of each of the element controls are normal.
  • the first screenshot processing unit 701 may be divided into a sample selecting unit 7011, a first matrix forming unit 7012, a second matrix obtaining unit 7013, and a feature vector determining unit 7014.
  • the sample selection unit 7011 is configured to acquire a screenshot of the target interface of the user interface to be detected, select a first preset number of samples from the target interface screenshot, and extract a second preset number of features for each sample.
  • the first matrix constituting unit 7012 is configured to form a first matrix according to the first preset number of samples and a second preset number of features extracted by each sample.
  • the second matrix obtaining unit 7013 is configured to obtain a second matrix according to the first matrix and the transposed matrix of the first matrix.
  • the feature vector determining unit 7014 is configured to calculate a feature vector of the second matrix, and determine a feature vector of the target interface screenshot according to the feature vector of the second matrix.
  • the cosine distance determining unit 703 may be divided into a vector length calculating unit 7031, a vector inner product obtaining unit 7032, and a cosine distance processing unit 7033.
  • the vector length calculation unit 7031 is configured to separately calculate a feature vector of the target interface screenshot and a length of the feature vector of the normal interface screenshot.
  • the vector inner product obtaining unit 7032 is configured to obtain an inner product of the feature vector of the target interface screenshot and the feature vector of the normal interface screenshot.
  • a cosine distance processing unit 7033 configured to determine a cosine of the target interface screenshot and the normal interface screenshot according to a length of a feature vector of the target interface screenshot, a length of a feature vector of the normal interface screenshot, and the inner product distance.
  • the element control obtaining unit 705 may be divided into a command generating unit 7051 and an element control processing unit 7052.
  • the command generating unit 7051 is configured to generate an operation state acquisition command through the SDK if it is determined that the cosine distance exceeds the preset threshold.
  • the element control processing unit 7052 is configured to acquire, in real time, the level and position of each element control of the user interface to be detected according to the running state acquisition command.
  • the element control detecting unit 706 may also be divided into a child-parent control determining unit 7061 and a child-parent control detecting unit 7062.
  • the child control determining unit 7061 is configured to determine, according to the level of each of the element controls, a child control in each of the element controls, and a parent control corresponding to the child control.
  • the child control detection unit 7062 is configured to detect whether the position of the target child control is set at the front end of the target parent control corresponding to the target child control, and detect whether the target child control occupies a range beyond the target parent control Range, the target child control is any one of the child controls in each of the determined element controls.
  • the present application implements all or part of the processes in the above embodiments, and may also be completed by a computer program to instruct related hardware.
  • the computer program may be stored in a computer readable storage medium, and the computer program is in the processor. When executed, the steps of the various method embodiments described above can be implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention se rapporte au domaine de la technologie des terminaux mobiles, et concerne un procédé d'identification d'affichage d'interface d'utilisateur, un dispositif terminal, un support de stockage lisible par ordinateur et un appareil. Le procédé comporte les étapes consistant à: acquérir une capture d'écran d'interface cible d'une interface d'utilisateur devant subir une détection et extraire un vecteur de caractéristiques de la capture d'écran d'interface cible; acquérir une capture d'écran d'interface normale d'une interface d'utilisateur normale pré-sauvegardée et extraire un vecteur de caractéristiques de la capture d'écran d'interface normale; déterminer, d'après le vecteur de caractéristiques de la capture d'écran d'interface cible et le vecteur de caractéristiques de la capture d'écran d'interface normale, une distance en cosinus entre la capture d'écran d'interface cible et la capture d'écran d'interface normale; déterminer si la distance en cosinus dépasse un seuil préétabli; si la distance en cosinus dépasse le seuil préétabli, acquérir le niveau et la position de chaque composant de commande de base de l'interface d'utilisateur; détecter si le niveau et la position de chaque composant de commande de base sont normaux; et si le niveau et la position détectés de chaque composant de commande de base sont normaux, déterminer que l'interface d'utilisateur s'affiche normalement, ce qui résout le problème d'un procédé conventionnel de détection d'affichage d'interface d'utilisateur dans lequel le rendement de maintenance des scripts est faible.
PCT/CN2018/097518 2018-01-12 2018-07-27 Procédé d'identification d'affichage d'interface d'utilisateur, dispositif terminal, support de stockage et appareil WO2019136961A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810029553.8A CN108363599B (zh) 2018-01-12 2018-01-12 用户界面显示识别方法及终端设备
CN201810029553.8 2018-01-12

Publications (1)

Publication Number Publication Date
WO2019136961A1 true WO2019136961A1 (fr) 2019-07-18

Family

ID=63011346

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/097518 WO2019136961A1 (fr) 2018-01-12 2018-07-27 Procédé d'identification d'affichage d'interface d'utilisateur, dispositif terminal, support de stockage et appareil

Country Status (2)

Country Link
CN (1) CN108363599B (fr)
WO (1) WO2019136961A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111600995A (zh) * 2020-05-15 2020-08-28 上海莉莉丝科技股份有限公司 移动设备以及对移动设备的游戏用户界面进行纠偏的方法
CN112686338A (zh) * 2021-03-10 2021-04-20 卡斯柯信号(北京)有限公司 图像信息识别方法、装置、设备及存储介质

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109144868B (zh) * 2018-08-15 2022-02-01 无线生活(杭州)信息科技有限公司 一种页面判断方法及装置
CN109240923B (zh) * 2018-08-31 2021-06-04 福建天泉教育科技有限公司 接口测试脚本的生成方法及计算机可读存储介质
CN109446061B (zh) * 2018-09-17 2022-06-10 平安科技(深圳)有限公司 一种页面检测方法、计算机可读存储介质及终端设备
CN111475396A (zh) * 2019-01-24 2020-07-31 北京嘀嘀无限科技发展有限公司 程序检测方法、装置、电子设备及储存介质
CN109886431B (zh) * 2019-01-25 2021-05-07 北京首都国际机场股份有限公司 一种机场航显终端自动巡检系统
CN109857673B (zh) * 2019-02-25 2022-02-15 北京云测信息技术有限公司 控件识别方法和装置
CN109976854B (zh) * 2019-03-22 2023-02-24 维沃移动通信有限公司 一种应用程序处理方法及终端设备
CN110309073B (zh) * 2019-06-28 2021-07-27 上海交通大学 移动应用程序用户界面错误自动化检测方法、系统及终端
CN112231034A (zh) * 2019-12-23 2021-01-15 北京来也网络科技有限公司 结合rpa和ai的软件界面元素的识别方法与装置
CN111880984A (zh) * 2020-06-09 2020-11-03 上海容易网电子商务股份有限公司 一种用于安卓互动屏应用运行状态的监控系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110235902A1 (en) * 2010-03-29 2011-09-29 Ebay Inc. Pre-computing digests for image similarity searching of image-based listings in a network-based publication system
CN103927480A (zh) * 2013-01-14 2014-07-16 腾讯科技(深圳)有限公司 一种恶意网页的识别方法、装置和系统
CN106874753A (zh) * 2016-12-30 2017-06-20 中国建设银行股份有限公司 识别异常界面的方法及装置
CN107025174A (zh) * 2017-04-06 2017-08-08 网易(杭州)网络有限公司 用于设备的用户界面异常测试的方法、装置和可读储介质
CN107315682A (zh) * 2017-06-22 2017-11-03 北京凤凰理理它信息技术有限公司 测试浏览器兼容方法、装置、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110235902A1 (en) * 2010-03-29 2011-09-29 Ebay Inc. Pre-computing digests for image similarity searching of image-based listings in a network-based publication system
CN103927480A (zh) * 2013-01-14 2014-07-16 腾讯科技(深圳)有限公司 一种恶意网页的识别方法、装置和系统
CN106874753A (zh) * 2016-12-30 2017-06-20 中国建设银行股份有限公司 识别异常界面的方法及装置
CN107025174A (zh) * 2017-04-06 2017-08-08 网易(杭州)网络有限公司 用于设备的用户界面异常测试的方法、装置和可读储介质
CN107315682A (zh) * 2017-06-22 2017-11-03 北京凤凰理理它信息技术有限公司 测试浏览器兼容方法、装置、存储介质及电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111600995A (zh) * 2020-05-15 2020-08-28 上海莉莉丝科技股份有限公司 移动设备以及对移动设备的游戏用户界面进行纠偏的方法
CN112686338A (zh) * 2021-03-10 2021-04-20 卡斯柯信号(北京)有限公司 图像信息识别方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN108363599A (zh) 2018-08-03
CN108363599B (zh) 2019-07-19

Similar Documents

Publication Publication Date Title
WO2019136961A1 (fr) Procédé d'identification d'affichage d'interface d'utilisateur, dispositif terminal, support de stockage et appareil
US11842438B2 (en) Method and terminal device for determining occluded area of virtual object
JP6815707B2 (ja) 顔姿勢検出方法、装置及び記憶媒体
RU2672181C2 (ru) Способ и устройство для генерирования команды
CN105320921B (zh) 双眼定位方法及双眼定位装置
US20150186004A1 (en) Multimode gesture processing
WO2019223461A1 (fr) Procédé de détection de toucher et support de stockage lisible par ordinateur
JP2017534123A (ja) フレキシブル表示装置の操作制御方法
US9069415B2 (en) Systems and methods for finger pose estimation on touchscreen devices
WO2022002262A1 (fr) Procédé et appareil de reconnaissance de séquences de caractères basés sur la vision artificielle, dispositif et support
WO2019019372A1 (fr) Procédé et dispositif d'exploitation et de commande d'image pour terminal mobile, terminal mobile et support
CN112683169A (zh) 物体尺寸测量方法、装置、设备及存储介质
CN112446917B (zh) 一种姿态确定方法及装置
JP2017068465A (ja) 情報処理装置、制御方法、及びプログラム
WO2019090691A1 (fr) Procédé et terminal d'essai du type singe
CN111141217A (zh) 物体测量方法、装置、终端设备及计算机存储介质
CN108984097B (zh) 触控操作方法、装置、存储介质及电子设备
EP2975503A2 (fr) Dispositif tactile et procédé tactile correspondant
CN109614175B (zh) 用户界面异常处理方法、装置、计算机设备及存储介质
CN114202804A (zh) 行为动作识别方法、装置、处理设备及存储介质
US20170185831A1 (en) Method and device for distinguishing finger and wrist
CN109325945B (zh) 图像处理方法、装置、电子设备及储存介质
CN112837214A (zh) 三维人像的获取方法、视频分析方法、装置、设备和介质
US8937600B1 (en) Method and apparatus for interpreting multi-touch events
JP2015084193A (ja) 携帯電子機器及び制御プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18899869

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13.11.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18899869

Country of ref document: EP

Kind code of ref document: A1