CN114924686A - Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment - Google Patents

Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment Download PDF

Info

Publication number
CN114924686A
CN114924686A CN202210850145.5A CN202210850145A CN114924686A CN 114924686 A CN114924686 A CN 114924686A CN 202210850145 A CN202210850145 A CN 202210850145A CN 114924686 A CN114924686 A CN 114924686A
Authority
CN
China
Prior art keywords
gesture
view
core management
management class
finger touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210850145.5A
Other languages
Chinese (zh)
Other versions
CN114924686B (en
Inventor
谭斌
尹欣荣
肖灵聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xingka Software Technology Development Co Ltd
Original Assignee
Shenzhen Xingka Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xingka Software Technology Development Co Ltd filed Critical Shenzhen Xingka Software Technology Development Co Ltd
Priority to CN202310208397.2A priority Critical patent/CN116820320A/en
Priority to CN202210850145.5A priority patent/CN114924686B/en
Publication of CN114924686A publication Critical patent/CN114924686A/en
Application granted granted Critical
Publication of CN114924686B publication Critical patent/CN114924686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the technical field of automobile diagnosis, in particular to an intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment. The intelligent gesture response processing method for the automobile diagnosis equipment comprises the following steps of: creating a corresponding core management class in a view layout existing in the source code; creating a gesture view through a core management class; generating a custom animated icon within the gesture view; displaying the animation icon in a screen; registering a monitoring area of the gesture in a core management class to monitor a finger touch action in real time; judging the effectiveness of the finger touch action and the touch area in the gesture view; and returning corresponding animation icons and finger touch action responses through the core management class when the judgment is valid. The gesture return operation of the automobile diagnosis equipment is smoother and more attractive, the gesture icons with different styles can be flexibly displayed according to the items for the user to return by using the gesture, and meanwhile, the gesture response function can be enriched and the automobile diagnosis equipment is more intelligent.

Description

Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment
Technical Field
The invention relates to the technical field of automobile diagnosis, in particular to an intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment.
Background
In the technical field of automobile diagnosis, due to the rapid development of the intelligent era, the design of diagnosis equipment is more inclined to improve the experience of users, and if the interface of one diagnosis equipment returns to the previous layer of interface, the operation is only limited to be carried out through a fixed navigation button on the interface, so that the diagnosis equipment is a little single; and the gesture graph is single, and cannot meet the requirements of diversification and individuation.
For example, publication No. CN105549838A, publication No. 2016.05.04 discloses a method for destroying current Activity by gesture sliding trigger, which includes: opening an application program to enter a first Activity interface; pressing one end of one side of the screen, and then slowly sliding towards the other side; when sliding to the width of one third of the screen, the application executes the finish () method; destroying the current Activity interface returns to the second Activity interface. The technical scheme of the invention aims to provide a gesture response, which can return to the previous layer by sliding from left to right and destroy the current Activity to realize the effect of clicking the return button, thereby facilitating the operation of a user and improving the experience.
The scheme can achieve return operation through gestures, and operation modes are expanded, but the UI of gesture response is single, the function is simple, and diversified and personalized requirements cannot be met.
Disclosure of Invention
In order to solve the defects of single UI (user interface) and single function of gesture response of the conventional automobile diagnosis equipment, the invention provides an intelligent gesture response processing method of the automobile diagnosis equipment, which comprises the following steps of:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in the core management class to monitor a finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment result in the S600 is valid.
Preferably, the gesture view in step S200 is carried on the whole screen.
Preferably, the listening area of the gesture is registered through a windowmanagerservice (wms) in step S500.
Preferably, when the monitoring area of the gesture is registered in the core management class in step S500, the range and the position of the monitoring area are dynamically adjusted according to the usage scenario and the authority of the operating user, where the range and the position are in a shape of a non-geometric rule defined by a user according to the device screen layout.
Preferably, the step S600 is performed by determining the sliding distance of the finger to achieve different responses.
Preferably, the gesture view generated in step S200 is associated, mapped and bound with the authority ID of the operating user, that is, the same gesture view responds differently according to the authority of the operating user.
Preferably, in step S700, the number of times of response return within the preset time is dynamically set and adjusted according to the frequency of the real-time detection of the finger touch motion.
The invention also provides a gesture intelligent response processing device of the automobile diagnosis equipment, which comprises
The core management module is used for creating a corresponding core management class in the view layout existing in the source code;
the creating module is used for creating a gesture view through the core management class;
the drawing module is used for generating a self-defined animation icon in the gesture view;
the display module is used for displaying the animation icon on a screen;
the monitoring module is used for registering a monitoring area of the gesture in the core management class so as to monitor the finger touch action in real time;
the judging module is used for judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and the response module is used for returning corresponding animation icons and finger touch action responses through the core management class when the judgment module judges that the animation icons and the finger touch action responses are valid.
The invention also provides an automobile diagnosis device, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the gesture response processing method as described in any of the above when executing the computer program.
The present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the gesture response processing method as described in any of the above.
Based on the above, compared with the prior art, the gesture response processing method provided by the invention enables the gesture return operation of the automobile diagnosis equipment to be smoother and more attractive, and can flexibly display different styles of gesture icons according to items for a user to return by utilizing gestures. Meanwhile, different response actions can be made according to preset settings, and the gesture response function is enriched.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts; in the following description, the drawings are illustrated in a schematic view, and the drawings are not intended to limit the present invention.
Fig. 1 is a flowchart of a gesture intelligent response processing method for an automobile diagnostic device according to the present invention;
FIG. 2 is an overall flowchart of the practical operation of embodiment 1 of the present invention;
FIG. 3 is a schematic structural diagram of an intelligent gesture response processing apparatus for an automotive diagnostic device according to the present invention;
fig. 4 is a schematic structural diagram of an automotive diagnostic apparatus provided by the present invention.
Reference numerals are as follows:
10, a core management module; 20 creating a module; 30 a drawing module; 40 a display module; 50 a monitoring module; 60, a judging module; 70 response module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments; the technical features devised in the different embodiments of the invention described below can be combined with each other as long as they do not conflict with each other; all other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
In the description of the present invention, it should be noted that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present invention belongs, and should not be construed as limiting the present invention; it will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Example 1
In order to solve the problems that a UI for gesture response is single, and the function is simple, and cannot meet diversified and personalized requirements, the embodiment provides an intelligent gesture response processing method for an automobile diagnostic device, please refer to fig. 1 and 2, which is applied to the automobile diagnostic device, and specifically includes the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor a finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, when the judgment result in the S600 is valid, returning the corresponding animation icon and the corresponding finger touch action response through the core management class.
The gesture intelligent response processing method for the automobile diagnosis equipment can be used for secondary development based on android 10, and in actual development, an EdgestutterHandler class can be created in a SystemUI through a DI (direct input) frame, wherein the EdgebacsturgeureHandler class is a core management class of the gesture intelligent response processing method for the whole automobile diagnosis equipment. The core management class needs to initialize preset parameters and variables required by gesture judgment during construction, and when the core management class creates a gesture view (navigationbarredgepanel), the core management class injects the preset parameters and variables into the gesture view. And then generating a self-defined animation icon in the gesture view, and displaying the icon through a screen. Meanwhile, the core management class registers a monitoring area to monitor finger touch actions in real time. The specific monitoring region setting process is constructed by referring to InputChannel. And the gesture view judges the effectiveness of the finger touch action and the touch area according to the monitored data such as the touch position, the touch distance and the like of the finger touch action, and returns corresponding animation icons and finger touch action responses through the core management class when the effectiveness is judged.
Referring to fig. 2, S100-S400 are selections for establishing multiple sets of gesture UI switching functions and demanding experiences using a factory design mode in a design mode. If the user A wants a prompting UI with a white arrow inside a red dot, the interface returns to the previous layer by the same effect of sliding the function on the left side and the right side, and the API method 1 is established according to the requirements; b, the user wants an intuitive ripple animation effect, the user slides left to return to the upper-level page, slides right to pull out the sidebar, and establishes an API method 2. And after the whole method is established, responding according to actual operation. The customized animation icons have the advantages that on one hand, customized operation can be performed according to the behavior habits of each user so as to improve the efficiency; on one hand, personalized services can be provided, and the use experience is increased.
In order to complete the response, S500 is required to perform a corresponding monitoring step, the distance between the pressing position of the finger and the edge of the screen is monitored in S500, when the finger touches the edge of the screen and is 30-40 pixels away from the left edge or the right edge, a response instruction is sent to an API method appointed by the system, and a user-defined icon effect is displayed on the screen; and then, according to the left sliding or the right sliding of the finger, setting the sliding distance to be a preset effective monitoring area within 0-30 pixels. And then, the step S600 is entered to judge whether the area is a valid monitoring area or not, the operation S700 is executed to return to the previous page, if the distance between the position pressed by the finger and the edge of the screen is monitored to be more than 30-40 pixels and less than the whole screen width in the step S500, the area is an invalid area, and the judgment of the step S600 is not executed. Further, since the application is applied in the normal immersion mode, if the user slides from the edge, the application does not recognize that the gesture occurs, and cannot send a signal. Thus, the sticky immersion mode is adopted globally, and enables the device to optimize the sliding effect without a notification bar at the top of the system. Specifically, the creation of the gesture view and the registration of the listening area may be performed at the code level simultaneously.
Preferably, the gesture view in step S200 may be carried on the whole screen.
Specifically, in step S500, the listening area of the gesture may be registered through a windowmanagerservice (wms).
Example 2
The method aims to further solve the problem that the fixed response area causes the situation that some practical scenes are not suitable. The invention also provides an embodiment 2 to adapt to different application scenarios, which specifically comprises the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, monitoring a finger touch action in real time in a monitoring area of a registered gesture in a core management class, and dynamically adjusting the range and the position of the monitoring area according to a use scene and the authority of an operating user, wherein the range and the position are in a shape of a self-defined non-geometric rule according to the screen layout of equipment;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, when the judgment result in the S600 is valid, returning the corresponding animation icon and the corresponding finger touch action response through the core management class.
If the gesture response area is fixed, the problem that the gesture response area is not suitable for some application scenes exists, for example, the automobile diagnosis device can be placed at a high position or a low position, and the fixed screen can cause inconvenience in use; there may also be some areas that are physically damaged, and if the gesture response cannot be used, the gesture response cannot be used. Therefore, in step S500, a person skilled in the art can perform preset settings according to the usage scenario and the operation user authority, and dynamically adjust the range and the position of the listening area, where the specific range and the position may be standard geometric shapes or custom shapes according to the device screen layout. In actual operation, the method can be realized by adopting a mode of partitioning in advance, a mode of subsequent selection or a mode of customizing the partitioning. And the specific range and position do not need to be carried on all screens, so that hardware resources are saved, and the problem that certain areas cannot be used due to physical damage can be avoided.
And the non-geometric regular shape can be unavailable to users who do not have professional training guidance, so that the problem of faults caused by misoperation or disordered operation of non-professional users is solved.
Example 3
In order to further solve the problem that in practical application, gesture response is inefficient and needs to be repeated for multiple times. The invention also provides an embodiment 3 for improving the efficiency of gesture operation response, comprising the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in the core management class to monitor a finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view, and judging the sliding distance of the finger to realize different responses;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment result in the S600 is valid.
If the software interface of the diagnosis equipment is more, the screen is turned over page by adopting gesture response, and the efficiency is not high. The sliding distance of the finger in S600 is determined to achieve different responses. Specifically, the judgment of the finger sliding distance of 0 to 30 pixels in step S600 may be divided into 3 segments, and each segment corresponds to the operation of turning 1 page, turning 2 pages, or turning 3 pages. Therefore, if the user wants to directly turn to the 2 nd or 3 rd page, the user can control the page according to the sliding distance of the fingers, so that the efficiency is improved, and the software and hardware resources of the equipment are saved.
Example 4
When the automobile diagnosis equipment is actually used, in order to solve the problem that the operator is complex and is easy to cause misoperation, the invention also provides an embodiment 4 for facilitating the operation and management of the equipment, which comprises the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, establishing a gesture view through the core management class, and performing associated mapping binding on the generated gesture view and the operation user authority ID, namely that the response results of the same user gesture are different due to different operation user authorities;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor a finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, when the judgment result in the S600 is valid, returning the corresponding animation icon and the corresponding finger touch action response through the core management class.
The current gesture response lacks the setting of management authority, and no matter what authority can be operated as long as the gesture response enters, so that the management is not favorable. Therefore, the gesture view generated in step S200 is associated with the operation user permission ID, that is, the same gesture view responds differently due to different operation user permissions (for example, when people with different permissions perform horizontal sliding, the response functions may be different). For example, when a high-level manager enters an equipment interface, the gesture response area is large, and the types of gestures which can be operated are also large; for example, a typical user has a relatively small gesture response interface and a relatively single gesture response type. Therefore, the management problem caused by the operation of ordinary operators after the high-level manager logs in can be solved. In specific implementation, different authority accounts are set on the equipment system level, each authority account is correspondingly provided with an operation user authority ID, and a gesture library provided with the authority corresponding to the user authority ID is provided. And activating the corresponding authority account in a mode of fingerprint identification, gesture input, password activation and the like according to the model and configuration of the equipment, and configuring the corresponding authority ID of the operation user. When the gesture view is created by the core management class, the content for judging the authority ID of the operation user is injected, after the gesture view is created, the association mapping binding is completed with the authority ID of the operation user, and then the gesture view can traverse the gesture library and make corresponding judgment.
Example 5
The gesture intelligent response processing method aims to solve the problem that when the gesture intelligent response processing method of the automobile diagnosis equipment is actually operated, an operator can operate too fast, and misoperation occurs. The invention also provides an embodiment 5 for optimizing the method, which comprises the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in the core management class to monitor a finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of the S600 is valid, and dynamically adjusting the number of times of returning responses within preset time according to the frequency of the finger touch action detected in real time.
The existing gesture response processing method has the problem that misoperation is caused by too fast operation in unit time of a user when return operation is carried out. Therefore, the number of times of returning the response within the preset time may be set in S700. Specifically, only 1 gesture response can be set to be responded in unit time, for example, only 1 gesture response is responded in 0.5s, so that not only is misoperation avoided, but also hardware resources of equipment are prevented from being wasted due to frequent operation.
The frequency of the finger touch action detected in real time can be dynamically set to adjust the number of times of response returned within the preset time, for example, when the frequency of the finger touch action detected in real time within a certain period of time is more, the operation within the period of time is frequent, and quick response is required, so that the number of times of response returned within the preset time can be properly increased; on the contrary, if the frequency of the finger touch action detected in real time within a certain period of time is relatively low, it indicates that the operation is not frequent in the period of time, and a quick response is not needed, so that the number of times of returning the response within the preset time can be appropriately reduced. Therefore, the operation and use are met, and the software and hardware resources of the equipment can be saved.
Example 6
In order to solve the above problems, the present invention further provides an embodiment 6 for optimizing the automobile diagnostic device, which includes the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time, wherein the monitoring cycle frequencies of people with different authorities/different devices are different;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
s700, when the judgment result in the S600 is valid, returning the corresponding animation icon and the corresponding finger touch action response through the core management class
Listening all the time in the gesture response processing method wastes device resources. The periodic frequency of the listening by people of different authorities/different devices is different in step S500. For example, the high-level manager has a large operation authority and can perform complicated operations, so that high-frequency monitoring is performed, and a general user has a relatively small gesture response interface and a relatively single gesture response type, so that the monitoring frequency is set to be low.
Preferably, the monitoring can also be awakened according to the time of the diagnosis work of the automobile diagnosis equipment, such as stopping the monitoring after the diagnosis work is started, and awakening the monitoring when the diagnosis work is performed to the final stage. Preferably, when the automobile diagnosis device does not work within the set time, the user can set the specific time by himself, the monitoring is stopped after no-man operation is performed within the set time, the brightness of the screen is reduced by the automobile diagnosis device, and then the monitoring module is awakened while the brightness is recovered after the automobile diagnosis device receives the operation.
Example 7
The embodiment also provides an intelligent gesture response processing device for the automobile diagnostic equipment, which comprises, as shown in fig. 3
The core management module is used for creating a corresponding core management class in the view layout existing in the source code;
the creating module is used for creating a gesture view through the core management class;
the drawing module is used for generating a self-defined animation icon in the gesture view;
the display module is used for displaying the animation icon on a screen;
the monitoring module is used for registering a monitoring area of the gesture in the core management class so as to monitor the finger touch action in real time;
the judging module is used for judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and the response module is used for returning corresponding animation icons and finger touch action responses through the core management class when the judgment module judges that the animation icons and the finger touch action responses are valid.
It should be noted that the division of each module of the above apparatus is only a logical division, and all or part of the actual implementation may be integrated into one physical entity or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the processing module may be a processing element separately set up, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and a function of the processing module may be called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
Example 8
The present embodiment provides an automotive diagnostic apparatus, as shown in fig. 4, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the gesture response processing method as described in any of the above when executing the computer program.
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the gesture response processing method as described in any of the above.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, devices and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. Those of ordinary skill in the art will appreciate that the various illustrative modules and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus, device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only a logical division, and there may be other divisions when the actual implementation is performed, or units having the same function may be grouped into one unit, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
In addition, it will be appreciated by those skilled in the art that, although there may be many problems with the prior art, each embodiment or aspect of the present invention may be improved only in one or several respects, without necessarily simultaneously solving all the technical problems listed in the prior art or in the background. It will be understood by those skilled in the art that nothing in a claim should be taken as a limitation on that claim.
Although terms such as source code, core management class, snoop etc. are used more often in this context, the possibility of using other terms is not excluded. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed as being without limitation to any additional limitations that may be imposed by the spirit of the present invention; it is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or electronic device that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or electronic device. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or electronic device that comprises the element.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and these modifications or substitutions do not depart from the spirit of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An intelligent gesture response processing method for automobile diagnosis equipment is characterized by comprising the following steps:
s100, creating a corresponding core management class in a view layout existing in a source code;
s200, creating a gesture view through the core management class;
s300, generating a self-defined animation icon in the gesture view;
s400, displaying the animation icon on a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor a finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, when the judgment result in the S600 is valid, returning the corresponding animation icon and the corresponding finger touch action response through the core management class.
2. The intelligent gesture response processing method for the automobile diagnostic equipment, according to claim 1, is characterized in that: in step S200, the gesture view is carried on the whole screen.
3. The intelligent gesture response processing method for the automobile diagnostic equipment, according to claim 1, is characterized in that: in step S500, the listening area of the gesture is registered through WindowManagerService.
4. The intelligent gesture response processing method for the automobile diagnostic equipment, according to claim 1, is characterized in that: in step S500, when the monitoring area of the gesture is registered in the core management class, the range and the position of the monitoring area are dynamically adjusted according to the use scenario and the authority of the operating user, where the range and the position are in a shape of a self-defined non-geometric rule according to the device screen layout.
5. The intelligent gesture response processing method for the automobile diagnosis equipment as claimed in claim 1, wherein: in step S600, different responses are realized by determining the sliding distance of the finger.
6. The intelligent gesture response processing method for the automobile diagnosis equipment as claimed in claim 1, wherein: the gesture view generated in step S200 is associated, mapped and bound with the operation user permission ID, that is, the same gesture view responds differently due to different operation user permissions.
7. The intelligent gesture response processing method for the automobile diagnostic equipment according to any one of claims 1 to 6, characterized in that: in step S700, the number of times of response return within a preset time is dynamically adjusted according to the frequency of the real-time detection of the finger touch action.
8. The utility model provides an automobile diagnosis equipment gesture intelligent response processing apparatus which characterized in that: comprises that
The core management module is used for creating a corresponding core management class in the view layout existing in the source code;
the creating module is used for creating a gesture view through the core management class;
the drawing module is used for generating a self-defined animation icon in the gesture view;
the display module is used for displaying the animation icon on a screen;
the monitoring module is used for registering a monitoring area of the gesture in the core management class so as to monitor the finger touch action in real time;
the judging module is used for judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and the response module is used for returning corresponding animation icons and finger touch action responses through the core management class when the judgment module judges that the animation icons and the finger touch action responses are valid.
9. An automotive diagnostic device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the gesture response processing method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the gesture response processing method according to any one of claims 1 to 7.
CN202210850145.5A 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment Active CN114924686B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310208397.2A CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment
CN202210850145.5A CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210850145.5A CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310208397.2A Division CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment

Publications (2)

Publication Number Publication Date
CN114924686A true CN114924686A (en) 2022-08-19
CN114924686B CN114924686B (en) 2023-03-14

Family

ID=82815606

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310208397.2A Pending CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment
CN202210850145.5A Active CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310208397.2A Pending CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment

Country Status (1)

Country Link
CN (2) CN116820320A (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110078560A1 (en) * 2009-09-25 2011-03-31 Christopher Douglas Weeldreyer Device, Method, and Graphical User Interface for Displaying Emphasis Animations for an Electronic Document in a Presentation Mode
US8509986B1 (en) * 2012-04-27 2013-08-13 Innova Electronics, Inc. Automotive diagnostic tool with projection display and virtual input
CN105425966A (en) * 2015-12-14 2016-03-23 珠海全志科技股份有限公司 Gesture control method and device based on Android system
CN106445243A (en) * 2016-11-07 2017-02-22 深圳Tcl数字技术有限公司 Touch responding device and method for intelligent equipment
US20170102697A1 (en) * 2015-10-08 2017-04-13 General Motors Llc Selecting a vehicle function to control using a wearable electronic device
CN106886331A (en) * 2017-01-12 2017-06-23 青岛海信移动通信技术股份有限公司 A kind of data processing method of touch terminal, device and touch terminal
WO2017113407A1 (en) * 2015-12-31 2017-07-06 华为技术有限公司 Gesture recognition method and apparatus, and electronic device
US20170308751A1 (en) * 2016-04-26 2017-10-26 Hyundai Motor Company Wearable device and vehicle diagnosis apparatus including the same
CN107357479A (en) * 2016-05-10 2017-11-17 中兴通讯股份有限公司 The management method and device of application program
CN107660278A (en) * 2015-06-19 2018-02-02 英特尔公司 To the technology of the computing resource of control electronics
CN109271220A (en) * 2018-08-16 2019-01-25 广州优视网络科技有限公司 Method, calculating equipment and the storage medium that the page returns are controlled by gesture operation
CN110069207A (en) * 2019-04-24 2019-07-30 努比亚技术有限公司 Touch control operation response method, device, mobile terminal and readable storage medium storing program for executing
US20200090430A1 (en) * 2018-09-17 2020-03-19 Westinghouse Air Brake Technologies Corporation Diagnostic System for a Transit Vehicle
CN113110771A (en) * 2021-04-01 2021-07-13 Tcl通讯(宁波)有限公司 Desktop application icon display control method and device, terminal equipment and storage medium
CN113508360A (en) * 2020-02-11 2021-10-15 荣耀终端有限公司 Card display method, electronic device and computer readable storage medium
CN113821128A (en) * 2020-06-18 2021-12-21 华为技术有限公司 Terminal device, gesture operation method thereof and medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110078560A1 (en) * 2009-09-25 2011-03-31 Christopher Douglas Weeldreyer Device, Method, and Graphical User Interface for Displaying Emphasis Animations for an Electronic Document in a Presentation Mode
US8509986B1 (en) * 2012-04-27 2013-08-13 Innova Electronics, Inc. Automotive diagnostic tool with projection display and virtual input
US20130289820A1 (en) * 2012-04-27 2013-10-31 Innnova Electronics, Inc. Automotive Diagnostic Tool with Virtual Display and Input
CN107660278A (en) * 2015-06-19 2018-02-02 英特尔公司 To the technology of the computing resource of control electronics
US20170102697A1 (en) * 2015-10-08 2017-04-13 General Motors Llc Selecting a vehicle function to control using a wearable electronic device
CN105425966A (en) * 2015-12-14 2016-03-23 珠海全志科技股份有限公司 Gesture control method and device based on Android system
WO2017113407A1 (en) * 2015-12-31 2017-07-06 华为技术有限公司 Gesture recognition method and apparatus, and electronic device
US20170308751A1 (en) * 2016-04-26 2017-10-26 Hyundai Motor Company Wearable device and vehicle diagnosis apparatus including the same
CN107357479A (en) * 2016-05-10 2017-11-17 中兴通讯股份有限公司 The management method and device of application program
CN106445243A (en) * 2016-11-07 2017-02-22 深圳Tcl数字技术有限公司 Touch responding device and method for intelligent equipment
CN106886331A (en) * 2017-01-12 2017-06-23 青岛海信移动通信技术股份有限公司 A kind of data processing method of touch terminal, device and touch terminal
CN109271220A (en) * 2018-08-16 2019-01-25 广州优视网络科技有限公司 Method, calculating equipment and the storage medium that the page returns are controlled by gesture operation
US20200090430A1 (en) * 2018-09-17 2020-03-19 Westinghouse Air Brake Technologies Corporation Diagnostic System for a Transit Vehicle
CN110069207A (en) * 2019-04-24 2019-07-30 努比亚技术有限公司 Touch control operation response method, device, mobile terminal and readable storage medium storing program for executing
CN113508360A (en) * 2020-02-11 2021-10-15 荣耀终端有限公司 Card display method, electronic device and computer readable storage medium
CN113821128A (en) * 2020-06-18 2021-12-21 华为技术有限公司 Terminal device, gesture operation method thereof and medium
CN113110771A (en) * 2021-04-01 2021-07-13 Tcl通讯(宁波)有限公司 Desktop application icon display control method and device, terminal equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TECHMERGER: "深入分析Android系统返回手势的实现原理", 《HTTPS://JUEJIN.CN/POST/7103503592119599117》 *

Also Published As

Publication number Publication date
CN114924686B (en) 2023-03-14
CN116820320A (en) 2023-09-29

Similar Documents

Publication Publication Date Title
CN106020613B (en) A kind of operating method and mobile terminal of unread message
CN104573552B (en) A kind of method and device of hiden application icon
CN104035706B (en) Display methods and electronic installation
CN107231677A (en) One kind breath screen awakening method, device and mobile terminal
CN110147256B (en) Multi-screen interaction method and device
CN106354373A (en) Icon moving method and mobile terminal
CN106681623A (en) Screenshot picture sharing method and mobile terminal
CN103189830A (en) Mode switching
CN106843654A (en) The method and mobile terminal of a kind of terminal multi-job operation
CN105094508A (en) Method and apparatus for performing window control on application program of mobile terminal
CN105335048A (en) Electron equipment with concealed application icon and application icon conceal method
CN111201501A (en) Method for providing haptic feedback to an operator of a touch sensitive display device
CN106557259B (en) A kind of operating method and mobile terminal of mobile terminal
CN106228085B (en) The method for secret protection and mobile terminal of application program
CN107037971A (en) Application management device, mobile terminal and method
CN106484301A (en) A kind of method of hiden application and terminal
CN104516640A (en) Touch display device and method for dynamically setting touch confinement region
CN107272949A (en) Method of controlling operation thereof and device, computer installation and storage medium
CN106126090B (en) The control method and electronic equipment of a kind of electronic equipment
CN107291226A (en) Control method and device, terminal based on touch gestures
WO2022105447A1 (en) Iot device control method, apparatus, terminal, and storage medium
CN114924686B (en) Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment
CN111796746B (en) Volume adjusting method, volume adjusting device and electronic equipment
CN110908580B (en) Method and device for controlling application
CN108037912A (en) Show the method, apparatus of page info

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant