CN110968250A - Method and system for realizing touch screen gesture simulation in unity editor environment - Google Patents

Method and system for realizing touch screen gesture simulation in unity editor environment Download PDF

Info

Publication number
CN110968250A
CN110968250A CN201911208684.3A CN201911208684A CN110968250A CN 110968250 A CN110968250 A CN 110968250A CN 201911208684 A CN201911208684 A CN 201911208684A CN 110968250 A CN110968250 A CN 110968250A
Authority
CN
China
Prior art keywords
gesture
unity
client
plug
raspberry
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911208684.3A
Other languages
Chinese (zh)
Inventor
白启扉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tibet ningsuan Technology Group Co.,Ltd.
Original Assignee
Dilu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dilu Technology Co Ltd filed Critical Dilu Technology Co Ltd
Priority to CN201911208684.3A priority Critical patent/CN110968250A/en
Publication of CN110968250A publication Critical patent/CN110968250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/33Intelligent editors

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method and a system for realizing simulation of touch screen gestures in a unity editor environment, which comprises the following steps that a unity client imports an encapsulated gesture plug-in; building an intermediate service based on the websocket; bi-directionally docking a raspberry pie with the unity client by using the intermediate service; the raspberry pie is connected with a touch pad as a peripheral, and a gesture message is transmitted to a server of the intermediate service by utilizing a gesture code library packaged in the raspberry pie; the server side forwards the gesture message to the unity client side; the gesture plug-in is used for app logic simulation after being analyzed. The invention has the beneficial effects that: the touch pad can be directly controlled on the pc through the raspberry pie, debugging gestures can be developed in the unity editor, development efficiency is improved, and the gestures can be debugged in the editor.

Description

Method and system for realizing touch screen gesture simulation in unity editor environment
Technical Field
The invention relates to the technical field of unity development, in particular to a method for simulating a touch screen gesture in a unity editor environment and a system for simulating the touch screen gesture on a PC.
Background
With the development of the internet in recent years, the internet is also called an international network, which refers to a huge network formed by connecting networks in series, the networks are connected with one another by a set of common protocols to form a logically single huge international network, and the internet is also called an internet or an internet according to transliteration, the network is a huge network formed by connecting networks in series, the networks are connected with one another by a set of common protocols to form a logically single huge global network, and the network includes network devices such as switches and routers, various different connection links, various servers and countless computers and terminals. The information can be instantly sent to people beyond thousands of miles by using the internet, and the internet is the foundation of the information society.
Gesture development is also a popular research object of the internet, in the field of interactive design, gestures and control are greatly different in a natural interactive interface based on touch, and currently, when cell phone end games and APPs are developed by unity, touch and gestures are often required to be issued to real machine testing. Therefore, it takes much time and is inconvenient to compile the unit to the mobile platform.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. In this section, as well as in the abstract and the title of the invention of this application, simplifications or omissions may be made to avoid obscuring the purpose of the section, the abstract and the title, and such simplifications or omissions are not intended to limit the scope of the invention.
The present invention has been made in view of the above-mentioned conventional problems.
Therefore, one technical problem solved by the present invention is: the method for simulating the touch screen gesture improves development efficiency and achieves gesture debugging in a unity editor.
In order to solve the technical problems, the invention provides the following technical scheme: a method for realizing the simulation of touch screen gestures in the environment of a unity editor comprises the following steps that a unity client imports an encapsulated gesture plug-in; building an intermediate service based on the websocket; bi-directionally docking a raspberry pie with the unity client by using the intermediate service; the raspberry pie is connected with a touch pad as a peripheral, and a gesture message is transmitted to a server of the intermediate service by utilizing a gesture code library packaged in the raspberry pie; the server side forwards the gesture message to the unity client side; the gesture plug-in is used for app logic simulation after being analyzed.
As a preferred solution of the method for simulating a touch screen gesture in a unity editor environment, the method includes: the gesture plug-in of the unity client is realized as follows, and the gesture plug-in is written by C # language; packaging the code of the gesture into the gesture plug-in; the imported gesture plug-in comprises a main prefab, and a user of the gesture plug-in drags the gesture plug-in into a scene to be used as an undisrupted node of the scene and take effect globally; the gesture plug-in provides an interface, and a user rewrites to realize customized gesture details.
As a preferred solution of the method for simulating a touch screen gesture in a unity editor environment, the method includes: building the intermediate service based on the websocket comprises the following steps of adopting a node; starting a unity client to register in a server and keeping long connection; and starting a touch pad client of the raspberry pi to register in a service and keeping long connection.
As a preferred solution of the method for simulating a touch screen gesture in a unity editor environment, the method includes: the raspberry group serving as an independent peripheral client after being connected with the touch pad comprises the following setting steps of programming the peripheral client by adopting C + + language; calling a driving program of the touch panel; a basic gesture library packaged according to different events of the touch pad and corresponding to the gesture formats of the gesture plug-in units one by one; and the client converts the operation behavior of the touch pad into a message queue and sends the message queue to the server.
As a preferred solution of the method for simulating a touch screen gesture in a unity editor environment, the method includes: and setting response frequency when the conversion is converted into the message queue and the server side is sent.
As a preferred solution of the method for simulating a touch screen gesture in a unity editor environment, the method includes: the method comprises the following steps that the server receives a message of a raspberry dispatching client and forwards the message to the unity client; the unity client does not distinguish the gesture source and does the same processing as the real-time gesture.
As a preferred solution of the method for simulating a touch screen gesture in a unity editor environment, the method includes: the websocket intermediate service comprises the following steps of creating a maven item; creating an html front end page modification web.xml adds a welome page; after the java code is written and operated, the server side instantly obtains the message of the client side.
The invention solves the technical problems that: the system for simulating the touch screen gesture under the environment of the unity editor is provided, the development efficiency is improved, and the gesture is debugged in the unity editor.
In order to solve the technical problems, the invention provides the following technical scheme: a system for simulating touch screen gestures on a PC comprises a unity client, a communication module, a raspberry pi and a touch pad client; the unity client is used for importing the packaged gesture plug-in; the communication module is used for establishing bidirectional intermediate service communication connection between the unity client and the raspberry pi, and the raspberry pi is connected with the touch pad client to serve as a peripheral end; and performing gesture development through an interface provided by the raspberry pi, and analyzing the gesture plug-in to be used for app logic simulation.
The invention has the beneficial effects that: the touch pad can be directly controlled on the pc through the raspberry pie, debugging gestures can be developed in the unity editor, development efficiency is improved, and the gestures can be debugged in the editor.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise. Wherein:
FIG. 1 is a schematic overall flowchart illustrating a method for simulating a touch screen gesture in a unity editor environment according to a first embodiment of the present invention;
fig. 2 is a timing diagram of handshake establishment according to the first embodiment of the present invention;
FIG. 3 is a schematic diagram of a system for simulating touch screen gestures in a unity editor environment according to a second embodiment of the present invention;
fig. 4 is a schematic block diagram illustrating an overall principle of a system for simulating a touch screen gesture in a unity editor environment according to a first embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, specific embodiments accompanied with figures are described in detail below, and it is apparent that the described embodiments are a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present invention, shall fall within the protection scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
Furthermore, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
The present invention will be described in detail with reference to the drawings, wherein the cross-sectional views illustrating the structure of the device are not enlarged partially in general scale for convenience of illustration, and the drawings are only exemplary and should not be construed as limiting the scope of the present invention. In addition, the three-dimensional dimensions of length, width and depth should be included in the actual fabrication.
Meanwhile, in the description of the present invention, it should be noted that the terms "upper, lower, inner and outer" and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation and operate, and thus, cannot be construed as limiting the present invention. Furthermore, the terms first, second, or third are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The terms "mounted, connected and connected" in the present invention are to be understood broadly, unless otherwise explicitly specified or limited, for example: can be fixedly connected, detachably connected or integrally connected; they may be mechanically, electrically, or directly connected, or indirectly connected through intervening media, or may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example 1
Referring to the schematic diagrams of fig. 1-2, a method for simulating a touch screen gesture in a unity editor environment is provided for this embodiment, and it should be noted that, this embodiment refers to a PC terminal in the unity editor environment, and a touch pad can be directly operated on a PC through a raspberry group, and a debugging gesture can be developed in the unity editor, which is too time-consuming and labor-consuming for the current compiling and releasing to a real machine, and the debugging of the real machine is not as direct and convenient as the debugging of the editor, thereby effectively improving the development efficiency.
In particular, the method for simulating the touch screen gesture under the environment of the unity editor comprises the following steps,
s1: and importing the packaged gesture plug-in by the unity client.
The gesture plug-in of the unity client is implemented as follows,
the gesture plug-in is written by C # language;
packaging the code of the gesture into a gesture plug-in;
the imported gesture plug-in comprises a main prefab (prefabricated part), and a user of the gesture plug-in drags the gesture plug-in into a scene to be used as a non-destruction node of the scene and take effect globally; prefab is also referred to as a preset, which is a resource type, a reusable game object stored in the project view. The presets may be placed multiple times into multiple scenes. When you add a preset to the scene, an instance of it is created. All preset instances are linked to the original preset, essentially its clone. Regardless of how many instances of your item exist, when you make any changes to the presets, you will see that these changes will apply to all instances. When the preset source changes, the changes will apply to all linked game objects. For example, in a game, if a new script is added to a preset, all linked game objects will immediately contain the script. However, it is possible to change the properties of a single instance while maintaining links. Changing the attributes of any one preset instance can see that the variable name becomes bold and the variable can now be rewritten, all rewritten attributes will not affect the change of the preset source. This allows you to modify preset instances so that they become unique without breaking the link between them and the preset source. The gesture plug-in provides an interface, and a user rewrites the details of the gesture to be customized.
It should be noted that the interface is implemented by using an API application program interface, which is some predefined functions or conventions for linking different components of the software system. The goal is to provide applications and developers the ability to access a set of routines based on certain software or hardware without having to access native code or understand the details of the internal workings. The C # language is an object-oriented, high-level programming language released by microsoft that runs on top of the.net Framework and the.net Core (fully open source, cross-platform).
S2: and building an intermediate service based on the websocket protocol.
Building a websocket-based intermediary service includes the following steps,
adopting a node;
starting a unity client to register in a server and keeping long connection;
the touchpad client that starts the raspberry pie registers in the service and maintains a long connection.
Further, the intermediate service of the websocket includes the following steps,
creating a maven project;
creating html front end pages
Xml add welcome page;
writing java code
And after the operation, the server side instantly obtains the message of the client side.
It should be noted that WebSocket is a communication protocol newly added to html5, and currently popular browsers support the protocol, such as Chrome, Safari, Firefox, Opera, IE, and the like, for which the protocol is supported earliest, the support is started from Chrome12, and with the continuous change of protocol draft, the implementation of the protocol by each browser is continuously updated.
The WebSocket protocol is a two-way communication protocol, which is built on TCP, and transmits data through TCP as http does, but it differs from http in two ways: WebSocket is a two-way communication protocol, after connection is established, a WebSocket server and a Browser/UA can actively send or receive data to each other, just like Socket, and the difference is that WebSocket is a protocol which is established on the basis of Web and simply simulates Socket; WebSocket needs to be connected through handshake, similar to TCP, the WebSocket also needs to be connected through handshake between a client and a server, and the WebSocket can communicate with each other only after the connection is successful.
After receiving a handshake request sent by Browser/UA, if the data packet data and format are correct, the protocol version numbers of a client and a server are matched, the WebSocket server receives the handshake connection and gives a corresponding data reply, the replied data packet is transmitted by adopting an http protocol, after receiving the data packet replied by the server, if the content and format of the data packet have no problem, the Browser indicates that the connection is successful, and triggers an open message, and at this time, a Web developer can want to send data to the server through a send interface. Otherwise, the handshake connection fails, the Web application receives the onerror message, and can know the reason for the connection failure.
Packaging and analyzing of the WebSocket data package are achieved through an open-source library file, the interfaces can be called, for example, PyWebSocket is written in Python language, cross-platform can be well achieved, expansion is simple, and at present, WebKit builds a WebSocket server to make a LayoutTest. WebSocket-Node is written by JavaScript language and is established on nodejs. The method is written by C/C + + language, the customizability is higher, and the programming can be participated from the beginning of TCP monitoring to the completion of the package.
S3: bidirectionally docking the raspberry pie with a unity client by using an intermediate service;
s4: the raspberry pi is connected with the touch pad as a peripheral, and a gesture code library packaged in the raspberry pi is utilized to transmit gesture messages to a server side of the intermediate service.
The raspberry pi as an independent peripheral client after being connected with the touch pad comprises the following setting steps,
writing a peripheral client by adopting C + + language;
calling a driving program of the touch panel;
the basic gesture library is packaged according to different events of the touch pad, and corresponds to the gesture formats of the gesture plug-in units one by one;
the client converts the operation behavior of the touch pad into a message queue and sends the message queue to the server. When the gesture is converted into a message queue to be sent to the server, the response frequency is set, and the gesture is prevented from being sent too frequently.
S5: the server side forwards the gesture message to the unity client side;
the server receives the message of the raspberry dispatching client and forwards the message to the unity client;
the unity client does not distinguish the gesture source and does the same processing as the real-time gesture.
S6: the gesture plug-in is used for app logic simulation after being analyzed.
Scene one:
the method has the advantages that the gesture function of the touch screen needs to be compiled and packaged on a real mobile phone in the prior app, the compiling and packaging need to be carried out for 10 minutes each time, the real mobile phone can only write log files for debugging, the efficiency is low, and the compiling and testing need to be packaged again after the codes are changed. The time for compiling and packaging the traditional method is close to 10 minutes, and the debugging is inconvenient. The method does not need packaging, is directly debugged and can debug the breakpoint.
The technical effects adopted in the method are verified and explained, different methods selected in the embodiment and the method are adopted for comparison and test, and the test results are compared by means of scientific demonstration to verify the real effect of the method. The traditional technical scheme adopts the existing real machine debugging mode, and the method is based on the touch pad mode of the raspberry group, and the method carries out actual simulation test comparison and statistics on time consumption of development. The comparative results recorded in the actual measurements are given in table 1 below.
Table 1: and (6) actual comparison.
Mode of development Debugging of existing real machine Raspberry pie-based touch pad
Compile time 300s 5s
Packing time 300s Is free of
Time of flight 60s Is free of
Debugging mode Log debugging Breakpoint debugging
Difficulty of debugging Is very difficult to be Is obviously convenient
It should be recognized that embodiments of the present invention can be realized and implemented by computer hardware, a combination of hardware and software, or by computer instructions stored in a non-transitory computer readable memory. The methods may be implemented in a computer program using standard programming techniques, including a non-transitory computer-readable storage medium configured with the computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner, according to the methods and figures described in the detailed description. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
Further, the operations of processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The processes described herein (or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions, and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) collectively executed on one or more processors, by hardware, or combinations thereof. The computer program includes a plurality of instructions executable by one or more processors.
Further, the method may be implemented in any type of computing platform operatively connected to a suitable interface, including but not limited to a personal computer, mini computer, mainframe, workstation, networked or distributed computing environment, separate or integrated computer platform, or in communication with a charged particle tool or other imaging device, and the like. Aspects of the invention may be embodied in machine-readable code stored on a non-transitory storage medium or device, whether removable or integrated into a computing platform, such as a hard disk, optically read and/or write storage medium, RAM, ROM, or the like, such that it may be read by a programmable computer, which when read by the storage medium or device, is operative to configure and operate the computer to perform the procedures described herein. Further, the machine-readable code, or portions thereof, may be transmitted over a wired or wireless network. The invention described herein includes these and other different types of non-transitory computer-readable storage media when such media include instructions or programs that implement the steps described above in conjunction with a microprocessor or other data processor. The invention also includes the computer itself when programmed according to the methods and techniques described herein. A computer program can be applied to input data to perform the functions described herein to transform the input data to generate output data that is stored to non-volatile memory. The output information may also be applied to one or more output devices, such as a display. In a preferred embodiment of the invention, the transformed data represents physical and tangible objects, including particular visual depictions of physical and tangible objects produced on a display.
Example 2
Referring to the illustrations of fig. 3 to 4, the illustration provides a system for realizing a simulated touch screen gesture in a unity editor environment in the present embodiment, and the system includes a unity client 100, a communication module 200, a raspberry pi 300, and a touch pad client 400; the unity client 100 is used for importing the packaged gesture plug-ins; the communication module 200 is used for establishing a bidirectional intermediate service communication connection between the unity client 100 and the raspberry pi 300, and the raspberry pi is connected with the touch pad client 400 to serve as a peripheral terminal; and performing gesture development through an interface provided by the raspberry pi 300, and analyzing the gesture plug-in to be used for app logic simulation.
As used in this application, the terms "component," "module," "system," and the like are intended to refer to a computer-related entity, either hardware, firmware, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to being: a process running on a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of example, both an application running on a computing device and the computing device can be a component. One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers. In addition, these components can execute from various computer readable media having various data structures thereon. The components may communicate by way of local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the internet with other systems by way of the signal).
It should be noted that the above-mentioned embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (8)

1. A method for realizing simulation of touch screen gestures in a unity editor environment is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
importing the encapsulated gesture plug-in by the unity client;
building an intermediate service based on a websocket protocol;
bi-directionally docking a raspberry pie with the unity client by using the intermediate service;
the raspberry pie is connected with a touch pad as a peripheral, and a gesture message is transmitted to a server of the intermediate service by utilizing a gesture code library packaged in the raspberry pie;
the server side forwards the gesture message to the unity client side;
the gesture plug-in is used for app logic simulation after being analyzed.
2. The method of claim 1 for implementing simulated touch screen gestures in a unity editor environment, wherein: the gesture plug-in of the unity client is implemented as follows,
the gesture plug-in is written by C # language;
packaging the code of the gesture into the gesture plug-in;
the imported gesture plug-in comprises a main prefab, and a user of the gesture plug-in drags the gesture plug-in into a scene to be used as an undisrupted node of the scene and take effect globally;
the gesture plug-in provides an interface, and a user rewrites to realize customized gesture details.
3. A method for implementing simulated touch screen gestures in a unity editor environment as claimed in claim 1 or 2, characterized in that: building a websocket-based intermediary service includes the following steps,
adopting a node;
starting a unity client to register in a server and keeping long connection;
and starting a touch pad client of the raspberry pi to register in a service and keeping long connection.
4. The method of claim 3 for implementing simulated touch screen gestures in a unity editor environment, wherein: the raspberry pi as an independent peripheral client after being connected with the touch pad comprises the following setting steps,
writing the peripheral client by adopting a C + + language;
calling a driving program of the touch panel;
a basic gesture library packaged according to different events of the touch pad and corresponding to the gesture formats of the gesture plug-in units one by one;
and the client converts the operation behavior of the touch pad into a message queue and sends the message queue to the server.
5. The method of claim 4 for implementing simulated touch screen gestures in a unity editor environment, wherein: and setting response frequency when the conversion is converted into the message queue and the server side is sent.
6. A method for implementing simulated touch screen gestures in a unity editor environment as claimed in claim 4 or 5, characterized in that: comprises the following steps of (a) carrying out,
the server receives the message of the raspberry dispatching client and forwards the message to the unity client;
the unity client does not distinguish the gesture source and does the same processing as the real-time gesture.
7. The method of implementing simulated touch screen gestures in a unity editor environment as claimed in claim 6, wherein: the websocket intermediary service includes the following steps,
creating a maven project;
creating html front end pages
Xml add welcome page;
writing java code
And after the operation, the server side instantly obtains the message of the client side.
8. A system for realizing simulation of touch screen gestures in a unity editor environment is characterized in that: the system comprises a unity client (100), a communication module (200), a raspberry pi (300) and a touchpad client (400);
the unity client (100) is used for importing packaged gesture plug-ins;
the communication module (200) is used for establishing a bidirectional intermediate service communication connection between the unity client (100) and the raspberry pi (300), and the raspberry pi is connected with the touch pad client (400) to serve as a peripheral end;
and performing gesture development through an interface provided by the raspberry pi (300), and analyzing the gesture plug-in to be used for app logic simulation.
CN201911208684.3A 2019-11-30 2019-11-30 Method and system for realizing touch screen gesture simulation in unity editor environment Pending CN110968250A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911208684.3A CN110968250A (en) 2019-11-30 2019-11-30 Method and system for realizing touch screen gesture simulation in unity editor environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911208684.3A CN110968250A (en) 2019-11-30 2019-11-30 Method and system for realizing touch screen gesture simulation in unity editor environment

Publications (1)

Publication Number Publication Date
CN110968250A true CN110968250A (en) 2020-04-07

Family

ID=70032406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911208684.3A Pending CN110968250A (en) 2019-11-30 2019-11-30 Method and system for realizing touch screen gesture simulation in unity editor environment

Country Status (1)

Country Link
CN (1) CN110968250A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108628455A (en) * 2018-05-14 2018-10-09 中北大学 A kind of virtual husky picture method for drafting based on touch-screen gesture identification
CN110113430A (en) * 2019-05-21 2019-08-09 大连大学 A kind of communication means between mobile phone and raspberry pie based on cloud database

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108628455A (en) * 2018-05-14 2018-10-09 中北大学 A kind of virtual husky picture method for drafting based on touch-screen gesture identification
CN110113430A (en) * 2019-05-21 2019-08-09 大连大学 A kind of communication means between mobile phone and raspberry pie based on cloud database

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
佚名 *

Similar Documents

Publication Publication Date Title
Lekić et al. IoT sensor integration to Node-RED platform
US10095609B1 (en) Intermediary for testing content and applications
US9715370B2 (en) Method and system for providing content
TWI520061B (en) Mobile device application framework
CN103309660B (en) Mobile solution cross-platform development method
US8612947B2 (en) System and method for remotely compiling multi-platform native applications for mobile devices
US9652364B1 (en) Cloud service for mobile testing and debugging
TW201915850A (en) Method for generating application program, apparatus, system, device, and medium
CN109800173A (en) Program debugging method, device and storage medium
CN104298591A (en) WebApp remote debugging method and system
Hales HTML5 and JavaScript Web Apps
CN103747074B (en) mobile monitoring system based on Web server
CN110389755B (en) Code processing method and device, electronic equipment and computer readable storage medium
CN102323880A (en) Mobile phone application interface development method and terminal based on browser parsing mode
CN110263279B (en) Page generation method and device, electronic equipment and computer readable storage medium
CN109828921A (en) HTML5 webpage automated function test method, system and electronic equipment
CN112631590B (en) Component library generation method, device, electronic equipment and computer readable medium
CN104731869A (en) Page display method and device
CN111026439A (en) Application program compatibility method, device, equipment and computer storage medium
CN113127361A (en) Application program development method and device, electronic equipment and storage medium
CN112988588B (en) Client software debugging method and device, storage medium and electronic equipment
CN113778405A (en) Cross-platform APP construction method, device, system and medium
AU2019222873B2 (en) Method and system for providing content
CN109120473B (en) Interface request frame implementation method, interface test method and corresponding device
CN103139298A (en) Method for transmitting network data and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210207

Address after: 11 / F, Liuwu building, Liuwu New District, Lhasa City, Tibet Autonomous Region, 850000

Applicant after: Tibet ningsuan Technology Group Co.,Ltd.

Address before: Building C4, No.55 Liyuan South Road, moling street, Nanjing, Jiangsu Province

Applicant before: DILU TECHNOLOGY Co.,Ltd.