CN115002274B - Control method and device, electronic equipment and computer readable storage medium - Google Patents

Control method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN115002274B
CN115002274B CN202210495300.6A CN202210495300A CN115002274B CN 115002274 B CN115002274 B CN 115002274B CN 202210495300 A CN202210495300 A CN 202210495300A CN 115002274 B CN115002274 B CN 115002274B
Authority
CN
China
Prior art keywords
service
target
call information
application
target service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210495300.6A
Other languages
Chinese (zh)
Other versions
CN115002274A (en
Inventor
李轩恺
王剑锋
郑爱华
董伟鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210495300.6A priority Critical patent/CN115002274B/en
Publication of CN115002274A publication Critical patent/CN115002274A/en
Application granted granted Critical
Publication of CN115002274B publication Critical patent/CN115002274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72484User interfaces specially adapted for cordless or mobile telephones wherein functions are triggered by incoming communication events
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The application discloses a control method, a control device, an electronic device and a nonvolatile computer readable storage medium. The method comprises the following steps: determining an application scene where the current application scene is located, and determining a service set corresponding to the application scene; under the condition that a trigger event is detected in the application scene, determining a target service corresponding to the trigger event in the service set; determining initial call information of the target service, and determining target call information according to the initial call information; operating the target service according to the target call information; judging whether a preset first interactive operation is received or not when the target service is interrupted; if yes, the interrupted service is restored. Therefore, the target service can realize cross-platform operation, the use scene of the electronic equipment is expanded, and when the service is interrupted, the interrupted service can be quickly recovered through interactive operation.

Description

Control method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of consumer electronics, and more particularly, to a control method, a control apparatus, an electronic device, and a non-volatile computer-readable storage medium.
Background
With the popularization of smartphones, almost everyone uses smartphones to perform communication services, when a user uses an application program, if a request such as a phone call, a voice call and the like is received, the current application program runs in the background, and in order to prevent the application program from being interrupted by a call request, the smartphones are generally provided with a no-disturb mode, but the mode can cause a call to be no longer reminded, and important calls are easy to be missed.
Disclosure of Invention
The embodiment of the application provides a control method, a control device, an electronic device and a nonvolatile computer readable storage medium.
The control method of the embodiment of the application comprises the steps of determining an application scene where the current application scene is located, and determining a service set corresponding to the application scene; under the condition that a trigger event is detected in the application scene, determining a target service corresponding to the trigger event in the service set; determining initial call information of the target service, and determining target call information according to the initial call information, wherein the target call information is matched with electronic equipment running the target service; operating the target service according to the target call information; judging whether a preset first interactive operation is received or not when the target service is interrupted; if yes, the interrupted service is restored.
The control device of the embodiment of the application comprises a scene sensing module, a service treatment module and a service operation module. The scene perception module is used for determining an application scene where the current scene is located and determining a service set corresponding to the application scene; the service management module is used for determining target service corresponding to the trigger event in the service set under the condition that the trigger event is detected in the application scene; determining initial call information of the target service, and determining target call information according to the initial call information, wherein the target call information is matched with electronic equipment running the target service; and the service operation module is used for operating the target service according to the target call information, judging whether a preset first interactive operation is received when the target service is interrupted, and recovering the interrupted service when the preset first interactive operation is received.
The electronic device of the embodiments of the present application includes a processor. The processor is used for determining the current application scene and determining a service set corresponding to the application scene; under the condition that a trigger event is detected in the application scene, determining a target service corresponding to the trigger event in the service set; determining initial call information of the target service, and determining target call information according to the initial call information, wherein the target call information is matched with electronic equipment running the target service; operating the target service according to the target call information; judging whether a preset first interactive operation is received or not when the target service is interrupted; if yes, the interrupted service is restored.
The non-transitory computer readable storage medium of the embodiments of the present application contains a computer program, which when executed by one or more processors, causes the processors to perform a control method of: determining an application scene where the current application scene is located, and determining a service set corresponding to the application scene; under the condition that a trigger event is detected in the application scene, determining a target service corresponding to the trigger event in the service set; determining initial call information of the target service, and determining target call information according to the initial call information, wherein the target call information is matched with electronic equipment running the target service; operating the target service according to the target call information; judging whether a preset first interactive operation is received or not when the target service is interrupted; if yes, the interrupted service is restored.
In the control method, the control device, the electronic equipment and the non-volatile computer readable storage medium of the embodiment of the application, the service set corresponding to the application scene is determined by determining the current application scene, and the target service responding to the trigger event is determined in the service set under the condition that the trigger event is detected in the application scene, so that the corresponding target service is rapidly determined by determining the application scene and detecting the trigger event, thereby improving the user experience. And it can be understood that the types of services of different platforms (such as android platforms, server platforms and the like) are different, for example, the services of the android platforms cannot normally run on the server platforms, so that after the initial call information of the target service is converted into the target call information matched with the electronic equipment running the target service, the target service can stably run on the electronic equipment, the target service can run across platforms, and the use scene of the electronic equipment is expanded.
In addition, after the target service is operated, if the service is found to be interrupted, whether the preset first interactive operation is received or not is judged, so that the interrupted target service is quickly recovered after the first interactive operation is received. Therefore, when the target service is interrupted, the interrupted target service can be quickly recovered through interactive operation, and on the premise that the no-disturbance mode is not used to avoid missing important calls, the target service is ensured to be continuously served for users, and the user experience is ensured.
Additional aspects and advantages of embodiments of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow diagram of a control method of certain embodiments of the present application;
FIG. 2 is a schematic illustration of a control device according to certain embodiments of the present application;
FIG. 3 is a schematic plan view of an electronic device according to some embodiments of the present application;
FIG. 4 is a schematic diagram of the relationship of applications and services of certain embodiments of the present application;
FIG. 5 is a schematic illustration of a scenario of a control method of certain embodiments of the present application;
FIGS. 6 and 7 are flow diagrams of control methods of certain embodiments of the present application;
FIGS. 8 and 9 are schematic diagrams of scenes of control methods of certain embodiments of the present application;
FIG. 10 is a flow chart of a control method of certain embodiments of the present application;
FIG. 11 is a schematic illustration of a scenario of a control method of certain embodiments of the present application;
FIGS. 12 and 13 are flow diagrams of control methods of certain embodiments of the present application;
FIGS. 14 and 15 are schematic diagrams of scenes of control methods of certain embodiments of the present application;
FIG. 16 is a flow chart of a control method of certain embodiments of the present application;
FIG. 17 is a schematic diagram of the structure of an application script of some embodiments of the present application;
FIG. 18 is a schematic illustration of a control method of certain embodiments of the present application;
FIG. 19 is a schematic illustration of a scenario of a control method of certain embodiments of the present application;
FIG. 20 is a flow chart of a control method of certain embodiments of the present application;
FIGS. 21 and 22 are schematic illustrations of control methods of certain embodiments of the present application;
FIGS. 23-25 are flow diagrams of control methods of certain embodiments of the present application;
FIG. 26 is a schematic illustration of a scenario of a control method of certain embodiments of the present application;
FIG. 27 is a flow chart of a control method of certain embodiments of the present application;
FIG. 28 is a schematic diagram of a control method of certain embodiments of the present application;
FIG. 29 is a flow chart of a control method of certain embodiments of the present application;
FIG. 30 is a flow chart of a control method of certain embodiments of the present application;
FIG. 31 is a schematic illustration of a scenario of a control method of certain embodiments of the present application;
FIG. 32 is a schematic diagram of a service dispatch system of some embodiments of the present application;
FIG. 33 is a schematic diagram of a control method of certain embodiments of the present application;
fig. 34 is a schematic diagram of a connection state of a non-volatile computer readable storage medium and a processor according to some embodiments of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the embodiments of the present application and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1, a control method is provided in an embodiment of the present application. The control method comprises the following steps:
011: determining the current application scene and determining a service set corresponding to the application scene;
012: under the condition that a trigger event is detected in an application scene, determining a target service corresponding to the trigger event in a service set;
013: determining initial call information of the target service, determining target call information according to the initial call information, wherein the target call information is matched with the electronic equipment 100 running the target service;
014: operating a target service according to the target call information;
015: judging whether a preset first interactive operation is received or not when the target service is interrupted;
016: if yes, the interrupted target service is restored.
Referring to fig. 2, a control device 10 is provided in an embodiment of the present application. Comprising a scene perception module 11, a service governance module 12 and a service execution module 13. The control method of the embodiment of the present application is applicable to the control device 10. Wherein, the scene perception module 11 is configured to execute step 011; the service governance module 12 is configured to perform steps 012 and 013; the service execution module 13 is configured to execute step 014, step 015 and step 016.
Referring to fig. 3, the embodiment of the application further provides an electronic device 100. The electronic device 100 includes a processor 20. The control method of the embodiment of the present application may be applied to the electronic apparatus 100. The processor 20 is configured to perform steps 011 to 016.
The electronic device 100 includes a housing 30. The electronic device 100 may be a cell phone, tablet computer, display device, notebook computer, teller machine, gate, smart watch, head display device, gaming machine, etc. As shown in fig. 3, the embodiment of the present application is described taking the electronic device 100 as an example of a mobile phone, and it is understood that the specific form of the electronic device 100 is not limited to the mobile phone. The housing 30 may also be used to mount functional modules of the electronic device 100, such as a display device, an imaging device, a power supply device, and a communication device, so that the housing 30 provides protection for the functional modules from dust, falling, water, and the like.
Specifically, the control apparatus 10 may be provided in the electronic device 100, or the control apparatus 10 may be provided in the server 200 of the cloud, or the control apparatus 10 may be provided in both the electronic device 100 and the server 200.
For example, the processor 20 includes a first processor 21 at the electronic device 100 and a second processor 22 at the server 200. The first processor 21 or the second processor 21 may perform steps 011 to 014. Alternatively, the first processor 21 is used to perform step 021, the second processor 22 is used to perform steps 012 and 013, the first processor 21 is used to perform step 014, and so on.
The scene perception module 11 may determine the application scene in which it is currently located.
The application scenario may be a use scenario where the electronic device 100 is currently located, and the interaction information may be acquired, and then the use scenario is determined according to the interaction information. For example, the usage scenario may be determined according to the interaction information of the electronic device 100, and the usage scenario determined according to the image (or according to the position information detected by the global positioning system (Global Positioning System, GPS)) identified by the camera of the current electronic device 100 or the other electronic devices 100 communicatively connected to the current electronic device 100 (for example, the mobile phone is the current electronic device 100, and the vehicle is the other electronic devices 100) is the market, and the application scenario is determined to be the market scenario; if the determined use scene is a subway, determining that the application scene is a subway scene. Or, if the electronic device 100 receives the input of the user to determine the usage scenario, for example, when the gesture input, the touch input, the voice input, and the like of the user meet the preset conditions, the corresponding usage scenario is determined, for example, the gesture is a two-finger tap gesture, and the usage scenario is determined to be a cross-end screen capture scenario.
For example, the scene sensing module 11 may determine the current application scene according to the location information of the electronic device 100 (such as the GPS information of the mobile phone or the GPS information of the vehicle), for example, when the location information indicates that the electronic device 100 is in an underground garage, the application scene may be determined to be the garage. It can be understood that different application scenarios correspond to different service sets, for example, garage scenarios, there are generally services such as "car navigation", "car management", etc., and for subway scenarios, there are generally services such as "car code", "arrival prompt", etc., so that the service set is determined according to the application scenario.
Service is a technical concept. When a certain functional entity provides an interface which can be called, the function entity supports the input parameters, the execution functions and the return results. This interface that can be invoked is a service.
The entity that refers to the interface is called the service provider. One service provider may expose multiple services to the outside. The entity type of the service provider is not limited.
The services may be android UI services, awareness services, cloud services (e.g., cloud software as a service (Software as a Service, saaS)), voice services, systematic services (e.g., android applications), third party services (e.g., google browser plug-ins), and the like.
For example, the plug-in the WPS OFFICE is exposed to the electronic device 100 through the service gateway, and the electronic device 100 invokes the plug-in the WPS through the service gateway of the WPS, so as to implement a corresponding service, such as a picture insertion service, a text insertion service, and the like.
The service is used as a minimum unit in the running process of the application and is used for realizing specific tasks. For example, the parking space navigation service realizes a navigation function, the vehicle management service is used for realizing vehicle locking/unlocking, the riding code service is used for realizing a pop-up riding code function, the arrival prompt service is used for realizing a function of prompting a user to arrive at a station, the screen capture service is used for realizing a device screen capture function, the screen capture service is used for realizing display screen projection among devices, and the like. For example, the map application includes a navigation service, an arrival reminding service, and the like.
The service called in the running process of the application can be the service on the current device or the service on other devices. For example, in the running process of the application, the device screen shots of other devices can be obtained by calling the screen shot service on the other devices, and the device screen shots of the other devices are displayed by calling the picture display service.
Referring to fig. 4, an application may be considered as a collection of services (e.g., application 1 through application 3 in fig. 4) for implementing specific business logic through calls between services. Unlike a conventional application that can only run on a specific operating system, the service of the application (such as application 3 in fig. 4) of the present application may be a service on a different operating system, that is, the application supports heterogeneous operating systems, and cross-platform running may be achieved.
Unlike the traditional application installation process, all the services used in the running process need to be installed locally in advance, and the services in the application support dynamic deployment, namely, the services corresponding to the service logic are dynamically deployed to the current device based on the service logic executed by the current device in the running process. Accordingly, when the functions of the same application are implemented by a plurality of devices, since service logic of different devices may be different, there may be a difference in services deployed by different devices, that is, services in the application have a characteristic of differential deployment.
It can be understood that the service set corresponding to the application scenario may be preset, or may be determined based on the number of times and the duration of use of the service that the user uses in different application scenarios. For example, when the user uses the ride code service every time he enters and exits the subway scene, the ride code service may be used as one of the service sets corresponding to the subway scene.
After determining the application scene, a trigger event can be detected in the current application scene, so that a target service corresponding to the trigger event in the service set is determined.
The trigger event has an association relation with the services in the service set, and the target service corresponding to the trigger event can be rapidly determined according to the trigger event.
The triggering event can be an event generated by user interaction, such as a user clicking a display screen, user voice or key input, data collected by a sensor meeting preset conditions, and the like. The data collected by the sensor meets the preset condition may be that the brightness of the ambient light collected by the ambient light sensor reaches the preset brightness, or that the current position is located at the preset position detected by the GPS, or the like.
For example, after determining that the user has entered the subway scene according to the position information detected by the GPS, the user may determine that the user wants to sit on the subway, so as to determine that a trigger event of the "car riding code" service is detected, and at this time, determine that the "car riding code" service is the target service from the service set of the subway scene. When it is determined that the user has entered the subway according to the location information of the electronic device 100, the voice information of the microphone, and the like, it may be determined that the user may need to get to the station, so that it may be determined that a trigger event of the "get to station prompt" service is detected, and at this time, from a service set of the subway scene, it is determined that the "get to station prompt" service is the target service.
After determining the target service, if the target service is already installed in the electronic device 100, the target service may be directly operated. When the target service is not installed, or in order to save the memory of the electronic device 100, the service may be stored in the cloud server 200, and when the electronic device 100 needs, the target service is obtained from the cloud server 200.
Services of different systems or platforms, which have different upper layer abstract modes of processes and different calling modes among processes, each service needs to be preset with initial calling information when registering, and the initial calling information can be information written when registering the service. When the system or platform on which the target service operates is different from the current electronic device 100, the electronic device 100 cannot directly operate the target service according to the initial call information, but needs to convert the initial call information of the target service into target call information matched with the target service, so that the target service can be stably operated under the current electronic device 100.
In one example, the current device is an android system, and the target service is a windows service, which cannot be directly used by the current device. After determining the initial call information of the target service, the initial call information needs to be converted. If the initial call information can include an android parameter a, an android parameter B and an android parameter C, the target service cannot directly process the android parameter a, the android parameter B and the android parameter C.
Therefore, the current device (or the server 200) may convert the initial call information into the target call information, for example, convert the android parameter a, the android parameter B, and the android parameter C into the windows parameter A, windows parameter B and the windows parameter C, and then input the windows parameter A, windows parameter B and the windows parameter C into the target service, so as to implement the call of the target service.
It can be understood that the android parameters and windows parameters may be only different in format and the content actually included is the same, for example, the target service is a navigation service, and when the destination is input for navigation, the android parameters and windows parameters both include the actual "destination", so that it is ensured that after the server 200 converts the initial call information into the target call information, the target service can be correctly called, and the situation that the target service may be wrongly called after the conversion is performed is avoided.
Referring to fig. 5, in the process of running the target service (for example, the user M uses the navigation service on the road of the driving company P), since the call, the alarm clock, etc. occupy the service of the current display area 40, the navigation service is interrupted (for example, enter the background operation), although the navigation service can continue to perform voice navigation, the call of the user M is affected, and the user M cannot see the route in real time through the display area 40, and easily opens the route by mistake, thereby leading to the delay of the user M; or the navigation service is directly interrupted by the service, and at this time, although voice navigation is not performed to affect the user's conversation, for the user M who is unfamiliar with the route, losing the navigation service may cause the user M to have difficulty in accurately opening to the company, and delay a lot of time.
The processor 20 may obtain a flag bit of the target service, where the flag bit is used to indicate that the target service is running in the foreground or in the background, if the flag bit is 0, it indicates that the target service is running in the background, and if the flag bit is 1, it indicates that the target service is running in the foreground. In the case that the flag bit is 0, it is determined that the target service is interrupted. Or, although the flag bit is 1, the target service pauses to execute the operation corresponding to the target service, and may still determine that the target service is interrupted, for example, the user M wants to drive to the company, and the navigation service is paused due to the incoming call, for example, the voice navigation is paused, and the user only travels according to the displayed route pattern during the call, where the navigation service does not provide the navigation service.
Therefore, after the user M calls, the user M naturally wants to resume the interrupted target service, so after determining that the target service is interrupted, the processor 20 determines in real time whether the electronic device 100 receives a preset first interaction operation, for example, the preset first interaction operation includes touching a first preset area of the electronic device 100 (such as a back surface of the electronic device 100, a preset area of a display screen of the electronic device 100, etc.), and the first interaction operation is completed quickly by tapping the first preset area; or shake an electronic device 100; or drawing a preset gesture (such as circling) in a second preset area of the electronic device 100 (such as the back of the electronic device 100), so as to prevent the situation of false touch occurring when the back of the electronic device 100 is touched, and accurately complete the first interactive operation; or voice input to complete the first interactive operation, such as the user speaking "back to navigation service" to the electronic device 100 to quickly complete the first interactive operation; or, the gaze point of the user M detected by the electronic device 100 moves along a preset track, and if the gaze point of the user M sweeps from top to bottom, the first interaction operation is confirmed to be completed.
The processor 20 may determine whether a first preset area of the electronic device 100 is touched; and/or determining whether the difference value of the gesture information of the electronic device 100 is greater than a preset threshold; and/or determining whether a second preset area of the electronic device 100 receives a preset gesture; and/or determining whether the electronic device 100 receives a preset voice input; and/or determine whether the movement trajectory of the gaze point detected by the electronic device 100 corresponds to a preset trajectory. Therefore, the first interactive operation is simpler, and the operation requirements of different user personalities can be met through the first interactive operation in different forms.
After the processor 20 detects that the user has completed the first interaction, it may determine that the user wants to resume the interrupted target service, thereby quickly resuming the interrupted target service, e.g., running the interrupted target service (e.g., navigation service) in the foreground again, to reduce the impact of the interruption of the target service.
When the first interactive operation is not detected, the target service is indicated to run at the background, but the corresponding operation is completed (for example, after the navigation service has navigated the user to the company, the user answers the call), and the user does not resume the navigation service any more.
According to the control method, the control device 10 and the electronic device 100, the service set corresponding to the application scene is determined by determining the current application scene of the electronic device, and the target service responding to the trigger event is determined in the service set under the condition that the trigger event is detected in the application scene, so that the corresponding target service is rapidly determined by determining the application scene and detecting the trigger event, and user experience is improved. And it can be understood that the types of services of different platforms (such as android platforms, server platforms and the like) are different, for example, the services of the android platforms cannot normally run on the server platforms, so that after the initial call information of the target service is converted into the target call information matched with the electronic equipment running the target service, the target service can stably run on the electronic equipment, the target service can run across platforms, and the use scene of the electronic equipment is expanded.
In addition, after the target service is operated, if the service is found to be interrupted, whether the preset first interactive operation is received or not is judged, so that the interrupted target service is quickly recovered after the first interactive operation is received. Therefore, when the target service is interrupted, the interrupted target service can be quickly recovered through interactive operation, and on the premise that the no-disturbance mode is not used to avoid missing important calls, the target service is ensured to be continuously served for users, and the user experience is ensured.
Referring to fig. 2, 3 and 6, in some embodiments, before resuming the interrupted target service, the control method further includes:
017: judging whether the operation corresponding to the target service is completed or not;
018: in case the operation is not completed, a step of resuming the interrupted target service is entered.
In certain embodiments, the service execution module 13 is further configured to perform step 017 and step 018.
In certain embodiments, processor 20 is configured to perform step 017 and step 018.
Specifically, it can be understood that, the target service that the user wants to restore is generally because the target service has not completed the corresponding operation, for example, the user has sold through the take-out service point until the take-out meal arrives, the take-out service always keeps the service, provides the position of the rider in real time, and predicts the service such as the arrival time prompt, the take-out service is interrupted after the user switches to other applications such as the game, and runs in the background, the user can occasionally want to view the take-out meal delivery condition (such as in the death and revival time of the game role), and at this time, the user can quickly restore the interrupted take-out service through the first interactive operation.
In order to prevent the situation that when the user does not want to resume the interrupted target service, the interrupted target service is resumed due to the false triggering of the first interactive operation, for example, when the take-out service is sent, the take-out service is completed, but the take-out service is resumed again due to the false operation, for example, triggering of the first interactive operation of the "shake-shake electronic device", so that the current main interface is occupied, and the use experience of the user is affected. The processor 20 may determine, before resuming the interrupted target service, whether the target service has completed the corresponding operation (e.g. the take-out service has been delivered, the position of the rider is not provided in real time at this time, and the arrival time prompt is expected, etc.), and in the case that the operation corresponding to the take-out service is not completed, determine that the user wants to view the meal delivery situation, so as to resume the take-out service, and implement fast switching of the take-out service to view the meal delivery situation; in the case where the corresponding operation of the take-away service has been completed, the user is likely to be a malfunction, and therefore, it may not be necessary to resume the interrupted take-away service at this time. Thereby ensuring the recovery accuracy of the target service.
Referring to fig. 2, 3 and 7, in some embodiments, before resuming the interrupted target service, the control method further includes:
019: sending out confirmation information, wherein the confirmation information comprises whether to resume the interrupted target service; and
020: and determining whether to resume the interrupted target service according to the second interaction operation.
In some embodiments, the service execution module 13 is further configured to perform step 019 and step 020.
In certain embodiments, processor 20 is configured to perform steps 019 and 020.
Specifically, it can be understood that, although the operation corresponding to the target service has been completed, sometimes the user only wants to open the interrupted target service, for example, after the navigation of the navigation service is completed, the friend sends positioning information to the user, the user wants to see the distance and route between the friend and the user, and at this time, the user wants to quickly open the navigation service that has just finished the navigation, so, in order to give initiative to the user, to implement quick recovery of the target service, please refer to fig. 8, when the user triggers the first interaction operation, confirmation information (for example, whether to recover the navigation service) is sent to prompt the user to recover the interrupted target service through the second input operation (for example, selection operation); alternatively, referring to fig. 9, the second interaction operation may be the same as the first interaction operation, for example, after the user performs the first interaction operation (such as a shake), the first interaction operation is performed again according to the confirmation information, so as to restore the interrupted target service. Thus, through the confirmation information, the user is prompted to quickly recover the interrupted target service by light weight, and initiative for starting the target service is given to the client so as to adapt to wider scenes.
Referring to fig. 2, 3 and 10, in some embodiments, step 016 comprises:
0161: in the first electronic device 110 and/or the second electronic device 120 connected to the first electronic device 110, the interrupted target service is resumed.
In certain embodiments, service execution module 13 is further configured to perform step 0161.
In certain embodiments, processor 20 is configured to perform step 0161.
Specifically, the electronic device 100 of the present application may include a first electronic device 110 and a second electronic device 120, where the first electronic device 110 and the second electronic device 120 are communicatively connected, for example, the first electronic device 110 is a mobile phone, and the second electronic device 120 is a watch.
In one example, referring to fig. 11, when a user uses the first electronic device 110 to run a target service, the current target service (such as a navigation service) is interrupted due to a call, and at this time, the user resumes the interrupted target service by performing a first interactive operation on the first electronic device 110 or the second electronic device 120; wherein, when the user performs the first interactive operation on the first electronic device 110, the interrupted target service is restored at the first electronic device 110 (as shown in fig. 11, the navigation service is restored at the display area 51 of the first electronic device 110); while when the user performs the first interactive operation on the second electronic device 120, the interrupted target service is resumed at the second electronic device 120 (as shown in fig. 11, the navigation service is resumed at the display area 52 of the second electronic device 120, and the display area 51 of the first electronic device 110 continues to maintain the display of the call interface); it can be appreciated that the user can select the idle electronic device 100 that is not occupied by the call or the like on the current display interface to resume the target service, so that the interrupted target service can be quickly resumed in the idle electronic device 100 without affecting the call, so that the target service can be circulated between different electronic devices 100, and thus a continuous target service is provided for the user. Of course, since the mobile phone has a larger display screen than the watch, the user may not resume the interrupted target service in the idle second electronic device 120, but still resume the interrupted target service in the first electronic device 110 to obtain better service experience, and specifically, the selection box may be popped up before resuming the interrupted target service, so that the user may select to resume the interrupted target service in the first electronic device 110 and/or the second electronic device 120, thereby maximally satisfying the user requirement and improving the user experience.
In another example, when the target service is executed in the second electronic device 120, if the call request is received so that the target service is interrupted, the interrupted target service may be restored in the second electronic device 120 and/or the idle first electronic device 110 when the first interactive operation is received.
Referring to fig. 2, 3, and 12, in some embodiments, step 011 includes:
0111: and acquiring the interaction information, and determining the current application scene according to the interaction information.
In some embodiments, the context awareness module 11 is further configured to perform step 0111.
In certain embodiments, the processor 20 is configured to perform step 0111.
Specifically, the information generated by any interaction procedure of the electronic device 100 may be information that can be acquired by the electronic device 100, where the interaction information includes at least one of input information, sensor information, status information, location information of the electronic device 100, and running information of an application of the electronic device 100.
Wherein, the input information may include: voice interaction information, text interaction information, touch interaction information, etc., are not limited herein. The sensor information may be an image collected by the camera 40 of the electronic device 100, or posture information collected by a posture sensor, or ambient light brightness information collected by an ambient light sensor, sound information collected by a microphone, or the like. The status information may be a current status of the electronic device 100, such as when the electronic device 100 is communicating with other devices, a time, an amount of power, etc. of the electronic device 100. The location information may then represent the location of the electronic device 100, such as a home, company, mall, or a specific location in a room. The running information of the application includes whether the current application completes a task (for example, the navigation application may complete the navigation task), whether the shopping application may settle the current shopping cart, etc.
In this embodiment, the application scenario may include a basic application scenario and an advanced application scenario, where the basic application scenario, such as an environment, time, activity, traffic, location, nearby devices, etc., may be perceived through interaction information of the electronic device 100, and the advanced application scenario may be determined according to the basic application scenario, for example, an advanced application scenario order may be obtained through time/location reasoning of the basic application scenario; obtaining advanced application scene shopping through basic application scene activity reasoning; acquiring a high-level application scene working through reasoning of a basic application scene vehicle; obtaining a high-level application scene tour spot through basic application scene position reasoning; advanced application scene screen projection and the like are obtained through device reasoning near the basic application scene, and the method is not limited herein.
In some embodiments, the application scenario is determined by at least one of location awareness, vehicle awareness, activity state awareness, device gesture awareness, nearby device awareness, environmental state awareness, time awareness, without limitation.
As one implementation, the application scenario is determined based on the perceived location of the electronic device 100. The location awareness may locate the location of the electronic device 100 through a global positioning system (globalpositioning system, GPS) technology, may locate the location of the electronic device 100 through a beidou positioning system technology, etc., and is not limited herein. For example, when the position information display is currently in a subway station, determining that the application scene is a subway scene; and when the position information displays that the garage is currently in the garage, determining that the application scene is the garage scene.
As one implementation, the application scenario is determined based on the perceived vehicle type. The vehicle perception can calculate the change speed of the electronic device 100 in the horizontal direction and the vertical direction through the acceleration sensor of the electronic device 100, and combine the technologies of machine learning and the like to determine whether the electronic device 100 is in a driving state, and when the electronic device 100 is determined to be in the driving state, the vehicle types, such as cars, buses, trains, planes and the like, are distinguished through voice recognition, wherein the environmental noise corresponding to different vehicles is different.
As an implementation, the application scenario is determined based on the perceived activity state. The activity state sensing may calculate the change speed of the electronic device 100 in the horizontal direction and the vertical direction through the acceleration sensor of the electronic device 100, and determine whether the user corresponding to the electronic device 100 is in a static state, a walking state, a running state, and the like in combination with technologies such as machine learning, which is not limited herein. For example, when the user is in a stationary state, the application scene may be determined to be a sleep scene; when the user is in the running state, the application scene can be determined to be the running scene.
As an implementation, the application scenario is determined based on the perceived device state. The device state awareness may obtain the state in which the electronic device 100 is located through the operating system of the electronic device 100. For example, whether an audio playback device is connected to the electronic device 100, whether a wireless module of the electronic device 100 is connected, whether a screen of the electronic device 100 is lit, and the like are not limited herein. For example, when the electronic device 100 is connected with an audio playback device, it is determined that the application scene is an audio playback scene.
As one implementation, an application scenario is determined based on perceived device gestures. The device gesture sensing may evaluate whether the electronic device 100 is right-side-up or downward, is resting on a desktop or is placed in a pocket, backpack, etc., by using sensors such as an acceleration sensor, a gyroscope, a magnetometer, etc. of the electronic device 100, in combination with techniques such as machine learning, etc., without limitation. For example, when the electronic device 100 is facing down, the application scene is determined to be a mute scene.
As one implementation, the application scenario is determined from perceived nearby devices. Nearby device awareness may identify nearby devices through bluetooth, wiFi, etc. broadcasts sent by nearby devices. The nearby devices may include, for example, smart phones, smart televisions, smart watches, smart headsets, smart automobiles, etc., which are not limited herein. For example, when the nearby device is a smart television, the application scene is determined to be a screen-casting scene.
As an implementation, the application scenario is determined according to the perceived environmental state. Environmental state sensing may identify the environmental state in which the electronic device 100 is currently located through sensors of barometers, thermometers, ambient light, etc. of the electronic device 100. For example, when the environmental state is night, it is determined that the application scene is a night scene shooting scene.
As an implementation, the application scenario is determined according to the perceived time. Time awareness may determine whether it is a weekday or weekend, morning, afternoon, night, late night, etc., by obtaining the current system time of the electronic device 100, calculating date, week, etc., information. And then determining whether the country or region is holidays or not. For example, when the time is a working day and is a working time, the application scene is determined to be a working scene.
Referring to fig. 2, 3, and 13, in some embodiments, step 011 further includes:
0112: determining an application script corresponding to an application scene, wherein the application script comprises at least one service identifier;
0113: and taking a set formed by the services corresponding to all the service identifiers in the application script as a service set.
In some embodiments, the scene perception module 11 is further configured to perform step 0112 and step 0113.
In some embodiments, processor 20 is configured to perform step 0112 and step 0113.
Specifically, the application script may include at least one service identifier, each service in the application script may find the service in the application market of the server 200 through the service identifier, and each service identifier may have a corresponding service; the application script may further include control logic for controlling services in the service set corresponding to the application scenario, for example, a section of general business logic is described by using a scripting language. Wherein the scripting language may be xml, javascript. The calling and management relation between the services is realized through an application script, and the application script describes a section of general business logic.
When determining the application script corresponding to the application scene, the service list corresponding to the application scene can be determined first, the service list comprises at least one service identifier corresponding to the service, and the application script corresponding to the application scene can be generated according to the at least one service identifier included in the service list.
When the application script is generated according to the service list, firstly, generating an initial application script corresponding to the application scene according to the service identifier in the service list, and then, adjusting the initial application script according to the editing data of the initial application script, so as to obtain an edited application script, wherein the edited application script is the application script corresponding to the application scene.
The current editing data can be determined according to input information of a user, interaction information collected under an application scene, user portraits, historical editing data of the user on the application script and the like. That is, as users continuously use the electronic device, the service set corresponding to each application scenario may be continuously updated according to the editing data, so that each user realizes a personalized service set for different application scenarios, so as to improve user experience.
In one embodiment, the edit data may also be determined based on user subscription operations to the service.
The subscription initiator device can obtain service lists of all services by accessing the application store server, then receive subscription operation on the services in the service lists, and generate application scripts corresponding to the applications according to service identifiers corresponding to the services subscribed by the user. It will be appreciated that one service may be used solely as one application in an application store, and that a plurality of services subscribed to by the user can also be used as one application in an application store.
Illustratively, as shown in fig. 14, a user may access an application store, search through a search box 72 in an application store interface 71 for all services corresponding to a particular application scenario (e.g., scenario a), forming a service list, with each service corresponding to a subscription button 73. After the service receives the click operation of the subscription button 73 by the user, the subscription button 73 becomes the unsubscribe button 74, which indicates that the service is subscribed, and the subscription of the corresponding application can be canceled by clicking the unsubscribe button 74. And generating an application script corresponding to the application scene according to the service identifiers corresponding to the subscribed one or more services.
Referring to fig. 15, when the device detects that the application subscription of the user changes, an application script of the application to which the user is newly subscribed is downloaded from the application store to the device.
If the cloud (e.g. server 200) detects that the subscription of the application of the device a and the device B is changed, the service identifiers of the newly subscribed applications of the device a and the device B are respectively sent to the device a and the device B through the change notification, and then the device a and the device B download the corresponding application scripts from the application store according to the corresponding service identifiers so as to complete the deployment of the application.
After determining the application script, the set of services corresponding to the at least one service identifier included in the application script can be used as a service set. For example, a plurality of service identifiers corresponding to the control logic of the triggers may be used as one service set, or a plurality of service identifiers corresponding to the control logic of all triggers in the application script may be used as one service set.
Referring to fig. 2, 3 and 16, in some embodiments, step 012 includes:
0121: under the condition that a trigger event is detected in an application scene, triggering a target trigger corresponding to the trigger event in the application script;
0122: and calling a corresponding target control logic in the application script through the target trigger, and determining a target service corresponding to the trigger event in the service set according to the target control logic.
In certain embodiments, service remediation module 12 is configured to perform steps 0121 and 0122.
In certain embodiments, the processor 20 is configured to perform steps 0121 and 0122.
Specifically, referring to fig. 17, an application script may be composed of several triggers (e.g., trigger a, trigger B, trigger C, and trigger D in fig. 17), each of which is composed of trigger events, control logic, and several services controlled by the control logic. Wherein the services in the application script are not the services themselves, but rather the service identities of the services (e.g., service 1, service 2, and service 3 in fig. 17).
The trigger event includes an event generated by interaction (such as reaching a specific position, time, user interaction input, etc.), and the control logic is business logic corresponding to the trigger, and is generally formed by combining a plurality of services according to specified logic. When a certain trigger event occurs, a specific trigger is triggered, so that corresponding control logic is executed, and corresponding service is called according to the control logic.
Referring to fig. 18, a plurality of triggers in an application script have a business relevance, and cooperate to complete a scene application. This scenario application requires responding to a number of different events (i.e., triggering events), each event corresponding to a behavior (e.g., an action in fig. 18). By creating different triggers and combining them together, a complete application is formed.
For example, a scene application that meets the user's needs of "music on-the-go". The triggering event is that the sound stream is transferred to the living room sound box for playing when the user is in the living room, the sound stream is transferred to the sound box for playing when the user is in the bedroom, and the sound stream is transferred to the mobile phone and the Bluetooth earphone for playing when the user is out of the room.
Under the application scene, under the condition that a trigger event is detected, a target trigger corresponding to the trigger event in the application script can be triggered; and calling target control logic corresponding to the target trigger through the target trigger, so that the target service corresponding to the trigger event in the service set is determined through the target control logic.
For example, after the trigger event shown in fig. 17 is triggered, a target trigger (i.e., trigger a) corresponding to the trigger event in the application script may be determined, and then target control logic (i.e., control logic in fig. 17) corresponding to the trigger a is invoked by the trigger a, so that a target service (such as service 1, service 2, etc.) corresponding to the trigger event in the service set is determined by the target control logic.
It will be appreciated that the services controlled by the target control logic may be plural, and the plural services may be executed according to the target control logic, e.g., the plural services controlled by the target control logic may be ordered and may be executed sequentially in order. Thus, the target service to be executed currently can be determined according to the target control logic.
Optionally, the trigger event may be an event triggered by the user, such as a tapping event, a shaking event, etc., or an event output after the service is invoked, such as an event returned after the service corresponding to the last trigger is invoked, which is not limited in this embodiment.
The services in the application are connected in series by the control logic, and the device deployed by the control logic can be determined in running, so that after the current device downloads the application script, the device is deployed at the moment through the service set called by the control logic. After the application runs, different services can be deployed to the same device or differentially deployed to different devices according to the requirements of control service logic, different tasks are executed on different devices, and the deployment process is dynamic. If there is a device deployment service failure on multiple devices, the user will be prompted that the current service is not available.
For example, referring to FIG. 19, for a cross-end screen capture application, services invoked to implement a cross-end screen capture include a double-finger tap service, a picture display service, and a screen capture service. When the user account subscribes to the cross-end screen capturing application, the mobile phone, the tablet, the car machine and the television under the user account all download the corresponding application script, and after downloading the application script, the double-finger tapping service is deployed in advance based on the initial trigger.
When the user double-finger taps the screen of the mobile phone, the trigger event is determined to be 'cross-end screenshot', and the mobile phone determines that the executed business logic is a screen capturing service on the side of requesting to call other equipment (such as a tablet, a car machine, a television and the like) based on a target trigger and target control logic in the application script.
After receiving the request, other equipment determines that the trigger event is "screenshot", and the other equipment determines that the executed business logic is deployment screenshot service based on a target trigger and target control logic in the application script, operates the screenshot service to screenshot the current interface and sends the screenshot to the mobile phone.
After the mobile phone receives the screenshot, the mobile phone determines that the trigger event is a display image, and based on a target trigger and target control logic in the application script, the mobile phone determines that the executed business logic is to deploy the picture display service and operates the picture display service to display the screenshot. Thus, cross-end screen capturing can be realized through dynamic deployment of services.
Referring to fig. 2, 3 and 20, in some embodiments, step 013 comprises:
0131: initiating a target call request for the target service, wherein the target call request comprises initial call information of the target service;
0132: and calling the corresponding service proxy according to the initial call information in the target call request, and determining the target call information corresponding to the initial call information through the service proxy.
In certain embodiments, service remediation module 12 is also used to perform steps 0131 and 0132.
In certain embodiments, processor 20 is configured to perform steps 0131 and 0132.
Specifically, referring to fig. 21, after service registration, the electronic device 100 may directly call the service; or after the user performs service arrangement on the registered service to form an integral application, the application is called, for example, by a subscription mode shown in fig. 14, so as to generate one or more application scripts corresponding to different application scenes. Then, the electronic device 100 quickly determines the target service according to the application scenario and the trigger event.
After determining the target service, the processor 20 may first initiate a target call request for the target service, where the target call request includes initial call information of the target service, where the initial call information may include basic attributes of the service, such as a service name, a service ID, a service description, and the like; the calling mode of the service, such as calling protocol type, some private parameters of the protocol and the like; parameter definition of service: such as an in-parameter list of services (e.g., parameter ID, parameter name, parameter type, etc.), an out-parameter list (e.g., parameter ID, parameter name, parameter type, etc.), conversion rules between service type parameters (e.g., conversion script, description, service ID, etc.), etc.
In order to achieve compatibility between services of the same type, a conversion rule between a parameter of a current service and a standard parameter (i.e., a conversion rule between service type parameters) is defined at the time of service registration. For example, the navigation service includes a navigation service a and a navigation service B, the destination parameter in the navigation service a is the "destination", the destination parameter in the navigation service B is the "final location", and parameter compatibility between services is achieved by establishing a conversion rule between the parameter of the navigation service a and the standard parameter and between the parameter of the navigation service B and the service type parameter of the standard parameter, and the call of the navigation service a and the navigation service B can be achieved by the standard parameter. If the destination navigation is performed by the navigation service A and the navigation service B, the destination navigation can be invoked by the standard parameter 'reaching the point'.
For example, parameters of the service are defined in the following table:
the initial call information may be stored in the electronic device 100 or the initial call information may be stored in the server 200, and the processor 20 may further receive the initial call information corresponding to the target service transmitted by the server 200.
Then, the processor 20 calls the service agent corresponding to the initial call information according to the initial call information in the target call request, thereby converting the initial call information into the target call information through the service agent.
It will be appreciated that the service agent may be deployed locally, such as at the electronic device 100; alternatively, the service agent may be disposed in the server 200 in the cloud, and the processor 20 sends the target call request to the server 200, and the server 200 calls the corresponding service agent according to the target call request, and converts the initial call information into the target call information.
With continued reference to fig. 21, it may be appreciated that, in order for an application to support a heterogeneous operating system, so as to implement cross-platform operation, a service gateway may convert, through a service proxy deployed locally (e.g., by the electronic device 100) or by the server 200, a target call request in a unified format into a real service call, i.e., a service call that meets the current platform service specification.
After the service agent conversion, the initial call information can be directly converted into target call information for the current electronic device 100 to directly run the target service, so that the target service can conform to the platform service specification of the current electronic device 100.
If the Messenger service, the deep service, etc. shown in fig. 21 are called, interface adaptation is performed according to the service proxy of the service gateway, so that the target call request accords with the real service call of the Messenger service, the deep service, etc., and correctly calls the Messenger service, the deep service, etc.
In some embodiments, the initial call information includes a service type and registration call information; the service type may include android system service, windows system service, etc., and the registration call information includes the basic attribute of the service, the call mode of the service, and the parameter definition of the service.
Based on the type of service that is required to provide the service, the electronic device 100 may be deployed with different service agents as needed, where different service agents correspond to different programming development languages (e.g., java, js, php, C, C ++, etc.), or different service agents correspond to different deployment modalities (e.g., applets, browser plug-ins, applications, etc.), or different service agents correspond to different operating environments (e.g., virtual machines, browsers, operating systems, containers, etc.).
And because of different operating systems, there are different upper layer abstract modes of the process and different call modes among the processes, the same call request is converted into different concrete real calls inside the service gateway. The conversion of each service type can be completed by a designated service agent, and the service agents can be increased and decreased according to actual deployment conditions, for example, a service gateway of windows only needs to deploy a service agent related to windows, and a service gateway of Android only needs to deploy a service agent related to Android.
Illustratively, as shown in fig. 22, the Service gateway may include at least one of an Android Service agent, a cloud restful agent, an Android dynamic Service agent, and a Web dynamic Service agent. For example, for Android Service, the initial call information may be converted into the target call information by Android Service, so as to adapt to the electronic device 100 of a different system or platform. For another example, for the cloud restful service, the initial call information may be converted into the target call information by the cloud restful proxy, so as to adapt to the electronic device 100 of different systems or platforms. And are not listed here.
In some implementations, the service may be invoked through an interface to which the service is exposed.
For example, an externally exposed interface of the RestFul service of the cloud is defined as the following format:
the externally exposed interface to the deep service of An Zhuoduan is defined as the following format:
the type information represents the real interface type of the service, and the attribute required to be configured is different in different types. Provider type information refers to which application can support this service, e.g. navigation class services can be provided by the high or hundred degrees. The link information refers to an address that accesses the cloud. The link standard information refers to a linked calling method.
When the interface exposed by the target service is called, the service gateway selects a corresponding service agent according to the service type (corresponding to the type information in the interface definition) in the initial call information, and then the service agent converts the registered call information into target call information which is suitable for the interface type of the real service provider.
Therefore, the service provider does not need to make interface adaptation modification, and can quickly determine the accurate service agent only by indicating the information when registering the service, thereby realizing the conversion from the registration call information to the target call information, and finally operating the target service according to the target call information, such as calling the interface exposed by the target service and inputting the target call information (such as input parameters) to the interface, and the target service outputs output parameters according to the input parameters, thereby realizing the call of the target service.
In this way, the target service can be normally operated by the electronic device 100 of any system or platform, and cross-platform operation of the target service is realized.
Referring to fig. 2, 3 and 23, in some embodiments, step 014 includes:
0141: operating the target service according to the target call information under the condition that the target service is of the first service type;
0142: and determining a host service corresponding to the target service under the condition that the target service is of the second service type, and operating the target service according to the target call information under the condition that the host service is in an operating state.
In some embodiments, service execution module 13 is further configured to perform step 0141 and step 0142.
In certain embodiments, processor 20 is configured to perform step 0141 and step 0142.
In particular, the services comprise different service types, e.g. the services comprise a first service type, which may be dynamic, and a second service type, which may be static.
The static service is a pre-installed service, the call of which depends on the host application, and the host application needs to be installed, and the static service can be called only when the host service in the host application runs.
Dynamic services refer to services supporting dynamic deployment, and can be dynamically deployed in a device in the running process of an application. The dynamic service is used as a service supporting dynamic deployment, the call of the dynamic service does not depend on a pre-installed application, and the dynamic service which can be called is uniformly stored in the server 200 of the cloud. In the case that the target service is the first service type, the target service may be directly acquired from the server 200 and dynamically deployed, so that the target service is directly operated according to the target call information.
In one possible implementation, the target service is a static service, i.e., a pre-install service, which defines the host application and access path. The service management module 12 is configured to manage a static service provided by a host application in a local device, and after obtaining target call information, the service management module 12 queries a target service in the local device, and then determines a host service corresponding to the target service, and before the target service is operated, the host service needs to be operated first, so that it is ensured that the target service can be normally operated according to the target call information.
Optionally, if the target service is started successfully, the electronic device 100 may determine the target service agent corresponding to the target service according to the initial call information (the service type in the initial call information) of the target service, and perform parameter conversion through the target service agent, so as to obtain, based on the host application and the access path (the interface exposed by the target service) corresponding to the target service, the target service code set corresponding to the target service stored in the current electronic device 100, so as to call the target service by running the target service code set.
Alternatively, if the target service fails to start, or the target service does not meet the pre-installation condition (e.g., the host application is not installed, the host service is not running, etc., resulting in the target service not being available), the electronic device 100 may prompt for the failure.
In one illustrative example, the static service invocation procedure is shown in FIG. 24.
Step 1801, a target call request for a target service is initiated to the service gateway.
Step 1802, the service gateway verifies whether the request is legal; if it is legal, step 1803 is executed, and if it is not legal, step 1809 is executed.
Step 1803, detecting whether a host service is started; then the target service is started and step 1804 is performed; if not, prompting the failure of starting the target service.
Step 1804, detecting whether a target service agent corresponding to the target service exists; if so, execute step 1805; if not, step 1809 is performed.
Step 1805, converting, by the target service agent, the initial call information in the target call request into target call information.
Step 1806, calling the target service based on the converted target calling information to obtain an output parameter.
In step 1807, output parameter conversion is performed by the target service agent to generate a call result.
Step 1808, return a call result.
Step 1809, return call failure.
In another possible implementation, the target service is a dynamic service, and the service administration module 12 obtains, at the server 200, a set of target service codes corresponding to the target service sent by the server 200, so as to run the set of target service codes according to the target call information, thereby implementing the call of the target service.
Optionally, the electronic device 100 needs to determine a target service agent corresponding to the target service according to initial call information of the target service, and perform parameter conversion through the target service agent, so as to output converted target call information, so as to call the target service based on the target call information.
In one illustrative example, the call procedure for a dynamic service is shown in FIG. 25.
Step 1901, a target call request is initiated to a service gateway.
Step 1902, the service gateway verifies whether the request is legal; if it is legal, step 1903 is performed, and if it is not legal, step 1910 is performed.
Step 1903, detecting whether there is a target service agent corresponding to the target service; if so, go to step 1904; if not, step 1910 is performed.
In step 1904, the initial call information in the target call request is converted into target call information by the target service agent.
Step 1905, detecting whether a target service is deployed; if deployed, go to step 1906; if not, step 1907 is performed.
Step 1906, calling the target service based on the converted target call information to obtain the output parameter.
In step 1907, the target service is downloaded and deployed from the application repository, and the target service is invoked based on the converted target invocation information to obtain the output parameters.
In step 1908, output parameter conversion is performed by the target service agent to generate a call result.
Step 1909, return call results.
Step 1910, return call failure.
There is also a difference in the subsequent steps after the electronic device 100 completes the service call for different service call types. In one possible implementation, in the case that the target service is a synchronous call service, the electronic device 100 obtains a call result of the target service, so as to call the next service (until the call of all the services under the trigger is completed) based on the call result; in the case where the target service is an asynchronous call service, the electronic device 100 acquires a service output event of the target service, thereby triggering a trigger in the target application script through the service output event.
The service return value may be data, a file, an instruction, etc., for example, the return value of the screen capturing service may be a captured picture; the trigger triggered by the service output event may be a trigger other than the target trigger, or may be a target trigger, which is not limited in this embodiment.
Referring to fig. 26 and 27, in some embodiments, the main flow of service operation is as follows:
In step 2101, a trigger event triggers a specific trigger in device a.
Device a executes control logic of the trigger, step 2102.
In step 2103, device a executes the business logic according to the control logic, changes the state information and synchronizes. The state information on one or more devices is peer-to-peer, maintaining all state information required for the application to run, and synchronizing the state information of all devices to the latest state through data synchronization when the state information on one device changes.
In one example, as shown in fig. 28, when different devices under the same user account are running the same application, device a deploys service 1 and service 2, device B deploys service 1 and service 3, and device C deploys service 4 and service 5. When the service 3 is invoked in the device B, the output event of the service 3 changes the state information, and the device B synchronizes the state information to the device a and the device C in a data synchronization manner, so that the device a and the device C update their own state information. When the service 5 is invoked in the device C, the output event of the service 5 changes the state information, and the device C synchronizes the state information to the device a and the device B in a data synchronization manner, so that the device a and the device B update their own state information.
Through the synchronization mechanism, each device under the user account is equivalent, and the disconnection of a single device does not affect the running of the application. For example, when the screen images of other devices are intercepted by double-finger tapping and sent to the current device, the user can tap any one device to complete the operation, and any device is disconnected, so that normal screen capturing of the other devices is not affected.
In step 2104, the service governance module invokes the target service according to the control logic, which may be invoked through the service gateway. Such as the service remediation module invoking service 2 of device C through the service gateway. The target service may be one or more, presented in a list. If there is only one service in the service list, the service is directly invoked, and if there are multiple services, the appropriate service is selected according to the policy.
Currently, two strategies are provided, wherein strategy 1 is to pop up a service list to enable a user to select by himself; policy 2 assists the user in automatically determining the target service based on the user information, the device information, the user behavior, the service information, and the like, such as determining the target service based on historical usage information of the user when running the target service.
Step 2105, for synchronous call, the application runtime directly obtains the return value of the service, and uses this return value as the input parameter to be used as the input value of the subsequent service, if there is subsequent business logic, then go back to step 1003, otherwise execution ends.
For asynchronous calls, the called service, after completing the service, sends a trigger event to device a, which responds to the trigger event and triggers a specific trigger, returning again to step 1001.
Referring to fig. 2, 3, and 29, in some embodiments, the initial call information includes registration call information and/or dynamic call information; step 013 comprises:
0133: acquiring registration call information of a target service stored by current equipment and/or acquiring dynamic call information generated by other services; the registration calling information is used for representing the attribute related to the target service, and the dynamic calling information is used for representing the input variable required by running the target service;
0134: and determining target call information according to the registration call information and the dynamic call information.
In certain embodiments, service remediation module 12 is also used to perform steps 0133 and 0134.
In certain embodiments, processor 20 is configured to perform steps 0133 and 0134.
Specifically, the initial call information includes registration call information and/or dynamic call information; the registration call information is used to indicate the attribute related to the target service, and the specific content thereof is referred to the foregoing description and will not be repeated herein. The dynamic call information includes input variables generated by other services or the electronic device 100, for example, the input variables of the target service are related parameters generated by other services in the target service set except the target service.
When the target service is operated according to the target call information, the input variable is input to the target service, and the output variable can be obtained, so that the operation of the target service is realized.
The specific calling process can be: the interfaces of the target service exposure are: POST http:// xxx.x.x.x.8888/service/call, the interface needs to obtain the following information:
{ A: string,// type information
And B, setting the device for calling the service as a local device or setting the device as an empty device if the device for calling the target service is the current device, and if the device is determined that other devices call the target service, indicating the IDs of the other devices, for example, inputting the ID of a watch connected with a mobile phone when the watch calls the target service, and directly controlling the watch to call the target service through the mobile phone.
C [ ]// entry list of service calls (i.e., input variables) }
It can be seen that when a target service is invoked via an interface, it is necessary to specify the service type, the device in which the service is located (some device that invokes the target service may be specified or some device filtered out), and a list of references, where the list of references may include input variables.
The target service can output parameters according to the information, if the current equipment calls the cloud map service to navigate, the type information in the input parameters is cloud restful, the equipment calling the service is local, the input variables comprise destination addresses, and the target service outputs a navigation route and a surrounding map as output parameters, so that navigation is realized.
In one example, for a subway scenario, after a user gets a subway, it is determined that a target service is an arrival reminding service, when the user arrives at each station, arrival reminding is performed, that is, when the arrival reminding service is called, the arrival reminding service outputs station information according to position information (i.e. as input variable) input by the electronic device 100, and when the output station information is a target station, a ride code service is called according to the target station (i.e. as input variable), so as to output ride code information.
In another example, for a cross-end screen capture scenario, if a trigger event of a cross-end screen capture is detected (e.g., the current electronic device 100 receives a screen capture gesture), the screen capture gesture service is executed, and the screen capture gesture is used as an input variable to request to invoke a screen capture service of another electronic device 100 (e.g., a device logged in to the same account as the current electronic device 100), and the screen capture service sends a screen capture to the current electronic device 100 after the screen capture according to the screen capture gesture.
For ease of understanding, please refer to fig. 30, which is described below in terms of an overall operation flow of the control method of the present application.
As shown in steps S1 and S2, the service of the present application is developed by a service developer and uploaded to the server 200. The narrow meaning refers to the developer developing according to a preset service framework; by broad sense is meant services developed by general service developers, i.e., services that are not developed in accordance with a preset service framework.
In step S3, the scene sensing module 11 senses the application scene.
In step S4, the electronic device 100 detects a triggering event according to the interaction information.
In step S5, after detecting the trigger event, the service management module 12 may determine the trigger corresponding to the trigger event, thereby determining one or more target services corresponding to the control logic in the trigger.
In step S6, if the electronic device 100 does not have the target service locally, the target service is downloaded from the server 200 and deployed to the electronic device 100.
In step S7, if there are multiple target services, the target service is determined according to the user selection, or the user is automatically assisted to determine the target service according to the user information, the device information, the user behavior, the service information, and other data.
In step S8, the target service is deployed to the electronic device 100 and invoked, where the target service may be deployed on the current device or on other devices of different systems or platforms, so as to implement cross-platform operation.
In step S9, when invoking the target service, the service administration module 12 may send a target invocation request to the service gateway.
In step S10, the service gateway determines a service proxy corresponding to the service type in the target call request, and the service proxy converts the registration call information into target call information and sends the target call information to the service operation module 13.
Step S11: the service operation module 13 operates the target service according to the target call information.
Referring to fig. 30 and 31, in an example, the scene sensing module 11 of the electronic device 100 determines that the application scene is a shopping scene according to the location information, when the user arrives at the parking lot entrance, the service management module 12 detects that the trigger event is "parking space navigation", the service management module 12 determines a trigger corresponding to the "parking space navigation", determines one or more target services (such as navigation services) corresponding to control logic in the trigger, and if the navigation services are not deployed locally in the electronic device 100, downloads the navigation services from the server 200 and deploys the navigation services.
After the navigation service deployment is completed, the service management module 12 invokes the navigation service, the service management module 12 sends a target invoking request to the service gateway, the service gateway determines that the service type in the target invoking request corresponds to a service proxy, the service proxy converts the registration invoking information into target invoking information and sends the target invoking information to the service operation module 13, and the service operation module 13 operates the navigation service to guide a user to park to an empty car.
After parking is completed, service administration module 12 detects that the trigger event is "shopping," service administration module 12 determines a trigger corresponding to "shopping," and determines one or more target services (e.g., shopping services) corresponding to control logic in the trigger, and if the shopping services are not deployed locally by electronic device 100, the shopping services are downloaded from server 200 and deployed.
Finally, the service management module 12 calls shopping service, the service management module 12 sends a target call request to a service gateway, the service gateway determines that the service type in the target call request corresponds to a service proxy, the service proxy converts registration call information into target call information and sends the target call information to the service operation module 13, and the service operation module 13 operates shopping service to guide a user to make shopping, for example, navigation is carried out on the user, and the user is quickly guided to find a required commodity position; alternatively, recommending the commodity; or the commodity is added into the shopping cart by touching a commodity label; alternatively, a pick-up code or the like is displayed.
In certain embodiments, the present application provides a new development model, a hierarchical development model, by servicing applications.
In particular, the system provides a generic mechanism to mask complex details about applications. The service developer only needs to provide single-function services that can be used by multiple applications. The service developer may be a general developer. Application developers focus on the logic of the business and the system provides a way of code/low code development. The application developer can be a common developer or a common user, so that the development threshold is reduced, and the foundation of the developer is expanded.
The development of applications is divided into three roles, mainly into service developers and application developers for external developers.
The service is used as a component element of the application, and the main flow is as follows:
1. and developing the service.
The application store can access the services of any operating system and platform theoretically, so that the developer can keep the original development mode, and the developers of different platforms such as Windows, android, cloud can develop related services by using the familiar technology of the developers.
2. The dynamic service is submitted to the server 200.
The system provides some dynamic service mechanisms, if the developer develops the dynamic service provided by the system, the developer needs to package the service and submit the packaged service to the server 200 after completing the development of the service.
3. The application store accesses the service.
The main sub-flows of service development are as follows:
1. the third party service provider issues a service in the application store, specifies information related to the name of the service, the description of the service, and the like.
2. The calling mode of the third party service comprises a calling path, input parameters, output parameters, conversion rules between the input parameters and the output parameters and the like.
3. The information of third party service installation is two kinds, one is a pre-installed service, for example, a service for playing a video is obtained along with the installation of the application of the curiosity, and the information of a host application and a supported version are required to be specified; one is a dynamic service, the code of which is at the server 200, which can be dynamically deployed to a designated device, the type of dynamic service that needs to be designated, and the address of the dynamic service at the application store.
4. After submitting the service, the application store performs a service audit, and if the service passes, the service can be seen in the application development tool.
Development flow of application developer:
application development tools of different levels such as codes, low codes, no codes and the like are provided, so that professional developers and common users can develop applications.
In some implementations, users may share applications.
The application is operated based on the user, and a unified framework is provided among different users by the system to support sharing and sharing application among different users. The application developer only needs to specify related sharing interaction (such as collision, two-dimensional code, short message, weChat and the like) when packaging the application, and can realize sharing of the application among different users without additional code development, and a set of mechanism is provided for authorizing and withdrawing the sharing and ensuring the safety of behaviors.
This mechanism will greatly enhance the experience of the application: if the shared clipboard is applied, the equipment under the user name can share complex paste, if the equipment bumps against the mobile phone of the friend, the content can be copied on the mobile phone of the friend, and the equipment can be directly pasted on the mobile phone of the friend. The multi-camera application can enable the camera of all devices under the name of a user to be selected when the user takes a picture, and if the user collides with the mobile phone of the friend, the camera of the mobile phone of the user can be selected; for example, the batch picture processing application on the mobile phone is in contact with the mobile phone of friends, the processing speed is doubled, and the processing speed is doubled when the mobile phone is in contact with a computer.
Or, for example, the drawing board can be cooperated with the graffiti by a plurality of devices of the user, can run on any device and system, and can circulate in the devices of the user; graffiti can be done together if a phone of another user bumps against it or links are sent to other users. The method comprises the steps that the method that the super game distributes audio, video and control capacity on a host device running the game to different target devices is adopted, so that the sharing capacity of one game process among a plurality of target devices is realized; if the user collides with the mobile phone of other users, the capabilities of the current game such as pictures, sounds, control and the like can be transferred to the mobile phones of other users.
Based on the above embodiments, the application of the present application has the following features:
1. compared with the traditional application which follows the equipment, the application follows the person, the user acquires the application bound with the user through the subscription service, and the application is only bound with the user, but not bound with the specific equipment.
2. The application is distributed naturally, different services of the application can be deployed to the same device or differentially deployed to different devices, different tasks are executed on different devices, and multi-device operation can be supported.
3. The application of the application supports heterogeneous operating systems, and the conversion from initial call information to target call information is realized through the service gateway, so that the service of the application (such as application 3 in fig. 4) can be the service on different operating systems, and thus the application can run across multiple platforms.
4. Services in applications of the present application may support dynamic deployment, i.e., users need to use the services before downloading from an application store, thereby supporting a point-and-use user use experience.
5. The application definition of the application is consistent on different devices, so that the consistency of the user experience is ensured; when the functions of the same application are realized through a plurality of devices, service logic of different devices may be different, so that different deployed services of different devices may also have differences, and the applied service sets are different on different devices and have the characteristic of differential deployment, so that the characteristics of different devices can be utilized.
6. The application of the application is an application for service assembly, the service can be used by a plurality of applications, the application is formed by combining a plurality of services, and the service has high reusability, so that the development difficulty is reduced.
7. The visual development tool reduces the development threshold, enables the common user to develop micro-applications, particularly the micro-applications related to the equipment characteristics, and improves the use experience of the equipment.
8. The application provides an access entry for a third party application developer, and the third party service can be conveniently accessed to the system on the premise of not modifying the original application by only specifying the service name and description, the calling mode of the service and other information.
9. The user subscription system enables personalized equipment experience to be possible, and because the application is bound with the user, after different users are switched by the same equipment, the application corresponding to the user is also different, so that the user can enable the same equipment to have different characteristics in different user hands by subscribing different applications.
10. The special sharing and sharing system, such as the application sharing clipboard, can enable the micro-application to generate effects among different users, and increases interaction experience.
Referring to fig. 32, a schematic diagram of a control device 1000 according to an exemplary embodiment of the present application is shown. At least one electronic device 100 and a server 200 are included in the system.
In one possible implementation, the electronic device 100 is provided with a service governance module 12, where the service governance module 12 serves as a core for controlling the running of applications, and includes a script management module 1211, an event bus module 1212, an application scheduling module 1213, a runtime module 1214, and a service scheduling module 1215.
The script management module 1211 is configured to manage application scripts of each application stored in the electronic device 100, and is responsible for parsing the application scripts, so as to determine triggers in the application scripts and services under the triggers. Optionally, the script management module 1211 is further configured to download a corresponding application script when receiving a subscription operation to the application, and delete an application script corresponding to the application when receiving a subscription cancellation operation to the application.
The event bus module 1212 is configured to cooperate with the application scheduling module 1213 to implement application scheduling based on the trigger event.
In some embodiments, the event bus module 1212 downloads a trigger corresponding to a plurality of applications, and after receiving the trigger event, the event bus module 1212 sends the trigger event to the application scheduling module 1213 to determine an application to be run from a plurality of applications, because the same trigger event may trigger a plurality of subscribed applications. The application scheduling module 1213 may automatically determine the application based on the scheduling policy, or may manually select the application by the user.
The runtime module 1214 is used to execute control logic in the application script to interact with the service dispatch module 1215 based on the control logic, with service calls made by the service dispatch module 1215.
In addition to the service administration module 12, a data synchronization module 14 and a service gateway 15 are also provided in the electronic device 100.
The data synchronization module 14 is configured to perform state synchronization to other devices when the state of the service administration module 12 is changed (e.g., the state is changed due to an event generated by a service), so as to ensure the consistency of state information of the different devices.
The service gateway 15 is configured to make a service call based on the target call request of the service scheduling module 1215, and specifically includes an agent management module 151 and a lifecycle management module 152. The agent management module 151 is provided with different service agents, and the service agents are used for converting initial call information of the target call request in a unified format to obtain target call information conforming to the electronic device 100.
The lifecycle management module 152 is used to manage the lifecycle of the service.
In the running process of the application, the electronic device 100 can not only perform service call through its own service gateway 15, but also perform service call (determined by the service logic of the application) through the service gateway 15 of other devices or the server 200, so that cross-device service call can be implemented.
For services of different deployment manners, as shown in fig. 32, the electronic device 100 is further provided with a static service module 16 and a dynamic service module 17. The static service module 16 is used for managing pre-installed static services, and the dynamic service module 17 is used for managing dynamic services deployed dynamically.
In some embodiments, the services contained in dynamic services module 17 may correspond to different operating environments, different deployment modalities, or different programming development languages. In fig. 32, the dynamic service module 17 is schematically illustrated as including the android dynamic service 1251, the Web dynamic service 1252, and the container dynamic service 1253, but this configuration is not limited thereto.
Server 200 is a server, a service cluster formed by a plurality of servers, or a cloud computing center. The functions implemented by the servers in the server cluster are divided, as shown in fig. 32, and the server cluster includes a user resource management server 212, an application store server 222, a service market server 223, and a cloud service library 224.
The user resource management server 212 is used for managing user accounts using applications, managing applications subscribed under different user accounts, managing invoked services, managing binding relations between user accounts and devices, and performing security verification in the interaction process.
The application store server 222 is used to provide application subscription services. When the user needs to use the application, the application search can be performed by the application search engine provided by the application store server 222, and the subscription application can be selected from the search results.
In some embodiments, after receiving the subscription operation to the application, the application store server 222 sends the user account and the service identifier of the subscribed application to the user resource management server 212, and the user resource management server 212 updates the subscription relationship between the user account and the application. Further, the user resource management server 212 is further configured to determine other devices under the user account, and push subscription notifications to the other devices, so that the other devices download application scripts of the applications from the application store server 222. Illustratively, after the user 'Zhang Sanu' subscribes to the application through the smart phone, the car machine and the tablet computer logging in the user account of Zhang Sanu receive the subscription notification pushed by the user resource management server 221, so as to download the application script of the application based on the service identifier in the notification.
The service marketplace server 223 is developer-oriented, a server for providing service query services. The developer can search for services through the service search engine provided by the service marketplace server 223, and apply the searched services to the developed applications, and further upload the developed applications to the application store server 222 for downloading by other users.
In addition, the developer may also perform dynamic service development and upload the developed dynamic service to the application store server 222.
The subsequent electronic device 100 may download and deploy the dynamic service from the application store server 222.
It should be noted that, the foregoing embodiments only illustrate the infrastructure of the system, and other computer devices (such as a developer device) or servers implementing other functions may also be included in the system, and the present embodiment is not limited to this configuration. The explanation will be made below with reference to a specific example based on the control device 1000 described above. Specifically, the description is given by taking the example of the user realizing the cross-end screen capturing application:
specifically, please refer to fig. 32 and 33 in combination, a user subscribes to a cross-end screen capture application through an application marketplace. The user resource management server 221 pushes a subscription notification containing the service identification of the cross-end screen capture application to all devices under the user account to facilitate subsequent service deployment.
The cross-end screen capturing application comprises a double-finger tapping service, a picture display service and a screen capturing service. The script management module 1211 of all devices under the user account analyzes the application script, and determines that the double-finger tapping service needs to be deployed first according to the analysis content.
The user may perform a double-finger tapping action on any device (such as the mobile phone 101 shown in fig. 33), so as to trigger a screen capturing event, the event bus module 1212 and the application scheduling module 1213 determine a service to be operated according to the screen capturing event, then the service scheduling module 1215 schedules the double-finger tapping service, after the double-finger tapping service is operated, the target call request for acquiring the screen capturing images of other devices is sent to the other devices (such as the computer 102 shown in fig. 33), the agent management module 151 in the service gateway 15 of the computer 102 determines a service agent for the target call request, then converts the initial call information in the target call request to generate target call information, the event bus module 1212 and the application scheduling module 1213 in the computer 102 cooperate to determine a service (such as the screen capturing service) to be deployed and operated, and the service scheduling module 1215 schedules the screen capturing service according to the target call information, so as to acquire the screen capturing images, and sends the call request containing the screen capturing images to the mobile phone 101 again.
The agent management module 151 in the service gateway 15 of the mobile phone determines a service agent corresponding to the target call request sent by the computer, then converts the initial call information in the target call request to generate target call information, the event bus module 1212 and the application scheduling module 1213 cooperate to determine a service (such as a picture display service) to be deployed and operated, the service scheduling module 1215 schedules the picture display service according to the target call information, and the picture display service displays a screen capture image of the computer 102 according to the screen capture image contained in the target call information and can store the screen capture image, thereby realizing cross-terminal screen capture.
And then calling the screen capturing service of other devices except the knocked device according to the control logic in the application script, determining that the executed business logic is to be the screen capturing and sending, thereby downloading and dynamically deploying the screen capturing service, calling the screen capturing service to capture the current device picture, and feeding back to the mobile phone. And after receiving the screen capturing pictures fed back by other equipment, the mobile phone calls a picture display service to realize screen capturing display.
Referring to fig. 34, embodiments of the present application also provide a non-transitory computer readable storage medium 200 containing a computer program 201. The computer program 201, when executed by the one or more processors 20, causes the one or more processors 20 to perform the control method of any of the embodiments described above.
Referring to fig. 1, for example, a computer program 201, when executed by one or more processors 20, causes the processors 20 to perform the following control methods:
011: determining the current application scene and determining a service set corresponding to the application scene;
012: under the condition that a trigger event is detected in an application scene, determining a target service corresponding to the trigger event in a service set;
013: determining initial call information of the target service, determining target call information according to the initial call information, wherein the target call information is matched with the electronic equipment 100 running the target service;
014: operating a target service according to the target call information;
015: judging whether a preset first interactive operation is received or not when the target service is interrupted;
016: if yes, the interrupted target service is restored.
In the description of the present specification, reference to the terms "certain embodiments," "in one example," "illustratively," and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiments or examples is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the present application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the present application.

Claims (19)

1. A control method, characterized by comprising:
determining an application scene at present, and determining a service set corresponding to the application scene, wherein the application scene is a use scene at present of the electronic equipment;
under the condition that a trigger event is detected in the application scene, determining a target service corresponding to the trigger event in the service set; different application scenes correspond to different service sets, and the trigger event has an association relation with the service in the service sets;
determining initial call information of the target service, determining target call information according to the initial call information, and converting the initial call information into target call information matched with the target service, wherein the target call information is matched with the electronic equipment running the target service;
Operating the target service according to the target call information;
detecting whether a user completes a first interactive operation when the target service is interrupted, so as to determine whether the user needs to recover the interrupted target service;
if yes, the interrupted target service is restored.
2. The control method according to claim 1, characterized by further comprising:
acquiring a zone bit of the target service, wherein the zone bit is used for indicating that the target service runs in a foreground or a background; and
And determining that the target service is interrupted under the condition that the flag bit indicates that the target service runs in the background.
3. The control method according to claim 1, wherein the detecting whether the user has completed the first interactive operation comprises:
judging whether a first preset area of the electronic equipment is touched or not; and/or
Judging whether the difference value of the gesture information of the electronic equipment is larger than a preset threshold value or not; and/or
Judging whether a second preset area of the electronic equipment receives a preset gesture or not; and/or
Judging whether the electronic equipment receives preset voice input or not; and/or
And judging whether the moving track of the gaze point detected by the electronic equipment accords with a preset track.
4. The control method according to claim 1, characterized by further comprising, before resuming the interrupted target service:
judging whether the operation corresponding to the target service is completed or not; and
And in the case that the operation is not completed, entering the step of recovering the interrupted target service.
5. The control method according to claim 4, characterized by further comprising, before resuming the interrupted target service:
sending confirmation information, wherein the confirmation information comprises whether to resume the interrupted target service; and
And determining whether to resume the interrupted target service according to the second interaction operation.
6. The control method according to claim 1, characterized in that the resuming of the interrupted target service comprises:
and restoring the interrupted target service in the first electronic device and/or a second electronic device connected with the first electronic device.
7. The control method according to claim 1, wherein the determining the current application scenario includes:
and acquiring interaction information, and determining the current application scene according to the interaction information.
8. The control method according to claim 1, wherein the determining a service set corresponding to the application scenario includes:
Determining an application script corresponding to the application scene, wherein the application script comprises at least one service identifier;
and taking a set formed by all the services corresponding to the service identifiers in the application script as the service set.
9. The control method according to claim 8, wherein the determining an application script corresponding to the application scenario includes:
determining a service list corresponding to the application scene;
and generating an application script corresponding to the application scene according to at least one service identifier contained in the service list.
10. The control method according to claim 9, wherein the generating the application script corresponding to the application scenario according to the at least one service identifier included in the service list includes:
generating an initial application script corresponding to the application scene according to at least one service identifier contained in the service list;
and acquiring editing data of the initial application script, and adjusting the initial application script based on the editing data to obtain an application script corresponding to the application scene.
11. The control method according to claim 8, wherein the determining a target service corresponding to the trigger event in the service set in the case that the trigger event is detected in the application scenario includes:
Triggering a target trigger corresponding to a trigger event in the application script under the condition that the trigger event is detected in the application scene;
and calling corresponding target control logic through the target trigger, and determining target service corresponding to the trigger event in the service set according to the target control logic.
12. The control method according to claim 1, wherein the determining initial call information of the target service, and determining target call information according to the initial call information, comprises:
initiating a target call request for the target service, wherein the target call request comprises initial call information of the target service;
and calling a corresponding service agent according to the initial call information in the target call request, and determining the target call information corresponding to the initial call information through the service agent.
13. The control method according to claim 12, wherein the initial call information includes a service type and registration call information; the step of calling the corresponding service agent according to the initial call information in the target call request, and determining the target call information corresponding to the initial call information through the service agent comprises the following steps:
And calling a service agent corresponding to the service type, and converting the registration call information into corresponding target call information according to a preset format.
14. The control method according to any one of claims 1 to 13, characterized in that the running the target service according to the target call information includes:
operating the target service according to the target call information under the condition that the target service is of a first service type;
and determining a host service corresponding to the target service under the condition that the target service is of a second service type, and operating the target service according to the target call information under the condition that the host service is in an operating state.
15. The control method according to any one of claims 1 to 13, characterized in that the initial call information includes registration call information and/or dynamic call information; the determining the initial call information of the target service, and determining the target call information according to the initial call information includes:
acquiring registration call information of the target service stored by the current equipment and/or acquiring dynamic call information generated by other services; the registration calling information is used for representing the attribute related to the target service, and the dynamic calling information is used for representing the input variable required by running the target service;
And determining target call information according to the registration call information and the dynamic call information.
16. The control method according to any one of claims 1 to 13, characterized in that the determining initial call information of the target service includes:
receiving initial call information corresponding to the target service, which is sent by a server;
the operation of the target service according to the target call information comprises the following steps:
under the condition that the target service is the localization service of the current equipment, acquiring a target service code set stored by the current equipment, and operating the target service code set according to the target calling information;
and under the condition that the target service is the non-localized service of the current equipment, acquiring a target service code set which is sent by the server and corresponds to the target service, and operating the target service code set according to the target calling information.
17. A control apparatus, characterized by comprising:
the scene perception module is used for determining an application scene at present and determining a service set corresponding to the application scene, wherein the application scene is a use scene at present of the electronic equipment;
The service management module is used for determining target services corresponding to the trigger event in the service set under the condition that the trigger event is detected in the application scene; different application scenes correspond to different service sets, and the trigger event has an association relation with the service in the service sets; determining initial call information of the target service, determining target call information according to the initial call information, and converting the initial call information into target call information matched with the target service, wherein the target call information is matched with electronic equipment running the target service; and
And the service running module is used for running the target service according to the target call information, detecting whether the user finishes the first interactive operation so as to determine whether the user needs to resume the interrupted target service, and resuming the interrupted target service when detecting that the user finishes the first interactive operation so as to determine that the user needs to resume the interrupted target service.
18. The electronic equipment is characterized by comprising a processor, wherein the processor is used for determining an application scene where the electronic equipment is currently located, determining a service set corresponding to the application scene, and the application scene is a use scene where the electronic equipment is currently located; under the condition that a trigger event is detected in the application scene, determining a target service corresponding to the trigger event in the service set; different application scenes correspond to different service sets, and the trigger event has an association relation with the service in the service sets; determining initial call information of the target service, determining target call information according to the initial call information, and converting the initial call information into target call information matched with the target service, wherein the target call information is matched with the electronic equipment running the target service; operating the target service according to the target call information; detecting whether a user completes a first interactive operation when the target service is interrupted, so as to determine whether the user needs to recover the interrupted target service; and restoring the interrupted target service when detecting that the user completes the first interactive operation to determine that the user needs to restore the interrupted target service.
19. A non-transitory computer readable storage medium comprising a computer program which, when executed by a processor, causes the processor to perform the control method of any one of claims 1-16.
CN202210495300.6A 2022-05-07 2022-05-07 Control method and device, electronic equipment and computer readable storage medium Active CN115002274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210495300.6A CN115002274B (en) 2022-05-07 2022-05-07 Control method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210495300.6A CN115002274B (en) 2022-05-07 2022-05-07 Control method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN115002274A CN115002274A (en) 2022-09-02
CN115002274B true CN115002274B (en) 2024-02-20

Family

ID=83026087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210495300.6A Active CN115002274B (en) 2022-05-07 2022-05-07 Control method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN115002274B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974652A (en) * 2023-09-22 2023-10-31 星河视效科技(北京)有限公司 Intelligent interaction method, device, equipment and storage medium based on SAAS platform

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107133094A (en) * 2017-06-05 2017-09-05 努比亚技术有限公司 Application management method, mobile terminal and computer-readable recording medium
CN109067990A (en) * 2018-08-20 2018-12-21 麒麟合盛网络技术股份有限公司 A kind of application service execution method and device
CN109684047A (en) * 2018-08-21 2019-04-26 平安普惠企业管理有限公司 Event-handling method, device, equipment and computer storage medium
CN110427239A (en) * 2019-07-30 2019-11-08 维沃移动通信有限公司 A kind of event-handling method, terminal device and computer readable storage medium
CN111488444A (en) * 2020-04-13 2020-08-04 深圳追一科技有限公司 Dialogue method and device based on scene switching, electronic equipment and storage medium
CN112199623A (en) * 2020-09-29 2021-01-08 上海博泰悦臻电子设备制造有限公司 Script execution method and device, electronic equipment and storage medium
CN113961309A (en) * 2021-10-21 2022-01-21 上海波顿诺华智能科技有限公司 Information processing method, information processing device, electronic equipment and computer storage medium
CN113971049A (en) * 2020-07-23 2022-01-25 海信视像科技股份有限公司 Background service management method and display device
CN114265641A (en) * 2021-12-14 2022-04-01 Oppo广东移动通信有限公司 Control method, electronic device, and computer-readable storage medium
CN115150507A (en) * 2022-05-07 2022-10-04 Oppo广东移动通信有限公司 Service scheduling method and system, electronic device and computer readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107133094A (en) * 2017-06-05 2017-09-05 努比亚技术有限公司 Application management method, mobile terminal and computer-readable recording medium
CN109067990A (en) * 2018-08-20 2018-12-21 麒麟合盛网络技术股份有限公司 A kind of application service execution method and device
CN109684047A (en) * 2018-08-21 2019-04-26 平安普惠企业管理有限公司 Event-handling method, device, equipment and computer storage medium
CN110427239A (en) * 2019-07-30 2019-11-08 维沃移动通信有限公司 A kind of event-handling method, terminal device and computer readable storage medium
CN111488444A (en) * 2020-04-13 2020-08-04 深圳追一科技有限公司 Dialogue method and device based on scene switching, electronic equipment and storage medium
CN113971049A (en) * 2020-07-23 2022-01-25 海信视像科技股份有限公司 Background service management method and display device
CN112199623A (en) * 2020-09-29 2021-01-08 上海博泰悦臻电子设备制造有限公司 Script execution method and device, electronic equipment and storage medium
CN113961309A (en) * 2021-10-21 2022-01-21 上海波顿诺华智能科技有限公司 Information processing method, information processing device, electronic equipment and computer storage medium
CN114265641A (en) * 2021-12-14 2022-04-01 Oppo广东移动通信有限公司 Control method, electronic device, and computer-readable storage medium
CN115150507A (en) * 2022-05-07 2022-10-04 Oppo广东移动通信有限公司 Service scheduling method and system, electronic device and computer readable storage medium

Also Published As

Publication number Publication date
CN115002274A (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN107943439B (en) Interface moving method and device, intelligent terminal, server and operating system
US20170048305A1 (en) Method, apparatus and computer-readable medium for displaying multimedia information in an application client
WO2023216604A1 (en) Service scheduling method and system, electronic device, and computer readable storage medium
EP3726376B1 (en) Program orchestration method and electronic device
CN112416613B (en) Application data processing method, device, equipment and medium
WO2014107693A1 (en) Method and system for providing cloud-based common distribution applications
WO2023093429A1 (en) Micro-application running method and apparatus, and device, storage medium and program product
CN111026396A (en) Page rendering method and device, electronic equipment and storage medium
CN110990075A (en) Starting method, device and equipment of fast application and storage medium
CN110990105B (en) Interface display method and device, electronic equipment and storage medium
CN111966275A (en) Program trial method, system, device, equipment and medium
US20240086231A1 (en) Task migration system and method
CN110851108A (en) Electronic equipment operation method and device, electronic equipment and storage medium
CN111464424A (en) Information sharing method, system and non-transitory computer-readable recording medium
CN115002274B (en) Control method and device, electronic equipment and computer readable storage medium
CN110851240B (en) Function calling method, device and storage medium
CN110968362A (en) Application running method and device and storage medium
EP3705997A2 (en) Method for providing routine and electronic device supporting same
CN111046265A (en) Card data display method, device, equipment and storage medium
US11249735B2 (en) System for the creation and deployment of multiplatform applications
CN115082368A (en) Image processing method, device, equipment and storage medium
CN114550417A (en) Disaster early warning method, terminal device and storage medium
CN112786022B (en) Terminal, first voice server, second voice server and voice recognition method
CN115150378A (en) Travel service method and device, electronic device and computer-readable storage medium
CN110868640A (en) Resource transfer method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant