CN110647231A - Data processing method, device and machine readable medium - Google Patents

Data processing method, device and machine readable medium Download PDF

Info

Publication number
CN110647231A
CN110647231A CN201810580632.8A CN201810580632A CN110647231A CN 110647231 A CN110647231 A CN 110647231A CN 201810580632 A CN201810580632 A CN 201810580632A CN 110647231 A CN110647231 A CN 110647231A
Authority
CN
China
Prior art keywords
scene
equipment
met
data
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810580632.8A
Other languages
Chinese (zh)
Inventor
张培阳
刘欣
吴兴昊
张亚楠
邵茂材
王迅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201810580632.8A priority Critical patent/CN110647231A/en
Publication of CN110647231A publication Critical patent/CN110647231A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3287Power saving characterised by the action undertaken by switching off individual functional units in the computer system

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the application provides a data processing method, a device, equipment and a machine readable medium, wherein the method comprises the following steps: judging whether the trigger condition is met or not according to the event information of the equipment; if the trigger condition is met, judging whether the scene detection condition is met; if the scene detection condition is met, detecting whether the equipment is in the scene; wherein the event comprises at least one of the following events: a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene. The embodiment of the application can reduce the resource consumed by the equipment.

Description

Data processing method, device and machine readable medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a data processing method, a data processing apparatus, an apparatus, and a machine-readable medium.
Background
With the development of communication technology, various mobile internet-Based Services are continuously introduced by large internet enterprises, and LBS (Location Based Services) is increasingly gaining attention. LBS is a value-added service which obtains the position information of the terminal by a positioning mode and provides corresponding service for the user under the support of a geographic information system platform. For example, when it is detected that the user enters a certain shop, sales promotion information, a coupon, and the like of the shop are provided to the user.
At present, a process of providing a service by a terminal specifically includes: the method comprises the steps of periodically collecting data of a sensor such as a Global Positioning System (GPS), judging whether the data meet service providing conditions or not according to the data of the sensor, providing services if the data meet the service providing conditions, and returning to the operation of executing the periodic collection if the data do not meet the service providing conditions.
In practical applications, taking a sensor as GPS as an example, the process of acquiring GPS data generally includes: the method comprises the steps of receiving signals of a plurality of GPS satellites, calculating the distance between the GPS satellites and the terminal according to the signals of the plurality of GPS satellites, and determining the position of the terminal according to the distance. The above GPS data acquisition process usually requires more resources, such as CPU and traffic resources. Moreover, in order to realize real-time performance of the service, the order of magnitude of the period corresponding to the periodic acquisition is usually seconds, which makes the frequency of the periodic acquisition higher, thereby further increasing the consumption of the terminal on resources.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present application is to provide a data processing method, which can reduce resources consumed by devices.
Correspondingly, the embodiment of the application also provides a data processing device, a device and a machine readable medium, which are used for ensuring the realization and the application of the method.
In order to solve the above problem, an embodiment of the present application discloses a data processing method, including:
judging whether the trigger condition is met or not according to the event information of the equipment;
if the trigger condition is met, judging whether the scene detection condition is met;
if the scene detection condition is met, detecting whether the equipment is in the scene;
wherein the event comprises at least one of the following events:
a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene.
On the other hand, the embodiment of the present application further discloses a data processing apparatus, including:
the first judgment module is used for judging whether the trigger condition is met or not according to the event information of the equipment;
the second judgment module is used for judging whether the scene detection condition is met or not under the condition that the trigger condition is met; and
the scene detection module is used for detecting whether the equipment is in the scene or not under the condition of meeting the scene detection condition;
wherein the event comprises at least one of the following events:
a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene.
In another aspect, an embodiment of the present application further discloses an apparatus, including:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform one or more of the methods described above.
In yet another aspect, embodiments of the present application disclose one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform one or more of the methods described above.
Compared with the prior art, the embodiment of the application has the following advantages:
the method comprises the steps of detecting whether equipment is in a scene or not under the condition that the equipment accords with scene detection conditions; since it may not be detected whether the device is in the scene in the case that the scene detection condition is not met, resources consumed to detect whether the device is in the scene may be reduced.
In addition, whether the judgment of the embodiment of the application accords with the scene detection condition or not can correspond to the trigger condition, and whether the judgment of the embodiment of the application accords with the trigger condition or not is judged according to the event information of the equipment, so that the trigger condition can be intelligently adjusted according to the event information of the equipment, the resource consumed by the equipment can be reduced, and the use experience of a user can be improved. For example, the triggering condition may be intelligently adjusted according to the electric quantity of the device, specifically, if the electric quantity of the device exceeds an electric quantity threshold, the triggering condition is met, or if the electric quantity of the device does not exceed the electric quantity threshold and a charging event is detected, the triggering condition is met; or if the electric quantity of the equipment does not exceed the electric quantity threshold value and the charging event is not detected, the triggering condition is not met; therefore, for devices such as mobile phones or wearable devices which are sensitive to electric quantity and power consumption, the trigger condition can be intelligently adjusted, and therefore the electric quantity of the devices can be saved.
Drawings
FIG. 1 is an illustration of an application environment for a data processing method of the present application;
FIG. 2 is a flow chart of steps of a first embodiment of a data processing method of the present application;
FIG. 3 is a flowchart illustrating steps of a second embodiment of a data processing method according to the present application;
FIG. 4 is a flowchart illustrating steps of a second embodiment of a data processing method according to the present application;
FIG. 5 is a flowchart illustrating the steps of a third embodiment of a data processing method according to the present application;
FIG. 6 is a block diagram of a data processing system according to an embodiment of the present application;
FIG. 7 is an interaction diagram of a data processing system according to an embodiment of the present application;
FIG. 8 is a block diagram of an embodiment of a data processing apparatus of the present application; and
fig. 9 is a schematic structural diagram of an apparatus according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments that can be derived from the embodiments given herein by a person of ordinary skill in the art are intended to be within the scope of the present disclosure.
While the concepts of the present application are susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the description above is not intended to limit the application to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the application.
Reference in the specification to "one embodiment," "an embodiment," "a particular embodiment," or the like, means that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, where a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. In addition, it should be understood that items in the list included in the form "at least one of a, B, and C" may include the following possible items: (A) (ii) a (B) (ii) a (C) (ii) a (A and B); (A and C); (B and C); or (A, B and C). Likewise, a listing of items in the form of "at least one of a, B, or C" may mean (a); (B) (ii) a (C) (ii) a (A and B); (A and C); (B and C); or (A, B and C).
In some cases, the disclosed embodiments may be implemented as hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be executed by one or more processors. A machine-readable storage medium may be implemented as a storage device, mechanism, or other physical structure (e.g., a volatile or non-volatile memory, a media disk, or other media other physical structure device) for storing or transmitting information in a form readable by a machine.
In the drawings, some structural or methodical features may be shown in a particular arrangement and/or ordering. Preferably, however, such specific arrangement and/or ordering is not necessary. Rather, in some embodiments, such features may be arranged in different ways and/or orders than as shown in the figures. Moreover, the inclusion of structural or methodical features in particular figures is not meant to imply that such features are required in all embodiments and that, in some embodiments, such features may not be included or may be combined with other features.
The embodiment of the application provides a data processing scheme which can judge whether a trigger condition is met or not according to event information of equipment; if the trigger condition is met, judging whether the scene detection condition is met; and if the scene detection condition is met, detecting whether the equipment is in the scene.
In the embodiment of the application, a scene is a term in the film and television industry, and refers to a life picture composed of certain people and people activities at a certain time and place. Extending to the field of communications technology, it is a product or application that a merchant proposes in order to meet the specific needs of a class of users. For example, a user coming to the great shopping mall in Jingdong wants to buy an apple computer; or the user opens the WeChat to know the eight diagrams of the friend circle, and the like; or, the user comes to a gas station and needs to refuel; alternatively, the user comes to the cafe and needs to relax; alternatively, the user comes to a sight, needs a tour, etc., which are all scenes.
Context data corresponding to a scene may refer to all scene data and interaction data (e.g., date, time of day, location, behavior, action, location, morphology, spatiotemporal elements, etc.) collected during an instance of device usage relative to an object or user. Objects that a device may interact with include, but are not limited to: other user devices (e.g., cell phones), peripheral devices such as bluetooth cell phones, keyboards, server devices, etc., or entities within a direct environment, such as landmarks, etc., machines, vehicles, etc.
Alternatively, the scene may be defined by a scene model. The scene model may include: the environment data corresponds to any one or combination of several fields, such as a time field, a location field, and a device status (e.g., speed of the vehicle). It is understood that any scene required by a merchant or those skilled in the art may be set according to the actual application requirements, and the specific scene and the scene model are not limited in the embodiments of the present application.
The embodiment of the present application may provide an example of the following scenario:
examples 1, 1,
In example 1, the scenario may include: a POI (Point of interest) scene, where the environment data corresponding to the scene may include: location data corresponding to the points of interest.
Examples of POIs may include: gas stations, parking lots, shopping malls, stores, etc. The POI scene may be used to provide a service corresponding to the POI to the user in case the user arrives at the POI. The service may be: guide or preference information corresponding to the POI, and the like.
The location data corresponding to the points of interest may include: the mobile terminal comprises latitude and longitude data, base station data, WIFI data and the like, wherein the base station data can be base stations covered by the interest points, and the WIFI data can be WIFI used by the interest points.
Examples 2,
In example 2, the scenario may include: a device state scenario, where the context data corresponding to the scenario may include: preset status data of the device.
Taking a vehicle as an example, the device status scenario may include: vehicle overspeed, low vehicle fuel volume, etc. Taking the air conditioner, the purifier and other devices as examples, the device status scenario may include: device parameters within preset ranges, etc. The device state scene may be used to send a reminder to the user when the device is in a certain device state, so as to improve the user experience. For example, in the case where the vehicle is speeding, a warning is given to the user to avoid an accident or to be photographed, etc. For another example, in the case that the vehicle is low in fuel, a fuel filling reminder is sent to the user. For another example, in the case that the air parameter of the purifier indicates that the control quality is poor, a mode switching reminder or the like is sent to the user. Accordingly, a maintenance reminder of the vehicle and the like can be sent to the user in a specific scene.
The preset state data of the device may refer to state data of the device in a certain aspect, such as for a vehicle overspeed scene, the preset state data may include: an upper speed limit of the vehicle; for another example, for a low vehicle fuel scenario, the preset status data may include: a lower fuel limit of the vehicle; for another example, for a home device status scenario, the preset status data may include: status data of the home devices, and the like. It can be understood that the preset state data of the device is not limited in the embodiments of the present application.
Examples 3,
In example 3, the scenario may include: mode scenarios, the environmental data may include: and characteristic data corresponding to the mode.
Taking a vehicle as an example, the mode scenario may include: in the route switching scenario, for example, when the navigation route selected by the user is congested, the navigation route is switched. Taking air conditioners, purifiers and other devices as examples, the mode scenarios may include: a mode switching scenario or the like, for example, in the case where the air parameters of the purifier indicate a poor control quality, a switching of the operation mode is performed, for example, switching the purifier from a first operation mode to a second operation mode, wherein the purification rate of the first operation mode is lower than the purification rate of the second operation mode, or the like.
Taking a route switching scenario as an example, the feature data corresponding to the mode may include: historical routes corresponding to the workday navigation mode, historical routes corresponding to the holiday navigation mode and the like. Taking a mode switching scenario as an example, the feature data corresponding to the mode may include: a range of operating parameters corresponding to an operating mode.
The scenes and the environment data corresponding to the scenes are described in detail in examples 1 to 3, and it can be understood that a person skilled in the art may adopt any one or a combination of examples 1 to 3 or other scenes according to practical application requirements, and the embodiments of the present application do not limit specific scenes and specific environment data corresponding to the scenes.
The conventional technology generally needs to periodically detect whether the device is in a scene, and the process of detecting whether the device is in the scene may include: data of a sensor such as a Global Positioning System (GPS) is periodically collected, and whether a service provision condition is satisfied is determined based on the data of the sensor. However, the above-mentioned acquisition process of GPS data usually requires a lot of resources.
The embodiment of the application provides a concept of scene detection conditions, and whether equipment is in the scene is detected only when the scene detection conditions are met; since it may not be detected whether the device is in the scene in the case that the scene detection condition is not met, resources consumed to detect whether the device is in the scene may be reduced.
In addition, whether the judgment of the embodiment of the application accords with the scene detection condition or not can correspond to the trigger condition, and whether the judgment of the embodiment of the application accords with the trigger condition or not is judged according to the event information of the equipment, so that the trigger condition can be intelligently adjusted according to the event information of the equipment, the power consumption of the equipment can be reduced, and the use experience of a user can be improved.
According to the embodiment of the application, under the condition that the equipment is in the scene, the service corresponding to the scene can be provided for the user. For example, when the user comes to a gas station, the user is provided with fueling coupon information (such as a coupon, a voucher, etc.) corresponding to the gas station; or, in case the user comes to the cafe, providing the user with offer information (special sale on the day today) corresponding to the cafe; or, when the user comes to the attraction, the user is provided with the tour guide information corresponding to the attraction, or the supermarket information in the attraction, and the like. It can be understood that, a person skilled in the art may determine a service corresponding to a scene according to an actual application requirement, and the embodiment of the present application does not limit a specific service corresponding to a scene.
The data processing method provided by the embodiment of the application can be applied to the application environment shown in fig. 1, for example
As shown in fig. 1, the client 100 and the server 200 are located in a wired or wireless network, through which the client 100 and the server 200 perform data interaction.
Optionally, the client may run on the device, for example, the client may be an APP running on the device, such as an e-commerce APP, an instant messaging APP, an input method APP, or an APP of an operating system itself, and the specific APP corresponding to the client is not limited in this embodiment of the application. Optionally, the above-mentioned devices may specifically include but are not limited to: smart phones, tablet computers, electronic book readers, MP3 (Moving Picture experts Group Audio Layer III) players, MP4 (Moving Picture experts Group Audio Layer IV) players, laptop portable computers, car-mounted computers, desktop computers, set-top boxes, smart televisions, wearable devices, and the like. It is to be understood that the embodiments of the present application are not limited to the specific devices.
Method embodiment one
Referring to fig. 2, a flowchart illustrating steps of a first embodiment of a data processing method according to the present application is shown, which may specifically include the following steps:
step 201, judging whether a trigger condition is met according to event information of equipment;
step 202, if the trigger condition is met, judging whether the scene detection condition is met;
and 203, if the scene detection condition is met, detecting whether the equipment is in the scene.
The method embodiment shown in fig. 2 may be executed by a client and/or a server, and it is understood that the embodiment of the present application does not impose a limitation on a specific execution subject of the method embodiment shown in fig. 2.
In step 201, the event may refer to an external and/or internal event of the device during operation. Examples of external events may include: I/O (Input/Output) events, etc. Examples of internal events may include: an operation event of the application program and the like, such as a trigger event and the like for a certain control in a page, an input event and the like for certain information. It will be appreciated that the user may determine the events in the configuration file according to actual data collection requirements.
Optionally, the event may correspond to a triggering object and/or a triggered object. Wherein, the object of the trigger event is called as a trigger object; the object that receives the event is referred to as the triggered object. For example, an event is that a user clicks a certain control in a page or inputs content in a certain text box, a trigger object corresponding to the event is the user, and a triggered object corresponding to the event is the control or the text box. For another example, in an IOT (Internet of things) service scenario, an event is that a control application corresponding to a control device or a sensor device sends a control instruction to an Internet of things device, or a control application corresponding to the control device or the sensor device receives data sent by the Internet of things device, and the like, a trigger object corresponding to the event may be the control application corresponding to the control device or the sensor device, and a triggered object corresponding to the event may be the Internet of things device.
In an alternative embodiment of the present application, the event may include, but is not limited to, at least one of the following events: a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene.
The power event may refer to an event related to the power of the device battery, such as the power of the device battery (referred to as the power of the device for short) reaching a certain specific value, or whether the device is in a charging state.
A network event may refer to an event related to a device network, such as whether a device is connected to the network, or the type of network to which the device is connected.
The traffic event may refer to traffic generated by the device uploading and downloading data through the wireless network of the operator, and the traffic event may include: the traffic consumed by the device, or the upper limit of the traffic owned by the device, etc.
A displacement event may refer to a real-time displacement produced by a device.
A task reception event may refer to a task received by a device, particularly, a task received from a server.
The embodiment of the application can provide the following technical scheme for judging whether the trigger condition is met or not according to the event information of the equipment:
technical solution A1
In technical solution a1, the step 201 determines whether the trigger condition is met according to the event information of the device, and specifically includes:
if the electric quantity of the equipment exceeds the electric quantity threshold value, the triggering condition is met; or
If the electric quantity of the equipment does not exceed the electric quantity threshold value and a charging event is detected, the triggering condition is met; or
If the electric quantity of the equipment does not exceed the electric quantity threshold value and the charging event is not detected, the triggering condition is not met.
The power threshold may be determined by those skilled in the art according to the actual application requirement, for example, the power threshold may be 30%, 50%, or the like. According to the electric quantity of the equipment, the triggering condition is intelligently adjusted, specifically, if the electric quantity of the equipment exceeds an electric quantity threshold value, the triggering condition is met, or if the electric quantity of the equipment does not exceed the electric quantity threshold value and a charging event is detected, the triggering condition is met; or if the electric quantity of the equipment does not exceed the electric quantity threshold value and the charging event is not detected, the triggering condition is not met; therefore, for devices such as mobile phones or wearable devices which are sensitive to electric quantity and power consumption, the trigger condition can be intelligently adjusted, and therefore the electric quantity of the devices can be saved.
Technical solution A2
In technical solution a2, the step 201 determines whether the trigger condition is met according to the event information of the device, and specifically includes:
if the flow of the equipment exceeds a flow threshold, the triggering condition is met; or
If the flow of the equipment does not exceed the flow threshold and the equipment is accessed to the local area network, the triggering condition is met; or
If the flow of the equipment does not exceed the flow threshold and the equipment is not accessed to the local area network, the triggering condition is not met.
The flow threshold value can be determined by those skilled in the art according to the actual application requirements, and for example, the flow threshold value can be 50M, 20M, and the like. The method and the device for controlling the flow rate of the equipment intelligently adjust the triggering condition according to the flow rate of the equipment, and specifically, if the flow rate of the equipment exceeds a flow rate threshold value, the triggering condition is met, or if the flow rate of the equipment does not exceed the flow rate threshold value and the equipment is accessed to a local area network, the triggering condition is met; or if the flow of the equipment does not exceed the flow threshold and the equipment is not accessed to the local area network, the triggering condition is not met; therefore, for devices such as mobile phones or wearable devices which are sensitive to flow, the trigger conditions can be intelligently adjusted, and therefore the flow of the devices can be saved.
Technical solution A3
In technical solution a3, the step 201 determines whether the trigger condition is met according to the event information of the device, and specifically includes:
if the displacement of the equipment in the preset time period does not exceed the position threshold, the equipment does not accord with the triggering condition; or
And if the displacement of the equipment in the preset time period exceeds the position threshold, judging whether the trigger condition is met according to the preset time interval.
The preset time period and the flow threshold may be determined by those skilled in the art according to actual application requirements, for example, the length of the preset time period may be 1S (second), 2S, 3S, 4S, 5S, and the like, and the displacement of the device in the preset time period may represent the speed of the device. The technical solution a3 may be applicable to a location-related scenario, in which case, whether the trigger condition is met may be determined according to the displacement of the device within a preset time period. Specifically, if the displacement of the device in a preset time period does not exceed the position threshold, if the device is stationary, the triggering condition may not be met; if the displacement of the device in the preset time period exceeds the position threshold, judging whether the trigger condition is met according to the preset time interval, optionally, if the displacement of the device in the preset time period is larger, the preset time interval is smaller, otherwise, if the displacement of the device in the preset time period is smaller, the preset time interval is larger.
Technical solution A4
In technical solution a4, the step 201 determines whether the trigger condition is met according to the event information of the device, and specifically includes: and if the task receiving event is detected, the triggering condition is met.
In the embodiment of the present application, a task may be used to trigger the scene detection, and the task may be determined by a person skilled in the art or a service operator. The information of the task may include: information of the scene, such as an identification of the scene, etc. As an example, a task may be associated with the store "starbucks (western music town)", and information for the task may include: information of the store, such as the name of the store, position data of the store, and the like. The client may store the task issued by the server to a task list, and a data structure corresponding to the task list may include, but is not limited to: queues, arrays, etc.
It should be noted that the above-mentioned technical solutions a1 to a4 are optional embodiments for determining whether the trigger condition is met, and actually, a person skilled in the art may determine a specific process for determining whether the trigger condition is met according to an actual application requirement, and the trigger condition may be configured by a server or a user, for example, the trigger condition may be configured in configuration information of a task, and the embodiments of the present application do not limit the specific trigger condition.
In step 202, the scene detection condition may be used as a trigger condition for detecting whether the device is in the scene, and a person skilled in the art may determine the scene detection condition according to the actual application requirement.
The embodiment of the application can provide the following judgment scheme for judging whether the scene detection condition is met:
judgment scheme 1,
In the determination scheme 1, the scenario may include: the step 202 of determining whether the scene detection condition is met may include: and judging whether the equipment is in the coverage range of the surrounding network corresponding to the interest point. The device in the judgment scheme 1 may be a mobile phone, a tablet computer, a vehicle-mounted device, or the like.
In this embodiment of the application, optionally, the coverage area of the surrounding network may be matched with the surrounding area corresponding to the interest point. The surrounding region may refer to the region of the ring around the point of interest. Optionally, the distance between the surrounding area and the point of interest may not exceed a distance threshold, examples of which may include: 50 meters, 100 meters, etc., it is understood that the embodiments of the present application do not impose any limitation on the specific distance between the surrounding area and the point of interest.
In an alternative embodiment of the present application, the frequency of use or frequency of occurrence of the surrounding network may comply with a frequency condition. In practical applications, statistics may be performed on the frequency of use or the frequency of occurrence of the surrounding network to which the point of interest is associated, where the frequency condition may include: the use frequency or the occurrence frequency exceeds a first frequency threshold, or the surrounding networks are sorted according to the use frequency or the occurrence frequency, and the sorting position is the top N (N is a natural number) bits, and the like; the preset condition may use a high-frequency surrounding network for determining the scene detection condition.
In an optional embodiment of the present application, the types of the surrounding network may include: mobile networks and/or WIFI (Wireless Fidelity).
In an application example of the application, base stations and/or WIFI detected by devices at high frequencies of about 50 meters around an interest point can be mined as a surrounding network corresponding to the interest point; therefore, whether the equipment is in the coverage range of the surrounding network corresponding to the interest point can be judged, if yes, the scene detection condition can be considered to be met, and if not, the scene detection condition can be considered to be not met.
It should be noted that the network information (base station and/or WIFI) corresponding to the device is generally obtained by an operating system of the device, and specifically, the operating system obtains the network information corresponding to the device and updates a cache according to a preset period, where the cache is used for storing the network information corresponding to the device. The detection of the GPS data corresponding to the equipment is high-frequency high-power detection, and the detection of the network information corresponding to the equipment is low-frequency low-power detection, specifically, the detection period of the GPS data is second, and the detection period of the network information is determined by the change of the network information; since whether the device is in the scene is detected only when the device is in the coverage area of the surrounding network corresponding to the point of interest, and whether the device is in the scene may not be detected when the device is not in the coverage area of the surrounding network corresponding to the point of interest, resources such as a CPU, traffic, and the like consumed by detecting whether the device is in the scene may be reduced.
Referring to fig. 3, a schematic comparison of a conventional technology and scene detection in the embodiment of the present application is shown, where a scene detection process in the conventional technology specifically includes: detecting whether the equipment is in a scene area, wherein the detection is high-frequency high-power detection; the scene detection process of the embodiment of the application specifically includes: and detecting whether the equipment is in the area near the scene through first detection, if so, detecting whether the equipment is in the area near the scene through second detection, and otherwise, circularly executing the first detection. Since the first detection is low-frequency and low-power detection, resources such as a CPU (central processing unit), traffic and the like consumed by detecting whether the equipment is in a scene area can be reduced. The scene area may refer to an area corresponding to a scene, specifically, an area where an interest point is located; a scene-neighborhood region may refer to a region that is no more than a distance threshold from a scene region.
Judgment scheme 2
In decision scheme 2, the scenario may include: a device state scenario, the device may include: the step 202 of determining whether the scene detection condition is met may specifically include: and judging whether a speed measuring device exists on the road where the vehicle is located.
The device state scenario may specifically include: vehicle overspeed scenarios. The speed measuring device is used for measuring the speed of the vehicle, and the current speed measuring mode of the vehicle can specifically comprise: the radar tests the speed, the interval tests the speed and bury the coil and test the speed etc. and the radar tests the speed, the interval tests the speed and bury the speed measuring device that the speed measuring mode such as coil tests the speed and correspond respectively and be: radar, the camera of road stuck point, buried line etc. can understand, this application embodiment does not put the restriction to specific speed sensor.
The scene detection process of the traditional technology is specifically as follows: when the vehicle runs on all roads, the speed of the vehicle is periodically calculated, and whether the speed exceeds a speed threshold (such as 120 km/h) is judged, so that more resources are consumed. The scene detection process of the embodiment of the present application may specifically include: and judging whether a speed measuring device exists on the road where the vehicle is located, if so, periodically calculating the speed of the vehicle, and judging whether the speed exceeds a speed threshold value. Because in the driving process of the vehicle on the partial road with the speed measuring device, the embodiment of the application can carry out overspeed detection, and compared with whole-course detection, the resource consumed by equipment can be reduced.
Judgment scheme 3
In decision scheme 3, the scenario may include: a device state scenario, the device may include: the step 202 of determining whether the scene detection condition is met may specifically include: and judging whether the distance between the position of the vehicle and the position of the preset residence is smaller than a distance threshold value.
The device state scenario may specifically include: and (5) a vehicle fuel shortage scene. The scene detection process of the traditional technology is specifically as follows: and detecting the oil quantity of the vehicle in real time, and judging whether the oil quantity of the vehicle is lower than an oil quantity threshold value or not. The scene detection process of the embodiment of the application specifically includes: and judging whether the distance between the position of the vehicle and the position of the preset residence is smaller than a distance threshold value or not, if so, determining that the user is about to reach the preset residence, so that the oil quantity of the vehicle can be detected, judging whether the oil quantity of the vehicle is lower than the oil quantity threshold value or not, and if so, sending an oil filling prompt to the user. The preset residence may be an address of a user, and the specific preset residence is not limited in this embodiment of the application.
In the embodiment of the application, the oil quantity threshold value can be a preset value, and can also be an oil quantity value determined according to a corresponding travel of a vehicle. The process of determining the oil amount threshold according to the corresponding travel of the vehicle may specifically include: and aiming at a stroke of the vehicle, determining an oil quantity value corresponding to the stroke according to a first oil quantity value before the stroke starts and a second oil quantity value after the stroke ends. The trip may be a trip corresponding to the time of the device, such as a workday trip (e.g., an on-duty trip, an off-duty trip, etc.) and a holiday trip, etc., and the workday trip is a trip between a home and a unit, etc. In one example, an oil amount value corresponding to a weekday trip may be determined based on historical trips of the vehicle, and the oil amount threshold may exceed the oil amount value corresponding to the weekday trip. Of course, the specific oil amount threshold is not limited in the embodiment of the present application.
Judgment scheme 4
In decision scheme 4, the scenario may include: mode scenario, the apparatus may comprise: the step 202 of determining whether the scene detection condition is met may specifically include: and judging whether the congestion probability of the road where the vehicle is located exceeds a probability threshold value.
The mode scenario may include: and switching the route. The scene detection process of the traditional technology is specifically as follows: when the vehicle runs on all roads, whether the front is congested or not is inquired periodically (for example, every 2 minutes), if so, the route is switched, and the periodic inquiry consumes more resources. The embodiment of the application can determine the congestion probability of the road through data mining; in this way, during the running process of the vehicle, whether the jam probability of the road where the vehicle is located exceeds the probability threshold value can be judged, if yes, whether the front is jammed is inquired, and if yes, the route is switched. Since the detection of whether to switch the route can be performed only on the road with a high congestion probability, the resources of the equipment can be saved.
Judgment scheme 5
In decision scheme 5, the scenario may include: mode scenario, the apparatus may comprise: the step 202 of determining whether the scene detection condition is met may specifically include: and judging whether the accident frequency of the road where the vehicle is located exceeds a second frequency threshold value.
The mode scenario may include: a security mode scenario. The operating modes of the vehicle may include: the normal mode and the safe mode, compared with the normal mode, the safe mode can provide more safety information or more functions, for example, dangerous places such as sharp bends, bridges, tunnels, crossings and the like can be prompted, and for example, functions disturbing driving such as music functions, video functions or call functions and the like can be turned off.
The scene detection process of the traditional technology is specifically as follows: when the vehicle runs on all roads, periodically (for example, every 2 minutes) inquiring whether an accident occurs in the front, if so, entering a safety mode, and the periodic inquiry consumes more resources. The accident frequency of the road can be determined by data mining; therefore, in the driving process of the vehicle, whether the accident frequency of the road where the vehicle is located exceeds the second frequency threshold value or not can be judged, if yes, whether an accident happens in the front or not is inquired, and if yes, the safety mode is entered. Since the detection of whether to enter the security mode can be performed only on the road with a high accident frequency, the resources of the equipment can be saved.
The process of determining whether the scene detection condition is met is described in detail through the determination schemes 1 to 5, and it can be understood that a person skilled in the art can adopt any one or a combination of the determination schemes 1 to 5 according to the actual application requirements, and it can be understood that the specific process of determining whether the scene detection condition is met is not limited in the embodiment of the present application.
Step 203 may detect whether the device is in the scene if the scene detection condition is met. For example, for a point of interest scene, the process of detecting whether the device is in the scene may include: the method comprises the steps of periodically acquiring GPS data corresponding to equipment, determining position data (such as longitude and latitude data) corresponding to the equipment according to the acquired GPS data, matching the position data corresponding to the equipment with the position data corresponding to a point of interest, and determining that the equipment is in a point of interest scene if matching is successful. Of course, the embodiment of the present application does not impose any limitation on the specific process of detecting whether the device is in the scene.
In an optional embodiment of the present application, if the step 203 meets the scene detection condition, detecting whether the device is in the process of the scene specifically may include: if the scene corresponding to the task meets the scene detection condition, adding the task to a scheduling list; acquiring a target task from the scheduling list; and detecting whether the equipment is in a scene corresponding to the target task.
In the embodiment of the present application, a task may be used to trigger the scene detection, and the task may be determined by a person skilled in the art or a service operator. The information of the task may include: information of the scene, such as an identification of the scene, etc. As an example, a task may be associated with the store "starbucks (western music town)", and information for the task may include: information of the store, such as the name of the store, position data of the store, and the like. The client may store the task issued by the server to a task list, and a data structure corresponding to the task list may include, but is not limited to: queues, arrays, etc.
In summary, the data processing method according to the embodiment of the present application detects whether the device is in the scene only when the scene detection condition is met; since it may not be detected whether the device is in the scene in the case that the scene detection condition is not met, resources consumed to detect whether the device is in the scene may be reduced.
In addition, whether the judgment of the embodiment of the application accords with the scene detection condition or not can correspond to the trigger condition, and whether the judgment of the embodiment of the application accords with the trigger condition or not is judged according to the event information of the equipment, so that the trigger condition can be intelligently adjusted according to the event information of the equipment, the resource consumed by the equipment can be reduced, and the use experience of a user can be improved.
Method embodiment two
Referring to fig. 4, a flowchart illustrating steps of a second embodiment of the data processing method of the present application is shown, which may specifically include the following steps:
step 401, judging whether a trigger condition is met according to event information of the equipment;
step 402, if the trigger condition is met, judging whether the scene detection condition is met;
step 403, if a scene detection condition is met, determining a scene detection code for detecting the scene and environment data corresponding to the scene detection code;
step 404, executing the scene detection code; and detecting whether the equipment is in the scene or not according to the environment data and the equipment data in the execution process of the scene detection code.
With respect to the first embodiment of the method shown in fig. 2, the embodiment shown in fig. 4 refines the process of detecting whether the device is in the scene through steps 403 and 404.
In the embodiment of the present application, the scene detection code is used for performing scene detection, that is, for detecting whether the device is in a certain scene. The scene detection code of the embodiment of the application has dynamic property, and can be dynamically issued to the client by the server, so that the client can utilize the scene detection code to perform scene detection, and further can provide a service corresponding to a scene according to a scene detection result; the scene detection code can support the addition and the update to the client, so the realization period of the service corresponding to the scene can be shortened, and the flexibility and the expansibility of the scene detection can be improved.
In the embodiment of the present application, the code may refer to a source file written by a programmer in a language supported by a development tool, and is a set of explicit rules for representing information in a discrete form by characters, symbols or signal symbols.
In one embodiment of the present application, the type of the scene detection code may include: an interpreted code. Interpreted code may refer to code that may be interpreted for execution. Examples of interpreted code may include: javascript (js) script code, JAVA code, etc. The interpreted code can be interpreted and executed by the parsing engine and can not be compiled, so that the convenience of adding and updating the scene detection code can be enhanced, and the implementation period of the scene detection code can be shortened.
In another embodiment of the present application, the type of the scene detection code may include: compiled code. Irrespective of the kind of scene detection code (executable scene detection code may in a particular case relate to a scene detection code kind), the compiled code may be understood as comprising instructions that can be read and executed/run by one or more computing or processing modules during execution of the executable scene detection code. The terms "compiled code" and "instructions" are known to those skilled in the art. Compiled code is code that is, for example, an application, software program, or software system. The compiled code may be a binary code, and in actual application, a code such as C + + may be compiled in advance into a compiled code.
The interpreted code or the compiled code is independent of a specific language, so that the scene detection code has the characteristics of cross-language and cross-platform; therefore, the scene detection code of the embodiment of the application can be well decoupled from the platform without paying attention to the specific implementation of the platform, so that different platforms can carry out scene detection based on the scene detection code, and the scene detection code of the embodiment of the application has the characteristic of cross-platform and is convenient for the transplantation of scene detection.
In an optional embodiment of the present application, the process of determining the scene detection code and the environment data corresponding to the scene detection code in step 403 may specifically include: matching the information of the task with the information of the scene detection code; and determining a scene detection code and environment data corresponding to the scene detection code according to the matching result.
In the embodiment of the present application, the client may receive a task issued by the server, where the task may be used to trigger scene detection, and the task may be determined by a person skilled in the art or a service operator. The information of the task may include: information of the scene, such as an identification of the scene, etc. As an example, a task may be associated with the store "starbucks (western music town)", and information for the task may include: information of the store, such as the name of the store, position data of the store, and the like. The client may store the task issued by the server to a task list, and a data structure corresponding to the task list may include, but is not limited to: queues, arrays, etc.
The information of the scene detection code may include: information of a scene. Therefore, the information of the task and the information of the scene detection codes can be matched, and the scene detection codes which are successfully matched are used as the scene detection codes corresponding to the task, namely the scene detection codes which need to be executed.
In an embodiment of the present application, before matching the information of the task with the information of the scene detection code, the task in the task list may be scheduled by using a scheduling algorithm to obtain a target task, and then the information of the target task may be matched with the information of the scene detection code. The factors on which the scheduling algorithm depends may include, but are not limited to: the priority of the task, the resource quota of the task, etc.
The priority is a parameter for determining the priority level of processing to be accepted by one task when a plurality of tasks are processed, and is usually processed with priority having a high priority. The resource quota may be used to control resources occupied by the task, and the resource quota may be an upper limit of the resources occupied by the task, and the resources include but are not limited to: flow, computational load, power consumption, etc. In practical application, if a resource actually occupied by a task exceeds a resource quota, the task may not have a scheduled condition, otherwise, if the resource actually occupied by a task does not exceed the resource quota, the task has the scheduled condition.
Step 404 executes the scene detection code determined in step 403 to obtain corresponding execution data.
According to one embodiment, the scene detection code may be interpreted code and the scene detection code may be executed using an interpretation engine. Examples of interpretation engines may include: and the JavaScript engine can be utilized to execute the JS script code.
According to another embodiment, the scene detection code may be a compiled code, and the compiled code may be carried in a file suffixed to the so (shared object), and then the client 100 may directly execute the so file.
In the execution process of the scene detection code, whether the equipment is in the scene corresponding to the scene detection code is detected according to the environment data and the equipment data. Optionally, the data of the device may include: sensor data of the device. The sensor is a detection device which can sense the measured information and convert the sensed information into an electric signal or other information in a required form according to a certain rule to output so as to meet the requirements of information transmission, processing, storage, display, recording, control and the like. Such sensors may include, but are not limited to: acceleration sensor, GPS sensor, gravity sensor, fingerprint sensor, baroceptor, heart rate sensor, distance sensor, air quality sensor, temperature sensor etc. can understand, and this application embodiment does not put the restriction to specific sensor.
In an optional embodiment of the present application, the sensor interface may be preset to obtain the sensor data of the device by calling the sensor interface. The sensor interface can be positioned above a HAL (hardware abstraction Layer), can hide the hardware interface details of a specific platform, provides a virtual hardware platform for an operating system, enables the virtual hardware platform to have hardware independence, and can be transplanted on various platforms.
Optionally, the process of detecting whether the device is in the scene corresponding to the scene detection code according to the environment data and the device data may specifically include: and matching the environmental data with the sensor data of the equipment, and if the matching is successful, determining that the equipment is in a scene corresponding to the scene detection code.
Taking the scene of the point of interest as an example, the environmental data may be location data (such as longitude and latitude data, base station data, WIFI data, and the like) corresponding to the point of interest, and the sensor data of the device may be location data of the device, so that the location data corresponding to the point of interest may be matched with the location data of the device.
Taking the device status scenario as an example, the preset status data of the device may be matched with the real-time status data of the device. For example, for a vehicle overspeed scenario, the upper speed limit of the vehicle is matched to the real-time speed of the vehicle; for another example, for a scene that the fuel quantity of the vehicle is low, matching the lower limit of the fuel quantity of the vehicle with the real-time fuel quantity of the vehicle; for another example, for a home device status scene, matching preset status data of the home device with real-time status data of the home device, and the like.
Taking a mode scene as an example, the feature data corresponding to the mode may be matched with the real-time data of the device.
In another optional embodiment of the present application, before executing the scene detection code, the environment data may be preprocessed, and the preprocessing may perform format conversion on the environment data to convert the environment data into a preset format. The preset format may be a format matched with the scene detection code, and it is understood that the specific preset format is not limited in the embodiment of the present application.
The obtaining of the execution data in the embodiment of the application may include at least one of the following data: detecting the result, detecting the basis data, and executing the error data. Wherein, the detection result may include: in a scene, or not in a scene; the detection-basis data may refer to a basis for detecting whether the device is in a scene corresponding to the scene detection code; the execution error data may refer to corresponding error data in case that execution of the scene detection code is erroneous.
In an optional embodiment of the present application, when the detection result is in a scene, the information of the scene may be sent to a preset application of the device according to the configuration of the scene, so that the preset application provides a service corresponding to the scene; or, the information of the scene may be sent to the server, so that the server performs secondary detection to detect whether the device is in the scene corresponding to the scene detection code, in this case, the secondary detection result of the server may be received, and further processing may be performed according to the secondary detection result.
In an optional embodiment of the present application, the method of the embodiment of the present application may further include: and sending the execution data corresponding to the scene detection code to a server. The execution data may cause the server to analyze the execution of the scene detection code on the client.
In summary, in the data processing method according to the embodiment of the present application, the scene detection code is dynamic, and can be dynamically issued to the client by the server, so that the client performs scene detection by using the scene detection code, and further can provide a service corresponding to a scene according to a scene detection result; the scene detection code can support the addition and the update to the client, so the realization period of the service corresponding to the scene can be shortened, and the flexibility and the expansibility of the scene detection can be improved.
In addition, in the embodiment of the application, the scene detection code is used for detecting whether the device is in a scene corresponding to the scene detection code, and the scene detection code is convenient to add and update and is beneficial to service expansion.
In addition, the types of scene detection codes may include: the compiled code or the interpreted code can not be influenced by the implementation language of the client, and is convenient to maintain.
In addition, the method and the device facilitate the gray scale release of the scene detection codes. Gray scale distribution refers to a distribution method that can smoothly transition between black and white. An A/B testing may be performed on it, i.e. a part of the users continue to use the scene detection code A, a part of the users start to use the scene detection code B, if the user scene detection code B has nothing to do with it, the scope is gradually enlarged, and all the users are migrated onto the scene detection code B. The gray level distribution can improve the stability of the product, and the problems can be found and adjusted in the initial gray level so as to improve the influence degree of the product.
In an application example of the present application, assume that a scenario a required by a service operator is: when a vehicle enters a gas station and is flamed out, the vehicle is notified so as to push information, and then the server of the embodiment of the application can dig out position data of the gas station and write a corresponding scene detection code a, wherein the detection basis of the scene detection code a is as follows: judging whether the vehicle is currently in the region of a gas station and whether the vehicle is flamed out, and if so, determining that the vehicle is in a scene; and issuing the scene detection code A and the position data of the gas station to the client.
After receiving the scene detection code a and the location data of the gas station, the client may execute the scene detection code a to detect whether the device is in the scene a, and if so, send a detection result to the server or the preset application to enable the server or the preset application to send a corresponding notification to the service operator.
Method embodiment three
Referring to fig. 5, a flowchart illustrating steps of a third embodiment of the data processing method in the present application is shown, which may specifically include the following steps:
step 501, receiving a task sent by a server;
step 502, registering sensor data according to task information;
the sensor data includes, but is not limited to, GPS data, base station data, WIFI data, bluetooth (Beacon) data, accelerometer data, gyroscope data, magnetometer data, traffic data, camera vision data, etc., wherein the purpose of the machine vision camera is to transmit images projected through the lens onto the sensor to a machine device capable of storage, analysis, and/or display.
Step 503, judging whether a scene detection condition is met or not according to a scene corresponding to the task, if so, executing step 504, and if not, executing step 507;
step 504, adding the task to a scheduling list;
step 505, acquiring a target task from a scheduling list, and detecting whether the equipment is in a scene corresponding to the target task, if so, executing step 506 and step 507, otherwise, executing step 507;
step 506, executing the action according to the action configuration of the task;
for example, information of the target task may be transmitted to a server or a preset application;
and 507, judging whether the trigger condition is met or not according to the event information of the equipment, if so, executing a step 503, and otherwise, executing the step 507 in a circulating mode.
In this embodiment of the present application, step 503 may be executed in a loop, and the time when step 503 is executed next time may be determined by the determination result of step 507.
To sum up, the data processing method according to the embodiment of the present application may determine, for a received task, whether a scene detection condition is met under an initial condition, and if not, determine whether a time to be executed next time meeting the scene detection condition may be determined by event information and a trigger condition of the device. Therefore, the trigger condition can be intelligently adjusted according to the event information of the equipment, so that the resource consumed by the equipment can be reduced, and the use experience of a user can be improved.
In addition, in the embodiment of the application, when the detection device is in the scene corresponding to the target task, whether the time of the next execution meeting the scene detection condition is determined according to the trigger condition can be determined.
Referring to fig. 6, a schematic structural diagram of a data processing system of the present application is shown, which may specifically include: a server 601 and a client 602;
among them, the client 602 may include: a data interaction module 621, a running opportunity module 622, a sensor abstraction module 623, a task scheduling module 624, a scene detection module 625, and a scene trigger module 626;
the data interaction module 621 interacts with the server 601, and specifically, the data interaction module 621 receives a task sent by the server 601 and sends execution data of the task to the server 601.
The operation opportunity module 622 is configured to determine whether a scene detection condition is met for a scene corresponding to the task; wherein, the judgment of whether the scene detection condition is met can be the detection of low power consumption rough sensing; also, the runtime module 622 can adaptively and dynamically determine the runtime according to the trigger condition and the event information of the device.
And the sensor abstraction module 623 is an acquisition interface for abstracting sensor data, and is called by the operation opportunity module 622 and the scene detection module 625.
The task scheduling module 624 is configured to schedule tasks meeting the scene detection condition; alternatively, the tasks may be scheduled according to their priorities.
A scene detection module 625, configured to detect whether the device is in the scene; in particular, the scene detection code may be executed; and detecting whether the equipment is in the scene or not according to the environment data and the equipment data in the execution process of the scene detection code. The scene detection module 625 may obtain the detection result and send the detection result to the scene trigger module 626.
The context trigger engine 626 is configured to perform context triggering according to the detection result, and specifically, may send information of the task to a preset application according to configuration information of the task, or may send information of the task to the server 601, so that the server 601 detects whether the device is in the context, so as to obtain a more accurate detection result. It is understood that the server 601 may also send a more accurate detection result to the data interaction module 621.
In summary, the embodiment of the present application provides the operation timing module 622 that is configurable and the operation timing can be adaptively adjusted, so as to achieve the purpose of low power consumption intelligent sensing.
Referring to fig. 7, an interaction diagram of a data processing system according to an embodiment of the present application is shown, which may specifically include: the system comprises an application layer, a system service layer, a cloud data query layer and a cloud data mining layer;
wherein, the application layer can include: the business side is also the application corresponding to the business operator side;
the system service layer may include: a code generation module 701, a data mining module 702, a running opportunity module 703, a task scheduling module 704, a code execution module 705, a scene triggering module 706, a first positioning module 707 and a data acquisition module 709;
the code generation module 701 is configured to generate a scene detection code;
the data mining module 702 is configured to mine environment data corresponding to a scene;
the running opportunity module 703, the task scheduling module 704, the code execution module 705 and the scene triggering module 706 may refer to fig. 6, and are not described herein again. Wherein, the running opportunity module 703 may be triggered by a task; data utilized by code execution module 705 may include: scene detection codes, scene fingerprints (such as WIFI fingerprints of stores), scene-corresponding sensor data, and the like, where the scene fingerprints and the scene-corresponding sensor data may be used as environment data corresponding to a scene; the scene trigger module 706 may perform trigger notification according to the action configuration issued by the server; the runtime module 703 and the code execution module 705 may obtain the required data from the sensors.
The first positioning module 707 is configured to position a device corresponding to the client;
the data collection module 709 is used for collecting behavior data generated by a user through a device, including but not limited to: POI data, store data, personal behavior data, and the like.
It should be noted that the code generation module 701 and the data mining module 702 of the system service layer are located in the server, and the operation opportunity module 703, the task scheduling module 704, the code execution module 705, the scene triggering module 706, the first positioning module 707 and the data acquisition module 709 of the system service layer are located in the client.
The modules of the cloud service layer may be located at a server, the server may be a cloud server, and the cloud service layer may include: a second positioning module 708, a data collection module 710 and a scene intelligent perception module 711; the 3 modules interact with the client respectively to collect positioning data, behavior data and scene data of the client.
Optionally, the transmission channel between the data mining module 702 and the scene intelligent perception module 711 may include, but is not limited to: http (Hyper Text Transfer Protocol over Secure Socket Layer), cmss (connection-mode network service), and the like.
The cloud data mining layer may be used to collect client-generated behavioral data. The cloud data query layer may be configured to store the behavior data, and the database used for storing may include: RDS (relational database Service) model, Tair (distributed database), etc.
In summary, in the embodiment of the present application, the running timing module 703 solves the contradiction between power consumption and real-time sensing, so that a continuous sensing effect can be achieved, and power consumption and flow of the device can be reduced. Specifically, the operation timing module 703 activates scene detection near the scene, and performs only low-frequency and low-power-consumption scene region detection at other times, thereby solving the contradiction between low power consumption and high real-time performance of scene detection.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
The embodiment of the application also provides a data processing device.
Referring to fig. 8, a block diagram of a data processing apparatus according to an embodiment of the present application is shown, which may specifically include the following modules:
a first judging module 801, configured to judge whether a trigger condition is met according to event information of a device;
a second judging module 802, configured to judge whether the scene detection condition is met under the condition that the trigger condition is met; and
a scene detection module 803, configured to detect whether the device is in the scene when the scene detection condition is met;
wherein the event may include at least one of the following events:
a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene.
Optionally, the first determining module 801 may include:
the first judgment submodule is used for meeting the triggering condition if the electric quantity of the equipment exceeds the electric quantity threshold; or
The second judgment submodule is used for meeting the triggering condition if the electric quantity of the equipment does not exceed the electric quantity threshold and a charging event is detected; or
And the third judgment submodule is used for not meeting the triggering condition if the electric quantity of the equipment does not exceed the electric quantity threshold and the charging event is not detected.
Optionally, the first determining module 801 may include:
the fourth judgment submodule is used for meeting the triggering condition if the flow of the equipment exceeds the flow threshold; or
The fifth judgment submodule is used for meeting the triggering condition if the flow of the equipment does not exceed the flow threshold and the equipment is accessed to the local area network; or
And the sixth judgment submodule is used for judging that the flow of the equipment does not exceed the flow threshold and the equipment is not accessed into the local area network, and the triggering condition is not met.
Optionally, the first determining module 801 may include:
the seventh judgment submodule is used for judging that the equipment does not meet the triggering condition if the displacement of the equipment in the preset time period does not exceed the position threshold; or
And the eighth judging submodule is used for judging whether the triggering condition is met or not according to the preset time interval if the displacement of the equipment in the preset time period exceeds the position threshold.
Optionally, the first determining module 801 may include:
and the ninth judgment submodule is used for meeting the triggering condition if the task receiving event is detected.
Optionally, the scene may include: in the point of interest scenario, the second determining module 802 may include:
and the tenth judging submodule is used for judging whether the equipment is in the coverage range of the surrounding network corresponding to the interest point.
Optionally, the coverage of the surrounding network is matched with the surrounding area corresponding to the interest point.
Optionally, the usage frequency of the surrounding network exceeds a first frequency threshold.
Optionally, the types of the surrounding network may include: a mobile network and/or a wireless fidelity network.
Optionally, the scene may include: a device state scenario, the device may include: the second determining module 802 may include:
and the eleventh judging submodule is used for judging whether a speed measuring device exists on the road where the vehicle is located.
Optionally, the scene may include: a device state scenario, the device may include: the second determining module 802 may include:
and the twelfth judging submodule is used for judging whether the congestion probability of the route where the vehicle is located exceeds a probability threshold value.
Optionally, the scene may include: a device state scenario, the device may include: the second determining module 802 may include:
and the thirteenth judging submodule is used for judging whether the distance between the position of the vehicle and the position of the preset residence is smaller than the distance threshold value.
Optionally, the scene may include: mode scenario, the apparatus may comprise: the second determining module 802 may include:
and the fourteenth judging submodule is used for judging whether the accident frequency of the road where the vehicle is located exceeds a second frequency threshold value.
Optionally, the scene detection module 803 may include:
the determining submodule is used for determining a scene detection code for detecting the scene and environment data corresponding to the scene detection code; and
the code execution submodule is used for executing the scene detection code; and detecting whether the equipment is in the scene or not according to the environment data and the equipment data in the execution process of the scene detection code.
Optionally, the scene may include: point of interest scenarios, the environmental data may include: location data corresponding to the points of interest; or
The scene may include: device state scenarios, the environmental data may include: preset state data of the device; or
The scene may include: mode scenarios, the environmental data may include: historical data corresponding to the pattern.
Optionally, the apparatus may further include:
and the sending module is used for sending the execution data corresponding to the scene detection code to a server.
Optionally, the execution data may include at least one of the following data: detecting the result, detecting the basis data, and executing the error data.
Optionally, the second determining module 802 may include:
and the fifteenth judgment submodule is used for judging whether the scene detection condition is met or not according to the scene corresponding to the task.
Optionally, the scene detection module 803 may include:
the adding submodule is used for adding the task to the scheduling list if the scene corresponding to the task meets the scene detection condition;
the target task determining submodule is used for determining a target task from the scheduling list; and
and the detection submodule is used for detecting whether the equipment is in a scene corresponding to the target task.
Embodiments of the application can be implemented as a system or apparatus employing any suitable hardware and/or software for the desired configuration. Fig. 9 schematically illustrates an example apparatus 1100 that may be used to implement various embodiments described herein.
For one embodiment, fig. 9 illustrates an exemplary apparatus 1100, which apparatus 1100 may comprise: one or more processors 1102, a system control module (chipset) 1104 coupled to at least one of the processors 1102, a system memory 1106 coupled to the system control module 1104, a non-volatile memory (NVM)/storage 1108 coupled to the system control module 1104, one or more input/output devices 1110 coupled to the system control module 1104, and a network interface 1112 coupled to the system control module 1106. The system memory 1106 may include: instructions 1162, the instructions 1162 being executable by the one or more processors 1102.
The processor 1102 may include one or more single-core or multi-core processors, and the processor 1102 may include any combination of general-purpose or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the apparatus 1100 is capable of operating as a server, a target device, a wireless device, etc., as described in embodiments herein.
In some embodiments, the apparatus 1100 may include one or more machine-readable media (e.g., system memory 1106 or NVM/storage 1108) having instructions and one or more processors 1102 configured to execute the instructions, in conjunction with the one or more machine-readable media, to implement the modules included in the aforementioned apparatus to perform the actions described in embodiments of the present application.
System control module 1104 for one embodiment may include any suitable interface controller to provide any suitable interface to at least one of processors 1102 and/or any suitable device or component in communication with system control module 1104.
System control module 1104 for one embodiment may include one or more memory controllers to provide an interface to system memory 1106. The memory controller may be a hardware module, a software module, and/or a firmware module.
System memory 1106 for one embodiment may be used to load and store data and/or instructions 1162. For one embodiment, system memory 1106 may include any suitable volatile memory, such as suitable DRAM (dynamic random access memory). In some embodiments, system memory 1106 may include: double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
System control module 1104 for one embodiment may include one or more input/output controllers to provide an interface to NVM/storage 1108 and input/output device(s) 1110.
NVM/storage 1108 for one embodiment may be used to store data and/or instructions 1182. NVM/storage 1108 may include any suitable non-volatile memory (e.g., flash memory, etc.) and/or may include any suitable non-volatile storage device(s), e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives, etc.
NVM/storage 1108 may include storage resources that are physically part of the device on which device 1100 is installed or may be accessible by the device and not necessarily part of the device. For example, NVM/storage 1108 may be accessed over a network via network interface 1112 and/or through input/output devices 1110.
Input/output device(s) 1110 for one embodiment may provide an interface for apparatus 1100 to communicate with any other suitable device, input/output devices 1110 may include communication components, audio components, sensor components, and so forth.
Network interface 1112 of one embodiment may provide an interface for device 1100 to communicate over one or more networks and/or with any other suitable device, and device 1100 may communicate wirelessly with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols, such as to access a communication standard-based wireless network, such as WiFi, 2G, or 3G, or a combination thereof.
For one embodiment, at least one of the processors 1102 may be packaged together with logic for one or more controllers (e.g., memory controllers) of the system control module 1104. For one embodiment, at least one of the processors 1102 may be packaged together with logic for one or more controllers of the system control module 1104 to form a System In Package (SiP). For one embodiment, at least one of the processors 1102 may be integrated on the same novelty as the logic of one or more controllers of the system control module 1104. For one embodiment, at least one of the processors 1102 may be integrated on the same chip with logic for one or more controllers of the system control module 1104 to form a system on a chip (SoC).
In various embodiments, the apparatus 1100 may include, but is not limited to: a computing device such as a desktop computing device or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, the apparatus 1100 may have more or fewer components and/or different architectures. For example, in some embodiments, device 1100 may include one or more cameras, keyboards, Liquid Crystal Display (LCD) screens (including touch screen displays), non-volatile memory ports, multiple antennas, graphics chips, Application Specific Integrated Circuits (ASICs), and speakers.
Wherein, if the display includes a touch panel, the display screen may be implemented as a touch screen display to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The present application also provides a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to an apparatus, the apparatus may be caused to execute instructions (instructions) of methods in the present application.
Provided in one example is an apparatus comprising: one or more processors; and, instructions in one or more machine-readable media stored thereon, which when executed by the one or more processors, cause the apparatus to perform a method as in embodiments of the present application, which may include: the method shown in fig. 2 or fig. 3 or fig. 4 or fig. 5 or fig. 6 or fig. 7.
One or more machine-readable media are also provided in one example, having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform a method as in embodiments of the application, which may include: the method shown in fig. 2 or fig. 3 or fig. 4 or fig. 5 or fig. 6 or fig. 7.
The specific manner in which each module performs operations of the apparatus in the above embodiments has been described in detail in the embodiments related to the method, and will not be described in detail here, and reference may be made to part of the description of the method embodiments for relevant points.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The data processing method, the data processing apparatus, the device, the machine-readable medium, and the device-based operating system provided by the present application are described in detail above, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the description of the above embodiments is only used to help understand the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (40)

1. A data processing method, comprising:
judging whether the trigger condition is met or not according to the event information of the equipment;
if the trigger condition is met, judging whether the scene detection condition is met;
if the scene detection condition is met, detecting whether the equipment is in the scene;
wherein the event comprises at least one of the following events:
a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene.
2. The method of claim 1, wherein the determining whether the trigger condition is met according to the event information of the device comprises:
if the electric quantity of the equipment exceeds the electric quantity threshold value, the triggering condition is met; or
If the electric quantity of the equipment does not exceed the electric quantity threshold value and a charging event is detected, the triggering condition is met; or
If the electric quantity of the equipment does not exceed the electric quantity threshold value and the charging event is not detected, the triggering condition is not met.
3. The method of claim 1, wherein the determining whether the trigger condition is met according to the event information of the device comprises:
if the flow of the equipment exceeds a flow threshold, the triggering condition is met; or
If the flow of the equipment does not exceed the flow threshold and the equipment is accessed to the local area network, the triggering condition is met; or
If the flow of the equipment does not exceed the flow threshold and the equipment is not accessed to the local area network, the triggering condition is not met.
4. The method of claim 1, wherein the determining whether the trigger condition is met according to the event information of the device comprises:
if the displacement of the equipment in the preset time period does not exceed the position threshold, the equipment does not accord with the triggering condition; or
And if the displacement of the equipment in the preset time period exceeds the position threshold, judging whether the trigger condition is met according to the preset time interval.
5. The method of claim 1, wherein the determining whether the trigger condition is met according to the event information of the device comprises:
and if the task receiving event is detected, the triggering condition is met.
6. The method of claim 1, wherein the scene comprises: the step of judging whether the scene detection condition is met or not in the interest point scene comprises the following steps:
and judging whether the equipment is in the coverage range of the surrounding network corresponding to the interest point.
7. The method of claim 6, wherein the coverage of the surrounding network matches the surrounding area corresponding to the point of interest.
8. The method of claim 6, wherein the frequency of use of the surrounding network exceeds a first frequency threshold.
9. The method of claim 6, wherein the type of the surrounding network comprises: a mobile network and/or a wireless fidelity network.
10. The method of claim 1, wherein the scene comprises: a device state scenario, the device comprising: a vehicle, the determining whether the scene detection condition is met, comprising:
and judging whether a speed measuring device exists on the road where the vehicle is located.
11. The method of claim 1, wherein the scene comprises: a device state scenario, the device comprising: a vehicle, the determining whether the scene detection condition is met, comprising:
and judging whether the congestion probability of the route where the vehicle is located exceeds a probability threshold value.
12. The method of claim 1, wherein the scene comprises: a device state scenario, the device comprising: a vehicle, the determining whether the scene detection condition is met, comprising:
and judging whether the distance between the position of the vehicle and the position of the preset residence is smaller than a distance threshold value.
13. The method of claim 1, wherein the scene comprises: a mode scenario, the device comprising: a vehicle, the determining whether the scene detection condition is met, comprising:
and judging whether the accident frequency of the road where the vehicle is located exceeds a second frequency threshold value.
14. The method according to any one of claims 1 to 13, wherein detecting whether the device is in the scene comprises:
determining a scene detection code for detecting the scene and environment data corresponding to the scene detection code;
executing the scene detection code; and detecting whether the equipment is in the scene or not according to the environment data and the equipment data in the execution process of the scene detection code.
15. The method of claim 14, wherein the scene comprises: a point of interest scene, the environmental data comprising: location data corresponding to the points of interest; or
The scene comprises the following steps: a device state scenario, the environmental data comprising: preset state data of the device; or
The scene comprises the following steps: a pattern scenario, the environmental data comprising: historical data corresponding to the pattern.
16. The method of claim 14, further comprising:
and sending the execution data corresponding to the scene detection code to a server.
17. The method of claim 16, wherein the execution data comprises at least one of: detecting the result, detecting the basis data, and executing the error data.
18. The method according to any one of claims 1 to 13, wherein the determining whether the scene detection condition is met comprises:
and judging whether the scene detection conditions are met or not according to the scene corresponding to the task.
19. The method according to any one of claims 1 to 13, wherein the detecting whether the device is in the scene if the scene detection condition is met comprises:
if the scene corresponding to the task meets the scene detection condition, adding the task to a scheduling list;
acquiring a target task from the scheduling list;
and detecting whether the equipment is in a scene corresponding to the target task.
20. A data processing apparatus, comprising:
the first judgment module is used for judging whether the trigger condition is met or not according to the event information of the equipment;
the second judgment module is used for judging whether the scene detection condition is met or not under the condition that the trigger condition is met; and
the scene detection module is used for detecting whether the equipment is in the scene or not under the condition of meeting the scene detection condition;
wherein the event comprises at least one of the following events:
a power event, a network event, a traffic event, a displacement event, and a task reception event; the task corresponds to a scene.
21. The apparatus of claim 20, wherein the first determining module comprises:
the first judgment submodule is used for meeting the triggering condition if the electric quantity of the equipment exceeds the electric quantity threshold; or
The second judgment submodule is used for meeting the triggering condition if the electric quantity of the equipment does not exceed the electric quantity threshold and a charging event is detected; or
And the third judgment submodule is used for not meeting the triggering condition if the electric quantity of the equipment does not exceed the electric quantity threshold and the charging event is not detected.
22. The apparatus of claim 20, wherein the first determining module comprises:
the fourth judgment submodule is used for meeting the triggering condition if the flow of the equipment exceeds the flow threshold; or
The fifth judgment submodule is used for meeting the triggering condition if the flow of the equipment does not exceed the flow threshold and the equipment is accessed to the local area network; or
And the sixth judgment submodule is used for judging that the flow of the equipment does not exceed the flow threshold and the equipment is not accessed into the local area network, and the triggering condition is not met.
23. The apparatus of claim 20, wherein the first determining module comprises:
the seventh judgment submodule is used for judging that the equipment does not meet the triggering condition if the displacement of the equipment in the preset time period does not exceed the position threshold; or
And the eighth judging submodule is used for judging whether the triggering condition is met or not according to the preset time interval if the displacement of the equipment in the preset time period exceeds the position threshold.
24. The apparatus of claim 20, wherein the first determining module comprises:
and the ninth judgment submodule is used for meeting the triggering condition if the task receiving event is detected.
25. The apparatus of claim 20, wherein the scene comprises: the second judging module comprises:
and the tenth judging submodule is used for judging whether the equipment is in the coverage range of the surrounding network corresponding to the interest point.
26. The apparatus of claim 25, wherein the coverage of the surrounding network matches the surrounding area corresponding to the point of interest.
27. The apparatus of claim 25, wherein the frequency of use of the surrounding network exceeds a first frequency threshold.
28. The apparatus of claim 25, wherein the type of the surrounding network comprises: a mobile network and/or a wireless fidelity network.
29. The apparatus of claim 20, wherein the scene comprises: a device state scenario, the device comprising: a vehicle, the second determination module comprising:
and the eleventh judging submodule is used for judging whether a speed measuring device exists on the road where the vehicle is located.
30. The apparatus of claim 20, wherein the scene comprises: a device state scenario, the device comprising: a vehicle, the second determination module comprising:
and the twelfth judging submodule is used for judging whether the congestion probability of the route where the vehicle is located exceeds a probability threshold value.
31. The apparatus of claim 20, wherein the scene comprises: a device state scenario, the device comprising: a vehicle, the second determination module comprising:
and the thirteenth judging submodule is used for judging whether the distance between the position of the vehicle and the position of the preset residence is smaller than the distance threshold value.
32. The apparatus of claim 20, wherein the scene comprises: a mode scenario, the device comprising: a vehicle, the second determination module comprising:
and the fourteenth judging submodule is used for judging whether the accident frequency of the road where the vehicle is located exceeds a second frequency threshold value.
33. The apparatus of any of claims 20 to 32, wherein the scene detection module comprises:
the determining submodule is used for determining a scene detection code for detecting the scene and environment data corresponding to the scene detection code; and
the code execution submodule is used for executing the scene detection code; and detecting whether the equipment is in the scene or not according to the environment data and the equipment data in the execution process of the scene detection code.
34. The apparatus of claim 33, wherein the scene comprises: a point of interest scene, the environmental data comprising: location data corresponding to the points of interest; or
The scene comprises the following steps: a device state scenario, the environmental data comprising: preset state data of the device; or
The scene comprises the following steps: a pattern scenario, the environmental data comprising: historical data corresponding to the pattern.
35. The apparatus of claim 33, further comprising:
and the sending module is used for sending the execution data corresponding to the scene detection code to a server.
36. The apparatus of claim 35, wherein the execution data comprises at least one of: detecting the result, detecting the basis data, and executing the error data.
37. The apparatus according to any one of claims 20 to 32, wherein the second determining module comprises:
and the fifteenth judgment submodule is used for judging whether the scene detection condition is met or not according to the scene corresponding to the task.
38. The apparatus of any of claims 20 to 32, wherein the scene detection module comprises:
the adding submodule is used for adding the task to the scheduling list if the scene corresponding to the task meets the scene detection condition;
the target task determining submodule is used for determining a target task from the scheduling list; and
and the detection submodule is used for detecting whether the equipment is in a scene corresponding to the target task.
39. An apparatus, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method recited by one or more of claims 1-19.
40. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method recited by one or more of claims 1-19.
CN201810580632.8A 2018-06-07 2018-06-07 Data processing method, device and machine readable medium Pending CN110647231A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810580632.8A CN110647231A (en) 2018-06-07 2018-06-07 Data processing method, device and machine readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810580632.8A CN110647231A (en) 2018-06-07 2018-06-07 Data processing method, device and machine readable medium

Publications (1)

Publication Number Publication Date
CN110647231A true CN110647231A (en) 2020-01-03

Family

ID=69008560

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810580632.8A Pending CN110647231A (en) 2018-06-07 2018-06-07 Data processing method, device and machine readable medium

Country Status (1)

Country Link
CN (1) CN110647231A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111416859A (en) * 2020-03-17 2020-07-14 武汉慧联无限科技有限公司 Monitoring method, monitoring equipment and computer readable storage medium
CN112330094A (en) * 2020-10-09 2021-02-05 广州市物联万方电子科技有限公司 Container scheduling method and device and server
CN112597899A (en) * 2020-12-24 2021-04-02 北京市商汤科技开发有限公司 Behavior state detection method and device, electronic equipment and storage medium
CN116320019A (en) * 2023-05-16 2023-06-23 荣耀终端有限公司 Data acquisition method, medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060218271A1 (en) * 2005-03-16 2006-09-28 Nokia Corporation Triggered statistics reporting
US20080027632A1 (en) * 2006-03-28 2008-01-31 Mauderer Hans P Storage and visualization of points of interest in a navigation system
US20080070588A1 (en) * 2006-09-19 2008-03-20 Drew Morin Device based trigger for location push event
CN101358853A (en) * 2008-08-08 2009-02-04 凯立德欣技术(深圳)有限公司 Interest point search method, interest point search method thereof and navigation system
WO2015197000A1 (en) * 2014-06-25 2015-12-30 可牛网络技术(北京)有限公司 Method and apparatus for controlling hardware state of mobile terminal
US20160286360A1 (en) * 2013-11-11 2016-09-29 Yandex Europe Ag Location service(s) management for mobile device(s)
CN107948923A (en) * 2016-10-13 2018-04-20 阿里巴巴集团控股有限公司 A kind of information processing method based on virtual fence, client and server

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060218271A1 (en) * 2005-03-16 2006-09-28 Nokia Corporation Triggered statistics reporting
US20080027632A1 (en) * 2006-03-28 2008-01-31 Mauderer Hans P Storage and visualization of points of interest in a navigation system
US20080070588A1 (en) * 2006-09-19 2008-03-20 Drew Morin Device based trigger for location push event
CN101358853A (en) * 2008-08-08 2009-02-04 凯立德欣技术(深圳)有限公司 Interest point search method, interest point search method thereof and navigation system
US20160286360A1 (en) * 2013-11-11 2016-09-29 Yandex Europe Ag Location service(s) management for mobile device(s)
WO2015197000A1 (en) * 2014-06-25 2015-12-30 可牛网络技术(北京)有限公司 Method and apparatus for controlling hardware state of mobile terminal
CN107948923A (en) * 2016-10-13 2018-04-20 阿里巴巴集团控股有限公司 A kind of information processing method based on virtual fence, client and server

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111416859A (en) * 2020-03-17 2020-07-14 武汉慧联无限科技有限公司 Monitoring method, monitoring equipment and computer readable storage medium
CN112330094A (en) * 2020-10-09 2021-02-05 广州市物联万方电子科技有限公司 Container scheduling method and device and server
CN112597899A (en) * 2020-12-24 2021-04-02 北京市商汤科技开发有限公司 Behavior state detection method and device, electronic equipment and storage medium
CN116320019A (en) * 2023-05-16 2023-06-23 荣耀终端有限公司 Data acquisition method, medium and electronic equipment
CN116320019B (en) * 2023-05-16 2023-10-27 荣耀终端有限公司 Data acquisition method, medium and electronic equipment

Similar Documents

Publication Publication Date Title
US10297148B2 (en) Network computer system for analyzing driving actions of drivers on road segments of a geographic region
CN110647231A (en) Data processing method, device and machine readable medium
US10768000B2 (en) Content presentation based on travel patterns
CN104812654A (en) Dynamically providing position information of transit object to computing device
US10168177B2 (en) Navigation system with destination action mechanism and method of operation thereof
US8751426B2 (en) Apparatus and method for generating context-aware information using local service information
US20200318983A1 (en) Route safety determination system
JP2010128815A (en) Information distribution system, information distribution server and program
US20200318982A1 (en) Location safety determination system
US10708729B2 (en) Outputting an entry point to a target service
Kassim et al. IoT bus tracking system localization via GPS-RFID
CN104660684A (en) Method and device for updating road net data information
CN110866178A (en) Data processing method, device and machine readable medium
KR20150008653A (en) Method for utilizing Usage Log of Portable Terminal and Apparatus for using the same
CN103644921A (en) Method and device for realizing street view display
CN107750339B (en) Detecting a context of a user using a mobile device based on wireless signal characteristics
CN110582054A (en) data processing method, device and machine readable medium
US20130179260A1 (en) Predicting Trends Using A Geographic Position System
CN110672086B (en) Scene recognition method, device, equipment and computer readable medium
CN110647691B (en) Data processing method, device and machine readable medium
CN111402620A (en) Arrival reminding method, device, terminal and storage medium
CN116709501A (en) Service scene identification method, electronic equipment and storage medium
US20210288726A1 (en) Location accuracy using local transmitters
KR20130118190A (en) Separate type navigation system
US20240175695A1 (en) Information processing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201224

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200103