CN109302684B - Scene determination method for terminal device, cloud server, and storage medium - Google Patents

Scene determination method for terminal device, cloud server, and storage medium Download PDF

Info

Publication number
CN109302684B
CN109302684B CN201811322846.1A CN201811322846A CN109302684B CN 109302684 B CN109302684 B CN 109302684B CN 201811322846 A CN201811322846 A CN 201811322846A CN 109302684 B CN109302684 B CN 109302684B
Authority
CN
China
Prior art keywords
information
terminal device
scene
terminal
physical information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811322846.1A
Other languages
Chinese (zh)
Other versions
CN109302684A (en
Inventor
张震
张大刚
胡峰
李星毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maipian Technology Shenzhen Co ltd
Original Assignee
Maipian Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maipian Technology Shenzhen Co ltd filed Critical Maipian Technology Shenzhen Co ltd
Priority to CN201811322846.1A priority Critical patent/CN109302684B/en
Publication of CN109302684A publication Critical patent/CN109302684A/en
Application granted granted Critical
Publication of CN109302684B publication Critical patent/CN109302684B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Educational Technology (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The invention discloses a scene judgment method of terminal equipment, which comprises the steps of acquiring physical information uploaded by each terminal equipment cloud, wherein the physical information comprises at least one of wireless connection information, position information and associated information of other terminal equipment; analyzing the physical information according to a preset algorithm to confirm whether the terminal equipment is in the same scene; the invention also discloses a cloud server and a storage medium. Therefore, whether the terminal devices are in the same scene or not can be judged through the physical information uploaded by the terminal device clouds and analysis according to the physical information, or the terminal devices existing in the same scene are judged, and linkage use of the terminal device scenes is achieved.

Description

Scene determination method for terminal device, cloud server, and storage medium
Technical Field
The present invention relates to the field of scene determination, and in particular, to a scene determination method for a terminal device, a cloud server, and a storage medium.
Background
The traditional terminal learning device is not provided with a time and space positioning module and a function design or is not accurate enough in time and space positioning, and most of the terminal learning devices (a touch and talk pen and the like) are required to be connected with other terminals in a one-to-one manner, such as: pen caps, tablets, mobile phones, etc. to obtain temporal and spatial information. If a new high-end learning device is provided, the cost and price of the learning device are seriously increased, the original resources are wasted, and the utilization rate of the existing device cannot be effectively improved.
Therefore, the terminal learning equipment has more use limitation, is complex to operate and cannot be used in a correlated manner, and the learning equipment cannot play a linked effect in a classroom, so that the learning efficiency is low and the use scene of a learning tool is severely limited; of course, the learning devices cannot provide accurate time and space positioning information, so that each terminal scene cannot be known, and the learning devices cannot be automatically associated and used for linkage teaching.
Disclosure of Invention
The invention mainly aims to provide a scene judging method of terminal equipment, a cloud server and a storage medium, and aims to solve the problem that the learning equipment cannot be automatically associated and used for linkage teaching because the current terminal learning equipment cannot provide accurate time and space positioning information so that each terminal scene cannot be obtained.
In order to achieve the above object, the present invention provides a method for determining a scene of a terminal device, where the method for determining a scene of the terminal device includes:
acquiring physical information uploaded by each terminal device cloud, wherein the physical information comprises at least one of wireless connection information, position information and associated information of other terminal devices;
and analyzing the physical information according to a preset algorithm to confirm whether the terminal equipment is in the same scene.
Optionally, the step of acquiring physical information uploaded by each terminal device cloud includes:
sending an information request instruction to each terminal device;
and acquiring physical information uploaded by each terminal device cloud according to the information request instruction.
Optionally, after the step of obtaining the physical information uploaded by each terminal device cloud, the method further includes:
and classifying the physical information uploaded by the terminal equipment clouds to confirm the relevance among the terminal equipment.
Optionally, the step of analyzing the physical information according to a preset algorithm to determine whether the terminal devices are in the same scene includes:
confirming the identification information of the terminal equipment according to the physical information;
matching the identification information with information in a pre-stored database according to a preset algorithm;
if the matching degree of the identification information and information in a pre-stored database is greater than a preset value, confirming a scene position corresponding to the identification information;
and taking the confirmed scene position as the scene position of the terminal equipment, and confirming whether the terminal equipment is in the same scene.
Optionally, after the step of matching the identification information with information in a pre-stored database, the method further includes:
and if the matching degree of the identification information and the information in the pre-stored database is smaller than or equal to a preset value, re-acquiring the physical information of the terminal equipment, and returning to the step of confirming the identification information of the terminal equipment according to the physical information.
Optionally, the step of taking the confirmed scene position as the scene position where the terminal device is located includes:
acquiring the real position of each terminal device, and comparing the confirmed scene position with the real position of the terminal device;
if the scene position is in the preset area of the real position, taking the confirmed scene position as the scene position of the terminal equipment;
and if the scene position is outside the preset area of the real position, taking the real position as the scene position of the terminal equipment, and storing the real position.
Optionally, after the step of analyzing the physical information according to a preset algorithm to determine whether the terminal device is in the same scene, the method further includes:
and feeding back the confirmation result to each terminal device.
In the embodiment of the invention, physical information uploaded by each terminal device cloud is acquired, wherein the physical information comprises at least one of wireless connection information, position information and associated information of other terminal devices; and analyzing the physical information according to a preset algorithm to confirm whether the terminal equipment is in the same scene. Therefore, whether the terminal devices are in the same scene or not can be judged through the physical information uploaded by the terminal device clouds and analysis according to the physical information, or the terminal devices existing in the same scene are judged, and linkage use of the terminal device scenes is achieved.
In order to achieve the above object, the present invention provides a cloud server, where the cloud server includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the steps of the scene determination method of the terminal device are implemented.
To achieve the above object, the present invention proposes a storage medium having a computer program stored thereon, which, when executed by a processor, implements the steps of the scene determination method of the terminal device as described above.
The invention discloses a cloud server and a storage medium, which can judge whether terminal devices are in the same scene or judge the terminal devices existing in the same scene through physical information uploaded by each terminal device cloud and analyze the physical information so as to realize linkage use of the terminal device scene.
Drawings
FIG. 1 is a schematic flow chart illustrating steps of a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of a second embodiment of the present invention;
FIG. 3 is a flowchart illustrating steps of a third embodiment of the present invention;
FIG. 4 is a flowchart illustrating steps of a fourth embodiment of the present invention;
FIG. 5 is a flowchart illustrating steps of a fifth embodiment of the present invention;
FIG. 6 is a flowchart illustrating steps of a sixth embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, if directional indications (such as up, down, left, right, front, and back … …) are involved in the embodiment of the present invention, the directional indications are only used to explain the relative positional relationship between the components, the motion situation, and the like in a specific posture (as shown in the drawing), and if the specific posture sends a change, the directional indications are changed accordingly.
In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
In the embodiment of the invention, the terminal device is a learning device worn by a student, and the cloud server is a cloud server and is used for processing data of the terminal device and carrying out scene judgment on the terminal device. In this embodiment, the terminal device does not include a time and space positioning module and a function design, or the time and space positioning of the terminal device is not accurate enough, for example: some learning equipment such as pen, schoolbag, pen socket are read to student for the scene of cloud server to terminal equipment is judged and can be positioned relatively accurately according to terminal equipment's not accurate enough time and spatial data or other data, can also with the result of judging on sending school or the parent's terminal equipment, so that school or the parent can know student's positional information and study condition in real time, for example, the student is at 8: 00 to 9: the positioning information in the time period of 30 is a region of one grade, and when the cloud server sends the information to the school or the parent, the school or the parent can clearly know the condition of the student.
In an embodiment, when the terminal device may include a plurality of learning devices worn by students, the cloud server may determine whether the terminal devices are in the same area or whether the terminal devices exist in the same area according to information of the respective terminal devices. For example, in 8: 00 to 9: in the period of 30, thirty terminals class in the area of the grade one, and it can be determined that thirty terminal devices are owned in the area of the grade one, that is, thirty students 8: 00 to 9: 30 class in the region of the year.
Of course, in other embodiments, the method may also be used for other terminal devices (not including higher-end devices such as a screen), and may also be used for scene determination of the staff members of the company, so as to monitor the working conditions of the staff members of the company in real time. In some embodiments, the method can also be used in various fields such as tourist groups and the like, and is not limited in the application.
In an embodiment of the present invention, as shown in fig. 1, which is a schematic flowchart illustrating a procedure of a first embodiment of the present invention, a method for determining a scene of a terminal device includes:
s10, acquiring physical information uploaded by each terminal device cloud, wherein the physical information comprises at least one of wireless connection information, position information and associated information of other terminal devices;
in an embodiment, the terminal devices include a plurality of terminal devices, and the plurality of terminal devices should be the same device or different devices having the same function, so that the terminal devices can be associated with each other and interact with each other, that is, the terminal devices can interact with each other through wireless modules such as bluetooth, WIFI, infrared, and the like, thereby sharing data or courses and the like.
In one embodiment, physical information uploaded by each terminal device cloud is acquired. Specifically, the cloud server may obtain physical information uploaded by each terminal device cloud, that is, the physical information obtained by the cloud server includes at least one of wireless connection information, location information, and association information with other terminal devices, or a combination of several of the foregoing information. The wireless connection information includes connection information of wireless modules such as bluetooth, WIFI, infrared and the like, that is, connection information such as bluetooth, WIFI, infrared and the like between each terminal device, and of course, the terminal device may also include a GPS module, that is, GPS positioning and the like may be performed through the GPS module.
As an example, the cloud server may obtain physical information between the respective terminal devices, for example, wireless connection information, location information, and information in association information with other terminal devices, and the like of the plurality of terminal devices during or over a period of time. Also for example, the plurality of terminal devices may also include GPS positioning information. Of course, the associated information with other terminal devices includes data sharing records, course sharing records, and the like.
In one embodiment, the step in S10 further includes:
s11, sending an information request instruction to each terminal device;
in this step, the cloud server sends an information request instruction to each terminal device, and each terminal device receives the information request instruction.
S12, acquiring physical information uploaded by each terminal device cloud according to the information request instruction;
in this step, physical information uploaded by each terminal device cloud is acquired according to the information request instruction. After receiving the information request instruction, each terminal device responds to the information request instruction, physical information corresponding to the information request instruction is sent to the terminal device according to the information request instruction, and the cloud server receives the physical information of the terminal device; or the cloud server directly captures the physical information of the terminal equipment after receiving the response information request instruction of the terminal equipment.
In this process, the terminal device may transmit the physical information to the cloud server in a wireless transmission manner, for example, in a wireless transmission manner such as WIFI or bluetooth, and of course, the terminal device may also upload the physical information cloud to the cloud storage, and the cloud server may directly obtain the physical information of the terminal device from the cloud storage.
That is, in this embodiment, the terminal device needs to configure one or more of the WIFI module, the GPS module, the bluetooth module, and the timing module, and certainly, may also configure other modules, such as: the infrared module, the data cloud upload module, and the like are not limited in this application as long as physical information of the infrared module and the data cloud upload module can be sent to the cloud server.
Optionally, the physical information includes at least one of wireless connection information of the terminal device, current location information, current time information, and association information with other terminal devices. The wireless connection information of the terminal device is connection information of a WIFI module or connection information of a bluetooth module, for example, when the terminal device is in WIFI or bluetooth connection, the WIFI or bluetooth connection information of the terminal device is obtained, that is, the wireless connection information of the terminal device is obtained and sent to the cloud server; in the process, if the terminal device is configured with the GPS module or the timing module, the position and the time information of the terminal device may be acquired together and sent to the cloud server. In other embodiments, if the terminal device is not configured with the timing module, the time for sending the physical information to the cloud server is used as the current time information of the terminal device.
Certainly, when the terminal device is configured with the bluetooth module, the terminal device may also be associated with other terminal devices through the bluetooth module, and at this time, the association information between the terminal device and other terminal devices may be obtained and sent to the cloud server.
That is, in this embodiment, when the terminal device configures one of the WIFI module, the GPS module, the bluetooth module, and the timing module, the association information between the terminal device and the module may be converted into physical information and sent to the cloud server; when the terminal equipment is provided with two or more modules of the WIFI module, the GPS module, the Bluetooth module and the timing module, the association information of the terminal equipment and the two or more modules can be converted into physical information to be sent to the cloud server.
S20, analyzing the physical information according to a preset algorithm to confirm whether the terminal equipment is in the same scene;
in this step, after the cloud server acquires the physical information of the terminal device, the physical information is analyzed according to a preset algorithm, and at least one or more combinations of WIFI connection information, bluetooth information, position information, time information and associated information with other terminal devices of the terminal device are extracted from the physical information, so as to confirm scene judgment of the terminal device.
Specifically, after receiving the physical information of the terminal device, the cloud server analyzes the physical information according to a preset algorithm, and the preset algorithm may be a matching algorithm, for example, when the physical information of the terminal device is WIFI connection information, since the WIFI connection information of the terminal device performs WIFI connection between the terminal device and other terminal devices, that is, an effective connection area of WIFI, that is, in the effective connection area of WIFI, the terminal device may establish wireless connection through WIFI, and therefore, the cloud server may determine whether the terminal device is in the same scene as other terminal devices according to the WIFI connection information and position information of other terminal devices connected to the terminal device through WIFI.
Of course, if the terminal device is configured with the GPS module and the timing module, the current location information of the GPS module and the current time information of the timing module may be directly used as the physical information of the terminal device, and sent to the cloud server, so as to obtain the scene determination of the mobile device.
In this embodiment, when the terminal device is configured with the WIFI module, the GPS module, the bluetooth module, and the timing module, that is, the physical information in the terminal device includes the wireless connection information, the current location information, the current time information, and the association information with other terminal devices, at this time, the cloud server may perform analysis according to the accuracy of the modules such as the WIFI module, the GPS module, the bluetooth module, and the timing module, for example, the connection range of the WIFI module of the terminal device is within 100m, the accuracy of the GPS module is within 80%, the connection range of the bluetooth module is within 10m, and the like, to determine the physical information of the terminal device or the association information with other terminal devices to perform analysis, and after the analysis, extract the current location information and the current time information of the terminal device from the physical information, and obtain the scene determination method information of the terminal device.
In an embodiment, when the physical information uploaded by the terminal device cloud includes a plurality of pieces, for example, when the physical information includes WIFI connection information, bluetooth information, location information, time information, and association information with other terminal devices, it may be determined that the terminal device is in the second region according to a specific gravity of the WIFI connection information, the bluetooth information, the location information, the time information, and the association information with other terminal devices, for example, when a specific gravity of the WIFI connection information in the first region is 40% and a specific gravity of the other information in the second region is 60%.
In the embodiment of the invention, physical information uploaded by each terminal device cloud is acquired, wherein the physical information comprises at least one of wireless connection information, position information and associated information of other terminal devices; and analyzing the physical information according to a preset algorithm to confirm whether the terminal equipment is in the same scene. Therefore, whether the terminal devices are in the same scene or not can be judged through the physical information uploaded by the terminal device clouds and analysis according to the physical information, or the terminal devices existing in the same scene are judged, and linkage use of the terminal device scenes is achieved.
Based on the foregoing embodiment, as shown in fig. 2, which is a schematic flow chart of the steps of the second embodiment of the present invention, in S10, the method further includes:
s13, classifying the physical information uploaded by the terminal device clouds to confirm the relevance of the terminal devices;
in this step, the physical information uploaded by each terminal device cloud is classified to confirm the relevance between each terminal device.
In an embodiment, when the cloud server obtains the physical information uploaded by each terminal device cloud, the physical information is classified. Classifying the WIFI connection information, the Bluetooth information, the position information, the time information and the association information with other terminal devices of each piece of physical information, for example, classifying the WIFI connection information in a third area into one class, classifying the WIFI connection information in a fourth area into one class, so as to confirm the terminal devices corresponding to the WIFI connection information in the third area, wherein the terminal devices in the third area are connected with each other through WIFI, so that the association between the terminal devices in the third area is confirmed; the method for determining the WIFI connection information of the fourth area is the same as that of the WIFI connection information of the third area, and is not limited herein.
Certainly, information such as bluetooth information, location information, time information, and association information with other terminal devices may also be categorized, and the categorization method is the same as the categorization method of the WIFI connection information, and is not described herein again.
Therefore, the physical information uploaded by the terminal equipment clouds can be classified to confirm the relevance among the terminal equipment, and whether the terminal equipment is in the same scene or not is judged according to the relevance information of the terminal equipment, or the terminal equipment existing in the same scene is judged, so that the terminal equipment scenes can be used in a linkage mode.
Based on the first embodiment, as shown in fig. 3, which is a schematic flow chart of the steps of the third embodiment of the present invention, in S20, the method further includes:
s21, confirming the identification information of the terminal equipment according to the physical information;
in this embodiment, after the cloud server receives physical information uploaded by the terminal device cloud, the identification information of the terminal device is confirmed according to the physical information.
In an embodiment, the identification information of the terminal device includes a model and a feature of the terminal device, for example, the identification information includes an a-type touch-and-talk pen, a B-type touch-and-talk pen, or the identification information includes a touch-and-talk pen with a timing module, a GPS module, and the like, which is not limited herein.
S22, matching the identification information with information in a pre-stored database according to a preset algorithm;
in one embodiment, the identification information is matched with information in a pre-stored database according to a preset algorithm. That is, according to a preset algorithm, the preset algorithm in this embodiment is a matching algorithm, a signal source corresponding to the identification information is searched in a pre-stored database, and then according to scene information and the like included in the pre-stored signal source, the scene information further includes location information, time information, association information with other terminal devices, and the like, so as to determine a scene area included in the signal source corresponding to the identification information.
In an embodiment, after finding the scene area included in the signal source corresponding to the identification information, the identification information is matched with information in the pre-stored database, for example, the position information in the physical information is matched with the scene area included in the pre-stored database.
In one embodiment, the value range of the matching degree is 0-100%, and when the position information in the physical information is completely the same as the scene area contained in the pre-stored database, the matching degree of the position information in the physical information and the scene area contained in the pre-stored database is 100%; when the position information in the physical information is completely different from the scene area contained in the pre-stored database, that is, the matching degree between the position information in the physical information and the scene area contained in the pre-stored database is 0.
Further, when the physical information data received by the cloud server includes a plurality of pieces, the extracted identification information also includes a plurality of pieces, and after each piece of identification information is extracted, the cloud server searches for the signal source corresponding to each piece of identification information in the pre-stored signal database, so as to determine the scene area where the signal source corresponding to each piece of identification information is located according to the scene area where each pre-stored signal source is located. And matching after determining the scene area where the signal source of each identification information is located. The specific matching method is the same as the above, and is not described herein.
S23, if the matching degree of the identification information and the information in the pre-stored database is greater than a preset value, confirming the scene position corresponding to the identification information;
in this step, when the matching degree between the identification information and the information in the pre-stored database is greater than or equal to a preset value, extracting the WIFI connection information, the bluetooth information, the location information, the time information of the terminal device and the associated information with other terminal devices from the physical information, and then obtaining the scene location corresponding to the identification information.
Preferably, the preset value in this embodiment is 60%, that is, when the matching degree is greater than or equal to 60%, the WIFI connection information, the bluetooth information, the location information, the time information, and the association information with other terminal devices of the terminal device may be extracted from the physical information, and then the scene location corresponding to the identification information is obtained.
In one embodiment, for example, in the physical information, the wireless connection information occupies 40% of the area a, and all other information except the wireless connection information is located in the area B, that is, 60%, at this time, the cloud server determines the location information of the terminal device, and determines that the location information of the terminal device is the area B according to the matching algorithm.
In other embodiments, the preset threshold may also be set to other values, such as 70%, 75%, etc., and the specific value may be set according to specific needs, which is not limited in this application.
S24, taking the confirmed scene position as the scene position of the terminal device, and confirming whether the terminal device is in the same scene;
when the cloud server determines the terminal, the determination can be performed according to the signal source strength of the identification information. For better understanding, the following are exemplified: when the cloud server has determined that the physical information uploaded by the terminal equipment cloud is a signal source, when the signal emission capability of the signal source is strong, the terminal in the same scene of the signal source can detect the physical information data emitted by the signal source, and similarly, the terminal in the adjacent scene of the signal source can also detect the physical information data emitted by the signal source, but the signal intensities detected by the terminals in the two scenes are different, so that the scene area included by the signal source can be determined according to the coverage range of the wireless signal in the signal source, and then the scene position corresponding to the signal source capable of detecting the signal intensity is determined in the included scene area, for example, the signal intensity detected by the terminal in the same scene of the signal source of the terminal equipment is-11 dBm, the signal intensity detected by the terminal in the adjacent scene of the signal source is-6 dBm, and if the signal intensity received by the cloud server is, the scene position of the terminal device corresponding to the signal source can be determined.
In an embodiment, the confirmed scene position is used as the scene position where the terminal device is located, so as to confirm whether each terminal device belongs to the same scene or confirm terminal devices existing in the same scene.
In the embodiment of the invention, the identification information of the terminal equipment is confirmed according to the physical information, the identification information is matched with the information in a pre-stored database according to a preset algorithm, if the matching degree of the identification information and the information in the pre-stored database is greater than a preset value, the scene position corresponding to the identification information is confirmed, the confirmed scene position is used as the scene position where the terminal equipment is located, and whether the terminal equipment is located in the same scene is confirmed. Therefore, whether the terminal devices are in the same scene or not can be judged through the identification information corresponding to the physical information uploaded by the terminal device clouds and the matching of the identification information and the information of the pre-stored database through a preset algorithm, or the terminal devices existing in the same scene can be judged, so that the terminal device scenes can be used in a linkage mode.
Based on the third embodiment, as shown in fig. 4, which is a schematic flow chart of the steps of the fourth embodiment of the present invention, in S22, the method further includes:
s25, if the matching degree of the identification information and the information in the pre-stored database is smaller than or equal to a preset value, re-acquiring the physical information of the terminal equipment, and returning to the step of confirming the identification information of the terminal equipment according to the physical information;
in this step, when the matching degree between the identification information and the information in the pre-stored database is smaller than the preset value, the physical information of the terminal device is obtained again, and the step of confirming the identification information of the terminal device according to the physical information is returned, that is, the step of returning to S21.
In other embodiments, the preset value may also be set to other values, such as 70%, 75%, etc., and the specific value may be set according to specific needs, which is not limited in this application.
Therefore, when the matching degree of the identification information and the information in the pre-stored database is low, the physical information of the terminal equipment is obtained again, and the step of confirming the identification information of the terminal equipment according to the physical information is returned, so that the matching accuracy of the identification information and the information in the pre-stored database is improved, and the judgment accuracy of the cloud server is improved.
Based on the third embodiment, as shown in fig. 5, which is a schematic flow chart of the steps of the fifth embodiment of the present invention, after S24, the method further includes:
s26, acquiring the real position of each terminal device, and comparing the confirmed scene position with the real position of the terminal device;
s27, if the scene position is in the preset area of the real position, taking the confirmed scene position as the scene position of the terminal device;
and S28, if the scene position is outside the preset area of the real position, taking the real position as the scene position of the terminal equipment, and storing the real position.
In an embodiment, the real position of each terminal device is obtained, and the confirmed scene position is compared with the real position of the terminal device. Specifically, the real position may be confirmed by a GPS module of the terminal device or a GPS module of another terminal device, and after the real position of each terminal device is confirmed, the confirmed scene position may be compared with the real position of the terminal device.
In an embodiment, if the scene position is within a preset area of the real position, the confirmed scene position is used as the scene position where the terminal device is located. At this time, the judgment result of the cloud server is more accurate, that is, the cloud server stores the confirmed scene position, so that the cloud server can quickly and accurately judge when the judgment is performed twice or more.
In an embodiment, if the scene position is outside the preset area of the real position, the real position is used as the scene position where the terminal device is located, and the real position is stored. At this time, the judgment result of the cloud server is inaccurate, that is, the cloud server takes the real position as the scene position where the terminal device is located, and stores the real position, so that the cloud server can quickly and accurately judge when the judgment is performed for two times or more.
In an embodiment, the preset area is an area within 5m from the real position, that is, if the scene position is within 5m of the real position, the judgment of the cloud server is more accurate, and the judgment method of the cloud server does not need to be updated; if the scene position is in an area outside the real position 5m, the determination of the cloud server is inaccurate, and the determination method of the cloud server needs to be updated, and of course, the determination method of the cloud server may be manually updated by a user, or the determination method may be automatically updated according to the cloud server, which is not limited herein.
In an embodiment, the preset area may also be an area from the real position X, where the value of X may be set according to specific needs, and is not limited herein.
The real position of each terminal device is obtained, and the confirmed scene position is compared with the real position of the terminal device; if the scene position is in the preset area of the real position, taking the confirmed scene position as the scene position of the terminal equipment; and if the scene position is outside the preset area of the real position, taking the real position as the scene position of the terminal equipment, and storing the real position. Therefore, the real position can be compared with the confirmed scene position, and whether the judgment result of the cloud server is accurate or not is determined according to the comparison result, so that whether the judgment method of the cloud server is updated or not is judged, and the judgment accuracy of the cloud server is improved.
Based on the first embodiment, as shown in fig. 6, which is a schematic flow chart of steps in a sixth embodiment of the present invention, after S20, the method further includes:
s30, feeding back the confirmation results to the terminal devices;
in this step, when the cloud server confirms whether the terminal devices are in the same scene, the confirmation result is fed back to each terminal device, where the confirmation information includes location information and time information of the terminal device, and may also include, but is not limited to, wireless connection information and information related to other terminal devices, and the like, which is not limited in this application.
Specifically, the terminal device includes many types of terminal devices, and in an embodiment, the terminal device may be a terminal device at a student end, or may be a terminal device at a teacher end, or a terminal device at a home terminal, so that the students, the teachers, and the parents can all obtain more accurate scene determination method information, and thus the terminal device of the teacher or the parents can monitor whether the terminal devices of the students or the children are in the same scene, so as to obtain the learning conditions of the students or the children, for example, the terminal devices of the students or the children are all in an area where the terminal devices of the students or the children are located in the year, and when the cloud server confirms that the terminal devices of the students or the children are in the area where the terminal devices of the students or the parents are located in the year, the terminal devices of the teachers or the parents can monitor the area where the terminal devices of the students or the children are located.
Of course, when the confirmation information contains the association information with the other terminal device, the terminal device may implement course or courseware sharing or the like with the other terminal device.
In addition, the present embodiment is not limited to the field of application to student management, and may be applied to other fields such as company employee management, and the like, and is not limited in the present application.
Therefore, the scene judgment method information can be fed back to the terminal equipment, so that the information can be publicized and intelligently managed.
Based on all the above embodiments, an embodiment of the present invention further provides a cloud server, where the cloud server includes a memory, a processor, and a computer program that is stored in the memory and can be run on the processor, and when the processor executes the computer program, the steps of the scene determination methods of the terminal devices in all the above embodiments are implemented, and specific steps refer to the above embodiments and are not described herein again.
Based on all the embodiments described above, embodiments of the present application further provide a storage medium, where a computer program is stored on the storage medium, and the computer program, when executed by a processor, implements the steps of the scene determination methods of the terminal devices in all the embodiments described above, and specific steps refer to the embodiments described above and are not described herein again.
In addition, the memory storage medium described above may take any combination of one or more media. The storage medium may be a signal medium or a storage medium. The storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the storage medium include: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a flash Memory, an optical fiber, a portable compact disc Read Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A signal medium may include a propagated data signal with program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A signal medium may also be any medium that is not a storage medium and that can communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
Program code embodied on a storage medium may be transmitted over any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
The above description is a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. A scene determination method of a terminal device, the scene determination method of the terminal device comprising:
acquiring physical information uploaded by each terminal device cloud, wherein the physical information comprises at least one of wireless connection information, position information and associated information of other terminal devices;
confirming the identification information of the terminal equipment according to the physical information;
matching the identification information with information in a pre-stored database according to a preset algorithm;
if the matching degree of the identification information and information in a pre-stored database is greater than a preset value, confirming a scene position corresponding to the identification information;
and taking the confirmed scene position as the scene position of the terminal equipment, and confirming whether the terminal equipment is in the same scene.
2. The method for determining the scene of the terminal device according to claim 1, wherein the step of acquiring the physical information uploaded by each terminal device cloud includes:
sending an information request instruction to each terminal device;
and acquiring physical information uploaded by each terminal device cloud according to the information request instruction.
3. The method for determining the scene of the terminal device according to claim 2, wherein after the step of obtaining the physical information uploaded by each terminal device cloud, the method further includes:
and classifying the physical information uploaded by the terminal equipment clouds to confirm the relevance among the terminal equipment.
4. The method for determining a scene of a terminal device according to claim 1, wherein after the step of matching the identification information with information in a pre-stored database, the method further comprises:
and if the matching degree of the identification information and the information in the pre-stored database is smaller than or equal to a preset value, re-acquiring the physical information of the terminal equipment, and returning to the step of confirming the identification information of the terminal equipment according to the physical information.
5. The method of claim 1, wherein the step of using the confirmed scene position as the scene position of the terminal device comprises:
acquiring the real position of each terminal device, and comparing the confirmed scene position with the real position of the terminal device;
if the scene position is in the preset area of the real position, taking the confirmed scene position as the scene position of the terminal equipment;
and if the scene position is outside the preset area of the real position, taking the real position as the scene position of the terminal equipment, and storing the real position.
6. The method for determining the scene of the terminal device according to claim 1, wherein after the step of determining whether the terminal device is in the same scene by using the determined scene position as the scene position of the terminal device, the method further comprises:
and feeding back the confirmation result to each terminal device.
7. A cloud server, characterized in that the cloud server comprises a memory, a processor and a computer program stored on the memory and operable on the processor, and the processor executes the computer program to implement the steps of the scene determination method of the terminal device according to any one of claims 1 to 6.
8. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the scene determination method of the terminal device according to any one of claims 1 to 6.
CN201811322846.1A 2018-11-07 2018-11-07 Scene determination method for terminal device, cloud server, and storage medium Active CN109302684B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811322846.1A CN109302684B (en) 2018-11-07 2018-11-07 Scene determination method for terminal device, cloud server, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811322846.1A CN109302684B (en) 2018-11-07 2018-11-07 Scene determination method for terminal device, cloud server, and storage medium

Publications (2)

Publication Number Publication Date
CN109302684A CN109302684A (en) 2019-02-01
CN109302684B true CN109302684B (en) 2020-08-11

Family

ID=65145170

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811322846.1A Active CN109302684B (en) 2018-11-07 2018-11-07 Scene determination method for terminal device, cloud server, and storage medium

Country Status (1)

Country Link
CN (1) CN109302684B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113811901B (en) * 2019-06-21 2024-03-08 三菱电机楼宇解决方案株式会社 Position confirmation system for equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104050177B (en) * 2013-03-13 2018-12-28 腾讯科技(深圳)有限公司 Streetscape generation method and server
CN106486124A (en) * 2015-08-28 2017-03-08 中兴通讯股份有限公司 A kind of method of speech processes and terminal
CN106250435B (en) * 2016-07-26 2019-12-06 广东石油化工学院 user scene identification method based on mobile terminal noise map
CN106911563A (en) * 2017-02-21 2017-06-30 苏州亮磊知识产权运营有限公司 Same scene based reminding method based on mobile terminal shooting picture and position verification
CN107166645B (en) * 2017-05-18 2019-07-02 厦门瑞为信息技术有限公司 A kind of air conditioning control method based on indoor scene analysis
CN107770054A (en) * 2017-11-01 2018-03-06 上海掌门科技有限公司 Chat creation method and equipment under a kind of same scene
CN108445520A (en) * 2017-12-25 2018-08-24 达闼科技(北京)有限公司 A kind of indoor and outdoor based on high in the clouds builds drawing method, device, electronic equipment and computer program product
CN108415683A (en) * 2018-03-07 2018-08-17 深圳车盒子科技有限公司 More scene voice householder methods, intelligent voice system, equipment and storage medium
CN108460138A (en) * 2018-03-09 2018-08-28 北京小米移动软件有限公司 Music recommends method, apparatus, equipment and storage medium

Also Published As

Publication number Publication date
CN109302684A (en) 2019-02-01

Similar Documents

Publication Publication Date Title
US11274932B2 (en) Navigation method, navigation device, and storage medium
CN107948924B (en) Calibration method, system, server and the medium of wireless signal finger print information
CN104185141A (en) Bluetooth beacon device based system and method for detecting stay time of user in area
CN205280712U (en) Intelligence water quality monitoring system
CN104423543A (en) Information processing method and device
CN105491576A (en) Method and device for acquiring network test data
CN110619027A (en) House source information recommendation method and device, terminal equipment and medium
CN109302684B (en) Scene determination method for terminal device, cloud server, and storage medium
CN104464016A (en) Mobile intelligent inspection system of cable device and inspection method
CN107666398B (en) Group notification method, system and storage medium based on user behavior
CN110320544B (en) Method, device, equipment and storage medium for identifying position of terminal equipment
CN104484370A (en) Transmitting method, receiving method, transmitting device, receiving device and system for answer information on basis of questions and answers
CN103473355A (en) Method, device and system for sharing application
CN105376760A (en) Data collection method based on wireless network, wireless access equipment and server
CN112859136A (en) Positioning method and related device
Baronti et al. Remote detection of social interactions in indoor environments through bluetooth low energy beacons
CN104881228A (en) Method, device and system for detecting learning time
CN106375593A (en) Method and device for determining positioning mode
CN104392101A (en) Data sharing method and data sharing device
CN108668226A (en) Based on the processing environment feature of block chain and the method and its application of user preference
CN104080046A (en) Method and device for passive check in
CN116033544A (en) Indoor parking lot positioning method, computer device, storage medium and program product
JP2014230032A (en) Information processing device and program
CN102761613A (en) Mobile terminal and device information obtaining method and system thereof as well as server
CN105163270B (en) The recognition methods of intelligent terminal location status and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant