CN113723372B - Prompting method and device, computer equipment and computer readable storage medium - Google Patents

Prompting method and device, computer equipment and computer readable storage medium Download PDF

Info

Publication number
CN113723372B
CN113723372B CN202111280172.5A CN202111280172A CN113723372B CN 113723372 B CN113723372 B CN 113723372B CN 202111280172 A CN202111280172 A CN 202111280172A CN 113723372 B CN113723372 B CN 113723372B
Authority
CN
China
Prior art keywords
sensor
trigger level
signal
time
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111280172.5A
Other languages
Chinese (zh)
Other versions
CN113723372A (en
Inventor
何至军
曹丽娜
李向灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Longzhi Digital Technology Service Co Ltd
Original Assignee
Shanghai Zhuohan Technology Co ltd
Beijing Zhuojianzhihan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Zhuohan Technology Co ltd, Beijing Zhuojianzhihan Technology Co ltd filed Critical Shanghai Zhuohan Technology Co ltd
Priority to CN202111280172.5A priority Critical patent/CN113723372B/en
Publication of CN113723372A publication Critical patent/CN113723372A/en
Application granted granted Critical
Publication of CN113723372B publication Critical patent/CN113723372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Emergency Alarm Devices (AREA)
  • Train Traffic Observation, Control, And Security (AREA)

Abstract

The disclosure relates to the technical field of computers, and provides a prompting method and a prompting device. The method comprises the steps that a plurality of sensors are arranged at different positions of a target object, and trigger level signals respectively acquired by the sensors and time marks corresponding to the trigger level signals are acquired; determining signal acquisition information according to all trigger level signals and time marks corresponding to all trigger level signals; determining a user behavior type corresponding to the target object according to the signal acquisition information; if the user behavior type is determined to be the preset behavior type, prompt information can be output. Therefore, the method and the device can remind the user to stop executing the action of the user behavior type in real time, and avoid accidents caused by misoperation, so that real-time supervision and reminding of the action executed by the user on the target object can be realized, the risk of dangerous accidents is reduced, and the user experience is improved.

Description

Prompting method and device, computer equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and an apparatus for prompting, a computer device, and a computer-readable storage medium.
Background
When the existing track type object is used by a user, certain risks exist if the user does not operate the object properly. For example, in a scene where the rail-type object is a slide, sometimes a child carelessly walks down the slide when a parent or a teacher does not notice the slide, so that an accident caused by the reverse climbing of the slide occurs.
In general, in order to avoid an accident caused by improper operation in a scene in which a user uses a rail-type object, a warning board is attached to the rail-type object, so that the user is expected to be warned or reminded by the warning board. However, this method has great limitations, and it is still difficult to avoid accidents caused by improper operation of the user in a scene using the rail-type object, so a new prompting scheme for using the rail-type object is needed.
Disclosure of Invention
In view of this, the present disclosure provides a prompting method, an apparatus, a computer device, and a computer-readable storage medium, so as to solve the problem that it is difficult to avoid an accident caused by an improper operation in a scene where a user uses a track-type object in the prior art.
In a first aspect of the embodiments of the present disclosure, a method for prompting is provided, where the method includes:
acquiring trigger level signals respectively acquired by a plurality of sensors and time marks corresponding to the trigger level signals; wherein the plurality of sensors are respectively arranged at different positions of the target object;
determining signal acquisition information according to all trigger level signals and time marks corresponding to all trigger level signals;
determining a user behavior type corresponding to the target object according to the signal acquisition information;
and if the user behavior type is a preset behavior type, outputting prompt information.
In a second aspect of the embodiments of the present disclosure, a prompting device is provided, where the device includes:
the information acquisition module is used for acquiring trigger level signals respectively acquired by a plurality of sensors and time marks corresponding to the trigger level signals; wherein the plurality of sensors are respectively arranged at different positions of the target object;
the information determining module is used for determining signal acquisition information according to all the trigger level signals and the time identifications corresponding to all the trigger level signals;
the type determining module is used for determining the user behavior type corresponding to the target object according to the signal acquisition information;
and the information output module is used for outputting prompt information if the user behavior type is a preset behavior type.
In a third aspect of the embodiments of the present disclosure, a computer device is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the above method when executing the computer program.
In a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, which stores a computer program, which when executed by a processor, implements the steps of the above-mentioned method.
Compared with the prior art, the embodiment of the disclosure has the following beneficial effects: the method includes the steps that a plurality of sensors are arranged at different positions of a target object, and trigger level signals respectively acquired by the sensors and time marks corresponding to the trigger level signals are acquired; then, signal acquisition information can be determined according to all trigger level signals and the time marks corresponding to all trigger level signals; then, determining a user behavior type corresponding to the target object according to the signal acquisition information; if the user behavior type is determined to be the preset behavior type, prompt information can be output. It can be seen that, in this embodiment, the user behavior type of the action performed by the user on the target object can be identified through the information collected by the preset multiple sensors, and when the identified user behavior type is the preset behavior type, the prompt information can be output, so that the user can be reminded in real time to stop performing the action of the user behavior type, and an accident caused by improper operation is avoided, so that real-time supervision and reminding of the action performed by the user on the target object can be realized, and further, the risk of occurrence of a dangerous accident is reduced and the user experience is improved.
Drawings
To more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive efforts.
Fig. 1 is a scene schematic diagram of an application scenario provided by an embodiment of the present disclosure;
FIG. 2 is a flow chart of a prompting method provided by an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an architecture of a hint system provided by an embodiment of the present disclosure;
FIG. 4 is a signal schematic of a sinusoidal signal received by a sensor provided by an embodiment of the present disclosure;
FIG. 5 is a scene schematic diagram of a slide application scenario provided by an embodiment of the present disclosure;
FIG. 6 is a scene schematic diagram of a slide application scenario provided by an embodiment of the present disclosure;
FIG. 7 is a scenario diagram of a slide application scenario provided by an embodiment of the present disclosure;
FIG. 8 is a scenario diagram of a slide application scenario provided by an embodiment of the present disclosure;
FIG. 9 is a block diagram of a prompting device provided by an embodiment of the present disclosure;
fig. 10 is a schematic diagram of a computer device provided by an embodiment of the disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
A presentation method and apparatus according to an embodiment of the present disclosure will be described in detail below with reference to the accompanying drawings.
In the prior art, in order to avoid an accident caused by misoperation in a scene where a user uses a track object, a warning board is attached to the track object, so that the user is expected to be warned or reminded by the warning board. However, this method has great limitations, and it is still difficult to avoid accidents caused by improper operation of the user in a scene using a rail-type object.
In order to solve the above problems, the present disclosure provides a prompting method, in which a plurality of sensors are arranged at different positions of a target object, and a trigger level signal respectively acquired by the plurality of sensors and a time identifier corresponding to the trigger level signal are acquired; then, signal acquisition information can be determined according to all trigger level signals and the time marks corresponding to all trigger level signals; then, determining a user behavior type corresponding to the target object according to the signal acquisition information; if the user behavior type is determined to be the preset behavior type, prompt information can be output. It can be seen that, in this embodiment, the user behavior type of the action performed by the user on the target object can be identified through the information collected by the preset multiple sensors, and when the identified user behavior type is the preset behavior type, the prompt information can be output, so that the user can be reminded in real time to stop performing the action of the user behavior type, and an accident caused by improper operation is avoided, so that real-time supervision and reminding of the action performed by the user on the target object can be realized, and further, the risk of occurrence of a dangerous accident is reduced and the user experience is improved.
For example, the embodiments of the present disclosure may be applied to an application scenario as shown in fig. 1. In this scenario, a target object 1, a plurality of sensors 2 and a device 3 may be included; the device 3 may be a server, or may be a terminal device, such as an electronic device with a data processing function, such as a notebook, a desktop computer, a tablet computer, and a smart phone; the target object may be a rail-like object or other object that the user may move.
In particular, in the application scenario shown in fig. 1, a plurality of sensors 2 may be arranged at different positions in the target object 1. If the user passes through the position set by the sensor 2, the sensor 2 can acquire the trigger level signal, and the device 3 can acquire the trigger level signal acquired by the sensor 2 and mark a corresponding time mark for each trigger level signal. Then, the device 3 may determine the signal acquisition information according to all the trigger level signals and the time identifier corresponding to each trigger level signal. Then, the device 3 may determine the user behavior type corresponding to the target object according to the signal acquisition information. Subsequently, if the user behavior type is a preset behavior type, the device 3 may output a prompt message. In this way, when the identified user behavior type is the preset behavior type, the device 3 may output the prompt information so as to remind the user in real time to stop executing the action of the user behavior type, thereby avoiding an accident caused by improper operation, and thus realizing real-time supervision and reminding of the action executed by the user on the target object, and further reducing the risk of occurrence of a dangerous accident and improving user experience.
It should be noted that the specific type, number and combination of the terminal devices 1 may be adjusted according to the actual requirements of the application scenarios, and the embodiment of the present disclosure does not limit this.
It should be noted that the above application scenarios are only illustrated for the convenience of understanding the present disclosure, and the embodiments of the present disclosure are not limited in any way in this respect. Rather, embodiments of the present disclosure may be applied to any scenario where applicable.
Fig. 2 is a flowchart of a prompting method provided in an embodiment of the present disclosure. One prompting method of fig. 2 may be performed by device 3 of fig. 1. As shown in fig. 2, the prompting method includes:
s201: acquiring trigger level signals respectively acquired by a plurality of sensors and time marks corresponding to the trigger level signals.
In this embodiment, the target object may be understood as an object on which the user can move, and in one implementation, may be a rail-type object, such as a slide, a sliding rail, a crawling cableway, and the like. The target object may have a plurality of sensors disposed thereon, wherein the plurality of sensors may be disposed at different positions of the target object, respectively. It should be noted that, when the user passes through the position set by the sensor, the sensor may acquire the trigger level signal, and it is understood that, when the sensor acquires the trigger level signal, it indicates that the position set by the sensor is passed by the user.
After the trigger level signal acquired by the sensor is acquired, the corresponding time identifier of the trigger level signal can be marked for the trigger level signal according to the acquisition time of the trigger level signal, so that the time when the user passes through the position set by the sensor can be recorded. In one implementation, the time identification may be a specific time, such as 14: 05: 01, or a serial number identifier, such as "1" or "2", which may indicate the collection sequence of each trigger level signal.
Next, the system architecture diagram corresponding to fig. 3 is exemplified. The sensor array unit (i.e. the sensor) may include a plurality of electromagnetic wave sensors, and the sensor array unit may transmit microwave signals and synchronously receive signals with the same amplitude transmitted in reverse from the space, wherein the spatial signals received by the sensor array unit may convert sinusoidal signals into digital level signals (i.e. trigger level signals) and transmit the digital level signals to the data buffer unit. The buffer unit is a temporary signal storage device and is responsible for storing and counting the trigger level signals within a certain time, and after the trigger level signals pass through accumulation of preset time, the data buffer unit attaches a time stamp mark (i.e. a time mark) to the trigger level signals in the time period.
S202: and determining signal acquisition information according to all the trigger level signals and the time marks corresponding to all the trigger level signals.
In this embodiment, since the trigger level signal within a period of time may reflect the behavior of the user on the target object, the information related to the trigger level signal within a period of time may be used as the signal acquisition information, for example, the information related to all trigger level signals within five seconds may be used as the signal acquisition information. The information related to the trigger level signal may be understood as a signal related to the acquisition of the trigger level signal, and may include, for example, a time identifier corresponding to the trigger level signal, location information of a sensor acquiring the trigger level signal, and the like.
Next, the system architecture diagram corresponding to fig. 3 is continued to be exemplified. After the data caching unit attaches a time stamp mark (i.e., a time identifier) to the trigger level signal of a time period, the data caching unit may determine signal acquisition information according to all the trigger level signals and the time identifiers corresponding to the trigger level signals, and send the signal acquisition information to the data analysis unit.
S203: and determining the user behavior type corresponding to the target object according to the signal acquisition information.
In this embodiment, after the signal acquisition information is acquired, the signal acquisition information may be analyzed to obtain information that can reflect the behavior form of the user on the target object, so that the user behavior type corresponding to the target object may be determined according to the behavior form of the user on the target object. The user behavior type may be understood as a category of a behavior form of the user on the target object, for example, stationary, moving to a certain direction, slow moving, fast moving, and the like.
As an example, the information about the behavior form of the user on the target object, such as whether the user moves on the target object, which direction the user moves in, and how much the user moves speed, may be analyzed according to the time identifier corresponding to the trigger level signal in the signal acquisition information and the position information of the sensor acquiring the trigger level signal, so as to determine the user behavior type corresponding to the target object.
Next, the system architecture diagram corresponding to fig. 3 is continued to be exemplified. After the data analysis unit acquires the signal acquisition information, the data analysis unit can analyze the signal acquisition information, determine a user behavior type corresponding to the target object, and send the user behavior type to the core processing unit. In one implementation, both the data analysis unit and the core processing unit may be MCUs.
S204: and if the user behavior type is a preset behavior type, outputting prompt information.
In this embodiment, the preset behavior type may be understood as a behavior type that needs to be reminded, and in an implementation manner, the preset behavior type may be a behavior type that easily causes an accident, for example, when the target object is a sliding track, because a user easily collides when driving the sliding track in the reverse direction, the reverse sliding track may be used as the preset behavior type. It should be noted that the preset behavior type may be preset according to actual requirements.
Specifically, after the user behavior type is acquired, whether the user behavior type is a preset behavior type or not can be judged, if the user behavior type determined according to the signal acquisition information is the preset behavior type, which indicates that the behavior form of the user on the target object belongs to the behavior form which needs to be reminded, a prompt message of the user behavior type can be generated, and the prompt message is output. It should be noted that, in this embodiment, the prompting message may include at least one of the following forms: a voice message, a text message, a picture message, and a somatosensory message (for example, a vibration of a target object or a vibration of a device on the user body), and of course, the prompt message may be in other forms besides the above-mentioned form, and is not described herein again. Specifically, when the prompt message includes a voice message, the prompt can be performed in a voice broadcast or buzzer manner; when the prompt message comprises a text message, performing text reminding on terminal equipment of a track manager or terminal equipment of a supervisor of a user acting on a target object; when the prompt message comprises a picture message, picture reminding can be performed on terminal equipment of a track manager or terminal equipment of a supervisor of a user acting on a target object; when the prompt message comprises the somatosensory message, the terminal device of a track manager, the terminal device of a supervisor of a user acting on the target object or the terminal device of the user acting on the target object can be controlled to vibrate, so that the reminding effect is achieved.
Next, the system architecture diagram corresponding to fig. 3 is continued to be exemplified. After the core processing unit obtains the user behavior type, the core processing unit may analyze whether the user behavior type is a preset behavior type, and if the user behavior type is the preset behavior type, the core processing unit may generate a prompt message according to the user behavior type and send the prompt message to the communication unit, so that the communication unit sends the prompt message to the reminding device through the internet of things, and the reminding device prompts according to the prompt message.
Compared with the prior art, the embodiment of the disclosure has the following beneficial effects: the method includes the steps that a plurality of sensors are arranged at different positions of a target object, and trigger level signals respectively acquired by the sensors and time marks corresponding to the trigger level signals are acquired; then, signal acquisition information can be determined according to all trigger level signals and the time marks corresponding to all trigger level signals; then, determining a user behavior type corresponding to the target object according to the signal acquisition information; if the user behavior type is determined to be the preset behavior type, prompt information can be output. It can be seen that, in this embodiment, the user behavior type of the action performed by the user on the target object can be identified through the information collected by the preset multiple sensors, and when the identified user behavior type is the preset behavior type, the prompt information can be output, so that the user can be reminded in real time to stop performing the action of the user behavior type, and an accident caused by improper operation is avoided, so that real-time supervision and reminding of the action performed by the user on the target object can be realized, and further, the risk of occurrence of a dangerous accident is reduced and the user experience is improved.
Next, a specific implementation manner of S201, that is, "acquiring the trigger level signals respectively acquired by the plurality of sensors and the time identifier corresponding to the trigger level signals" will be described. Specifically, in one implementation, S201 may include: for each sensor, if the sinusoidal wave signal received by the sensor meets a signal triggering condition, a triggering level signal and a time identifier corresponding to the triggering level signal are generated.
In this embodiment, for each sensor, after the sensor receives a sine wave signal, it may be determined whether the sine wave signal satisfies a signal trigger condition, and if the sine wave signal satisfies the signal trigger condition, which indicates that a user passes through a location of the sensor, a trigger level signal and a time identifier corresponding to the trigger level signal may be generated. It should be noted that, the sensor in this embodiment may be an antenna module integrated with transceiver, and the antenna module integrated with transceiver may transmit a microwave signal through an antenna and synchronously receive a signal with the same amplitude transmitted in reverse space, and it should be noted that, if a user is located at a position of the antenna module, the antenna module may receive the signal with the same amplitude transmitted in reverse space, and the antenna module may also convert the received spatial signal from a sine wave signal into a digital level signal (i.e., a trigger level signal), and it should be noted that, when it is detected that the user is located at the position of the antenna module, the trigger level signal may be a high level signal.
It is understood that the signal trigger condition is that the product of the frequency of the sinusoidal signal received by the sensor and the time (acquisition time) is the same as the signal emission frequency value of the sensor. In one implementation, the sinusoidal signal y received by the sensor may be a signal received by a machine obtained at different antenna modules as shown in fig. 4, and the signal reflected by different spatial distances may generate a time error, so that the sinusoidal signal y received by the sensor may be a sinusoidal signal y received by the sensor
Figure 645123DEST_PATH_IMAGE001
Where y is the echo signal, A is the amplitude of the signal, ω is the frequency, t is the acquisition time, ω t is the product of the frequency and the acquisition time,
Figure 437629DEST_PATH_IMAGE002
for the signal offset caused when the signal is looped back,
Figure 217367DEST_PATH_IMAGE003
spatial noise and interference. When the product ω t of the frequency and the time (acquisition time) of the sinusoidal signal received by the sensor is multiplied by the signal emission frequency value of the sensor
Figure 550259DEST_PATH_IMAGE004
The sine wave signal received by the sensor meets the signal triggering condition, and a triggering level signal and a time mark corresponding to the triggering level signal are generated, namely when the sine wave signal meets the signal triggering condition
Figure 364631DEST_PATH_IMAGE005
When the user passes through the position of the sensor, the sensor can receive the sine wave signal, convert the sine wave signal into the level signal, obtain the trigger level signal, and record the time identifier corresponding to the trigger level signal.
Next, a specific implementation of S202, that is, "determining signal acquisition information according to all trigger level signals and the time stamp corresponding to each trigger level signal" will be described. Specifically, in one implementation, S202 may include:
s202 a: and sequencing all the trigger level signals according to the time marks corresponding to all the trigger level signals to obtain a signal sequencing result.
After obtaining all the trigger level signals and the time identifiers corresponding to the trigger level signals, all the trigger level signals may be sorted according to the time identifiers corresponding to the trigger level signals, for example, all the trigger level signals may be sorted in the order of time from early to late, that is, all the trigger level signals may be sorted according to the order of the time identifiers of the trigger level signals from front to back, so as to obtain a signal sorting result.
For example, assuming that the time identifier corresponding to the trigger level signal a is sequence number identifier "1" and the time identifier corresponding to the trigger level signal B is sequence number identifier "2", the larger the number of the sequence number identifier is, the later the time corresponding to the sequence number identifier is, and therefore, the obtained signal sorting result may be trigger level signal a → trigger level signal B.
S202 b: and determining signal acquisition information according to the signal sequencing result and the position identification of the sensor corresponding to each trigger level signal.
The signal sequencing result and the position identification of the sensor corresponding to each trigger level signal can reflect the behavior form of the user on the target object. Therefore, in this embodiment, the position identifier of the sensor corresponding to each trigger level signal may also be obtained; the position identifier of the sensor is identification information that can reflect the position of the sensor on the target object, and for example, the position identifier may be coordinates of the sensor. Therefore, the signal sequencing result and the position identification of the sensor corresponding to each trigger level signal can be used as signal acquisition information.
Next, a specific implementation manner of S203, namely, "determining the user behavior type corresponding to the target object according to the signal acquisition information" will be introduced. Specifically, in one implementation, S203 may include: and determining the user behavior type corresponding to the target object according to the signal sequencing result, the time identifier corresponding to each trigger level signal and the position identifier of the sensor.
In this embodiment, an implementation of determining a user behavior type is explained by taking an example in which a target object is a track-type object. It should be noted that the sensor may be disposed at the bottom or the side of the rail-like object, for example, the rail-like object is a slide, and the sensor may be disposed at the bottom or the side of the slide, so that the sensor may be used to detect whether the user passes through the position on the slide where the sensor is disposed. In one implementation, the distance between each two adjacent sensors may be 60 cm. It should be further noted that, if the target object is a track-type object, the user behavior type may include track forward movement, track reverse movement, and track stop, and the preset behavior type may be track reverse movement.
In one implementation, the plurality of sensors disposed on the target object may include a first sensor and a second sensor, and the signal sorting result is obtained by sorting all the trigger level signals according to the time marks of the trigger level signals from front to back.
If the signal sorting result indicates that the trigger level signal of the first sensor is located before the trigger level signal of the second sensor (i.e. the acquisition time of the trigger level signal of the first sensor is earlier than the acquisition time of the trigger level signal of the second sensor), the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than a preset time length (i.e. the time interval between the acquisition time of the trigger level signal of the first sensor and the acquisition time of the trigger level signal of the second sensor is less than the preset time length, for example, the preset time length is 5 seconds), and the position identifier of the first sensor is closer to the starting position of the target object than the position identifier of the second sensor, i.e. the first sensor is closer to the starting position of the track than the second sensor, the user behavior type corresponding to the target object is track forward movement.
Next, as illustrated in fig. 5, assuming that the target object is a slide, the first sensor a is disposed at a position a, that is, the position of the first sensor a is denoted by a, the second sensor B is disposed at a position B, that is, the position of the first sensor B is denoted by B, and the signal sorting result is that the trigger level signal a of the first sensor a is located before the trigger level signal B of the second sensor B (that is, the acquisition time of the trigger level signal a of the first sensor a is earlier than the acquisition time of the trigger level signal B of the second sensor B), and the time interval between the time denoted by the trigger level signal a of the first sensor a and the time denoted by the trigger level signal B of the second sensor B is less than the preset time length; since the position mark a of the first sensor a is closer to the starting position (i.e., the starting position) of the slide than the position mark B of the second sensor B, it can be determined that the behavior of the user on the slide is sliding from the starting position to the ending position, and thus it can be determined that the user behavior type corresponding to the slide is track forward movement.
If the signal sorting result is that the trigger level signal of the first sensor is located behind the trigger level signal of the second sensor (i.e. the acquisition time of the trigger level signal of the first sensor is later than the acquisition time of the trigger level signal of the second sensor), the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than a preset time (i.e. the time interval between the acquisition time of the trigger level signal of the first sensor and the acquisition time of the trigger level signal of the second sensor is less than a preset time, for example, the preset time is 5 seconds), and the position identifier of the first sensor is closer to the starting position of the target object than the position identifier of the second sensor, i.e. the first sensor is closer to the starting position of the track than the second sensor, the type of the user behavior corresponding to the target object can be determined as the track reverse movement.
Next, as illustrated in fig. 6, assuming that the target object is a slide, the first sensor a is disposed at a position a, that is, the position of the first sensor a is denoted by a, the second sensor B is disposed at a position B, that is, the position of the first sensor B is denoted by B, and the signal sorting result is that the trigger level signal a of the first sensor a is located behind the trigger level signal B of the second sensor B (that is, the acquisition time of the trigger level signal a of the first sensor a is later than that of the trigger level signal B of the second sensor B), and the time interval between the time denoted by the trigger level signal a of the first sensor a and the time denoted by the trigger level signal B of the second sensor B is less than the preset time length; since the position mark a of the first sensor a is closer to the starting position (i.e., the starting position) of the slide than the position mark B of the second sensor B, it can be determined that the behavior of the user on the slide is sliding from the end position to the starting position, and thus it can be determined that the behavior type of the user corresponding to the slide is track reverse movement.
If the signal sorting result only includes the trigger level signal of the first sensor or the trigger level signal of the second sensor, or the signal sorting result includes the trigger level signal of the first sensor and the trigger level signal of the second sensor, and the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is greater than or equal to a preset time length, it may be determined that the user behavior type corresponding to the target object is track dwell.
Next, an example will be described with reference to fig. 7 and 8. As shown in fig. 7, within the preset time period, the signal sorting result only includes the trigger level signal a of the first sensor a, and therefore, it can be determined that the user behavior type corresponding to the slide is the track stop; as shown in fig. 8, within the preset time period, the signal sequencing result only includes the trigger level signal B of the second sensor B, and thus, it may be determined that the user behavior type corresponding to the slide is the track stop.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 9 is a schematic diagram of a prompting device provided in an embodiment of the present disclosure. As shown in fig. 9, the presentation apparatus includes:
an information obtaining module 901, configured to obtain trigger level signals respectively acquired by multiple sensors and time identifiers corresponding to the trigger level signals; wherein, the plurality of sensors are respectively arranged at different positions of the target object;
an information determining module 902, configured to determine signal acquisition information according to all trigger level signals and time identifiers corresponding to the trigger level signals;
a type determining module 903, configured to determine a user behavior type corresponding to the target object according to the signal acquisition information;
and an information output module 904, configured to output a prompt message if the user behavior type is a preset behavior type.
Optionally, the information obtaining module 901 is configured to:
for each sensor, if the sinusoidal wave signal received by the sensor meets a signal triggering condition, generating a triggering level signal and a time identifier corresponding to the triggering level signal; the signal triggering condition is that the product of the frequency and the time of the sine wave signal is the same as the signal emission frequency value of the sensor.
Optionally, the information determining module 902 is configured to:
sequencing all the trigger level signals according to the time marks corresponding to all the trigger level signals to obtain a signal sequencing result;
and determining signal acquisition information according to the signal sequencing result and the position identification of the sensor corresponding to each trigger level signal.
Optionally, the type determining module 903 is configured to:
and determining the user behavior type corresponding to the target object according to the signal sequencing result, the time identifier corresponding to each trigger level signal and the position identifier of the sensor.
Optionally, the target object is a track-type object, and the preset behavior type is track reverse movement.
Optionally, the plurality of sensors includes a first sensor and a second sensor; the signal sorting result is obtained by sorting all the trigger level signals from front to back according to the time marks of all the trigger level signals;
a type determination module 903, configured to:
if the signal sequencing result is that the trigger level signal of the first sensor is located before the trigger level signal of the second sensor, the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than the preset time length, and the position identifier of the first sensor is closer to the starting position of the target object than the position identifier of the second sensor, the user behavior type corresponding to the target object is that the track moves in the forward direction;
if the signal sequencing result is that the trigger level signal of the first sensor is behind the trigger level signal of the second sensor, the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than the preset time length, and the position identifier of the first sensor is closer to the starting position of the target object than the position identifier of the second sensor, the user behavior type corresponding to the target object is that the track moves reversely;
and if the signal sequencing result only comprises the trigger level signal of the first sensor or the trigger level signal of the second sensor, or the signal sequencing result comprises the trigger level signal of the first sensor and the trigger level signal of the second sensor, and the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is greater than or equal to the preset time length, the user behavior type corresponding to the target object is the track stop.
Optionally, the sensor is an antenna module integrating transceiving.
Compared with the prior art, the embodiment of the disclosure has the following beneficial effects: the embodiment of the disclosure discloses a prompting device, which comprises: the information acquisition module is used for acquiring trigger level signals respectively acquired by a plurality of sensors and time marks corresponding to the trigger level signals; wherein the plurality of sensors are respectively arranged at different positions of the target object; the information determining module is used for determining signal acquisition information according to all the trigger level signals and the time identifications corresponding to all the trigger level signals; the type determining module is used for determining the user behavior type corresponding to the target object according to the signal acquisition information; and the information output module is used for outputting prompt information if the user behavior type is a preset behavior type. It can be seen that, in this embodiment, the user behavior type of the action performed by the user on the target object can be identified through the information collected by the preset multiple sensors, and when the identified user behavior type is the preset behavior type, the prompt information can be output, so that the user can be reminded in real time to stop performing the action of the user behavior type, and an accident caused by improper operation is avoided, so that real-time supervision and reminding of the action performed by the user on the target object can be realized, and further, the risk of occurrence of a dangerous accident is reduced and the user experience is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present disclosure.
Fig. 10 is a schematic diagram of a computer device 10 provided by an embodiment of the present disclosure. As shown in fig. 10, the computer device 10 of this embodiment includes: a processor 1001, a memory 1002, and a computer program 1003 stored in the memory 1002 and executable on the processor 1001. The steps in the various method embodiments described above are implemented when the processor 1001 executes the computer program 1003. Alternatively, the processor 1001 realizes the functions of each module/module in each device embodiment described above when executing the computer program 1003.
Illustratively, the computer program 1003 may be partitioned into one or more modules stored in the memory 1002 and executed by the processor 1001 to accomplish the present disclosure. One or more modules/modules may be a series of computer program instruction segments capable of performing certain functions, the instruction segments describing the execution of the computer program 1003 in the computer device 10.
The computer device 10 may be a desktop computer, a notebook computer, a palm computer, a cloud server, or other computer devices. Computer device 10 may include, but is not limited to, a processor 1001 and a memory 1002. Those skilled in the art will appreciate that fig. 10 is merely an example of a computer device 10 and is not intended to limit computer device 10 and that computer device 10 may include more or fewer components than shown, or some of the components may be combined, or different components, e.g., the computer device may also include input output devices, network access devices, buses, etc.
The Processor 1001 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 1002 may be an internal storage module of the computer device 10, such as a hard disk or a memory of the computer device 10. The memory 1002 may also be an external storage device of the computer device 10, such as a plug-in hard disk provided on the computer device 10, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, memory 1002 may also include both internal and external memory modules of computer device 10. The memory 1002 is used for storing computer programs and other programs and data required by the computer apparatus. The memory 1002 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned functional modules and modules are illustrated as examples, and in practical applications, the above-mentioned functional allocation may be performed by different functional modules and modules according to requirements, that is, the internal structure of the apparatus is divided into different functional modules or modules to perform all or part of the above-mentioned functions. In the embodiments, each functional module and each module may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module, and the integrated modules may be implemented in a form of hardware or a form of software functional modules. In addition, specific names of the functional modules and modules are only used for distinguishing one functional module from another, and are not used for limiting the protection scope of the present disclosure. The modules and the specific working processes of the modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, e.g., a division of modules or modules into only one logical division, another division may be present in an actual implementation, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present disclosure may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated modules/modules, if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like.
The above examples are only intended to illustrate the technical solutions of the present disclosure, not to limit them; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.

Claims (7)

1. A method for prompting, the method comprising:
acquiring trigger level signals respectively acquired by a plurality of sensors and time marks corresponding to the trigger level signals; the sensors are respectively arranged at different positions of a target object, the sensors comprise a first sensor and a second sensor, and the target object is a rail object;
determining signal acquisition information according to all trigger level signals and time marks corresponding to all trigger level signals;
determining a user behavior type corresponding to the target object according to the signal acquisition information;
if the user behavior type is a preset behavior type, outputting prompt information;
the acquiring of the trigger level signals respectively acquired by the plurality of sensors and the time identifier corresponding to the trigger level signals includes:
for each sensor, if the sinusoidal wave signal received by the sensor meets a signal triggering condition, generating a triggering level signal and a time identifier corresponding to the triggering level signal; the signal triggering condition is that the product of the frequency and the time of the sine wave signal is the same as the signal emission frequency value of the sensor;
the determining signal acquisition information according to each trigger level signal and the time identifier corresponding to each trigger level signal includes:
sequencing all the trigger level signals according to the time marks corresponding to all the trigger level signals to obtain a signal sequencing result, wherein the signal sequencing result is obtained by sequencing all the trigger level signals according to the time marks of all the trigger level signals from front to back;
determining signal acquisition information according to the signal sequencing result and the position identification of the sensor corresponding to each trigger level signal;
the determining the user behavior type corresponding to the target object according to the signal acquisition information includes:
if the signal sequencing result indicates that the trigger level signal of the first sensor is located before the trigger level signal of the second sensor, the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than a preset time length, and the position identifier of the first sensor is closer to the starting position of the target object than the position identifier of the second sensor, the user behavior type corresponding to the target object is track forward movement;
and if the signal sequencing result shows that the trigger level signal of the first sensor is behind the trigger level signal of the second sensor, the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than a preset time length, and the position identifier of the first sensor is closer to the starting point position of the target object than the position identifier of the second sensor, the user behavior type corresponding to the target object is track reverse movement.
2. The method of claim 1, wherein the predetermined behavior type is track reverse movement.
3. The method according to claim 1, wherein the determining the user behavior type corresponding to the target object according to the signal acquisition information further comprises:
and if the signal sequencing result only comprises the trigger level signal of the first sensor or the trigger level signal of the second sensor, or the signal sequencing result comprises the trigger level signal of the first sensor and the trigger level signal of the second sensor, and the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is greater than or equal to a preset time length, the user behavior type corresponding to the target object is the track stop.
4. A method according to any of claims 1-3, wherein the sensor is a transceiver integrated antenna module.
5. A prompting device, the device comprising:
the information acquisition module is used for acquiring trigger level signals respectively acquired by a plurality of sensors and time marks corresponding to the trigger level signals; the sensors are respectively arranged at different positions of a target object, the sensors comprise a first sensor and a second sensor, and the target object is a rail object;
the information determining module is used for determining signal acquisition information according to all the trigger level signals and the time identifications corresponding to all the trigger level signals;
the type determining module is used for determining the user behavior type corresponding to the target object according to the signal acquisition information;
the information output module is used for outputting prompt information if the user behavior type is a preset behavior type;
wherein the content of the first and second substances,
the information acquisition module is specifically configured to: for each sensor, if the sinusoidal wave signal received by the sensor meets a signal triggering condition, generating a triggering level signal and a time identifier corresponding to the triggering level signal; the signal triggering condition is that the product of the frequency and the time of the sine wave signal is the same as the signal emission frequency value of the sensor;
the information determination module is specifically configured to: sequencing all the trigger level signals according to the time marks corresponding to all the trigger level signals to obtain a signal sequencing result, wherein the signal sequencing result is obtained by sequencing all the trigger level signals according to the time marks of all the trigger level signals from front to back; determining signal acquisition information according to the signal sequencing result and the position identification of the sensor corresponding to each trigger level signal;
the type determination module is specifically configured to:
if the signal sequencing result indicates that the trigger level signal of the first sensor is located before the trigger level signal of the second sensor, the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than a preset time length, and the position identifier of the first sensor is closer to the starting position of the target object than the position identifier of the second sensor, the user behavior type corresponding to the target object is track forward movement;
and if the signal sequencing result shows that the trigger level signal of the first sensor is behind the trigger level signal of the second sensor, the time interval between the time identifier of the trigger level signal of the first sensor and the time identifier of the trigger level signal of the second sensor is less than a preset time length, and the position identifier of the first sensor is closer to the starting point position of the target object than the position identifier of the second sensor, the user behavior type corresponding to the target object is track reverse movement.
6. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when executing the computer program.
7. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN202111280172.5A 2021-11-01 2021-11-01 Prompting method and device, computer equipment and computer readable storage medium Active CN113723372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111280172.5A CN113723372B (en) 2021-11-01 2021-11-01 Prompting method and device, computer equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111280172.5A CN113723372B (en) 2021-11-01 2021-11-01 Prompting method and device, computer equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113723372A CN113723372A (en) 2021-11-30
CN113723372B true CN113723372B (en) 2022-01-18

Family

ID=78686289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111280172.5A Active CN113723372B (en) 2021-11-01 2021-11-01 Prompting method and device, computer equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113723372B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105243732A (en) * 2015-10-30 2016-01-13 深圳怡化电脑股份有限公司 Filtering method and system of sensor information during banknote information
CN107844741A (en) * 2017-09-11 2018-03-27 浙江新再灵科技股份有限公司 The detection warning system and method that children drive in the wrong direction on escalator
CN108909378A (en) * 2018-07-26 2018-11-30 宁波琻捷电子科技有限公司 Vehicle tyre localization method and system
CN109921796A (en) * 2017-12-13 2019-06-21 林项武 A kind of circuit that the electric signal of mechanical periodicity is converted to direct current signal level
CN111046832A (en) * 2019-12-24 2020-04-21 广州地铁设计研究院股份有限公司 Image recognition-based retrograde determination method, device, equipment and storage medium
CN111144247A (en) * 2019-12-16 2020-05-12 浙江大学 Escalator passenger reverse-running detection method based on deep learning
CN111382705A (en) * 2020-03-10 2020-07-07 创新奇智(广州)科技有限公司 Reverse behavior detection method and device, electronic equipment and readable storage medium
CN111924695A (en) * 2020-07-09 2020-11-13 上海市隧道工程轨道交通设计研究院 Intelligent safety protection system for subway escalator and working method of intelligent safety protection system
CN112836667A (en) * 2021-02-20 2021-05-25 上海吉盛网络技术有限公司 Method for judging falling and retrograde of passenger on ascending escalator
CN112919294A (en) * 2021-03-25 2021-06-08 衢州市特种设备检验中心 Escalator control method and system and escalator system
CN113307133A (en) * 2021-06-18 2021-08-27 南京信息工程大学 Elevator passenger reverse driving safety monitoring control system and monitoring method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2481250C (en) * 2002-04-08 2011-09-27 Newton Security, Inc. Tailgating and reverse entry detection, alarm, recording and prevention using machine vision
CN204569120U (en) * 2015-01-22 2015-08-19 杭州西奥电梯有限公司 A kind of safety takes the guidance system of escalator
CN105203100B (en) * 2015-10-29 2018-12-25 小米科技有限责任公司 Intelligently guiding user takes the method and device of elevator
CN207209724U (en) * 2017-07-31 2018-04-10 江南嘉捷电梯股份有限公司 A kind of self diagnosis escalator

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105243732A (en) * 2015-10-30 2016-01-13 深圳怡化电脑股份有限公司 Filtering method and system of sensor information during banknote information
CN107844741A (en) * 2017-09-11 2018-03-27 浙江新再灵科技股份有限公司 The detection warning system and method that children drive in the wrong direction on escalator
CN109921796A (en) * 2017-12-13 2019-06-21 林项武 A kind of circuit that the electric signal of mechanical periodicity is converted to direct current signal level
CN108909378A (en) * 2018-07-26 2018-11-30 宁波琻捷电子科技有限公司 Vehicle tyre localization method and system
CN111144247A (en) * 2019-12-16 2020-05-12 浙江大学 Escalator passenger reverse-running detection method based on deep learning
CN111046832A (en) * 2019-12-24 2020-04-21 广州地铁设计研究院股份有限公司 Image recognition-based retrograde determination method, device, equipment and storage medium
CN111382705A (en) * 2020-03-10 2020-07-07 创新奇智(广州)科技有限公司 Reverse behavior detection method and device, electronic equipment and readable storage medium
CN111924695A (en) * 2020-07-09 2020-11-13 上海市隧道工程轨道交通设计研究院 Intelligent safety protection system for subway escalator and working method of intelligent safety protection system
CN112836667A (en) * 2021-02-20 2021-05-25 上海吉盛网络技术有限公司 Method for judging falling and retrograde of passenger on ascending escalator
CN112919294A (en) * 2021-03-25 2021-06-08 衢州市特种设备检验中心 Escalator control method and system and escalator system
CN113307133A (en) * 2021-06-18 2021-08-27 南京信息工程大学 Elevator passenger reverse driving safety monitoring control system and monitoring method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于机器视觉的扶梯安全检测方法的研究;黎德源;《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》;20180515;第I138-502页 *

Also Published As

Publication number Publication date
CN113723372A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN106952303B (en) Vehicle distance detection method, device and system
CN110784825B (en) Method and device for generating vehicle running track
US10860877B2 (en) Logistics parcel picture processing method, device and system
CN111131783A (en) Monitoring method and device based on electronic fence, terminal equipment and storage medium
CN108848274B (en) Message reminding method, device and equipment
Nieto et al. On creating vision‐based advanced driver assistance systems
CN110047513A (en) A kind of video monitoring method, device, electronic equipment and storage medium
CN109960959B (en) Method and apparatus for processing image
CN112651266A (en) Pedestrian detection method and device
CN113723372B (en) Prompting method and device, computer equipment and computer readable storage medium
CN106610716A (en) Gesture recognition method and device
CN110853364B (en) Data monitoring method and device
CN113963316A (en) Target event determination method and device, storage medium and electronic device
CN108957460A (en) Detection method, equipment and the computer readable storage medium of vehicle distances
CN109188533B (en) Detection method, device and equipment of electronic equipment
CN113438318B (en) Performance test system and method of cloud control platform, electronic equipment and storage medium
CN114513608A (en) Movement detection method and device and electronic equipment
CN113009432B (en) Method, device and equipment for improving measurement accuracy and target detection accuracy
CN112261582B (en) Electronic price tag moving position determining method, device and system
CN112669351A (en) Rapid movement detection method and device
CN105653025A (en) Information processing method and electronic equipment
CN111521174A (en) Method, device, medium and electronic equipment for generating synchronization signal of combined inertial navigation system
Behrisch et al. Modelling Bluetooth inquiry for SUMO
CN107786734B (en) Information identification method, server and computer storage medium
CN113687358B (en) Target object identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220805

Address after: 5305, floor 5, building 6, No. 8, Beiyuan street, Chaoyang District, Beijing 100020

Patentee after: Beijing Longzhi Digital Technology Service Co.,Ltd.

Address before: 101100-090, floor 1, building 1, No. 2, Jufu North Road, Jufuyuan national industry development base, Tongzhou District, Beijing

Patentee before: Beijing zhuojianzhihan Technology Co.,Ltd.

Patentee before: Shanghai zhuohan Technology Co.,Ltd.