CN114463654A - State detection method, device, equipment and computer storage medium - Google Patents

State detection method, device, equipment and computer storage medium Download PDF

Info

Publication number
CN114463654A
CN114463654A CN202210122988.3A CN202210122988A CN114463654A CN 114463654 A CN114463654 A CN 114463654A CN 202210122988 A CN202210122988 A CN 202210122988A CN 114463654 A CN114463654 A CN 114463654A
Authority
CN
China
Prior art keywords
environment
image
similarity
images
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210122988.3A
Other languages
Chinese (zh)
Inventor
彭应亮
王前卫
时代奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202210122988.3A priority Critical patent/CN114463654A/en
Publication of CN114463654A publication Critical patent/CN114463654A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a method, a device, equipment and a computer storage medium for detecting equipment state, wherein the method for detecting the equipment state comprises the following steps: acquiring at least two environment images of the photographed surrounding environment of the equipment, wherein the photographing visual angles of the environment images are the same, and the photographing time difference of the environment images meets the set condition; performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time; acquiring the similarity of reference areas of environment images adjacent to the shooting time; and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state.

Description

State detection method, device, equipment and computer storage medium
Technical Field
The embodiment of the application relates to the technical field of data processing, in particular to a method, a device, equipment and a computer storage medium for detecting equipment states.
Background
With the development of electronic information technology, many devices are installed with applications having a map navigation function, such as a map application, a network appointment application, or a life service application. These applications need to detect whether the device is in a moving state or a stationary state when certain functions are implemented, such as implementing AR navigation functions.
In the prior art, when detecting the state of equipment, the state is generally detected by using data output by a sensor mounted on the equipment, wherein the sensor comprises: an Inertial Measurement Unit (IMU), an acceleration gyroscope, a Global Positioning System (GPS), or the like. Because the sensor has detection error and the output data is not very accurate, the state detection is carried out based on the data output by the sensor, and the problem of inaccurate state detection exists, thereby influencing the realization of corresponding functions.
Disclosure of Invention
In view of the above, embodiments of the present application provide a method, an apparatus, a device and a computer storage medium for detecting a device status, so as to solve some or all of the above problems.
According to a first aspect of embodiments of the present application, there is provided an apparatus status detection method, including: acquiring at least two environment images of the photographed surrounding environment of the equipment, wherein the photographing visual angles of the environment images are the same, and the photographing time difference of the environment images meets the set condition; performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time; acquiring the similarity of reference areas of environment images adjacent to the shooting time; and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state.
According to a second aspect of the embodiments of the present application, there is provided an apparatus for detecting a device state, including: the device comprises an acquisition module, a storage module and a display module, wherein the acquisition module is used for acquiring at least two environment images of the environment around the device, the environment images have the same shooting visual angle, and the shooting time difference of the environment images meets the set condition; the identification module is used for carrying out image identification on the environment image and determining a reference area corresponding to a static object in the environment image according to an identification result, wherein the static object refers to an object which does not change along with the change of shooting time and has a state or a form changing; the calculation module is used for acquiring the similarity of reference areas of the environment images adjacent to the shooting time; and the determining module is used for determining that the equipment is in a static state if the similarity is greater than or equal to a preset similarity threshold.
According to a third aspect of embodiments of the present application, there is provided an electronic apparatus, including: the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation corresponding to the device state detection method of the first aspect.
According to a fourth aspect of embodiments of the present application, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the device status detection method as in the first aspect.
The method, the device, the equipment and the computer storage medium for detecting the equipment state are used for acquiring at least two environment images of the surrounding environment of the equipment, wherein the shooting visual angles of the environment images are the same, and the shooting time difference of the environment images meets the set condition; performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time; acquiring the similarity of reference areas of environment images adjacent to the shooting time; and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state. According to the method and the device, whether the equipment is static or not is determined by utilizing the similarity of the environment images, errors caused by hardware are reduced, and the accuracy of equipment state detection is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic view of a scene of an apparatus state detection method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of an apparatus status detection method according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a similarity calculation effect according to an embodiment of the present application;
fig. 4 is a structural diagram of an apparatus state detection device according to a second embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present application, the technical solutions in the embodiments of the present application will be described clearly and completely below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application shall fall within the scope of the protection of the embodiments in the present application.
The following further describes specific implementations of embodiments of the present application with reference to the drawings of the embodiments of the present application.
Example one
For convenience of understanding, an application scenario of the device status detection method provided in the first embodiment of the present application is described, and fig. 1 is shown with reference to fig. 1, where fig. 1 is a scenario diagram of the device status detection method provided in the first embodiment of the present application. The scenario shown in fig. 1 includes an electronic device 101 and a device 102. The electronic device 101 may be a device that executes the device status detection method provided in the first embodiment of the present application.
The electronic device 101 may be a terminal device such as a smart phone, a tablet computer, a notebook computer, or the like, and the electronic device 101 may also be a network device such as a server, which is not limited in this application.
The electronic device 101 may access a Network, be connected to a cloud via a Network, and perform data interaction, where the Network includes a Local Area Network (LAN), a Wide Area Network (WAN), and a mobile communication Network; such as the World Wide Web (WWW), Long Term Evolution (LTE) networks, 2G networks (2 th Generation Mobile Network), 3G networks (3 th Generation Mobile Network), 5G networks (5 th Generation Mobile Network), etc. Of course, this is merely an example and does not represent a limitation of the present application.
Taking the device as an example in fig. 1 as a vehicle, the electronic device acquires at least two environment images captured at a preset capturing angle of view, where the environment images may indicate an environment around the device, the preset capturing angle of view is not changed with respect to the device within a preset time period, that is, the capturing angle of view of the environment images is not changed with respect to the device within the preset time period, a reference region corresponding to a stationary object is identified in the environment images, and the stationary object may include an object in the environment images whose position and appearance are not changed with respect to the ground. Such as utility poles, buildings, station boards, etc., which are objects that are not easily changed for a short time, and therefore, if the device is stationary, the stationary objects in the environment image should have a very high degree of similarity, and if the device is moving, the stationary objects may present different images in different environment images. For another example, objects such as trees and traffic lights, which may change themselves, are not suitable as static objects, leaves of trees may swing due to wind, and traffic lights may change according to time timing, so that images presented by these objects in the environment image may change even though the device does not move. Calculating the similarity between the reference area of each environment image and the corresponding area in the previous environment image to obtain at least one similarity, if the at least one similarity is greater than or equal to a similarity threshold, the environment around the device is not changed relative to the device, the device can be determined to be static, and if the at least one similarity has a value less than the similarity threshold, the environment around the device is changed relative to the device, and because the static objects are not changed relative to the ground, the device can be determined to move relative to the surrounding environment, and the device can be determined not to be static.
With reference to the scenario shown in fig. 1, a device state detection method provided in an embodiment of the present application is described in detail, it should be noted that fig. 1 is only an application scenario of the device state detection method provided in the embodiment of the present application, and does not represent that the device state detection method must be applied to the scenario shown in fig. 1, optionally, the device state detection method provided in the embodiment of the present application may be applied to an electronic device, that is, the electronic device is an execution main body of the device state detection method provided in the embodiment of the present application, the electronic device may be a terminal device such as a smart phone, a tablet computer, a notebook computer, and the like, and the electronic device may also be a network device such as a server, as shown in fig. 2, fig. 2 is a flowchart of the device state detection method provided in the embodiment of the present application, and the method includes the following steps:
step 201, at least two captured environment images of the environment surrounding the device are obtained, the captured view angles of the environment images are the same, and the capture time difference of the environment images meets a set condition.
The device may be any object whose speed is to be detected, for example, the device may be a vehicle, a person, a bicycle, or the like. The environment images are used for indicating the environment around the equipment, the preset shooting visual angle is not changed relative to the equipment, if the shooting visual angle is changed, the two images with high similarity can not determine whether the equipment moves, therefore, the preset shooting visual angles of the at least two environment images are required to be fixed relative to the equipment, and the shooting time difference meets the set conditions. For example, the device may be a vehicle, and the preset photographing angle of view may include an angle of view toward the front of the vehicle, an angle of view toward the rear of the vehicle, and the like. The time difference may be a preset time period, and may be set by those skilled in the art as needed.
In an example, the environment image may be captured by the electronic device, or may be captured by another device and then transmitted to the electronic device. In another example, the environment image may be an image frame in a video stream within a preset time period, or may be at least two images separately captured within the preset time period.
For example, in one specific example, at least two environmental images are acquired, including: acquiring a video stream obtained by shooting the environment of the equipment; at least two image frames are extracted in the video stream as at least two ambient images. The method and the device have the advantages that the surrounding environment of the device can be continuously shot according to the preset shooting visual angle to obtain the video stream, and the environment image is extracted from the video stream to ensure that the real-time performance of the obtained environment image is better.
Optionally, in an embodiment, acquiring at least two captured environment images of the environment around the device includes: acquiring the measurement speed of the equipment measured by the speed measurement module; and when the measuring speed of the equipment is less than or equal to the preset speed value, acquiring at least two environment images.
The measuring speed measured by the speed measuring module can be determined before the environment image is obtained, if the measuring speed is high, although the measuring module has errors on hardware, the measuring module can prove that the equipment is not in a static state, and when the speed is high, the influence of the errors on the hardware on the measuring speed is weak, the environment image detecting speed can not be obtained, when the equipment speed is low, the proportion of the errors in the measuring speed is high, the influence on the result is large, and at the moment, the environment image detecting speed can be obtained. The preset speed value may be any speed value greater than 0km/h and less than or equal to 5km/h, and of course, this is only an exemplary description, and the environment image is acquired again when the measured speed is less than or equal to the preset speed value, so as to reduce the data computation amount.
Step 202, performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of the shooting time.
The stationary object includes an object in the environment image whose position and appearance do not change with respect to the ground. For example, the stationary object may include a building such as a telegraph pole or a building, a signal tower, and the like, and an object that is easy to change such as a tree or a traffic light is not suitable as a reference image. The stationary object may also be a part of an object whose position and appearance do not change with respect to the ground, for example, the stationary object may comprise a trunk of a tree, a support bar of a traffic light, etc.
There are many implementations of image recognition of the environment image, and two specific examples are presented here for illustration.
In a first example, performing image recognition on an environment image, and determining a reference region corresponding to a stationary object in the environment image according to a recognition result includes:
identifying a changing object in the environment image by using an image recognition model, wherein the changing object refers to an object of which the state or the form changes along with the change of the shooting time; and segmenting the environment image, matting the area occupied by the change object, and determining the residual area as the reference area.
The embodiment of the application determines the reference area by scratching the area occupied by the variable object, so that the reference area is as large as possible, and the accuracy of similarity calculation can be improved.
In a second example, performing image recognition on an environment image, and determining a reference region corresponding to a stationary object in the environment image according to a recognition result includes:
recognizing a fixed-position object in the environment image as a static object by using an image recognition model; and segmenting the environment image, and determining the area occupied by the fixed object as the reference area.
The reference area is determined by dividing the area occupied by the fixed object, the reference area can be quickly obtained, and the algorithm is simple.
And step 203, acquiring the similarity of the reference areas of the environment images adjacent to the shooting time.
In this application, optionally, in an embodiment, acquiring the similarity of the reference areas of the environment images adjacent to each other in the shooting time includes:
acquiring environment images with adjacent shooting time; for each environmental image, dividing a reference area of the environmental image into at least two image units; calculating the similarity between corresponding image units of the environment images adjacent in shooting time, wherein the corresponding image units refer to image units with the same position in the environment images; and determining the similarity of the reference areas of the environment images adjacent to the shooting time based on the similarity between the image units of the environment images.
As shown in fig. 3, fig. 3 is a schematic diagram illustrating a similarity calculation effect according to an embodiment of the present application, where fig. 3 illustrates an environment image a and an environment image B, a reference area in the environment image a is an area occupied by a building, N image units a are determined in the reference area of the environment image a, one image unit may include at least one pixel, N image units B are determined in an area corresponding to positions of the image units in the environment image B, and each pair of image units corresponding to the positions may form a group of image units, so as to form N groups of image units, and a similarity between the image unit a and the image unit B in each group of image units is calculated, so as to obtain similarities of the N groups of image units. The calculation similarity of the N image units is determined in the reference area of the environment image, the calculation is not carried out on the whole reference area, the calculation amount is reduced, and the speed of calculating the similarity is improved.
And 204, if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state.
It should be noted that the similarity may be expressed as a percentage, for example, 100% represents the exact same, and the similarity threshold may be set to any value between 80% and 100%; for another example, the similarity threshold may be set to any value between 80 and 100, which is 100 for the exact same, and this is only an exemplary illustration. The similarity may be calculated once every time an environment image is acquired, or may be calculated together after several environment images are acquired at one time.
Alternatively, in the first example, the calculation is performed every time an environmental image is acquired. The electronic device periodically acquires the environment images shot under the preset shooting visual angle, identifies the environment images in the current period, determines a reference area corresponding to a static object in the environment images in the current period according to the identification result, calculates the similarity between the reference area of the environment images in the current period and the corresponding area in the previous environment image, and determines that the device is static if the similarity of K periods is greater than or equal to a similarity threshold value, wherein K is the number of the environment images acquired in the preset time period in the step 201. The similarity is calculated in real time, whether the equipment is static or not can be determined in real time, and the response speed is higher in a scene needing real-time detection.
Optionally, in a second example, at least two environment images of the device taken at a preset taking angle in the past may be acquired at the current time, the at least two environment images may be sorted by time, image recognition is performed on each environment image to determine a reference region corresponding to a stationary object, for each environment image, a similarity between the reference region of the environment image and a corresponding region in a previous environment image is calculated, if the similarities are both greater than or equal to a similarity threshold, the device is determined to be stationary within a preset time period, otherwise, the device may be determined not to be stationary within the preset time period.
Optionally, in a specific application scenario, taking Augmented Reality (AR) navigation as an example, the method further includes: after the device is determined to be static within a preset time period, generating an Augmented Reality (AR) navigation instruction according to the acquired information; and performing navigation guidance based on the AR navigation instruction.
The equipment state detection method provided by the embodiment of the application acquires at least two environment images of the environment around the equipment, wherein the shooting visual angles of the environment images are the same, and the shooting time difference of the environment images meets the set condition; performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time; acquiring the similarity of reference areas of environment images adjacent to the shooting time; and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state. According to the method and the device, whether the equipment is static or not is determined by utilizing the similarity of the environment images, errors caused by hardware are reduced, and the accuracy of equipment state detection is improved.
Example two
Based on the method described in the first embodiment, a second embodiment of the present application provides an apparatus state detection device, configured to execute any one of the methods described in the first embodiment, and referring to fig. 4, the apparatus state detection device 40 includes:
an obtaining module 401, configured to obtain at least two captured environment images of the environment around the device, where the captured angle of view of the environment images is the same and a difference between capturing times of the environment images satisfies a set condition;
an identifying module 402, configured to perform image identification on the environment image, and determine a reference area corresponding to a stationary object in the environment image according to an identification result, where the stationary object is an object whose state or form does not change with a change of shooting time;
a calculating module 403, configured to obtain similarity of reference areas of environment images adjacent to each other at the shooting time;
a determining module 404, configured to determine that the device is in a stationary state if the similarities are all greater than or equal to a preset similarity threshold.
The device state detection device provided by the embodiment of the application acquires at least two environment images of the surrounding environment of the device, wherein the environment images have the same shooting visual angle and the shooting time difference of the environment images meets the set condition; performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time; acquiring the similarity of reference areas of environment images adjacent to the shooting time; and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state. According to the method and the device, whether the equipment is static or not is determined by utilizing the similarity of the environment images, errors caused by hardware are reduced, and the accuracy of equipment state detection is improved.
EXAMPLE III
Based on the method described in the first embodiment, a third embodiment of the present application provides an electronic device, configured to execute the method described in the first embodiment, and referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device provided in a fourth embodiment of the present application, and a specific embodiment of the present application does not limit a specific implementation of the electronic device.
As shown in fig. 5, the electronic device may include: a processor (processor)502, a Communications Interface 504, a memory 506, and a communication bus 508.
Wherein:
the processor 502, communication interface 504, and memory 506 communicate with one another via a communication bus 508.
A communication interface 504 for communicating with other electronic devices such as a terminal device or a server.
The processor 502 is configured to execute the program 510, and may specifically perform the relevant steps in the above method embodiments.
In particular, program 510 may include program code that includes computer operating instructions.
The processor 502 may be a CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present application. The electronic device comprises one or more processors, which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 506 for storing a program 510. The memory 506 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 510 may specifically be used to cause the processor 502 to perform any of the methods in the previous embodiments.
For specific implementation of each step in the program 510, reference may be made to corresponding steps and corresponding descriptions in units in the foregoing device status detection method embodiments, which are not described herein again. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
The electronic equipment provided by the embodiment of the application acquires at least two environment images of the photographed surrounding environment of the equipment, wherein the photographing visual angles of the environment images are the same, and the photographing time difference of the environment images meets the set condition; performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time; acquiring the similarity of reference areas of environment images adjacent to the shooting time; and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state. According to the method and the device, whether the equipment is static or not is determined by utilizing the similarity of the environment images, errors caused by hardware are reduced, and the accuracy of equipment state detection is improved.
Example four
Based on the method described in the first embodiment, a fourth embodiment of the present application provides a computer storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method described in the first embodiment.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present application may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present application.
The above-described methods according to embodiments of the present application may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium downloaded through a network and to be stored in a local recording medium, so that the methods described herein may be stored in such software processes on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It will be appreciated that the computer, processor, microprocessor controller or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the device state detection methods described herein. Further, when a general-purpose computer accesses code for implementing the device status detection method illustrated herein, execution of the code transforms the general-purpose computer into a special-purpose computer for performing the device status detection method illustrated herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The above embodiments are only used for illustrating the embodiments of the present application, and not for limiting the embodiments of the present application, and those skilled in the relevant art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also belong to the scope of the embodiments of the present application, and the scope of patent protection of the embodiments of the present application should be defined by the claims.

Claims (10)

1. A device state detection method, comprising:
acquiring at least two environment images of the photographed surrounding environment of the equipment, wherein the photographing visual angles of the environment images are the same, and the photographing time difference of the environment images meets the set condition;
performing image recognition on the environment image, and determining a reference area corresponding to a static object in the environment image according to a recognition result, wherein the static object refers to an object of which the state or the form does not change along with the change of shooting time;
acquiring the similarity of reference areas of environment images adjacent to the shooting time;
and if the similarity is greater than or equal to a preset similarity threshold, determining that the equipment is in a static state.
2. The method according to claim 1, wherein the image recognition of the environment image and the determination of the reference region corresponding to the static object in the environment image according to the recognition result comprise:
identifying a changing object in the environment image by using an image recognition model, wherein the changing object refers to an object of which the state or the form changes along with the change of the shooting time;
and segmenting the environment image, matting the area occupied by the change object, and determining the residual area as the reference area.
3. The method according to claim 1, wherein the image recognition of the environment image and the determination of the reference region corresponding to the static object in the environment image according to the recognition result comprise:
recognizing a fixed-position object in the environment image as a static object by using an image recognition model;
and segmenting the environment image, and determining the area occupied by the fixed object as the reference area.
4. The method of claim 1, wherein the obtaining of the similarity of the reference areas of the environment images adjacent to the shooting time comprises:
acquiring environment images with adjacent shooting time;
for each environmental image, dividing a reference area of the environmental image into at least two image units;
calculating the similarity between corresponding image units of the environment images adjacent in shooting time, wherein the corresponding image units refer to image units with the same position in the environment images;
and determining the similarity of the reference areas of the environment images adjacent to the shooting time based on the similarity between the image units of the environment images.
5. The method according to any one of claims 1-4, wherein said acquiring at least two environment images of the environment surrounding the device, comprising:
acquiring a video stream obtained by shooting the surrounding environment of the equipment;
at least two image frames are extracted from the video stream as the at least two ambient images.
6. The method of claim 1, wherein the acquiring at least two captured environment images of the environment surrounding the device comprises:
acquiring the measurement speed of the equipment measured by a speed measurement module;
and when the measuring speed of the equipment is less than or equal to a preset speed value, acquiring the at least two environment images.
7. The method of any of claims 1-4, wherein upon determining that the device is stationary, the method further comprises:
generating an augmented reality AR navigation instruction according to the acquired information;
and performing navigation guidance based on the AR navigation instruction.
8. An apparatus state detection device comprising:
the device comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring at least two environment images of the photographed surrounding environment of the device, the photographing visual angles of the environment images are the same, and the photographing time difference of the environment images meets the set condition;
the identification module is used for carrying out image identification on the environment image and determining a reference area corresponding to a static object in the environment image according to an identification result, wherein the static object refers to an object which does not change along with the change of shooting time and has a state or a form changing;
the calculation module is used for acquiring the similarity of reference areas of the environment images adjacent to the shooting time;
and the determining module is used for determining that the equipment is in a static state if the similarity is greater than or equal to a preset similarity threshold.
9. An electronic device, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the device state detection method according to any one of claims 1-7.
10. A computer storage medium having stored thereon a computer program which, when executed by a processor, implements the device status detection method according to any one of claims 1 to 7.
CN202210122988.3A 2022-02-09 2022-02-09 State detection method, device, equipment and computer storage medium Pending CN114463654A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210122988.3A CN114463654A (en) 2022-02-09 2022-02-09 State detection method, device, equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210122988.3A CN114463654A (en) 2022-02-09 2022-02-09 State detection method, device, equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN114463654A true CN114463654A (en) 2022-05-10

Family

ID=81414227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210122988.3A Pending CN114463654A (en) 2022-02-09 2022-02-09 State detection method, device, equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN114463654A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117576490A (en) * 2024-01-16 2024-02-20 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117576490A (en) * 2024-01-16 2024-02-20 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment
CN117576490B (en) * 2024-01-16 2024-04-05 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
KR102189262B1 (en) Apparatus and method for collecting traffic information using edge computing
CN110246147B (en) Visual inertial odometer method, visual inertial odometer device and mobile equipment
CN110146097B (en) Method and system for generating automatic driving navigation map, vehicle-mounted terminal and server
CN110553648A (en) method and system for indoor navigation
US20200334571A1 (en) Method and apparatus for training trajectory classification model, and electronic device
CN108550258B (en) Vehicle queuing length detection method and device, storage medium and electronic equipment
CN112798811B (en) Speed measurement method, device and equipment
CN111261016A (en) Road map construction method and device and electronic equipment
CN111210477A (en) Method and system for positioning moving target
CN109357679B (en) Indoor positioning method based on significance characteristic recognition
US20200279395A1 (en) Method and system for enhanced sensing capabilities for vehicles
CN111507204A (en) Method and device for detecting countdown signal lamp, electronic equipment and storage medium
CN113945937A (en) Precision detection method, device and storage medium
CN110969864A (en) Vehicle speed detection method, vehicle driving event detection method and electronic equipment
CN113012215A (en) Method, system and equipment for space positioning
CN114463654A (en) State detection method, device, equipment and computer storage medium
CN113256683B (en) Target tracking method and related equipment
CN113658265A (en) Camera calibration method and device, electronic equipment and storage medium
CN113344906A (en) Vehicle-road cooperative camera evaluation method and device, road side equipment and cloud control platform
CN114264310A (en) Positioning and navigation method, device, electronic equipment and computer storage medium
CN113744236B (en) Loop detection method, device, storage medium and computer program product
CN111339226B (en) Method and device for constructing map based on classification detection network
CN113628284A (en) Pose calibration data set generation method, device and system, electronic equipment and medium
CN111833253A (en) Method and device for constructing spatial topology of interest points, computer system and medium
CN113379591B (en) Speed determination method, speed determination device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination