CN111553196A - Method, system, device and storage medium for detecting hidden camera - Google Patents

Method, system, device and storage medium for detecting hidden camera Download PDF

Info

Publication number
CN111553196A
CN111553196A CN202010260452.9A CN202010260452A CN111553196A CN 111553196 A CN111553196 A CN 111553196A CN 202010260452 A CN202010260452 A CN 202010260452A CN 111553196 A CN111553196 A CN 111553196A
Authority
CN
China
Prior art keywords
thermal
image
hidden camera
detecting
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010260452.9A
Other languages
Chinese (zh)
Inventor
郝田田
郭凯
王梓童
王浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN202010260452.9A priority Critical patent/CN111553196A/en
Publication of CN111553196A publication Critical patent/CN111553196A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a method, a system, a device and a storage medium for detecting a hidden camera. The method for detecting the hidden camera comprises the following steps: acquiring a plurality of frames of thermal images acquired by a thermal imaging device; determining a target thermal image containing a hidden camera from the multiple frames of thermal images to generate a detection result; and sending the detection result to an electronic device for reminding a user of privacy security risks. The method and the device have the advantages that automatic identification of the hidden camera based on thermodynamic diagram image pair is realized, and the detection efficiency is improved.

Description

Method, system, device and storage medium for detecting hidden camera
Technical Field
The present application relates to the field of computer data processing, and in particular, to a method, system, apparatus, and storage medium for detecting a hidden camera.
Background
With the development of science and technology, the electronic products have entered the period of high-speed development, and various new electronic products, such as smart phones, tablet computers, notebook computers, camera devices and the like, are introduced into the market, which bring great convenience to the lives of people and enrich the lives of people. Meanwhile, due to the pursuit of portability, elaboration and the like of electronic products, the electronic products are increasingly miniaturized and portable. However, these pocket-sized and portable electronic products are convenient for people and can be used by other useful people to implement illegal activities.
Taking a miniature camera as an example, the phenomenon of peeping through the miniature camera is increasingly inundated, which seriously infringes the privacy of citizens and influences the normal life of people.
Miniature camera, also can be called hidden camera, can be hidden and install in the hotel room, especially if the pinhole camera, its size is very little, its bore is generally only millimeter magnitude size, consequently the disguise is extremely strong, can install in facilities and goods of furniture for display rather than for use in the rooms such as difficult wall of noticing, lamps and lanterns, socket, the action and the facility to indoor personnel are kept watch on, and along with the continuous progress of technique, hidden camera volume is littleer and more, disguise the means and also constantly promoting, let originally the work of just being difficult more snow add frost, surveyors's work load has greatly been increased, detection efficiency is low.
In view of this, it is desirable to provide some solutions capable of detecting a hidden image capturing device.
Disclosure of Invention
In view of the above-mentioned shortcomings of the related art, it is an object of the present application to provide a method, system, device, and storage medium for detecting a hidden camera, so as to overcome the above-mentioned technical problems of the related art that it is difficult to detect a hidden camera.
To achieve the above and other related objects, a first aspect of the disclosure provides a method for detecting a hidden camera, including the steps of: acquiring a plurality of frames of thermal images acquired by a thermal imaging device; determining a target thermal image containing a hidden camera from the multiple frames of thermal images to generate a detection result; and sending the detection result to an electronic device for reminding a user of privacy security risks.
A second aspect of the disclosure provides a hidden camera detection system, including an obtaining module, a detecting module, and a sending module, where the obtaining module is configured to obtain multiple frames of thermal images obtained by a thermal imaging device and identification data corresponding to each frame of thermal image; the detection module is used for determining a target thermal image containing a hidden camera from the multi-frame thermal image to generate a detection result; the sending module is used for sending the detection result to an electronic device so as to remind a user of privacy security risks.
A third aspect of the present disclosure provides a cloud server system, including at least one storage device and at least one processing device, where the at least one storage device is used to store at least one program; the at least one processing device is connected to the storage device, and is configured to execute and implement the method for detecting a hidden camera provided in the first aspect of the disclosure when the at least one program is executed.
A fourth aspect of the present disclosure provides a method for detecting a hidden camera, including the following steps: displaying the acquired video stream in a preview interface when detecting that a thermal imaging device works; when a detection result of the hidden camera is received, prompt information related to the detection result is generated to be displayed in the preview interface; and the detection result is obtained by detecting the multi-frame thermal image acquired by the thermal imaging device by a hidden camera detection system.
A fifth aspect of the present disclosure provides a hidden camera detection system, including an obtaining module, a generating module, and a display module, where the obtaining module is configured to synchronously obtain a video stream when detecting that a thermal imaging device is working, and receive a detection result of a hidden camera, where the detection result is obtained by detecting, by a hidden camera detection system, a plurality of frames of thermal images obtained by the thermal imaging device; the generation module is used for generating prompt information related to the detection result to be displayed in a preview interface when the detection result of the hidden camera is received; the display module is used for displaying the synchronously acquired video stream in the preview interface and displaying prompt information related to the detection result in the preview interface.
A sixth aspect of the present disclosure provides an electronic apparatus, comprising: a display, at least one memory, and at least one processor, wherein the at least one memory is to store at least one program; the at least one processor is connected with the at least one memory and configured to execute and implement the method for detecting a hidden camera provided by the fourth aspect of the disclosure when the at least one program is executed.
A seventh aspect of the present disclosure provides an electronic apparatus, including: a display, a thermal imaging unit for scanning a current scene to acquire successive frames of thermal images of the current scene, at least one memory for storing at least one program, and at least one processor; the at least one processor is connected with the at least one memory and the thermal imaging unit, and is configured to execute and implement the method for detecting a hidden camera provided in the fourth aspect of the disclosure when running the at least one program.
An eighth aspect of the present disclosure provides a computer-readable storage medium storing at least one program which, when executed by a processor, implements the method of detecting a hidden camera as provided in the first aspect of the present disclosure or the method of detecting a hidden camera as provided in the fourth aspect of the present disclosure.
In summary, according to the method, the system, the apparatus, and the storage medium for detecting the hidden camera disclosed in the present application, on one hand, the server side implements automatic identification of the hidden camera based on thermodynamic diagram image by executing the first method for detecting the hidden camera, and thus the detection efficiency is improved. The electronic equipment displays the detection result to the user in a visual or sensible mode in time by executing a second method for detecting and hiding the camera, so that the user can know the privacy leakage risk in time, and the privacy safety of the user is ensured. On the other hand, the method for detecting the hidden camera disclosed by the application can also realize the positioning of the hidden camera so that a user can confirm the position of the hidden camera, and the subsequent operation is conveniently executed. In addition, whether on a server side or an electronic device, the scanning coverage range can be determined based on the field angle parameters and the posture data of the thermal imaging device, and the scanned area is displayed for the user on the electronic device in a visual mode, so that the condition of scanning omission or detection omission caused by negligence of the user is greatly reduced, and the detection comprehensiveness is ensured.
Other aspects and advantages of the present application will be readily apparent to those skilled in the art from the following detailed description. Only exemplary embodiments of the present application have been shown and described in the following detailed description. As those skilled in the art will recognize, the disclosure of the present application enables those skilled in the art to make changes to the specific embodiments disclosed without departing from the spirit and scope of the invention as it is directed to the present application. Accordingly, the descriptions in the drawings and the specification of the present application are illustrative only and not limiting.
Drawings
The specific features of the invention to which this application relates are set forth in the appended claims. The features and advantages of the invention to which this application relates will be better understood by reference to the exemplary embodiments described in detail below and the accompanying drawings. The brief description of the drawings is as follows:
fig. 1 is a flowchart illustrating a first method for detecting a hidden camera according to an embodiment of the present invention.
Fig. 2 is a schematic diagram illustrating a connection between a thermal imaging apparatus and an electronic device according to an embodiment of the disclosure.
Fig. 3 is a schematic view illustrating a range of viewing angles of an image capturing device and a thermal imaging device according to an embodiment of the present disclosure.
Fig. 4 is a flow chart illustrating the determination of scan coverage in one embodiment of the present application.
Fig. 5 is a flowchart illustrating a first method for detecting a hidden camera according to an embodiment of the present disclosure.
FIG. 6 is a schematic interface diagram illustrating a preview interface according to an embodiment of the present application.
Fig. 7 is a schematic interface diagram of a preview interface in a further embodiment of the present application.
Fig. 8 is a schematic diagram of a result display interface in an embodiment of the present application.
Fig. 9 is a flow chart illustrating the determination of scan coverage in one embodiment of the present application.
Fig. 10 is a block diagram of a first hidden camera detection system according to an embodiment of the present invention.
Fig. 11 is a block diagram of a second hidden camera detection system according to an embodiment of the present application.
Fig. 12 is a block diagram of a cloud server system according to an embodiment of the present invention.
Fig. 13 is a block diagram of a first electronic device according to an embodiment of the present application.
FIG. 14 is a block diagram of a second electronic device according to the present application in one embodiment.
Detailed Description
The following description of the embodiments of the present application is provided for illustrative purposes, and other advantages and capabilities of the present application will become apparent to those skilled in the art from the present disclosure.
In the following description, reference is made to the accompanying drawings that describe several embodiments of the application. It is to be understood that other embodiments may be utilized and that changes in the module or unit composition, electrical, and operation may be made without departing from the spirit and scope of the present disclosure. The following detailed description is not to be taken in a limiting sense, and the scope of embodiments of the present application is defined only by the claims of the issued patent. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
Although the terms first, second, etc. may be used herein to describe various elements, information, or parameters in some instances, these elements or parameters should not be limited by these terms. These terms are only used to distinguish one element or parameter from another element or parameter. For example, the first identification data may be referred to as second identification data, and similarly, the second identification data may be referred to as first identification data, without departing from the scope of the various described embodiments. The first identification data and the second identification data are both describing one identification data, but they are not the same identification data unless the context clearly indicates otherwise. Depending on context, for example, the word "if" as used herein may be interpreted as "at … …" or "at … …".
Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
Private places inevitably involved in user lives such as hotels and fitting rooms can be illegally installed with miniature cameras at hidden places by secret photographers, so that the risk of secret photography is avoided, and the privacy safety of users cannot be guaranteed. Although some software schemes detect whether a hidden camera exists in a room by utilizing the characteristic that the hidden camera is connected with Wifi communication, the scheme can detect the camera using Wifi networking, and the camera using non-networking or 4G network cannot be effectively detected.
Still others use hardware detection to detect hidden cameras. For example, in a relatively common infrared detection method, the characteristic that a hidden camera emits infrared rays outwards when in operation is utilized, so that the purpose of detecting the hidden camera is achieved by adopting a mode that whether a red dot appears in a screen or not is detected by a detector. The mode can only detect the hidden camera needing the infrared emitting device to supplement light, and the hidden camera which does not supplement light in the infrared mode cannot play a role. For another example, the magnetic induction method of detecting by using a magnetic field existing around a hidden camera requires a professional instrument for detection, and requires related professional knowledge when in use, so that the application in daily life is not feasible.
In view of this, the present application discloses a method for detecting a hidden camera, which is used at a server side to automatically identify a thermal image to determine whether a hidden camera exists in a current scene by using a characteristic that the hidden camera generates heat when working. For the sake of distinguishing from the method for detecting a hidden camera for a client, which will be described later, the method for detecting a hidden camera for a server is referred to as a first method for detecting a hidden camera. The server may communicate with the client through a network, for example, the network may be the internet, a mobile network, a Local Area Network (LAN), a wide area network (WLAN), a storage local area network (SAN), one or more intranets, etc., or a suitable combination thereof, and the types or protocols of the communication network between the publisher terminal and the server, between the responder terminal and the server, and the like are not limited in this application.
In some embodiments of the present application, the server may be disposed on one or more entity servers according to various factors such as function, load, and the like. When distributed in a plurality of entity servers, the server may be composed of servers based on a cloud architecture. For example, a Cloud-based server includes a Public Cloud (Public Cloud) server and a Private Cloud (Private Cloud) server, wherein the Public or Private Cloud server includes Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), Infrastructure as a Service (IaaS), and Infrastructure as a Service (IaaS). The private cloud service end is used for example for a Mei Tuo cloud computing service platform, an Array cloud computing service platform, an Amazon cloud computing service platform, a Baidu cloud computing platform, a Tencent cloud computing platform and the like. The server may also be formed by a distributed or centralized cluster of servers. For example, the server cluster is composed of at least one entity server. Each entity server is provided with a plurality of virtual servers, each virtual server runs at least one functional module in the server side, and the virtual servers are communicated with each other through a network.
Referring to fig. 1, which is a flowchart illustrating a first method for detecting a hidden camera according to an embodiment of the present invention, as shown in the drawing, the first method for detecting a hidden camera includes steps S10, S11, and S12, and the first method for detecting a hidden camera is described in detail with reference to fig. 1 to 4.
In step S10, the server acquires a plurality of frames of thermal images acquired by a thermal imaging device.
When the thermal imaging device receives the detection instruction, the current scene is scanned based on the movement track of the thermal imaging device by the user. The detection instruction may be triggered manually, for example, if a user powers on the thermal imaging apparatus, it indicates that the detection instruction is triggered, and if the thermal imaging apparatus detects a power-on action, it indicates that the detection instruction is received; for another example, the user triggers a start button arranged on the thermal imaging device to realize the triggering of the detection instruction; for another example, the application program (for example, the APP of the mobile phone) of the corresponding client that detects the hidden camera triggers the detection instruction to the thermal imaging apparatus. The current scene specifically refers to a situation where whether a hidden camera exists or not needs to be checked, for example, private places such as hotels, public toilets, fitting rooms and lodging residents.
In an embodiment, the thermal imaging device scans a current scene and simultaneously acquires continuous frames of thermal images obtained by scanning, in an example, the thermal imaging device uploads the continuous frames of thermal images obtained by scanning to a server, and a plurality of frames of thermal images obtained by the server are the continuous frames of thermal images obtained by scanning after the thermal imaging device receives a detection instruction. In another example, the thermal imaging device uploads a thermodynamic image acquired at a preset interval time or a preset interval frame number to the server in the process of scanning the current scene after receiving the detection instruction, wherein in order to avoid an omission condition that the scene is scanned but is not uploaded to the server for identification, the preset interval time should be shorter than a time length taken by the thermal imaging device to move a field range, and two adjacent thermodynamic images acquired at the preset interval frame number should have a partially overlapped area. For example, when the thermal imaging device receives a detection instruction, the thermal image scanned at that time is uploaded as a first frame of thermal image uploaded to the server, after an interval of 30ms, the currently scanned thermal image is uploaded as a second frame of thermal image uploaded to the server, after another interval of 30ms, the currently scanned thermal image is uploaded as a third frame of thermal image uploaded to the server, so that the thermal image is uploaded regularly until the thermal imaging device finishes the current scanning, and the server acquires multiple frames of thermal images. For another example, when the thermal imaging device receives the detection instruction, the scanned thermal image at that time is uploaded as a first frame of thermal image uploaded to the server, after 15 frames of images are spaced, the currently scanned thermal image is uploaded as a second frame of thermal image uploaded to the server, after 15 frames of images are spaced, the currently scanned thermal image is uploaded as a third frame of thermal image uploaded to the server, and the third frame of thermal image is uploaded according to the rule until the thermal imaging device finishes scanning at this time, and the server acquires multiple frames of thermal images.
It should be noted that, the thermal imaging apparatus may also be connected to an electronic device, and an application program loaded in the electronic device triggers a detection instruction of the thermal imaging apparatus and uploads a continuous frame image acquired by the thermal imaging apparatus. The electronic device is, for example, an APP application or a client with a web page/website access capability, and the client mentioned later may also be understood as a client. The electronic device includes components such as memory, a memory controller, one or more processing units (CPUs), peripheral interfaces, RF circuitry, audio circuitry, speakers, microphones, input/output (I/O) subsystems, display screens, other output or control devices, and external ports, which communicate via one or more communication buses or signal lines. The electronic device includes, but is not limited to, personal computers such as desktop computers, notebook computers, tablet computers, smart phones, smart televisions, and the like. The electronic device can also be an electronic device consisting of a host with a plurality of virtual machines and a human-computer interaction device (such as a touch display screen, a keyboard and a mouse) corresponding to each virtual machine.
Please refer to fig. 2, which is a schematic diagram of a connection between a thermal imaging apparatus and an electronic device in an embodiment of the present application, wherein a in fig. 2 is a schematic diagram of a front side of the connection between the thermal imaging apparatus and the electronic device, and b in fig. 2 is a schematic diagram of a back side of the connection between the thermal imaging apparatus and the electronic device, in the embodiment shown in fig. 2, the electronic device 10 is a smartphone, as mentioned above, the electronic device is not limited to the smartphone, and for simplicity, the following embodiment is described as an example of the electronic device of the smartphone, an application 11 for detecting a hidden camera of a corresponding client is loaded in the electronic device 10, when the application 11 is opened or a start item (not shown) in a preview interface presented after the application 11 is opened is triggered, it indicates that the thermal imaging apparatus 12 receives a detection instruction, the thermal imaging device 12 scans a current scene based on a moving track of the thermal imaging device 12 by a user, and transmits continuous frames of thermal images obtained by scanning to the electronic device 10, and the electronic device 10 uploads the continuous frames of thermal images obtained by scanning to the server in the manner described in one example, or the electronic device 10 uploads a thermal image obtained by scanning the current scene by the thermal imaging device 12 at a preset interval time or a preset interval frame number to the server in the manner described in another example, so that the server obtains multiple frames of thermal images obtained by the thermal imaging device and uploaded by the electronic device 10.
In addition, it should be noted that the multiple frames of thermal images acquired by the server in the above embodiments may be a set of thermal images that are continuously acquired in real time as the thermal imaging device scans the current scene, and the number of thermal images in the set of thermal images may continuously increase as the scanning time of the thermal imaging device increases. The multiple frames of thermal images acquired by the server in the above embodiment may also be a set of thermal images that are uploaded uniformly after the thermal imaging device finishes scanning the current scene, where the number of thermal images in the set of thermal images is the number of thermal images uploaded uniformly after the scanning is finished.
In practical application, in order to mark each frame of image, when the server side obtains the multiple frames of thermal images, the server side also obtains identification data corresponding to each frame of thermal images, and the identification data can be used as a mapping relation or an index relation to determine each frame of image so as to search, match or screen the image. For example, since the thermal imaging device is moving at a moment when scanning the current scene, and the posture of each frame of image acquired by the thermal imaging device is different, the identification data may be posture data of the thermal imaging device, and each frame of image may be determined by using the posture data as a mapping relation or an index relation. For another example, the identification data may also be time data, and the time data corresponding to each frame of image obtained by the thermal imaging device is used as a mapping relation or an index relation of each frame of image, in practice, the time data is, for example, in units of microseconds, milliseconds, or seconds, so as to record a specific time corresponding to each frame of image, for example, the shooting time of the nth frame of image is 4500 milliseconds or 4.5 seconds; but not limited thereto, in order to ensure the accuracy of image matching, the identification data may also include both attitude data and time data of the thermal imaging apparatus.
In some embodiments, the pose data is gyroscope measurement data output by a gyroscope. Generally, the angular velocity meter of the gyroscope can measure the rotation speed of the carrier in three coordinate axis directions in a three-dimensional space, and the angular velocity value is integrated by the sensing data of the angular velocity meter of the gyroscope to be used as the attitude data of the carrier. The carrier may be, for example, a thermal imaging device, the gyroscope is a sensor disposed in the thermal imaging device, the carrier may also be, for example, an electronic device (e.g., the electronic device 10 shown in fig. 2) connected to the thermal imaging device, and the gyroscope is a sensor disposed in the electronic device, and since the thermal imaging device follows a moving track of the electronic device, data measured by the gyroscope reflects a posture of the thermal imaging device scanning a current scene, that is, the data is posture data of the thermal imaging device. In an embodiment, each frame of image may also be determined by using the attitude data as a mapping relation or an index relation, for example, in practice, the attitude data is a gyroscope measurement data, such as a gyroscope measurement data output by the gyroscope when each frame of image is captured during the capturing process of the image, namely, angular acceleration rotation output data around X, Y and the Z axis provided by the gyroscope.
In order to facilitate later display for a user to visually determine the position of the hidden camera, in some embodiments, the server may further obtain a related image corresponding to each frame of the thermal image when obtaining the plurality of frames of the thermal image. The image associated with each frame of thermal image is an image having the same or overlapping scene with each frame of thermal image, and the image and the thermal image are associated by adopting the identification data. In an embodiment, the associated image of each frame of thermal image is a simple-stroke image or a cartoon image generated by the server side performing image reconstruction or conversion on each frame of thermal image, and both images have the same identification data.
In another embodiment, the associated image is a live-action image with a view-angle overlapping area acquired synchronously with each frame of thermal image, and the live-action image and its associated thermal image have the same identification data. For example, the related image is a real image captured by an image capturing device, as shown in fig. 3, which is a schematic view illustrating a viewing angle range of the image capturing device and the thermal imaging device in an embodiment of the present invention, wherein a viewing angle of the image capturing device is, for example, α, a viewing field range of the image capturing device is a1, the real image captured with the viewing angle α is a related image a2, a viewing angle of an infrared objective lens (a lens for acquiring a thermal image) of the thermal imaging device is, for example, β, a viewing field range of the infrared objective lens is B1, the thermal image B2 is captured with the viewing angle β, and in the embodiment shown in fig. 3, a viewing angle overlapping region of the related image a2 and the thermal image B2 is a region C. The camera device may be, for example, a camera disposed on the thermal imaging device and facing the same direction as the infrared objective lens (e.g., the infrared objective lens 13 of the thermal imaging device 12 in fig. 2), and may also be, for example, a camera on an electronic apparatus connected to the thermal imaging device (e.g., the camera 14 on the electronic apparatus 10 shown in fig. 2). In this way, when the live-action image captured by the image capturing device and the thermal image captured by the infrared objective lens of the thermal imaging device are acquired synchronously, the live-action image and the thermal image have the same time data and posture data, so that the live-action image and the thermal image are associated by the same identification data.
In step S11, the server determines a target thermal image including a hidden camera from the multiple frames of thermal images to generate a detection result.
In some embodiments, the server may match the acquired multiple frames of thermal images by using preset image features including a hidden camera, thereby determining a target thermal image with the hidden camera. The target thermal image is a thermal image with a hidden camera in a plurality of frames of thermal images.
In some embodiments, the server side inputs each frame of thermal image into the preset detection model by using the preset detection model set through machine learning, so as to identify the target thermal image containing the hidden camera from each frame of thermal image. The predetermined detection model may be a neural network model, and in some examples, the neural network model may be a neural network model based on a convolutional neural network structure, the network structure including an input layer, at least one hidden layer, and at least one output layer. Wherein the input layer is used for receiving a thermal image or a preprocessed thermal image; the hidden layer comprises a convolution layer and an activation function layer, and even can comprise at least one of a normalization layer, a pooling layer and a fusion layer; the output layer is used for outputting the image marked with the hidden camera. The connection mode is determined according to the connection relation of each layer in the neural network structure. For example, a connection relationship between the front layer and the back layer set based on data transfer, a connection relationship between the front layer and the back layer and full connection are set based on the convolutional kernel size in each hidden layer, and the like. According to the set training set aiming at the preset detection model, a sample image containing a thermal image of a hidden camera can be used as a positive sample, a sample image containing a thermal image of a non-hidden camera can be used as a negative sample, and the trained detection model can be used as the preset detection model for identifying the hidden camera. In an example, the trained detection model may be used to output at least one of a target thermal image containing a hidden camera or identification data corresponding thereto. In another example, when the trained detection model is used, the trained detection model may also be used to identify a position of an area where the hidden camera is located in the image, and output a positioning data set of the identified hidden camera, where the positioning data set includes identification data corresponding to the target thermodynamic image and a pixel coordinate of the hidden camera in the target thermodynamic diagram, and the pixel coordinate may be, for example, a center point coordinate of the area where the hidden camera is located, or a coordinate matrix of a boundary point of the area where the hidden camera is located.
After determining the target thermal image containing the hidden camera according to the manner in the above embodiment, the server generates a detection result based on the target thermal image. The detection result comprises at least one of the target thermal image or identification data corresponding to the target thermal image, a related image of the target thermal image determined according to the identification data, position information of the hidden camera in the target thermal image or the related image thereof, privacy safety warning information, or information of the hidden camera which is not found.
In some embodiments, the detection result comprises the target thermal image. And the server directly takes the determined target thermal image containing the hidden camera as a detection result.
In some embodiments, the detection result includes identification data corresponding to the target thermal image. In this embodiment, after the server determines the target thermal image, the identifier data corresponding to the target thermal image is used as the detection result. The identification data corresponding to the target thermal image is used for matching with a continuous frame thermal image prestored in an electronic device or a continuous frame associated image of the continuous frame target thermal image prestored in the electronic device. For example, as shown in fig. 2, the thermal imaging apparatus 12 uploads the continuous frame images acquired by the thermal imaging apparatus 12 by the electronic device 10, in other words, the target thermal image is a partial image of the continuous frame images, the electronic device may pre-store the continuous frame thermal images and the identification data corresponding to each frame thermal image, and the electronic device may determine the thermal image having the identification data in the detection result from the pre-stored continuous frame thermal images according to the identification data corresponding to each frame thermal image and display the thermal image as the target thermal image to the user. Furthermore, the electronic device may further pre-store the associated image of each frame of the thermal image obtained by the camera device as described above, where the associated image and the corresponding thermal image have the same identification data, and the electronic device may determine the live-action image with the identification data in the detection result from the pre-stored associated images of consecutive frames according to the identification data corresponding to each frame of the thermal image, so as to be displayed to the user. The electronic device matches the process of pre-storing the consecutive frames of thermal images or the consecutive frames of associated images according to the identification data corresponding to the target thermal image in the detection result and the related display, please refer to the description of the second method for detecting the hidden camera applied to the client and fig. 5 to 9, which will not be described herein again.
As mentioned above, when acquiring multiple frames of thermal images, the server also acquires a related image corresponding to each frame of thermal image, so in some embodiments, the detection result includes a related image of the target thermal image determined according to the identification data. In this embodiment, after the server determines the target thermal image, the server obtains a related image of the target thermal image based on the identification data of the target thermal image, and uses the related image as a detection result.
In some embodiments, the detection result includes position information of the hidden camera in the target thermal image. The position information is obtained based on a result of recognizing and outputting a plurality of frames of thermal images by the server based on a preset detection model. In an example, the position information is a positioning data set, and as mentioned above, the positioning data set includes identification data corresponding to the target thermal image and pixel coordinates of the hidden camera in the target thermal image; in another example, the location information includes a location mark marked in the target thermodynamic diagram, in this example, the server makes a location mark on the target thermodynamic image based on the location data set, where the location mark is, for example, a location frame defined according to pixel coordinates, or a location point set according to pixel coordinates (the location point is, for example, presented in a manner of an easily observable mark such as "+", "×" or a square), and here, the server takes the target thermodynamic image marked with the location mark as a detection result. The position information is not limited to this, in other examples, the server side presets the detection model to recognize the hidden camera in the multi-frame thermal image, and also to recognize a reference object near the hidden camera, and the position information may include a position of the hidden camera relative to the reference object, for example, the server side determines that the hidden camera is above the television according to a recognition result output by the preset detection model, and then takes information related to the existence of the privacy camera above the television as a detection result.
In some embodiments, the detection result includes position information of the hidden camera in an associated image of the target thermal image, wherein the position information of the hidden camera in the associated image includes a positioning mark marked in the associated image.
In an example, the associated image is a simple stroke image or a cartoon image generated by the server side performing image reconstruction or conversion on each frame of thermal image, and the server side makes a positioning mark on the associated image based on a positioning data set output by a preset detection model, where the positioning mark is, for example, a positioning frame defined according to pixel coordinates or a positioning point set according to pixel coordinates (the positioning point is, for example, presented in an easily observable mark manner such as "+", "×" or a square frame), and here, the server side takes the associated image marked with the positioning mark as a detection result.
In another example, the associated image is a live-action image acquired by a camera device, and the positioning mark marked in the associated image is used by the server to map the position information of the hidden camera in the target thermal image into the associated image based on the spatial position between the thermal imaging device and the camera device. Specifically, the server determines a related image of a target thermal image based on identification data of the target thermal image in a positioning data set output by a preset detection model, then maps a pixel coordinate in the positioning data set to a pixel coordinate of a hidden camera in the related image based on a spatial position between a thermal imaging device and a camera device, and defines a positioning frame or a positioning point (the positioning point is presented in an easily observed mark manner such as a "+", "×" or a square frame) in the related image according to the pixel coordinate of the hidden camera in the related image as a positioning mark, where the server takes a live-action image marked with the positioning mark as a detection result.
Here, it should be noted that, in view of the fact that the aforementioned imaging device may be, for example, a camera which is disposed on the thermal imaging device and faces the same direction as the infrared objective lens, the spatial position between the thermal imaging device and the imaging device is expressed by the spatial position between the infrared objective lens of the thermal imaging device and the camera which faces the same direction as the infrared objective lens. The aforementioned image pickup device may also be, for example, a camera on an electronic apparatus connected to a thermal imaging device, such as the camera 14 of the electronic apparatus 10 connected to the thermal imaging device 12 in fig. 2, where the spatial position between the thermal imaging device and the image pickup device is represented by the spatial position between the infrared objective lens 13 on the thermal imaging device 14 and the camera 14 on the electronic apparatus 10.
In some embodiments, the server determines that a target thermal image including a hidden camera exists in multiple frames of thermal images, and uses privacy security warning information as a detection result. The privacy safety warning message is, for example, a vibration instruction message, which is used to instruct an electronic device (e.g., the electronic device 10 shown in fig. 2) to generate vibration to prompt a user that there is a risk of hiding a camera; the privacy security alert information may also be, for example, a voice prompt message, which is used to instruct an electronic device (e.g., the electronic device 10 shown in fig. 2) to generate a voice prompt to remind a user of the risk of hiding the camera; the privacy and safety warning information may also be, for example, text prompt information, where the text prompt information is used to instruct an electronic device (e.g., the electronic device 10 shown in fig. 2) to display prompt text on a display screen of the electronic device (e.g., scroll display "hidden camera exists in current scene, please check") to remind a user of risk of the hidden camera; the privacy security warning information may also be, for example, picture prompt information, where the picture prompt information is used to instruct an electronic device (for example, the electronic device 10 shown in fig. 2) to display a prompt picture on a display screen of the electronic device, and a user can know that a risk of hiding a camera exists in a current scene by seeing the picture displayed on the display screen. However, the privacy security alert information is not limited to this, and the privacy security alert information may also be, for example, information obtained by combining at least two of the foregoing examples, for example, the privacy security prompt information includes vibration instruction information and text prompt information, so that the electronic device may also generate vibration when displaying the prompt text on the display screen thereof, and a user may know that the current scene has a risk of hiding the camera through the sensible information.
In some embodiments, when the server determines that no hidden camera exists in the multi-frame thermal image, the server takes an undiscovered hidden camera information as a detection result. The undiscovered hidden camera information may be, for example, voice information, which is used to instruct an electronic device (e.g., the electronic device 10 shown in fig. 2) to play relevant information of the undiscovered hidden camera in voice (e.g., voice play "no privacy security risk is found in the current scene, please stay in the current scene") to notify the user; the undiscovered hidden camera information may also be, for example, text information, which is used to instruct an electronic device (e.g., the electronic device 10 shown in fig. 2) to display prompt text on its display screen (e.g., scroll display "hide camera is not found in current scene, please leave your heart in place") to notify the user; the undiscovered hidden camera information may also be, for example, picture information, where the picture information is used to instruct an electronic device (e.g., the electronic device 10 shown in fig. 2) to display a prompt picture on a display screen of the electronic device, and a user can know that a hidden camera is not detected in a current scene by seeing the picture displayed on the display screen. However, the undiscovered hidden camera information may also be, for example, information obtained by combining at least two of the above examples, for example, the undiscovered hidden camera information includes voice information and text information, so that the electronic device may also play relevant information of the undiscovered hidden camera in a voice manner when displaying the prompt text on a display screen of the electronic device, and a user may know that the hidden camera is not detected in the current scene through the sensible information.
The detection result is not limited to the foregoing embodiment, and in other embodiments, the detection result may also be at least two combinations of the foregoing embodiments, which are not described herein again.
In step S12, the server sends the detection result to an electronic device for reminding the user of privacy security risk.
In an embodiment, the server sends the detection result to the electronic device when detecting that the image uploading by the thermal imaging device is finished, in other words, the server sends the detection result to the electronic device when determining that the current scanning of the thermal imaging device on the current scene is finished, and what the electronic device shows for the user is a summary of the detection results of the current scanning. For example, if the server side cannot acquire the image uploaded by the thermal imaging device within the preset time length, it indicates that the server side detects that the image uploading of the thermal imaging device is finished, that is, the scanning is finished; if the server side obtains the information of the thermal imaging device power failure, the server side indicates that the server side detects that the image uploading of the thermal imaging device is finished; for another example, when the server acquires the scanning end information sent by the electronic device loaded with the application program (e.g., the mobile phone APP) for detecting the hidden camera corresponding to the client based on the user triggering and operating the scanning end item in the application program, it indicates that the server detects that the image uploading by the thermal imaging device is ended.
It should be noted that the aforementioned and later mentioned trigger operations include, but are not limited to, clicking (for example, clicking with an input device such as a mouse, or touching with a finger, etc.), long pressing, or repeated touching, etc., and it should be understood that the user performing the trigger operation on an object may be referred to as the user triggering the object, for example, the user performing the trigger operation on a reservation interface, which may also be referred to as the user triggering the reservation interface. For another example, a user triggering a hot key region indicates that the user performed a triggering operation on the hot key region, and so on. Similar situations will not be described in detail below. It should be understood that the scan ending item or the postscript starting item refers to a component or a virtual key that can be used for inputting instructions or information through operations such as triggering or clicking.
In another embodiment, the server sends the detection result to the electronic device in real time, in other words, the server sends the detection result generated by detecting the acquired multi-frame thermal image in steps S10 and S11 to the electronic device in real time, and the electronic device will continuously show the detection result of the current scan for the user along with the progress of the current scan of the thermal imaging device on the current scene.
The electronic device is an electronic device (for example, the electronic device 10 shown in fig. 2) loaded with a detection hidden camera related application program corresponding to the client, and please refer to the following description of fig. 5 to 9 for receiving and displaying the detection result, which is not described herein again.
In addition, in order to avoid the occurrence of a missed scanning area after the current scene is scanned based on the moving track of the thermal imaging device by the user, the first method for detecting a hidden camera further includes a step of acquiring a field angle parameter of the thermal imaging device and a step of determining a scanning coverage range, where the two steps may occur in or after step S10, or before, during, or after step S11, depending on the actual situation, and the occurrence order is not limited herein.
Referring to fig. 4, which is a flowchart illustrating the scan coverage determining step in an embodiment of the present application, as shown in the figure, the scan coverage determining step includes steps S20 and S21.
In step S20, the server detects that the image uploading by the thermal imaging device is finished, and calculates a scanning coverage range based on the attitude data and the field angle parameter to generate progress information;
the detection of the end of acquiring the image by the thermal imaging device by the server can be realized by any one of acquiring no image uploaded by the thermal imaging device within a preset time period, acquiring power-off information of the thermal imaging device by the server, and acquiring scanning end information sent by electronic equipment loaded with an application program (such as a mobile phone APP) corresponding to the detection hidden camera based on a scanning end item in the user-triggered operation application program by the server.
In some embodiments, as previously described, the pose data is gyroscope measurement data output by a gyroscope built into the thermal imaging device or into an electronic device connected to the thermal imaging device. In view of this, the service end can determine the rotation angle of the thermal imaging device on three coordinate axes (X-axis, Y-axis and Z-axis) relative to the initial position of the thermal imaging device through the gyroscope measurement data, and determine the scanning coverage range in combination with the field angle parameter, wherein the scanning coverage range can be expressed in percentage. For example, the server determines that the rotation angles of the thermal imaging device on three coordinate axes (an X axis, a Y axis and a Z axis) relative to the initial position of the thermal imaging device reach 360 degrees, and then determines that the coverage of the current scanning is one-hundred percent full coverage; for another example, the server determines that at least one of the rotation angles of the thermal imaging device on the three coordinate axes relative to the initial position of the thermal imaging device does not reach 360 °, superimposes the rotation degree to which the angle of view of the thermal imaging extends on the basis of the determined rotation angles on the three coordinate axes as the rotation angle, and calculates the percentage of the scanning coverage range of the superimposed rotation angle as the coverage range of the current scanning.
In some embodiments, the server generates progress information based on the determined scanning coverage, where the progress information may be at least one of text information including the scanning coverage, picture information including the scanning coverage, or voice information including the scanning coverage, and the progress information is used for the electronic device to display a coverage covered by the current detection to the user in a manner that can be perceived by the user, so as to avoid missing scanning.
In step S21, the server sends the progress information to the electronic device for reminding the user of the integrity of the detection. In this embodiment, the server sends the progress information in step S20 to an electronic device (for example, the electronic device 10 shown in fig. 2), and please refer to the following description for fig. 5 to 9 for receiving and displaying the progress information by the electronic device, which is not described herein again.
The application further discloses a method for detecting the hidden camera, which is used for the client side so as to display the hidden camera detected in the current scene to a user according to a detection result of the server side on the hidden camera, and specifically, the method for detecting the hidden camera is run by an application program loaded in the client side.
In some embodiments of the present application, the client is, for example, an electronic device loaded with an APP application or having a web page/website access capability, and is hereinafter referred to as an electronic device. The electronic device includes components such as memory, a memory controller, one or more processing units (CPUs), peripheral interfaces, RF circuitry, audio circuitry, speakers, microphones, input/output (I/O) subsystems, display screens, other output or control devices, and external ports, which communicate via one or more communication buses or signal lines. The electronic device includes, but is not limited to, personal computers such as desktop computers, notebook computers, tablet computers, smart phones, smart televisions, and the like. The electronic device can also be an electronic device consisting of a host with a plurality of virtual machines and a human-computer interaction device (such as a touch display screen, a keyboard and a mouse) corresponding to each virtual machine.
Referring to fig. 5, which is a flowchart illustrating a second method for detecting a hidden camera according to an embodiment of the present invention, as shown in the figure, the second method for detecting a hidden camera includes steps S30 and S31, and the following describes the second method for detecting a hidden camera in detail with reference to fig. 5 to 9.
In step S30, the electronic device detects that a thermal imaging device is operating and displays the acquired video stream in a preview interface.
In one embodiment, the electronic device is coupled to the thermal imaging device. In an example, as shown in fig. 2, the thermal imaging device 12 is connected to the electronic device 10 through an external interface 15 of the electronic device 10, the electronic device 10 is loaded with an application 11 for executing the second hidden camera detection method, and the electronic device 10 can interact with the thermal imaging device 12 through the external interface 15. In another example, the thermal imaging apparatus may also be connected to the electronic device via a wireless communication circuit of the electronic device, wherein the wireless communication circuit includes 2G-5G, WiFi, bluetooth, 315/433 rf circuit, etc., and the electronic device may interact with the thermal imaging apparatus via the wireless communication circuit of the electronic device.
In an embodiment, the thermal imaging apparatus scans the current scene based on the moving track of the thermal imaging apparatus by the user when receiving the detection instruction, in other words, the thermal imaging apparatus may be regarded as operating the thermal imaging apparatus when receiving the detection instruction, and the electronic device may be regarded as detecting the operation of the thermal imaging apparatus when detecting the detection instruction. The detection instruction may be triggered manually, for example, if a user powers on the thermal imaging apparatus, the detection instruction is triggered, and if the electronic device detects a power-on action of the thermal imaging apparatus, the detection instruction is detected; for another example, a user triggers a start button arranged on the thermal imaging device to trigger a detection instruction, and when the electronic device detects a start signal sent by the thermal imaging device, the detection instruction is detected; for another example, when the electronic device detects that the application program is opened or a start item in a preview interface presented after the application program is opened is triggered, the electronic device detects the detection instruction. The current scene specifically refers to a situation where whether a hidden camera exists or not needs to be checked, for example, private places such as hotels, public toilets, fitting rooms and lodging residents.
And loading the preview interface when an application program loaded in the electronic equipment and used for executing the second method for detecting the hidden camera is opened. Specifically, the application program may load the preview interface based on the electronic device detecting that the thermal imaging apparatus is turned on, for example, when the electronic device detects a power-on action of the thermal imaging apparatus as described above. The application program may also be opened based on the electronic device detecting a trigger operation, for example, the electronic device loads the preview interface when detecting that the user clicks an icon of the application program.
In one embodiment, the acquired video stream is a continuous frame thermal image acquired by a thermal imaging device. Specifically, the thermal imaging device obtains continuous frames of thermal images obtained by scanning while scanning the current scene and transmits the continuous frames of thermal images to the electronic device in real time. In another example, the electronic device also temporarily saves the acquired sequential frame thermal images in its memory for subsequent use.
In practical application, in order to mark each frame of thermal image, when the electronic device acquires the continuous frame of thermal image, the electronic device also acquires identification data corresponding to each frame of thermal image in the continuous frame of thermal image, and the identification data can be used as a mapping relation or an index relation to determine each frame of image so as to search, match or screen the image. For example, since the thermal imaging device is moving at a moment when scanning the current scene, and the posture of each thermal image acquired by the thermal imaging device is different, the identification data may be posture data of the thermal imaging device, and each thermal image may be determined by using the posture data as a mapping relation or an index relation. For another example, the identification data may also be time data, and the time data corresponding to each frame of thermal image obtained by the thermal imaging device is used as a mapping relation or an index relation of each frame of thermal image, in practice, the time data is, for example, in units of microseconds, milliseconds or seconds, so as to record a specific time corresponding to each frame of image, for example, the shooting time of the nth frame of image is 4500 milliseconds or 4.5 seconds. But not limited thereto, in order to ensure the accuracy of image matching, the identification data may also include both attitude data and time data of the thermal imaging apparatus.
In some examples, the pose data is gyroscope measurement data output by a gyroscope. Generally, the angular velocity meter of the gyroscope can measure the rotation speed of the carrier in three coordinate axis directions in a three-dimensional space, and the angular velocity value is integrated by the sensing data of the angular velocity meter of the gyroscope to be used as the attitude data of the carrier. The carrier may be, for example, a thermal imaging apparatus, the gyroscope is a sensor disposed in the thermal imaging apparatus, the carrier may also be, for example, an electronic device (such as the electronic device 10 shown in fig. 2) connected to the thermal imaging apparatus, and the gyroscope is a sensor disposed in the electronic device, since the thermal imaging apparatus follows a moving track of the electronic device in the embodiment shown in fig. 2, the gyroscope measures data, that is, the posture of the thermal imaging apparatus scanning the current scene, that is, the posture data of the thermal imaging apparatus. In an embodiment, each frame of image may also be determined by using the attitude data as a mapping relation or an index relation, for example, in practice, the attitude data is a gyroscope measurement data, such as a gyroscope measurement data output by the gyroscope when each frame of image is captured during the capturing process of the image, namely, angular acceleration rotation output data around X, Y and the Z axis provided by the gyroscope.
In another embodiment, the acquired video stream is a continuous frame associated image associated with the continuous frame thermal image. The continuous frame related images are live-action images which are synchronously acquired with the continuous frame thermal images and have view angle overlapping areas, each frame of live-action image in the continuous frame related images corresponds to each frame of thermal image in the continuous frame thermal images one by one, and the identification data is adopted for association. For example, as shown in fig. 3, the field angle of the imaging device is α, the field range of the imaging device is a1, the real image captured by the field angle α is a related image a2, the field angle of the infrared objective lens (the lens for acquiring the thermal image) of the thermal imaging device is β, the field range of the infrared objective lens is B1, and the thermal image B2 is captured by the field angle β, in the embodiment shown in fig. 3, the overlapping area of the viewing angles of the related image a2 and the thermal image B2 is a C area. The camera device may be, for example, a camera disposed on the thermal imaging device and facing the same direction as an infrared objective lens (a lens for acquiring a thermal image, such as the infrared objective lens 13 of the thermal imaging device 12 in fig. 2), and may also be, for example, a camera 14 on the electronic apparatus 10 shown in fig. 2. Thus, when the live-action image shot by the camera device and the thermal image shot by the infrared objective lens of the thermal imaging device are synchronously obtained, the live-action image and the thermal image have the same time data and posture data, so that the live-action image and the thermal image have the same identification data. It is noted that in some examples, the electronic device also temporarily stores the acquired sequential frame-associated images in its memory for subsequent use.
In the foregoing embodiment, the preview interface is displayed by using a single-screen display, as shown in fig. 6, which is an interface schematic diagram of the preview interface in an embodiment of the present application, where the D area is a single-screen display area, and the displayed content is a video stream formed by consecutive frames of thermal images or consecutive frames of associated images.
In another embodiment, the obtained video stream includes consecutive frames of thermal images obtained by the thermal imaging device and consecutive frame associated images associated with the consecutive frames of thermal images, and please refer to the description in the foregoing embodiment and the description in another embodiment for the consecutive frames of thermal images and the consecutive frame associated images associated with the consecutive frames of thermal images, which is not described herein again. In this embodiment, the preview interface is displayed in a split-screen manner, as shown in fig. 7, which is an interface schematic diagram of the preview interface in another embodiment of the present application, the preview interface is displayed in a split-screen manner and is respectively an E area and an F area, where the content displayed in the E area is a video stream formed by continuous frames of thermal images, and the content displayed in the F area is a video stream formed by continuous frames of associated images of continuous frames of thermal images, in some examples, the thermal images displayed in the E area and the associated images displayed in the F area at the same time are images with one-to-one correspondence relationship, so that the E area and the F area display the same scene, so that a user can visually determine an area covered by a current thermal image through the associated images, and further, the user can determine an area where a detected hidden camera is located.
It should be noted that, in order to facilitate the coordinate transformation between the consecutive frame thermal images and the consecutive frame related images, in some embodiments, the electronic device further obtains the spatial position between the thermal imaging device that obtains the consecutive frame thermal images and the camera device that obtains the consecutive frame related images when obtaining the video stream. In view of the fact that the aforementioned imaging device may be, for example, a camera disposed on the thermal imaging device and facing the same direction as the infrared objective lens, the spatial position between the thermal imaging device and the imaging device is expressed by the spatial position between the infrared objective lens of the thermal imaging device and the camera facing the same direction as the infrared objective lens. The aforementioned imaging device may also be, for example, a camera 14 on the electronic equipment 10 connected to the thermal imaging device 12 as shown in fig. 2, where the spatial position between the thermal imaging device and the imaging device is represented by the spatial position between the infrared objective lens 13 on the thermal imaging device 12 and the camera 14 on the electronic equipment 10.
In step S31, when the electronic device receives the detection result of the hidden camera, a prompt message related to the detection result is generated to be displayed in the preview interface.
And the detection result is obtained by detecting the multi-frame thermal image acquired by the thermal imaging device by a hidden camera detection system. The multiple frames of thermal images may be, for example, consecutive frames of thermal images scanned and acquired by the thermal imaging device, and it is also understood that the multiple frames of thermal images are the same images as the consecutive frames of thermal images acquired by the thermal imaging device in the video stream. The multi-frame thermal image may also be, for example, a set of thermal images obtained by scanning the thermal imaging device for a preset interval time or a preset interval frame number, and it is also understood that the multi-frame thermal image is a partial image of consecutive frames of thermal images obtained by the thermal imaging device in the video stream, where the preset interval time is less than a time length taken by the thermal imaging device to move a field range, and two adjacent thermal images obtained by the preset interval frame number have a partial overlapping region. In addition, the multiple frames of thermal images acquired by the hidden camera detection system may be uploaded by the thermal imaging device, or may be uploaded by the electronic device, which is not limited herein.
The hidden camera detection system is, for example, a system that is disposed at a server and is used to execute the first method for detecting a hidden camera, and the detection of the hidden camera on multiple frames of thermal images and the sending of the detection result are as described in the foregoing steps S10 to S12, and are not described herein again. According to the aforementioned steps S10 to S12, the hidden camera detection system sends the detection result to the electronic device.
In an embodiment, the detection result includes a target thermal image including the hidden camera or a related image of the target thermal image. In order to distinguish the thermal images of the frames and the associated images of the frames from the thermal images of the frames and the associated images of the frames acquired by the electronic device, a target thermal image in a detection result sent by the hidden camera detection system is referred to as a first target thermal image, an associated image is referred to as a first associated image, a target thermal image containing the hidden camera in the continuous frame images prestored in the electronic device is referred to as a second target thermal image, and an associated image associated with the second target thermal image is referred to as a second target associated image, which is also referred to in the following description. The first target thermal image is a thermal image with a hidden camera in a plurality of frames of thermal images detected by the hidden camera detection system. The first associated image is an associated image associated with the first target thermal image by using identification data, for example, a simple-stroke image or a cartoon image generated by image reconstruction or conversion of the first target thermal image by the hidden camera detection system, or for example, a live-action image having an overlapping region of viewing angles and acquired synchronously with the first target thermal image.
In this embodiment, the electronic device presents the first target thermal image or the first associated image associated with the first target thermal image in the preview interface as the prompt information related to the detection result.
In an embodiment, the detection result includes identification data of a first target thermal image including the hidden camera, and in order to distinguish the first target thermal image from identification data of each frame of thermal image acquired by the electronic device, the identification data sent by the hidden camera detection system is referred to as first identification data, and the identification data of each frame of thermal image acquired by the electronic device is referred to as second identification data, which is also referred to in the following description.
In an example, continuous frames of thermal images obtained when the thermal imaging device works are prestored in the electronic device, and the electronic device matches the continuous frames of thermal images according to the first identification data to determine a second target thermal image including a hidden camera, and displays the second target thermal image as prompt information in a preview interface. Specifically, the electronic device may match second identification data identical to the first identification data from the second identification data set, and find out a thermal image having the second identification data from the consecutive frame thermal images as a second target thermal image. The second identification data set is a set of second identification data of each frame of thermal image acquired by the electronic device.
In another example, a continuous frame associated image associated with the continuous frame image is prestored in the electronic device, the electronic device matches the continuous frame associated image according to the first identification data to determine a second target associated image, and the second target associated image is displayed in the preview interface as prompt information. Specifically, the electronic device may match out second identification data identical to the first identification data from the second identification data set, and find out an associated image having the second identification data from the consecutive frame associated images as a second target associated image.
In an embodiment, the detection result includes position information of the hidden camera in the first target thermal image or a first related image related thereto. In an example, the position information is a positioning data set, where the positioning data set includes first identification data corresponding to the first target thermal image and pixel coordinates of the hidden camera in the first target thermal image. In this example, the step of generating, by the electronic device, prompt information related to the detection result includes: and determining the position of the hidden camera in the continuous frame images acquired by the thermal imaging device when the thermal imaging device works or continuous frame associated images associated with the continuous frame images based on the positioning data set, and setting positioning marks in the continuous frame images or the continuous frame associated images associated with the continuous frame images according to the position to display in a preview interface. For example, the electronic device may match second identification data that is the same as the first identification data from the second identification data set, find out a thermal image with the second identification data from the consecutive frames of thermal images as a second target thermal image, and then, make a positioning mark on the hidden camera in the second target thermal image according to the pixel coordinates in the positioning data set. For another example, the electronic device may match second identification data that is the same as the first identification data from the second identification data set, find out an associated image with the second identification data from the consecutive frame associated images as a second target associated image, then map the pixel coordinates in the positioning data set to the pixel coordinates of the hidden camera in the second target associated image based on the spatial position between the thermal imaging device and the camera device, and make a positioning mark in the second target associated image according to the mapped pixel coordinates. Therefore, the electronic equipment displays the second target associated image provided with the positioning mark or the second target thermal image provided with the positioning mark in the preview interface.
In another example, the position information is a positioning mark marked in a first target thermal image or a positioning mark marked in a first related image of the first target thermal image, and the detection result sent by the hidden camera detection system is the first target thermal image provided with the positioning mark or the first related image provided with the positioning mark. In this example, the electronic device presents the first target thermal image with the positioning mark or the first associated image with the positioning mark as prompt information related to the detection result in the preview interface.
The positioning mark in the above example is, for example, a positioning frame defined according to pixel coordinates, or a positioning point set according to pixel coordinates (the positioning point is presented in the form of an easily observable mark such as a "+", "×" or a square), and the positioning frame or the positioning point is used to indicate that the camera is hidden.
In another example, the position information is a position of the hidden camera relative to a reference object, and the electronic device displays the position information of the hidden camera relative to the reference object as prompt information in the preview interface. For example, if the detection result sent by the hidden camera detection system is "the hidden camera is above the television", the electronic device may play the voice information of "the hidden camera is above the television" in a voice manner, or the electronic device displays the text information of "the hidden camera is above the television" in the preview interface, or the electronic device displays the picture information of "the hidden camera is above the television" in the preview interface.
In an embodiment, the detection result includes private security warning information, and the prompt information is the private security warning information. The privacy safety warning information is, for example, vibration instruction information, and the electronic device (for example, the electronic device 10 shown in fig. 2) generates vibration as prompt information based on the vibration instruction information to prompt a user that there is a risk of hiding the camera; the privacy and safety warning information may also be, for example, voice prompt information, and the electronic device (e.g., the electronic device 10 shown in fig. 2) plays a voice prompt based on the voice prompt information as the prompt information to remind a user of the risk of hiding the camera; the privacy and safety warning information may also be, for example, text prompt information, and the electronic device (e.g., the electronic device 10 shown in fig. 2) displays prompt text in its preview interface based on the text prompt information (e.g., scroll displays "hidden camera exists in current scene, please check") to remind the user that there is risk of hidden camera; the privacy and safety warning information may also be, for example, picture prompt information, and the electronic device (e.g., the electronic device 10 shown in fig. 2) displays a prompt picture on its preview interface based on the picture prompt information, so that a user can know that there is a risk of hiding a camera in a current scene by seeing the prompt picture. However, the privacy security alert information may also be, for example, information obtained by combining at least two of the foregoing examples, for example, the privacy security prompt information includes vibration instruction information and text prompt information, so that the electronic device may generate vibration when displaying the prompt text on the preview interface thereof, and a user may know that the current scene has a risk of hiding the camera through the sensible information.
In an embodiment, the detection result includes undiscovered hidden camera information, and the prompt information is the undiscovered hidden camera information. The undiscovered hidden camera information may be, for example, voice information, and the electronic device (e.g., the electronic device 10 shown in fig. 2) plays information related to the undiscovered hidden camera based on the voice information (e.g., voice play "privacy security risk is not found in the current scene, please stay in the current scene") to notify the user; the undiscovered hidden camera information may also be, for example, text information, and the electronic device (e.g., the electronic device 10 shown in fig. 2) displays a prompt text in its preview interface based on the text information (e.g., scroll displays "hide camera is not discovered in current scene, please stay in mind") to notify the user; the undiscovered hidden camera information may also be, for example, picture information, and the electronic device (for example, the electronic device 10 shown in fig. 2) displays a prompt picture in its preview interface based on the image information, so that the user can know that the hidden camera is not detected in the current scene by seeing the picture displayed in the preview interface. However, the undiscovered hidden camera information may also be, for example, information obtained by combining at least two of the above examples, for example, the undiscovered hidden camera information includes voice information and text information, so that the electronic device may also play relevant information of the undiscovered hidden camera in a voice manner when displaying the prompt text in a preview interface thereof, and a user may know that the hidden camera is not detected in the current scene through the sensible information.
As described in the first method for detecting a hidden camera provided in the present application, the detection result may be sent to the electronic device after the image is uploaded by the thermal imaging apparatus, or sent to the electronic device immediately after the image is generated. In view of this, in an embodiment, the prompt information generated by the electronic device in the foregoing embodiments is displayed in the preview interface in real time. For example, as shown in fig. 6, a G area in the drawing is a floating window arranged above a video stream in a preview interface, and the floating window is used for displaying historical prompt information of this time of scanning, and each time the electronic device acquires a detection result in this time of scanning of the thermal imaging device, the electronic device pops up prompt information related to the detection result on the video stream in the preview interface for a preset time based on the detection result, and displays the prompt information in the G area of the floating window, such as the historical prompt information 16 in the G area, after the preset time is reached. For another example, when the prompt information is the second target-related image with the positioning mark, or the second target thermal image with the positioning mark, or the first target thermal image with the positioning mark, the electronic device replaces the thermal image, which has the same second identification data as the second target thermal image, in the consecutive frame thermal images displayed in the video stream, with the second target thermal image with the positioning mark or the first target thermal image with the positioning mark, or replaces the related image, which has the same second identification data as the second target thermal image, in the consecutive frame related images in the video stream, with the second target-related image with the positioning mark, so that the user can observe the position of the hidden camera in real time in the video stream.
In another embodiment, the prompt generated by the electronic device in the previous embodiments is presented in the preview interface at the end of the thermal imaging scan. In an example, when the electronic device detects that the thermal imaging scan is finished, a result display interface is loaded in the preview interface, as shown in fig. 8, which is displayed as a result display interface schematic diagram in an embodiment of the present application, and an H region in the result display interface shows a summary of prompt information generated based on a detection result in the scan. Wherein the electronic device upon detecting the end of the thermal imaging scan may, for example: if the electronic equipment cannot acquire the image uploaded by the thermal imaging device within the preset time length, the electronic equipment indicates that the electronic equipment detects that the scanning of the thermal imaging device is finished; when the electronic equipment acquires the power-off information of the thermal imaging device, the electronic equipment indicates that the electronic equipment detects that the scanning of the thermal imaging device is finished; when the electronic equipment detects that the user triggers a scanning end item in a preview interface of an application program (for example, a mobile phone APP) for operation, it indicates that the electronic equipment detects that the scanning of the thermal imaging device is ended.
As described in the first method for detecting a hidden camera provided in the present application, the hidden camera detection system may further determine the coverage area of the thermal imaging device scanned this time, and send the advance information to the electronic device. In view of this, in some embodiments, the second method for detecting a hidden camera further includes receiving progress information to be displayed in the preview interface. Specifically, when the electronic device detects that the thermal imaging scanning is finished, a result display interface is loaded in the preview interface, and as shown in fig. 8, in addition to the summary of the prompt information generated based on the detection result in the current scanning, the result display interface also displays the progress information in the J area.
In addition, the determination of the coverage of the current scan of the thermal imaging apparatus can also be performed on the electronic device, and in view of this, the method for detecting a hidden camera further includes a step of determining a scan coverage, which is shown as a flowchart of determining a scan coverage in an embodiment of the present application with reference to fig. 9, as shown in the figure, the step of determining a scan coverage includes step S40, step S41, and step S42.
In step S40, the electronic device acquires a field angle parameter of the thermal imaging apparatus. This step may occur during or after step S30, or before, during, or after step S31, or before, during, or after step S32, as the case may be, without limitation to the order of occurrence.
In step S41, the electronic device detects that the thermal imaging apparatus scanning is finished, and calculates a scanning coverage based on the attitude data of the second identification data and the field angle parameter to generate progress information.
The detection of the end of scanning of the thermal imaging device by the electronic device may be implemented, for example, by any one of the electronic device not obtaining an image uploaded by the thermal imaging device within a preset time period, or the electronic device obtaining information about power failure of the thermal imaging device, or the electronic device detecting an end of scanning item in a preview interface of an application program (e.g., a mobile phone APP) that a user triggers an operation.
In some embodiments, as previously described, the pose data is gyroscope measurement data output by a gyroscope built into the thermal imaging device or into an electronic device connected to the thermal imaging device. In view of this, the electronic device may determine, from the gyroscope measurement data, the rotation angle of the thermal imaging apparatus with respect to its initial position on three coordinate axes (X-axis, Y-axis, and Z-axis), and determine, in combination with the field angle parameter, the scan coverage, which may be expressed in terms of a percentage. For example, the electronic device determines that the rotation angles of the thermal imaging device on three coordinate axes (an X axis, a Y axis and a Z axis) relative to the initial position of the thermal imaging device reach 360 degrees, and then determines that the coverage of the current scanning is one-hundred percent full coverage; for another example, the electronic device determines that at least one of the rotation angles of the thermal imaging device on the three coordinate axes relative to the initial position of the thermal imaging device does not reach 360 °, superimposes the rotation degree to which the angle of view of thermal imaging extends on the basis of the determined rotation angles on the three coordinate axes as the rotation angle, and calculates the percentage of the scanning coverage range of the superimposed rotation angle as the coverage range of the current scanning.
In some embodiments, the electronic device generates progress information based on the determined scanning coverage, and the progress information may be at least one of text information containing the scanning coverage, picture information containing the scanning coverage, or voice information containing the scanning coverage, for example.
In step S42, the electronic device displays the progress information in the preview interface for reminding the user of the completeness of the detection.
And as mentioned above, the electronic device loads a result display interface in the preview interface when detecting that the thermal imaging scanning is finished. Further, in some embodiments, the electronic device presents the progress information in a results display interface.
The progress information may be, for example, voice information including a scanning coverage, and the electronic device plays the scanning coverage based on the voice information (for example, voice playing "this scanning reaches seventy percent, please confirm whether to continue scanning"); the progress information may be, for example, text information including a scanning coverage, and the electronic device displays prompt text in a result display interface of the electronic device based on the text information (for example, scroll displays "this scanning reaches seventy percent, please confirm whether to continue scanning") to notify the user; the progress information may be, for example, picture information including a scanning coverage, the electronic device displays a prompt picture in a result display interface based on the image information, as shown in fig. 8, and a user can know the scanning coverage by seeing the picture displayed in the result display interface. However, the present invention is not limited thereto, and the schedule information may also be information obtained by combining at least two of the above examples, and those skilled in the art may combine the above examples at will, and details are not described herein again.
In summary, according to the method for detecting the hidden camera disclosed by the application, on one hand, the server side realizes automatic identification of the hidden camera based on thermodynamic diagram image by executing the first method for detecting the hidden camera, and the detection efficiency is improved. The electronic equipment displays the detection result to the user in a visual or sensible mode in time by executing a second method for detecting and hiding the camera, so that the user can know the privacy leakage risk in time, and the privacy safety of the user is ensured. On the other hand, the method for detecting the hidden camera disclosed by the application can also realize the positioning of the hidden camera so that a user can confirm the position of the hidden camera, and the subsequent operation is conveniently executed. In addition, whether on a server side or an electronic device, the scanning coverage range can be determined based on the field angle parameters and the posture data of the thermal imaging device, and the scanned area is displayed for the user on the electronic device in a visual mode, so that the condition of scanning omission or detection omission caused by negligence of the user is greatly reduced, and the detection comprehensiveness is ensured.
In an embodiment, the hidden camera detection system is, for example, a service end, and in order to distinguish from a hidden camera detection system subsequently used in an electronic device, the hidden camera detection system used in the service end is referred to as a first hidden camera detection system, which is also referred to as a first hidden camera detection system later. The server may be arranged on one or more entity servers according to various factors such as function, load, and the like. When distributed in a plurality of entity servers, the server may be composed of servers based on a cloud architecture. For example, the server based on the cloud architecture includes a public cloud server and a private cloud server, where the public or private cloud server includes SaaS, PaaS, IaaS, and the like. The private cloud service end comprises a Mei Tuo cloud computing service platform, an Array cloud computing service platform, an Amazon cloud computing service platform, a Baidu cloud computing platform, a Tencent cloud computing platform and the like. The server may also be formed by a distributed or centralized cluster of servers. For example, the server cluster is composed of at least one entity server. Each entity server is provided with a plurality of virtual servers, each virtual server runs at least one functional module in the server side, and the virtual servers are communicated with each other through a network.
Referring to fig. 10, which is a block diagram of a first hidden camera detection system according to an embodiment of the present invention, as shown in the figure, the first hidden camera detection system 20 includes an obtaining module 21, a detecting module 22, and a sending module 23. The acquiring module 21 is configured to acquire multiple frames of thermal images acquired by a thermal imaging device and identification data corresponding to each frame of thermal image; the detection module 22 is configured to determine a target thermal image containing a hidden camera from the multiple frames of thermal images to generate a detection result; the sending module 23 is configured to send the detection result to an electronic device to remind a user of privacy security risks.
The obtaining module 21, the detecting module 22, and the sending module 23 in the first hidden camera detecting system 20 cooperatively perform the steps S10 to S12 according to the functions of the modules described above, which is not described herein again.
In some embodiments, the obtaining module 21 is further configured to obtain the field angle parameter of the thermal imaging device.
In some embodiments, the detection module 22 is further configured to determine the scanning coverage, and the detection module 22 performs the foregoing steps S20 and S21 to determine the scanning coverage, which is not described herein again.
In an embodiment, the hidden camera detection system is, for example, a client, and in order to distinguish from the hidden camera detection system used in the server, the hidden camera detection system used in the client is referred to as a second hidden camera detection system, which is also referred to in the following description. The client is, for example, an electronic device loaded with an APP application or having web/website access capabilities, and includes components such as memory, a memory controller, one or more processing units (CPUs), peripheral interfaces, RF circuitry, audio circuitry, speakers, microphones, input/output (I/O) subsystems, a display screen, other output or control devices, and external ports, which communicate via one or more communication buses or signal lines. The electronic device includes, but is not limited to, a desktop computer, a notebook computer, a tablet computer, a smart phone, a smart television, and the like. The user terminal may also be an electronic device composed of a host with a plurality of virtual machines and a human-computer interaction device (such as a touch display screen, a keyboard and a mouse) corresponding to each virtual machine.
In an embodiment, the client communicates with a server in a network system, for example, a hidden camera system or a platform, the network may be the internet, one or more intranets, local area networks, wide area networks, storage local area networks, and the like, or a suitable combination thereof, and the types or protocols of the communication network between the publisher terminal and the server, between the responder terminal and the server, and the like in the embodiments of the present application are not limited in this application.
In some embodiments of the present application, the server may be disposed on one or more entity servers according to various factors such as function, load, and the like. When distributed in a plurality of entity servers, the server may be composed of servers based on a cloud architecture. For example, the server based on the cloud architecture includes a public cloud server and a private cloud server, where the public or private cloud server includes SaaS, PaaS, IaaS, and the like. The private cloud service end comprises a Mei Tuo cloud computing service platform, an Array cloud computing service platform, an Amazon cloud computing service platform, a Baidu cloud computing platform, a Tencent cloud computing platform and the like. The server may also be formed by a distributed or centralized cluster of servers. For example, the server cluster is composed of at least one entity server. Each entity server is provided with a plurality of virtual servers, each virtual server runs at least one functional module in a server of the first hidden camera detection system, and the virtual servers are communicated with each other through a network.
Referring to fig. 11, which is a block diagram of a second hidden camera detection system according to an embodiment of the present application, as shown in the figure, the second hidden camera detection system 30 includes an obtaining module 31, a generating module 32, and a display module 33. The acquiring module 31 is configured to synchronously acquire a video stream when detecting that a thermal imaging apparatus is working; receiving a detection result of the hidden camera; the detection result is obtained by detecting a plurality of frames of thermal images acquired by the thermal imaging device by a first hidden camera detection system; the generating module 32 is configured to generate, when receiving a detection result of the hidden camera, prompt information related to the detection result to be displayed in a preview interface; the display module 33 is configured to display the synchronously acquired video stream in the preview interface; and displaying prompt information related to the detection result in the preview interface.
In some embodiments, the obtaining module 31 is further configured to receive progress information sent by the first hidden camera detection system, and the displaying module 33 is further configured to display the progress information in a preview interface.
In some embodiments, the second hidden-camera detection system 30 further includes a detection module (not shown) for detecting a trigger operation in a current display interface of the electronic device or detecting an operating state of the thermal imaging apparatus. In an embodiment, the detection module is, for example, a touch screen detector or an event monitor, and the touch screen detector detects that a manager performs a corresponding operation when a trigger event in a currently displayed page occurs. Such as a user-triggered operation to open an application or a user-triggered operation to start an item or scan an end item in an application. The event monitor receives event information from a peripheral interface. The event information includes information about a sub-event (e.g., a user touch on the touch-sensitive display system as part of a multi-touch gesture).
The obtaining module 31, the generating module 32, the displaying module 33, and the detecting module in the second hidden camera detecting system 30 cooperatively perform the step S30 and the step S31 according to the functions of the modules described above, which is not described herein again.
In some embodiments, the obtaining module 31, the generating module 32, and the displaying module 33 are further configured to cooperatively perform the foregoing steps S40 to S42 to determine the scanning coverage, which is not described herein again. For example, the acquiring module 31 performs the aforementioned step S40, the generating module 42 performs the aforementioned step S41, and the displaying module performs the aforementioned step S42, thereby determining and displaying the scan coverage.
In an embodiment, the cloud server system may be arranged on one or more entity servers according to a plurality of factors such as functions, loads, and the like. When distributed in a plurality of entity servers, the server may be composed of servers based on a cloud architecture. For example, the server based on the cloud architecture includes a public cloud server and a private cloud server, where the public or private cloud server includes SaaS, PaaS, IaaS, and the like. The private cloud service end comprises a Mei Tuo cloud computing service platform, an Array cloud computing service platform, an Amazon cloud computing service platform, a Baidu cloud computing platform, a Tencent cloud computing platform and the like. The server may also be formed by a distributed or centralized cluster of servers. For example, the server cluster is composed of at least one entity server. Each entity server is provided with a plurality of virtual servers, each virtual server runs at least one functional module in the server, and the virtual servers are communicated with each other through a network.
Referring to fig. 12, which is a block diagram illustrating a cloud server system according to an embodiment of the present application, as shown, the cloud server system 40 includes at least one storage device 41 and at least one processing device 42.
The at least one storage device 41 for storing at least one program; in an embodiment, the storage device 41 comprises a storage server or memory, which may comprise high speed random access memory, and may also comprise non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In certain embodiments, the memory may also include memory that is remote from the one or more processors, such as network-attached memory accessed via RF circuitry or external ports and a communication network (not shown), which may be the internet, one or more intranets, local area networks, wide area networks, storage area networks, and the like, or suitable combinations thereof. The memory controller may control access to the memory by other components of the device, such as the CPU and peripheral interfaces.
The at least one processing device 42 is connected to the storage device 41 and configured to execute the at least one program to perform at least one embodiment as described above for the first method for detecting a hidden camera. The processing device 42 is, for example, a server, such as an application server or the like, that includes a processor operatively coupled with a memory and/or a non-volatile storage device. More specifically, the processor may execute instructions stored in the memory and/or the non-volatile storage device to perform operations in the computing device, such as generating image data and/or transmitting image data to an electronic display. As such, the processor may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more field programmable logic arrays (FPGAs), or any combination thereof.
For the sake of distinguishing from another electronic device provided in the following description, an electronic device provided herein is referred to as a first electronic device, please refer to fig. 13, which is a block diagram of the first electronic device in an embodiment of the present application, and as shown in the drawing, the first electronic device 50 of the present application includes: a display 51, at least one memory 52, and at least one processor 53.
In an embodiment, the first electronic device is, for example, an electronic device loaded with an APP application or having web/website access capability, and the electronic device includes components such as a memory, a memory controller, one or more processing units (CPUs), a peripheral interface, RF circuitry, audio circuitry, a speaker, a microphone, an input/output (I/O) subsystem, a display screen, other output or control devices, and an external port, which communicate via one or more communication buses or signal lines. The electronic device includes, but is not limited to, personal computers such as desktop computers, notebook computers, tablet computers, smart phones, smart televisions, and the like. The first electronic device may also be an electronic device composed of a host with a plurality of virtual machines and a human-computer interaction device (such as a touch display screen, a keyboard and a mouse) corresponding to each virtual machine.
The display 51 is functionally implemented by a graphics module in the electronic device, which includes various known software components for rendering and displaying graphics on a touch screen, and a controller for displaying the same. Note that the term "graphic" includes any object that may be displayed to a user, including but not limited to text, web pages, icons (e.g., user interface objects including soft keys), digital images, videos, animations and the like. The display screen is, for example, a touch screen, and provides both an output interface and an input interface between the device and the user. The touch screen controller receives/sends electrical signals from/to the touch screen. The touch screen then displays visual output to the user. This visual output may include text, graphics, video, and any combination thereof.
The at least one memory 52 is for storing at least one program; in embodiments, the memory may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In certain embodiments, the memory may also include memory that is remote from the one or more processors, such as network attached memory that is accessed via RF circuitry or external ports and a communications network, which may be the internet, one or more intranets, local area networks, wide area networks, storage area networks, and the like, or suitable combinations thereof. The memory controller may control access to the memory by other components of the device, such as the CPU and peripheral interfaces.
In an embodiment, the at least one processor 53 is connected to the at least one memory 52, and is configured to execute and implement at least one embodiment described in the above second method for detecting a hidden camera, such as the embodiment described in any one of fig. 5 to 9, when the at least one program is executed. In an embodiment, the processor 53 is operatively coupled to memory and/or non-volatile storage. More specifically, the processor may execute instructions stored in the memory and/or the non-volatile storage device to perform operations in the computing device, such as generating image data and/or transmitting image data to an electronic display. As such, the processor may include one or more general purpose microprocessors, one or more special purpose processors, one or more field programmable logic arrays, or any combination thereof.
In some embodiments, the first electronic device 50 further comprises an interface unit (not shown) for connecting a thermal imaging device to acquire consecutive thermal images acquired by the thermal imaging device.
In some embodiments, the first electronic device 50 further includes a camera device (not shown) for acquiring consecutive frame-related images of the consecutive thermal images, such as real-world images associated with each thermal image in a one-to-one correspondence.
In order to distinguish from the aforementioned electronic device, the electronic device is referred to as a second electronic device, please refer to fig. 14, which is a block diagram of the second electronic device in an embodiment, and as shown in the drawing, the second electronic device 60 of the present application includes: a display 61, a thermal imaging unit 62, at least one memory 63, and at least one processor 64.
In an embodiment, the second electronic device is, for example, an electronic device loaded with an APP application or having web/website access capability, and the electronic device includes components such as a memory, a memory controller, one or more processing units (CPUs), a peripheral interface, RF circuitry, audio circuitry, a speaker, a microphone, an input/output (I/O) subsystem, a display screen, other output or control devices, and an external port, which communicate via one or more communication buses or signal lines. The electronic device includes, but is not limited to, personal computers such as desktop computers, notebook computers, tablet computers, smart phones, smart televisions, and the like. The second electronic device may also be an electronic device composed of a host with a plurality of virtual machines and a human-computer interaction device (such as a touch display screen, a keyboard and a mouse) corresponding to each virtual machine.
The function of the display 61 is performed by a graphics module in the electronic device, which includes various known software components for rendering and displaying graphics on a touch screen, and a controller for displaying the same. Note that the term "graphic" includes any object that may be displayed to a user, including but not limited to text, web pages, icons (e.g., user interface objects including soft keys), digital images, videos, animations and the like. The display screen is, for example, a touch screen, and provides both an output interface and an input interface between the device and the user. The touch screen controller receives/sends electrical signals from/to the touch screen. The touch screen then displays visual output to the user. This visual output may include text, graphics, video, and any combination thereof.
The thermal imaging unit 62 is coupled to the at least one processor 64 for scanning the current scene to acquire successive frames of thermal images of the current scene. The thermal imaging unit 62 is a device that converts an image of temperature distribution of a target object into a thermal image by means of infrared radiation detection of the target object, signal processing, photoelectric conversion, and the like, using an infrared thermal imaging technology, and the thermal imaging unit 62 generally includes an optical-mechanical assembly, a focusing/zooming assembly, an internal non-uniformity correction assembly (hereinafter referred to as an internal correction assembly), an imaging circuit assembly, and an infrared detector/refrigerator assembly, where the optical-mechanical assembly mainly includes an infrared objective lens and a structural member, and the above components cooperate to acquire a continuous frame thermal image of the current Yangtze river.
The at least one memory 63 is for storing at least one program; in embodiments, the memory may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In certain embodiments, the memory may also include memory that is remote from the one or more processors, such as network attached memory that is accessed via RF circuitry or external ports and a communications network, which may be the internet, one or more intranets, local area networks, wide area networks, storage area networks, and the like, or suitable combinations thereof. The memory controller may control access to the memory by other components of the device, such as the CPU and peripheral interfaces.
In an embodiment, the at least one processor 64 is connected to the at least one memory 63, and is configured to execute and implement at least one embodiment described in the above second method for detecting a hidden camera, such as the embodiment described in any one of fig. 5 to 9, when the at least one program is executed. In an embodiment, the processor 64 is operatively coupled to memory and/or non-volatile storage. More specifically, the processor may execute instructions stored in the memory and/or the non-volatile storage device to perform operations in the computing device, such as generating image data and/or transmitting image data to an electronic display. As such, the processor may include one or more general purpose microprocessors, one or more special purpose processors, one or more field programmable logic arrays, or any combination thereof.
In some embodiments, the second electronic device 60 further includes a camera device (not shown) for acquiring consecutive frame-related images of the consecutive thermal images, such as live-action images associated with each thermal image in a one-to-one correspondence.
The present application also provides a computer-readable and writable storage medium storing a computer program that, when executed, implements at least one of the embodiments described above for the first method of detecting a hidden camera, such as the embodiments described in any of fig. 1 to 4, or implements at least one of the embodiments described above for the second method of detecting a hidden camera, such as the embodiments described in any of fig. 5 to 9.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application.
In the embodiments provided herein, the computer-readable and writable storage medium may include read-only memory, random-access memory, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory, a USB flash drive, a removable hard disk, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable-writable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but are intended to be non-transitory, tangible storage media. Disk and disc, as used in this application, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
In one or more exemplary aspects, the functions described in the computer program of the methods described herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module, which may be located on a tangible, non-transitory computer-readable and/or writable storage medium. Tangible, non-transitory computer readable and writable storage media may be any available media that can be accessed by a computer.
The flowcharts and block diagrams in the figures described above of the present application illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (34)

1. A method for detecting a hidden camera is characterized by comprising the following steps:
acquiring a plurality of frames of thermal images acquired by a thermal imaging device;
determining a target thermal image containing a hidden camera from the multiple frames of thermal images to generate a detection result;
and sending the detection result to an electronic device for reminding a user of privacy security risks.
2. The method of claim 1, wherein the step of obtaining a plurality of frames of thermal images obtained by a thermal imaging device comprises obtaining identification data corresponding to each frame of thermal image.
3. The method for detecting a hidden camera according to claim 2, wherein the detection result includes at least one of the target thermal image or identification data corresponding to the target thermal image, a related image of the target thermal image determined according to the identification data, position information of the hidden camera in the target thermal image or the related image thereof, or privacy safety warning information, or hidden camera information that is not found.
4. The method for detecting a hidden camera according to claim 3, wherein the identification data corresponding to the target thermal image is used for matching consecutive frames of thermal images pre-stored in the electronic device or consecutive frames of associated images of the consecutive frames of target thermal images pre-stored in the electronic device.
5. The method for detecting the hidden camera according to claim 4, wherein the related image is a live-action image with an overlapping viewing angle area, which is obtained synchronously with each frame of the thermal image.
6. The method of detecting a hidden camera as claimed in claim 3, wherein the position information of the hidden camera in the target thermodynamic image comprises at least one of a set of location data, a location marker marked in the target thermodynamic diagram, and a position relative to a reference, wherein the set of location data comprises identification data corresponding to the target thermodynamic image and pixel coordinates of the hidden camera in the target thermodynamic diagram.
7. The method for detecting the hidden camera according to claim 3, wherein the position information of the hidden camera in the associated image of the target thermal image comprises a positioning mark marked in the associated image; wherein the positioning mark in the associated image maps the position information of the hidden camera in the target thermal image to the associated image based on the spatial position between the thermal imaging device and the camera device acquiring the associated image.
8. The method for detecting the hidden camera according to claim 3, wherein the privacy security warning information includes any one or any combination of vibration instruction information, voice prompt information, text prompt information and picture prompt information.
9. Method for detecting a hidden camera as claimed in claim 2, characterized in that the identification data comprise pose data and/or time data of the thermal imaging device.
10. The method for detecting a hidden camera according to claim 9, further comprising the step of obtaining the field angle parameter of the thermal imaging device, and the steps of:
detecting that the image uploading of the thermal imaging device is finished, and calculating a scanning coverage range based on the attitude data and the field angle parameter to generate progress information;
and sending the progress information to the electronic equipment so as to remind a user of the integrity of detection.
11. The method for detecting the hidden camera according to claim 1, wherein the step of determining the target thermal image containing the hidden camera from the multiple frames of thermal images to generate the detection result comprises: and inputting the multi-frame thermal image into a preset detection model to identify a target thermal image containing the hidden camera.
12. The method of claim 1, wherein the step of sending the detection result to an electronic device comprises:
and when the end of image uploading of the thermal imaging device is detected, sending the detection result to the electronic equipment, or sending the detection result to the electronic equipment in real time.
13. A hidden camera detection system, comprising:
the acquisition module is used for acquiring a plurality of frames of thermal images acquired by a thermal imaging device and identification data corresponding to each frame of thermal image;
the detection module is used for determining a target thermal image containing a hidden camera from the multi-frame thermal image to generate a detection result;
and the sending module is used for sending the detection result to an electronic device so as to remind the user of privacy security risks.
14. A cloud server system, comprising:
at least one storage device for storing at least one program;
at least one processing device, connected to the storage device, for executing and implementing the method for detecting a hidden camera according to any one of claims 1 to 12 when running the at least one program.
15. A method for detecting a hidden camera is characterized by comprising the following steps:
displaying the acquired video stream in a preview interface when detecting that a thermal imaging device works;
when a detection result of the hidden camera is received, prompt information related to the detection result is generated to be displayed in the preview interface; and the detection result is obtained by detecting the multi-frame thermal image acquired by the thermal imaging device by a hidden camera detection system.
16. A method for detecting a hidden camera as claimed in claim 15, wherein the video stream comprises consecutive frames of thermal images acquired by the thermal imaging device and/or consecutive frame related images associated with the consecutive frames of thermal images, and the consecutive frame related images are live-action images with view angle overlapping areas acquired synchronously with the consecutive frames of thermal images.
17. The method for detecting a hidden camera as claimed in claim 15, wherein the step of detecting that a thermal imaging device is operating to present the captured video stream in a preview interface comprises: and acquiring identification data corresponding to each frame of thermal image in the continuous frames of thermal images acquired by the thermal imaging device, wherein the identification data comprises attitude data and/or time data of the thermal imaging device.
18. The method for detecting a hidden camera according to claim 17, further comprising the steps of obtaining the thermal imaging device field angle parameter and:
detecting that the thermal imaging device finishes scanning, and calculating a scanning coverage range based on the attitude data and the field angle parameter to generate progress information;
and displaying the progress information in the preview interface so as to remind a user of the integrity of detection.
19. The method according to claim 15, wherein the detection result comprises a target thermal image of the hidden camera or an associated image of the target thermal image, and the prompt message displayed in the preview interface is the target thermal image or the associated image.
20. The method of claim 15, wherein the detection result comprises identification data of a target thermal image of the hidden camera, and the step of generating the prompt message related to the detection result comprises:
and matching continuous frame images acquired when the thermal imaging device works or continuous frame associated images associated with the continuous frame images according to the identification data to determine a target thermal image or a target associated image containing the hidden camera, wherein prompt information displayed in the preview interface is the target thermal image or the target associated image.
21. The method for detecting a hidden camera according to claim 15, wherein the detection result comprises position information of the hidden camera in a target thermal image or an image related thereto.
22. The method of claim 21, wherein the location information is a set of location data, and the step of generating a prompt related to the detection result comprises:
determining, based on the set of positioning data, a position in a consecutive frame thermal image acquired by the hidden camera while the thermal imaging device is operating or a consecutive frame associated image associated with the consecutive frame thermal image;
and setting a positioning mark according to the position to be displayed in the preview interface.
23. The method for detecting the hidden camera according to claim 21, wherein the position information is a positioning mark marked in a target thermodynamic diagram or a positioning mark marked in an associated image of the target thermodynamic diagram, and the prompt information displayed in the preview interface is the target thermodynamic diagram with the positioning mark or the associated image with the positioning mark.
24. The method according to claim 21, wherein the position information is a position of the hidden camera with respect to a reference, and the prompt information displayed in the preview interface is the position information of the hidden camera with respect to the reference.
25. The method according to claim 15, wherein the detection result includes private security warning information or hidden camera information that is not found, and the prompt information is the private security warning information or the hidden camera information that is not found.
26. The method for detecting the hidden camera according to claim 25, wherein the privacy security warning message includes any one or any combination of a vibration prompt message, a voice prompt message, a text prompt message, and a picture prompt message.
27. The method for detecting the hidden camera according to claim 15, wherein the prompt message is displayed in the preview interface in real time or at the end of the scanning of the thermal imaging device.
28. A hidden camera detection system, comprising:
the acquisition module is used for synchronously acquiring a video stream when detecting that a thermal imaging device works; receiving a detection result of the hidden camera; the detection result is obtained by detecting a plurality of frames of thermal images acquired by the thermal imaging device by a hidden camera detection system;
the generating module is used for generating prompt information related to the detection result to be displayed in a preview interface when the detection result of the hidden camera is received;
the display module is used for displaying the synchronously acquired video stream in the preview interface; and displaying prompt information related to the detection result in the preview interface.
29. An electronic device, comprising:
a display;
at least one memory for storing at least one program;
at least one processor, coupled to the at least one memory, configured to execute and implement the method for detecting a hidden camera according to any of claims 15 to 27 when running the at least one program.
30. The electronic device of claim 29, further comprising an interface unit for interfacing with a thermal imaging device to acquire successive frames of thermal images acquired by the thermal imaging device.
31. The electronic device of claim 30, further comprising a camera device for capturing successive frame-related images of said successive frames of thermal images.
32. An electronic device, comprising:
a display device is arranged on the base plate,
the thermal imaging unit is used for scanning the current scene to acquire continuous frame thermal images of the current scene;
at least one memory for storing at least one program;
at least one processor, coupled to the at least one memory and the thermal imaging unit, configured to execute and implement the method for detecting a hidden camera according to any of claims 15 to 27 when running the at least one program.
33. An electronic device according to claim 32, comprising a camera device for acquiring successive frame-related images of said successive frames of thermal images.
34. A computer-readable storage medium, in which at least one program is stored, which, when being executed by a processor, carries out and implements the method for detecting a hidden camera according to any one of claims 1 to 12 or the method for detecting a hidden camera according to any one of claims 15 to 27.
CN202010260452.9A 2020-04-03 2020-04-03 Method, system, device and storage medium for detecting hidden camera Pending CN111553196A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010260452.9A CN111553196A (en) 2020-04-03 2020-04-03 Method, system, device and storage medium for detecting hidden camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010260452.9A CN111553196A (en) 2020-04-03 2020-04-03 Method, system, device and storage medium for detecting hidden camera

Publications (1)

Publication Number Publication Date
CN111553196A true CN111553196A (en) 2020-08-18

Family

ID=72005582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010260452.9A Pending CN111553196A (en) 2020-04-03 2020-04-03 Method, system, device and storage medium for detecting hidden camera

Country Status (1)

Country Link
CN (1) CN111553196A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113205062A (en) * 2020-12-28 2021-08-03 武汉纺织大学 Virtual dress trying-on system capable of displaying trying-on effect in real time
KR102337790B1 (en) * 2021-06-07 2021-12-09 (주)지슨 Hidden camera detector and method using temperature change
CN113891067A (en) * 2021-09-24 2022-01-04 浙江大学 Wireless network camera positioning method and device, storage medium and electronic equipment
WO2023045519A1 (en) * 2021-09-27 2023-03-30 中兴通讯股份有限公司 Hidden camera detection method, terminal device, electronic device, and storage medium
WO2023155567A1 (en) * 2022-02-16 2023-08-24 Oppo广东移动通信有限公司 Information interaction method and related device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002122678A (en) * 2001-01-30 2002-04-26 Masanobu Kujirada Detector of camera, etc.
JP2002156464A (en) * 2001-01-30 2002-05-31 Masanobu Kujirada Apparatus and method for detection of hidden camera
CN102420898A (en) * 2011-09-27 2012-04-18 惠州Tcl移动通信有限公司 Mobile phone-based panoramic photographing realization method and mobile phone
CN106303198A (en) * 2015-05-29 2017-01-04 小米科技有限责任公司 Photographing information acquisition methods and device
CN109272549A (en) * 2018-08-31 2019-01-25 维沃移动通信有限公司 A kind of location determining method and terminal device of infrared heat point
CN110223284A (en) * 2019-06-11 2019-09-10 深圳市启芯众志科技有限公司 A kind of detection method and detection device of the pinhole cameras based on intelligent terminal
CN110233970A (en) * 2019-06-27 2019-09-13 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002122678A (en) * 2001-01-30 2002-04-26 Masanobu Kujirada Detector of camera, etc.
JP2002156464A (en) * 2001-01-30 2002-05-31 Masanobu Kujirada Apparatus and method for detection of hidden camera
CN102420898A (en) * 2011-09-27 2012-04-18 惠州Tcl移动通信有限公司 Mobile phone-based panoramic photographing realization method and mobile phone
CN106303198A (en) * 2015-05-29 2017-01-04 小米科技有限责任公司 Photographing information acquisition methods and device
CN109272549A (en) * 2018-08-31 2019-01-25 维沃移动通信有限公司 A kind of location determining method and terminal device of infrared heat point
CN110223284A (en) * 2019-06-11 2019-09-10 深圳市启芯众志科技有限公司 A kind of detection method and detection device of the pinhole cameras based on intelligent terminal
CN110233970A (en) * 2019-06-27 2019-09-13 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113205062A (en) * 2020-12-28 2021-08-03 武汉纺织大学 Virtual dress trying-on system capable of displaying trying-on effect in real time
KR102337790B1 (en) * 2021-06-07 2021-12-09 (주)지슨 Hidden camera detector and method using temperature change
CN113891067A (en) * 2021-09-24 2022-01-04 浙江大学 Wireless network camera positioning method and device, storage medium and electronic equipment
WO2023045519A1 (en) * 2021-09-27 2023-03-30 中兴通讯股份有限公司 Hidden camera detection method, terminal device, electronic device, and storage medium
WO2023155567A1 (en) * 2022-02-16 2023-08-24 Oppo广东移动通信有限公司 Information interaction method and related device

Similar Documents

Publication Publication Date Title
EP3466070B1 (en) Method and device for obtaining image, and recording medium thereof
CN111553196A (en) Method, system, device and storage medium for detecting hidden camera
KR102593824B1 (en) Method for controlling a camera and electronic device thereof
KR102620138B1 (en) Method for Outputting Screen and the Electronic Device supporting the same
US9661214B2 (en) Depth determination using camera focus
WO2020253655A1 (en) Method for controlling multiple virtual characters, device, apparatus, and storage medium
KR102576654B1 (en) Electronic apparatus and controlling method thereof
CN108304075B (en) Method and device for performing man-machine interaction on augmented reality device
US8938558B2 (en) Modifying functionality based on distances between devices
CN107562288B (en) Response method based on infrared touch device, infrared touch device and medium
WO2019179237A1 (en) Satellite-view electronic map acquisition method and device, apparatus, and storage medium
US10733799B2 (en) Augmented reality sensor
US11132842B2 (en) Method and system for synchronizing a plurality of augmented reality devices to a virtual reality device
US11637968B2 (en) Image photographing method of electronic device and electronic device
CN107077200B (en) Reflection-based control activation
CN111988729A (en) Discovery and connection of remote devices
WO2016124146A1 (en) Display device camouflage/recovery system and control method
KR102651793B1 (en) Computer readable recording medium and electronic apparatus for performing video call
US10366495B2 (en) Multi-spectrum segmentation for computer vision
CN110134902B (en) Data information generating method, device and storage medium
CN111383251B (en) Method, device, monitoring equipment and storage medium for tracking target object
JP7293362B2 (en) Imaging method, device, electronic equipment and storage medium
US20170076144A1 (en) Video display system, image display control method, and recording medium storing image display control program
CN114600162A (en) Scene lock mode for capturing camera images
CN112804481B (en) Method and device for determining position of monitoring point and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination