CN109272549B - Method for determining position of infrared hotspot and terminal equipment - Google Patents

Method for determining position of infrared hotspot and terminal equipment Download PDF

Info

Publication number
CN109272549B
CN109272549B CN201811012517.7A CN201811012517A CN109272549B CN 109272549 B CN109272549 B CN 109272549B CN 201811012517 A CN201811012517 A CN 201811012517A CN 109272549 B CN109272549 B CN 109272549B
Authority
CN
China
Prior art keywords
image
infrared
terminal device
hot spot
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811012517.7A
Other languages
Chinese (zh)
Other versions
CN109272549A (en
Inventor
王勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811012517.7A priority Critical patent/CN109272549B/en
Publication of CN109272549A publication Critical patent/CN109272549A/en
Application granted granted Critical
Publication of CN109272549B publication Critical patent/CN109272549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention provides a method for determining the position of an infrared hotspot and terminal equipment, relates to the technical field of communication, and aims to solve the problem that a camera hidden in efficiency cannot be accurately and efficiently found in the prior art. The method comprises the following steps: receiving a first input of a user under the condition that a first image is displayed on a first interface; displaying a first identifier on a first position of a first image according to a second image in response to a first input; wherein the first image and the second image are respectively: the non-infrared camera and the infrared camera are used for acquiring images of the same shot object in a bright environment and a dark environment, and the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of the target infrared hot spot, the target infrared hot spot is at least one of M infrared hot spots, and M is greater than or equal to 1.

Description

Method for determining position of infrared hotspot and terminal equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a method for determining the position of an infrared hotspot and terminal equipment.
Background
Nowadays, with the continuous improvement of living standard of people, the chances of traveling and going on business are also continuously increased. But the residential hotel for traveling and business trip becomes a common thing. However, for some benefit, unscrupulous or lawless persons install hidden cameras (e.g., pin-hole cameras) in hotels to capture the privacy of others, which can have serious adverse effects on the individuals being peered and their homes.
However, at present, only the personal life experience can be used for searching for the hidden camera which may exist in the room, and the hidden camera cannot be searched for efficiently and accurately.
Disclosure of Invention
The embodiment of the invention provides a method for determining the position of an infrared hotspot and terminal equipment, which are used for solving the problem that the prior art cannot accurately and efficiently find an efficient hidden camera.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, a method for determining a location of an infrared hotspot is provided, where the method includes:
receiving a first input of a user under the condition that a first image is displayed on a first interface;
displaying a first identifier on a first position of the first image according to a second image in response to the first input;
wherein the first image and the second image are respectively: the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of a target infrared hotspot, the target infrared hotspot is at least one of the M infrared hotspots, and M is greater than or equal to 1.
In a second aspect, an embodiment of the present invention further provides a terminal device, including:
the receiving module is used for receiving a first input of a user under the condition that the first image is displayed on the first interface;
the display module is used for responding to the first input received by the receiving module and displaying a first identifier on a first position of the first image according to a second image;
wherein the first image and the second image are respectively: the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of a target infrared hotspot, the target infrared hotspot is at least one of the M infrared hotspots, and M is greater than or equal to 1.
In a third aspect, an embodiment of the present invention provides a terminal device, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the method for determining a location of an infrared hotspot according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the method for determining a location of an infrared hotspot according to the first aspect.
In the embodiment of the present invention, since the hidden camera (e.g., the pinhole camera) emits heat when operating, so as to generate the infrared spectrum, in the embodiment of the present invention, when the first interface displays the first image (i.e., the non-infrared image collected by the non-infrared camera), the terminal device searches for the positions of all suspicious thermal points (i.e., the infrared hot points) according to the second image (i.e., the infrared image collected by the infrared camera) after receiving the first input of the user. Meanwhile, the first image and the second image are respectively images acquired by the non-infrared camera and the infrared camera aiming at the same shooting object, so that the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image. Therefore, the terminal device can display the first identifier at the position of the target infrared hotspot (such as the hidden camera) on the first image to prompt the user of the position of the target infrared hotspot, so that the user can quickly lock the hidden camera and avoid privacy disclosure.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a method for determining a position of an infrared hotspot according to an embodiment of the present invention;
fig. 3 is a schematic interface diagram applied to a method for determining a location of an infrared hotspot according to an embodiment of the present invention;
fig. 4 is a second schematic interface diagram applied to the method for determining a location of an infrared hot spot according to the embodiment of the present invention;
fig. 5 is a third schematic interface diagram applied to a method for determining a position of an infrared hot spot according to an embodiment of the present invention;
fig. 6 is a fourth schematic interface diagram applied to the method for determining a position of an infrared hot spot according to the embodiment of the present invention;
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 8 is a second schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The "position" mentioned in the embodiment of the present invention may be a point, or may be a region, and the size, area, shape, and the like of the region are not limited in this invention. For example, the first position of the first image may be a point on the first image or a region on the first image.
The "infrared camera" mentioned in the embodiments of the present invention generally appears in a pair with an infrared light generator (e.g., an infrared fill-in lamp), the infrared light generator is used for generating infrared light to illuminate a photographic subject, and then the infrared camera shoots the photographic subject illuminated by the infrared light, so as to obtain an infrared image, which is generally a black-and-white or greenish image. It should be noted that for better imaging of the infrared image, the infrared image is usually acquired in a dark environment. The dark environment refers to an environment with weak ambient light, and can be an environment with ambient light intensity smaller than a preset value, such as a space for pulling a shading curtain in a house in natural wind and light at night.
The "non-infrared camera" mentioned in the embodiment of the present invention is a lens other than the infrared camera, for example, a general front camera in a smart phone, and the present invention is not limited thereto.
The terminal device in the embodiment of the invention can be a mobile terminal device and can also be a non-mobile terminal device. The mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile terminal device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
In addition, the "infrared camera" and the "non-infrared camera" mentioned in the embodiment of the present invention may be an internal module integrated in the terminal device, that is, may be an internal module of the terminal device, or may be an external module that is detachable from the terminal device, which is not limited in this respect.
The execution main body of the method for determining the position of the infrared hotspot provided by the embodiment of the present invention may be the terminal device (including a mobile terminal device and a non-mobile terminal device), or may also be a functional module and/or a functional entity capable of implementing the method for determining the position of the infrared hotspot in the terminal device, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a terminal device as an example to exemplarily describe the method for determining the position of the infrared hotspot provided by the embodiment of the invention.
The terminal in the embodiment of the present invention may be a terminal having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment applied to the method for determining the location of an infrared hotspot according to the embodiment of the present invention, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the method for determining the location of the infrared hotspot provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the method for determining the location of the infrared hotspot may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can realize the method for determining the position of the infrared hotspot provided by the embodiment of the invention by running the software program in the android operating system.
The method for determining the position of an infrared hotspot according to the embodiment of the present invention is described below with reference to a flowchart of the method for determining the position of an infrared hotspot shown in fig. 2, where as shown in fig. 2, the method for determining the position of an infrared hotspot according to the embodiment of the present invention includes steps 201 and 202:
step 201: in the case where the first interface displays the first image, the terminal device receives a first input of a user.
In an embodiment of the present invention, the first input of the user specifically includes: the user clicks on the first interface, or the user presses on the first interface, or a specific gesture of the user on the first interface, or a clicking operation of the user on a specific control, and the like.
The clicking operation can be a single-click operation, a double-click operation or a continuous clicking operation for a preset number of times. The specific gesture may be any one of a single-click gesture, a sliding gesture, a pressure recognition gesture, a long-press gesture, a double-press gesture, and a double-click gesture.
Step 202: in response to the first input, the terminal device displays a first identifier at a first position of the first image according to the second image.
In an embodiment of the present invention, the first image is an image collected by a non-infrared camera, the second image is an image collected by an infrared camera, and the first image and the second image are images collected by the non-infrared camera and the infrared camera with respect to the same shooting object, respectively.
In an embodiment of the present invention, the first image may be an image acquired by a non-infrared camera in a bright environment, and the second image may be an image acquired by an infrared camera in a dark environment. The bright environment refers to an environment where the ambient light intensity is greater than a preset light intensity, and may be under the sun, in the daytime, sunlight penetrates into indoor space, and the like.
For example, after selecting the position W to be photographed, the user may use the non-infrared camera to capture the non-infrared image in a bright environment, then convert the photographing environment from the bright environment to a dark environment (e.g., turn off the light, pull the curtain, etc.), and then continue to capture the infrared image in the dark environment using the infrared camera at the position W. And then, the terminal equipment analyzes the infrared image to extract all infrared hot spots, and then marks the position of the target infrared hot spot on the non-infrared image by using the first identification so as to prompt a user to check and eliminate hidden dangers.
For example, in a case where an infrared camera and a non-infrared camera are integrated in a terminal device, a user needs to capture a non-infrared image and an infrared image at the same position by using the same photographing posture. Specifically, taking a mobile phone as an example, the mobile phone may capture a posture of the mobile phone when taking a picture for the first time, and then take a picture for the second time in the same posture. For example, the posture during photographing can be obtained through the mobile phone Gsensor, that is, whether the placing direction of the mobile phone is the same as the placing direction of the mobile phone during the previous photographing is judged, and the user is reminded to adjust the placing direction of the mobile phone until the placing direction of the mobile phone is the same as the placing direction of the mobile phone during the previous photographing, so as to obtain two images aiming at the same photographing object.
In an embodiment of the present invention, the first image and the second image are respectively images collected by a non-infrared camera and an infrared camera for a same shooting object, where: the shape and position of the object included in the first image are all the same as those of the object included in the second image, or the probability that the part of the first image having the same shape and position as those of the object in the second image occupies the total object is greater than a preset threshold. For example, the preset threshold may be 90%, that is, 90% or more of the parts of the first image that are identical in shape and position to the photographic subject and the second image that are identical in shape and position to each other, and the first image and the second image are regarded as images acquired for the same photographic subject.
It should be noted that the position of the subject may be a position of the subject in the image or a relative position between the subject and another subject.
In an embodiment of the present invention, the second image includes: m infrared hot spots, wherein the infrared temperature corresponding to each infrared hot spot is greater than or equal to a preset threshold value, and M is greater than or equal to 1.
In an embodiment of the present invention, the first position of the first image is a position where a target infrared hot spot is located, and the target infrared hot spot is at least one of the M infrared hot spots.
In an embodiment of the present invention, the first identifier may be information such as a pattern, a picture, and a text.
In an example, if the first mark is a pattern, the shape of the pattern may be any possible shape, such as a circle, a rectangle, a triangle, a diamond, a circle, or a polygon, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto. Further, the pattern may be displayed on the first image with high or low brightness.
In the embodiment of the present invention, since the first image and the second image are images captured by the non-infrared camera and the infrared camera respectively for the same shooting object, that is, the first image and the second image are images captured for the same scene (for example, images shot by the non-infrared camera and the infrared camera for the same corner of a certain room), the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image.
Illustratively, after the terminal device determines M infrared hot spots in the second image, the positions of the infrared hot spots in the first image can be determined, and by analyzing the physical structure of the object, the objects at the positions of the infrared hot spots in the first image are identified, and the target infrared hot spots are screened out.
Example 1: taking the example of finding a hidden camera, in general, there are the following items in a room that may be installed with the hidden camera: lighters, cartoon pendants, bath foam, beverage cans, smoke detectors and the like. Therefore, the terminal equipment can screen out the possible positions of the hidden camera according to the prior knowledge, and check the hidden camera one by one to eliminate hidden dangers.
Example 2: the terminal device can determine whether each infrared hotspot is overlapped with the position of the normal circuit module, so as to determine whether the infrared hotspot is a target infrared hotspot. For example, if the infrared hot spot S1 overlaps with a normal circuit module, it is determined as a normal infrared hot spot, and for example, a circuit of a television generates heat, so that an infrared hot spot is generated, and at this time, the infrared hot spot can be removed; if the positions of the infrared hot spot S1 and the normal circuit module are not overlapped, namely the position of the infrared hot spot S1 is not supposed to have a circuit, the infrared hot spot is determined as a target infrared hot spot, for example, a place where an object such as a bed head, a wardrobe and the like cannot have a circuit, and the infrared hot spot is detected to be present, so that the infrared hot spot is determined as the target infrared hot spot.
For example, the terminal device may mark only the location of the target infrared hotspot on the first interface. As shown in fig. 3, after an RGB image of a room a is collected by an RGB camera of a mobile phone, the image is displayed on a first interface (31 in fig. 3), and at this time, when a user clicks the first interface 31, two target infrared hot spots, namely, a hot spot a (as shown in fig. 3, the hot spot is located on a wall surface) and a hot spot B (as shown in fig. 3, the hot spot is located on a wine bottle on a tea table) are displayed on the first interface and are identified by circles.
Optionally, in an embodiment of the present invention, the step 201 specifically includes the following steps:
step 201 a: the terminal device receives a first input of a user when the first split screen displays a first image and the second split screen displays a second image.
The first interface is an interface in the first split screen.
In the embodiment of the invention, the terminal device can split the interface of the display screen of the terminal device into the first split screen and the second split screen in a split screen mode, wherein the first split screen displays the first image, and the second split screen displays the second image.
For example, the terminal device may display the first image and the second image separately in a split screen format. As shown in fig. 4, after the user acquires an RGB image of the room a through the RGB camera of the mobile phone and acquires an infrared image of the room a at the same angle through the infrared camera of the mobile phone (the image is indicated as an infrared image by oblique lines in fig. 4), the RGB image is displayed on the interface of the first split screen (i.e., 31 in fig. 4), and the infrared image is displayed on the interface of the second split screen (i.e., 32 in fig. 4). At this time, when the user clicks the first interface 31, two target infrared hot spots, i.e., a hot spot a (as shown in fig. 4, the hot spot is located on a wine bottle on a tea table) and a hot spot B (as shown in fig. 4, the hot spot is located on a wall surface) are displayed on the first interface and are identified by circles.
Further optionally, in an embodiment of the present invention, after the step 201a, the method further includes:
step 201b 1: the terminal device receives a second input of the user.
In the embodiment of the present invention, the second input is used to trigger the terminal device to display the second image in a full screen.
For example, the second input may be an operation of the user on the second split screen, an operation of the user on the first split screen, an operation of the user on a specific control on the first split screen or the second split screen, a specific gesture of the user on the first split screen or the second split screen, or the like.
Step 201b 2: and the terminal equipment responds to the second input, exits the split screen state and displays the second image in a full screen mode.
Further optionally, in the embodiment of the present invention, the first image is a real-time non-infrared image currently acquired by a non-infrared camera; or, the second image is a real-time infrared image currently acquired by the infrared camera. For example, the terminal device may acquire an infrared image acquired by the infrared camera in advance, and then, when the non-infrared camera starts to take a picture, the terminal device displays a real-time non-infrared image (i.e., a preview image acquired by the non-infrared camera) acquired by the non-infrared camera currently on the first split screen, and displays an infrared image acquired by the infrared camera in advance on the second split screen, that is, the interface of the first split screen is a preview interface; or the terminal device may acquire a non-infrared image acquired by the non-infrared camera in advance, and then, when the infrared camera starts to take a picture, the non-infrared image acquired by the non-infrared camera in advance is displayed on the first split screen, and a real-time infrared image (i.e., a preview image acquired by the infrared camera) currently acquired by the infrared camera is displayed on the second split screen, that is, the interface of the second split screen is a preview interface.
Further optionally, in the embodiment of the present invention, when the infrared camera starts to take a picture, the terminal device may display a real-time infrared image (i.e., a preview image acquired by the infrared camera) currently acquired by the infrared camera on the second split screen, and display a panoramic image acquired by the non-infrared camera in advance on the first split screen. The panoramic image is synthesized by a plurality of continuous first images, and the plurality of first images are images shot by the non-infrared camera in the panoramic mode.
For example, the first split screen may be played from one end of the panoramic image in a video playing mode, and the content of the panoramic image displayed in real time in the first split screen is associated with the preview screen currently displayed in the second split screen (i.e. the shooting objects of the two are the same), for example, the playing speed of the panoramic image is the same as the moving speed of the screen in the preview interface.
Optionally, in the embodiment of the present invention, the target infrared hot spot includes N infrared hot spots, where N is greater than or equal to 1, and N is less than M. At this time, the method further includes the steps of:
step A1: and the terminal equipment displays the second identification on M-N second positions of the first image according to the second image.
The M-N second positions are positions of other infrared hot spots except the target infrared hot spot in the M infrared hot spots in the first image.
In an embodiment of the present invention, the second identifier may be information such as a pattern, a picture, and a character.
In an example, if the second mark is a pattern, the shape of the pattern may be any possible shape, such as a circle, a rectangle, a triangle, a diamond, a circle, or a polygon, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto. Further, the pattern may be displayed on the first image with high or low brightness.
For example, the terminal device may mark the positions of all the infrared hotspots. As shown in fig. 5, after an RGB image of a room a is collected by an RGB camera of a mobile phone, the image is displayed on a first interface (31 in fig. 5), at this time, if a user clicks the first interface 31, 4 infrared hotspots are displayed on the first interface, where a hotspot C (refer to fig. 5, the hotspot is located in a desk lamp) and a hotspot D (refer to fig. 5, the hotspot is located in a television) are two normal infrared hotspots, and the desk lamp and the television can emit heat, so that the hotspot C and the hotspot D can be eliminated to be identified by a frame; the hot spot a (referring to fig. 5, the hot spot is located on the wall surface) and the hot spot B (as shown in fig. 5, the hot spot is located on the wine bottle on the tea table) are target infrared hot spots and are identified by circles.
Optionally, in an embodiment of the present invention, the method further includes:
step A2: and the terminal equipment displays the third identification on M third positions of the second image.
And the M third positions are positions of the M infrared hot spots in the second image.
In an embodiment of the present invention, the second identifier may be information such as a pattern, a picture, and a character.
In an example, if the third mark is a pattern, the shape of the pattern may be any possible shape, such as a circle, a rectangle, a triangle, a diamond, a circle, or a polygon, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto. Further, the pattern may be displayed on the second image with high or low brightness.
For example, the terminal device may mark the position of the infrared hot spot in two split screens at the same time. As shown in fig. 6, after the user acquires an RGB image of the room a through the RGB camera of the mobile phone and acquires an infrared image of the room a at the same angle through the infrared camera of the mobile phone (the image is indicated as an infrared image by oblique lines in fig. 6), the RGB image is displayed on the interface of the first split screen (i.e., 31 in fig. 6), and the infrared image is displayed on the interface of the second split screen (i.e., 32 in fig. 6). At this time, the user clicks the first interface 31, and four infrared hot spots (the related description refers to the related description of fig. 5 above) are displayed on the first interface, and are identified by triangles on the second split-screen interface 32.
It should be noted that the first identifier, the second identifier, and the third identifier may be the same or different, and this is not limited in this embodiment of the present invention.
It should be noted that, the step a2 may be executed before the step 201, or may be executed after the step 201, and the present invention is not limited to this.
According to the method for determining the position of the infrared hot spot, provided by the embodiment of the invention, since the hidden camera (for example, a pinhole camera) emits heat when working, so as to generate the infrared spectrum, in the embodiment of the invention, the terminal device searches the positions of all suspicious hot spots (namely, the infrared hot spots) according to the second image (namely, the infrared image acquired by the infrared camera) after receiving the first input of the user under the condition that the first image (namely, the non-infrared image acquired by the non-infrared camera) is displayed on the first interface. Meanwhile, the first image and the second image are respectively images acquired by the non-infrared camera and the infrared camera aiming at the same shooting object, so that the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image. Therefore, the terminal device can display the first identifier at the position of the target infrared hotspot (such as the hidden camera) on the first image to prompt the user of the position of the target infrared hotspot, so that the user can quickly lock the hidden camera and avoid privacy disclosure.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention, and as shown in fig. 7, the terminal device 400 includes: a receiving module 401 and a display module 402, wherein:
the receiving module 401 is configured to receive a first input of a user when the first interface displays the first image.
A display module 402, configured to, in response to the first input received by the receiving module 401, display the first identifier at the first position of the first image according to the second image.
Wherein the first image and the second image are respectively: the non-infrared camera and the infrared camera are used for acquiring images of the same shot object in a bright environment and a dark environment, and the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of a target infrared hot spot, the target infrared hot spot is at least one of M infrared hot spots, and M is greater than or equal to 1.
Optionally, the receiving module 401 is specifically configured to: receiving a first input of a user under the condition that the first split screen displays a first image and the second split screen displays a second image; the first interface is an interface in the first split screen.
Optionally, the first image is a real-time non-infrared image currently acquired by a non-infrared camera; or, the second image is a real-time infrared image currently acquired by the infrared camera.
Optionally, the display module 402 is further configured to display a second identifier on M-1 second positions of the first image according to the second image; the M-1 second positions are positions of other infrared hot spots except the target infrared hot spot in the M infrared hot spots in the first image.
Optionally, the display module 402 is further configured to display a third identifier at M third positions of the second image; the M third positions are positions of the M infrared hot spots in the second image.
According to the terminal device provided by the embodiment of the invention, as the hidden camera (for example, a pinhole camera) emits heat when working, so as to generate the infrared spectrum, the terminal device in the embodiment of the invention searches the positions of all suspicious heat points (namely, infrared hot points) according to the second image (namely, the infrared image acquired by the infrared camera) after receiving the first input of the user under the condition that the first image (namely, the non-infrared image acquired by the non-infrared camera) is displayed on the first interface. Meanwhile, the first image and the second image are respectively images acquired by the non-infrared camera and the infrared camera aiming at the same shooting object, so that the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image. Therefore, the terminal device can display the first identifier at the position of the target infrared hotspot (such as the hidden camera) on the first image to prompt the user of the position of the target infrared hotspot, so that the user can quickly lock the hidden camera and avoid privacy disclosure.
The terminal device provided by the embodiment of the present invention can implement each process implemented by the terminal device in the above method embodiments, and is not described here again to avoid repetition.
Fig. 8 is a schematic diagram of a hardware structure of a terminal device for implementing various embodiments of the present invention, where the terminal device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the configuration of the terminal device 100 shown in fig. 8 does not constitute a limitation of the terminal device, and that the terminal device 100 may include more or less components than those shown, or combine some components, or arrange different components. In the embodiment of the present invention, the terminal device 100 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 107 is used for receiving a first input of a user under the condition that the first image is displayed on the first interface; a processor 110 for displaying a first identifier at a first position of the first image according to the second image in response to a first input received by the user input unit 107; wherein the first image and the second image are respectively: the non-infrared camera and the infrared camera are used for acquiring images of the same shot object in a bright environment and a dark environment, and the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of a target infrared hot spot, the target infrared hot spot is at least one of M infrared hot spots, and M is greater than or equal to 1.
According to the terminal device provided by the embodiment of the invention, as the hidden camera (for example, a pinhole camera) emits heat when working, so as to generate the infrared spectrum, the terminal device in the embodiment of the invention searches the positions of all suspicious heat points (namely, infrared hot points) according to the second image (namely, the infrared image acquired by the infrared camera) after receiving the first input of the user under the condition that the first image (namely, the non-infrared image acquired by the non-infrared camera) is displayed on the first interface. Meanwhile, the first image and the second image are respectively images acquired by the non-infrared camera and the infrared camera aiming at the same shooting object, so that the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image. Therefore, the terminal device can display the first identifier at the position of the target infrared hotspot (such as the hidden camera) on the first image to prompt the user of the position of the target infrared hotspot, so that the user can quickly lock the hidden camera and avoid privacy disclosure.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal device 100 provides the user with wireless broadband internet access via the network module 102, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal device 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the terminal device 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device 100. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 8, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal device 100, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal device 100, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 100 or may be used to transmit data between the terminal apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal device 100, connects various parts of the entire terminal device 100 by various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal device 100. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal device 100 may further include a power supply 111 (such as a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 100 includes some functional modules that are not shown, and are not described in detail here.
Optionally, an embodiment of the present invention further provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor 110, where the computer program, when executed by the processor, implements each process of the above-mentioned method for determining a location of an infrared hotspot, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned method for determining a location of an infrared hotspot, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method for determining a location of an infrared hotspot, the method comprising:
receiving a first input of a user under the condition that a first image is displayed on a first interface;
displaying a first identifier on a first position of the first image according to a second image in response to the first input;
wherein the first image and the second image are respectively: the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of a target infrared hotspot, the target infrared hotspot is at least one of the M infrared hotspots, and M is greater than or equal to 1.
2. The method of claim 1, wherein receiving a first input from a user while the first interface displays the first image comprises:
receiving a first input of a user under the condition that the first split screen displays a first image and the second split screen displays a second image;
and the first interface is an interface in the first split screen.
3. The method of claim 2,
the first image is a real-time non-infrared image currently acquired by a non-infrared camera;
or the second image is a real-time infrared image currently acquired by the infrared camera.
4. The method of claim 1, wherein the target infrared hotspot comprises N infrared hotspots, N being greater than or equal to 1, N being less than M, the method further comprising:
displaying second marks on M-N second positions of the first image according to the second image;
the M-N second positions are positions of other infrared hot spots except the target infrared hot spot in the M infrared hot spots in the first image.
5. The method according to any one of claims 1 to 4, further comprising:
displaying a third mark on M third positions of the second image;
wherein the M third locations are locations of the M infrared hot spots in the second image.
6. A terminal device, comprising:
the receiving module is used for receiving a first input of a user under the condition that the first image is displayed on the first interface;
the display module is used for responding to the first input received by the receiving module and displaying a first identifier on a first position of the first image according to a second image;
wherein the first image and the second image are respectively: the second image comprises M infrared hot spots; the position of each infrared hot spot in the second image is associated with the position of the infrared hot spot in the first image; the first position is the position of a target infrared hotspot, the target infrared hotspot is at least one of the M infrared hotspots, and M is greater than or equal to 1.
7. The terminal device of claim 6, wherein the receiving module is specifically configured to:
receiving a first input of a user under the condition that the first split screen displays a first image and the second split screen displays a second image;
and the first interface is an interface in the first split screen.
8. The terminal device of claim 7,
the first image is a real-time non-infrared image currently acquired by a non-infrared camera;
or the second image is a real-time infrared image currently acquired by the infrared camera.
9. The terminal device of claim 6, wherein the target infrared hotspot comprises N infrared hotspots, N being greater than or equal to 1, N being less than M;
the display module is further configured to display a second identifier at M-N second positions of the first image according to the second image; the M-N second positions are positions of other infrared hot spots except the target infrared hot spot in the M infrared hot spots in the first image.
10. The terminal device according to any one of claims 6 to 9, wherein the display module is further configured to display a third identifier at M third positions of the second image;
wherein the M third locations are locations of the M infrared hot spots in the second image.
CN201811012517.7A 2018-08-31 2018-08-31 Method for determining position of infrared hotspot and terminal equipment Active CN109272549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811012517.7A CN109272549B (en) 2018-08-31 2018-08-31 Method for determining position of infrared hotspot and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811012517.7A CN109272549B (en) 2018-08-31 2018-08-31 Method for determining position of infrared hotspot and terminal equipment

Publications (2)

Publication Number Publication Date
CN109272549A CN109272549A (en) 2019-01-25
CN109272549B true CN109272549B (en) 2021-04-23

Family

ID=65154922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811012517.7A Active CN109272549B (en) 2018-08-31 2018-08-31 Method for determining position of infrared hotspot and terminal equipment

Country Status (1)

Country Link
CN (1) CN109272549B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111553196A (en) * 2020-04-03 2020-08-18 北京三快在线科技有限公司 Method, system, device and storage medium for detecting hidden camera
CN112565502A (en) * 2020-11-13 2021-03-26 苏州熙烁数字科技有限公司 Intelligent privacy protection method and system based on signal identification

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102778978A (en) * 2012-06-18 2012-11-14 广州视睿电子科技有限公司 Touch system based on infrared light identification
CN102871784B (en) * 2012-09-21 2015-04-08 中国科学院深圳先进技术研究院 Positioning controlling apparatus and method
CN104835275A (en) * 2014-02-07 2015-08-12 上海炯歌电子科技有限公司 Infrared imaging alarm device used for burglarproof doors and windows
CN106548467B (en) * 2016-10-31 2019-05-14 广州飒特红外股份有限公司 The method and device of infrared image and visual image fusion
CN106683039B (en) * 2016-11-21 2020-10-02 云南电网有限责任公司电力科学研究院 System for generating fire situation map
KR20180093589A (en) * 2017-02-14 2018-08-22 엘아이지넥스원 주식회사 Apparatus and method for extracting feature point to fusing image about maneuvering target

Also Published As

Publication number Publication date
CN109272549A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN110913132B (en) Object tracking method and electronic equipment
CN108848308B (en) Shooting method and mobile terminal
CN110891144B (en) Image display method and electronic equipment
WO2021057267A1 (en) Image processing method and terminal device
CN107977144B (en) Screen capture processing method and mobile terminal
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
CN111541845A (en) Image processing method and device and electronic equipment
CN108174103B (en) Shooting prompting method and mobile terminal
CN109819168B (en) Camera starting method and mobile terminal
CN107730460B (en) Image processing method and mobile terminal
CN109495616B (en) Photographing method and terminal equipment
CN108459788B (en) Picture display method and terminal
JP7371254B2 (en) Target display method and electronic equipment
CN110730298A (en) Display control method and electronic equipment
CN111083374B (en) Filter adding method and electronic equipment
CN111031221B (en) Shooting method and electronic equipment
CN111246111B (en) Photographing method, electronic device, and medium
CN110209324B (en) Display method and terminal equipment
CN108510266A (en) A kind of Digital Object Unique Identifier recognition methods and mobile terminal
CN109272549B (en) Method for determining position of infrared hotspot and terminal equipment
CN109104573B (en) Method for determining focusing point and terminal equipment
CN108833791B (en) Shooting method and device
CN109104564B (en) Shooting prompting method and terminal equipment
CN111064888A (en) Prompting method and electronic equipment
CN108924413B (en) Shooting method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant