CN107682665B - Monitoring method and mobile terminal - Google Patents

Monitoring method and mobile terminal Download PDF

Info

Publication number
CN107682665B
CN107682665B CN201710822912.0A CN201710822912A CN107682665B CN 107682665 B CN107682665 B CN 107682665B CN 201710822912 A CN201710822912 A CN 201710822912A CN 107682665 B CN107682665 B CN 107682665B
Authority
CN
China
Prior art keywords
mobile terminal
video image
monitored object
area
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710822912.0A
Other languages
Chinese (zh)
Other versions
CN107682665A (en
Inventor
吴丽芳
黄华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710822912.0A priority Critical patent/CN107682665B/en
Publication of CN107682665A publication Critical patent/CN107682665A/en
Application granted granted Critical
Publication of CN107682665B publication Critical patent/CN107682665B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0261System arrangements wherein the object is to detect trespassing over a fixed physical boundary, e.g. the end of a garden

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)
  • Telephone Function (AREA)

Abstract

The invention provides a monitoring method and a mobile terminal. The mobile terminal is provided with a camera, and the method comprises the following steps: acquiring a video image of a monitored object through the camera; identifying the monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.

Description

Monitoring method and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a monitoring method and a mobile terminal.
Background
Along with the improvement of the quality of life of the substance, the entertainment activities are more and more colorful, and especially parents pay more attention to the participation of the entertainment activities of children. However, the current entertainment activity field is large, facilities are various, and a lot of other children are also in the activity field. After the child enters the activity place, the child likes to run and play everywhere, and parents mostly select a certain position to watch the child far away. Occasionally, the children can chat with the surrounding people or play mobile phones, so that the children are out of the sight of the parents, and the parents find the children in the crowd with more labor. If the child is not found in time, the child may be exposed to accidents or dangers, such as getting lost or even turned. Although the functions of the current mobile terminal are increasingly powerful, no mobile terminal provides the function of monitoring the activities of children in real time.
Disclosure of Invention
The embodiment of the invention provides a monitoring method and a mobile terminal, and aims to solve the problem that the mobile terminal in the prior art has no real-time monitoring function.
According to an aspect of the embodiments of the present invention, there is provided a monitoring method applied to a mobile terminal having a camera, the method including:
acquiring a video image of a monitored object through the camera;
identifying the monitored object from the video image;
judging an activity area where the monitored object is located;
and executing preset reminding processing according to the activity area.
According to an aspect of the embodiments of the present invention, there is provided a mobile terminal having a camera, the mobile terminal including:
the video image acquisition module is used for acquiring a video image of the monitored object through the camera;
the monitoring object identification module is used for identifying the monitoring object from the video image;
the activity area judging module is used for judging the activity area of the monitored object;
a first reminding processing execution module for executing preset reminding processing according to the active region
According to still another aspect of the embodiments of the present invention, there is provided a mobile terminal including:
a processor and a memory;
the memory is used for storing a program for executing the monitoring method;
the processor is configured to execute programs stored in the memory.
According to still another aspect of the embodiments of the present invention, there is provided a computer-readable storage medium having a monitoring program stored thereon, the monitoring program, when executed by a processor, implementing the steps of the monitoring method described above.
According to the embodiment of the invention, the mobile terminal acquires the video image of the monitored object through the camera; identifying a monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flow chart illustrating steps of a monitoring method according to a first embodiment of the present invention;
FIG. 2 is a flow chart illustrating steps of a monitoring method according to a second embodiment of the present invention;
fig. 3 is a block diagram of a mobile terminal according to a third embodiment of the present invention;
fig. 4 is a second block diagram of a mobile terminal according to a third embodiment of the present invention;
fig. 5 is a block diagram of a mobile terminal according to a fourth embodiment of the present invention;
fig. 6 is a schematic structural diagram of a mobile terminal according to a fifth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
The monitoring method provided by the embodiment of the invention is described in detail.
Referring to fig. 1, a flowchart illustrating steps of a monitoring method in an embodiment of the present invention is shown, where the monitoring method is applied to a mobile terminal having a camera, and the method includes:
and 101, acquiring a video image of a monitored object through the camera.
In this embodiment, when the user needs to monitor, the monitoring function may be started, or the monitoring program may be started, the camera is opened after the monitoring is started, the user aligns the camera with the monitored object, the mobile terminal acquires the video image of the monitored object through the camera, and the user may view the video image of the monitored object on the screen of the mobile terminal. For example, a parent brings a child to an amusement place, and during monitoring, the parent can aim the camera of the mobile phone at the child, and the camera of the mobile phone acquires video images of the child in the amusement place.
Step 102, identifying the monitored object from the video image.
In this embodiment, after acquiring the video image of the monitored object, the mobile terminal identifies the monitored object. The identification may be performed in various ways, for example, by identifying a monitored object by recognizing a human face, or by identifying a monitored object by recognizing a height, a body shape, or the like. The identification mode is not limited in detail in the embodiment of the invention, and can be set according to actual conditions.
And 103, judging the activity area of the monitored object.
In this embodiment, the mobile terminal first divides the area where the monitoring object is located into a plurality of active areas, and determines the active area where the monitoring object is located after identifying the monitoring object. For example, it is determined whether the monitored object is in a safe area or whether the monitored object is in an unsafe area. The embodiment of the invention does not limit the activity area in detail and can be set according to the actual situation.
And 104, executing preset reminding processing according to the activity area.
In this embodiment, after the activity area where the monitoring object is located is determined, different reminding processes are executed for different activity areas. For example, if the monitored object is in a safe area, no prompt is given; if the monitored object is not in the safe area, an audible alarm can be given, or the screen flickers, or a video image pops up on the screen, and the like. The reminding processing is not limited in detail in the embodiment of the invention, and can be set according to actual conditions.
In summary, in the embodiment of the present invention, the mobile terminal obtains the video image of the monitored object through the camera; identifying a monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.
Example two
Referring to fig. 2, a flow chart of the steps of a monitoring method in another embodiment of the invention is shown. The method is applied to the mobile terminal with the camera, and comprises the following steps:
step 201, acquiring a video image of a monitored object through the camera.
Step 202, obtaining the appearance characteristics of the monitored object.
In this embodiment, before the monitored object is identified, the appearance feature of the monitored object may be obtained in advance. For example, a photo or a video is taken of the monitored object, and the mobile terminal acquires the appearance characteristics of the monitored object from the photo or the video; the appearance features input by the user can also be acquired. The embodiment of the invention does not limit the obtaining mode in detail, and can be set according to the actual situation.
Optionally, the appearance features include at least one of height, skin tone, hair style, and clothing. For example, the mobile terminal obtains from the picture that the height of the monitored object is about 120cm, the skin color is white, and the hair is short, the upper garment is a green short-sleeve shirt with cartoon patterns, and the lower garment is a pair of color shorts. The appearance features may also include facial features, wearing apparel, and the like. The embodiment of the present invention is not limited in detail, and may be set according to actual situations.
After the appearance features of the monitored objects are obtained, a feature table can be generated according to the appearance features of the monitored objects, and a plurality of monitored objects can be monitored simultaneously according to different feature tables during monitoring. For example, after the appearance features of the monitored object a are obtained, a feature table a' may be generated for the monitored object a; after the appearance features of the monitored object B are obtained, a feature table B' can be generated aiming at the monitored object B; and the mobile terminal simultaneously monitors the monitored object A and the monitored object B according to the characteristic table A 'and the characteristic table B'.
Step 203, identifying the monitored object from the video image according to the appearance characteristics.
In this embodiment, after the mobile terminal obtains the appearance feature, the monitoring object is identified from the video image according to the appearance feature. For example, the monitored object a is identified from the video image according to the height of about 120cm, fair skin and short hair, the upper garment is a green short sleeve with cartoon pattern, and the lower garment is a pair of short pants with color. The more the appearance features acquired by the mobile terminal are, the more accurate the monitored object is identified.
And 204, judging the activity area of the monitored object.
In this embodiment, an activity area where a monitored object is located is determined, and specifically, it is determined that the monitored object is located in a safe area or a non-safe area according to a distance between the monitored object and the camera. For example, when the distance between the monitoring object and the camera is less than 2 meters, the monitoring object is in a safe area; when the distance between the monitoring object and the camera is more than 2 meters, the monitoring object is in an unsafe area. The unsafe zone includes a warning zone or a hazardous zone. For example, when the distance between the monitoring object and the camera is greater than 2 meters and less than 3 meters, the monitoring object is in a warning area; when the distance between the monitoring object and the camera is more than 3 meters, the monitoring object is in a dangerous area. The size of the safety area, the warning area and the dangerous area is not limited in detail in the embodiment of the invention, and can be set according to the actual situation.
And step 205, executing preset reminding processing according to the activity area.
In this embodiment, when the monitoring object is in different active areas, different reminding processes are executed. Specifically, if the monitored object is in the safe area, the video image is displayed in real time and the monitored object is highlighted. For example, the monitoring object a is in a safe area, the mobile terminal displays a video image on the screen in real time, and the monitoring object a is marked red in the video image. And if the monitored object is in the unsafe zone, executing at least one reminding process of sounding an alarm and generating a prompt pop-up window. For example, if the monitored subject is in a warning area, a short beep audible alert is issued; when the monitored object is in the dangerous area, a long-sound 'tic' sound alarm is given out, and a popup window is generated to prompt the user to pay attention to the monitored object so that the user can adjust the distance between the user and the monitored object. Furthermore, the monitored object can be timed in an unsafe area, and a sound alarm or a popup window is generated after the set time length is exceeded. For example, when the monitored object enters the unsafe zone from the safe zone, the monitored object is recorded to be in the unsafe zone and then returns to the safe zone after 15 seconds, no warning is carried out, when the time that the monitored object is in the unsafe zone exceeds 1 minute, an audible alarm is sent out, and when the time exceeds 2 minutes, a popup window is generated. The embodiment of the present invention is not limited in detail, and may be set according to actual situations.
And step 206, if the monitored object is not in any activity area, sending out a sound alarm, and displaying a pre-stored video image.
In this embodiment, when the monitored object is blocked by an object, or the monitored object is not in the shooting area of the camera, or the monitored object is not recognized by the mobile terminal, it can be determined that the monitored object is not in any moving area. When the monitored object is not in any moving area, a sound alarm is given out, and meanwhile, a video image stored in advance can be displayed on a screen of the mobile terminal, so that a user can conveniently find the monitored object blocked by an object or the monitored object beyond a shooting area of the camera according to the video image.
And step 207, extracting and storing the video image with the first set time length before the reminding processing so that the user can search the monitoring object according to the video image.
In this embodiment, the video image may be extracted and stored in the mobile terminal, so that the user may search for the monitoring object according to the video image. The stored video image may be a video image of a first set duration before the mobile terminal performs the alerting process. For example, the monitored object a is shielded by an object, the mobile terminal sends out a sound alarm, and meanwhile, extracts and stores a video image 3 minutes before the sound alarm, and a user can look up the stored video image to find the monitored object a shielded by the object. The first set time length is not limited in detail in the embodiment of the present invention, and may be set according to an actual situation.
And step 208, detecting whether the mobile terminal displays the video image.
In this embodiment, after the video image is acquired through the camera, whether the mobile terminal displays the video image is detected. Specifically, whether the foreground running of the mobile terminal is a monitoring program is detected. For example, the mobile terminal displays a video image, that is, a foreground of the mobile terminal runs a monitoring program, and a user can focus on a monitoring object from a screen of the mobile terminal; the mobile terminal does not display the video image, that is, other application programs besides the monitoring program are operated in the foreground of the mobile terminal.
Step 209, if the mobile terminal does not display the video image, starting timing.
In this embodiment, if the mobile terminal does not display the video image, that is, the mobile terminal foreground runs other application programs except the monitoring program, the timing is started. For example, the mobile terminal displays a video image, and when the user uses the instant messaging software, timing is started from an interface of the instant messaging software displayed by the mobile terminal; or when the user browses the webpage, starting to time from the opening of the webpage.
And step 210, executing a preset reminding process when the counted accumulated time length exceeds a second set time length.
In this embodiment, the timed accumulated duration is a duration for the foreground of the mobile terminal to run other application programs, and if the accumulated duration exceeds the second set duration, it indicates that the user does not pay attention to the monitoring object for a long time, and at this time, the user needs to be reminded to pay attention to the monitoring object. For example, the user reads the novel by using the reading software, the cumulative time of reading exceeds 10 minutes, and the mobile terminal can give an audio alarm or display a video image on the screen. The embodiment of the present invention is not limited in detail, and may be set according to actual situations.
In summary, in the embodiment of the present invention, the mobile terminal obtains the video image of the monitored object through the camera; identifying a monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.
It should be noted that the foregoing method embodiments are described as a series of acts or combinations for simplicity in explanation, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts or acts described, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
EXAMPLE III
The mobile terminal provided in the embodiments of the present invention can implement the details of the monitoring method in the first to second embodiments, and achieve the same effect.
Referring to fig. 6, a block diagram of a mobile terminal in the embodiment of the present invention is shown, where the mobile terminal has a camera, and the mobile terminal includes a video image acquisition module 301, a monitored object identification module 302, an active area determination module 303, and a first reminding processing execution module 304:
a video image obtaining module 301, configured to obtain a video image of a monitored object through the camera;
a monitored object identification module 302, configured to identify the monitored object from the video image;
an activity area determination module 303, configured to determine an activity area where the monitored object is located;
a first reminding processing executing module 304, configured to execute a preset reminding process according to the active area.
On the basis of fig. 3, optionally, before the monitoring object identifying module 302, the mobile terminal further includes an appearance feature obtaining module 305, see fig. 4:
an appearance feature obtaining module 305, configured to obtain an appearance feature of the monitored object;
the monitored object identifying module 302 is specifically configured to identify the monitored object from the video image according to the appearance feature.
Optionally, the appearance features include at least one of height, skin tone, hair style, and clothing.
On the basis of fig. 3, optionally, the active area determining module 303 is specifically configured to determine that the monitored object is located in a safe area or an unsafe area according to a distance between the monitored object and the camera, where the unsafe area includes a warning area or a dangerous area.
On the basis of fig. 3, optionally, the first reminder processing execution module 304 includes a video image display sub-module 3041 and a reminder processing execution sub-module 3042, as shown in fig. 4:
a video image display sub-module 3041, configured to display the video image in real time and highlight the monitored object if the monitored object is in the safe area;
a reminding processing execution sub-module 3042, configured to execute at least one reminding process of sounding an alarm and generating a prompt popup if the monitored object is in the insecure area.
On the basis of fig. 3, optionally, the mobile terminal further includes a second reminding processing executing module 306, see fig. 4:
and a second reminding processing execution module 306, configured to send out a sound alarm and display a pre-stored video image if the monitored object is not located in any active area.
On the basis of fig. 3, optionally, the mobile terminal further includes a video image storage module 307, see fig. 4:
and the video image storage module 307 is configured to extract and store a video image with a first set time length before the reminding processing, so that a user searches for the monitored object according to the video image.
On the basis of fig. 3, optionally, the mobile terminal further includes a video image detection module 308, a timing module 309, and a third reminding processing execution module 310, as shown in fig. 4:
a video image detection module 308, configured to detect whether the mobile terminal displays the video image;
a timing module 309, configured to start timing if the mobile terminal does not display the video image;
and a third reminding processing executing module 310, configured to execute a preset reminding process when the counted accumulated time length exceeds the second set time length.
In summary, in the embodiment of the present invention, the mobile terminal obtains the video image of the monitored object through the camera; identifying a monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.
Example four
An embodiment of the present invention provides a mobile terminal, including: a processor and a memory;
the memory is used for storing a program for executing the monitoring method described in the first embodiment to the second embodiment of the claims;
the processor is configured to execute programs stored in the memory.
Fig. 5 is a block diagram of a mobile terminal according to another embodiment of the present invention. The mobile terminal 500 shown in fig. 5 includes: at least one processor 501, memory 502, at least one network interface 504, and a user interface 503. The various components in the mobile terminal 500 are coupled together by a bus system 505. It is understood that the bus system 505 is used to enable connection communications between these components. The bus system 505 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 505 in FIG. 5.
The user interface 503 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or flexible screen, among others.
It is to be understood that the memory 502 in embodiments of the present invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory 502 of the subject systems and methods described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 502 stores elements, executable modules or data structures, or a subset thereof, or an expanded set thereof as follows: an operating system 5021 and application programs 5022.
The operating system 5021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 5022 includes various applications, such as a media player (MediaPlayer), a Browser (Browser), and the like, for implementing various application services. The program for implementing the method according to the embodiment of the present invention may be included in the application program 5022.
In the embodiment of the present invention, the processor 501 is configured to obtain a video image of a monitored object through the camera by calling a program or an instruction stored in the memory 502, specifically, a program or an instruction stored in the application 5022; identifying the monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area.
The method disclosed by the above-mentioned embodiments of the present invention may be applied to the processor 501, or implemented by the processor 501. The processor 501 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the method may be implemented by an integrated logic circuit of hardware or an operation instruction in the form of software in the processor 501. The Processor 501 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 502, and the processor 501 reads the information in the memory 502 and completes the steps of the method in combination with the hardware.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described in this disclosure may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described in this disclosure. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the processor 501 is further configured to: acquiring the appearance characteristics of the monitored object; the identifying the monitored object from the video image comprises: and identifying the monitored object from the video image according to the appearance characteristic.
Optionally, the appearance features include at least one of height, skin tone, hair style, and clothing.
Optionally, the processor 501 is further configured to: and judging that the monitored object is in a safe region or an unsafe region according to the distance between the monitored object and the camera, wherein the unsafe region comprises a warning region or a dangerous region.
Optionally, the processor 501 is further configured to: if the monitored object is in the safe area, displaying the video image in real time and highlighting the monitored object; if the monitored object is in the unsafe zone, executing at least one reminding process of sounding an alarm and generating a prompt pop-up window; and if the monitored object is not in any activity area, sending out a sound alarm and displaying a pre-stored video image.
Optionally, the processor 501 is further configured to: and extracting and storing the video image with the first set time length before the reminding processing so that a user can search the monitored object according to the video image.
Optionally, the processor 501 is further configured to: detecting whether the mobile terminal displays the video image; if the mobile terminal does not display the video image, timing is started; and when the accumulated time length exceeds the second set time length, executing preset reminding processing.
The mobile terminal 500 can implement the processes implemented by the mobile terminal in the foregoing embodiments, and in order to avoid repetition, the detailed description is omitted here. In the embodiment of the invention, the mobile terminal acquires the video image of the monitored object through the camera; identifying a monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.
EXAMPLE five
Fig. 6 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention. Specifically, the mobile terminal in fig. 6 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
The mobile terminal in fig. 6 includes a Radio Frequency (RF) circuit 610, a memory 620, an input unit 630, a display unit 640, a processor 660, an audio circuit 670, a wifi (wireless fidelity) module 680, and a power supply 690.
The input unit 630 may be used, among other things, to receive numeric or character information input by a user and to generate signal inputs related to user settings and function control of the mobile terminal. Specifically, in the embodiment of the present invention, the input unit 630 may include a touch panel 631. The touch panel 631 may collect touch operations performed by a user (e.g., operations performed by the user on the touch panel 631 by using any suitable object or accessory such as a finger or a stylus) thereon or nearby, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 631 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 660, and can receive and execute commands sent by the processor 660. In addition, the touch panel 631 may be implemented using various types, such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 631, the input unit 630 may also include other input devices 632, and the other input devices 632 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Among them, the display unit 640 may be used to display information input by a user or information provided to the user and various menu interfaces of the mobile terminal. The display unit 640 may include a display panel 641, and optionally, the display panel 641 may be configured in the form of an LCD or an Organic Light-Emitting Diode (OLED).
It should be noted that the touch panel 631 may cover the display panel 641 to form a touch display screen, and when the touch display screen detects a touch operation thereon or nearby, the touch display screen is transmitted to the processor 660 to determine the type of the touch event, and then the processor 660 provides a corresponding visual output on the touch display screen according to the type of the touch event. The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like.
The processor 660 is a control center of the mobile terminal, connects various parts of the whole mobile phone by using various interfaces and lines, and executes various functions and processes data of the mobile terminal by operating or executing software programs and/or modules stored in the first memory 621 and calling data stored in the second memory 622, thereby performing overall monitoring of the mobile terminal. Optionally, processor 660 may include one or more processing units.
In the embodiment of the present invention, the processor 660 is configured to obtain a video image of the monitored object through the camera by calling a software program and/or a module stored in the first memory 621 and/or data stored in the second memory 622; identifying the monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area.
Optionally, the processor 660 is further configured to: acquiring the appearance characteristics of the monitored object; the identifying the monitored object from the video image comprises: and identifying the monitored object from the video image according to the appearance characteristic.
Optionally, the appearance features include at least one of height, skin tone, hair style, and clothing.
Optionally, the processor 660 is further configured to: and judging that the monitored object is in a safe region or an unsafe region according to the distance between the monitored object and the camera, wherein the unsafe region comprises a warning region or a dangerous region.
Optionally, the processor 660 is further configured to: if the monitored object is in the safe area, displaying the video image in real time and highlighting the monitored object; if the monitored object is in the unsafe zone, executing at least one reminding process of sounding an alarm and generating a prompt pop-up window; and if the monitored object is not in any activity area, sending out a sound alarm and displaying a pre-stored video image.
Optionally, the processor 660 is further configured to: and extracting and storing the video image with the first set time length before the reminding processing so that a user can search the monitored object according to the video image.
Optionally, the processor 660 is further configured to: detecting whether the mobile terminal displays the video image; if the mobile terminal does not display the video image, timing is started; and when the accumulated time length exceeds the second set time length, executing preset reminding processing.
Therefore, in the embodiment of the invention, the mobile terminal acquires the video image of the monitored object through the camera; identifying a monitored object from the video image; judging an activity area where the monitored object is located; and executing preset reminding processing according to the activity area. The mobile terminal provides a real-time monitoring and reminding function for the user, parents can use the mobile terminal to monitor the children in places such as amusement parks or outdoors, and when the children are far away from the parents or are separated from the monitoring range of the mobile terminal, the mobile terminal can send out a warning to remind the parents, so that the personal safety of the children is guaranteed, and the parents are liberated.
For the embodiment of the mobile terminal, since it is basically similar to the method embodiment, the description is relatively simple, and for relevant points, reference may be made to part of the description of the method embodiment.
An embodiment of the present invention further provides a computer-readable storage medium, where a monitoring program is stored on the computer-readable storage medium, and the monitoring program, when executed by a processor, implements the steps of the monitoring method described above.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As is readily imaginable to the person skilled in the art: any combination of the above embodiments is possible, and thus any combination between the above embodiments is an embodiment of the present invention, but the present disclosure is not necessarily detailed herein for reasons of space.
The monitoring schemes provided herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The structure required to construct a system incorporating aspects of the present invention will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the monitoring solution of the present invention may be embodied in a form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A monitoring method is applied to a mobile terminal with a camera, and comprises the following steps:
acquiring a video image of a monitored object through the camera;
identifying the monitored object from the video image;
judging an activity area where the monitored object is located;
executing preset reminding processing according to the activity area;
wherein, the judging the activity area where the monitoring object is located includes:
judging that the monitored object is in a safe region or an unsafe region according to the distance between the monitored object and the camera, wherein the unsafe region comprises a warning region or a dangerous region;
wherein the executing of the preset reminding processing according to the active area comprises:
if the monitored object is in the safe area, displaying the video image in real time and highlighting the monitored object;
if the time length of the monitored object in the non-safe area exceeds the set time length, executing at least one reminding process of sending out a sound alarm and generating a prompt pop-up window;
the method further comprises the following steps:
if the monitored object is not in any activity area, sending out a sound alarm and displaying a pre-stored video image;
wherein the method further comprises:
detecting whether the mobile terminal displays the video image;
if the mobile terminal does not display the video image, timing is started;
when the accumulated time length exceeds the second set time length, executing preset reminding processing;
wherein the detecting whether the mobile terminal displays the video image comprises:
detecting whether the foreground operation of the mobile terminal is a monitoring program;
if the mobile terminal does not display the video image, starting timing, including:
if the foreground of the mobile terminal runs other application programs except the monitoring program, timing is started;
before the determining the activity area where the monitoring object is located, the method further includes:
and dividing the area where the monitoring object is located into a plurality of active areas.
2. The method of claim 1, wherein prior to said identifying the monitored object from the video image, the method further comprises:
acquiring the appearance characteristics of the monitored object;
the identifying the monitored object from the video image comprises:
and identifying the monitored object from the video image according to the appearance characteristic.
3. The method of claim 2, wherein the appearance features include at least one of height, skin tone, hair style, and clothing.
4. The method of claim 1, further comprising:
and extracting and storing the video image with the first set time length before the reminding processing so that a user can search the monitored object according to the video image.
5. A mobile terminal, characterized in that the mobile terminal has a camera, the mobile terminal comprising:
the video image acquisition module is used for acquiring a video image of the monitored object through the camera;
the monitoring object identification module is used for identifying the monitoring object from the video image;
the activity area judging module is used for judging the activity area of the monitored object;
the first reminding processing execution module is used for executing preset reminding processing according to the activity area;
the active area judgment module is specifically configured to judge that the monitored object is located in a safe area or an unsafe area according to a distance between the monitored object and the camera, where the unsafe area includes a warning area or a dangerous area;
wherein, the first reminding processing execution module comprises:
the video image display sub-module is used for displaying the video image in real time and highlighting the monitored object if the monitored object is in the safe area;
the reminding processing execution submodule is used for executing at least one reminding processing of sending out a sound alarm and generating a reminding pop-up window if the time length of the monitored object in the non-safe area exceeds the set time length;
the mobile terminal further includes:
the second reminding processing execution module is used for sending out a sound alarm and displaying a pre-stored video image if the monitored object is not in any active area;
wherein the mobile terminal further comprises:
the video image detection module is used for detecting whether the mobile terminal displays the video image;
the timing module is used for starting timing if the video image is not displayed on the mobile terminal;
the third reminding processing execution module is used for executing the preset reminding processing when the accumulated time length of the timing exceeds the second set time length;
the video image detection module is specifically configured to:
detecting whether the foreground operation of the mobile terminal is a monitoring program;
the timing module is specifically configured to:
if the foreground of the mobile terminal runs other application programs except the monitoring program, timing is started;
the mobile terminal is further configured to:
and dividing the area where the monitoring object is located into a plurality of active areas.
6. The mobile terminal of claim 5, wherein before the monitored object identification module, the mobile terminal further comprises:
the appearance characteristic acquisition module is used for acquiring the appearance characteristic of the monitored object;
the monitored object identification module is specifically configured to identify the monitored object from the video image according to the appearance feature.
7. The mobile terminal of claim 6, wherein the appearance characteristics include at least one of height, skin tone, hair style, and clothing.
8. The mobile terminal of claim 5, wherein the mobile terminal further comprises:
and the video image storage module is used for extracting and storing the video image with the first set time length before the reminding processing so that a user can search the monitoring object according to the video image.
CN201710822912.0A 2017-09-13 2017-09-13 Monitoring method and mobile terminal Active CN107682665B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710822912.0A CN107682665B (en) 2017-09-13 2017-09-13 Monitoring method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710822912.0A CN107682665B (en) 2017-09-13 2017-09-13 Monitoring method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107682665A CN107682665A (en) 2018-02-09
CN107682665B true CN107682665B (en) 2021-04-16

Family

ID=61136402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710822912.0A Active CN107682665B (en) 2017-09-13 2017-09-13 Monitoring method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107682665B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108942955A (en) * 2018-07-04 2018-12-07 广东技术师范学院 A kind of household supervisory-controlled robot
CN110097736B (en) * 2019-04-23 2021-11-30 维沃移动通信有限公司 Alarm method and alarm device
CN110443984B (en) * 2019-06-27 2021-01-26 维沃移动通信有限公司 Monitoring method and mobile terminal
CN111653057A (en) * 2020-05-27 2020-09-11 惠州Tcl移动通信有限公司 Behavior supervision method and device, storage medium and mobile terminal
CN115311832B (en) * 2022-08-04 2024-06-07 深圳市双金格科技有限公司 Method and device for intelligently reminding operation, storage medium and desk lamp

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011097598A (en) * 2010-11-22 2011-05-12 Nec Corp Mobile phone with remote monitoring function and remote monitoring system using mobile phone with camera
CN105049458A (en) * 2015-09-08 2015-11-11 北京奇虎科技有限公司 Wearable device remote tracking method, mobile terminal and cloud server
CN105848094A (en) * 2016-05-11 2016-08-10 大连乐图数字科技有限公司 Tracking method, device and system
CN205880946U (en) * 2016-06-03 2017-01-11 浙江大学软件学院(宁波)管理中心(宁波软件教育中心) Access control system of kindergarten based on face identification

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2880458B1 (en) * 2004-12-31 2007-04-20 Cit Alcatel ALERT SYSTEM COMPRISING A MOBILE ALERT TERMINAL AND AN ALERT SERVER
KR101003180B1 (en) * 2008-07-31 2010-12-22 한국산업기술대학교산학협력단 The remote blackbox system and its intelligent monitoring method
CN203206346U (en) * 2013-03-01 2013-09-18 深圳市宝捷讯电子有限公司 A master-slave cell phone system
US20150296183A1 (en) * 2014-04-14 2015-10-15 Hee Moon Cho Outside surveillance apparatus in which interphone and tv are interlinked and control method using the same
FR3023699B1 (en) * 2014-07-21 2016-09-02 Withings METHOD AND DEVICE FOR MONITORING A BABY AND INTERACTING
CN106022209B (en) * 2016-04-29 2019-09-17 杭州华橙网络科技有限公司 A kind of method and device of range estimation and processing based on Face datection
CN106803928B (en) * 2017-01-22 2021-02-23 宇龙计算机通信科技(深圳)有限公司 Reminding method, reminding device and terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011097598A (en) * 2010-11-22 2011-05-12 Nec Corp Mobile phone with remote monitoring function and remote monitoring system using mobile phone with camera
CN105049458A (en) * 2015-09-08 2015-11-11 北京奇虎科技有限公司 Wearable device remote tracking method, mobile terminal and cloud server
CN105848094A (en) * 2016-05-11 2016-08-10 大连乐图数字科技有限公司 Tracking method, device and system
CN205880946U (en) * 2016-06-03 2017-01-11 浙江大学软件学院(宁波)管理中心(宁波软件教育中心) Access control system of kindergarten based on face identification

Also Published As

Publication number Publication date
CN107682665A (en) 2018-02-09

Similar Documents

Publication Publication Date Title
CN107682665B (en) Monitoring method and mobile terminal
US11520376B2 (en) Wearable electronic device and display method of wearable electronic device according to sensor data
CN106095295B (en) Processing method based on fingerprint identification and mobile terminal
US9691256B2 (en) Method and device for presenting prompt information that recommends removing contents from garbage container
EP3121701A1 (en) Method and apparatus for single-hand operation on full screen
US10942580B2 (en) Input circuitry, terminal, and touch response method and device
CN106406710B (en) Screen recording method and mobile terminal
CN105468695B (en) Information display method and device
CN107181913B (en) A kind of photographic method and mobile terminal
US9940521B2 (en) Visibility enhancement devices, systems, and methods
CN107613203B (en) Image processing method and mobile terminal
US10095377B2 (en) Method and device for displaying icon badge
US20140049487A1 (en) Interactive user interface for clothing displays
EP3016048B1 (en) Method and device for displaying a reminder based on geographic criteria
CN106681592B (en) Display switching method and device based on electronic equipment and electronic equipment
CN108366169B (en) Notification message processing method and mobile terminal
CN107665434B (en) Payment method and mobile terminal
CN103135930A (en) Touch screen control method and device
CN107506130B (en) Character deleting method and mobile terminal
CN105072258B (en) Scene modes of mobile terminal switching method, device and mobile terminal
CN107241491B (en) Message prompting method, mobile terminal and storage medium
JP2015090569A (en) Information processing device and information processing method
CN107454255B (en) Lyric display method, mobile terminal and computer readable storage medium
US20230152956A1 (en) Wallpaper display control method and apparatus and electronic device
KR20160062746A (en) Method and apparatus for prompting device connection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant