CN113748668A - Image processing method, mobile terminal and electronic equipment - Google Patents

Image processing method, mobile terminal and electronic equipment Download PDF

Info

Publication number
CN113748668A
CN113748668A CN202080030104.0A CN202080030104A CN113748668A CN 113748668 A CN113748668 A CN 113748668A CN 202080030104 A CN202080030104 A CN 202080030104A CN 113748668 A CN113748668 A CN 113748668A
Authority
CN
China
Prior art keywords
image frame
controlled device
mobile terminal
target image
image frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202080030104.0A
Other languages
Chinese (zh)
Other versions
CN113748668B (en
Inventor
翁松伟
郝贵伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN113748668A publication Critical patent/CN113748668A/en
Application granted granted Critical
Publication of CN113748668B publication Critical patent/CN113748668B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The embodiment of the application discloses an image processing method, which is applied to a control end and comprises the following steps: receiving image frames sent by controlled equipment, and storing target image frames in the image frames to obtain an image frame set; wherein the image frame is a live view image captured by the controlled device; and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display. In the method provided by the embodiment of the application, the control end can store the image frame of the live view image shot by the controlled device, so that after the connection with the controlled device is disconnected, the stored live view image frame can be read for display, a user can search for the controlled device according to the live view image before the connection is disconnected, and the searching efficiency is improved.

Description

Image processing method, mobile terminal and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, a mobile terminal, an electronic device, and a computer-readable storage medium.
Background
The LiveView image is a live view image. In the flight process of the unmanned aerial vehicle, the shot LiveView image can be sent to a control end in wireless connection, so that a user can receive and see the LiveView image at the control end. But the wireless connection between the control end and the unmanned aerial vehicle is inevitably disconnected for various reasons. After the wireless connection between control end and unmanned aerial vehicle breaks off, if unmanned aerial vehicle is not in user's field of vision, the user will be unable to learn unmanned aerial vehicle's position, is difficult to find unmanned aerial vehicle.
Disclosure of Invention
In view of the above, an object of the present invention is to solve the above-mentioned technical problem that a user cannot find a controlled device after the controlled device is disconnected from a control end.
A first aspect of an embodiment of the present application provides an image processing method applied to a control end, where the method includes:
receiving image frames sent by controlled equipment, and storing target image frames in the image frames to obtain an image frame set; wherein the image frame is a live view image captured by the controlled device;
and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
A second aspect of the embodiments of the present application provides an image processing method, which is applied to a controlled device, and includes:
acquiring an image frame of a shot live-view image, wherein the image frame comprises a picture obtained by surrounding shooting the periphery of the controlled device when a specified event occurs;
and sending the image frames to a control end so that the control end can store target image frames in the image frames to obtain an image frame set, and reading the target image frames at adjacent moments when the control end is disconnected with the controlled equipment in the image frame set to play back and display after the control end is disconnected with the controlled equipment.
A third aspect of the embodiments of the present application provides a mobile terminal, where the mobile terminal is connected to a controlled device;
the mobile terminal includes: a processor and a memory storing a computer program;
the processor, when executing the computer program, implements the steps of:
receiving image frames sent by the controlled equipment, and storing target image frames in the image frames to obtain an image frame set; wherein the image frame is a live view image captured by the controlled device;
and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
A fourth aspect of the embodiments of the present application provides an electronic device, which is controlled by a control terminal connected to the electronic device;
the electronic device comprises a camera, a processor and a memory storing a computer program;
the processor, when executing the computer program, implements the steps of:
acquiring an image frame of a live view image shot by the camera, wherein the image frame is obtained by controlling the camera to carry out surrounding shooting on the periphery when a specified event occurs by the electronic equipment;
and sending the image frames to the control end so that the control end can store target image frames in the image frames to obtain an image frame set, and reading the target image frames at adjacent moments when the control end is disconnected with the controlled equipment in the image frame set to play back and display after the control end is disconnected with the electronic equipment.
A fifth aspect of an embodiment of the present application provides a control system, including: the control end and the controlled equipment;
the controlled equipment is used for sending the image frame of the shot real-time view-finding image to the control end;
the control end is used for receiving the image frames and storing target image frames in the image frames to obtain an image frame set; and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
A sixth aspect of embodiments of the present application provides a computer-readable storage medium having a computer program stored thereon; the computer program, when executed by a processor, implements any of the image processing methods provided by the first aspect above.
A seventh aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program; the computer program, when executed by a processor, implements any of the image processing methods provided by the second aspect described above.
According to the image processing method provided by the embodiment of the application, the control end can store the image frame of the real-time view image shot by the controlled device, so that after the connection with the controlled device is disconnected, the stored real-time view image frame can be read for display, a user can search for the controlled device according to the real-time view image before the connection is disconnected, and the searching efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a scene schematic diagram of controlling an unmanned aerial vehicle according to an embodiment of the present application.
Fig. 2 is a flowchart of an image processing method according to an embodiment of the present application.
Fig. 3 is a scene schematic diagram of unmanned aerial vehicle surround shooting provided by the embodiment of the application.
Fig. 4 is a flowchart of another image processing method provided in an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a control system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The remote control equipment comprises wireless remote control equipment and wired remote control equipment, and for the wireless remote control equipment, the wireless remote control equipment comprises a control end and a controlled end, and the controlled end can also be called controlled equipment. A wireless connection channel can be established between the control end and the controlled equipment, a user can enable the control end to send out corresponding instructions by controlling the control end, the instructions can be transmitted to the controlled equipment through the wireless connection channel, and the controlled equipment can execute corresponding actions according to the received instructions, so that the remote control function is realized.
The remote control devices include various types such as a remote control car, a remote control boat, a remote control airplane, a remote control robot, and the like, and also include various home appliances equipped with a remote control, such as an air conditioner, a television, a fan, and the like.
Unmanned aerial vehicle also is a remote control equipment, and from a whole set of product, it includes remote controller and unmanned aerial vehicle usually, and wherein what the remote controller corresponds is control end, the unmanned aerial vehicle corresponds is controlled the equipment. It should be noted that, in practice, unmanned aerial vehicle's control end need not be the remote controller, and in a case, the control end also can be mobile terminal, and mobile terminal can directly be connected with unmanned aerial vehicle promptly, can directly control unmanned aerial vehicle through mobile terminal. In another case, the control end may be a combination of a remote controller and other devices. For example, can refer to fig. 1, fig. 1 is a scene schematic diagram of controlling the unmanned aerial vehicle that this application embodiment provided, as shown in fig. 1, the remote controller can be connected with the mobile terminal, and at this moment, the remote controller and the mobile terminal can be regarded as the control end of unmanned aerial vehicle as a whole.
The mobile terminal can be various electronic devices with processing chips, such as a mobile phone, a tablet, a notebook, smart glasses, and the like, and the connection with the remote controller can also be in various manners, such as wireless connection, such as bluetooth, WiFi, and the like, and also can be realized through a physical data line.
In a scenario where the control end includes a mobile terminal and a remote controller connected to each other, it may be the remote controller that establishes the connection with the unmanned aerial vehicle. When the user wishes to interact with the drone, there are a variety of interaction modes that can be selected. For example, an interactive mode is that a user can operate on the mobile terminal, and at this time, a signal generated by the mobile terminal based on the operation can be transmitted to the remote controller, and then the remote controller forwards the signal to the unmanned aerial vehicle wirelessly connected with the remote controller. For another example, another interaction mode may be that the user directly operates on the remote controller, and then a signal generated by the remote controller based on the operation may be directly transmitted to the drone.
In the process of unmanned aerial vehicle flight, the user can receive the picture that unmanned aerial vehicle shot in real time at the control end. Specifically, the unmanned aerial vehicle is equipped with a camera, and an image captured by the camera in real time may form a LiveView image (the captured image may be compressed to form a LiveView image, and certainly, may not be processed in one embodiment), which is a live view image, and a screen corresponding to the LiveView image is a screen captured by the camera in real time. After the LiveView image is obtained, the unmanned aerial vehicle can send the LiveView image to the control end through the wireless transmission module, and therefore a user can see the picture shot by the unmanned aerial vehicle on the control end in real time.
The transmission of the LiveView image is based on the wireless connection between the control end and the unmanned aerial vehicle, but the wireless connection between the control end and the unmanned aerial vehicle is inevitably interrupted for various reasons. For example, the drone may fly into an area without signals, the drone may collide with an obstacle to explode, the power of the drone may be too low to maintain a wireless connection, and so on. These circumstances all can lead to the wireless connection disconnection between unmanned aerial vehicle and the control end, and wireless connection disconnection, and the user just can't know unmanned aerial vehicle's position through the LiveView image, if unmanned aerial vehicle has flown away from user's field of vision scope, the user will be difficult to find unmanned aerial vehicle.
In order to solve the above problem, an embodiment of the present application provides an image processing method, and reference may be made to fig. 2, where fig. 2 is a flowchart of an image processing method provided by an embodiment of the present application. The method can be applied to the control end, and for understanding of the control end, reference can be made to the related description in the foregoing, and details are not described herein again. The method comprises the following steps:
s201, receiving image frames sent by controlled equipment, and storing target image frames in the image frames to obtain an image frame set.
Among them, as for the controlled device, it may be an electronic device having a motion and photographing function, such as an unmanned aerial vehicle, an unmanned ship, or the like. In the following description of the present application, an unmanned aerial vehicle is taken as an example to be developed, but the unmanned aerial vehicle is only taken as an example and should not be construed as a limitation on the controlled device.
The image frames received from the controlled device are live view images, namely LiveView images, taken by the controlled device. The LiveView image has been described in the foregoing, and is not described in detail herein.
And S202, after the connection with the controlled device is disconnected, reading the target image frame of the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
In the video stream corresponding to the LiveView image, the image shot by the camera is in units of frames, so the control end receives the image frames from the controlled equipment.
The original purpose of the LiveView image is to enable the user to perform live view, that is, to let the user know what kind of picture will be recorded or stored if the user presses the shooting button, so the LiveView image is not stored in the original design. However, the applicant finds that after the unmanned aerial vehicle is disconnected from the control terminal due to an event such as collision or low power, the user does not have any reference clue when searching for the unmanned aerial vehicle, and can only search for the unmanned aerial vehicle by means of the memory before disconnection, and in this case, the efficiency of searching for the unmanned aerial vehicle is very low. If the user is a beginner, the user can be panicked even when the user meets the above situation.
Based on the above findings, the applicant proposes a scheme that when the connection between the control terminal and the controlled device is not disconnected, the received image frames of the live view image sent by the controlled device can be stored, and the stored image frames form an image frame set. The set of image frames may be understood as a buffered segment of video. Therefore, after the connection with the controlled device is disconnected due to some reasons, the control end can read the image frames in the stored image frame set for playback display, namely, the LiveView image before disconnection is played back, so as to assist the user to find the falling position or the current position of the controlled device, improve the machine finding efficiency of the user, and reduce the tension of the user for small and white users.
If the time at which the control terminal is disconnected from the controlled device is referred to as a first time, the image frames read from the image frame set may be image frames at a time within a certain time range from the first time, that is, the adjacent times may be understood as times within a certain time range from the first time, and are not limited to one or two adjacent frames. For example, the certain time range is a small time range, i.e., the image frames at the time closer to the first time are determined, or the image frames at the time adjacent to the first time are determined. For example, it may be 5 minutes before the first time, 30 seconds before the first time, and so on. In this way, the picture displayed in the playback is closer to the current time in terms of time, and the method has more reference significance for searching the controlled device. Of course, if the target image cached by the control end is a small segment of video (for example, 30s), and the small segment of video is updated according to the real-time image frame returned by the controlled device, so that the image cached by the control end is always the latest video returned by the unmanned aerial vehicle, the cached small segment of video can also be directly played back and displayed after the disconnection, so that the playback video assists the user to find the position of the unmanned aerial vehicle after the communication between the control end and the controlled device is interrupted.
The adjacent time mentioned in the present application may be one adjacent time or a plurality of adjacent and continuous times, and may of course be a plurality of spaced times.
Considering that the LiveView image storage needs to occupy a certain storage space of the control end, and the storage space of the control end is limited, the stored image frame set can be limited by a specified condition. The specific limitation may be performed in various ways, for example, in one embodiment, the specified condition may be a first condition for limiting the amount of data corresponding to the image frame set, that is, limiting the amount of data of the image frame set. A data amount threshold, such as 50M, may be set, and the data amount of the image frame set may be limited to be less than or equal to the data amount threshold, or may be limited to be within a certain range around the data amount threshold.
For another example, in one embodiment, the specified condition may be a second condition for limiting the number of frames contained in the image frame set. In a video stream, the duration corresponding to a frame is usually fixed, and therefore, limiting the number of frames included in an image frame set can be understood as limiting the playing duration of a video formed by the image frame set. In a specific implementation, a frame number threshold may be set, for example, 7200 frames (frame rate: 24FPS, play duration: 300s), and the number of frames included in the image frame set may be limited to be less than or equal to the frame number threshold, or may be limited to be within a certain range around the frame number threshold.
If storing the currently received image frame would cause the image frame set not to satisfy the specified condition (e.g., the data amount exceeds the data amount threshold, or the frame number exceeds the frame number threshold), in one embodiment, the image frame set may be updated according to the image frames received in real time so that the storage space occupied by the image frame set is kept appropriate. In a specific implementation, the image frame received in real time may replace the image frame in the image frame set corresponding to the earliest time, in other words, the old image frame to be stored is discarded, and the new image frame received currently is stored, so that the image frame displayed in playback may correspond to the latest image before the disconnection.
For the user to find the unmanned aerial vehicle, the larger the time span of the LiveView image which can be played back in the control end is, the more the user can determine the position of the unmanned aerial vehicle. For example, the LiveView images that the user can play back in the A scenario are from the past 1 minute to the disconnected present, while the LiveView images that the user can play back in the B scenario are from the past 5 minutes to the disconnected present. Obviously, the B scheme can provide more flight information to the user, so the user in the B scheme can find the drone more easily than the user in the a scheme.
However, the image frame set needs to be limited in storage space, and the larger the time span of the LiveView image is, the more the storage space needed by the image frame set is increased. For this reason, the embodiment of the present application provides an implementation manner, for the image frame received from the controlled device, the frame may be extracted and stored, that is, the received image frame does not need to be stored in its entirety, and the target image frame may be extracted and stored therein. Thus, the same time span requires less storage space, and the same storage space can store a longer time span. Also, the time span of playing back a video is more useful to a user to find a machine than the fluency of playing back a video.
In the above frame extraction process, it is necessary to selectively extract frames according to the properties of image frames (e.g., I frame, P frame, etc.), and since the content of this portion is prior art, a detailed description of how to extract frames will not be provided here.
It should be noted that the target image frame is extracted from the received image frame in the above-mentioned embodiment, but in some other embodiments, the received image frames may all be target image frames and may all be stored.
After the connection between the control terminal and the unmanned aerial vehicle is disconnected, playback of the LiveView image can be provided for the user. While there are many possible implementations on a particular interactive design. For example, a playback button can be popped up for a user to click after the connection is disconnected, and when the user inputs a playback instruction by clicking the button, reading and playing of the LiveView image are triggered. For another example, the playback function may be set on a function page of the airplane, and when the user inputs a playback instruction on the function page of the airplane, reading and playing of the LiveView image is triggered.
In order to enable a user to find the unmanned aerial vehicle (controlled device) more quickly, in one embodiment, when the connection between the control terminal and the unmanned aerial vehicle is not disconnected, the state information of the unmanned aerial vehicle can be further transmitted on the basis of transmitting the image frame of the LiveView image. After receiving the image frames and the state information sent by the unmanned aerial vehicle, the control end can establish an association relationship between the image frames and the state information corresponding in time, and when a user plays back a LiveView image, the state information can be displayed in a matched manner with the associated image frames so as to assist the user in finding the unmanned aerial vehicle.
There are various ways to associate temporally corresponding image frames with state information. For example, the image frames and the state information corresponding to each other in time may be stored in an associated manner, or for example, the association relationship between different image frames and the state information may be recorded by a configuration file, so that when the live view image is played back, the corresponding image frames and the state information may be read according to the association information recorded by the configuration file to be displayed in a matching manner.
There are various ways of displaying the image frames and the status information in a matching manner. For example, in one embodiment, the status information may overlay a display interface displayed in the image frame. In a specific implementation, a display switch may be further provided, and when the display switch is turned on (a display instruction is input), the state information is displayed in a display interface of the image frame in a covering manner, and when the display switch is turned off, the state information is hidden. For another example, in an embodiment, the status information may have its own display interface, and when the user needs to review the status information while playing back the live view image, the user may input a switching instruction to switch the display interface of the current image frame to the display interface of the status information.
For the state information of the drone, it may include various information such as height, position, moving direction, attitude, speed, electric quantity, distance information from a peripheral object, and the like. These information may be determined by various sensors or other hardware modules configured on the drone, for example, the location information may be determined by a GPS location module of the drone, the attitude information may be determined from an inertial measurement unit IMU of the drone, and the altitude and distance information may be determined by the drone through a sensing module.
Regarding the sensing module, it may include a TOF (time of flight) camera, through which a depth image may be captured, so that a distance to a captured object may be calculated. The photographed object may be various objects around, such as buildings, and the calculated distance may be a distance from the surrounding buildings, or the photographed object may be the ground, and the calculated distance may be a height. In an embodiment, a perception data map obtained by processing the depth image shot by the perception module can be transmitted to the control terminal, so that a user can selectively refer to the perception data map when playing back the live view image.
According to different state information, different areas can be designated for different state information to be displayed, for example, information such as position, posture, speed, electric quantity and the like can be displayed at corners of the display interface in a covering manner, such as the lower right corner, the upper right corner and the like, and the movement direction can be displayed at the top of the display interface and the like. The display position of the state information is not limited in the present application.
Moreover, the display forms of different state information can be defined according to the requirements, for example, the position information can be displayed in the form of a GPS map, the movement direction can be displayed in the form of a virtual arrow, and the distance information can be displayed in a fusion manner with the image frames. The distance information is information on the distance to a peripheral object, and for example, the distance to the a building is 100m and the distance to the B building is 150 m. When displaying, the distance information may be displayed around the object corresponding to the distance information, for example, 100m may be displayed around a building in the display interface of the image frame, and 150m may be displayed around B building in the display interface of the image frame.
Considering that the flight environment of the unmanned aerial vehicle is complicated and changeable, the connection between the unmanned aerial vehicle and the control end is inevitably disconnected, but many times, the disconnection is recovered in a short time (such as 1 second), and therefore, the playback of the LiveView image is not suitable for each disconnection. In one embodiment, a third condition for limiting the disconnection time may be set. Specifically, the third condition may be set based on a duration threshold, for example, the duration of disconnection may be greater than or equal to the duration threshold. Then, after the control end is disconnected from the controlled device, whether the disconnection duration meets the third condition or not can be further judged, and only if the disconnection duration meets the third condition, the image frame of the LiveView image is read for displaying.
In one embodiment, the above-mentioned storing of the image frame of the LiveView image may be performed under some conditions, and the control end does not need to buffer a segment of the LiveView image at any time. For example, the control terminal may store the received image frame of the LiveView image when the controlled device performs automatic return flight (low power return flight) when the power of the controlled device is low. For another example, the control end may start to store the image frames of the LiveView image only when an abnormality or other event occurs in the controlled device.
Although there are many reasons for disconnection of the drone from the control terminal, some disconnection is foreseeable. Aiming at the foreseeable disconnection condition, in one implementation mode, the unmanned aerial vehicle can shoot the periphery in a surrounding mode before disconnection so as to record the position of the unmanned aerial vehicle more comprehensively, and then after disconnection, a user can play back a LiveView image shot in a surrounding mode by the unmanned aerial vehicle at a control end, so that the unmanned aerial vehicle can be found more quickly.
For these situations that can be expected to be disconnected, it may be called a specific event, that is, the drone may perform the above-mentioned surround shooting of the perimeter when the specific event occurs. Regarding the specific event, it may be various events that can be foreseen to be disconnected, for example, the specific event may be that the controlled device is in a low-power state, specifically, the power of the drone is less than or equal to the power threshold. When the low-power event occurs, the unmanned aerial vehicle can foresee that the connection between the unmanned aerial vehicle and the control terminal is about to be disconnected, then the periphery can be shot in a surrounding mode before the connection is disconnected, and the control terminal is informed of the position of the unmanned aerial vehicle in the form of a LiveView image.
As another example, the specified event may be a collision of the controlled device with an obstacle. When the unmanned aerial vehicle collides with the obstacle, a plurality of unmanned aerial vehicles can foresee the disconnection condition. For example, in one case, the drone is forced to descend (fall) due to a collision, and it is expected that the drone is difficult to maintain the connection with the control end after a secondary collision with the ground, so the drone can take a surrounding shot of the perimeter before the connection is disconnected.
As another example, the specified event may be that the connection between the controlled device and the control end is unstable. For example, during the flight of the unmanned aerial vehicle, the connection between the unmanned aerial vehicle and the control end is more and more unstable, and the frequency of transient disconnection is more and more high. When the frequency of disconnected is higher than a frequency threshold value, can make unmanned aerial vehicle encircle the periphery and shoot, carry out the circle formula to the environment of unmanned aerial vehicle position promptly and shoot to it is difficult to find unmanned aerial vehicle after preventing to thoroughly break off with the connection of control end.
It should be noted that there are various ways for the unmanned aerial vehicle to implement the surrounding shooting. For example, the unmanned aerial vehicle can be matched with a pan-tilt, and the camera can be controlled by the pan-tilt to realize surrounding shooting; for another example, unmanned aerial vehicle can make the organism rotate through adjusting the gesture to realize the shooting of encircleing of camera.
Considering that the drone is usually at a high altitude, in order to capture an image that reflects the location of the drone (otherwise, a circle of sky may be captured), the drone may be captured around the current location at a downward angle of view. Further, considering that the unmanned aerial vehicle usually lands after the designated event occurs, in one embodiment, when the unmanned aerial vehicle takes a picture at the downward oblique angle, the downward oblique angle may gradually increase as the height of the unmanned aerial vehicle decreases.
The scene describing the shooting of the periphery by the unmanned aerial vehicle can be developed by combining the attitude angle. Unmanned aerial vehicle is when surrounding shooting to the periphery, along with the reduction of unmanned aerial vehicle height, in an embodiment, unmanned aerial vehicle can pass through the angle of pitch (pitch) of cloud platform adjustment camera, makes the visual angle of shooting produce the change of pitch direction, and of course, in another kind of embodiment, unmanned aerial vehicle also can directly adjust the angle of pitch (pitch) of fuselage to the shooting visual angle that drives the camera changes on pitch direction. Meanwhile, when the pitch axis of the camera is adjusted through the pan-tilt, the unmanned aerial vehicle can adjust the yaw angle (yaw) of the camera through the pan-tilt, so that the camera rotates circumferentially on the horizontal plane, and surrounding shooting of the surrounding environment is achieved; in another embodiment, the unmanned aerial vehicle can also drive the shooting visual angle of the camera to generate horizontal circular rotation by adjusting the yaw angle (yaw) of the body; in another embodiment, the drone can also turn left or right by adjusting the pitch angle of the pan/tilt and the roll angle (roll) of the drone body, and can also drive the camera to tilt down to record the surrounding environment. Referring to fig. 3, fig. 3 is a schematic view of a scene that is shot around by an unmanned aerial vehicle according to an embodiment of the present application.
It should be noted that when the unmanned aerial vehicle surrounds the periphery and shoots, the unmanned aerial vehicle is not limited to shooting one circle, and in some scenes, the unmanned aerial vehicle can shoot various circles such as half circle (180 degrees), three-quarter circle (270 degrees), two circles, and the like, and the unmanned aerial vehicle is not limited to this.
As can be seen from the foregoing description of the control end, in an embodiment, the control end may include a mobile terminal and a remote controller, where the mobile terminal is connected to the remote controller, and the remote controller is wirelessly connected to the controlled device. The image frame of the LiveView image sent by the controlled device can be received by the remote controller, the remote controller forwards the received image frame to the mobile terminal, and the mobile terminal stores the image frame.
As can be seen from the foregoing description of the mobile terminal, the mobile terminal may be smart glasses. When mobile terminal is intelligent glasses, intelligent glasses can be directly connected with unmanned aerial vehicle, or be connected with unmanned aerial vehicle through the remote controller, it can receive the image frame of the live view image that unmanned aerial vehicle shot to cache in the memory of intelligent glasses, take place accidents such as explode at unmanned aerial vehicle, with unmanned aerial vehicle's disconnection back, intelligent glasses can read the image frame playback of the live view image of storage for the user, in order to make things convenient for the user to find unmanned aerial vehicle fast. The above is a description of the image processing method provided in the embodiments of the present application. According to the image processing method provided by the embodiment of the application, the control end can store the image frame of the live view image, so that the stored live view image frame can be read to be displayed after the connection with the controlled equipment is disconnected, a user can search for the unmanned aerial vehicle according to the live view image before the connection is disconnected, and the airplane searching efficiency is improved.
Referring to fig. 4, fig. 4 is a flowchart of another image processing method according to an embodiment of the present disclosure. The method is applied to the controlled device, and with regard to the controlled device, reference may be made to the relevant description in the foregoing. The method comprises the following steps:
s401, acquiring an image frame of a shot live-view image, wherein the image frame comprises a picture obtained by surrounding shooting the periphery of the controlled device when a specified event occurs.
S402, sending the image frames to a control end so that the control end can store target image frames in the image frames to obtain an image frame set, and after the image frames are disconnected with the controlled equipment, reading the target image frames at adjacent moments when the control end is disconnected with the controlled equipment in the image frame set so as to play back and display the target image frames.
It should be noted that the specified event is not necessarily an abnormal event of the controlled device, and may be a normal event such as normal operation of the controlled device.
Optionally, the specified event includes that the electric quantity of the controlled device is less than or equal to an electric quantity threshold.
Optionally, the specified event includes a collision of the controlled device with an obstacle.
Optionally, the specified event includes that the connection between the controlled device and the control terminal is unstable.
Optionally, the controlled device comprises a drone.
Optionally, the surround shooting the periphery includes:
and shooting around the current position at a downward oblique visual angle.
Optionally, the downward-looking viewing angle increases with decreasing height of the controlled device.
Optionally, the method further includes:
and acquiring current state information, and sending the state information to the control end so as to facilitate the control end to establish an association relationship between the state information and the target image frame at the corresponding time, and to display the state information and the associated target image frame in a matching manner.
Optionally, the status information includes one or more of the following: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
Optionally, the location is determined by GPS positioning.
Optionally, the distance information is calculated according to a depth image obtained by shooting a peripheral object.
Optionally, the control end includes a mobile terminal and/or a remote controller.
Optionally, the controlled device is wirelessly connected with the remote controller, and the mobile terminal is connected with the remote controller;
the image frames are sent to the remote controller by the controlled equipment, so that the remote controller can transmit the image frames to the mobile terminal for storage.
Optionally, the image frame set is used to assist the user in finding the landing position of the controlled device. In the image processing method provided by this embodiment, the controlled device may send the image frame of the live view image to the control end, so that the control end may store the image frame of the live view image, and after the connection with the controlled device is disconnected, the stored image frame may be read and displayed, thereby assisting the user in searching for the controlled device. In addition, the controlled device can also perform surrounding shooting on the periphery when a specified event occurs, so that the real-time view image stored in the control terminal can indicate the position and the environment of the controlled device, and the efficiency of searching for the controlled device by a user is further improved.
The above is a description of another image processing method provided in the embodiments of the present application. Referring to fig. 5, fig. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure. The mobile terminal can be connected with a controlled device.
The mobile terminal includes: a processor 510 and a memory 520 storing computer programs;
the processor, when executing the computer program, implements the steps of:
receiving image frames sent by the controlled equipment, and storing target image frames in the image frames to obtain an image frame set; wherein the image frame is a live view image captured by the controlled device;
and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
Optionally, the processor is further configured to update the image frame set according to the image frames received in real time when the image frame set does not meet a specified condition.
Optionally, the processor, when performing the updating of the image frame set according to the image frames received in real time, is specifically configured to replace the target image frame with the image frame received in real time, the target image frame being earliest in time in the image frame set.
Optionally, the specified condition includes a first condition for limiting a data amount corresponding to the image frame set.
Optionally, the first condition includes that the data amount corresponding to the image frame set is less than or equal to a data amount threshold.
Optionally, the specified condition includes a second condition for limiting a number of frames included in the image frame set.
Optionally, the second condition includes that a number of frames included in the image frame set is less than or equal to a frame number threshold.
Optionally, the processor is further configured to receive state information sent by the controlled device, and establish an association relationship between the state information and the target image frame at a corresponding time; and the state information is used for being matched with the target image frame associated with the state information for display.
Optionally, the processor is specifically configured to, when performing the establishing of the association relationship between the state information and the target image frame at the corresponding time, associate and store the state information and the target image frame at the corresponding time.
Optionally, the processor is specifically configured to display the status information on a designated area of a display interface of the target image frame when the status information is displayed in cooperation with the associated target image frame.
Optionally, the display of the status information is triggered by a display instruction input by a user.
Optionally, when the processor performs the display of the state information in cooperation with the associated target image frame, the processor is specifically configured to switch the display interface of the target image frame to the display interface of the state information according to a switching instruction input by a user.
Optionally, the status information includes one or more of the following: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
Optionally, the position is used for displaying on a display interface of the target image frame in the form of a GPS map.
Optionally, the moving direction is used for displaying on a display interface of the target image frame in the form of a virtual arrow.
Optionally, the distance information is used for displaying on an object corresponding to the distance information in the display interface of the target image frame.
Optionally, the image frame is obtained by the controlled device performing surround shooting on the periphery when a specified event occurs.
Optionally, the specified event includes that the electric quantity of the controlled device is less than or equal to an electric quantity threshold.
Optionally, the specified event includes a collision of the controlled device with an obstacle.
Optionally, the specified event includes that the connection between the controlled device and the remote controller is unstable.
Optionally, the controlled device comprises a drone.
Optionally, the surround shooting the periphery includes:
and shooting around the current position at a downward oblique visual angle.
Optionally, the downward-looking viewing angle increases with decreasing height of the controlled device.
Optionally, the target image frame is extracted from the image frames.
Optionally, the processor is specifically configured to store the target image frame in the image frames when executing the storing of the target image frame, and store the target image frame in the image frames when the controlled device returns with low power.
Optionally, the processor is further configured to determine that a duration of disconnection from the controlled device satisfies a third condition before reading the target image frame for display, where the third condition is set based on a duration threshold.
Optionally, the third condition includes that the disconnection duration is greater than or equal to the duration threshold.
Optionally, reading the target image frame for display is triggered by a playback instruction input by a user.
Optionally, the mobile terminal is connected to the controlled device through a remote controller.
Optionally, the image frame set is used to assist the user in finding the landing position of the controlled device.
The mobile terminal provided by the embodiment can store the image frame of the live view image shot by the controlled device, so that after the connection with the controlled device is disconnected, the stored live view image frame can be read for display, a user can search for the controlled device according to the live view image before the connection is disconnected, and the searching efficiency is improved.
The above is a description of a mobile terminal provided in the embodiments of the present application. Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. The electronic equipment is controlled by a control end connected with the electronic equipment;
the electronic device comprises a camera 610, a processor 620 and a memory 630 storing computer programs;
the processor, when executing the computer program, implements the steps of:
acquiring an image frame of a live view image shot by the camera, wherein the image frame is obtained by controlling the camera to carry out surrounding shooting on the periphery when a specified event occurs by the electronic equipment;
and sending the image frames to the control end so that the control end can store target image frames in the image frames to obtain an image frame set, and reading the target image frames at adjacent moments when the control end is disconnected with the controlled equipment in the image frame set to play back and display after the control end is disconnected with the electronic equipment.
Optionally, the specified event includes that the power of the electronic device is less than or equal to a power threshold.
Optionally, the specified event includes a collision of the electronic device with an obstacle.
Optionally, the specified event includes that the connection between the electronic device and the remote controller is unstable.
Optionally, the electronic device includes a drone.
Optionally, the processor is specifically configured to control the camera to shoot around the current position at an oblique downward angle of view when the processor executes the control of the camera to shoot around the periphery.
Optionally, the downward-looking viewing angle is raised as the height of the electronic device is reduced.
Optionally, the processor is further configured to acquire current state information, and send the state information to the remote controller, so that after the mobile terminal receives the state information from the remote controller, the association relationship between the state information and the target image frame at the corresponding time is established, and the state information and the target image frame associated therewith are displayed in a matching manner.
Optionally, the status information includes one or more of the following: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
Optionally, the location is determined by GPS positioning.
Optionally, the method further includes: a TOF camera;
the distance information is obtained through calculation according to a depth image, and the depth image is obtained through shooting of peripheral objects through the TOF camera.
Optionally, the method further includes: a holder;
the image frame is obtained by the electronic equipment controlling the camera to carry out surrounding shooting through the cradle head.
Optionally, the control end includes a mobile terminal and/or a remote controller.
Optionally, the electronic device is connected to the mobile terminal through the remote controller;
the image frames are sent to the remote controller and forwarded to the mobile terminal by the remote controller, and the target image frames are stored by the mobile terminal.
Optionally, the image frame set is used to assist the user in finding the landing position of the controlled device.
The electronic device provided by this embodiment can send the image frame of the live view image to the control end, so that the control end can store the image frame of the live view image after acquiring the image frame of the live view image, and then, after the connection with the electronic device is disconnected, the control end can read the stored image frame for displaying, thereby assisting the user in searching for the electronic device. In addition, the electronic equipment can also perform surrounding shooting on the periphery when a specified event occurs, so that the real-time view image stored in the control terminal can indicate the position and the environment of the electronic equipment, and the efficiency of searching for the electronic equipment by a user is further improved.
The foregoing is a description of an electronic device provided in an embodiment of the present application. Referring to fig. 7, fig. 7 is a schematic structural diagram of a control system according to an embodiment of the present disclosure. The control system includes: a control end 710 and a controlled device 720;
the controlled device 720 is configured to send an image frame of the captured live view image to the control end;
the control end 710 is configured to receive the image frames, and store a target image frame in the image frames to obtain an image frame set; and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
Optionally, the control end is further configured to update the image frame set according to the image frame received in real time when the image frame set does not meet a specified condition.
Optionally, when the update of the image frame set according to the image frames received in real time is performed, the control end is specifically configured to replace the target image frame with the image frame received in real time, the target image frame being earliest in time in the image frame set.
Optionally, the specified condition includes a first condition for limiting a data amount corresponding to the image frame set.
Optionally, the first condition includes that the data amount corresponding to the image frame set is less than or equal to a data amount threshold.
Optionally, the specified condition includes a second condition for limiting a number of frames included in the image frame set.
Optionally, the second condition includes that a number of frames included in the image frame set is less than or equal to a frame number threshold.
Optionally, the controlled device is further configured to obtain state information of the local device, and send the state information to the control end;
the control terminal is further used for receiving the state information and establishing an association relation between the state information and the target image frame corresponding to time; and the state information is used for being matched with the target image frame associated with the state information for display.
Optionally, the control end is specifically configured to, when the association relationship between the state information and the target image frame at the corresponding time is established, associate and store the state information and the target image frame at the corresponding time.
Optionally, when the control end performs the display of the status information in cooperation with the associated target image frame, the control end is specifically configured to display the status information on a designated area of a display interface of the target image frame.
Optionally, the display of the status information is triggered by a display instruction input by a user.
Optionally, when the control end performs the display of the state information in cooperation with the associated target image frame, the control end is specifically configured to switch the display interface of the target image frame to the display interface of the state information according to a switching instruction input by a user.
Optionally, the status information includes one or more of the following: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
Optionally, the position is determined by the controlled device through GPS positioning, and the position is used for displaying on a display interface of the target image frame in the form of a GPS map.
Optionally, the moving direction is used for displaying on a display interface of the target image frame in the form of a virtual arrow.
Optionally, the distance information is calculated by the controlled device according to a depth image obtained by shooting a peripheral object, and the distance information is used for displaying on an object corresponding to the distance information in the display interface of the target image frame.
Optionally, the image frame is obtained by the controlled device performing surround shooting on the periphery when a specified event occurs.
Optionally, the specified event includes that the electric quantity of the controlled device is less than or equal to an electric quantity threshold.
Optionally, the specified event includes a collision of the controlled device with an obstacle.
Optionally, the specified event includes that the connection between the controlled device and the control terminal is unstable.
Optionally, the controlled device comprises a drone.
Optionally, the controlled device performs surround shooting on the periphery specifically for shooting around the current position at an oblique downward angle of view.
Optionally, the downward-looking viewing angle increases with decreasing height of the controlled device.
Optionally, the target image frame is extracted from the image frames.
Optionally, the control end is specifically configured to store the target image frame in the image frames when the target image frame is stored, and store the target image frame in the image frames when the controlled device returns to the low power state.
Optionally, the control terminal is further configured to determine that a duration of disconnection from the controlled device satisfies a third condition before reading the target image frame for display, where the third condition is set based on a duration threshold.
Optionally, the third condition includes that the disconnection duration is greater than or equal to the duration threshold.
Optionally, reading the target image frame for display is triggered by a playback instruction input by a user.
Optionally, the control end includes a mobile terminal and/or a remote controller.
Optionally, the mobile terminal is connected with the remote controller, and the remote controller is wirelessly connected with the controlled device;
the image frames are acquired from the controlled equipment by the remote controller and then are sent to the mobile terminal, and the target image frames are stored by the mobile terminal.
Optionally, the image frame set is used to assist the user in finding the landing position of the controlled device. The control system that this application embodiment provided, the control end can get off the image frame storage of the live view image of being controlled equipment shooting to behind the disconnection of being controlled equipment, can read the live view's under the storage image frame and show, let the user can carry out unmanned aerial vehicle's the seek according to the live view image before the disconnection, improved the efficiency of looking for the machine.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium; the computer program, when executed by a processor, implements any of the image processing methods as provided herein for application to a control terminal.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium; the computer program, when executed by a processor, implements any of the image processing methods as provided herein for application to a controlled device.
In the above embodiments, various embodiments are provided for each step, and as to which embodiment is specifically adopted for each step, on the basis of no conflict or contradiction, a person skilled in the art can freely select or combine the embodiments according to actual situations, thereby forming various embodiments. While the present document is intended to be limited to the details and not by way of limitation, it is understood that various embodiments are also within the scope of the disclosure of the embodiments of the present application.
Embodiments of the present application may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, in which program code is embodied. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method and the device provided by the embodiment of the present application are described in detail above, and the principle and the embodiment of the present application are explained by applying a specific example, and the description of the above embodiment is only used to help understanding the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (123)

1. An image processing method is applied to a control end, and the method comprises the following steps:
receiving image frames sent by controlled equipment, and storing target image frames in the image frames to obtain an image frame set; wherein the image frame is a live view image captured by the controlled device;
and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
2. The image processing method according to claim 1, further comprising:
when the image frame set does not meet the specified conditions, updating the image frame set according to the image frames received in real time.
3. The image processing method of claim 2, wherein said updating the set of image frames from the image frames received in real-time comprises:
replacing the corresponding temporally oldest target image frame of the set of image frames with the image frames received in real-time.
4. The image processing method according to claim 2, wherein the specified condition includes a first condition for limiting an amount of data corresponding to the image frame set.
5. The method of claim 4, wherein the first condition comprises an amount of data corresponding to the set of image frames being less than or equal to a data amount threshold.
6. The image processing method according to claim 2, wherein the specified condition includes a second condition for limiting the number of frames included in the image frame set.
7. The image processing method of claim 6, wherein the second condition comprises a number of frames contained in the set of image frames being less than or equal to a frame number threshold.
8. The image processing method according to claim 1, further comprising:
receiving state information sent by the controlled equipment, and establishing an incidence relation between the state information and the target image frame at the corresponding time; and the state information is used for being matched with the target image frame associated with the state information for display.
9. The image processing method according to claim 8, wherein said establishing an association between the state information and the target image frame at a corresponding time comprises:
and storing the state information in association with the target image frame at the corresponding time.
10. The image processing method of claim 8, wherein displaying the status information in association with the associated target image frame comprises:
and displaying the state information on a designated area of a display interface of the target image frame.
11. The image processing method according to claim 10, wherein the display of the state information is triggered by a display instruction input by a user.
12. The image processing method of claim 8, wherein displaying the status information in association with the associated target image frame comprises:
and switching the display interface of the target image frame to the display interface of the state information according to a switching instruction input by a user.
13. The image processing method of claim 8, wherein the status information comprises one or more of: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
14. The image processing method according to claim 13, wherein the position is used for a display interface displayed on the target image frame in the form of a GPS map.
15. The image processing method according to claim 13, wherein the moving direction is used for displaying a display interface in the form of a virtual arrow in the target image frame.
16. The image processing method according to claim 13, wherein the distance information is used for displaying on an object corresponding to the distance information in a display interface of the target image frame.
17. The image processing method according to any one of claims 1 to 16, wherein the image frame is obtained by the controlled device by performing surround shooting on the periphery when a specified event occurs.
18. The image processing method according to claim 17, wherein the specified event includes a power level of the controlled device being less than or equal to a power level threshold.
19. The image processing method according to claim 17, wherein the specified event includes collision of the controlled device with an obstacle.
20. The image processing method according to claim 17, wherein the specified event includes that the connection between the controlled device and the control terminal is unstable.
21. The image processing method of claim 17, wherein the controlled device comprises a drone.
22. The image processing method according to claim 21, wherein said taking a surround shot of the periphery comprises:
and shooting around the current position at a downward oblique visual angle.
23. The image processing method according to claim 22, wherein the diagonally downward viewing angle is raised as the height of the controlled device is reduced.
24. The image processing method of claim 1, wherein the target image frame is extracted from the image frames.
25. The image processing method according to any one of claims 1 to 24, wherein the storing of the target image frame among the image frames comprises:
and when the controlled device returns with low power, storing a target image frame in the image frames.
26. The image processing method of claim 1, wherein prior to reading the target image frame for display, the method further comprises:
determining that the time duration of disconnection with the controlled device satisfies a third condition, the third condition being set based on a time duration threshold.
27. The method according to claim 26, wherein the third condition comprises a duration of the disconnection being greater than or equal to the duration threshold.
28. The image processing method according to claim 1, wherein reading the target image frame for playback display is triggered by a playback instruction input by a user.
29. The image processing method according to claim 1, wherein the control terminal comprises a mobile terminal and/or a remote controller.
30. The image processing method according to claim 29, wherein the mobile terminal is connected to the remote controller, and the remote controller is wirelessly connected to the controlled device;
the image frames are acquired from the controlled equipment by the remote controller and then are sent to the mobile terminal, and the target image frames are stored by the mobile terminal.
31. The image processing method of claim 1, wherein the set of image frames is used to assist a user in finding a landing location of the controlled device.
32. An image processing method is applied to a controlled device, and comprises the following steps:
acquiring an image frame of a shot live-view image, wherein the image frame comprises a picture obtained by surrounding shooting the periphery of the controlled device when a specified event occurs;
and sending the image frames to a control end so that the control end can store target image frames in the image frames to obtain an image frame set, and reading the target image frames at adjacent moments when the control end is disconnected with the controlled equipment in the image frame set to play back and display after the control end is disconnected with the controlled equipment.
33. The image processing method of claim 32, wherein the specified event comprises a power level of the controlled device being less than or equal to a power level threshold.
34. The image processing method of claim 32, wherein the specified event comprises the controlled device colliding with an obstacle.
35. The image processing method according to claim 32, wherein the specified event includes that the connection between the controlled device and the control terminal is unstable.
36. The image processing method of claim 32, wherein the controlled device comprises a drone.
37. The image processing method according to claim 36, wherein said taking a surround shot of the periphery comprises:
and shooting around the current position at a downward oblique visual angle.
38. The image processing method of claim 37, wherein the diagonally downward viewing angle increases as the height of the controlled device decreases.
39. The image processing method according to claim 32, further comprising:
and acquiring current state information, and sending the state information to the control end so as to facilitate the control end to establish an association relationship between the state information and the target image frame at the corresponding time, and to display the state information and the associated target image frame in a matching manner.
40. The image processing method of claim 39, wherein the status information comprises one or more of: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
41. The image processing method of claim 40, wherein the location is determined by GPS positioning.
42. The image processing method according to claim 40, wherein the distance information is calculated from a depth image obtained by photographing a peripheral object.
43. The image processing method according to claim 32, wherein the control terminal comprises a mobile terminal and/or a remote controller.
44. The image processing method according to claim 43, wherein the controlled device is wirelessly connected with the remote controller, and the mobile terminal is connected with the remote controller;
the image frames are sent to the remote controller by the controlled equipment, so that the remote controller can transmit the image frames to the mobile terminal for storage.
45. The image processing method of claim 32, wherein the set of image frames is used to assist a user in finding a landing location of the controlled device.
46. A mobile terminal is characterized in that the mobile terminal is connected with a controlled device;
the mobile terminal includes: a processor and a memory storing a computer program;
the processor, when executing the computer program, implements the steps of:
receiving image frames sent by the controlled equipment, and storing target image frames in the image frames to obtain an image frame set; wherein the image frame is a live view image captured by the controlled device;
and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
47. The mobile terminal of claim 46, wherein the processor is further configured to update the set of image frames based on the image frames received in real-time when the set of image frames does not satisfy a specified condition.
48. The mobile terminal of claim 47, wherein the processor, when executing the updating of the set of image frames from the image frames received in real-time, is specifically configured to replace the temporally oldest corresponding target image frame in the set of image frames with the image frames received in real-time.
49. The mobile terminal of claim 47, wherein the specified condition comprises a first condition for limiting an amount of data corresponding to the set of image frames.
50. The mobile terminal of claim 49, wherein the first condition comprises an amount of data corresponding to the set of image frames being less than or equal to a data amount threshold.
51. The mobile terminal of claim 47, wherein the specified condition comprises a second condition for limiting a number of frames included in the set of image frames.
52. The mobile terminal of claim 51, wherein the second condition comprises a number of frames included in the set of image frames being less than or equal to a frame number threshold.
53. The mobile terminal of claim 46, wherein the processor is further configured to receive status information sent by the controlled device, and establish an association between the status information and the target image frame at a corresponding time; and the state information is used for being matched with the target image frame associated with the state information for display.
54. The mobile terminal of claim 53, wherein the processor, when performing the associating relationship between the status information and the target image frame at the corresponding time, is specifically configured to store the status information in association with the target image frame at the corresponding time.
55. The mobile terminal of claim 53, wherein the processor, when executing the displaying of the status information in coordination with the associated target image frame, is specifically configured to display the status information on a designated area of a display interface of the target image frame.
56. The mobile terminal of claim 55, wherein the display of the status information is triggered by a display instruction input by a user.
57. The mobile terminal of claim 53, wherein the processor, when executing the displaying of the status information in cooperation with the associated target image frame, is specifically configured to switch a display interface of the target image frame to a display interface of the status information according to a switching instruction input by a user.
58. The mobile terminal of claim 53, wherein the status information comprises one or more of: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
59. The mobile terminal of claim 58, wherein the location is used for a display interface displayed on the target image frame in the form of a GPS map.
60. The mobile terminal of claim 58, wherein the motion direction is used for displaying a display interface in the form of a virtual arrow on the target image frame.
61. The mobile terminal of claim 58, wherein the distance information is configured to be displayed on an object corresponding to the distance information in the display interface of the target image frame.
62. The mobile terminal of any of claims 46-61, wherein the image frames are captured by the controlled device around the perimeter when a specified event occurs.
63. The mobile terminal of claim 62, wherein the specified event comprises a power level of the controlled device being less than or equal to a power level threshold.
64. The mobile terminal of claim 62, wherein the specified event comprises the controlled device colliding with an obstacle.
65. The mobile terminal of claim 62, wherein the specified event comprises an unstable connection between the controlled device and the remote control.
66. The mobile terminal of claim 62, wherein the controlled device comprises a drone.
67. The mobile terminal of claim 66, wherein the photographing the perimeter around comprises:
and shooting around the current position at a downward oblique visual angle.
68. The mobile terminal of claim 67, wherein the downward angle of view increases as the height of the controlled device decreases.
69. The mobile terminal of claim 46, wherein the target image frame is extracted from the image frames.
70. The mobile terminal according to any of claims 46 to 69, wherein the processor, when executing the storing of the target image frames in the image frames, is specifically configured to store the target image frames in the image frames when the controlled device returns with low power.
71. The mobile terminal of claim 46, wherein the processor is further configured to determine that a duration of disconnection from the controlled device satisfies a third condition before reading the target image frame for display, the third condition being set based on a duration threshold.
72. The mobile terminal of claim 71, wherein the third condition comprises the duration of the disconnection being greater than or equal to the duration threshold.
73. The mobile terminal of claim 46, wherein reading the target image frame for playback display is triggered by a playback instruction input by a user.
74. The mobile terminal of claim 46, wherein the mobile terminal is connected to the controlled device via a remote control.
75. The mobile terminal of claim 46, wherein the set of image frames is configured to assist a user in finding a landing location of the controlled device.
76. An electronic device is characterized by being controlled by a control terminal connected with the electronic device;
the electronic device comprises a camera, a processor and a memory storing a computer program;
the processor, when executing the computer program, implements the steps of:
acquiring an image frame of a live view image shot by the camera, wherein the image frame is obtained by controlling the camera to carry out surrounding shooting on the periphery when a specified event occurs by the electronic equipment;
and sending the image frames to the control end so that the control end can store target image frames in the image frames to obtain an image frame set, and reading the target image frames at adjacent moments when the control end is disconnected with the controlled equipment in the image frame set to play back and display after the control end is disconnected with the electronic equipment.
77. The electronic device of claim 76, wherein the specified event comprises a charge of the electronic device being less than or equal to a charge threshold.
78. The electronic device of claim 76, wherein the specified event comprises the electronic device colliding with an obstacle.
79. The electronic device of claim 76, wherein the specified event comprises an unstable connection between the electronic device and the remote control.
80. The electronic device of claim 76, wherein the electronic device comprises a drone.
81. The electronic device of claim 80, wherein the processor, when executing the controlling the camera to capture a surrounding, is specifically configured to control the camera to capture a surrounding at a downward angle of view around a current location.
82. The electronic device of claim 81, wherein the downward oblique viewing angle increases as the height of the electronic device decreases.
83. The electronic device according to claim 76, wherein said processor is further configured to obtain current status information and send the status information to the remote controller, so that after the mobile terminal receives the status information from the remote controller, the association relationship between the status information and the target image frame at the corresponding time is established, and the status information is displayed in cooperation with the associated target image frame.
84. The electronic device of claim 83, wherein the status information comprises one or more of: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
85. The electronic device of claim 84, wherein the location is determined by GPS positioning.
86. The electronic device of claim 84, further comprising: a TOF camera;
the distance information is obtained through calculation according to a depth image, and the depth image is obtained through shooting of peripheral objects through the TOF camera.
87. The electronic device of claim 76, further comprising: a holder;
the image frame is obtained by the electronic equipment controlling the camera to carry out surrounding shooting through the cradle head.
88. The electronic device of claim 76, wherein the control terminal comprises a mobile terminal and/or a remote control.
89. The electronic device of claim 88, wherein the electronic device is connected to the mobile terminal via the remote control;
the image frames are sent to the remote controller and forwarded to the mobile terminal by the remote controller, and the target image frames are stored by the mobile terminal.
90. The electronic device of claim 76, wherein the set of image frames is configured to assist a user in finding a landing location of the controlled device.
91. A control system, comprising: the control end and the controlled equipment;
the controlled equipment is used for sending the image frame of the shot real-time view-finding image to the control end;
the control end is used for receiving the image frames and storing target image frames in the image frames to obtain an image frame set; and after the connection with the controlled device is disconnected, reading the target image frame at the adjacent moment when the control end is disconnected with the controlled device in the image frame set so as to perform playback display.
92. The control system of claim 91, wherein the control end is further configured to update the set of image frames according to the image frames received in real time when the set of image frames does not satisfy a specified condition.
93. The control system according to claim 92, wherein said control terminal, when performing said updating of said set of image frames based on said image frames received in real-time, is specifically configured to replace said temporally oldest corresponding target image frame in said set of image frames with said image frame received in real-time.
94. The control system of claim 91, wherein the specified condition comprises a first condition for limiting an amount of data corresponding to the set of image frames.
95. The control system of claim 94, wherein the first condition comprises an amount of data corresponding to the set of image frames being less than or equal to a data amount threshold.
96. The control system of claim 92 wherein the specified condition comprises a second condition for limiting a number of frames contained in the set of image frames.
97. The control system of claim 96 wherein the second condition comprises a number of frames included in the set of image frames being less than or equal to a frame number threshold.
98. The control system according to claim 91, wherein the controlled device is further configured to obtain state information of the controlled device, and send the state information to the control end;
the control terminal is further used for receiving the state information and establishing an association relation between the state information and the target image frame corresponding to time; and the state information is used for being matched with the target image frame associated with the state information for display.
99. The control system according to claim 98, wherein said control terminal is specifically configured to store said status information in association with said target image frame at a corresponding time when said establishing of the association relationship between said status information and said target image frame at a corresponding time is performed.
100. The control system according to claim 98, wherein the control terminal is specifically configured to display the status information on a designated area of a display interface of the target image frame when performing the display of the status information in cooperation with the associated target image frame.
101. The control system of claim 100, wherein the display of the status information is triggered by a display instruction entered by a user.
102. The control system according to claim 98, wherein the control terminal is specifically configured to switch the display interface of the target image frame to the display interface of the status information according to a switching instruction input by a user when the status information is displayed in cooperation with the associated target image frame.
103. The control system of claim 98, wherein the status information includes one or more of: height, position, motion direction, posture, speed, electric quantity, and distance information from peripheral objects.
104. The control system of claim 103, wherein the location is determined by GPS positioning of the controlled device, and wherein the location is used for displaying a display interface in the form of a GPS map on the target image frame.
105. The control system of claim 103, wherein the direction of motion is used to display a display interface in the form of a virtual arrow on the target image frame.
106. The control system according to claim 103, wherein the distance information is calculated by the controlled device according to a depth image obtained by shooting a peripheral object, and the distance information is used for displaying on an object corresponding to the distance information in the display interface of the target image frame.
107. The control system according to any one of claims 91-106, wherein the image frames are captured by the controlled device around the perimeter at the occurrence of a specified event.
108. The control system of claim 107, wherein the specified event comprises a charge of the controlled device being less than or equal to a charge threshold.
109. The control system of claim 107, wherein the specified event comprises the controlled device colliding with an obstacle.
110. The control system of claim 107, wherein the specified event comprises an unstable connection between the controlled device and the control terminal.
111. The control system of claim 107, wherein the controlled device comprises a drone.
112. The control system according to claim 111, wherein the controlled device is configured to capture the surrounding area in a surrounding manner, and to capture the surrounding area at a downward and oblique angle of view.
113. The control system of claim 112, wherein the diagonally downward viewing angle increases as the height of the controlled device decreases.
114. The control system of claim 91, wherein the target image frames are extracted from the image frames.
115. The control system according to any one of claims 91 to 114, wherein the control terminal is specifically configured to perform the storing of the target image frame in the image frames when the controlled device returns with low power.
116. The control system of claim 91, wherein the control terminal is further configured to determine that a duration of disconnection from the controlled device satisfies a third condition before reading the target image frame for display, the third condition being set based on a duration threshold.
117. The control system of claim 116, wherein the third condition comprises a length of time for the disconnection being greater than or equal to the length of time threshold.
118. The control system of claim 91, wherein reading the target image frame for playback display is triggered by a playback instruction input by a user.
119. The control system of claim 91, wherein the control terminal comprises a mobile terminal and/or a remote control.
120. The control system according to claim 119, wherein the mobile terminal is connected to the remote controller, and the remote controller is wirelessly connected to the controlled device;
the image frames are acquired from the controlled equipment by the remote controller and then are sent to the mobile terminal, and the target image frames are stored by the mobile terminal.
121. The control system of claim 91, wherein the set of image frames is configured to assist a user in finding a landing location of the controlled device.
122. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program; the computer program, when executed by a processor, implements an image processing method as claimed in any one of claims 1-31.
123. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program; the computer program, when executed by a processor, implements an image processing method as claimed in any one of claims 32-45.
CN202080030104.0A 2020-06-19 2020-06-19 Image processing method, mobile terminal and electronic equipment Active CN113748668B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/097216 WO2021253436A1 (en) 2020-06-19 2020-06-19 Image processing method, mobile terminal, and electronic device

Publications (2)

Publication Number Publication Date
CN113748668A true CN113748668A (en) 2021-12-03
CN113748668B CN113748668B (en) 2023-09-12

Family

ID=78728400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080030104.0A Active CN113748668B (en) 2020-06-19 2020-06-19 Image processing method, mobile terminal and electronic equipment

Country Status (2)

Country Link
CN (1) CN113748668B (en)
WO (1) WO2021253436A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104679873A (en) * 2015-03-09 2015-06-03 深圳市道通智能航空技术有限公司 Aircraft tracing method and aircraft tracing system
US20150358591A1 (en) * 2014-06-04 2015-12-10 Jae Wan Kim Security method using image frame, device for executing the method, and recording medium that stores the method
US20180359413A1 (en) * 2016-01-29 2018-12-13 SZ DJI Technology Co., Ltd. Method, system, device for video data transmission and photographing apparatus
WO2018227350A1 (en) * 2017-06-12 2018-12-20 深圳市大疆创新科技有限公司 Control method for homeward voyage of unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
CN110261880A (en) * 2019-06-19 2019-09-20 深圳市道通智能航空技术有限公司 A kind of method, system and unmanned plane for searching for unmanned plane

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109429028A (en) * 2017-08-30 2019-03-05 深圳市道通智能航空技术有限公司 A kind of method and apparatus of unmanned plane image reproducing
US10827123B1 (en) * 2018-01-05 2020-11-03 Gopro, Inc. Modular image capture systems
CN109765587B (en) * 2019-03-06 2024-02-09 深圳飞马机器人股份有限公司 Unmanned aerial vehicle positioning system, unmanned aerial vehicle positioning method and unmanned aerial vehicle monitoring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150358591A1 (en) * 2014-06-04 2015-12-10 Jae Wan Kim Security method using image frame, device for executing the method, and recording medium that stores the method
CN104679873A (en) * 2015-03-09 2015-06-03 深圳市道通智能航空技术有限公司 Aircraft tracing method and aircraft tracing system
US20180359413A1 (en) * 2016-01-29 2018-12-13 SZ DJI Technology Co., Ltd. Method, system, device for video data transmission and photographing apparatus
WO2018227350A1 (en) * 2017-06-12 2018-12-20 深圳市大疆创新科技有限公司 Control method for homeward voyage of unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
CN110261880A (en) * 2019-06-19 2019-09-20 深圳市道通智能航空技术有限公司 A kind of method, system and unmanned plane for searching for unmanned plane

Also Published As

Publication number Publication date
CN113748668B (en) 2023-09-12
WO2021253436A1 (en) 2021-12-23

Similar Documents

Publication Publication Date Title
US20200302803A1 (en) Unmanned aerial vehicle return method and device, storage medium and unmanned aerial vehicle
US11722647B2 (en) Unmanned aerial vehicle imaging control method, unmanned aerial vehicle imaging method, control terminal, unmanned aerial vehicle control device, and unmanned aerial vehicle
KR20190080780A (en) Electronic apparatus and method for controlling same
WO2019119434A1 (en) Information processing method, unmanned aerial vehicle, remote control apparatus, and non-volatile storage medium
CN107450573B (en) Flight shooting control system and method, intelligent mobile communication terminal and aircraft
CN108702464B (en) Video processing method, control terminal and mobile device
CN110383814B (en) Control method, unmanned aerial vehicle, remote control device and nonvolatile storage medium
US11228737B2 (en) Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium
CN106406343A (en) Control method, device and system of unmanned aerial vehicle
US20210152750A1 (en) Information processing apparatus and method for controlling the same
CN110945452A (en) Cloud deck, unmanned aerial vehicle control method, cloud deck and unmanned aerial vehicle
WO2022021438A1 (en) Image processing method, image control method, and related device
CN110337806A (en) Group picture image pickup method and device
JP2018070010A (en) Unmanned aircraft controlling system, controlling method and program thereof
CN110622089A (en) Following control method, control terminal and unmanned aerial vehicle
CN114727028B (en) Image exposure method and device and unmanned aerial vehicle
CN113795803A (en) Flight assistance method, device, chip, system and medium for unmanned aerial vehicle
CN110278717B (en) Method and device for controlling the flight of an aircraft
CN113795805A (en) Flight control method of unmanned aerial vehicle and unmanned aerial vehicle
CN113905211B (en) Video patrol method, device, electronic equipment and storage medium
CN111526280A (en) Control method and device of camera device, electronic equipment and storage medium
CN113748668B (en) Image processing method, mobile terminal and electronic equipment
WO2021217408A1 (en) Unmanned aerial vehicle system, and control method and device therefor
CN110291776B (en) Flight control method and aircraft
JP2019097137A (en) Generation device, generating system, imaging system, movable body, generation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant