WO2021253436A1 - 图像处理方法、移动终端及电子设备 - Google Patents

图像处理方法、移动终端及电子设备 Download PDF

Info

Publication number
WO2021253436A1
WO2021253436A1 PCT/CN2020/097216 CN2020097216W WO2021253436A1 WO 2021253436 A1 WO2021253436 A1 WO 2021253436A1 CN 2020097216 W CN2020097216 W CN 2020097216W WO 2021253436 A1 WO2021253436 A1 WO 2021253436A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
controlled device
target image
mobile terminal
processing method
Prior art date
Application number
PCT/CN2020/097216
Other languages
English (en)
French (fr)
Inventor
翁松伟
郝贵伟
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/097216 priority Critical patent/WO2021253436A1/zh
Priority to CN202080030104.0A priority patent/CN113748668B/zh
Publication of WO2021253436A1 publication Critical patent/WO2021253436A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Definitions

  • This application relates to the field of image processing technology, and in particular to an image processing method, mobile terminal, electronic device, and computer-readable storage medium.
  • the LiveView image is the live view image.
  • the drone can send the captured LiveView image to the wirelessly connected control terminal, so that the user can watch the LiveView image on the control terminal.
  • the wireless connection between the control terminal and the drone will inevitably be disconnected due to various reasons. After the wireless connection between the control terminal and the drone is disconnected, if the drone is not in the user's field of view, the user will not be able to know the location of the drone, and it will be difficult to find the drone.
  • one of the objectives of the present invention is to solve the aforementioned technical problem that the user cannot find the controlled device after the connection between the controlled device and the control terminal is disconnected.
  • the first aspect of the embodiments of the present application provides an image processing method applied to a control terminal, and the method includes:
  • the second aspect of the embodiments of the present application provides an image processing method applied to a controlled device, including:
  • the image frame is sent to the control terminal, so that the control terminal can store the target image frame in the image frame to obtain the image frame set.
  • the control terminal After disconnecting from the controlled device, read the The target image frames at adjacent moments when the control terminal is disconnected from the controlled device in the image frame set are displayed for playback.
  • a third aspect of the embodiments of the present application provides a mobile terminal, where the mobile terminal is connected to a controlled device;
  • the mobile terminal includes: a processor and a memory storing a computer program
  • the processor implements the following steps when executing the computer program:
  • the fourth aspect of the embodiments of the present application provides an electronic device, which is controlled by a control terminal connected to the electronic device;
  • the electronic device includes a camera, a processor, and a memory storing a computer program
  • the processor implements the following steps when executing the computer program:
  • the image frame is sent to the control terminal so that the control terminal can store the target image frame in the image frame to obtain an image frame set.
  • After disconnecting from the electronic device read the The target image frames at adjacent moments when the control terminal is disconnected from the controlled device in the image frame set are displayed for playback.
  • the fifth aspect of the embodiments of the present application provides a control system, including: a control terminal and a controlled device;
  • the controlled device is used to send the image frame of the captured live view image to the control terminal;
  • the control terminal is used to receive the image frame and store the target image frame in the image frame to obtain an image frame set; after disconnecting from the controlled device, read the image frame Collect the target image frames at adjacent moments when the control terminal is disconnected from the controlled device for playback display.
  • the sixth aspect of the embodiments of the present application provides a computer-readable storage medium that stores a computer program; when the computer program is executed by a processor, any one of the image processing provided in the first aspect is implemented method.
  • the seventh aspect of the embodiments of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program; when the computer program is executed by a processor, any one of the image processing provided in the second aspect is implemented method.
  • the control terminal can store the image frames of the live view image taken by the controlled device, so that the stored live view image can be read after the connection with the controlled device is disconnected
  • Frame display allows users to search for the controlled device based on the real-time view image before the connection is disconnected, which improves the efficiency of searching.
  • FIG. 1 is a schematic diagram of a scene of controlling a drone provided by an embodiment of the present application.
  • Fig. 2 is a flowchart of an image processing method provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a scene of a drone surrounding shooting provided by an embodiment of the present application.
  • Fig. 4 is a flowchart of another image processing method provided by an embodiment of the present application.
  • Fig. 5 is a schematic structural diagram of a mobile terminal provided by an embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Fig. 7 is a schematic structural diagram of a control system provided by an embodiment of the present application.
  • the remote control device includes a wireless remote control device and a wired remote control device.
  • a wireless remote control device it includes a control terminal and a controlled terminal.
  • the controlled terminal can also be called a controlled device.
  • a wireless connection channel can be established between the control terminal and the controlled device. The user can control the control terminal to make the control terminal send out corresponding instructions, which can be transmitted to the controlled device through the wireless connection channel, and the controlled device will follow the received instructions The corresponding actions can be executed to realize the function of remote control.
  • remote control devices such as remote control cars, remote control boats, remote control airplanes, remote control robots, etc.
  • various household appliances equipped with remote controls such as air conditioners, TVs, fans, etc.
  • UAV is also a kind of remote control equipment. From the perspective of the entire product, it usually includes a remote control and an unmanned aerial vehicle.
  • the remote control corresponds to the control terminal, and the unmanned aerial vehicle corresponds to the controlled device.
  • the control terminal of the drone is not necessarily a remote control.
  • the control terminal can also be a mobile terminal, that is, the mobile terminal can be directly connected to the drone through the mobile terminal. You can directly control the drone.
  • the control terminal can be a combination of a remote control and other devices.
  • Figure 1 is a schematic diagram of a scenario for controlling a drone provided by an embodiment of the present application. As shown in Figure 1, the remote controller can be connected to the mobile terminal. At this time, the remote controller and the mobile terminal can be considered as a whole The control end of the drone.
  • the mobile terminal can be a variety of electronic devices with processing chips, such as mobile phones, tablets, notebooks, and smart glasses.
  • the connection with the remote control can also be in various ways, such as wireless connections such as Bluetooth and WiFi, or Connection through physical data lines.
  • the remote controller may be the one that establishes the connection with the drone.
  • the user wants to interact with the drone, there are a variety of interaction methods to choose from.
  • one way of interaction is that the user can operate on the mobile terminal.
  • the signal generated by the mobile terminal based on the operation can be transmitted to the remote control, and the remote control forwards the signal to the drone that is wirelessly connected to it.
  • another way of interaction may be that the user directly operates on the remote control, and the signal generated by the remote control based on the operation can be directly sent to the drone.
  • the user can watch the pictures taken by the drone in real time on the control terminal.
  • the drone is equipped with a camera, and the image captured by the camera in real time can form a LiveView image (the captured image can be compressed to form a LiveView image, of course, in one embodiment, it may not be processed), the so-called LiveView image That is, the live view image, the screen corresponding to the LiveView image is the screen shot by the camera in real time.
  • the drone can send the LiveView image to the control terminal through the wireless transmission module, so that the user can view the picture taken by the drone in real time on the control terminal.
  • the transmission of LiveView images is based on the wireless connection between the control terminal and the drone, but the wireless connection between the control terminal and the drone will inevitably be interrupted due to various reasons. For example, the drone may fly into an area with no signal, or the drone may collide with an obstacle and blow up, or the power of the drone may be too low to maintain a wireless connection. These situations will cause the wireless connection between the drone and the control terminal to be disconnected, and the wireless connection is disconnected, and the user cannot know the location of the drone through the LiveView image. If the drone has flew out of the user's field of view, the user It will be difficult to find the drone.
  • FIG. 2 is a flowchart of an image processing method provided by an embodiment of the present application.
  • This method can be applied to the control terminal.
  • the method includes the following steps:
  • S201 Receive an image frame sent by a controlled device, and store a target image frame in the image frame to obtain an image frame set.
  • the controlled device may be an electronic device with motion and shooting functions, such as unmanned aerial vehicles, unmanned vehicles, and unmanned ships.
  • drones are often used as examples to expand, but drones are only used as examples and should not be construed as restrictions on controlled devices.
  • the image frame received from the controlled device is the live view image taken by the controlled device, that is, the LiveView image.
  • the LiveView image there is a corresponding description in the previous article, so I won't repeat it here.
  • the image captured by the camera is in frame units, so the control terminal receives the image frame from the controlled device.
  • LiveView images Since the original intention of LiveView images is to allow users to view in real time, that is, to let users know what kind of images will be recorded or stored if the capture button is pressed now, therefore, LiveView images will not be stored in the original design .
  • the applicant found that after the drone is disconnected from the control terminal due to an event such as a collision or low battery, the user will not have any clues to refer to when looking for the drone, and can only rely on the memory before the disconnection Come to find a machine, in this case, the efficiency of finding a machine will be very low. If the user is a novice, he may even panic when encountering the above-mentioned situation.
  • the applicant proposed this solution that is, when the connection between the control terminal and the controlled device is not disconnected, the received image frame of the LiveView image sent by the controlled device can be stored, and the stored image frame Form a collection of image frames.
  • the set of image frames can be understood as a piece of video that has been buffered.
  • the control terminal can read the image frames in the stored image frame set for playback display, that is, the LiveView image before the disconnection is played back to assist the user Finding the landing position or current position of the controlled device can improve the user's machine-finding efficiency, and for novice users, it can also reduce their sense of tension.
  • the image frame read from the image frame set can be an image frame at a time within a certain time range from the first time, that is,
  • the aforementioned adjacent time can be understood as a time within a certain time range from the first time, and is not limited to one or two adjacent frames.
  • the certain time range is a small time length range, that is, an image frame at a time closer to the first time is determined, or, it can also be said that an image frame at a time adjacent to the first time. For example, it can be 5 minutes before the first moment, 30 seconds before the first moment, and so on.
  • the time of the playback displayed picture is closer to the current time, and it is more meaningful to find the controlled device.
  • the target image buffered by the control end is a short video (for example, 30s)
  • the short video will be updated according to the real-time image frame returned by the controlled device, so the image buffered by the control end is always a drone
  • the latest video returned can also be played back and displayed directly after the connection is disconnected, so that after the communication between the control terminal and the controlled device is interrupted, the playback video assists the user in finding the location of the drone.
  • adjacent moments mentioned in this application may be an adjacent moment or multiple consecutive and adjacent moments, or of course, may also be multiple moments at intervals.
  • the set of stored image frames can be restricted by specified conditions.
  • the specified condition may be the first condition for restricting the amount of data corresponding to the image frame set, that is, restricting the data amount of the image frame set.
  • a data volume threshold can be set, such as 50M, and the data volume of the image frame set can be restricted to be less than or equal to the data volume threshold, or the data volume of the image frame collection can be restricted to be within a certain range near the data volume threshold.
  • the specified condition may be a second condition used to limit the number of frames included in the image frame set.
  • the duration corresponding to a frame is usually fixed. Therefore, limiting the number of frames included in the image frame set can be understood as limiting the playback duration of the video formed by the image frame set.
  • you can set a frame number threshold such as 7200 frames (frame rate: 24FPS, playback time: 300s)
  • the number of frames included in the set is within a certain range around the frame number threshold.
  • the image frame set can be updated based on the image frames received in real time, so that the storage space occupied by the image frame set is kept appropriate.
  • the image frame received in real time can be used to replace the image frame with the earliest corresponding time in the image frame set. In other words, the stored old image frame will be discarded, and the new image frame currently received can be stored. Make the image frame displayed in playback correspond to the latest one before disconnection.
  • the LiveView image that the user can play back in plan A is from the past 1 minute to the present when the connection is disconnected
  • the LiveView image that the user can play back in plan B is from the past 5 minutes to the present when the connection is disconnected.
  • solution B can provide users with more flight information, so users in solution B can find drones more easily than users in solution A.
  • the stored image frame set needs to be limited by storage space.
  • the embodiments of the present application provide an implementation manner. For the image frames received from the controlled device, frames can be extracted and stored, that is, the received image frames do not have to be all stored, and the target can be extracted.
  • the image frame is stored. In this way, the same time span requires less storage space, and the same storage space can store a longer time span.
  • the time span of the playback video is also more useful than the smoothness of the playback video.
  • the target image frame is extracted from the received image frame in the foregoing embodiment, but in some other embodiments, the received image frame may all be the target image frame and can be stored.
  • a playback button can be popped up for the user to click after the connection is disconnected.
  • the playback function can be set on the function page of the aircraft.
  • the image frame of the LiveView image can also be transmitted.
  • the control terminal receives the image frame and status information sent by the drone, it can associate the time-corresponding image frame with the status information.
  • the status information can be associated with the image frame. Coordinated display to assist users in finding a machine.
  • the time-corresponding image frame can be stored in association with status information.
  • the association relationship between different image frames and status information can be recorded through a configuration file. Then, when the LiveView image is played back, it can be recorded according to the configuration file. The associated information reads the corresponding image frame and status information to coordinate the display.
  • the status information can be overlaid on the display interface displayed on the image frame.
  • a display switch can also be set. When the user turns on the display switch (input display instruction), the status information is displayed overlaid on the display interface of the image frame, and when the user turns off the display switch, the status information is hidden.
  • the status information can have its own display interface. When the user is playing back the LiveView image, if there is a need to consult the status information, the user can input a switching instruction to switch the display interface of the current image frame. To the display interface of status information.
  • the status information of the drone can include various information, such as height, position, direction of movement, attitude, speed, power, distance information from surrounding objects, and so on.
  • This information can be determined by various sensors or other hardware modules configured on the drone.
  • the position information can be determined by the drone's GPS positioning module
  • the attitude information can be determined by the drone's inertial measurement unit IMU, height and distance.
  • the information can be determined by the drone through the perception module.
  • different status information can be assigned to different areas for display, for example, position, posture, speed, power and other information can be overlaid on the corners of the display interface, such as the lower right corner, upper right corner, etc., while the movement The direction can be displayed on the top of the display interface, etc.
  • This application does not limit the display position of the status information.
  • the display form of different status information can also be defined according to requirements.
  • location information can be displayed in the form of GPS map
  • movement direction can be displayed in the form of virtual arrows
  • distance information can be displayed in fusion with image frames.
  • it is distance information to surrounding objects, for example, the distance to the A building is 100 m, and the distance to the B building is 150 m.
  • the distance information can be displayed around the corresponding object. For example, 100m can be displayed in the display interface of the image frame A around the building, and 150m can be displayed in the image frame's display interface B around the building .
  • a third condition for limiting the disconnection time can be set.
  • the third condition may be set based on the duration threshold, for example, the duration of disconnection may be greater than or equal to the duration threshold. Then, after the control terminal is disconnected from the controlled device, it can be further judged whether the length of disconnection meets the third condition, and only when the third condition is met, can the image frame of the LiveView image be read for display.
  • the above-mentioned storage of the image frames of the LiveView image may be performed under certain conditions, and the control terminal does not necessarily need to cache a section of the LiveView image at any time.
  • the control terminal can store the received image frame of the LiveView image when the power of the controlled device is low and returns to home automatically (return to home with low power).
  • the control terminal can only start to store the image frames of the LiveView image when an abnormality or other event occurs in the controlled device.
  • the drone can be made to take pictures of the surroundings before disconnection, so as to record its location more comprehensively. After disconnection, the user can The control terminal replays the LiveView images captured by the drone, so that the drone can be found more quickly.
  • the drone can perform the above-mentioned surround shooting of the surroundings.
  • the designated event it can be a variety of events that can be foreseeably disconnected.
  • the designated event can be that the controlled device is in a low power state, specifically, that the power of the drone is less than or equal to the power threshold.
  • the drone can foresee that its connection with the control terminal is about to be disconnected, and it can take a surround shot of the surroundings before the connection is disconnected, and inform the control terminal of its location in the form of a LiveView image. .
  • the designated event may be a collision between the controlled device and an obstacle.
  • many drones can foresee the disconnection.
  • the drone is forced to descend (fall) due to a collision. It can be foreseen that it is difficult for the drone to maintain the connection with the control terminal after a secondary collision with the ground, so the drone can be disconnected from the connection.
  • the designated event may be that the connection between the controlled device and the control terminal is unstable.
  • the connection between it and the control terminal becomes more and more unstable, and the frequency of short-term disconnection becomes higher and higher.
  • the frequency of the disconnection is higher than a frequency threshold, the drone can be used to circle the surroundings, that is, the environment where the drone is located in a circle to prevent it from being difficult to find after the connection with the control terminal is completely disconnected Drone.
  • the drone can be equipped with a pan-tilt, and the camera can be controlled by the pan-tilt to achieve surround shooting; another example, the drone can rotate the body by adjusting the attitude, so as to realize the camera's surround shooting.
  • the drone can be made to surround the current location with an oblique downward perspective. Take a shot. Furthermore, it is also considered that after the above-mentioned specified event occurs, it is usually accompanied by the landing of the drone. Therefore, in one embodiment, when the drone is shooting from an obliquely downward angle of view, it is obliquely downward. The viewing angle can be gradually raised as the height of the drone decreases.
  • the drone can adjust the pitch angle of the camera through the gimbal, so that the shooting angle of view has a pitch direction.
  • the drone can also directly adjust the pitch of the fuselage, so as to drive the camera's shooting angle of view to change in the pitch direction.
  • the drone can also adjust the yaw angle (yaw) of the camera through the gimbal, so that the camera can rotate in a circle on the horizontal plane, so as to realize surrounding shooting of the surrounding environment.
  • the drone can also adjust the yaw angle of the fuselage (yaw) to drive the camera's shooting angle of view to produce a horizontal circular rotation; in another embodiment, no one
  • the drone can also turn the drone to the left or right by adjusting the pitch angle of the gimbal and the roll angle of the drone body. It can also drive the camera diagonally down to record the surrounding environment.
  • FIG. 3 is a schematic diagram of a surrounding shooting scene of a drone provided by an embodiment of the present application.
  • the drone when the drone is shooting the surroundings, it is not limited to only one circle. In some scenes, the drone can shoot half a circle (that is, 180 degrees), three quarters of a circle (270 This application does not limit the number of laps such as degrees), two laps, etc.
  • the control terminal may include a mobile terminal and a remote controller, wherein the mobile terminal is connected with the remote controller, and the remote controller is wirelessly connected with the controlled device.
  • the image frame of the LiveView image sent by the controlled device can be received by the remote control, and the remote control forwards the received image frame to the mobile terminal, and the mobile terminal stores the image frame.
  • the mobile terminal may be smart glasses.
  • the smart glasses can be directly connected to the drone, or connected to the drone through the remote control, which can receive the image frames of the LiveView image taken by the drone and cache them in the memory of the smart glasses
  • the smart glasses can read the stored LiveView image and play back the image frame to the user, so that the user can quickly find the drone.
  • the control terminal can store the image frame of the live view image, so that after the connection with the controlled device is disconnected, the stored live view image frame can be read for display, so that Users can search for drones based on the real-time view image before disconnection, which improves the efficiency of finding aircraft.
  • FIG. 4 is a flowchart of another image processing method provided by an embodiment of the present application. This method is applied to the controlled device.
  • the method includes:
  • S402. Send the image frame to the control terminal, so that the control terminal can store the target image frame in the image frame to obtain an image frame set, and read it after disconnecting from the controlled device
  • the target image frames at adjacent moments when the control terminal is disconnected from the controlled device in the image frame set are displayed for playback.
  • the specified event is not necessarily an abnormal event of the controlled device, but may also be a normal event such as the normal operation of the controlled device.
  • the specified event includes that the power of the controlled device is less than or equal to a power threshold.
  • the designated event includes a collision between the controlled device and an obstacle.
  • the specified event includes that the connection between the controlled device and the control terminal is unstable.
  • the controlled device includes a drone.
  • the surround shooting of the surroundings includes:
  • the diagonally downward viewing angle increases as the height of the controlled device decreases.
  • it also includes:
  • the state information includes one or more of the following: height, position, movement direction, posture, speed, power, and distance information from surrounding objects.
  • the position is determined by GPS positioning.
  • the distance information is calculated based on a depth image obtained by shooting surrounding objects.
  • control terminal includes a mobile terminal and/or a remote controller.
  • the controlled device is connected to the remote control wirelessly, and the mobile terminal is connected to the remote control;
  • the image frame is sent by the controlled device to the remote control, so that the remote control transmits the image frame to the mobile terminal for storage.
  • the image frame set is used to assist the user in finding the landing position of the controlled device.
  • the controlled device can send the image frame of the live view image to the control terminal, so that the control terminal can store the image frame of the live view image, so that the connection with the controlled device is disconnected. Later, the stored image frames can be read for display, assisting the user in finding the controlled device.
  • the controlled device can also take pictures of the surroundings when a specified event occurs, so that the real-time view image stored by the control terminal can indicate the location and environment of the controlled device, which further improves the efficiency of the user in finding the controlled device.
  • FIG. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
  • the mobile terminal can be connected with the controlled device.
  • the mobile terminal includes: a processor 510 and a memory 520 storing a computer program
  • the processor implements the following steps when executing the computer program:
  • the processor is further configured to, when the set of image frames does not meet a specified condition, update the set of image frames according to the image frames received in real time.
  • the processor is specifically configured to replace the image frame set in the image frame set with the image frame received in real time when performing the update of the image frame set based on the image frame received in real time. Corresponds to the target image frame with the earliest time.
  • the specified condition includes a first condition for limiting the amount of data corresponding to the set of image frames.
  • the first condition includes that the amount of data corresponding to the set of image frames is less than or equal to a data amount threshold.
  • the specified condition includes a second condition for limiting the number of frames included in the image frame set.
  • the second condition includes that the number of frames included in the image frame set is less than or equal to a frame number threshold.
  • the processor is further configured to receive status information sent by the controlled device, and establish an association relationship between the status information and the target image frame at a corresponding time; wherein, the status information Used to display in conjunction with the target image frame associated with it.
  • the processor is specifically configured to associate the state information with the target image frame at the corresponding time when performing the establishment of the association relationship between the state information and the target image frame at the corresponding time.
  • Associated storage is specifically configured to associate the state information with the target image frame at the corresponding time when performing the establishment of the association relationship between the state information and the target image frame at the corresponding time.
  • the processor is specifically configured to display the state information on a designated area of a display interface of the target image frame when performing coordinated display of the state information and the associated target image frame.
  • the display of the status information is triggered by a display instruction input by the user.
  • the processor is specifically configured to switch the display interface of the target image frame to the display interface of the target image frame according to the switching instruction input by the user when performing the coordinated display of the state information and the associated target image frame.
  • Display interface for status information is specifically configured to switch the display interface of the target image frame to the display interface of the target image frame according to the switching instruction input by the user when performing the coordinated display of the state information and the associated target image frame.
  • the state information includes one or more of the following: height, position, movement direction, posture, speed, power, and distance information from surrounding objects.
  • the location is used to be displayed on the display interface of the target image frame in the form of a GPS map.
  • the movement direction is used to be displayed on the display interface of the target image frame in the form of a virtual arrow.
  • the distance information is used to display on an object corresponding to the distance information in a display interface of the target image frame.
  • the image frame is obtained by the controlled device surrounding the surrounding shooting when a specified event occurs.
  • the specified event includes that the power of the controlled device is less than or equal to a power threshold.
  • the designated event includes a collision between the controlled device and an obstacle.
  • the specified event includes that the connection between the controlled device and the remote controller is unstable.
  • the controlled device includes a drone.
  • the surround shooting of the surroundings includes:
  • the diagonally downward viewing angle increases as the height of the controlled device decreases.
  • the target image frame is extracted from the image frame.
  • the processor when executing the storing of the target image frame in the image frame, is specifically configured to: Store it.
  • the processor is further configured to, before reading the target image frame for display, determine that the duration of disconnection from the controlled device satisfies a third condition, and the third condition is based on The duration threshold is set.
  • the third condition includes that the duration of the disconnection is greater than or equal to the duration threshold.
  • reading the target image frame for display is triggered by a playback instruction input by the user.
  • the mobile terminal is connected to the controlled device through a remote controller.
  • the image frame set is used to assist the user in finding the landing position of the controlled device.
  • the mobile terminal provided in this embodiment can store the image frames of the real-time view image taken by the controlled device, so that after the connection with the controlled device is disconnected, the stored real-time view image frame can be read for display.
  • the user can search for the controlled device based on the live view image before the connection is disconnected, which improves the efficiency of searching.
  • FIG. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device is controlled by the control terminal connected to the electronic device;
  • the electronic device includes a camera 610, a processor 620, and a memory 630 storing a computer program
  • the processor implements the following steps when executing the computer program:
  • the image frame is sent to the control terminal so that the control terminal can store the target image frame in the image frame to obtain an image frame set.
  • After disconnecting from the electronic device read the The target image frames at adjacent moments when the control terminal is disconnected from the controlled device in the image frame set are displayed for playback.
  • the specified event includes that the power of the electronic device is less than or equal to a power threshold.
  • the designated event includes a collision between the electronic device and an obstacle.
  • the specified event includes that the connection between the electronic device and the remote control is unstable.
  • the electronic equipment includes a drone.
  • the processor is specifically configured to control the camera to surround the current location to shoot with an oblique downward angle of view when performing the control of the camera to perform surrounding shooting.
  • the diagonally downward viewing angle increases as the height of the electronic device decreases.
  • the processor is further configured to obtain current status information, and send the status information to the remote controller, so that after the mobile terminal receives the status information from the remote controller Establish an association relationship between the status information and the target image frame at a corresponding time, and display the status information and the target image frame associated with it in cooperation.
  • the state information includes one or more of the following: height, position, movement direction, posture, speed, power, and distance information from surrounding objects.
  • the position is determined by GPS positioning.
  • it also includes: TOF camera;
  • the distance information is calculated based on a depth image, and the depth image is obtained by photographing surrounding objects through the TOF camera.
  • it also includes: PTZ;
  • the image frame is obtained by the electronic device controlling the camera through the pan/tilt to perform surround shooting.
  • control terminal includes a mobile terminal and/or a remote controller.
  • the electronic device is connected to the mobile terminal through the remote controller;
  • the image frame is sent to the remote control and forwarded by the remote control to the mobile terminal, and the target image frame is stored by the mobile terminal.
  • the image frame set is used to assist the user in finding the landing position of the controlled device.
  • the electronic device provided in this embodiment can send the image frame of the live view image to the control terminal, so that after acquiring the image frame of the live view image, the control terminal can store the image frame of the live view image, and further, After the connection with the electronic device is disconnected, the control terminal can read the stored image frame for display to assist the user in searching for the electronic device.
  • the electronic device can also photograph the surroundings when a designated event occurs, so that the real-time view image stored by the control terminal can indicate the location and environment of the electronic device, which further improves the efficiency of the user searching for the electronic device.
  • FIG. 7 is a schematic structural diagram of a control system provided by an embodiment of the present application.
  • the control system includes: a control terminal 710 and a controlled device 720;
  • the controlled device 720 is configured to send image frames of the captured live view image to the control terminal;
  • the control terminal 710 is configured to receive the image frame and store the target image frame in the image frame to obtain an image frame set; after disconnecting from the controlled device, read the image
  • the target image frames at adjacent moments when the control terminal is disconnected from the controlled device in the frame set are replayed and displayed.
  • control terminal is further configured to, when the set of image frames does not meet a specified condition, update the set of image frames according to the image frames received in real time.
  • control terminal is specifically configured to replace the image frame set in the image frame set with the image frame received in real time when performing the update of the image frame set based on the image frame received in real time. Corresponds to the target image frame with the earliest time.
  • the specified condition includes a first condition for limiting the amount of data corresponding to the set of image frames.
  • the first condition includes that the amount of data corresponding to the set of image frames is less than or equal to a data amount threshold.
  • the specified condition includes a second condition for limiting the number of frames included in the image frame set.
  • the second condition includes that the number of frames included in the image frame set is less than or equal to a frame number threshold.
  • the controlled device is further configured to obtain state information of the machine, and send the state information to the control terminal;
  • the control terminal is also used for receiving the state information and establishing an association relationship between the state information and the target image frame at a corresponding time; wherein the state information is used for the target image frame associated with it Cooperate with the display.
  • control terminal is specifically configured to combine the state information with the target image frame at the corresponding time when performing the establishment of the association relationship between the state information and the target image frame at the corresponding time.
  • Associated storage is specifically configured to combine the state information with the target image frame at the corresponding time when performing the establishment of the association relationship between the state information and the target image frame at the corresponding time.
  • control terminal is specifically configured to display the state information on a designated area of the display interface of the target image frame when performing coordinated display of the state information and the associated target image frame.
  • the display of the status information is triggered by a display instruction input by the user.
  • control terminal is specifically configured to switch the display interface of the target image frame to the display interface of the target image frame according to the switching instruction input by the user when performing the coordinated display of the state information and the associated target image frame.
  • Display interface for status information is specifically configured to switch the display interface of the target image frame to the display interface of the target image frame according to the switching instruction input by the user when performing the coordinated display of the state information and the associated target image frame.
  • the state information includes one or more of the following: height, position, movement direction, posture, speed, power, and distance information from surrounding objects.
  • the location is determined by the controlled device through GPS positioning, and the location is used to display on the display interface of the target image frame in the form of a GPS map.
  • the movement direction is used to be displayed on the display interface of the target image frame in the form of a virtual arrow.
  • the distance information is calculated by the controlled device according to a depth image obtained by shooting surrounding objects, and the distance information is used for the distance information corresponding to the distance information in the display interface of the target image frame Display on the object.
  • the image frame is obtained by the controlled device surrounding the surrounding shooting when a specified event occurs.
  • the specified event includes that the power of the controlled device is less than or equal to a power threshold.
  • the designated event includes a collision between the controlled device and an obstacle.
  • the specified event includes that the connection between the controlled device and the control terminal is unstable.
  • the controlled device includes a drone.
  • the surrounding shooting of the surroundings by the controlled device is specifically used for shooting around the current location with an oblique downward angle of view.
  • the diagonally downward viewing angle increases as the height of the controlled device decreases.
  • the target image frame is extracted from the image frame.
  • control terminal when performing the storing of the target image frame in the image frame, is specifically configured to: Store it.
  • control terminal is further configured to, before reading the target image frame for display, determine that the duration of disconnection from the controlled device satisfies a third condition, and the third condition is based on The duration threshold is set.
  • the third condition includes that the duration of the disconnection is greater than or equal to the duration threshold.
  • reading the target image frame for display is triggered by a playback instruction input by the user.
  • control terminal includes a mobile terminal and/or a remote controller.
  • the mobile terminal is connected to the remote control, and the remote control is wirelessly connected to the controlled device;
  • the image frame is acquired by the remote controller from the controlled device and then sent to the mobile terminal, and the target image frame is stored by the mobile terminal.
  • the image frame set is used to assist the user in finding the landing position of the controlled device.
  • the control terminal can store the image frames of the live view image taken by the controlled device, so that the stored live view image frame can be read after the connection with the controlled device is disconnected.
  • the display allows users to search for drones based on the real-time view image before the connection is disconnected, which improves the efficiency of finding aircraft.
  • the embodiments of the present application also provide a computer-readable storage medium that stores a computer program; when the computer program is executed by a processor, the computer program implements any image processing applied to the control terminal as provided in the present application. method.
  • the embodiments of the present application also provide a computer-readable storage medium that stores a computer program; when the computer program is executed by a processor, the computer program can be implemented as any of the applications provided in this application for a controlled device. Image processing method.
  • the embodiments of the present application may adopt the form of a computer program product implemented on one or more storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing program codes.
  • Computer usable storage media include permanent and non-permanent, removable and non-removable media, and information storage can be achieved by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to: phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • CD-ROM compact disc
  • DVD digital versatile disc
  • Magnetic cassettes magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

本申请实施例公开了一种图像处理方法,应用于控制端,所述方法包括:接收被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;其中,所述图像帧是所述被控设备拍摄的实时取景图像;在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。本申请实施例提供的方法中,控制端可以将被控设备拍摄的实时取景图像的图像帧存储下来,从而在与被控设备的连接断开后,可以读取存储下的实时取景的图像帧进行显示,让用户可以根据连接断开前的实时取景图像进行被控设备的寻找,提高了寻找的效率。

Description

图像处理方法、移动终端及电子设备 技术领域
本申请涉及图像处理技术领域,尤其涉及一种图像处理方法、移动终端、电子设备及计算机可读存储介质。
背景技术
LiveView图像即实时取景图像。在无人机的飞行过程中,其可以将所拍摄的LiveView图像发送给无线连接的控制端,以使用户可以在控制端收看到LiveView图像。但控制端与无人机之间的无线连接难免会因各种各样的原因而断开。在控制端与无人机之间的无线连接断开后,若无人机不在用户的视野范围内,用户将无法得知无人机的位置,难以找到无人机。
发明内容
有鉴于此,本发明的目的之一是解决上述所提及的被控设备与控制端的连接断开后用户找不到被控设备的技术问题。
本申请实施例第一方面提供了一种图像处理方法应用于控制端,所述方法包括:
接收被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;其中,所述图像帧是所述被控设备拍摄的实时取景图像;
在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
本申请实施例第二方面提供了一种图像处理方法,应用于被控设备,包括:
获取拍摄的实时取景图像的图像帧,所述图像帧包括所述被控设备在发生指定事件时对周边进行环绕拍摄得到的;
将所述图像帧发送给控制端,以便于所述控制端对所述图像帧中的目标图像帧进行存储,得到图像帧集合,在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
本申请实施例第三方面提供了一种移动终端,所述移动终端与被控设备连接;
所述移动终端包括:处理器与存储有计算机程序的存储器;
所述处理器在执行所述计算机程序时实现以下步骤:
接收所述被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;其中,所述图像帧是所述被控设备拍摄的实时取景图像;
在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
本申请实施例第四方面提供了一种电子设备,受控于与所述电子设备连接的控制端;
所述电子设备包括相机、处理器与存储有计算机程序的存储器;
所述处理器在执行所述计算机程序时实现以下步骤:
获取所述相机拍摄的实时取景图像的图像帧,所述图像帧是所述电子设备在发生指定事件时控制所述相机对周边进行环绕拍摄得到的;
将所述图像帧发送给所述控制端,以便于所述控制端对所述图像帧中的目标图像帧进行存储,得到图像帧集合,在与所述电子设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
本申请实施例第五方面提供了一种控制系统,包括:控制端与被控设备;
所述被控设备用于,将拍摄的实时取景图像的图像帧发送给控制端;
所述控制端用于,接收所述图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
本申请实施例第六方面提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序;所述计算机程序被处理器执行时实现上述第一方面提供的任一种图像处理方法。
本申请实施例第七方面提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序;所述计算机程序被处理器执行时实现上述第二方面提供的任一种图像处理方法。
本申请实施例提供的图像处理方法,控制端可以将被控设备拍摄的实时取景图像的图像帧存储下来,从而在与被控设备的连接断开后,可以读取存储下的实时取景的图像帧进行显示,让用户可以根据连接断开前的实时取景图像进行被控设备的寻找,提高了寻找的效率。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的操控无人机的场景示意图。
图2是本申请实施例提供的一种图像处理方法的流程图。
图3是本申请实施例提供的一种无人机环绕拍摄的场景示意图。
图4是本申请实施例提供的另一种图像处理方法的流程图。
图5是本申请实施例提供的一种移动终端的结构示意图。
图6是本申请实施例提供的一种电子设备的结构示意图。
图7是本申请实施例提供的一种控制系统的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。
遥控设备包括无线遥控设备和有线遥控设备,对于无线遥控设备,其包括控制端与被控端,被控端也可以用被控设备来称呼。控制端与被控设备之间可以建立无线连接通道,用户可以通过操控控制端,使控制端发出相应的指令,该指令通过无线连接通道可以传输给被控设备,被控设备根据接收到的指令可以执行相应的动作,实现遥控的功能。
上述的遥控设备有多种,比如遥控车、遥控船、遥控飞机、遥控机器人等,还包括各种配备了遥控器的家用电器,如空调、电视、风扇等。
无人机也是一种遥控设备,从整套产品来看,其通常包括遥控器与无人机,其中遥控器对应的是控制端,无人机对应的被控设备。需要注意的是,在实际中,无人机的控制端并不一定是遥控器,在一种情况中,控制端也可以是移动终端,即移动终端可以直接与无人机连接,通过移动终端可以直接操控无人机。在另一种情况中,控制端可以是遥控器与其他设备的组合。比如,可以参见图1,图1是本申请实施例提供的操控无人机的场景示意图,如图1所示,遥控器可以和移动终端连接,此时,遥控 器与移动终端整体可以认为是无人机的控制端。
移动终端可以是手机、平板、笔记本、智能眼镜等各种具有处理芯片的电子设备,其与遥控器的连接也可以是各种方式的连接,比如可以是蓝牙、WiFi等无线连接,也可以是通过物理数据线实现的连接。
在控制端包括互相连接的移动终端与遥控器的场景中,与无人机建立连接的可以是遥控器。当用户希望与无人机进行交互时,有多种交互方式可以选择。比如,一种交互方式是,用户可以在移动终端上进行操作,此时,移动终端基于操作所产生的信号可以传输给遥控器,再由遥控器将信号转发给与其无线连接的无人机。又比如,另一种交互方式可以是,用户直接在遥控器上操作,则遥控器基于操作所产生的信号可以直接发送给无人机。
在无人机飞行的过程中,用户可以在控制端实时收看到无人机拍摄的画面。具体的,无人机上搭载有相机,相机实时拍摄的图像可以形成LiveView图像(拍摄的图像可以经过压缩处理后形成LiveView图像,当然,在一种实施方式中也可以不经过处理),所谓LiveView图像即实时取景图像,LiveView图像所对应的画面是相机实时拍摄的画面。在得到LiveView图像后,无人机可以通过无线传输模块将LiveView图像可以发送给控制端,从而,用户能够在控制端上实时收看到无人机所拍摄的画面。
LiveView图像的传输是以控制端与无人机之间的无线连接为基础的,但控制端与无人机之间的无线连接难免会因各种各样的原因而中断。比如无人机可能飞进了无信号的区域,又比如无人机可能与障碍物碰撞而炸机,再比如无人机的电量可能过低无法维持无线连接等。这些情况都会导致无人机与控制端之间的无线连接断开,而无线连接断开,用户就无法通过LiveView图像知晓无人机的位置,若无人机已经飞离用户的视野范围,用户将难以找到无人机。
为解决上述问题,本申请实施例提供了一种图像处理方法,可以参见图2,图2是本申请实施例提供的一种图像处理方法的流程图。该方法可以应用于控制端,关于控制端的理解可以参考前文中的相关说明,在此不再赘述。该方法包括以下步骤:
S201、接收被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合。
其中,关于被控设备,其可以是具有运动和拍摄功能的电子设备,比如无人机、无人车、无人船等。本申请后续的说明中多以无人机为例子进行展开,但无人机仅是作为例子,不应理解为是对被控设备的限制。
从被控设备处接收到的图像帧是被控设备拍摄的实时取景图像,即LiveView图 像。关于LiveView图像,在前文中已有相应说明,在此不再赘述。
S202、在与被控设备断开连接后,读取图像帧集合中控制端与被控设备断开连接时的相邻时刻的目标图像帧以进行回放显示。
在LiveView图像对应的视频流中,相机所拍摄的图像是以帧为单位的,因此控制端从被控设备处接收到的是图像帧。
由于LiveView图像的初衷只是为了让用户能够进行实时取景,即让用户知道,若现在按下拍摄按钮,将录制或存储下怎样的画面,因此,LiveView图像在原本的设计中是不会存储下来的。但申请人发现,在无人机因发生碰撞或低电量等事件而与控制端断开连接后,用户在寻找无人机时将没有任何可以参考的线索,只能凭借连接断开前的记忆来找机,这种情况下,找机效率将很低。若用户是新手,遇到上述的情况甚至会惊慌失措。
基于上述发现,申请人提出了本方案,即在控制端与被控设备的连接未断开时,对于接收到的被控设备发送的LiveView图像的图像帧,可以进行存储,存储下来的图像帧构成图像帧集合。图像帧集合可以理解为是缓存下来的一段视频。如此,在与被控设备的连接由于某些原因断开后,控制端可以读取已经存储下的图像帧集合中的图像帧进行回放显示,即将断连前的LiveView图像进行回放,以辅助用户寻找被控设备的降落位置或当前位置,提高用户的找机效率,对于小白用户,还可以减少其紧张感。
若将控制端与被控设备断开连接的时刻称为第一时刻,则从图像帧集合中读取的图像帧可以是距离第一时刻在一定时间范围内的时刻的图像帧,也即是说,上述相邻时刻可以理解为距离第一时刻在一定时间范围内的时刻,不局限于相邻的一帧或两帧。例如,该一定时间范围是一个小的时长范围,即确定距离第一时刻较近的时刻的图像帧,或者,也可以说与第一时刻相邻的时刻的图像帧。比如,可以是第一时刻前5分钟,第一时刻前30秒等等。如此,回放显示的画面在时间上与当前时间更接近,对被控设备的寻找更有参考意义。当然,如果控制端缓存的目标图像是一小段视频(例如30s),且该一小段视频是会根据被控设备返回的实时的图像帧进行更新的,因此控制端缓存的图像一直是无人机返回的最新的视频,则也可以在断开连接后,直接将该缓存的小段视频进行回放显示,以在控制端与被控设备发生通信中断后,回放视频辅助用户寻找无人机的位置。
需要说明的是,本申请中提及的相邻时刻可以是相邻的一个时刻或相邻的且持续的多个时刻,当然也可以是间隔的多个时刻。
考虑到LiveView图像的存储需要占据控制端一定的存储空间,而控制端的存储空 间是有限的,因此,对存储的图像帧集合,可以通过指定条件进行限制。具体限制时可以有多种方式,比如,在一种实施方式中,指定条件可以是用于限制图像帧集合对应的数据量的第一条件,即对图像帧集合的数据量进行限制。可以设置一个数据量阈值,比如50M,则可以限制图像帧集合的数据量小于或等于该数据量阈值,也可以限制图像帧集合的数据量处在数据量阈值附近的一个确定范围内。
又比如,在一种实施方式中,指定条件可以是用于限制图像帧集合所包含的帧数的第二条件。在一个视频流中,一帧对应的时长通常是固定的,因此,对图像帧集合所包含的帧的数量进行限制,可以理解为对图像帧集合形成的视频的播放时长进行限制。具体实现时,可以设置一个帧数阈值,比如7200帧(帧率:24FPS,播放时长:300s),则可以限制图像帧集合所包含的帧数小于或等于该帧数阈值,也可以限制图像帧集合所包含的帧数处在帧数阈值附近的一个确定范围内。
对于当前接收到的图像帧,若对该图像帧进行存储将导致图像帧集合不满足上述的指定条件(比如数据量超出数据量阈值,或,帧数超过帧数阈值),则在一种实施方式中,可以根据实时接收到的图像帧对图像帧集合进行更新,以使图像帧集合所占的存储空间保持合适。具体实现时,可以用实时接收到的图像帧替换图像帧集合中的对应时间最早的图像帧,换言之,即将存储的旧的图像帧进行丢弃,对当前接收的新的图像帧进行存储,从而可以使回放显示的图像帧对应断开连接前最新的画面。
对于用户找机而言,控制端中可回放的LiveView图像的时间跨度越大,则越有利于用户确定无人机所在的位置。举个例子,A方案中用户可以回放的LiveView图像是从过去1分钟到连接断开的现在,而B方案中用户可以回放的LiveView图像是从过去5分钟到连接断开的现在。显然,B方案可以给用户提供更多的飞行信息,因此B方案中的用户能够比A方案中的用户更容易的找到无人机。
但存储的图像帧集合需要受到存储空间上的限制,LiveView图像的时间跨度越大,图像帧集合所需的存储空间也会增大。为此,本申请实施例提供一种实施方式,对于从被控设备处接收到的图像帧,可以抽帧存储,也就是说,接收到的图像帧不一定要全部存储,可以抽取其中的目标图像帧进行存储。如此,同样的时间跨度需要更少的存储空间,同样的存储空间可以存储的更长的时间跨度。并且,对于用户找机而言,回放视频的时间跨度也要比回放视频的流畅性更有用。
在进行上述的抽帧处理时,需要根据图像帧的性质(如I帧、P帧等)有选择性的进行抽取,鉴于该部分内容为现有技术,故在此不对具体如何抽帧进行详细说明。
需要说明的是,目标图像帧在上述的实施方式是从接收的图像帧中抽取得到的, 但在一些其他的实施方式中,接收到的图像帧可以都是目标图像帧,都可以进行存储。
在控制端与无人机之间的连接断开后,可以给用户提供LiveView图像的回放。而在具体的交互设计上,有多种可行的实施方式。比如,可以在连接断开后弹出回放按钮供用户点击,当用户通过点击该按钮输入回放指令时,触发对LiveView图像的读取和播放。又比如,可以将回放功能设置在飞机的功能页面上,当用户在飞机的功能页面上输入回放指令时,触发LiveView图像的读取和播放。
为了能够让用户更快的找到无人机(被控设备),在一种实施方式中,在控制端与无人机的连接未断开时,还可以在传输LiveView图像的图像帧的基础上,进一步传输无人机的状态信息。控制端在接收到无人机发送的图像帧与状态信息后,可以将时间上相对应的图像帧与状态信息建立关联关系,则在用户回放LiveView图像时,状态信息可以与其所关联的图像帧配合的显示,以辅助用户找机。
将时间上相对应的图像帧与状态信息建立关联关系的方式有多种。比如,可以将时间上相对应的图像帧与状态信息关联存储,又比如,可以通过配置文件记录不同图像帧与状态信息之间的关联关系,则在LiveView图像回放时,可以根据该配置文件记录的关联信息读取对应的图像帧与状态信息以配合的显示。
图像帧与状态信息配合显示的方式有多种。比如,在一种实施方式中,状态信息可以覆盖显示在图像帧的显示界面。具体实现时,还可以设置一个显示开关,当用户打开显示开关(输入显示指令)时,将状态信息覆盖显示在图像帧的显示界面,当用户关闭显示开关时,则隐藏状态信息。又比如,在一种实施方式中,状态信息可以有自己的显示界面,当用户在回放LiveView图像时,若有查阅状态信息的需求,则可以输入切换指令,使当前的图像帧的显示界面切换至状态信息的显示界面。
对于无人机的状态信息,其可以包括各种信息,比如高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息等。这些信息可以通过无人机上配置的各种传感器或其他硬件模块确定,比如,位置信息可以通过无人机的GPS定位模块确定,姿态信息可以根据无人机的惯性测量单元IMU确定,高度与距离信息可以是无人机通过感知模块确定的。
关于感知模块,其可以包括TOF(Time of Flight)摄像头,通过TOF摄像头可以拍摄得到深度图像,从而可以计算出与所拍摄对象的距离。所拍摄对象可以是周边的各种物体,比如建筑物,则计算出的距离是与周边建筑物的距离,所拍摄对象也可以是地面,则计算出的距离可以是高度。在一种实施方式中,还可以将感知模块所拍摄的深度图像进行处理后得到的感知数据图传输给控制端,以便用户在进行LiveView图 像回放时,可以选择性的查阅该感知数据图。
根据状态信息的不同,可以给不同的状态信息指定不同的区域以进行显示,比如,位置、姿态、速度、电量等信息可以覆盖显示在显示界面的角落,如右下角、右上角等,而运动方向可以显示在显示界面的顶部等。本申请对状态信息的显示位置不作限定。
并且,不同状态信息的显示形式也可以根据需求自行定义,比如,位置信息可以以GPS地图的形式显示,运动方向可以以虚拟箭头的形式显示,而距离信息可以与图像帧融合显示。关于距离信息,其是与周边物体的距离信息,比如,与A建筑物的距离是100m,与B建筑物的距离是150m。在显示时,可以将距离信息显示在其所对应的物体周边,比如,可以将100m显示在图像帧的显示界面中A建筑物的周边,150m显示在图像帧的显示界面中B建筑物的周边。
考虑到无人机的飞行环境复杂多变,其与控制端的连接不可避免的会断开,但很多时候,这种断开都会在很短的时间内(比如1秒)恢复,因此,不适宜在每次断开都进行LiveView图像的回放。在一种实施方式中,可以设置用于限制断连时间的第三条件。具体的,第三条件可以基于时长阈值设定,比如可以是断开连接的时长大于或等于该时长阈值。那么,在控制端与被控设备断开连接后,可以进一步判断断开连接的时长是否满足该第三条件,只有满足该第三条件,才读取LiveView图像的图像帧进行显示。
在一种实施方式中,上述的对LiveView图像的图像帧的存储可以在一些条件下进行,控制端并不一定要在任何时候都缓存一段LiveView图像。比如,控制端可以在被控设备的电量较低进行自动返航(低电量返航)时,再对接收的LiveView图像的图像帧进行存储。又比如,可以在被控设备出现异常或其他事件时,控制端才开始对LiveView图像的图像帧进行存储。
虽然无人机与控制端断连的原因有很多,但有一些断连是可以预见的。针对这些可以预见的断连情况,在一种实施方式中,可以使无人机在断连之前对周边进行环绕拍摄,以更全面的记录其所在的位置,则在断连后,用户可以在控制端回放无人机环绕拍摄的LiveView图像,从而能够更快速的找到无人机。
对于这些可以预见会断连的情况,可以称为指定事件,即无人机在发生指定事件时,可以执行上述的对周边进行环绕拍摄。关于指定事件,其可以是各种可以预见会断连的事件,比如,指定事件可以是被控设备处于低电量状态,具体的,即无人机的电量小于或等于电量阈值。当该低电量事件发生时,无人机可以预见到其与控制端的 连接即将断开,则可以在连接断开之前,对周边进行环绕拍摄,将自身所在的位置以LiveView图像的形式告知控制端。
又比如,指定事件可以是被控设备与障碍物发生碰撞。当无人机与障碍物发生碰撞时,有很多无人机可以预见断连的情况。比如,在一种情况中,无人机因碰撞而被迫下降(坠落),可以预见无人机在与地面发生的二次碰撞后难以维持与控制端的连接,因此无人机可以在连接断开之前,对周边进行环绕拍摄。
再比如,指定事件可以是被控设备与控制端之间的连接不稳定。比如无人机在飞行过程中,其与控制端之间的连接越来越不稳定,短暂性断连的频率越来越高。当断连的频率高于一个频率阈值时,可以使无人机对周边进行环绕拍摄,即对无人机所处位置的环境进行圈式拍摄,以防止与控制端的连接彻底断开后难以找到无人机。
需要说明的是,无人机在对周边进行环绕拍摄的实现有多种方式。比如,无人机可以搭配有云台,通过云台控制相机可以实现环绕拍摄;又比如,无人机可以通过调整姿态,使机体进行旋转,从而实现相机的环绕拍摄。
考虑到无人机通常处于较高的高度,因此,为拍摄到能够反映其所在位置的图像(否则可能拍到一圈天空),可以使无人机以斜向下的视角环绕当前所处位置进行拍摄。进一步的,又考虑到在上述指定事件发生后,通常也伴随着无人机的降落,因此,在一种实施方式中,无人机在以斜向下的视角进行拍摄时,斜向下的视角可以随着无人机高度的降低而逐渐抬高。
可以结合姿态角来展开描述无人机对周边拍摄的场景。无人机在对周边环绕拍摄时,随着无人机高度的降低,在一种实施方式中,无人机可以通过云台调整相机的俯仰角(pitch),使拍摄的视角产生俯仰方向的变化,当然,在另一种实施方式中,无人机也可以直接调整机身的俯仰角(pitch),从而带动相机的拍摄视角在俯仰方向上变化。同时,在通过云台调整相机的pitch轴时,无人机还可以通过云台调整相机的偏航角(yaw),使相机在水平面上做圆周式的转动,从而实现对周边环境的环绕拍摄;而在另一种实施方式中,无人机也可以通过调整机身的偏航角(yaw),从而带动相机的拍摄视角产生水平上的圆周转动;在又一种实施方式中,无人机还可以通过调整云台的pitch角和无人机机身的滚转角(roll),使无人机向左或向右转弯,也可以带动相机斜向下实现对周边环境的记录。可以参见图3,图3是本申请实施例提供的一种无人机环绕拍摄的场景示意图。
需要注意的是,无人机在对周边进行环绕拍摄时,并不限定只能拍摄一圈,在一些场景中,无人机可以拍摄半圈(即180度)、四分之三圈(270度)、两圈等各种圈 数,本申请对此不做限定。
由前文对控制端的说明可知,在一种实施方式中,控制端可以包括移动终端与遥控器,其中移动终端与遥控器连接,遥控器与被控设备无线连接。被控设备发送的LiveView图像的图像帧可以由遥控器接收,遥控器将接收到的图像帧转发给移动终端,由移动终端对图像帧进行存储。
由前文对移动终端的说明可知,移动终端可以是智能眼镜。当移动终端是智能眼镜时,智能眼镜可以直接与无人机连接,或通过遥控器与无人机连接,其可以接收无人机拍摄的LiveView图像的图像帧,并缓存到智能眼镜的存储器中,在无人机发生炸机等事故、与无人机的连接断开后,智能眼镜可以读取存储的LiveView图像的图像帧回放给用户,以方便用户快速找到无人机。以上为对本申请实施例提供的图像处理方法的说明。本申请实施例提供的图像处理方法,控制端可以将实时取景图像的图像帧存储下来,从而在与被控设备的连接断开后,可以读取存储下的实时取景的图像帧进行显示,让用户可以根据连接断开前的实时取景图像进行无人机的寻找,提高了找机的效率。
下面请参见图4,图4是本申请实施例提供的另一种图像处理方法的流程图。该方法应用于被控设备,关于被控设备,可以参见前文中的相关说明。该方法包括:
S401、获取拍摄的实时取景图像的图像帧,所述图像帧包括所述被控设备在发生指定事件时对周边进行环绕拍摄得到的。
S402、将所述图像帧发送给控制端,以便于所述控制端对所述图像帧中的目标图像帧进行存储,得到图像帧集合,在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
需要说明的是,指定事件不一定是被控设备的异常事件,其也可以是被控设备正常工作等正常事件。
可选的,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
可选的,所述指定事件包括所述被控设备与障碍物发生碰撞。
可选的,所述指定事件包括所述被控设备与所述控制端之间的连接不稳定。
可选的,所述被控设备包括无人机。
可选的,所述对周边进行环绕拍摄,包括:
以斜向下的视角环绕当前所处位置进行拍摄。
可选的,所述斜向下的视角随所述被控设备高度的降低而抬高。
可选的,还包括:
获取当前的状态信息,并将所述状态信息发送给所述控制端,以便于所述控制端建立所述状态信息与对应时间的所述目标图像帧的关联关系,将所述状态信息与其所关联的目标图像帧配合显示。
可选的,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
可选的,所述位置是通过GPS定位确定的。
可选的,所述距离信息是根据对周边物体拍摄所得的深度图像计算得到的。
可选的,所述控制端包括移动终端和/或遥控器。
可选的,所述被控设备与所述遥控器无线连接,所述移动终端与所述遥控器连接;
所述图像帧被所述被控设备发送给所述遥控器,以便于所述遥控器将所述图像帧传输给所述移动终端进行存储。
可选的,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。本实施例提供的图像处理方法,被控设备可以将实时取景图像的图像帧发送给控制端,以使控制端可以对实时取景图像的图像帧进行存储,从而在与被控设备的连接断开后,可以读取存储的图像帧进行显示,辅助用户进行被控设备的寻找。并且,被控设备还可以在发生指定事件时对周边进行环绕拍摄,使得控制端存储的实时取景图像能够指示出被控设备所处的位置和环境,进一步的提高用户寻找被控设备的效率。
以上为对本申请实施例提供的另一种图像处理方法的说明。下面请参见图5,图5是本申请实施例提供的一种移动终端的结构示意图。该移动终端可以与被控设备连接。
所述移动终端包括:处理器510与存储有计算机程序的存储器520;
所述处理器在执行所述计算机程序时实现以下步骤:
接收所述被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;其中,所述图像帧是所述被控设备拍摄的实时取景图像;
在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
可选的,所述处理器还用于,当所述图像帧集合不满足指定条件时,根据实时接收到的所述图像帧对所述图像帧集合进行更新。
可选的,所述处理器在执行所述根据实时接收到的所述图像帧对所述图像帧集合进行更新时具体用于,用实时接收到的所述图像帧替换所述图像帧集合中的对应时间最早的所述目标图像帧。
可选的,所述指定条件包括用于限制所述图像帧集合对应的数据量的第一条件。
可选的,所述第一条件包括所述图像帧集合对应的数据量小于或等于数据量阈值。
可选的,所述指定条件包括用于限制所述图像帧集合所包含的帧数的第二条件。
可选的,所述第二条件包括所述图像帧集合中所包含的帧数小于或等于帧数阈值。
可选的,所述处理器还用于,接收所述被控设备发送的状态信息,并建立所述状态信息与对应时间的所述目标图像帧之间的关联关系;其中,所述状态信息用于与其所关联的目标图像帧配合显示。
可选的,所述处理器在执行所述建立所述状态信息与对应时间的所述目标图像帧之间的关联关系时具体用于,将所述状态信息与对应时间的所述目标图像帧关联存储。
可选的,所述处理器在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,在所述目标图像帧的显示界面的指定区域上显示所述状态信息。
可选的,所述状态信息的显示由用户输入的显示指令触发。
可选的,所述处理器在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,根据用户输入的切换指令,将所述目标图像帧的显示界面切换至所述状态信息的显示界面。
可选的,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
可选的,所述位置用于以GPS地图的形式显示在所述目标图像帧的显示界面。
可选的,所述运动方向用于以虚拟箭头的形式显示在所述目标图像帧的显示界面。
可选的,所述距离信息用于在所述目标图像帧的显示界面中的所述距离信息对应的物体上显示。
可选的,所述图像帧是所述被控设备在发生指定事件时对周边进行环绕拍摄得到的。
可选的,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
可选的,所述指定事件包括所述被控设备与障碍物发生碰撞。
可选的,所述指定事件包括所述被控设备与所述遥控器之间的连接不稳定。
可选的,所述被控设备包括无人机。
可选的,所述对周边进行环绕拍摄,包括:
以斜向下的视角环绕当前所处位置进行拍摄。
可选的,所述斜向下的视角随所述被控设备高度的降低而抬高。
可选的,所述目标图像帧是从所述图像帧中抽取的。
可选的,所述处理器在执行所述对所述图像帧中的目标图像帧进行存储时具体用于,当所述被控设备低电量返回时,对所述图像帧中的目标图像帧进行存储。
可选的,所述处理器还用于,在读取所述目标图像帧以进行显示之前,确定与所述被控设备的断开连接的时长满足第三条件,所述第三条件是基于时长阈值设定的。
可选的,所述第三条件包括所述断开连接的时长大于或等于所述时长阈值。
可选的,读取所述目标图像帧以进行显示由用户输入的回放指令触发。
可选的,所述移动终端通过遥控器与所述被控设备连接。
可选的,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。
本实施例提供的移动终端,可以将被控设备拍摄的实时取景图像的图像帧存储下来,从而在与被控设备的连接断开后,可以读取存储下的实时取景的图像帧进行显示,让用户可以根据连接断开前的实时取景图像对被控设备进行寻找,提高了寻找的效率。
以上为对本申请实施例提供的一种移动终端的说明。下面请参见图6,图6是本申请实施例提供的一种电子设备的结构示意图。该电子设备受控于与所述电子设备连接的控制端;
所述电子设备包括相机610、处理器620与存储有计算机程序的存储器630;
所述处理器在执行所述计算机程序时实现以下步骤:
获取所述相机拍摄的实时取景图像的图像帧,所述图像帧是所述电子设备在发生指定事件时控制所述相机对周边进行环绕拍摄得到的;
将所述图像帧发送给所述控制端,以便于所述控制端对所述图像帧中的目标图像帧进行存储,得到图像帧集合,在与所述电子设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
可选的,所述指定事件包括所述电子设备的电量小于或等于电量阈值。
可选的,所述指定事件包括所述电子设备与障碍物发生碰撞。
可选的,所述指定事件包括所述电子设备与所述遥控器之间的连接不稳定。
可选的,所述电子设备包括无人机。
可选的,所述处理器在执行所述控制所述相机对周边进行环绕拍摄时具体用于,控制所述相机以斜向下的视角环绕当前所处位置进行拍摄。
可选的,所述斜向下的视角随所述电子设备高度的降低而抬高。
可选的,所述处理器还用于,获取当前的状态信息,并将所述状态信息发送给所述遥控器,以便于所述移动终端从所述遥控器处接收到所述状态信息后,建立所述状 态信息与对应时间的所述目标图像帧的关联关系,将所述状态信息与其所关联的目标图像帧配合显示。
可选的,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
可选的,所述位置是通过GPS定位确定的。
可选的,还包括:TOF摄像头;
所述距离信息是根据深度图像计算得到的,所述深度图像是通过所述TOF摄像头对周边物体进行拍摄得到的。
可选的,还包括:云台;
所述图像帧是所述电子设备通过所述云台控制所述相机进行环绕拍摄得到的。
可选的,所述控制端包括移动终端和/或遥控器。
可选的,所述电子设备通过所述遥控器与所述移动终端连接;
所述图像帧被发送给所述遥控器,并由所述遥控器转发给所述移动终端,所述目标图像帧由所述移动终端存储。
可选的,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。
本实施例提供的电子设备,其可以将实时取景图像的图像帧发送给控制端,以使控制端在获取到实时取景图像的图像帧后,可以对实时取景图像的图像帧进行存储,进而,在与电子设备的连接断开后,控制端可以读取存储的图像帧进行显示,辅助用户进行电子设备的寻找。并且,电子设备还可以在发生指定事件时对周边进行环绕拍摄,使得控制端存储的实时取景图像能够指示出电子设备所处的位置和环境,进一步的提高用户寻找电子设备的效率。
以上为对本申请实施例提供的一种电子设备的说明。下面请参见图7,图7是本申请实施例提供的一种控制系统的结构示意图。该控制系统包括:控制端710与被控设备720;
所述被控设备720用于,将拍摄的实时取景图像的图像帧发送给控制端;
所述控制端710用于,接收所述图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
可选的,所述控制端还用于,当所述图像帧集合不满足指定条件时,根据实时接收到的所述图像帧对所述图像帧集合进行更新。
可选的,所述控制端在执行所述根据实时接收到的所述图像帧对所述图像帧集合 进行更新时具体用于,用实时接收到的所述图像帧替换所述图像帧集合中的对应时间最早的所述目标图像帧。
可选的,所述指定条件包括用于限制所述图像帧集合对应的数据量的第一条件。
可选的,所述第一条件包括所述图像帧集合对应的数据量小于或等于数据量阈值。
可选的,所述指定条件包括用于限制所述图像帧集合所包含的帧数的第二条件。
可选的,所述第二条件包括所述图像帧集合中所包含的帧数小于或等于帧数阈值。
可选的,所述被控设备还用于,获取本机的状态信息,并将所述状态信息发送给所述控制端;
所述控制端还用于,接收所述状态信息,并建立所述状态信息与对应时间的所述目标图像帧之间的关联关系;其中,所述状态信息用于与其所关联的目标图像帧配合显示。
可选的,所述控制端在执行所述建立所述状态信息与对应时间的所述目标图像帧之间的关联关系时具体用于,将所述状态信息与对应时间的所述目标图像帧关联存储。
可选的,所述控制端在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,在所述目标图像帧的显示界面的指定区域上显示所述状态信息。
可选的,所述状态信息的显示由用户输入的显示指令触发。
可选的,所述控制端在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,根据用户输入的切换指令,将所述目标图像帧的显示界面切换至所述状态信息的显示界面。
可选的,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
可选的,所述位置是所述被控设备通过GPS定位确定的,所述位置用于以GPS地图的形式显示在所述目标图像帧的显示界面。
可选的,所述运动方向用于以虚拟箭头的形式显示在所述目标图像帧的显示界面。
可选的,所述距离信息是所述被控设备根据对周边物体拍摄所得的深度图像计算得到的,所述距离信息用于在所述目标图像帧的显示界面中的所述距离信息对应的物体上显示。
可选的,所述图像帧是所述被控设备在发生指定事件时对周边进行环绕拍摄得到的。
可选的,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
可选的,所述指定事件包括所述被控设备与障碍物发生碰撞。
可选的,所述指定事件包括所述被控设备与所述控制端之间的连接不稳定。
可选的,所述被控设备包括无人机。
可选的,所述被控设备对周边进行环绕拍摄具体用于,以斜向下的视角环绕当前所处位置进行拍摄。
可选的,所述斜向下的视角随所述被控设备高度的降低而抬高。
可选的,所述目标图像帧是从所述图像帧中抽取的。
可选的,所述控制端在执行所述对所述图像帧中的目标图像帧进行存储时具体用于,当所述被控设备低电量返回时,对所述图像帧中的目标图像帧进行存储。
可选的,所述控制端还用于,在读取所述目标图像帧以进行显示之前,确定与所述被控设备的断开连接的时长满足第三条件,所述第三条件是基于时长阈值设定的。
可选的,所述第三条件包括所述断开连接的时长大于或等于所述时长阈值。
可选的,读取所述目标图像帧以进行显示由用户输入的回放指令触发。
可选的,所述控制端包括移动终端和/或遥控器。
可选的,所述移动终端与所述遥控器连接,所述遥控器与所述被控设备无线连接;
所述图像帧由所述遥控器从所述被控设备获取后发送给所述移动终端,所述目标图像帧由所述移动终端存储。
可选的,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。本申请实施例提供的控制系统,控制端可以将被控设备拍摄的实时取景图像的图像帧存储下来,从而在与被控设备的连接断开后,可以读取存储下的实时取景的图像帧进行显示,让用户可以根据连接断开前的实时取景图像进行无人机的寻找,提高了找机的效率。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序;所述计算机程序被处理器执行时实现如本申请提供任一种应用于控制端的图像处理方法。
本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序;所述计算机程序被处理器执行时实现如本申请提供任一种应用于被控设备的图像处理方法。
以上实施例中对每个步骤分别提供了多种实施方式,至于每个步骤具体采用哪种实施方式,在不存在冲突或矛盾的基础上,本领域技术人员可以根据实际情况自由选择或组合,由此构成各种不同的实施例。而本申请文件限于篇幅,未对各种不同的实施例展开说明,但可以理解的是,各种不同的实施例也属于本申请实施例公开的范围。
本申请实施例可采用在一个或多个其中包含有程序代码的存储介质(包括但不限 于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。计算机可用存储介质包括永久性和非永久性、可移动和非可移动媒体,可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括但不限于:相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本申请实施例所提供的方法和设备进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (123)

  1. 一种图像处理方法,其特征在于,应用于控制端,所述方法包括:
    接收被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;其中,所述图像帧是所述被控设备拍摄的实时取景图像;
    在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
  2. 根据权利要求1所述的图像处理方法,其特征在于,还包括:
    当所述图像帧集合不满足指定条件时,根据实时接收到的所述图像帧对所述图像帧集合进行更新。
  3. 根据权利要求2所述的图像处理方法,其特征在于,所述根据实时接收到的所述图像帧对所述图像帧集合进行更新,包括:
    用实时接收到的所述图像帧替换所述图像帧集合中的对应时间最早的所述目标图像帧。
  4. 根据权利要求2所述的图像处理方法,其特征在于,所述指定条件包括用于限制所述图像帧集合对应的数据量的第一条件。
  5. 根据权利要求4所述的图像处理方法,其特征在于,所述第一条件包括所述图像帧集合对应的数据量小于或等于数据量阈值。
  6. 根据权利要求2所述的图像处理方法,其特征在于,所述指定条件包括用于限制所述图像帧集合所包含的帧数的第二条件。
  7. 根据权利要求6所述的图像处理方法,其特征在于,所述第二条件包括所述图像帧集合中所包含的帧数小于或等于帧数阈值。
  8. 根据权利要求1所述的图像处理方法,其特征在于,还包括:
    接收所述被控设备发送的状态信息,并建立所述状态信息与对应时间的所述目标图像帧之间的关联关系;其中,所述状态信息用于与其所关联的目标图像帧配合显示。
  9. 根据权利要求8所述的图像处理方法,其特征在于,所述建立所述状态信息与对应时间的所述目标图像帧之间的关联关系,包括:
    将所述状态信息与对应时间的所述目标图像帧关联存储。
  10. 根据权利要求8所述的图像处理方法,其特征在于,将所述状态信息与其所述关联的目标图像帧配合显示,包括:
    在所述目标图像帧的显示界面的指定区域上显示所述状态信息。
  11. 根据权利要求10所述的图像处理方法,其特征在于,所述状态信息的显示由 用户输入的显示指令触发。
  12. 根据权利要求8所述的图像处理方法,其特征在于,将所述状态信息与其所述关联的目标图像帧配合显示,包括:
    根据用户输入的切换指令,将所述目标图像帧的显示界面切换至所述状态信息的显示界面。
  13. 根据权利要求8所述的图像处理方法,其特征在于,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
  14. 根据权利要求13所述的图像处理方法,其特征在于,所述位置用于以GPS地图的形式显示在所述目标图像帧的显示界面。
  15. 根据权利要求13所述的图像处理方法,其特征在于,所述运动方向用于以虚拟箭头的形式显示在所述目标图像帧的显示界面。
  16. 根据权利要求13所述的图像处理方法,其特征在于,所述距离信息用于在所述目标图像帧的显示界面中的所述距离信息对应的物体上显示。
  17. 根据权利要求1-16任一项所述的图像处理方法,其特征在于,所述图像帧是所述被控设备在发生指定事件时对周边进行环绕拍摄得到的。
  18. 根据权利要求17所述的图像处理方法,其特征在于,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
  19. 根据权利要求17所述的图像处理方法,其特征在于,所述指定事件包括所述被控设备与障碍物发生碰撞。
  20. 根据权利要求17所述的图像处理方法,其特征在于,所述指定事件包括所述被控设备与所述控制端之间的连接不稳定。
  21. 根据权利要求17所述的图像处理方法,其特征在于,所述被控设备包括无人机。
  22. 根据权利要求21所述的图像处理方法,其特征在于,所述对周边进行环绕拍摄,包括:
    以斜向下的视角环绕当前所处位置进行拍摄。
  23. 根据权利要求22所述的图像处理方法,其特征在于,所述斜向下的视角随所述被控设备高度的降低而抬高。
  24. 根据权利要求1所述的图像处理方法,其特征在于,所述目标图像帧是从所述图像帧中抽取的。
  25. 根据权利要求1至24任一项所述的图像处理方法,其特征在于,所述对所述 图像帧中的目标图像帧进行存储,包括:
    当所述被控设备低电量返回时,对所述图像帧中的目标图像帧进行存储。
  26. 根据权利要求1所述的图像处理方法,其特征在于,在读取所述目标图像帧以进行显示之前,所述方法还包括:
    确定与所述被控设备的断开连接的时长满足第三条件,所述第三条件是基于时长阈值设定的。
  27. 根据权利要求26所述的图像处理方法,其特征在于,所述第三条件包括所述断开连接的时长大于或等于所述时长阈值。
  28. 根据权利要求1所述的图像处理方法,其特征在于,读取所述目标图像帧以进行回放显示由用户输入的回放指令触发。
  29. 根据权利要求1所述的图像处理方法,其特征在于,所述控制端包括移动终端和/或遥控器。
  30. 根据权利要求29所述的图像处理方法,其特征在于,所述移动终端与所述遥控器连接,所述遥控器与所述被控设备无线连接;
    所述图像帧由所述遥控器从所述被控设备获取后发送给所述移动终端,所述目标图像帧由所述移动终端存储。
  31. 根据权利要求1所述的图像处理方法,其特征在于,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。
  32. 一种图像处理方法,其特征在于,应用于被控设备,包括:
    获取拍摄的实时取景图像的图像帧,所述图像帧包括所述被控设备在发生指定事件时对周边进行环绕拍摄得到的;
    将所述图像帧发送给控制端,以便于所述控制端对所述图像帧中的目标图像帧进行存储,得到图像帧集合,在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
  33. 根据权利要求32所述的图像处理方法,其特征在于,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
  34. 根据权利要求32所述的图像处理方法,其特征在于,所述指定事件包括所述被控设备与障碍物发生碰撞。
  35. 根据权利要求32所述的图像处理方法,其特征在于,所述指定事件包括所述被控设备与所述控制端之间的连接不稳定。
  36. 根据权利要求32所述的图像处理方法,其特征在于,所述被控设备包括无人 机。
  37. 根据权利要求36所述的图像处理方法,其特征在于,所述对周边进行环绕拍摄,包括:
    以斜向下的视角环绕当前所处位置进行拍摄。
  38. 根据权利要求37所述的图像处理方法,其特征在于,所述斜向下的视角随所述被控设备高度的降低而抬高。
  39. 根据权利要求32所述的图像处理方法,其特征在于,还包括:
    获取当前的状态信息,并将所述状态信息发送给所述控制端,以便于所述控制端建立所述状态信息与对应时间的所述目标图像帧的关联关系,将所述状态信息与其所关联的目标图像帧配合显示。
  40. 根据权利要求39所述的图像处理方法,其特征在于,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
  41. 根据权利要求40所述的图像处理方法,其特征在于,所述位置是通过GPS定位确定的。
  42. 根据权利要求40所述的图像处理方法,其特征在于,所述距离信息是根据对周边物体拍摄所得的深度图像计算得到的。
  43. 根据权利要求32所述的图像处理方法,其特征在于,所述控制端包括移动终端和/或遥控器。
  44. 根据权利要求43所述的图像处理方法,其特征在于,所述被控设备与所述遥控器无线连接,所述移动终端与所述遥控器连接;
    所述图像帧被所述被控设备发送给所述遥控器,以便于所述遥控器将所述图像帧传输给所述移动终端进行存储。
  45. 根据权利要求32所述的图像处理方法,其特征在于,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。
  46. 一种移动终端,其特征在于,所述移动终端与被控设备连接;
    所述移动终端包括:处理器与存储有计算机程序的存储器;
    所述处理器在执行所述计算机程序时实现以下步骤:
    接收所述被控设备发送的图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;其中,所述图像帧是所述被控设备拍摄的实时取景图像;
    在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设 备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
  47. 根据权利要求46所述的移动终端,其特征在于,所述处理器还用于,当所述图像帧集合不满足指定条件时,根据实时接收到的所述图像帧对所述图像帧集合进行更新。
  48. 根据权利要求47所述的移动终端,其特征在于,所述处理器在执行所述根据实时接收到的所述图像帧对所述图像帧集合进行更新时具体用于,用实时接收到的所述图像帧替换所述图像帧集合中的对应时间最早的所述目标图像帧。
  49. 根据权利要求47所述的移动终端,其特征在于,所述指定条件包括用于限制所述图像帧集合对应的数据量的第一条件。
  50. 根据权利要求49所述的移动终端,其特征在于,所述第一条件包括所述图像帧集合对应的数据量小于或等于数据量阈值。
  51. 根据权利要求47所述的移动终端,其特征在于,所述指定条件包括用于限制所述图像帧集合所包含的帧数的第二条件。
  52. 根据权利要求51所述的移动终端,其特征在于,所述第二条件包括所述图像帧集合中所包含的帧数小于或等于帧数阈值。
  53. 根据权利要求46所述的移动终端,其特征在于,所述处理器还用于,接收所述被控设备发送的状态信息,并建立所述状态信息与对应时间的所述目标图像帧之间的关联关系;其中,所述状态信息用于与其所关联的目标图像帧配合显示。
  54. 根据权利要求53所述的移动终端,其特征在于,所述处理器在执行所述建立所述状态信息与对应时间的所述目标图像帧之间的关联关系时具体用于,将所述状态信息与对应时间的所述目标图像帧关联存储。
  55. 根据权利要求53所述的移动终端,其特征在于,所述处理器在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,在所述目标图像帧的显示界面的指定区域上显示所述状态信息。
  56. 根据权利要求55所述的移动终端,其特征在于,所述状态信息的显示由用户输入的显示指令触发。
  57. 根据权利要求53所述的移动终端,其特征在于,所述处理器在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,根据用户输入的切换指令,将所述目标图像帧的显示界面切换至所述状态信息的显示界面。
  58. 根据权利要求53所述的移动终端,其特征在于,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
  59. 根据权利要求58所述的移动终端,其特征在于,所述位置用于以GPS地图的形式显示在所述目标图像帧的显示界面。
  60. 根据权利要求58所述的移动终端,其特征在于,所述运动方向用于以虚拟箭头的形式显示在所述目标图像帧的显示界面。
  61. 根据权利要求58所述的移动终端,其特征在于,所述距离信息用于在所述目标图像帧的显示界面中的所述距离信息对应的物体上显示。
  62. 根据权利要求46-61任一项所述的移动终端,其特征在于,所述图像帧是所述被控设备在发生指定事件时对周边进行环绕拍摄得到的。
  63. 根据权利要求62所述的移动终端,其特征在于,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
  64. 根据权利要求62所述的移动终端,其特征在于,所述指定事件包括所述被控设备与障碍物发生碰撞。
  65. 根据权利要求62所述的移动终端,其特征在于,所述指定事件包括所述被控设备与所述遥控器之间的连接不稳定。
  66. 根据权利要求62所述的移动终端,其特征在于,所述被控设备包括无人机。
  67. 根据权利要求66所述的移动终端,其特征在于,所述对周边进行环绕拍摄,包括:
    以斜向下的视角环绕当前所处位置进行拍摄。
  68. 根据权利要求67所述的移动终端,其特征在于,所述斜向下的视角随所述被控设备高度的降低而抬高。
  69. 根据权利要求46所述的移动终端,其特征在于,所述目标图像帧是从所述图像帧中抽取的。
  70. 根据权利要求46至69任一项所述的移动终端,其特征在于,所述处理器在执行所述对所述图像帧中的目标图像帧进行存储时具体用于,当所述被控设备低电量返回时,对所述图像帧中的目标图像帧进行存储。
  71. 根据权利要求46所述的移动终端,其特征在于,所述处理器还用于,在读取所述目标图像帧以进行显示之前,确定与所述被控设备的断开连接的时长满足第三条件,所述第三条件是基于时长阈值设定的。
  72. 根据权利要求71所述的移动终端,其特征在于,所述第三条件包括所述断开连接的时长大于或等于所述时长阈值。
  73. 根据权利要求46所述的移动终端,其特征在于,读取所述目标图像帧以进行 回放显示由用户输入的回放指令触发。
  74. 根据权利要求46所述的移动终端,其特征在于,所述移动终端通过遥控器与所述被控设备连接。
  75. 根据权利要求46所述的移动终端,其特征在于,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。
  76. 一种电子设备,其特征在于,受控于与所述电子设备连接的控制端;
    所述电子设备包括相机、处理器与存储有计算机程序的存储器;
    所述处理器在执行所述计算机程序时实现以下步骤:
    获取所述相机拍摄的实时取景图像的图像帧,所述图像帧是所述电子设备在发生指定事件时控制所述相机对周边进行环绕拍摄得到的;
    将所述图像帧发送给所述控制端,以便于所述控制端对所述图像帧中的目标图像帧进行存储,得到图像帧集合,在与所述电子设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
  77. 根据权利要求76所述的电子设备,其特征在于,所述指定事件包括所述电子设备的电量小于或等于电量阈值。
  78. 根据权利要求76所述的电子设备,其特征在于,所述指定事件包括所述电子设备与障碍物发生碰撞。
  79. 根据权利要求76所述的电子设备,其特征在于,所述指定事件包括所述电子设备与所述遥控器之间的连接不稳定。
  80. 根据权利要求76所述的电子设备,其特征在于,所述电子设备包括无人机。
  81. 根据权利要求80所述的电子设备,其特征在于,所述处理器在执行所述控制所述相机对周边进行环绕拍摄时具体用于,控制所述相机以斜向下的视角环绕当前所处位置进行拍摄。
  82. 根据权利要求81所述的电子设备,其特征在于,所述斜向下的视角随所述电子设备高度的降低而抬高。
  83. 根据权利要求76所述的电子设备,其特征在于,所述处理器还用于,获取当前的状态信息,并将所述状态信息发送给所述遥控器,以便于所述移动终端从所述遥控器处接收到所述状态信息后,建立所述状态信息与对应时间的所述目标图像帧的关联关系,将所述状态信息与其所关联的目标图像帧配合显示。
  84. 根据权利要求83所述的电子设备,其特征在于,所述状态信息包括以下一种 或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
  85. 根据权利要求84所述的电子设备,其特征在于,所述位置是通过GPS定位确定的。
  86. 根据权利要求84所述的电子设备,其特征在于,还包括:TOF摄像头;
    所述距离信息是根据深度图像计算得到的,所述深度图像是通过所述TOF摄像头对周边物体进行拍摄得到的。
  87. 根据权利要求76所述的电子设备,其特征在于,还包括:云台;
    所述图像帧是所述电子设备通过所述云台控制所述相机进行环绕拍摄得到的。
  88. 根据权利要求76所述的电子设备,其特征在于,所述控制端包括移动终端和/或遥控器。
  89. 根据权利要求88所述的电子设备,其特征在于,所述电子设备通过所述遥控器与所述移动终端连接;
    所述图像帧被发送给所述遥控器,并由所述遥控器转发给所述移动终端,所述目标图像帧由所述移动终端存储。
  90. 根据权利要求76所述的电子设备,其特征在于,所述图像帧集合用于辅助用户寻找所述被控设备的降落位置。
  91. 一种控制系统,其特征在于,包括:控制端与被控设备;
    所述被控设备用于,将拍摄的实时取景图像的图像帧发送给控制端;
    所述控制端用于,接收所述图像帧,并对所述图像帧中的目标图像帧进行存储,得到图像帧集合;在与所述被控设备断开连接后,读取所述图像帧集合中所述控制端与所述被控设备断开连接时的相邻时刻的所述目标图像帧以进行回放显示。
  92. 根据权利要求91所述的控制系统,其特征在于,所述控制端还用于,当所述图像帧集合不满足指定条件时,根据实时接收到的所述图像帧对所述图像帧集合进行更新。
  93. 根据权利要求92所述的控制系统,其特征在于,所述控制端在执行所述根据实时接收到的所述图像帧对所述图像帧集合进行更新时具体用于,用实时接收到的所述图像帧替换所述图像帧集合中的对应时间最早的所述目标图像帧。
  94. 根据权利要求91所述的控制系统,其特征在于,所述指定条件包括用于限制所述图像帧集合对应的数据量的第一条件。
  95. 根据权利要求94所述的控制系统,其特征在于,所述第一条件包括所述图像帧集合对应的数据量小于或等于数据量阈值。
  96. 根据权利要求92所述的控制系统,其特征在于,所述指定条件包括用于限制所述图像帧集合所包含的帧数的第二条件。
  97. 根据权利要求96所述的控制系统,其特征在于,所述第二条件包括所述图像帧集合中所包含的帧数小于或等于帧数阈值。
  98. 根据权利要求91所述的控制系统,其特征在于,所述被控设备还用于,获取本机的状态信息,并将所述状态信息发送给所述控制端;
    所述控制端还用于,接收所述状态信息,并建立所述状态信息与对应时间的所述目标图像帧之间的关联关系;其中,所述状态信息用于与其所关联的目标图像帧配合显示。
  99. 根据权利要求98所述的控制系统,其特征在于,所述控制端在执行所述建立所述状态信息与对应时间的所述目标图像帧之间的关联关系时具体用于,将所述状态信息与对应时间的所述目标图像帧关联存储。
  100. 根据权利要求98所述的控制系统,其特征在于,所述控制端在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,在所述目标图像帧的显示界面的指定区域上显示所述状态信息。
  101. 根据权利要求100所述的控制系统,其特征在于,所述状态信息的显示由用户输入的显示指令触发。
  102. 根据权利要求98所述的控制系统,其特征在于,所述控制端在执行将所述状态信息与其所述关联的目标图像帧配合显示时具体用于,根据用户输入的切换指令,将所述目标图像帧的显示界面切换至所述状态信息的显示界面。
  103. 根据权利要求98所述的控制系统,其特征在于,所述状态信息包括以下一种或多种:高度、位置、运动方向、姿态、速度、电量、与周边物体的距离信息。
  104. 根据权利要求103所述的控制系统,其特征在于,所述位置是所述被控设备通过GPS定位确定的,所述位置用于以GPS地图的形式显示在所述目标图像帧的显示界面。
  105. 根据权利要求103所述的控制系统,其特征在于,所述运动方向用于以虚拟箭头的形式显示在所述目标图像帧的显示界面。
  106. 根据权利要求103所述的控制系统,其特征在于,所述距离信息是所述被控设备根据对周边物体拍摄所得的深度图像计算得到的,所述距离信息用于在所述目标图像帧的显示界面中的所述距离信息对应的物体上显示。
  107. 根据权利要求91-106任一项所述的控制系统,其特征在于,所述图像帧是 所述被控设备在发生指定事件时对周边进行环绕拍摄得到的。
  108. 根据权利要求107所述的控制系统,其特征在于,所述指定事件包括所述被控设备的电量小于或等于电量阈值。
  109. 根据权利要求107所述的控制系统,其特征在于,所述指定事件包括所述被控设备与障碍物发生碰撞。
  110. 根据权利要求107所述的控制系统,其特征在于,所述指定事件包括所述被控设备与所述控制端之间的连接不稳定。
  111. 根据权利要求107所述的控制系统,其特征在于,所述被控设备包括无人机。
  112. 根据权利要求111所述的控制系统,其特征在于,所述被控设备对周边进行环绕拍摄具体用于,以斜向下的视角环绕当前所处位置进行拍摄。
  113. 根据权利要求112所述的控制系统,其特征在于,所述斜向下的视角随所述被控设备高度的降低而抬高。
  114. 根据权利要求91所述的控制系统,其特征在于,所述目标图像帧是从所述图像帧中抽取的。
  115. 根据权利要求91至114任一项所述的控制系统,其特征在于,所述控制端在执行所述对所述图像帧中的目标图像帧进行存储时具体用于,当所述被控设备低电量返回时,对所述图像帧中的目标图像帧进行存储。
  116. 根据权利要求91所述的控制系统,其特征在于,所述控制端还用于,在读取所述目标图像帧以进行显示之前,确定与所述被控设备的断开连接的时长满足第三条件,所述第三条件是基于时长阈值设定的。
  117. 根据权利要求116所述的控制系统,其特征在于,所述第三条件包括所述断开连接的时长大于或等于所述时长阈值。
  118. 根据权利要求91所述的控制系统,其特征在于,读取所述目标图像帧以进行回放显示由用户输入的回放指令触发。
  119. 根据权利要求91所述的控制系统,其特征在于,所述控制端包括移动终端和/或遥控器。
  120. 根据权利要求119所述的控制系统,其特征在于,所述移动终端与所述遥控器连接,所述遥控器与所述被控设备无线连接;
    所述图像帧由所述遥控器从所述被控设备获取后发送给所述移动终端,所述目标图像帧由所述移动终端存储。
  121. 根据权利要求91所述的控制系统,其特征在于,所述图像帧集合用于辅助 用户寻找所述被控设备的降落位置。
  122. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序;所述计算机程序被处理器执行时实现如权利要求1-31任一项所述的图像处理方法。
  123. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序;所述计算机程序被处理器执行时实现如权利要求32-45任一项所述的图像处理方法。
PCT/CN2020/097216 2020-06-19 2020-06-19 图像处理方法、移动终端及电子设备 WO2021253436A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/097216 WO2021253436A1 (zh) 2020-06-19 2020-06-19 图像处理方法、移动终端及电子设备
CN202080030104.0A CN113748668B (zh) 2020-06-19 2020-06-19 图像处理方法、移动终端及电子设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/097216 WO2021253436A1 (zh) 2020-06-19 2020-06-19 图像处理方法、移动终端及电子设备

Publications (1)

Publication Number Publication Date
WO2021253436A1 true WO2021253436A1 (zh) 2021-12-23

Family

ID=78728400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/097216 WO2021253436A1 (zh) 2020-06-19 2020-06-19 图像处理方法、移动终端及电子设备

Country Status (2)

Country Link
CN (1) CN113748668B (zh)
WO (1) WO2021253436A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104679873A (zh) * 2015-03-09 2015-06-03 深圳市道通智能航空技术有限公司 一种飞行器追踪方法和系统
CN109429028A (zh) * 2017-08-30 2019-03-05 深圳市道通智能航空技术有限公司 一种无人机图像回放的方法和装置
CN109765587A (zh) * 2019-03-06 2019-05-17 深圳飞马机器人科技有限公司 无人机定位系统、方法及监控系统
US20190215457A1 (en) * 2018-01-05 2019-07-11 Gopro, Inc. Modular Image Capture Systems
CN110261880A (zh) * 2019-06-19 2019-09-20 深圳市道通智能航空技术有限公司 一种搜索无人机的方法、系统及无人机

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101482620B1 (ko) * 2014-06-04 2015-01-14 김재완 영상 프레임을 이용한 보안 방법, 이를 실행하는 장치 및 이를 저장한 기록 매체
CN107079135B (zh) * 2016-01-29 2020-02-07 深圳市大疆创新科技有限公司 视频数据传输方法、系统、设备和拍摄装置
CN108700890B (zh) * 2017-06-12 2021-10-29 深圳市大疆创新科技有限公司 无人机返航控制方法、无人机和机器可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104679873A (zh) * 2015-03-09 2015-06-03 深圳市道通智能航空技术有限公司 一种飞行器追踪方法和系统
CN109429028A (zh) * 2017-08-30 2019-03-05 深圳市道通智能航空技术有限公司 一种无人机图像回放的方法和装置
US20190215457A1 (en) * 2018-01-05 2019-07-11 Gopro, Inc. Modular Image Capture Systems
CN109765587A (zh) * 2019-03-06 2019-05-17 深圳飞马机器人科技有限公司 无人机定位系统、方法及监控系统
CN110261880A (zh) * 2019-06-19 2019-09-20 深圳市道通智能航空技术有限公司 一种搜索无人机的方法、系统及无人机

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
THE BENEVOLENT WHO ENJOYS THE MOUNTAINS AND VIEWS THE WORLD: "What to Do if Your DJI Drone Gets Lost ", SOHU.COM, 22 April 2020 (2020-04-22), pages 1 - 2, XP055882666, Retrieved from the Internet <URL:http://www.sohu.com/a/390005525_100083694> [retrieved on 20220124] *

Also Published As

Publication number Publication date
CN113748668A (zh) 2021-12-03
CN113748668B (zh) 2023-09-12

Similar Documents

Publication Publication Date Title
US10863073B2 (en) Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
US20230175864A1 (en) Method, device and system for processing a flight task
WO2019119434A1 (zh) 信息处理方法、无人机、遥控设备以及非易失性存储介质
CN107450573B (zh) 飞行拍摄控制系统和方法、智能移动通信终端、飞行器
CN110383814B (zh) 控制方法、无人机、遥控设备以及非易失性存储介质
US11228737B2 (en) Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium
CN110945452A (zh) 云台和无人机控制方法、云台及无人机
CN111796603A (zh) 烟雾巡检无人机系统、巡检检测方法和存储介质
JP2007235399A (ja) 自動撮影装置
WO2021135824A1 (zh) 图像曝光方法及装置、无人机
JP2018070010A (ja) 無人航空機制御システム、その制御方法、及びプログラム
WO2019241970A1 (zh) 无人机的扬声器控制方法和设备
WO2021253436A1 (zh) 图像处理方法、移动终端及电子设备
WO2021217408A1 (zh) 无人机系统及其控制方法和装置
US20200382696A1 (en) Selfie aerial camera device
JP2019097137A (ja) 生成装置、生成システム、撮像システム、移動体、生成方法、及びプログラム
US20220187828A1 (en) Information processing device, information processing method, and program
CN110291013B (zh) 云台的控制方法、云台及无人飞行器
WO2023029857A1 (zh) 数据处理方法、装置、设备、存储介质及程序产品
US10715732B2 (en) Transmission apparatus, setting apparatus, transmission method, reception method, and storage medium
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
WO2022016334A1 (zh) 图像处理方法、装置、穿越机、图像优化系统及存储介质
CN108319295B (zh) 避障控制方法、设备及计算机可读存储介质
WO2022188151A1 (zh) 影像拍摄方法、控制装置、可移动平台和计算机存储介质
JPWO2018198317A1 (ja) 無人飛行体による空撮システム、方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20941064

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20941064

Country of ref document: EP

Kind code of ref document: A1