CN112732146B - Image display method and device and storage medium - Google Patents

Image display method and device and storage medium Download PDF

Info

Publication number
CN112732146B
CN112732146B CN201911032409.0A CN201911032409A CN112732146B CN 112732146 B CN112732146 B CN 112732146B CN 201911032409 A CN201911032409 A CN 201911032409A CN 112732146 B CN112732146 B CN 112732146B
Authority
CN
China
Prior art keywords
image
target
display
display interface
selection instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911032409.0A
Other languages
Chinese (zh)
Other versions
CN112732146A (en
Inventor
张哲维
刘波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN201911032409.0A priority Critical patent/CN112732146B/en
Publication of CN112732146A publication Critical patent/CN112732146A/en
Application granted granted Critical
Publication of CN112732146B publication Critical patent/CN112732146B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Abstract

The application discloses a method and a device for displaying images and a storage medium. Wherein, the method comprises the following steps: determining an image set corresponding to target monitoring equipment; selecting a first image and a second image which are acquired at different time points of a target area from an image set, wherein the first image and the second image are used for showing the state of a target object in the target area; respectively determining a first target image and a second target image displayed in different areas of the first display interface based on the first image and the second image; and displaying the first target image and the second target image in the first display interface. The method and the device solve the technical problems that the images are displayed in different interfaces in a contrast mode, the memory of the terminal is occupied, and the contrast display effect is poor.

Description

Image display method and device and storage medium
Technical Field
The present disclosure relates to the field of image display, and in particular, to an image display method and apparatus, and a storage medium.
Background
In the existing image contrast display technology, the contrast mode of the target area is realized by respectively opening two programs to respectively display images. Compared with the display in the same interface, the method not only occupies a larger memory of a user, but also is not intuitive in the display contrast of the images.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the application provides an image display method and device and a storage medium, and aims to at least solve the technical problems that images are displayed in different interfaces in a contrast mode, terminal memory is occupied, and the contrast display effect is poor.
According to an aspect of an embodiment of the present application, there is provided an image displaying method, including: determining an image set corresponding to target monitoring equipment; selecting a first image and a second image which are acquired at different time points of a target area from the image set, wherein the first image and the second image are used for showing the state of a target object in the target area; respectively determining a first target image and a second target image displayed in different areas of a first display interface based on the first image and the second image; and displaying the first target image and the second target image in the first display interface.
Optionally, the target monitoring device is any one of monitoring devices for monitoring state information of a target object in the target area; determining an image set corresponding to the target monitoring device, including: displaying the equipment identifications of all monitoring equipment for monitoring the state information of the target object in the target area in a second display interface; receiving a first selection instruction of a target device identifier in the device identifiers of all the monitoring devices in a second display interface; and in response to the first selection instruction, displaying the images acquired by the monitoring equipment corresponding to the target equipment identification at different time points in a third display interface, and taking all the images displayed in the third display interface as the images in the image set.
Optionally, selecting the first image and the second image acquired at different time points of the target region from the set of images comprises: receiving a second selection instruction in the third display interface, and selecting a first image from all images displayed in the third display interface based on the second selection instruction; and receiving a third selection instruction in the third display interface, and selecting a second image from all images displayed in the third display interface based on the third selection instruction.
Optionally, the method further comprises: responding to a first selection instruction, and showing a trigger control for detecting the trigger instruction; and when the trigger control is triggered, generating a first display interface for displaying the first target image and the second target image.
Optionally, the different regions of the first presentation interface comprise: the display device comprises a first display area and a second display area, wherein the first display area is used for displaying a first target image, and the second display area is used for displaying a second target image; a segmentation mark is arranged between the first display area and the second display area.
Optionally, the segmentation identification includes: a dividing line for dividing the first display interface into a first display area and a second display area; determining a first target image and a second target image respectively displayed in different areas of the first display interface based on the first image and the second image, comprising: determining first contour information of the first display area and second contour information of the second display area by taking the dividing line as a dividing boundary; a first target image is extracted from the first image based on the first contour information, and a second target image is extracted from the second image based on the second contour information.
Optionally, the split flag is movable, and the sizes of the first presentation area and the second presentation area vary with the movement of the split flag.
According to another aspect of the embodiments of the present application, there is provided an image displaying method, including: displaying the equipment identifications of all monitoring equipment for monitoring the state information of the target object in the target area in a second display interface; receiving a first selection instruction of a target device identifier in the device identifiers of all the monitoring devices in a second display interface; displaying images, collected at different time points, of the monitoring equipment corresponding to the target equipment identification in a third display interface in response to the first selection instruction; receiving a second selection instruction in the third display interface, and selecting a first image from all images displayed in the third display interface based on the second selection instruction; receiving a third selection instruction in the third display interface, and selecting a second image from all images displayed in the third display interface based on the third selection instruction; and the first image and the second image are displayed in a first display area and a second display area of the first display interface in a contrast mode.
Optionally, the comparative display of the first image and the second image in the first display area and the second display area of the first display interface includes: respectively determining a first target image and a second target image displayed in a first display area and a second display area of a first display interface based on the first image and the second image, wherein a first image range corresponding to the first target image and a second image range corresponding to the second target image jointly form an image range of the target area; and simultaneously displaying the first target image and the second target image in the first display interface.
Optionally, a division identifier is disposed between the first display area and the second display area, the division identifier is movable, and the size of the first display area and the size of the second display area change along with the movement of the division identifier.
According to another aspect of the embodiments of the present application, there is provided an image displaying apparatus including: the first determining module is used for determining an image set corresponding to the target monitoring equipment; a selection module, configured to select, from the image set, a first image and a second image acquired at different time points of the target region, where the first image and the second image are used to show a state of a target object in the target region; the second determining module is used for respectively determining a first target image and a second target image displayed in different areas of the first display interface based on the first image and the second image; and the display module is used for displaying the first target image and the second target image in the first display interface.
According to still another aspect of the embodiments of the present application, there is provided a non-volatile storage medium including a stored program, wherein the program controls a device on which the storage medium is located to execute the above-described image presentation method when the program runs.
According to another aspect of the embodiments of the present application, there is also provided a processor, connected to the memory, for executing program instructions in the memory, where the program instructions are used for executing the above-mentioned image presentation method.
In an embodiment of the application, a first image and a second image acquired at different time points of a target area are selected from an image set, wherein the first image and the second image are used for showing the state of a target object in the target area; respectively determining a first target image and a second target image displayed in different areas of the first display interface based on the first image and the second image; and displaying the first target image and the second target image in the first display interface. Based on the scheme, the effect of displaying the images in different areas and time on the same interface is achieved, and the technical problems that the images are displayed in different interfaces in a contrast mode, terminal memory is occupied, and the contrast display effect is poor are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flow chart of a method for displaying an image according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a display device for displaying images according to an embodiment of the present application;
FIG. 3 is a schematic flow chart diagram illustrating another method for displaying an image according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an image presentation interface according to an embodiment of the application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For better understanding of the embodiments of the present application, the technical terms referred to in the embodiments of the present application are briefly described as follows:
image range: the spatial range covered by the target image may be represented as an image of a specific spatial range.
In accordance with an embodiment of the present application, there is provided a method embodiment of a method for presenting an image, it being noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a schematic flow chart of an image displaying method according to an embodiment of the present application, as shown in fig. 1, the method includes the following steps:
step S102, determining an image set corresponding to target monitoring equipment;
in some embodiments of the present application, because the setting positions or the shooting angles of the monitoring devices are different, the ranges of the images acquired by the monitoring devices are also different, in this embodiment of the present application, for convenience of subsequent comparison, the images acquired by the monitoring devices may be grouped or classified, that is, each monitoring device corresponds to an image set, and the images in the image set are the images acquired by the corresponding monitoring devices.
The target monitoring equipment is any one of monitoring equipment used for monitoring state information of a target object in a target area; before the image set is selected, displaying the device identifications of all monitoring devices for monitoring the state information of the target object in the target area, namely the monitoring device identifications, in a second display interface; receiving a first selection instruction of a target device identifier in the device identifiers of all the monitoring devices in a second display interface; in response to the first selection instruction, displaying, in the third display interface, images acquired by the monitoring device at different time points, which correspond to the target device identifier in the image set, and taking all the images displayed in the third display interface as images in the image set, that is, images corresponding to the device identifiers.
Optionally, the second display interface may be an interface dedicated to displaying the identifier of the monitoring device, for example, an interface displaying different folder identifiers (a folder identifier may be an identifier of the monitoring device); the third display interface may be an interface presented after clicking the folder corresponding to the corresponding folder identifier, and the interface is used for displaying the image in the corresponding folder.
Step S104, selecting a first image and a second image which are acquired at different time points of a target area from an image set, wherein the first image and the second image are used for showing the state of a target object in the target area;
in some embodiments of the present application, when selecting the first image and the second image acquired at different points in time of the target region from the set of images, this may be achieved by: receiving a second selection instruction in the third display interface, and selecting a first image from all images displayed in the third display interface based on the second selection instruction; and receiving a third selection instruction in the third display interface, and selecting a second image from all images displayed in the third display interface based on the third selection instruction.
The first selection instruction, the second selection instruction and the third selection instruction include, but are not limited to, a mouse click instruction, a keyboard selection instruction and the like.
Step S106, respectively determining a first target image and a second target image displayed in different areas of a first display interface based on the first image and the second image;
the first image range corresponding to the first target image and the second image range corresponding to the second target image together constitute the image range of the target region, and the meaning of "the image range that together constitutes the target region" may be expressed as: the first image range and the second image range can be spliced into the target area in space, and the boundary of the first image and the boundary of the second image are shown to be matched on two sides of the dividing line on the display interface. Specifically, the method comprises the following steps: the first image range corresponding to the first target image and the second image range corresponding to the second target image are image ranges of different sub-regions in the target region, for example, the target region is divided into a first sub-region and a second sub-region, the first image range and the second image range are respectively used for displaying the first sub-region and the second sub-region, wherein images corresponding to the first image range and the second image range both include states of the target object.
In other embodiments of the present application, the first image range and the second image range may be image ranges of the same region in the target region, i.e. the first sub-region and the second sub-region are the same region in the target region. In this way, images of the same region at different time points can be contrasted and displayed.
And S108, displaying the first target image and the second target image in the first display interface.
In some embodiments of the present application, the first presentation interface may be triggered based on a control on the interface, specifically: responding to a first selection instruction, and showing a trigger control used for detecting the trigger instruction; and when the trigger control is triggered, generating a first display interface for displaying the first target image and the second target image.
The number of the different areas of the first display interface may be multiple, for example, two, and the different areas include but are not limited to: the display device comprises a first display area and a second display area, wherein the first display area is used for displaying a first target image, and the second display area is used for displaying a second target image; a segmentation mark is arranged between the first display area and the second display area.
In some embodiments, the segmentation identification comprises: a dividing line for dividing the first display interface into a first display area and a second display area; at this time, respectively determining a first target image and a second target image displayed in different areas of the first display interface based on the first image and the second image includes: determining first contour information of the first display area and second contour information of the second display area by taking the dividing line as a dividing boundary; a first target image is extracted from the first image based on the first contour information, and a second target image is extracted from the second image based on the second contour information.
Optionally, the split flag is movable, and the sizes of the first presentation area and the second presentation area vary with the movement of the split flag. Specifically, when a first display area and a second display area are adjusted by using a division identifier, a moving instruction of a division line between the first display area and the second display area is received; in response to the move instruction, moving the dividing line to adjust the size of the first and second display areas.
In some embodiments of the present application, the first image and the second image are used to show the growth status of the crop in the first sub-area and the second sub-area, respectively; in order to more conveniently know the change information of the crops in the area, after the sizes of the first area and the second area are adjusted, the following processing steps can be further included: determining a change area of the first display area or the second display area after adjustment; comparing the image acquired at the target time corresponding to the first display area in the change area with the image acquired at the target time corresponding to the second display area to obtain the change information of the crop growth state; and displaying the change information of the crop growth state in a display interface, wherein the target time corresponding to the first display area and the target time corresponding to the second display area can be different. In an alternative embodiment, the variation region may be a region formed by a position of the dividing line before the movement and a position after the movement; the above change information is determined by: determining a difference between an image of the region before the movement of the segmentation line and an image shown after the movement.
For example, as shown in the display interface of fig. 4, the middle solid line represents the movable dividing line, the upper arrow represents the moving direction of the movable dividing line indicating the second display region, and the right dotted line represents the location where the movable dividing line moves following the moving direction, which results in increasing the display area of the first display region and decreasing the display area of the second display region as the dividing line moves toward the second display region. The same applies to the opposite direction. This allows the size of the first display area and the second display area to be adjusted by moving the dividing line. It should be noted that the size variation of the two display areas is the same.
Specifically, the change information of the crop growth state is displayed in the display interface, and the crop characteristic information, such as the height, the color and the like of the crop, in the image can be extracted to compare the crop characteristic information, so that the conclusion whether the crop growth state is healthy can be obtained.
In the list mode, monitoring station types of the same type are arranged into a group in a monitoring station list display interface, a plurality of monitoring station identifiers are displayed in a map in the map mode and are identified according to actual geographic positions, pictures shot by the same monitoring station can be placed in corresponding folders, a new interface can be opened after the monitoring station identifiers are clicked, pictures shot by the monitoring station in different time periods can be displayed on the new interface, when the icons are placed on the pictures, comparison-adding icons are displayed, comparison-adding icons are clicked, the picture can become a first picture to be compared, then a second picture is determined by the same operation, and then one interface can be opened for comparison display after confirmation is clicked. The custom editing content (pictures, word notes, etc.) in the editing mode can be saved as pictures to the local.
Fig. 2 is a schematic structural diagram of an image display device according to an embodiment of the present application. The device is used for realizing the method shown in fig. 1, and as shown in fig. 2, the device comprises:
a first determining module 20, configured to determine an image set corresponding to a target monitoring device;
a selection module 22, configured to select, from the image set, a first image and a second image acquired at different time points of the target region, where the first image and the second image are used to show a state of a target object in the target region;
a second determining module 24, configured to determine, based on the first image and the second image, a first target image and a second target image displayed in different areas of the first display interface, where a first image range corresponding to the first target image and a second image range corresponding to the second target image jointly form an image range of the target area;
and a display module 26, configured to display the first target image and the second target image in the first display interface.
It should be noted that, reference may be made to the description related to the embodiment shown in fig. 1 for a preferred implementation of the embodiment shown in fig. 2, and details are not described here again.
Fig. 3 is a schematic flowchart of another image displaying method according to an embodiment of the present application, and as shown in fig. 3, the method includes:
step S302, displaying the device identifications of all monitoring devices for monitoring the state information of the target object in the target area in a second display interface;
step S304, receiving a first selection instruction of a target device identifier in the device identifiers of all the monitoring devices in a second display interface;
step S306, responding to the first selection instruction, and displaying the images collected at different time points in the image set in a third display interface;
step S308, receiving a second selection instruction in the third display interface, and selecting a first image from all images displayed in the third display interface based on the second selection instruction;
step S310, receiving a third selection instruction in a third display interface, and selecting a second image from all images displayed in the third display interface based on the third selection instruction;
step S312, comparing and displaying the first image and the second image in the first display area and the second display area of the first display interface. Where, the meanings set forth herein include, but are not limited to: the first image and the second image are simultaneously displayed in different areas of the same interface.
In some embodiments of the present application, the comparative display of the first image and the second image in the first display area and the second display area of the first display interface may be represented by the following processes: respectively determining a first target image and a second target image displayed in a first display area and a second display area of a first display interface based on the first image and the second image, wherein a first image range corresponding to the first target image and a second image range corresponding to the second target image jointly form an image range of the target area; and simultaneously displaying the first target image and the second target image in the first display interface.
Optionally, a division identifier is disposed between the first display area and the second display area, the division identifier is movable, and the size of the first display area and the size of the second display area change along with the movement of the division identifier.
It should be noted that, reference may be made to the relevant description in examples 1-2 for the preferred implementation of the embodiment shown in fig. 3, and details are not repeated here.
According to still another aspect of the embodiments of the present application, there is provided a non-volatile storage medium, the storage medium including a stored program, wherein the program controls a device on which the storage medium is located to execute the above-mentioned image presentation method when running. For example, instructions may be executed that implement the following functions: determining an image set corresponding to target monitoring equipment; selecting a first image and a second image which are acquired at different time points of a target area from the image set, wherein the first image and the second image are used for showing the state of a target object in the target area; respectively determining a first target image and a second target image displayed in different areas of a first display interface based on the first image and the second image; and displaying the first target image and the second target image in the first display interface.
According to another aspect of the embodiments of the present application, there is also provided a processor, connected to the memory, for executing program instructions in the memory, the program instructions being configured to perform the image presentation method described above. For example, instructions may be executed that implement the following functions: determining an image set corresponding to target monitoring equipment; selecting a first image and a second image which are acquired at different time points of a target area from the image set, wherein the first image and the second image are used for showing the state of a target object in the target area; respectively determining a first target image and a second target image displayed in different areas of a first display interface based on the first image and the second image; and displaying the first target image and the second target image in the first display interface.
The above-mentioned serial numbers of the embodiments of the present application are merely for description, and do not represent the advantages and disadvantages of the embodiments.
In the embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (13)

1. An image display method, comprising:
determining an image set corresponding to target monitoring equipment, wherein the target monitoring equipment is determined from all monitoring equipment through a first selection instruction, and the image set comprises images acquired by the target monitoring equipment at different time points;
selecting a first image and a second image which are acquired at different time points of a target area from the image set based on a second selection instruction and a third selection instruction respectively, wherein the first image and the second image are used for showing the state of a target object in the target area;
respectively determining a first target image and a second target image displayed in different areas of a first display interface based on the first image and the second image;
and displaying the first target image and the second target image in the first display interface.
2. The method according to claim 1, wherein the target monitoring device is any one of monitoring devices for monitoring status information of a target object in the target area; determining an image set corresponding to the target monitoring device, including:
displaying the device identifications of all monitoring devices for monitoring the state information of the target object in the target area in a second display interface;
receiving a first selection instruction of a target device identifier in the device identifiers of all the monitoring devices in the second display interface;
and in response to the first selection instruction, displaying images acquired by the monitoring device corresponding to the target device identification at different time points in a third display interface, and taking all the images displayed in the third display interface as the images in the image set.
3. The method of claim 2, wherein selecting the first and second images from the set of images acquired at different points in time of the target region comprises:
receiving a second selection instruction in the third display interface, and selecting the first image from all images displayed in the third display interface based on the second selection instruction;
receiving a third selection instruction in the third presentation interface, and selecting the second image from all images presented in the third presentation interface based on the third selection instruction.
4. The method of claim 2, further comprising:
responding to the first selection instruction, and showing a trigger control used for detecting a trigger instruction; when the trigger control is triggered, generating the first display interface for displaying the first target image and the second target image.
5. The method of claim 1, wherein the different regions of the first presentation interface comprise: a first display area for displaying the first target image, and a second display area for displaying the second target image; and a segmentation identifier is arranged between the first display area and the second display area.
6. The method of claim 5,
the segmentation identification comprises: a dividing line for dividing the first display interface into the first display area and a second display area;
respectively determining a first target image and a second target image displayed in different areas of a first display interface based on the first image and the second image, wherein the steps of: determining first contour information of the first display area and second contour information of the second display area by taking the dividing line as a dividing boundary; the first target image is extracted from the first image based on the first contour information, and the second target image is extracted from the second image based on the second contour information.
7. The method of claim 5, wherein the split flag is movable, and wherein the size of the first and second presentation areas changes as the split flag moves.
8. The method of claim 1,
a first image range corresponding to the first target image and a second image range corresponding to the second target image jointly form an image range of the target area; alternatively, the first and second electrodes may be,
the first image range corresponding to the first target image and the second image range corresponding to the second target image are image ranges of different sub-areas in the target area.
9. An image display method, comprising:
displaying the equipment identifications of all monitoring equipment for monitoring the state information of the target object in the target area in a second display interface;
receiving a first selection instruction of a target device identifier in the device identifiers of all the monitoring devices in a second display interface;
displaying images, collected at different time points, of the monitoring equipment corresponding to the target equipment identification in a third display interface in response to the first selection instruction;
receiving a second selection instruction in the third display interface, and selecting a first image from all images displayed in the third display interface based on the second selection instruction;
receiving a third selection instruction in the third display interface, and selecting a second image from all images displayed in the third display interface based on the third selection instruction;
determining a first target image and a second target image displayed in a first display area and a second display area of a first display interface respectively based on the first image and the second image;
and carrying out comparison display on the first target image and the second target image in a first display area and a second display area.
10. The method of claim 9, wherein the comparative display of the first and second images in the first and second display areas of the first display interface comprises:
respectively determining a first target image and a second target image displayed in a first display area and a second display area of the first display interface based on the first image and the second image, wherein a first image range corresponding to the first target image and a second image range corresponding to the second target image jointly form an image range of the target area;
and simultaneously displaying the first target image and the second target image in the first display interface.
11. The method according to claim 9, wherein a split mark is provided between the first and second display areas, the split mark is movable, and the size of the first and second display areas changes with the movement of the split mark.
12. An apparatus for displaying an image, comprising:
the device comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for determining an image set corresponding to target monitoring equipment, the target monitoring equipment is determined from all monitoring equipment through a first selecting instruction, and the image set comprises images acquired by the target monitoring equipment at different time points;
the selection module is used for selecting a first image and a second image which are acquired at different time points of a target area from the image set respectively based on a second selection instruction and a third selection instruction, wherein the first image and the second image are used for showing the state of a target object in the target area;
the second determining module is used for respectively determining a first target image and a second target image displayed in different areas of the first display interface based on the first image and the second image;
and the display module is used for displaying the first target image and the second target image in the first display interface.
13. A non-volatile storage medium, comprising a stored program, wherein the program controls a device in which the storage medium is located to execute the method for presenting an image according to any one of claims 1 to 8 when the program runs.
CN201911032409.0A 2019-10-28 2019-10-28 Image display method and device and storage medium Active CN112732146B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911032409.0A CN112732146B (en) 2019-10-28 2019-10-28 Image display method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911032409.0A CN112732146B (en) 2019-10-28 2019-10-28 Image display method and device and storage medium

Publications (2)

Publication Number Publication Date
CN112732146A CN112732146A (en) 2021-04-30
CN112732146B true CN112732146B (en) 2022-06-21

Family

ID=75589353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911032409.0A Active CN112732146B (en) 2019-10-28 2019-10-28 Image display method and device and storage medium

Country Status (1)

Country Link
CN (1) CN112732146B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113360063B (en) * 2021-05-31 2023-02-03 网易(杭州)网络有限公司 Image display method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845733A (en) * 2018-05-31 2018-11-20 Oppo广东移动通信有限公司 Screenshot method, device, terminal and storage medium
CN109873979A (en) * 2019-01-07 2019-06-11 广东思理智能科技股份有限公司 Camera-based static image difference comparison method and device
CN110147708A (en) * 2018-10-30 2019-08-20 腾讯科技(深圳)有限公司 A kind of image processing method and relevant apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10216762B2 (en) * 2014-06-04 2019-02-26 Panasonic Corporation Control method and non-transitory computer-readable recording medium for comparing medical images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845733A (en) * 2018-05-31 2018-11-20 Oppo广东移动通信有限公司 Screenshot method, device, terminal and storage medium
CN110147708A (en) * 2018-10-30 2019-08-20 腾讯科技(深圳)有限公司 A kind of image processing method and relevant apparatus
CN109873979A (en) * 2019-01-07 2019-06-11 广东思理智能科技股份有限公司 Camera-based static image difference comparison method and device

Also Published As

Publication number Publication date
CN112732146A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN109862414B (en) Mask bullet screen display method and device and server
CN108712665B (en) Live broadcast list generation method and device, server and storage medium
CN109767447B (en) Template matching method, device, equipment and medium
CN109947967A (en) Image-recognizing method, device, storage medium and computer equipment
CN105975142A (en) Method and device for icon moving
CN106155496A (en) A kind of information displaying method and device
US11894021B2 (en) Data processing method and system, storage medium, and computing device
CN110865753B (en) Application message notification method and device
CN112732146B (en) Image display method and device and storage medium
CN111539481B (en) Image labeling method, device, electronic equipment and storage medium
JP2010026975A (en) Information processor and information processing program
CN105094597A (en) Batch picture selecting method and apparatus
CN110515515A (en) Information displaying method and device, storage medium
JP5894707B2 (en) Information processing apparatus, information processing method, and program for information processing apparatus
CN113010738A (en) Video processing method and device, electronic equipment and readable storage medium
CN112732377A (en) Image display method and device and storage medium
CN112752127B (en) Method and device for positioning video playing position, storage medium and electronic device
CN108241515A (en) Application shortcut method for building up and terminal
CN107085521A (en) A kind of icon display method and device
CN109919164A (en) The recognition methods of user interface object and device
WO2018171234A1 (en) Video processing method and apparatus
CN108388395A (en) Image cropping method, apparatus and terminal
CN109241218B (en) Data point display method and device
CN110215702B (en) Method and device for controlling grouping in game
CN112764635B (en) Display method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant after: Guangzhou Jifei Technology Co.,Ltd.

Address before: 510000 Block C, 115 Gaopu Road, Tianhe District, Guangzhou City, Guangdong Province

Applicant before: Guangzhou Xaircraft Technology Co.,Ltd.

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant