CN112000218A - Object display method and device - Google Patents

Object display method and device Download PDF

Info

Publication number
CN112000218A
CN112000218A CN201910447742.1A CN201910447742A CN112000218A CN 112000218 A CN112000218 A CN 112000218A CN 201910447742 A CN201910447742 A CN 201910447742A CN 112000218 A CN112000218 A CN 112000218A
Authority
CN
China
Prior art keywords
target object
displayed
image
display end
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910447742.1A
Other languages
Chinese (zh)
Inventor
卢毓智
王磊
刘登勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201910447742.1A priority Critical patent/CN112000218A/en
Publication of CN112000218A publication Critical patent/CN112000218A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention discloses an object display method and device, and relates to the technical field of computers. One embodiment of the method comprises: collecting an image to be displayed; identifying a target object from the image to be displayed; determining the position information of the target object in a display end; and adjusting the target object to a target area in the display end according to the position information so as to display the target object in the target area. According to the embodiment, the target object in the image to be displayed can be automatically adjusted to the target area of the display end, so that the target object is displayed in the target area, the target object can be conveniently checked, and the user experience is improved.

Description

Object display method and device
Technical Field
The invention relates to the technical field of computers, in particular to an object display method and device.
Background
At present, after an acquisition end acquires an image, the image is generally directly displayed by using a display end. For example, when an image captured by a camera is displayed, the image is directly displayed by using a display screen of the camera or other display terminals such as a computer and a mobile phone.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
when a user views an image, the user usually has an object of great interest, for example, when the acquired image is a makeup test photograph of the user, the object of great interest to the user is a face area of the user. Due to factors such as the activity of the user when the image is acquired or the display scale of the image and the display end is not matched, in the displayed image, an object focused by the user may be located in an area which is not convenient to view in the display end, for example, when the face area of the user is located in the edge area of the display end, the user is not convenient to view the face area, and thus the user experience is poor.
Disclosure of Invention
In view of this, embodiments of the present invention provide an object display method and apparatus, which can automatically adjust a target object in an image to be displayed to a target area of a display end, so that the target object is displayed in the target area, thereby facilitating viewing of the target object and further improving user experience.
To achieve the above object, according to an aspect of an embodiment of the present invention, there is provided an object display method including:
collecting an image to be displayed;
identifying a target object from the image to be displayed;
determining the position information of the target object in a display end;
and adjusting the target object to a target area in the display end according to the position information so as to display the target object in the target area.
Optionally, the adjusting the target object to the target area in the display end according to the position information includes:
placing the image to be displayed and the display end in the same coordinate system;
calculating a first distance between the center of the target object and the center of the target area in the coordinate system according to the position information of the target object in the coordinate system and the position of the target area in the coordinate system;
and adjusting the target object to the target area according to the first distance.
Optionally, the target area is a central area of the display end;
the adjusting the target object to the target area in the display end according to the position information includes:
calculating a second distance between the center of the target object and the edge of the display end according to the position information of the target object;
determining the moving amount and the moving direction of the target object to the central area according to the second distance;
and moving the target object to the central area of the display end according to the moving amount and the moving direction.
Optionally, the calculating, according to the second distance, a moving amount of the target object to the central region includes:
determining two second distances between the center of the target object and mutually parallel edges in the display end;
calculating the movement amount by using the following calculation formula according to the two determined second distances;
Figure BDA0002074162790000021
wherein M represents the amount of movement, a represents one of the two second distances, and b represents the other of the two second distances.
Optionally, the object display method further includes:
determining a focal length adjustment amount according to the position information of the target object in the display end, the resolution of the image to be displayed and the resolution of the display end;
and adjusting the display scale of the target object according to the focal length adjustment amount.
Optionally, the identifying a target object from the image to be displayed and the determining the position information of the target object corresponding to the image to be displayed include:
selecting a plurality of candidate areas from the image to be displayed;
carrying out normalization processing on the candidate area;
determining the category of the candidate region after normalization processing by using a convolutional neural network model;
and determining the target object and the position information of the target object according to the category of the candidate region.
According to a second aspect of embodiments of the present invention, there is provided an object display apparatus including: the device comprises an object identification module, a position determination module and an adjustment module; wherein the content of the first and second substances,
the object identification module is used for acquiring an image to be displayed and identifying a target object from the image to be displayed;
the position determining module is used for determining the position information of the target object in the image to be displayed;
the adjusting module is used for adjusting the target object to a target area in a display end according to the position information so as to display the target object in the target area.
Optionally, the adjusting module is configured to place the image to be displayed and the display end in the same coordinate system; calculating a first distance between the center of the target object and the center of the target area in the coordinate system according to the position information of the target object in the coordinate system and the position of the target area in the coordinate system; and adjusting the target object to the target area according to the first distance.
Optionally, the target area is a central area of the display end;
the adjusting module is used for calculating a second distance between the center of the target object and the edge of the display end according to the position information of the target object; determining the moving amount and the moving direction of the target object to the central area according to the second distance; and moving the target object to the central area of the display end according to the moving amount and the moving direction.
Optionally, the adjusting module is configured to determine two second distances between a center of the target object and mutually parallel edges of the display end; calculating the movement amount by using the following calculation formula according to the two determined second distances;
Figure BDA0002074162790000041
wherein M represents the amount of movement, a represents one of the two second distances, and b represents the other of the two second distances.
Optionally, the adjusting module is further configured to determine a focal length adjustment amount according to the position information of the target object in the display end, the resolution of the image to be displayed, and the resolution of the display end, and adjust the display scale of the target object according to the focal length adjustment amount.
According to a third aspect of embodiments of the present invention, there is provided a server, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method according to any one of the preceding first aspects.
According to a fourth aspect of embodiments of the present invention, there is provided a computer readable medium having a computer program stored thereon, wherein the program, when executed by a processor, implements the method according to any one of the first aspect.
One embodiment of the above invention has the following advantages or benefits: according to the position information of the target object in the image to be displayed, the target object is adjusted to the target area in the display end, so that the target object is displayed in the target area, the target object can be conveniently checked, and the user experience can be improved.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
fig. 1 is a flowchart illustrating an object display method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a position of a target object at a display end according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a position of a target object at a display end according to another embodiment of the present invention;
FIG. 4 is a flowchart illustrating an object display method according to another embodiment of the present invention;
fig. 5 is a schematic structural diagram of an object display apparatus according to an embodiment of the present invention;
FIG. 6 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
fig. 7 is a schematic block diagram of a computer system suitable for use in implementing a terminal device or server of an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
As shown in fig. 1, an embodiment of the present invention provides an object display method, which includes the following steps S101 to S104:
step S101: and collecting an image to be displayed.
When the acquisition end acquires the image to be displayed, the acquired image may be video information, at the moment, the image to be displayed can be extracted from the acquired video information by performing frame extraction preprocessing on the video information, automatic exposure processing, automatic white balance processing, automatic backlight processing, automatic overexposure processing and the like of the acquisition end (such as a camera) can be further realized according to the acquisition site environment and the analysis of the acquired image to be displayed, and then the noise filtering, contrast enhancement, image scaling and the like can be further performed on the image to be displayed, so that a target object can be identified from the image to be displayed in the later stage.
Step S102: and identifying a target object from the image to be displayed.
Optionally, an image recognition technique may be used to identify a target object from an image to be displayed, for example, a plurality of candidate regions may be selected from the image to be displayed; carrying out normalization processing on the candidate area; determining the category of the candidate region after normalization processing by using a convolutional neural network model; and determining the target object and the target object according to the category of the candidate region.
Specifically, 3000 candidate regions of 2000-one are selected from an image to be displayed by using a Selective Search (Selective Search) method, then each candidate region is normalized to the same size, for example, each candidate region is normalized to the same size 227 × 227, then the normalized candidate regions are trained by using a convolutional neural network model, then, for each trained candidate region, a linear SVM two-class classifier is used for discrimination, the input of the linear SVM two-class classifier is 4096-dimensional features output by the convolutional neural network model, the output is whether the candidate region is of the same type, and the target object is a face, for example, the output is whether the candidate region is a face. And finally, according to the categories respectively corresponding to the candidate areas, the candidate areas belonging to the target object can be determined, namely the target object is identified from the image to be displayed.
Step S103: and determining the position information of the target object in the display end.
Because the position information of each candidate region in the image to be displayed is determined, after the candidate region corresponding to the target object is determined, the position information of the target object in the image to be displayed can be determined according to the position information of the candidate region corresponding to the target object in the image to be displayed, and then the position information of the target object in the display end can be determined according to the relative display position of the image to be displayed and the display end. Specifically, when the image to be displayed is displayed on the display end, the center of the image to be displayed generally coincides with the center of the display end, and the position information of the target object in the display end can be determined according to the display ratio of the image to be displayed and the display end.
In addition, the image to be displayed and the display end may be placed in the same coordinate system, for example, the origin of coordinates of the coordinate system is the center of the image to be displayed and the center of the display end (here, the center of the image to be displayed and the center of the display end are assumed to coincide with each other), and then the coordinates of the target object in the coordinate system, that is, the position information of the target object at the display end, may be determined according to the position information of the target object in the image to be displayed.
Step S104: and adjusting the target object to a target area in the display end according to the position information so as to display the target object in the target area.
Still taking the example that the image to be displayed and the display end are located in the same coordinate system, the target object can be adjusted to the target area of the display end in the following manner: calculating a first distance between the center of the target object and the center of the target area in the coordinate system according to the position information of the target object in the coordinate system and the position of the target area in the coordinate system; and adjusting the target object to the target area according to the first distance.
Here, the schematic position diagram of the image to be displayed and the display end may be as shown in fig. 2, where a square area a represents a target object in the image to be displayed, a rounded rectangle B represents the display end, a rectangle C represents the image to be displayed, and a square area D represents a target area in the display end, in fig. 2, a center of the image to be displayed and a center of the display end coincide with each other, and a center of the coincidence is an origin of a coordinate system where the two are located. Because the image to be displayed and the display end are located in the same coordinate system, the first distance AD of the center of the target object a and the center of the target area D in the display end in the coordinate system can be directly calculated, and then the target object a is moved towards the target area D by the first distance AD, that is, the target object can be adjusted to the target area, so that the target object is displayed by using the target area.
Generally, for further convenience of viewing by the user, the target area for displaying the target object is a central area of the display end, and since the center of the display end is a coordinate origin of the coordinate system, it is easy to understand that the center of the central area of the display end is also located on the coordinate origin. The first distance between the center of the target object and the center of the target area can be determined directly according to the coordinate of the center of the target object in the coordinate system, so that the calculated amount of the first distance is further reduced, the adjustment rate of the target object is favorably improved, and the user experience is further improved.
When the target area is the central area of the display end, the moving amount of the target object can be calculated by other methods, for example, the second distance between the center of the target object and the edge of the display end can be calculated according to the position information of the target object; determining the moving amount and the moving direction of the target object to the central area according to the second distance; and moving the target object to the central area of the display end according to the moving amount and the moving direction.
When determining the moving amount, two second distances between the center of the target object and the mutually parallel edges of the display end may be determined; calculating the movement amount according to the two determined second distances by using the following calculation formula;
Figure BDA0002074162790000081
wherein M represents the amount of movement, a represents one of the two second distances, and b represents the other of the two second distances.
In this embodiment, a schematic position diagram of the image to be displayed and the display end may be as shown in fig. 3, where a square area a represents a target object in the image to be displayed, a rectangle B represents the display end, a rectangle C represents the image to be displayed, and the target area is a central area of the display end, which is not shown in the figure. Referring to fig. 3, the determination process of the moving amount and the moving direction will be described in detail by taking the calculation of the second distance between the center of the target object and the two vertical edges of the display end as an example.
First, the second distance a between the center of the target object and the left vertical edge b1 of the display end may be calculated, and then the second distance b between the center of the target object and the left vertical edge b2 of the display end may be calculated, and then a calculation formula may be used
Figure BDA0002074162790000091
The amount of movement of the target object in the horizontal direction to the center region is calculated. And, the moving direction of the target object to the central area in the horizontal direction can be determined according to the magnitude relation of a and b, for example, when a>When b, the target object is close to the right side of the display end, the moving direction is leftward movement, and the target area is adjusted to the central area of the display end; on the contrary, when a<And b, if the target object is close to the left side of the display end, the moving direction is rightward. When a is equal to b, that is, when the target object does not need to be moved in the horizontal direction, the movement amount is 0.
It is understood that the second distance between the center of the target object and the two lateral edges of the display end can be calculated by the same method to determine the moving amount and the moving direction of the target object to the central area in the vertical direction.
Further, the focal length adjustment amount can be determined according to the position information of the target object in the display end, the resolution of the image to be displayed and the resolution of the display end; and adjusting the display scale of the target object according to the focal length adjustment amount.
For example, when the target object is located at a central position, the focal length adjustment amount can be adjusted according to a contrast value between the resolution of the image to be displayed and the resolution of the display end, so that the original display proportion can be maintained when the image to be displayed is displayed at the display end according to the focal length adjustment amount. However, when the target object is at a position with a relatively edge, for example, when the distance (a and/or b) between the center of the target object and the edge of the display end is greater than the side length (assumed to be e) of the target area in the display end, it may be determined that the target object is at the position with the relatively edge. At this time, if the target object is adjusted to the target area of the display end and displayed according to the original display scale, a display area which is not filled with the image to be displayed may exist in the edge area of the display end, and at this time, the display scale of the target object may be appropriately enlarged according to the actual situation, so that when the target object is displayed by using the central area, other areas of the display end are also filled with the image to be displayed, thereby improving the aesthetic property of the display screen and further improving the user experience.
The following describes an object display manner provided by the embodiment of the present invention in detail by taking an example of performing intelligent makeup trial by using an Augmented Reality (AR) technology in combination with an application scenario of AR intelligent makeup trial. It can be understood that, in the AR intelligent makeup trial application scene, the target object is a face area of the user, and the target area in the display end is a central area of the display end, that is, the object display method aims to display the face area through the central area of the display end, so that the user can more conveniently view the makeup trial effect of the user. As shown in fig. 4, in the application scenario, the object identification method provided in the embodiment of the present invention may include the following steps:
step S401: and collecting an image to be displayed.
In addition to the face region, the image to be displayed also includes a background region where the user is located.
Step S402: and recognizing a face area from the image to be displayed.
Step S403: and calculating the distance between the center of the face area and the edge of the display end according to the position information of the face area in the display end.
When the image recognition technology is used for recognizing the face area from the image to be displayed, the recognition result can be a matrix corresponding to the face area, and the distance between the edge of the face area matrix and the edge of the display edge can also be calculated when the distance is calculated.
Step S404: and determining the movement amount and the movement direction of the face area to the central area of the display end according to the calculated distance.
Step S405: and moving the face area to the central area of the display end according to the movement amount and the movement direction so as to display the face area by using the central area of the display end.
In the application scenario, the object display method provided by the embodiment of the invention can be used for automatically adjusting the face area of the user to the central area of the display end, so that the face area is located in the central area of the display screen, and the user can place the face area of the user in the acquisition range of the image to be displayed when performing AR makeup trial without continuously adjusting the position of the user to place the face area in the center of the display screen.
As shown in fig. 5, an embodiment of the present invention provides an object display apparatus 500, including: an object identification module 501, a position determination module 502 and an adjustment module 503; wherein the content of the first and second substances,
the object identification module 501 is configured to acquire an image to be displayed and identify a target object from the image to be displayed;
the position determining module 502 is configured to determine position information of the target object in the image to be displayed;
the adjusting module 503 is configured to adjust the target object to a target area in a display end according to the position information, so that the target object is displayed in the target area.
Optionally, the adjusting module 503 is configured to place the image to be displayed and the display end in the same coordinate system; calculating a first distance between the center of the target object and the center of the target area in the coordinate system according to the position information of the target object in the coordinate system and the position of the target area in the coordinate system; and adjusting the target object to the target area according to the first distance.
Optionally, the target area is a central area of the display end; the adjusting module 503 is configured to calculate a second distance between the center of the target object and the edge of the display end according to the position information of the target object; determining the moving amount and the moving direction of the target object to the central area according to the second distance; and moving the target object to the central area of the display end according to the moving amount and the moving direction.
Optionally, the adjusting module 503 is configured to determine two second distances between the center of the target object and the mutually parallel edges of the display end; calculating the movement amount by using the following calculation formula according to the two determined second distances;
Figure BDA0002074162790000121
wherein M represents the amount of movement, a represents one of the two second distances, and b represents the other of the two second distances.
Optionally, the adjusting module 503 is further configured to determine a focal length adjustment amount according to the position information of the target object in the display end, the resolution of the image to be displayed, and the resolution of the display end, and adjust the display scale of the target object according to the focal length adjustment amount.
An embodiment of the present invention further provides a server, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method according to any one of the preceding first aspects.
An embodiment of the present invention further provides a computer-readable medium, on which a computer program is stored, where the computer program is configured to, when executed by a processor, implement the method according to any one of the above first aspects.
Fig. 6 illustrates an exemplary system architecture 600 to which an object display method or an object display apparatus according to an embodiment of the present invention may be applied.
As shown in fig. 6, the system architecture 600 may include terminal devices 601, 602, 603, a network 604, and a server 605. The network 604 serves to provide a medium for communication links between the terminal devices 601, 602, 603 and the server 605. Network 604 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
A user may use the terminal devices 601, 602, 603 to interact with the server 605 via the network 604 to receive or send messages or the like. Various communication client applications, such as shopping applications, web browser applications, search applications, instant messaging tools, mailbox clients, social platform software, and the like, may be installed on the terminal devices 601, 602, and 603.
The terminal devices 601, 602, 603 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 605 may be a server providing various services, such as a background management server (for example only) providing support for shopping websites browsed by users using the terminal devices 601, 602, 603. The backend management server may analyze and perform other processing on the received data such as the product information query request, and feed back a processing result (for example, target push information, product information — just an example) to the terminal device.
It should be noted that the object display method provided by the embodiment of the present invention is generally executed by the server 605, and accordingly, the object display apparatus is generally disposed in the server 605.
It should be understood that the number of terminal devices, networks, and servers in fig. 6 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use with a terminal device implementing an embodiment of the present invention. The terminal device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 701.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes an object identification module, a location determination module, and an adjustment module. The names of these modules do not in some cases constitute a limitation on the module itself, and for example, the object recognition module may also be described as a "module that recognizes a target object from an image to be displayed".
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise: collecting an image to be displayed; identifying a target object from the image to be displayed; determining the position information of the target object in a display end; and adjusting the target object to a target area in the display end according to the position information so as to display the target object in the target area.
According to the technical scheme of the embodiment of the invention, the target object is adjusted to the target area in the display end according to the position information of the target object in the image to be displayed, so that the target object is displayed in the target area, the target object is convenient to view, and the user experience is further improved.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An object display method, comprising:
collecting an image to be displayed;
identifying a target object from the image to be displayed;
determining the position information of the target object in a display end;
and adjusting the target object to a target area in the display end according to the position information so as to display the target object in the target area.
2. The method according to claim 1, wherein the adjusting the target object to a target area in the display end according to the position information comprises:
placing the image to be displayed and the display end in the same coordinate system;
calculating a first distance between the center of the target object and the center of the target area in the coordinate system according to the position information of the target object in the coordinate system and the position of the target area in the coordinate system;
and adjusting the target object to the target area according to the first distance.
3. The method of claim 1, wherein the target area is a central area of the display end;
the adjusting the target object to the target area in the display end according to the position information includes:
calculating a second distance between the center of the target object and the edge of the display end according to the position information of the target object;
determining the moving amount and the moving direction of the target object to the central area according to the second distance;
and moving the target object to the central area of the display end according to the moving amount and the moving direction.
4. The method of claim 3, wherein said calculating an amount of movement of said target object to said central region based on said second distance comprises:
determining two second distances between the center of the target object and mutually parallel edges in the display end;
calculating the movement amount by using the following calculation formula according to the two determined second distances;
Figure FDA0002074162780000021
wherein M represents the amount of movement, a represents one of the two second distances, and b represents the other of the two second distances.
5. The method of claim 1, further comprising:
determining a focal length adjustment amount according to the position information of the target object in the display end, the resolution of the image to be displayed and the resolution of the display end;
and adjusting the display scale of the target object according to the focal length adjustment amount.
6. The method according to any one of claims 1 to 5, wherein the identifying a target object from the image to be displayed and the determining the position information of the target object corresponding to the image to be displayed comprise:
selecting a plurality of candidate areas from the image to be displayed;
carrying out normalization processing on the candidate area;
determining the category of the candidate region after normalization processing by using a convolutional neural network model;
and determining the target object and the position information of the target object according to the category of the candidate region.
7. An object display apparatus, comprising: the device comprises an object identification module, a position determination module and an adjustment module; wherein the content of the first and second substances,
the object identification module is used for acquiring an image to be displayed and identifying a target object from the image to be displayed;
the position determining module is used for determining the position information of the target object in the image to be displayed;
the adjusting module is used for adjusting the target object to a target area in a display end according to the position information so as to display the target object in the target area.
8. The apparatus of claim 7,
the adjusting module is used for placing the image to be displayed and the display end in the same coordinate system; calculating a first distance between the center of the target object and the center of the target area in the coordinate system according to the position information of the target object in the coordinate system and the position of the target area in the coordinate system; and adjusting the target object to the target area according to the first distance.
9. A server, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
10. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201910447742.1A 2019-05-27 2019-05-27 Object display method and device Pending CN112000218A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910447742.1A CN112000218A (en) 2019-05-27 2019-05-27 Object display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910447742.1A CN112000218A (en) 2019-05-27 2019-05-27 Object display method and device

Publications (1)

Publication Number Publication Date
CN112000218A true CN112000218A (en) 2020-11-27

Family

ID=73461271

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910447742.1A Pending CN112000218A (en) 2019-05-27 2019-05-27 Object display method and device

Country Status (1)

Country Link
CN (1) CN112000218A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702616A (en) * 2020-12-08 2021-04-23 珠海格力电器股份有限公司 Processing method and device for playing content

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160188141A1 (en) * 2014-12-26 2016-06-30 Wistron Corporation Electronic device and method for displaying target object thereof
CN108154058A (en) * 2016-12-05 2018-06-12 北京小米移动软件有限公司 Graphic code displaying, the band of position determine method and device
CN109283933A (en) * 2018-09-27 2019-01-29 易瓦特科技股份公司 The control method and device that unmanned plane follows
CN109688325A (en) * 2018-12-04 2019-04-26 维沃移动通信有限公司 A kind of image display method and terminal device
CN109784256A (en) * 2019-01-07 2019-05-21 腾讯科技(深圳)有限公司 Face identification method and device, storage medium and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160188141A1 (en) * 2014-12-26 2016-06-30 Wistron Corporation Electronic device and method for displaying target object thereof
CN108154058A (en) * 2016-12-05 2018-06-12 北京小米移动软件有限公司 Graphic code displaying, the band of position determine method and device
CN109283933A (en) * 2018-09-27 2019-01-29 易瓦特科技股份公司 The control method and device that unmanned plane follows
CN109688325A (en) * 2018-12-04 2019-04-26 维沃移动通信有限公司 A kind of image display method and terminal device
CN109784256A (en) * 2019-01-07 2019-05-21 腾讯科技(深圳)有限公司 Face identification method and device, storage medium and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702616A (en) * 2020-12-08 2021-04-23 珠海格力电器股份有限公司 Processing method and device for playing content

Similar Documents

Publication Publication Date Title
US10762387B2 (en) Method and apparatus for processing image
US10699431B2 (en) Method and apparatus for generating image generative model
CN109255337B (en) Face key point detection method and device
CN109242801B (en) Image processing method and device
US10614621B2 (en) Method and apparatus for presenting information
CN110059623B (en) Method and apparatus for generating information
CN110619807B (en) Method and device for generating global thermodynamic diagram
CN109446442B (en) Method and apparatus for processing information
CN109118456B (en) Image processing method and device
CN110427915B (en) Method and apparatus for outputting information
CN111784712B (en) Image processing method, device, equipment and computer readable medium
CN109583389B (en) Drawing recognition method and device
CN110084317B (en) Method and device for recognizing images
CN110110666A (en) Object detection method and device
CN111160410A (en) Object detection method and device
CN114283416A (en) Processing method and device for vehicle insurance claim settlement pictures
CN112000218A (en) Object display method and device
CN112770044A (en) Method and device for taking self-timer image
CN113362090A (en) User behavior data processing method and device
CN111311358A (en) Information processing method and device and electronic equipment
CN111311305A (en) Method and system for analyzing user public traffic band based on user track
US20210357673A1 (en) Method and device for processing feature point of image
CN111767098A (en) Method and device for adjusting font size
CN110634155A (en) Target detection method and device based on deep learning
CN110263743B (en) Method and device for recognizing images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination